文档介绍:Principles of Information Science
Chapter 3
Measures of Information
3-1 Measures of Random Syntactic Information
Shannon Theory of Information
2. The amount of information can then be measured
by the amount of uncertainty it removed.
1. Information is something that can be used to remove
uncertainty.
Key point:
3. In the cases munications, only waveform is
concerned while meaning and value are ignored.
4. Uncertainty and thus information are statistic in
nature and statistical mathematics is enough.
Shannon Theorem of Random Entropy
The measure of uncertainty, will take the form of
H (p , …, p ) = - k p log p
n
n=1
N
n
1
N
S
(1) H should be a continuous function of p , for all n;
S
n
if the conditions below are satisfied:
(2) H should be a monotonically increasing function
of N when p = 1/N for all n;
S
n
(3) H should observe the rule of stepped weighting
summation.
S
The rule of stepped weighting summation:
1/2
1/3
1/6
x
x
x
1
2
3
1/2
1/2
2/3
1/3
x
x
x
1
2
3
p = 1/2
p = 1/3
p = 1/6
1
2
3
H (1/2, 1/3, 1/6) = H (1/2, 1/2) + 1/2 H (2/3, 1/3)
S
S
S
Proof:
(a) In the case of equal probabilities
S
S
H (1/N, …, 1/N) = A(N)
By use condition (3), it is then easy to have
A(MN) = H (1/MN, …, 1/MN)
S
S
M
i=1
= A(M) + A(N)
Then
A(N ) = 2 A(N), A(S ) = A(S), A(t ) = A(t)
2
a
a
b
b
Let
= H (1/M, …, 1/M) + (1/M) H (1/N, …, 1/N)
For any given b, it is always possible to find a proper
a such that
S t < S
a
b
a+1
(*)
a
b
log t
log S
<
b
a
+
1
b
or equivalently
On the other hand, from (*) we have
A(S ) A(t ) < A(S )
a
a
b
a+1
or
A(s)
b
A(t) <
(a+1)A(S)
(**)
b
a
A(t)
A(S)
<
a
b
+
1
b
(***)
Thus
A(t)
A(S)
-
log t
log S
<
1
b
When b is large enough, we have
A(t) = k log t
(b) In the case of unequal and rational probabilities
Let
n
n
i
i
i=1
N
i
then the unequal distribution entropy H (p , …, p )
es the case of equal proba