文档介绍:Principles of Information Science
Chapter 5
Principles of Information Transferring:
Communication Theory
1. Model munication System
T
T
C
Source
Sink
Channel
Transformation: to match the source with the channel.
Source, sink, and channel are given beforehand.
Noise
X
Y
-1
T: Transformation
The Functions of Transformation
- Modulation: Spectra Matching,
Better Performance/Cost Seeking
- Amplification: Signal/Noise Improving
- Equalization: Channel Characteristic Adjusting
- Source Coding: Transmission Efficiency Bettering
- Channel Coding: Noise Immunity
- Cryptographic Coding: Security Protection
2, Model Analysis
Ignoring the content and utility factors, the
model munication exhibits statistical
properties.
Source Entropy: H(X), H(X, Y)
I(X; Y) = H(X) - H(X|Y)
= H(Y) - H(Y|X)
A radical feature munication:
The sent waveform recovery at receiving end
with a certain fidelity under noises.
Mutual Information:
Channel Capacity C = I(X; Y)
Max
{p(X)}
Define
Example (AWGN Channel):
p(y|x) =(2ps ) exp[-(1/2s ) (y-x) ]
= H(Y) - p(x) p(y|x) log p(y|x) dy dx
2
= H(Y) - log (2ps e)
I(X; Y) = H(Y) - H(Y|X)
The only way to maximize I(X; Y) is to maximize H(Y).
This requires Y being normal variable with zero mean.
Due to Y = X+N, it requires X being a normal variable
with zero mean.
-1/2
2
2
-
-
2
1/2
Let P = P + s
C = log (2peP ) - log(2pes )
Y
1/2
1/2
= (1/2) log (P /s )
2
Y
= (1/2) log (1 + P/N)
(bit/symbol)
If X is a white Gaussian signal with bandwidth F and
duration T, then there are 2FT symbols transmitted
per second. Therefore, we have
C = FT log (1 + P/N) bit/second,
2
Y
2
the famous capacity formula.
(N = s )
2
F
F
T
T
P
P
Signal Volume to be transmitted through a channel
must be smaller than the channel capacity provided.
From
C = FT log (1 + P/N )
A) F, T, P are basic parameters of a channel.
C) Signal Division
Frequency Division -- FDMA
Time Division -- TDMA
Time Freque