Information Theory and
coding
Reference
Digital And analog
communication systems-K
Sam Shanmugam
Chapter 1 : Introduction to Electronic Communications
System
Main purpose of an electronic
communications system is to transfer
information from one place to another.
Electronic communications can be viewed
as the transmission, reception and
processing of information between two or
more locations using electronic
circuit/device.
Basic Communication Model
Basic communication models shows the
communication flows between 2 points.
Source sender of the information
Sink receiver that receive the information
Channel transmission path/medium of the
information between the source and sink
3
Basic Communication Model
Communication system model
Transmission channel physical link between the communicating parties
Modulator transform the source signal so that it is physically suitable for the
transmission channel
Transmitter introduce the modulated signal into the channel (also act as
amplifier)
Receiver Detect the signal on the channel and amplify it (due to the attenuation)
Demodulator Get the source signal (original) from the received signal and pass it
to the recipient
4
Model of a communication
system
Fig below shows the basic functional
blocks of communication system
The overall purpose of this system is
to transfer information from one
point in the space and time , called
source , to another point, the user
destination.
Input transducer
Message produced by the source is
not electrical . Hence an input
transducer is required for converting
the message to a time varying
electrical quantity called a message
signal.
Elements of digital
communication
Information Theory
It is a study of Communication
Engineering plus Maths.
A Communication Engineer has to
Fight with
Limited Power
Inevitable Background Noise
Limited Bandwidth
Information Theory deals
with
The Measure of Source
Information
The Information Capacity of the
channel
Coding
Information Measure
This is utilized to determine the information rate
of discrete Sources
Consider two Messages
A Dog Bites a Man High probability Less information
A Man Bites a Dog Less probability High Information
So we can say that
Information (1/Probability of
Occurrence)
Information Measure
Also we can state the three law from
Intution
Rule 1: Information I(mk) approaches to 0 as
pk approaches infinity.
Mathematically I(mk) = 0 as pk 1
e.g. Sun Rises in East
Information Measure
Rule 2: The Information Content I(mk) must be Non
Negative .
It may be zero
Mathematically I(mk) >= 0 as 0 <= Pk <=1
e.g. Sun Rises in West.
Information Measure
Rule 3: The Information Content of
message having Higher probability is
less than the Information Content of
Message having Lower probability
Mathematically I(mk) > I(mj)
Information Measure
Also we can state for the Sum of two messages that
the information content in the two combined
messages is same as the sum of information
content of each message Provided the occurrence
is mutually independent.
e.g. There will be Sunny weather Today.
There will be Cloudy weather Tomorrow
Mathematically
I (mk and mj) = I(mk mj)
= I(mk)+I(mj)
Information measure
So Question is which function that we can use
that measure the Information?
Information = F(1/Probability)
Requirement that function must satisfy
1. Its output must be non negative Quantity.
2. Minimum Value is 0.
3. It Should make Product into summation.
Information
I(mk) = Log
Here b may be 2, e or 10
If b = 2 then unit is bits
b = e then unit is nats
b = 10 then unit is decit
(1/ Pk )
Conversion Between Units
log 2 v
ln v log10 v
ln 2 log10 2
Example
Example
A Source generates one of four
symbols during each interval with
probabilities p1=1/2, p2=1/4, p3=
p4=1/8. Find the Information content
of these messages.
Average Information Content(Entropy) of
symbols in long independent sequences
It is necessary to define the information content
of the particular symbol as communication
channel deals with symbol.
Here we make following assumption..
1. The Source is stationery, so Probability remains
constant with time.
2. The Successive symbols are statistically
independent and come out at avg rate of r
symbols per second
Average Information
Content
Suppose a source emits M possible symbols
s1, s2, ..SM having Probability of occurrence
p1,p2,.pM
M
p
i 1
For a long message having symbols N (>>M)
s1 will occur p1N times,
s2 will occur p2N times so on.
Average Information
Content
Since s1 occurs p1N times so information
Contribution by s1 is
I1= p1N log2(1/p1).
Similarly information Contribution by s2 is
I2=p2N log2(1/p2). And So on.
Hence the Total Information Content is
1
Itotal NPi log
Pi
i 1
M
Average Information
Content
Average Information
Content
Average Information
Content
And Average Information is obtained by
Itotal M
1
H
Pi log
N
Pi
i 1
Bits/Symbol
It means that In long message we can expect H bit of
information per symbol. Another name of H is entropy.
Information Rate
Information Rate = Total Information/ time
taken
n
Tb
Here Time Taken
n bits are transmitted with r symbols per
second. Total Information is nH.
Information
nH
rate
n
r
R rH
R
Bits/sec
Preliminary
Assume a discrete-time digital information source S:
M = {a1, ..., ak}... the set of symbols of S
(S is said to be a k-ary information source.)
Xt...the symbol which S produces at time t
The sequence X1, ..., Xn is called a message produced by S.
Example: S = fair dice
if the message is
52
, then
Average Information Content(Entropy) of
symbols in long dependent sequences
Dependent=Memory = Occurrence of
present symbol is dependent on
previous symbol.
Information of Sources with Memory
Most real world sources exhibit memory, resulting in
correlated source signals;
This property is retained during sampling and
quantisation
Here memory can be modelled by a
Markov process
Markov information source
Markov information source
a simple model of information source with memory
The choice of the next symbol depends on
at most m previous symbols
(m-th order Markov source)
m = 0 memoryless source
m = 1 simple Markov source
55
Andrey Markov
1856-1922
Markov process is a most complete model to
describe sources with memory; it is a
probabilistic model
Andrey Markov
1856-1922
Most widely used Markov process is 1st order
Markov process, where
Pi = P(Xi) is probability of occurrence of state Xi;
imagine starting an experiment with time index t, at
the beginning or t = 0, you can find that the process
S(0) starts from state Xi with probability Pi; hence Pi
is a priori probability
Transition probability pij describes the probability of
the process changing from state Xi to Xj, hence is
conditional probability p(S(t) = Xj|S(t1) = Xi) = pij
Example of (simple) Markov
source
S ... memoryless & stationary source with P(0) = q, P(1) = 1 q
Xt
S
R
1-bit register
if Xt1 = 0, then R = 0:
S = 0 Xt = 0 ... PXt|Xt1(0 | 0) = q
S = 1 Xt = 1 ... PXt|Xt1(1 | 0) = 1 q
if Xt1 = 1, then R = 1:
S = 0 Xt = 1 ... PXt|Xt1(1 | 1) = q
60
S = 1 Xt = 0 ... PXt|Xt1(0 | 1) = 1 q
Markov source as a finite state
machine
m-th order k-ary Markov source:
The next symbols depends on previous m symbols.
The model is having one of km internal states.
The state changes when a new symbol is generated.
finite state machine
1 / 1q
Xt
S
R
0/q
1-bit register
generated
symbol
61
1/q
0 / 1q
probability