KEMBAR78
Information Theory: 2015 Wiley India Pvt. Ltd. All Rights Reserved | PDF | Applied Mathematics | Areas Of Computer Science
0% found this document useful (0 votes)
45 views11 pages

Information Theory: 2015 Wiley India Pvt. Ltd. All Rights Reserved

The document discusses information theory and coding. It defines self-information and entropy, which are measures of information. It describes how entropy is used to measure the average information of a source and how it applies to sources with and without memory.

Uploaded by

dickens
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
45 views11 pages

Information Theory: 2015 Wiley India Pvt. Ltd. All Rights Reserved

The document discusses information theory and coding. It defines self-information and entropy, which are measures of information. It describes how entropy is used to measure the average information of a source and how it applies to sources with and without memory.

Uploaded by

dickens
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Information Theory

INFORMATION THEORY AND CODING, by Kulkarni & Shivaprakasha


Copyright 2015 Wiley India Pvt. Ltd. All rights reserved.
Measure of Information
Consider the following statements
It rained heavily in Cherrapunji yesterday
There was a heavy rainfall in Rajasthan last night

Self-information of a message k, Ik, is inversely proportional to its


probability of occurrence pk

INFORMATION THEORY AND CODING, by Kulkarni & Shivaprakasha


Copyright 2015 Wiley India Pvt. Ltd. All rights reserved.
The information conveyed by a message cannot be negative
Ik 0

If the event is definite, then the corresponding information


conveyed by the event will be 0
If pk = 1 then Ik = 0

The information conveyed by composite independent statements is


the sum of the individual self-information contents
I(m1, m2) = I(m1) + I(m2)

INFORMATION THEORY AND CODING, by Kulkarni & Shivaprakasha


Copyright 2015 Wiley India Pvt. Ltd. All rights reserved.
Self-information content of a message is given by

If r = 2, the unit is bits


If r = e, the unit is nats
If r = 10, the unit is Hartley or Decits

INFORMATION THEORY AND CODING, by Kulkarni & Shivaprakasha


Copyright 2015 Wiley India Pvt. Ltd. All rights reserved.
Average Information Content (Entropy) of
a Zero Memory Source
Source Alphabet
Probabilities
Total amount of self information

The average information content

INFORMATION THEORY AND CODING, by Kulkarni & Shivaprakasha


Copyright 2015 Wiley India Pvt. Ltd. All rights reserved.
Entropy of the source

Average Information Rate

INFORMATION THEORY AND CODING, by Kulkarni & Shivaprakasha


Copyright 2015 Wiley India Pvt. Ltd. All rights reserved.
Properties of Entropy
Entropy is a continuous function of probability
Entropy is a symmetric function of its arguments
Upper bound on entropy

Source Efficiency

Redundancy

INFORMATION THEORY AND CODING, by Kulkarni & Shivaprakasha


Copyright 2015 Wiley India Pvt. Ltd. All rights reserved.
Extension of Zero Memory Source

For an nth-order extension of S

H(Sn) = nH(S )

INFORMATION THEORY AND CODING, by Kulkarni & Shivaprakasha


Copyright 2015 Wiley India Pvt. Ltd. All rights reserved.
Entropy of a Source with Memory

A Markoff model is used to represent a source with memory


State probabilities

Source entropy

INFORMATION THEORY AND CODING, by Kulkarni & Shivaprakasha


Copyright 2015 Wiley India Pvt. Ltd. All rights reserved.
Average information rate

The average information content per symbol in a message of length L

Condition to be satisfied

INFORMATION THEORY AND CODING, by Kulkarni & Shivaprakasha


Copyright 2015 Wiley India Pvt. Ltd. All rights reserved.
Further Reading
[1] Shanmugam K Sam, Digital and Analog Communication Systems, John Wiley & Sons: New York,
1979.
[2] Simon Haykin, An Introduction to Analog and Digital Communications, John Wiley & Sons: New
York, 1989.
[3] Ranjan Bose, Information Theory Coding and Cryptography, Tata McGraw Hill: New Delhi,
2007.
[4] Simon Haykin and Michael Moher, Modern Wireless Communications, Dorking Kindersley
(India) Pvt. Ltd.: New Delhi, 2007.
[5] Claude Elwood Shannon, A Mathematical Theory of Communication, Bell System Technical
Journal; 27, pp. 379423, 623656, 1948.
[6] Ian A Glover and Peter M Grant, Digital Communications, Pearson Education: UK, 2004.
[7] Bernard Sklar and Pabitra K Ray, Digital Communications: Fundamentals and Applications,
Dorking Kindersley (India) Pvt. Ltd.: India, 2001.
[8] Andrea Goldsmith, Wireless Communications, Cambridge University Press: UK, 2005.
[9] John G Proakis and Masoud Salehi, Contemporary Communication System using MATLAB,
PWS Publishing Company: Boston, MA, 1998.

INFORMATION THEORY AND CODING, by Kulkarni & Shivaprakasha


Copyright 2015 Wiley India Pvt. Ltd. All rights reserved.

You might also like