Noise in Communication
Systems
Information and Measure of
Information
1
Introduction
Noise is a general term which is used to describe an unwanted signal
which affects a wanted signal. These unwanted signals arise from a
variety of sources which may be considered in one of two main
categories:-
•Interference, usually from a human source (man made)
•Naturally occurring random noise
Interference
Interference arises for example, from other communication systems
(cross talk), 50 Hz supplies (hum) and harmonics, switched mode
power supplies, thyristor circuits, ignition (car spark plugs) motors
… etc.
2
Thermal Noise (Johnson Noise)
This type of noise is generated by all resistances (e.g. a resistor,
semiconductor, the resistance of a resonant circuit, i.e. the real part of the
impedance, cable etc).
Experimental results (by Johnson) and theoretical studies (by Nyquist) give
the mean square noise voltage as _ 2
V = 4 k TBR (volt 2 )
Where k = Boltzmann’s constant = 1.38 x 10-23 Joules per K
T = absolute temperature
B = bandwidth noise measured in (Hz)
R = resistance (ohms) 3
Thermal Noise (Johnson Noise) (Cont’d)
The law relating noise power, N, to the temperature and bandwidth is
N = k TB watts
Thermal noise is often referred to as ‘white noise’ because it has a
uniform ‘spectral density’.
4
Shot Noise
• Shot noise was originally used to describe noise due to random
fluctuations in electron emission from cathodes in vacuum tubes
(called shot noise by analogy with lead shot).
• Shot noise also occurs in semiconductors due to the liberation of
charge carriers.
• For pn junctions the mean square shot noise current is
I n2 = 2(I DC + 2 I o ) qe B (amps) 2
Where
is the direct current as the pn junction (amps)
is the reverse saturation current (amps)
is the electron charge = 1.6 x 10-19 coulombs
B is the effective noise bandwidth (Hz)
• Shot noise is found to have a uniform spectral density as for thermal
noise 5
Low Frequency or Flicker Noise
Active devices, integrated circuit, diodes, transistors etc also exhibits
a low frequency noise, which is frequency dependent (i.e. non
uniform) known as flicker noise or ‘one – over – f’ noise.
Excess Resistor Noise
Thermal noise in resistors does not vary with frequency, as previously
noted, by many resistors also generates as additional frequency
dependent noise referred to as excess noise.
Burst Noise or Popcorn Noise
Some semiconductors also produce burst or popcorn noise with a
2
spectral density which is proportional to 1 f
6
Noise Evaluation
The essence of calculations and measurements is to determine the
signal power to Noise power ratio, i.e. the (S/N) ratio or (S/N)
expression in dB. S S
=
ratio N
N
S S
= 10 log 10
N dB N
Also recall that
S ( mW )
S dBm = 10 log 10
1 mW
N ( mW )
and N dBm = 10 log 10
1mW
S
i.e. =10 log 10 S − 10 log 10 N
N dB
S
= S dBm − N dBm
N dB
7
Noise Evaluation (Cont’d)
The probability of amplitude of noise at any frequency or in any band
of frequencies (e.g. 1 Hz, 10Hz… 100 KHz .etc) is a Gaussian
distribution.
8
Noise Evaluation (Cont’d)
Noise may be quantified in terms of
noise power spectral density, po watts per
Hz, from which Noise power N may be
expressed as
N= po Bn watts
Ideal low pass filter
Bandwidth B Hz = Bn
N= po Bn watts
Practical LPF
3 dB bandwidth shown, but noise does not suddenly cease
at B3dB
Therefore, Bn > B3dB, Bn depends on actual filter.
N= p0 Bn
In general the equivalent noise bandwidth is > B3dB.
9
Analysis of Noise In Communication Systems
Thermal Noise (Johnson noise)
This thermal noise may be represented by an equivalent circuit as shown below
____
V = 4 k TBR (volt 2 )
2
(mean square value , power)
then VRMS = V____2 = 2 kTBR = Vn
i.e. Vn is the RMS noise voltage.
A) System BW = B Hz
N= Constant B (watts) = KB
B) System BW
N= Constant 2B (watts) = K2B
For A, S = S For B, S = S
N KB N K 2B
10
Analysis of Noise In Communication Systems (Cont’d)
Resistors in Series
Assume that R1 at
temperature T1 and R2 at
temperature T2, then
____ ___ ___
V =V +V
2 2 2
n n1 n2
____
= 4 k T1 B R1
2
V n1
____
Vn 2 = 4 k T2 B R2
2
____
V = 4 k B (T1 R1 + T2 R2 )
2
n
____
= 4 kT B ( R1 + R2 )
2
V n
i.e. The resistor in series at same temperature behave as a
11
single resistor
Analysis of Noise In Communication Systems (Cont’d)
Resistance in Parallel
R2 R1
Vo1 = Vn1 Vo 2 = Vn 2
R1 + R2 R1 + R2
____ ___ ___
V =V +V
2 2 2
n o1 o2
R R
R22 T1 R1+R12 T2 R2 R1 R2
____
4kB
V = (
2
R1 + R2 )
2
n
1 2
_____
4kB R1 R2 (T1 R1+ T2 R2 )
V 2
=
n
(R1 + R2 )2
_____
RR
V 2
= 4kTB 1 2
R1 + R2
n
12
Matched Communication Systems
In communication systems we are usually concerned
with the noise (i.e. S/N) at the receiver end of the system.
The transmission path may be for example:-
Or
An equivalent circuit, when the line is connected to the receiver is shown below.
13
Matched Communication Systems (Cont’d)
14
Signal to Noise
The signal to noise ratio is given by
S Signal Power
=
N Noise Power
The signal to noise in dB is expressed by
S S
dB =10 log 10
N N
S
dB =S dBm − N dBm for S and N measured in mW.
N
Noise Factor- Noise Figure
Consider the network shown below,
15
Noise Factor- Noise Figure (Cont’d)
• The amount of noise added by the network is embodied in the
Noise Factor F, which is defined by
(S N )
(S N )
IN
Noise factor F =
OUT
• F equals to 1 for noiseless network and in general F > 1. The
noise figure in the noise factor quoted in dB
i.e. Noise Figure F dB = 10 log10 F F ≥ 0 dB
• The noise figure / factor is the measure of how much a network
degrades the (S/N)IN, the lower the value of F, the better the
network.
16
Noise Figure – Noise Factor for Active Elements
For active elements with power gain G>1, we have
(S N )
= S IN N OUT SOUT = G S IN
(S N )
IN
F= But
OUT
N IN S OUT
Therefore
S IN N OUT N
F= = OUT
N IN G S IN G N IN
Since in general F v> 1 , then N OUT is increased by noise due to the active element i.e.
Na represents ‘added’ noise measured at the output. This added noise may be referred to the
input as extra noise, i.e. as equivalent diagram is
17
Noise Figure – Noise Factor for Active Elements (Cont’d)
Ne is extra noise due to active elements referred to the input; the element is thus
effectively noiseless.
18
Noise Temperature
19
Noise Figure – Noise Factor for Passive Elements
20
System Noise Figure
Assume that a system comprises the elements shown below,
Assume that these are now cascaded and connected to an aerial at the input, with N IN = N ae
from the aerial.
Now , N OUT = G3 (N IN 3 + N e3 )
= G3 (N IN 3 + (F3 − 1) N IN )
Since N IN 3 = G2 ( N IN 2 + N e 2 ) = G2 ( N IN 2 + (F2 − 1)N IN )
similarly N IN 2 = G1 (N ae + (F1 − 1)N IN )
21
System Noise Figure (Cont’d)
N OUT = G3 G2 G1 N ae + G1 (F1 − 1)N IN + G2 (F2 − 1)N IN + G3 (F3 − 1)N IN
The overall system Noise Factor is
N OUT N OUT
Fsys = =
GN IN G1G2 G3 N ae
N IN (F2 − 1) N IN (F3 − 1) N IN
= 1 + (F1 − 1) + +
N ae G1 N ae G1G2 N ae
(F2 − 1) (F3 − 1) (F4 − 1) (Fn − 1)
Fsys = F1 + + + + ........... +
G1 G1G2 G1G2 G3 G1G2 ..........Gn−1
The equation is called FRIIS Formula.
22
System Noise Temperature
23
Additive White Gaussian Noise
Additive
Noise is usually additive in that it adds to the information bearing signal. A model of the
received signal with additive noise is shown below
White
White noise = po ( f ) = Constant
Gaussian
We generally assume that noise voltage amplitudes have a Gaussian or Normal distribution.
24
What is information?
• Can we measure information?
• Consider the two following sentences:
1. There is a traffic jam on I5
2. There is a traffic jam on I5 near Exit 234
Sentence 2 seems to have more information than that of
sentence 1. From the semantic viewpoint, sentence 2 provides
more useful information.
What is information?
• It is hard to measure the “semantic” information!
• Consider the following two sentences
1. There is a traffic jam on I5 near Exit 160
2. There is a traffic jam on I5 near Exit 234
It’s not clear whether sentence 1 or 2 would have more information!
What is information?
• Let’s attempt at a different definition of
information.
– How about counting the number of letters in
the two sentences:
1. There is a traffic jam on I5 (22 letters)
2. There is a traffic jam on I5 near Exit 234 (33 letters)
Definitely something we can measure and compare!
It’s interesting to know that
log is the only functionf
What is information?
f ( s ) = lf ( s )
that satisfies
l
• First attempt to quantify information by Hartley (1928).
– Every symbol of the message has a choice of possibilities.
s
– A message of length , therefore can have distinguishable possibilities.
l
l s
– Information measure is then the logarithm of
sl
I = log( s l ) = l log( s )
Intuitively, this definition makes sense:
one symbol (letter) has the information of log(s ) then a sentence of length l
should have l times more information, i.e. l log s
How about we measure information as the number
of Yes/No questions one has to ask to get the
correct answer to a simple game below
1 2 How many questions?
3 4 2
1 2 3 4
How many questions?
5 6 7 8
4
9 10 11 12
13 14 15 16
Randomness due to uncerntainty of where the circle is!
Shannon’s Information Theory
The
Claude Shannon: A Mathematical Theory of Communication
Bell System Technical Journal, 1948
Shannon’s measure of information is the number of bits to
represent the amount of uncertainty (randomness) in a
data source, and is defined as entropy
n
H = − pi log( pi )
i =1
Where there are n symbols 1, 2, … n, each with
probability of occurrence of pi
Shannon’s Entropy
• Consider the following string consisting of symbols a and b:
abaabaababbbaabbabab… ….
– On average, there are equal number of a and b.
– The string can be considered as an output of a below source with equal
probability of outputting symbol a or b:
0.5 a
We want to characterize the average
information generated by the source!
0.5 b
source
Intuition on Shannon’s Entropy
n
Why H = − pi log( pi )
i =1
Suppose you have a long random string of two binary symbols 0 and 1, and the
probability of symbols 1 and 0 are p
0 and 1 p
Ex: 00100100101101001100001000100110001 ….
If any string is long enough say N, it is likely to contain Np 0 0’s and Np1 1’s.
The probability of this string pattern occurs is equal to
p = p0Np0 p1Np1
Hence, # of possible patterns is 1 / p = p0− Np0 p1− Np1
1
# bits to represent all possible patterns is log( p0− Np0 p1− Np1 ) = − Npi log pi
i =0
The average # of bits to represent the symbol is therefore
1
− pi log pi
i =0
More Intuition on Entropy
• Assume a binary memoryless source, e.g., a flip of a coin. How much
information do we receive when we are told that the outcome is heads?
– If it’s a fair coin, i.e., P(heads) = P (tails) = 0.5, we say that the amount
of information is 1 bit.
– If we already know that it will be (or was) heads, i.e., P(heads) = 1, the
amount of information is zero!
– If the coin is not fair, e.g., P(heads) = 0.9, the amount of information is
more than zero but less than one bit!
– Intuitively, the amount of information received is the same if P(heads) =
0.9 or P (heads) = 0.1.
Self Information
• So, let’s look at it the way Shannon did.
• Assume a memoryless source with
– alphabet A = (a1, …, an)
– symbol probabilities (p1, …, pn).
• How much information do we get when finding out that the next
symbol is ai?
• According to Shannon the self information of ai is
Why?
Assume two independent events A and B, with
probabilities P(A) = pA and P(B) = pB.
For both the events to happen, the probability is
pA ¢ pB. However, the amount of information
should be added, not multiplied.
Logarithms satisfy this!
No, we want the information to increase with
decreasing probabilities, so let’s use the negative
logarithm.
Self Information
Example 1:
Example 2:
Which logarithm? Pick the one you like! If you pick the natural log,
you’ll measure in nats, if you pick the 10-log, you’ll get Hartleys,
if you pick the 2-log (like everyone else), you’ll get bits.
Self Information
On average over all the symbols, we get:
H(X) is called the first order entropy of the source.
This can be regarded as the degree of uncertainty
about the following symbol.
Entropy
Example: Binary Memoryless Source
BMS 01101000…
Let
Then
1
The uncertainty (information) is greatest when
0 0.5 1
Example
Three symbols a, b, c with corresponding probabilities:
P = {0.5, 0.25, 0.25}
What is H(P)?
Three weather conditions in Corvallis: Rain, sunny, cloudy with
corresponding probabilities:
Q = {0.48, 0.32, 0.20}
What is H(Q)?
Entropy: Three properties
1. It can be shown that 0 · H · log N.
2. Maximum entropy (H = log N) is reached when all
symbols are equiprobable, i.e.,
pi = 1/N.
3. The difference log N – H is called the redundancy of
the source.