Module 3 – Codes on graphs
Part - 1
Recap
In the previous classes we saw:
●
How to compress a source of information
●
What is the limit of speed of reliable
communication through a noisy channel
●
And, a peep into linear block codes – a
mechanism to correct channel errors
In this module
We will see,
●
A new class of codes called convolutional codes
●
Their decoding algorithm
●
A class of codes, called Turbo codes, that have
excellent error correction properties
Get packed for the tour ...
Introduction
Problem with large block lengths
●
Incurs lot of decoding delay.
For eg if the block length is 1024 bits and the
data rate is 10 kbps, the delay is 0.1 seconds.
Cannot be tolerated for real-time applications
Alternative
●
Encode small blocks of data of length k0, called frames,
to blocks of length n0.
●
The difference is, the encoded bits depend on both
present frame and past frames.
●
Thus the encoder has memory.
●
Such codes are known as Tree codes (We will see
why).
Recap – Block encoeder
Encoder in a figure
What do you make out from this?
Convolutional codes
●
We assume that we have an infinitely long stream of incoming bits (Thanks to the huge volume of data these
days its not a bad assumption.)
●
This stream of bits is broken up into segments of k 0 bits, called information frames.
●
The encoder consists of two parts -
– Memory, which is basically a shift register
– A logic circuit
●
The memory of the encoder can store m information frames.
●
Each time a new frame arrives, its shifted into the shift register and the oldest frame is discarded.
●
At the end of any frame time the encoder has m most recent information frames in its memory which
corresponds to mk0 information bits.
●
When a new frame arrives, the encoder computes the codeword frame using the new frame and m previously
stored frames.
●
The computation of the codeword frame is done by the logic circuit.
●
This codeword frame is then shifted out.
●
The oldest information frame in the memory is then discarded and most recent information frame is then
shifted into the memory
●
The encoder is now ready for the next incoming frame.
●
For each information frame of k0 bits the encoder generates a codeword frame of n 0 bits.
●
Note that, the same information frame does not generate the same codeword frame always since the
codeword frame depends on memory of the encoder as well.
Tree code
●
The constraint length of the encoder is defined
as the number of bits it can store in the memory
Example
Example contd
Example contd
Analyzing the encoder
●
(Do it on board. Reference next slide)
Example contd : Analysis
Example contd
Example contd
Present state of the Next state of the
Incoming bit Outgoing bits
encoder encoder
0 00 00 00
1 00 11 10
0 01 11 00
1 01 00 10
0 10 01 01
1 10 10 11
0 11 10 01
1 11 01 11
State diagram
Example contd
(replace this diagram)
Encode the bit stream 1 0 0 1 1 01
using trellis
Encoding 10100011
●
It means the bit i0 comes at time t = 0, i1 comes with unit delay, i2 comes with delay of
2 units, etc.
Polynomial description of the
encoder
●
a = in + in-1 + in-2
●
Let I(D)= i0 + i1 D + i2 D2 + i3 D3 + i4 D4 + ...
●
Then nth term of C(D)= (in + in-1 + in-2)Dn
= I(D)*(1+D+D2) (work out on board and convince this
●
Therefore, g11(D) = 1+D+D2
●
b = in + in-2 (Do it by yourself)
●
Then, g12(D) = 1+D2
Polynomial description (do it on
board)
Another example
●
What is the rate and constraint length of this encoder?
Encoding using polynomials
●
C(D) = I(D) * G(D)
1 X n0 1 X k0 k0 X n0
●
Previous example: k0 = 1, n0 = 2
●
I(D) = i0 + i1 D + i2 D2 + i3 D3 + i4 D4 + ...
●
G(D) = [ 1 1 + D4]
●
C(D) = I(D)*G(D) = [ C1(D) C2(D)], where
– C1(D) = i0 + i1 D + i2 D2 + i3 D3 + ...
– C2(D) = <write down by yourself>
Generating function
●
Consider the following example.
Matrix description of convolutional
codes