1.Coding Theory
-
Upload
mailstonaik -
Category
Documents
-
view
46 -
download
2
description
Transcript of 1.Coding Theory
![Page 1: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/1.jpg)
CODING THEORY
A Bird’s Eye View
![Page 2: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/2.jpg)
2
Text Books
Shu Lin and Daniel. J. Costello Jr., “Error Control Coding:
Fundamentals and applications”, Prentice Hall Inc.
R.E. Blahut, “Theory and Practice of Error Control Coding”, MGH
References:
Rolf Johannesson, Kamil Sh. Zigangirov, “Fundamentals of
Convolutional Coding”, Universities Press (India) Ltd. 2001.
Proakis, “Digital Communications”, MGH.
![Page 3: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/3.jpg)
3
Introduction
Role of Channel Coding in Digital
Communication System
Block Codes and Convolution Codes
Channel Models
Decoding Rules
Error Correction Schemes
References
![Page 4: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/4.jpg)
4
Slide 41
![Page 5: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/5.jpg)
5
![Page 6: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/6.jpg)
6
![Page 7: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/7.jpg)
7
![Page 8: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/8.jpg)
8
![Page 9: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/9.jpg)
9
![Page 10: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/10.jpg)
10
![Page 11: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/11.jpg)
11
![Page 12: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/12.jpg)
12
![Page 13: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/13.jpg)
13
![Page 14: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/14.jpg)
14
Shannon’s Theorem (1948)
Noisy Coding Theorem due to Shannon:
Roughly: Consider channel with capacity C. If we are
willing to settle for a rate of transmission that is strictly
below C, then there is an encoding scheme for the
source data, that will reduce the probability of a
decision error to any desired level.
Problem: Proof is not constructive! To this day, no one
has found a way to construct the coding schemes
promised by Shannon’s theorem.
![Page 15: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/15.jpg)
15
Shannon’s Theorem (1948)-contd
Additional concerns:
Is the coding scheme easy to implement, both in
encoding and decoding?
May require extremely long codes.
![Page 16: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/16.jpg)
16
The Shannon-Hartley Theorem
Gives us a theoretical maximum bit-rate that can
be transmitted with an arbitrarily small bit-error
rate (BER), with a given average signal power,
over a channel with bandwidth B Hz, which is
affected by AWGN.
For any given BER, however small, we can find
a coding technique that achieves this BER;
smaller the given BER, the more complicated
will be the coding technique.
![Page 17: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/17.jpg)
17
Shannon-Hartley Theorem-contd.
Let the channel bandwidth be B Hz and signal
to noise ratio be S/N (not in dB).
sec/)/1(log2
bitsNSBC
![Page 18: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/18.jpg)
18
Shannon-Hartley Theorem-contd.
For a given bandwidth B and a given S/N,
we can find a way of transmitting data at a
bit-rate R bits/second, with a bit-error rate
(BER) as low as we like, as long as R C.
Now assume we wish to transmit at an
average energy/bit of Eb and the AWGN
noise has two sided power spectral density
N0 /2 Watts per Hz. It follows that the signal
power S = EbR and the noise power N = N0B
Watts.
![Page 19: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/19.jpg)
19
Shannon-Hartley Theorem-contd.
R/B ratio is called bandwidth efficiency in
bits/sec/Hz. How many bits per sec do I get for
each Hz of bandwidth.We want this to be as
high as possible. Eb /N0 is the normalised
average energy/bit, where the normalisation is
with respect to the one sided PSD of the noise.
The law gives the following bounds:
![Page 20: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/20.jpg)
20
Shannon-Hartley Theorem-contd.
BRN
E
BN
RE
B
R
BR
b
b
/
12)(
)1(log
/
min
0
0
2
![Page 21: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/21.jpg)
21
Shannon Limit
The bound gives the minimum possiblenormalised energy per bit satisfying the Shannon-Hartley law.
If we draw a graph of (Eb/N0 )min against (R/B)we observe the that (Eb/N0 )min never goes less than about 0.69 which is about -1.6dB.
Therefore if our normalised energy per bit is less than -1.6dB, we can never satisfy the Shannon-Hartley law, however inefficient (in terms of bit/sec/Hz) we are prepared to be.
![Page 22: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/22.jpg)
22
Shannon Limit-contd.
There exists a limiting value of (Eb/N0 ) below
which there cannot be error free communication
at any transmission rate.
The curve R = C will divide the achievable and
non-achievable regions.
![Page 23: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/23.jpg)
23
![Page 24: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/24.jpg)
24
Modulation-Coding trade-off
For Pb=10-5, BPSK modulation requires Eb/N0 = 9.6dB(optimum un-coded binary modulation)
For this case, Shannon’s work promised a performance improvement of 11.2dB over the performance of un-coded binary modulation, through the use of coding techniques.
Today, “Turbo Codes”, are capable of achieving an improvement close to this.
“Turbo Codes” are Near Shannon limit error correcting codes
![Page 25: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/25.jpg)
25
Coding Theory-Introduction
Main problem:
A stream of source data, in the form of 0’s and 1’s, is being transmitted over a communication channel, such as a telephone line. Occasionally, disruptions can occur in the channel, causing 0’s to turn into 1’s and vice versa.
Question: How can we tell when the original data has been changed, and when it has, how can we recover the original data?
![Page 26: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/26.jpg)
26
Coding Theory-Introduction
Easy things to try:
Do nothing. If a channel error occurs with probability p, then the probability of making a decision error is p.
Send each bit 3 times in succession. The bit that occurs the majority of the time, gets picked. (E.g. 010 => 0)
Repetition codes!!
![Page 27: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/27.jpg)
27
Coding Theory-Introduction
Generalize above: Send each bit n times, choose majority
bit. In this way, we can make the probability of making
a decision error arbitrarily small, but inefficient in terms
of transmission rate.
As n increases the achievable BER reduces, at the
expense of increased codeword length (reduced code
rate)
Repetition coding is inefficient…
![Page 28: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/28.jpg)
28
Coding Theory Introduction (cont’d)
Encode source information, by adding additional
information (redundancy), that can be used to detect,
and perhaps correct, errors in transmission. The more
redundancy we add, the more reliably we can detect and
correct errors, but the less efficient we become at
transmitting the source data.
![Page 29: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/29.jpg)
29
Error control applications
Data communication networks (Ethernet, FDDI, WAN, Bluetooth)
Satellite and Deep space communications
Cellular mobile communications
Modems
Computer buses
Magnetic disks and tapes
CDs, DVDs. Digital sound needs ECC!
![Page 30: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/30.jpg)
30
Error control categories
The error control problem can be classified in several ways:
Types of error control coding: detection vs. correction
Types of errors: how much clustering- random, burst etc.
Types of codes: block vs. convolutional
![Page 31: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/31.jpg)
31
Error Control Strategies
Error detection.
Goal: avoid accepting faulty data.
Lost data may be unfortunate; wrong data may be disastrous.
(Forward) error correction (FEC or ECC).
Use redundancy in encoded message to estimate from the received data what message was actually sent.
The best estimate is usually the “closest" message. The optimal estimate is the message that is most probable given what is received.
![Page 32: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/32.jpg)
32
![Page 33: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/33.jpg)
33
![Page 34: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/34.jpg)
34
Types of Channel Codes
Block Codes ( Codes with strong algebraic flavor)
~1950--- Hamming Code (Single error correction)
All codes in 50’s were too weak compared to the codes promised by Shannon
Major Breakthrough……..1960
BCH Codes…
Reed-Solomon Codes…
Capable of correcting Multiple Errors
![Page 35: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/35.jpg)
35
![Page 36: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/36.jpg)
36
![Page 37: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/37.jpg)
37
![Page 38: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/38.jpg)
38
Convolutional Codes
Codes with Probabilistic flavor.
Late 1950’s but gained popularity after the
introduction of Viterbi algorithm in 1967.
Developed from the idea of sequential decoding
Non-block codes
Codes are generated by a convolution operation on
the information sequence
![Page 39: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/39.jpg)
39
![Page 40: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/40.jpg)
40
![Page 41: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/41.jpg)
41
Coding Schemes: Trend…
Since 1970’s the two avenues of research started
working together
This resulted in the development towards the
codes promised by Shannon
Today “Turbo Codes”, are capable of achieving
an improvement close to Shannon Limit
![Page 42: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/42.jpg)
42
Coding Schemes
Applications demand for wide range of data rates, block sizes, error rates.
No single error protection scheme works for all applications. Some requires the use of multiple coding techniques.
A common combination uses an inner convolutional code and an outer Reed-Solomon code.
![Page 43: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/43.jpg)
43
![Page 44: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/44.jpg)
44
![Page 45: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/45.jpg)
45
![Page 46: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/46.jpg)
46Slide 3
![Page 47: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/47.jpg)
47
![Page 48: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/48.jpg)
48
![Page 49: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/49.jpg)
49
![Page 50: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/50.jpg)
50
![Page 51: 1.Coding Theory](https://reader033.fdocuments.in/reader033/viewer/2022042516/55cf9964550346d0339d237d/html5/thumbnails/51.jpg)
51