Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7...

27
Module-2 Page 30 of 103

Transcript of Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7...

Page 1: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

Module-2

Page 30 of 103

Page 2: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

2.1) NPTEL Video Link : Module-2 Lecture number 8,9 and 26 to 31

Sl. No.

Module No.

Lecture No.

Topics covered Video Link

1 Mod 02 Lec-08 Information theory (part-1) http://nptel.ac.in/courses/117101051/8

2 Mod 02 Lec-09 Information theory (part-2) http://nptel.ac.in/courses/117101051/9

3 Mod 02 Lec-26 Source coding (part-1) http://nptel.ac.in/courses/117101051/26

4 Mod 02 Lec-27 Source coding (part-2) http://nptel.ac.in/courses/117101051/27

5 Mod 02 Lec-28 Source coding (part-3) http://nptel.ac.in/courses/117101051/28

6 Mod 02 Lec-29 Source coding (part-4) http://nptel.ac.in/courses/117101051/29

7 Mod 02 Lec-30 Channel coding http://nptel.ac.in/courses/117101051/30

8 Mod 02 Lec-31 Fundamentals of OFDM http://nptel.ac.in/courses/117101051/31

NPTEL Web Link: http://nptel.ac.in/courses/Webcourse-contents/IIT%20Kharagpur/Digi%20Comm/New_index1.html

Page 31 of 103

Page 3: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

2.2) Questions:

Questions from Video Lectures of NPTEL

Sl no Questions Video

Number

Time in

Minutes

1 What is the need of processing source in communication? Explain with a block

diagram

08 4

2 What are the basic operations performed at the transmitter? 08 6

3 How source coding differs from channel coding in communication? 08 7

4 What are the functions performed by source code? Explain with an example. 08 8

5 In a horse race there are eight horses the probabilities of winning of these

horses are as follows:

Horse H1 H2 H3 H4 H5 H6 H7 H8

Probability

of winning

1/2 1/4 1/8 1/16 1/64 1/64 1/64 1/64

i) Find the information conveyed by each horse.

ii) How to assign code length for each horse information?

iii) Find the average information conveyed.

iv) Find the average length assigned practically.

v) What is the entropy when equal length code is assigned?

08 9

6 How to measure the information of any event based on the probability of the

event?

08 23

7 Define entropy of the system. 08 25

8 Which are the units used to measure entropy? 08 27

9 Why information is expressed in log form? 08 27

10 In a binary source there are two symbols with probabilities P and (1-P). Plot

the entropy as a function of P and when the entropy will be maximum?

08 28

11 A source has four symbols with probabilities as follows:

Symbols A B C D

Probabilities 1/2 1/4 1/8 1/8

Find the entropy of the source and discuss how to assign code for these

08 33

Page 32 of 103

Page 4: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

symbols?

12 Explain the properties of entropy. 08 38

13 What are minimum and maximum values of average codeword length Lav? 08 40

14 Explain Shannon source coding theorem. 08 42

15 Define joint entropy of the two random variables. 08 47

16 A source has alphabet X={0,1} with probabilities p and 1-p, similarly another

source has alphabet Y={0,1} with probabilities q and 1-q, Find the joint

entropy if random variables are independent.

08 52

17 Define conditional entropy 09 6

18 State and prove chain rule as applied to entropies. 09 10

19 The joint probability matrix is given by

0016/116/1

016/1

4/116/1

32/132/132/132/1

8/116/1

16/18/1

),( YXP

Find the probabilities of input and output symbols. Also find

H(X),H(Y),H(X,Y), H(X/Y) and H(Y/X)

09 17

20 Write the Venn diagram of channel entropies 09 28

21 Define mutual information and what is its significance? 09 30

22 Prove that when X and Y are independent events then I(X;Y)=0 09 34

23 Prove the following identities

I(X;Y)=H(X)-H(X/Y)

09 36

24 State and prove channel coding theorem. 09 42

25 Define channel capacity for discrete memory less channel. 09 48

26 With suitable graph explain upper and lower bound on the probability of

error.

09 51

27 With block diagram explain the working of a communication system 26 2

28 What is the function of source code? Why we need this in communication

system

26 3

29 What are the difference between lossless coding and lossy coding? 26 9

30 What is discrete memory less source explain with an example. 26 10

31 A source has four symbols with TWO different code sets as follows:

SYMBOLS CODE-A CODE-B

X0 00 0

26 12

Page 33 of 103

Page 5: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

X1 10 10

X2 10 110

X3 11 111

Find the average length of the code and entropy of the source, for code –A

and code-B.

32 What are the desired properties of a source code? What is the condition for a

code to be uniquely decodable?

26 15, 16

33 What is prefix of a code? What is its significance? 26 18

34 With suitable example explain the procedure of Huffman coding 26 23

35 A source has five symbols with probabilities as follows:

Symbols Probabilities S1 0.2 S2 0.15 S3 0.25 S4 0.25 S5 0.15

Obtain Huffman coding. Also find the efficiency and redundancy.

26 24

36 Explain the properties of Huffman coding. 26 28

37 A source has four symbols S={S1, S2 ,S3, S4} with source alphabet X={0 1}

SYMBOLS CODE A CODE B Code C Code D

S1 0 0 10 0

S2 0 010 00 10

S3 0 01 11 110

S4 0 10 110 111

Identify the codes as

i) Singular

ii) Non-singular but uniquely decodable

iii) Uniquely decodable but not prefix

iv) Prefix

26 32

38 A source has seven symbols with probabilities as follows

Symbols Probabilities X1 0.49 X2 0.26 X3 0.12

26 35

Page 34 of 103

Page 6: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

X4 0.04 X5 0.04 X6 0.05 X7 0.02

Find

i) a binary Huffman code

ii) the expected code length for this encoding

iii) What is the minimum length of any fixed length code for this random

variable?

39 Which of these codes cannot be Huffman codes for any probability

assignment?

a) {0, 10, 11}

b) {00, 01, 10, 110}

c) {01, 10}

26 48

40 State and prove necessary condition for Kraft’s inequality 27 13

41 State and prove sufficient condition for Kraft’s inequality 27 24

42 What is optimal code? Explain with suitable example 27 37

43 For an optimal code prove that

27 40

44 27 49

45 A source has TWO symbols with probabilities as follows

Symbols probabilities

X1 0.9

X2 0.1

Find the average code word length, entropy and efficiency of Huffman

coding and comment on the result

28 8

46 What is Block-wise encoding explain with suitable example. 28 11

47 How average codeword length can be made equal to entropy of a system for

any discrete memoryless source?

28 12

48 For nth extension prove that

n

28 12

49 What are the disadvantages of Huffman code? 28 18

50 Explain the procedure of Shannon-fano-Elias code or Arithmetic code. 28 19

51 Prove that in Shannon –Fano-Elias coding average codeword length

L = H(S)+2

28 35

Page 35 of 103

Page 7: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

52 A source generates five symbols with probabilities 0.2, 0.05, 0.45, 0.15 and

0.15.

Which of these statements are true?

a) There is a binary UDC [uniquely decodable code] for this source with

average length = 2.

b) There is a binary UDC [uniquely decodable code] for this source with

average length <3.2.

c) There is no binary UDC [uniquely decodable code] for this source with

average length < 2.5.

d) Also find binary Huffman coded and its average length.

28 42

53 Using Kraft inequality show that there is a prefix code with length 6, 6, 5, 4, 4,

4, 4, 4, 3, 3, 3, 2

a) construct such a code

b) specify a set of source symbol probabilities such that this code gives

the average length = H(X) for that source.

28 49

54 Find the binary Huffman code for the source with probabilities

(1/3, 1/5, 1/5, 2/15, 2/15). Show that this code is also optimal for the source

with probability (1/5, 1/5, 1/5, 1/5, 1/5).

28 54

55 Find a probability distribution {p1, p2, p3, p4} such that there are two optimal

codes that assign different length {li} to the four symbols.

28 54

56 Discuss the procedure of Block-wise SFA with suitable example. 29 13

55 What is lexicographic ordering? What is its significance in coding? 29 14

56 What are the two phases of Arithmetic codes? 29 15

57 What are the advantages of Arithmetic coding and briefly discuss its

procedure?

29 19

58 Explain the iterative encoding procedure of Arithmetic codes. 29 24

59 Briefly discus in arithmetic coding how to compute ( / ( ) for Laplace

model

29 26

60 Briefly discus in arithmetic coding how to compute ( / ( ) for Dirichlet

model

29 28

61 Briefly discuss with an example how to calculate different probabilities in

arithmetic coding.

29 31

62 With an example explain how a file compression encoding is performed in

arithmetic coding.

29 34

Page 36 of 103

Page 8: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

63 With an example explain how a file compression decoding is performed in

arithmetic coding.

29 37

64 Explain the procedure of Lempel-Ziv coding. 29 39

65 Explain the encoding procedure of Lempel-Ziv coding with suitable example. 29 46

66 Explain the decoding procedure of Lempel-Ziv coding with suitable example. 29 49

67 Take the string 0010100001000000001100000100100000100000000001 and

perform Lempel-Ziv coding on this string. What is the compression ratio?

29 53

68 Decode the string 00101011101100100100011010101000011 that was

encoded using the basic Lempel-Ziv algorithm.

29 53

69 With neat block diagram, explain the role of source and channel coder in

digital communication system.

30 3

70 What is the purpose of channel coder in communication system? 30 4

71 Consider a binary symmetric channel with probability of error p=0.25, find the

channel capacity.

30 8

72 What is repetition code? Explain with an example. 30 16

73 Define code rate in case of repetition code. How many errors an n-times

repetition code will detect and correct?

30 18

74 What is parity check code? Explain with an example. 30 21

75 Define code rate in case of parity check code. 30 28

76 How to define difference between any two codes? 30 29

77 What is Hamming distance between any two code vectors? 30 30

78 What is dmin of a code? Explain its significance in coding theory. 30 31

79 Explain the decoding procedure in case of source code. 30 39

80 Explain the design procedure of (7, 4) Hamming code. 30 47

81 How to check error in the received code of Hamming code? Explain with

suitable example.

30 49

82 Define DTFT of a signal x(n) 31 4

83

spectrum.

31 5

84 - -1), also mention its magnitude and phase of

the spectrum.

31 6

85 Define Inverse DTFT of a signal 31 7

86 Briefly discuss linear and convolution property of DTFT. 31 8

87 Briefly discuss modulation property of DTFT. 31 10

Page 37 of 103

Page 9: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

88 Define DFT of a signal x(n) 31 11

89 Define Inverse DFT of a signal x(n) 31 12

90 Define DFT in matrix form 31 13

91 Define IDFT in matrix form 31 18

92 How cyclic convolution differs from liner convolution 31 20

93 Define cyclic convolution 31 20

94 Briefly discuss the following properties of DFT

i) Linearity

ii) Point-wise multiplication

iii) cyclic shift

31 22

95 Find the linear convolution of

- - -2)- -3)and

- -1)

31 25

96 What are problems with single carrier communication? 31 34

97 Briefly discuss the fundamentals of OFDM with frequency selective channel. 31 38

98 Briefly discuss the working of M-band OFDM 31 39

99 Explain the role of subbands in OFDM 31 40

100 What are the characteristics of an ideal carrier? 31 40

101 How to choose efficient carriers in OFDM? 31 42

102 With block diagram explain efficient transmitter structure. 31 43

103 How to avoid ISI in OFDM? 31 46

104 With suitable example explain how cyclic prefix avoid ISI in OFDM. 31 47

105 With neat block diagram explain the working of Transceiver structure in

OFDM

31 50

106 Briefly discuss the power allocation in OFDM. 31 52

107 What is bit allocation in OFDM and explain with suitable example? 31 54

Page 38 of 103

Page 10: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

2.3) Quiz Questions

Questions: Fill in the blanks.

Q.

No.

Question Answer

1 If all the events are equally likely then the entropy is ____________ Maximum

2 av H(X)+1

3 The minimum number of bits used to represent the source is_______ Entropy

4 When X and Y are independent events then I(X;Y)=_______ zero

5 For a band limited signal of bandwidth W for no loss of information

the sampling rate should be____

6 Source coding is needed when the channel may not have the capacity

to communicate at the ___________rate

source information

7 A sampled signal has to be _____ before transmission through a

discrete memory-less channel.

Quantized

8 By ________ the value of n, we can have L as close to H(x). increasing

9 Shannon –Fano-Elias coding is________ optimum asymptotically

10 Arithmetic coding can take source with correlation and thus gives

better _____ for the source with correlation

compression

11 Dictionary based algorithms are not efficient for _____ transmission. Image

12 For small source alphabets, we have efficient coding only if we use -

________ blocks of source symbols.

long

13 By using long blocks we can achieve length per symbol close to the

__________ of the source.

Entropy

14 In source coding we remove the ________ Redundancy

15 In arithmetic coding the upper and the lower limit can be

computed___________

Recursively

16 In case of arithmetic coding only information required by the tag

generation is ________

Cumulative distribution

function

17 In case of arithmetic coding tag for any sequence can be computed

________

Sequentially

18 In Shannon-Fano-Elias coding ____________ distributive function is

used for coding

Cumulative

Page 39 of 103

Page 11: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

19 To approach channel capacity we need codes of _______ length large

20 If n is the length of repetition code, then the number of errors

corrected is__________

n/2

21 In case of parity check code we can detect ______ error. one

22 The difference between two code words 101011 and 110110 is___ Four

23 In case of n-times repetition code the dmin=_____ n

24 If we have a code with minimum distance d then we can correct all

the errors, if the number of errors is __________the d/2.

less than

25 Single carrier performance is affected due to high _____ in some

bands, since all used frequencies are given equal importance.

attenuation

26 In OFDM the spectrum is divided into _________ Narrow subbands

27 In OFDM power and rate of transmission in a band depends on the

response of the ____in that band.

channel

28 By using cyclic prefix we can avoid ___________ in OFDM ISI

29 The number of bits and the constellation can be chosen for a sub-

channel based on the ________in that sub-channel and the required

probability of error

SNR

Page 40 of 103

Page 12: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

2.4) True or False:

Q: State whether the following statements are

True or False? Answer

1 Should use fewer bits for frequent event T

2 T

3 If X and Y are dependent events then H(X,Y)=H(X)+H(Y) F

4 If X and Y are dependent events then the mutual information conveyed is zero. F

5 R so that the average probability of

T

6 e F

7 If R>C, then real communication is not possible. T

8 Telephone quality speech signal can be compressed to as little as 1-2kbps T

9 Uniform distribution of the source may allow efficient representation of the signal at

a low rate.

F

10 Morse code is an example of non-uniform distribution of the source T

11 Huffman code is a prefix code T

12 The condition

1

Is necessary for any code to be prefix code.

T

13 Kraft inequality will not hold good for uniquely decodable code. F

14 A code for a random variable X is optimum if there is no code for the same random

variable with smaller average length.

T

15 Huffman code cannot be applied for sources with unknown statistics. T

16 Block wise SFA has to compute all probabilities masses. F

17 Dictionary based algorithms are most efficient for text transmission. T

18 By continuing to block symbols together, we find that redundancy drops to

acceptable values when we block eight symbols together.

T

Page 41 of 103

Page 13: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

19 For small source alphabets, we have efficient coding only if we use short blocks of

source symbols.

F

20 By using long blocks we can achieve length per symbol close to the entropy of the

source.

T

21 Huffman coding is ideal for long block coding F

22 In arithmetic code first we generate a tag T

23 In arithmetic coding, the interval containing the tag value for a given sequence is

disjoint from the interval containing the tag values of all other sequences.

T

24 In case of arithmetic coding the tag for any sequence is given by the difference of

upper and lower limit.

F

25 In arithmetic coding each tag should be represented by a unique binary code T

26 In arithmetic coding the values of l(n) and u(n) values come closer and closer as n gets

larger.

T

27 For large length sequence Huffman coding is better than Arithmetic coding. F

28 In case of Lempel-Ziv coding we maintain a list of substrings appeared before T

29 In case of Lempel no explicit knowledge of the source statistics required T

30 In channel coding we remove the redundancy F

31 Channel coding is used to reduce probability of error for finite block of symbol. T

32 In case of parity check code we can correct errors also F

33 The minimum hamming distance of a parity check code is two T

34 T

35 In single carrier communication, the frequency selective channel introduces ISI at the

receiver.

T

36 In single carrier communication equalization may reduce noise greatly in frequencies

where channel response is poor.

F

37 In OFDM we use single carrier communication F

38 In OFDM separate data is transmitted in each band using different carrier T

39 In OFDM no ISI since in each narrow subband, the channel response is almost flat. T

40 By using zero insertion we can avoid ISI in OFDM T

Page 42 of 103

Page 14: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

2.5) Frequently Asked Questions [ FAQ ]:

Questions Video

No

1 What is the need of processing source in communication? Explain with a block

diagram.

08

2 How source coding differs from channel coding in communication? 08

3 What are the functions performed by source code? Explain with an example. 08

4 How to measure the information of any event based on the probability of the

event?

08

5 Define entropy of the source. 08

6 Which are the units used to measure entropy? 08

7 Why information is expressed in log form? 08

8 In a binary source there are two symbols with probabilities p and (1-p). Plot the

entropy as a function of P and when the entropy is maximum?

08

9 A source has four symbols with probabilities as follows

Symbols probabilities

A 1/2

B 1/4

C 1/8

D 1/8

Find the entropy of the source and discuss how to assign code for these

symbols?

08

10 Explain the properties of entropy. 08

11 What are minimum and maximum values of average codeword length Lav? 08

12 Explain Shannon source coding theorem. 08

13 A source has alphabet X={0,1} with probabilities p and 1-p, similarly another

source has alphabet Y={0,1} with probabilities q and 1-q, Find the joint entropy

if random variables are independent.

08

14 The joint probability matrix is given by

09

Page 43 of 103

Page 15: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

0016/116/1

016/1

4/116/1

32/132/132/132/1

8/116/1

16/18/1

),( YXP

Find the probabilities of input and output symbols. Also find

H(X),H(Y),H(X,Y), H(X/Y) and H(Y/X)

15 Write the Venn diagram of channel entropies 09

16 Define mutual information and what is its significance? 09

17 Prove the following identities

I(X;Y)=H(X)-H(X/Y)

09

18 State and prove channel coding theorem. 09

19 With suitable graph explain upper and lower bound on the probability of error. 09

20 What are the difference between lossless coding and lossy coding? 26

21 With suitable example explain the procedure of Huffman coding 26

22 A source has five symbols with codes as follows

Symbols probabilities

S1 0.2

S2 0.15

S3 0.25

S4 0.25

S5 0.15

Obtain Huffman coding also find the efficiency and redundancy.

26

23 Explain the properties of Huffman coding. 26

24 A source has four symbols S={S1, S2 ,S3, S4}

with source alphabet X={0 1}

SYMBOLS CODE A CODE B Code C Code D

S1 0 0 10 0

S2 0 010 00 10

S3 0 01 11 110

S4 0 10 110 111

Identify the codes as

i) Singular

ii) Non-singular but uniquely decodable

26

Page 44 of 103

Page 16: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

iii) Uniquely decodable but not prefix

iv) Prefix

25 A source has seven symbols with probabilities as follows

Symbols probabilities X1 0.49 X2 0.26 X3 0.12 X4 0.04 X5 0.04 X6 0.05 X7 0.02

Find

i) a binary Huffman code

ii) the expected code length for this encoding

iii) What is the minimum length of any fixed length code for this random

variable?

26

26 Which of these codes cannot be Huffman codes for any probability

assignment?

a) {0, 10, 11}

b) {00, 01, 10, 110}

c) {01, 10}

26

27 State and prove necessary condition for Kraft’s inequality 27

28 State and prove sufficient condition for Kraft’s inequality 27

29 What is optimal code? Explain with suitable example 27

30 For an optimal code prove that

27

31 27

32 A source has TWO symbols with probabilities as follows

Symbols probabilities

X1 0.9

X2 0.1

Find the average code word length, entropy and efficiency of Huffman coding

and comment on the result

28

33 What are the disadvantages of Huffman code? 28

34 Explain the procedure of Shannon-fano-Elias code or Arithmetic code. 28

35 Prove that in Shannon –Fano-Elias coding average codeword length 28

Page 45 of 103

Page 17: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

L = H(S)+2

36 A source generates five symbols with probabilities 0.2, 0.05, 0.45, 0.15 and

0.15.

Which of these statements are true?

a) There is a binary UDC [uniquely decodable code] for this source with

average length = 2.

b) There is a binary UDC [uniquely decodable code] for this source with

average length <3.2.

c) There is no binary UDC [uniquely decodable code] for this source with

average length < 2.5.

d) Also find binary Huffman coded and its average length.

28

37 Using Kraft inequality show that there is a prefix code with length 6, 6, 5, 4, 4,

4, 4, 4, 3, 3, 3, 2

a) construct such a code

b) specify a set of source symbol probabilities such that this code gives

the average length = H(X) for that source.

28

38 Find the binary Huffman code for the source with probabilities

(1/3, 1/5, 1/5, 2/15, 2/15). Argue that this code is also optimal for the source

wit probability

(1/5, 1/5, 1/5, 1/5, 1/5)

28

39 Discuss the procedure of Block-wise SFA with suitable example. 29

40 What is lexicographic ordering? What is its significance in coding? 29

41 What are the advantages of Arithmetic coding and briefly discuss its

procedure?

29

42 Explain the iterative encoding procedure of Arithmetic codes. 29

43 Briefly discus in arithmetic coding how to compute ( / ( ) for Laplace

model

29

44 Briefly discus with an example, how to calculate different probabilities in

arithmetic coding.

29

45 With an example explain how a file compression encoding is performed in

arithmetic coding.

29

46 With an example explain how a file compression decoding is performed in

arithmetic coding.

29

47 Explain the procedure of Lempel-Ziv coding 29

Page 46 of 103

Page 18: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

48 Explain the encoding procedure of Lempel-Ziv coding with suitable example. 29

49 Explain the decoding procedure of Lempel-Ziv coding with suitable example. 29

50 Take the string 0010100001000000001100000100100000100000000001 and

do Lempel-Ziv coding on this string. What is the compression ration?

29

51 Decode the string 00101011101100100100011010101000011 that was

encoded using the basic Lempel-Ziv algorithm

29

52 What is the purpose of channel coder in communication system? 30

53 Consider a binary symmetric channel wit probability of error p=0.25, find the

channel capacity.

30

54 What is repetition code? Explain with an example 30

55 Define code rate in case of repetition code. How many errors an n-times

repetition code will detect and correct?

30

56 What is parity check code? Explain with an example 30

55 How to define difference between any two codes? 30

56 What is Hamming distance between any two code vectors? 30

57 Explain the decoding procedure in case of source code. 30

58 Explain the design procedure of (7, 4) Hamming code. 30

59 How to check error in the received code of Hamming code? Explain with

suitable example.

30

60 Briefly discuss linear and convolution property of DTFT. 31

61 Briefly discuss modulation property of DTFT. 31

62 Define DFT of a signal x(n) 31

63 How cyclic convolution differs from liner convolution 31

64 Define cyclic convolution 31

65 Briefly discuss the following properties of DFT

i) Linearity

ii) Point-wise multiplication

iii) cyclic shift

31

66 What are problems with single carrier communication? 31

67 Briefly discuss the fundamentals of OFDM with frequency selective channel. 31

68 Explain the role of subbands in OFDM 31

69 What are the characteristics of an ideal carrier? 31

70 How to choose efficient carriers in OFDM? 31

71 With block diagram explain efficient transmitter structure. 31

Page 47 of 103

Page 19: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

72 How to avoid ISI in OFDM? 31

73 With suitable example explain how cyclic prefix avoid ISI in OFDM. 31

74 With neat block diagram explain the working of Transceiver structure in OFDM 31

75 Briefly discuss the power allocation in OFDM. 31

Page 48 of 103

Page 20: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

2.6) Assignment Questions. Q

Nos

Questions

1 A zero memory source has a source alphabet

S = {s1, s2, s3, s4, s5, s6} with

P = {0.3, 0.2, 0.15, 0.15, 0.15, 0.05}

Find the entropy of the source

2 Which of the following statements convey more information and why?

i) Tomorrow may be holiday.

ii) Our college students won the state level football match.

iii) Rose is red.

3 A source will have three symbols S1, S2 and S3 with probabilities 0.7, 0.2 and 0.1 respectively.

i) Find the information conveyed by each symbol and comment on the result.

ii) Also find the entropy of the system

4 In a Binary symmetric Channel the channel matrix is given by=

With p{a=0}=w and p{a=1}=

Derive an expression for mutual information

5 Prove the following identities of information measure

i)

ii)

6 Prove the following identities of information measure

i) H(X,Y)=H(X)+H(X/Y)

ii) H(X,Y)=H(Y)+H(Y/X)

7 Define the following entropies

i) H(A)

ii) H(B)

iii) H(A,B)

iv) H(A/B) and

v) H(B/A)

8 A source has Five symbols with codes as follows

Symbols probabilities

S1 0.1

Page 49 of 103

Page 21: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

S2 0.4

S3 0.2

S4 0.15

S5 0.15

Obtain Huffman coding also find the efficiency and redundancy.

9 A source has Five symbols with codes as follows

Symbols probabilities

S1 0.2

S2 0.2

S3 0.25

S4 0.2

S5 0.15

Obtain Huffman code; find its efficiency and redundancy.

i) Keeping added symbols as high as possible in the subsequent levels

ii) Keeping added symbols as low as possible in the subsequent levels when equal

probabilities are there?

Comments on the results.

10 A source has Five symbols with codes as follows

Symbols probabilities S1 0.1 S2 0.25 S3 0.25 S4 0.2 S5 0.2

Obtain Huffman coding also write the Tree for Binary Huffman code

11 What are the advantages of using cumulative distribution function in Shannon –Fano-Elias

coding.

12 Prove that in Shannon –Fano-Elias coding average codeword length

L < H(S)+2

13 In a Binary Channel the channel matrix is given by = 2/3 1/31/10 9/10

With p{a=0}=3/4 and p{a=1}=1/4

i) Write the noise diagram

ii) Find the probabilities of output symbols

Also find the backward probabilities

Page 50 of 103

Page 22: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

14 A source has Three symbols with probabilities as follows

Symbols probabilities S1 0.9 S2 0.06 S3 0.04

Find the average code word length, entropy, efficiency and redundancy of Huffman coding

and comment on the result

15 A source has Three symbols with probabilities as follows

Symbols probabilities

S1 0.9

S2 0.08

S3 0.02

Find the average code word length, entropy, efficiency and redundancy of Huffman coding

for second extension and comment on the result

16 A source has four symbols with probabilities as follows

Symbols probabilities

S1 0.2

S2 0.55

S3 0.15

S4 0.1

Obtain Shannon –Fano-Elias coding.

17 A source has five symbols with probabilities as follows

Symbols probabilities

S1 0.4

S2 0.2

S3 0.2

S4 0.1

S5 0.1

Obtain Shannon –Fano-Elias coding comment on the result

18 A source has Three symbols with probabilities as follows

Symbols probabilities

S1 0.65

S2 0.15

S3 0.2

Page 51 of 103

Page 23: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

Obtain Arithmetic coding comment on the result

2.7) Test your skill:

Sl. No.

Questions

1 Consider the following case, A card is drawn from a deck (i) You are told it is a spade. How much information did you receive? (ii) How much information would you receive if you are told that the card drawn is

an ace? (iii) If you are told that the card drawn is an ace of spades, how much information

did you receive? (iv) Is the information obtained in (iii) is the sum of information’s obtained in (i)

and (ii)? Justify your answer

2 The output of an information source contains 128 symbols 16 of which occur with a probability of 1/32 and the remaining 112 occur with a probability of 1/224. The source emits 2000 symbols per second. Assuming that the symbols are chosen independently, find the average information of this source. What could be the maximum entropy of the system?

3 In a Conventional telegraphy we use two symbols, the dot(.) and dash(_). Assuming the

dash is twice as long as the dot and half as probable. Find the average symbol rate and the entropy rate

4 A source has four symbols S={S1, S2 ,S3, S4} with source alphabet X={0 1}. Find which of the codes satisfying Kraft’s inequality and which are the codes are instantaneous codes?

SYMBOLS CODE A CODE B Code C Code D Code E S1 00 0 0 0 0 S2 01 100 10 10 10 S3 10 110 11 110 110 S4 11 111 111 100 11

5 A source has Five symbols with codes as followsSymbols probabilities S1 0.2 S2 0.2 S3 0.25 S4 0.2 S5 0.15

Obtain Huffman code; find its efficiency and redundancy. i) Keeping added symbols as high as possible in the subsequent levels ii) Keeping added symbols as low as possible in the subsequent levels when

equal probabilities are there? Comments on the results.

6 A source has four symbols with probabilities as follows Symbols Probabilities

Page 52 of 103

Page 24: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

S1 0.2 S2 0.55 S3 0.15 S4 0.1

Obtain Shannon –Fano-Elias coding.

7 In a communication system, a transmitter has 3 input symbols A = {a1, a2, a3} and receiver also has 3 output symbols B ={b1, b2, b3}. The matrix given below shows JPM with some marginal probabilities:

i) Find the missing probabilities (*) in the table. ii) Find P(b3/a1) and P(a1/b3) iii) Are the events a1 and b1 statistically independent? Why?

8 For the given channel matrix, compute the mutual information I(X,Y) with P(x1) = 0.45 and

P(x2) =0.55

1y 2y 3y

6/50

6/13/1

03/2

)/(2

1

XX

XYP

Page 53 of 103

Page 25: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

2.8) Additional link

Module-2 General Links

http://www.youtube.com/watch?v=C-o2jcLFxyk&list=PLWMqMAYxtBM-IeOSmNkT-KEcgru8EkzCs http://www.youtube.com/watch?v=R4OlXb9aTvQ http://www.youtube.com/watch?v=JnJq3Py0dyM http://www.yovisto.com/video/20224 http://www.youtube.com/watch?v=UrefKMSEuAI&list=PLE125425EC837021F

Q Nos

Question Videos Web Link

1 Noisy channel http://videolectures.net/mackay_course_06 http://www.yovisto.com/video/20230

http://www.indigosim.com/tutorials/communication/t3s3.htm

http://www.st-andrews.ac.uk/~www_pa/Scots_Guide/iandm/part8/page1.html

http://en.wikibooks.org/wiki/Data_Coding_Theory/Shannon_capacity

2 Rate of transmission Rb and Channel capacity C,

http://en.wikipedia.org/wiki/Channel_capacity

http://en.wikipedia.org/wiki/Shannon%E2%80%93Hartley_theorem

http://www.cis.temple.edu/~giorgio/cis307/readings/datatrans.html

3 Source encoder http://members.chello.nl/~m.heijligers/DAChtml/digcom/digcom.html

http://zone.ni.com/devzone/cda/ph/p/id/82

4 Zero memory sources?

http://www.google.co.in/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=rja&ved=0CDcQFjAC&url=http%3A%2F%2Fwww.cse.msu.edu%2F~cse847%2Fslides%2Fintroinfo.ppt&ei=lBe9UreAJMPqrAet6YDwDA&usg=AFQjCNEuNfVU4lYWFYS1FQmN3-l2gTR_eg&bvm=b

http://www.cs.cmu.edu/~roni/10601-slides/info-theory-x4.pdf

Page 54 of 103

Page 26: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

v.58187178,d.bmk

5 Maximum entropy http://en.wikipedia.org/wiki/Entropy_%28information_theory%29

http://web.eecs.utk.edu/~mclennan/Classes/420-594-F07/handouts/Lecture-04.pdf

http://www.princeton.edu/~achaney/tmve/wiki100k/docs/Entropy_%28information_theory%29.html

6 Source extension? https://www.cs.auckland.ac.nz/courses/compsci314s2c/resources/InfoTheory.pdf

http://people.irisa.fr/Olivier.Le_Meur/teaching/InformationTheory_DIIC3_INC.pdf

http://meru.cecs.missouri.edu/courses/cecs401/dict2.pdf

7 Redundancy in a source?

http://www.youtube.com/watch?v=JnJq3Py0dyM

http://en.wikipedia.org/wiki/Redundancy_%28information_theory%29

http://www.ifi.uzh.ch/ee/fileadmin/user_upload/teaching/hs09/L2_InformationTheory.pdf

8 Average codeword length with upper and lower bound

http://www.cim.mcgill.ca/~langer/423/lecture5.pdf

http://www.cims.nyu.edu/~chou/notes/infotheory.pdf

9 Shanon’s first theorem

http://www.cims.nyu.edu/~chou/notes/infotheory.pdf

https://ccnet.stanford.edu/cgi-bin/course.cgi?cc=ee10sc&action=handout_download&handout_id=ID13160335419045

https://www.cse.ust.hk/~golin/Talks/Shannon_Coding_Extensions.pdf

http://people.irisa.fr/Olivier.Le_Meur/teaching/InformationTheory_DIIC3_INC.pdf

10 Huffman coding http://people.cs.nctu.edu.tw/~cjtsai/courses/imc/classnotes/imc12_03_Huffman.pdf

http://www.cim.mcgill.ca/~langer/423/lecture3.pdf

http://www.cimt.plymouth.ac.uk/resources/codes/codes_u17_text.pdf

http://www.eit.lth.se/fileadmin/eit/courses/eit080/InfoTheorySH/InfoTheoryPart1c.pdf

11 Properties of entropy

http://www.youtube.com/watch?v=LodZWzrbayY

http://www.renyi.hu/~revesz/epy.pdf

http://cgm.cs.mcgill.ca/~soss/cs644/projects/simon/Entropy.html

http://en.wikipedia.org/wiki/Entropy_%28information_theory%29

12 Noiseless coding theorem

https://ccnet.stanford.edu/cgi-bin/course.cgi?cc=ee10sc&action=handout_download&handout_id=ID13160335419045

https://www.cse.ust.hk/~golin/Talks/Shannon_Coding_Extensions.pdf

http://en.wikipedia.org/wiki/Shannon%27s_source_coding_theorem

http://poincare.matf.bg.ac.rs/nastavno/viktor/Channel_Capacity.pdf

13 Introduction to information channel

http://www.exp-math.uni-

http://www.stanford.edu/~monta

http://poincare.matf.bg.ac.rs/nas

http://www-public.it-

Page 55 of 103

Page 27: Module-2 - VTU-NPTEL-NMEICTnptel.vtu.ac.in/VTU-NMEICT/digicomm/Module2.pdfX4 0.04 X5 0.04 X6 0.05 X7 0.02 Find i) a binary Huffman code ii) the expected code length for this encoding

essen.de/~vinck/information%20theory/lecture%202013%20info%20theory/chapter%204%20channel-coding%20BW.pdf

nar/RESEARCH/BOOK/partA.pdf

tavno/viktor/Channel_Capacity.pdf#page=1&zoom=auto,0,654

sudparis.eu/~uro/cours-pdf/poly.pdf

14 Equivocation and mutual information

http://www.ece.uvic.ca/~agullive%20/joint.pdf

http://skynet.ee.ic.ac.uk/notes/CS_2011_3_comm_channels.pdf

http://www2.tu-ilmenau.de/nt/de/teachings/vorlesungen/itsc_master/folien/script.pdf

http://www-public.it-sudparis.eu/~uro/cours-pdf/poly.pdf

15 Properties of different information channels

http://paginas.fe.up.pt/~vinhoza/itpa/lecture3.pdf

http://www2.maths.lth.se/media/thesis/2012/hampus-wessman-MATX01.pdf

https://www.ti.rwth-aachen.de/teaching/ti/data/save_dir/ti1/WS1011/chap2_handouts.pdf

http://people.csail.mit.edu/madhu/FT02/scribe/lect02.pdf

16 Venn diagram of channel entropies

http://clem.dii.unisi.it/~vipp/files/TIC/dispense.pdf

http://people.irisa.fr/Olivier.Le_Meur/teaching/InformationTheory_DIIC3_INC.pdf

https://www.cs.uic.edu/pub/ECE534/WebHome/ch2.pdf

https://www.cs.princeton.edu/picasso/mats/intro-to-info_jp.pdf

17 Channel capacity http://clem.dii.unisi.it/~vipp/files/TIC/dispense.pdf

http://chamilo2.grenet.fr/inp/courses/PHELMAA3SIC5PMSCSF0/document/M2R_SIPT/Info_Th_ChI_II_III.pdf

http://www.icg.isy.liu.se/courses/infotheory/lect5.pdf

http://poincare.matf.bg.ac.rs/nastavno/viktor/Channel_Capacity.pdf

Page 56 of 103