A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE...
Transcript of A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE...
![Page 1: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/1.jpg)
A Mathematical Theory of Communication(after C. E. Shannon)
Alex Vlasiuk
Alex A Mathematical Theory of Communication 1 / 17
![Page 2: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/2.jpg)
Image: IEEE Information Theory Society
![Page 3: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/3.jpg)
Why Shannon?
I ”the father of information theory”
I ideas from the 1948 paper are ubiquitous
I (hopefully) some can be explained through handwaving:
c©Jeff Portaro, Noun Project
I was on my desktop
Shannon, Claude Elwood. ”A mathematical theory of communication.” ACM
SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55.
Alex A Mathematical Theory of Communication 3 / 17
![Page 4: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/4.jpg)
Why Shannon?
I ”the father of information theory”
I ideas from the 1948 paper are ubiquitous
I (hopefully) some can be explained through handwaving:
c©Jeff Portaro, Noun Project
I was on my desktop
Shannon, Claude Elwood. ”A mathematical theory of communication.” ACM
SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55.
Alex A Mathematical Theory of Communication 3 / 17
![Page 5: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/5.jpg)
Why Shannon?
I ”the father of information theory”
I ideas from the 1948 paper are ubiquitous
I (hopefully) some can be explained through handwaving:
c©Jeff Portaro, Noun Project
I was on my desktop
Shannon, Claude Elwood. ”A mathematical theory of communication.” ACM
SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55.
Alex A Mathematical Theory of Communication 3 / 17
![Page 6: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/6.jpg)
Why Shannon?
I ”the father of information theory”
I ideas from the 1948 paper are ubiquitous
I (hopefully) some can be explained through handwaving
:
c©Jeff Portaro, Noun Project
I was on my desktop
Shannon, Claude Elwood. ”A mathematical theory of communication.” ACM
SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55.
Alex A Mathematical Theory of Communication 3 / 17
![Page 7: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/7.jpg)
Why Shannon?
I ”the father of information theory”
I ideas from the 1948 paper are ubiquitous
I (hopefully) some can be explained through handwaving:
c©Jeff Portaro, Noun Project
I was on my desktop
Shannon, Claude Elwood. ”A mathematical theory of communication.” ACM
SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55.
Alex A Mathematical Theory of Communication 3 / 17
![Page 8: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/8.jpg)
Why Shannon?
I ”the father of information theory”
I ideas from the 1948 paper are ubiquitous
I (hopefully) some can be explained through handwaving:
c©Jeff Portaro, Noun Project
I was on my desktop
Shannon, Claude Elwood. ”A mathematical theory of communication.” ACM
SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55.
Alex A Mathematical Theory of Communication 3 / 17
![Page 9: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/9.jpg)
Why Shannon?
I ”the father of information theory”
I ideas from the 1948 paper are ubiquitous
I (hopefully) some can be explained through handwaving:
c©Jeff Portaro, Noun Project
I was on my desktop
Shannon, Claude Elwood. ”A mathematical theory of communication.” ACM
SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55.
Alex A Mathematical Theory of Communication 3 / 17
![Page 10: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/10.jpg)
Setting
Alex A Mathematical Theory of Communication 4 / 17
![Page 11: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/11.jpg)
Capacity and states of a channelSymbols: S1, . . . , Sn with certain durations t1, . . . , tn.
Allowedcombinations of symbols are signals.Capacity of a channel:
C = limT→∞
logN(T )
T,
N(T ) is the number of allowed signals of duration T. Units: bits persecond.
Alex A Mathematical Theory of Communication 5 / 17
![Page 12: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/12.jpg)
Capacity and states of a channelSymbols: S1, . . . , Sn with certain durations t1, . . . , tn. Allowedcombinations of symbols are signals.
Capacity of a channel:
C = limT→∞
logN(T )
T,
N(T ) is the number of allowed signals of duration T. Units: bits persecond.
Alex A Mathematical Theory of Communication 5 / 17
![Page 13: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/13.jpg)
Capacity and states of a channelSymbols: S1, . . . , Sn with certain durations t1, . . . , tn. Allowedcombinations of symbols are signals.Capacity of a channel:
C = limT→∞
logN(T )
T,
N(T ) is the number of allowed signals of duration T.
Units: bits persecond.
Alex A Mathematical Theory of Communication 5 / 17
![Page 14: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/14.jpg)
Capacity and states of a channelSymbols: S1, . . . , Sn with certain durations t1, . . . , tn. Allowedcombinations of symbols are signals.Capacity of a channel:
C = limT→∞
logN(T )
T,
N(T ) is the number of allowed signals of duration T. Units: bits persecond.
Alex A Mathematical Theory of Communication 5 / 17
![Page 15: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/15.jpg)
Capacity and states of a channelSymbols: S1, . . . , Sn with certain durations t1, . . . , tn. Allowedcombinations of symbols are signals.Capacity of a channel:
C = limT→∞
logN(T )
T,
N(T ) is the number of allowed signals of duration T. Units: bits persecond.
Alex A Mathematical Theory of Communication 5 / 17
![Page 16: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/16.jpg)
Graphical representation of a Markov process
Source is a stochastic (random) process.
Example. Alphabet: A, B, C. Transition probabilities:
Alex A Mathematical Theory of Communication 6 / 17
![Page 17: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/17.jpg)
Graphical representation of a Markov process
Source is a stochastic (random) process.Example. Alphabet: A, B, C.
Transition probabilities:
Alex A Mathematical Theory of Communication 6 / 17
![Page 18: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/18.jpg)
Graphical representation of a Markov process
Source is a stochastic (random) process.Example. Alphabet: A, B, C. Transition probabilities:
Alex A Mathematical Theory of Communication 6 / 17
![Page 19: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/19.jpg)
Graphical representation of a Markov process
Source is a stochastic (random) process.Example. Alphabet: A, B, C. Transition probabilities:
Alex A Mathematical Theory of Communication 6 / 17
![Page 20: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/20.jpg)
Example: approximations to English
Using 27 (26+space) alphabet.
I symbols independent and equiprobable:XFOML RXKHRJFFJUJ ZLPWCFWKCYJ FFJEYVKCQSGHYDQPAAMKBZAACIBZLHJQD
I symbols independent but with frequencies of English text:OCRO HLI RGWR NMIELWIS EU LL NBNESEBYA TH EEIALHENHTTPA OOBTTVA NAH BRL
I digram structure as in English:ON IE ANTSOUTINYS ARE T INCTORE ST BE S DEAMY ACHIND ILONASIVE TUCOOWE AT TEASONARE FUSO TIZIN ANDYTOBE SEACE CTISBE”One opens a book at random and selects a letter at random on the page. This
letter is recorded. The book is then opened to another page and one reads until
this letter is encountered. The succeeding letter is then recorded.”
Alex A Mathematical Theory of Communication 7 / 17
![Page 21: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/21.jpg)
Example: approximations to English
Using 27 (26+space) alphabet.
I symbols independent and equiprobable:XFOML RXKHRJFFJUJ ZLPWCFWKCYJ FFJEYVKCQSGHYDQPAAMKBZAACIBZLHJQD
I symbols independent but with frequencies of English text:OCRO HLI RGWR NMIELWIS EU LL NBNESEBYA TH EEIALHENHTTPA OOBTTVA NAH BRL
I digram structure as in English:ON IE ANTSOUTINYS ARE T INCTORE ST BE S DEAMY ACHIND ILONASIVE TUCOOWE AT TEASONARE FUSO TIZIN ANDYTOBE SEACE CTISBE”One opens a book at random and selects a letter at random on the page. This
letter is recorded. The book is then opened to another page and one reads until
this letter is encountered. The succeeding letter is then recorded.”
Alex A Mathematical Theory of Communication 7 / 17
![Page 22: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/22.jpg)
Example: approximations to English
Using 27 (26+space) alphabet.
I symbols independent and equiprobable:XFOML RXKHRJFFJUJ ZLPWCFWKCYJ FFJEYVKCQSGHYDQPAAMKBZAACIBZLHJQD
I symbols independent but with frequencies of English text:OCRO HLI RGWR NMIELWIS EU LL NBNESEBYA TH EEIALHENHTTPA OOBTTVA NAH BRL
I digram structure as in English:ON IE ANTSOUTINYS ARE T INCTORE ST BE S DEAMY ACHIND ILONASIVE TUCOOWE AT TEASONARE FUSO TIZIN ANDYTOBE SEACE CTISBE”One opens a book at random and selects a letter at random on the page. This
letter is recorded. The book is then opened to another page and one reads until
this letter is encountered. The succeeding letter is then recorded.”
Alex A Mathematical Theory of Communication 7 / 17
![Page 23: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/23.jpg)
Example: approximations to English
Using 27 (26+space) alphabet.
I symbols independent and equiprobable:XFOML RXKHRJFFJUJ ZLPWCFWKCYJ FFJEYVKCQSGHYDQPAAMKBZAACIBZLHJQD
I symbols independent but with frequencies of English text:OCRO HLI RGWR NMIELWIS EU LL NBNESEBYA TH EEIALHENHTTPA OOBTTVA NAH BRL
I digram structure as in English:ON IE ANTSOUTINYS ARE T INCTORE ST BE S DEAMY ACHIND ILONASIVE TUCOOWE AT TEASONARE FUSO TIZIN ANDYTOBE SEACE CTISBE
”One opens a book at random and selects a letter at random on the page. This
letter is recorded. The book is then opened to another page and one reads until
this letter is encountered. The succeeding letter is then recorded.”
Alex A Mathematical Theory of Communication 7 / 17
![Page 24: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/24.jpg)
Example: approximations to English
Using 27 (26+space) alphabet.
I symbols independent and equiprobable:XFOML RXKHRJFFJUJ ZLPWCFWKCYJ FFJEYVKCQSGHYDQPAAMKBZAACIBZLHJQD
I symbols independent but with frequencies of English text:OCRO HLI RGWR NMIELWIS EU LL NBNESEBYA TH EEIALHENHTTPA OOBTTVA NAH BRL
I digram structure as in English:ON IE ANTSOUTINYS ARE T INCTORE ST BE S DEAMY ACHIND ILONASIVE TUCOOWE AT TEASONARE FUSO TIZIN ANDYTOBE SEACE CTISBE”One opens a book at random and selects a letter at random on the page. This
letter is recorded. The book is then opened to another page and one reads until
this letter is encountered. The succeeding letter is then recorded.”
Alex A Mathematical Theory of Communication 7 / 17
![Page 25: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/25.jpg)
I trigram structure as in English:IN NO IST LAT WHEY CRATICT FROURE BIRS GROCIDPONDENOME OF DEMONSTURES OF THE REPTAGIN ISREGOACTIONA OF CRE
I first-order word approximation:REPRESENTING AND SPEEDILY IS AN GOOD APT OR COMECAN DIFFERENT NATURAL HERE HE THE A IN CAME THE TOOF TO EXPERT GRAY COME TO FURNISHES THE LINEMESSAGE HAD BE THESE
I Second-order word approximation:THE HEAD AND IN FRONTAL ATTACK ON AN ENGLISHWRITER THAT THE CHARACTER OF THIS POINT ISTHEREFORE ANOTHER METHOD FOR THE LETTERS THATTHE TIME OF WHO EVER TOLD THE PROBLEM FOR ANUNEXPECTED
Alex A Mathematical Theory of Communication 8 / 17
![Page 26: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/26.jpg)
I trigram structure as in English:IN NO IST LAT WHEY CRATICT FROURE BIRS GROCIDPONDENOME OF DEMONSTURES OF THE REPTAGIN ISREGOACTIONA OF CRE
I first-order word approximation:REPRESENTING AND SPEEDILY IS AN GOOD APT OR COMECAN DIFFERENT NATURAL HERE HE THE A IN CAME THE TOOF TO EXPERT GRAY COME TO FURNISHES THE LINEMESSAGE HAD BE THESE
I Second-order word approximation:THE HEAD AND IN FRONTAL ATTACK ON AN ENGLISHWRITER THAT THE CHARACTER OF THIS POINT ISTHEREFORE ANOTHER METHOD FOR THE LETTERS THATTHE TIME OF WHO EVER TOLD THE PROBLEM FOR ANUNEXPECTED
Alex A Mathematical Theory of Communication 8 / 17
![Page 27: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/27.jpg)
I trigram structure as in English:IN NO IST LAT WHEY CRATICT FROURE BIRS GROCIDPONDENOME OF DEMONSTURES OF THE REPTAGIN ISREGOACTIONA OF CRE
I first-order word approximation:REPRESENTING AND SPEEDILY IS AN GOOD APT OR COMECAN DIFFERENT NATURAL HERE HE THE A IN CAME THE TOOF TO EXPERT GRAY COME TO FURNISHES THE LINEMESSAGE HAD BE THESE
I Second-order word approximation:THE HEAD AND IN FRONTAL ATTACK ON AN ENGLISHWRITER THAT THE CHARACTER OF THIS POINT ISTHEREFORE ANOTHER METHOD FOR THE LETTERS THATTHE TIME OF WHO EVER TOLD THE PROBLEM FOR ANUNEXPECTED
Alex A Mathematical Theory of Communication 8 / 17
![Page 28: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/28.jpg)
Entropy
A set of possible events with probabilities
p1, p2, . . . , pn
Need: a measure of uncertainty in the outcome
H = −n∑
i=1
pi log pi
Example: two possibilities with probabilities p and q = 1− p.
H = −(p log p+ q log q)
Alex A Mathematical Theory of Communication 9 / 17
![Page 29: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/29.jpg)
Entropy
A set of possible events with probabilities
p1, p2, . . . , pn
Need: a measure of uncertainty in the outcome
H = −n∑
i=1
pi log pi
Example: two possibilities with probabilities p and q = 1− p.
H = −(p log p+ q log q)
Alex A Mathematical Theory of Communication 9 / 17
![Page 30: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/30.jpg)
Entropy
A set of possible events with probabilities
p1, p2, . . . , pn
Need: a measure of uncertainty in the outcome
H = −n∑
i=1
pi log pi
Example: two possibilities with probabilities p and q = 1− p.
H = −(p log p+ q log q)
Alex A Mathematical Theory of Communication 9 / 17
![Page 31: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/31.jpg)
Entropy
A set of possible events with probabilities
p1, p2, . . . , pn
Need: a measure of uncertainty in the outcome
H = −n∑
i=1
pi log pi
Example: two possibilities with probabilities p and q = 1− p.
H = −(p log p+ q log q)
Alex A Mathematical Theory of Communication 9 / 17
![Page 32: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/32.jpg)
Entropy
A set of possible events with probabilities
p1, p2, . . . , pn
Need: a measure of uncertainty in the outcome
H = −n∑
i=1
pi log pi
Example: two possibilities with probabilities p and q = 1− p.
H = −(p log p+ q log q)
Alex A Mathematical Theory of Communication 9 / 17
![Page 33: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/33.jpg)
Alex A Mathematical Theory of Communication 10 / 17
![Page 34: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/34.jpg)
Conditional entropy and entropy of a source
x, y - events
H(x) +H(y) ≥ H(x, y)
= H(x) +Hx(y)
A source has states with entropies Hi, transition probabilities are pi(j),then
H =∑i
PiHi = −∑i,j
Pi pi(j) log pi(j)
Different units! Turns out,
limN→∞
logn(q)
N= H
n(q) – number of the most probable sequences of length N , totalprobability q with any q 6= 0, 1.
Entropy of source: bits per symbol
Alex A Mathematical Theory of Communication 11 / 17
![Page 35: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/35.jpg)
Conditional entropy and entropy of a source
x, y - events
H(x) +H(y) ≥ H(x, y) = H(x) +Hx(y)
A source has states with entropies Hi, transition probabilities are pi(j),then
H =∑i
PiHi = −∑i,j
Pi pi(j) log pi(j)
Different units! Turns out,
limN→∞
logn(q)
N= H
n(q) – number of the most probable sequences of length N , totalprobability q with any q 6= 0, 1.
Entropy of source: bits per symbol
Alex A Mathematical Theory of Communication 11 / 17
![Page 36: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/36.jpg)
Conditional entropy and entropy of a source
x, y - events
H(x) +H(y) ≥ H(x, y) = H(x) +Hx(y)
A source has states with entropies Hi, transition probabilities are pi(j)
,then
H =∑i
PiHi = −∑i,j
Pi pi(j) log pi(j)
Different units! Turns out,
limN→∞
logn(q)
N= H
n(q) – number of the most probable sequences of length N , totalprobability q with any q 6= 0, 1.
Entropy of source: bits per symbol
Alex A Mathematical Theory of Communication 11 / 17
![Page 37: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/37.jpg)
Conditional entropy and entropy of a source
x, y - events
H(x) +H(y) ≥ H(x, y) = H(x) +Hx(y)
A source has states with entropies Hi, transition probabilities are pi(j),then
H =∑i
PiHi = −∑i,j
Pi pi(j) log pi(j)
Different units! Turns out,
limN→∞
logn(q)
N= H
n(q) – number of the most probable sequences of length N , totalprobability q with any q 6= 0, 1.
Entropy of source: bits per symbol
Alex A Mathematical Theory of Communication 11 / 17
![Page 38: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/38.jpg)
Conditional entropy and entropy of a source
x, y - events
H(x) +H(y) ≥ H(x, y) = H(x) +Hx(y)
A source has states with entropies Hi, transition probabilities are pi(j),then
H =∑i
PiHi = −∑i,j
Pi pi(j) log pi(j)
Different units!
Turns out,
limN→∞
logn(q)
N= H
n(q) – number of the most probable sequences of length N , totalprobability q with any q 6= 0, 1.
Entropy of source: bits per symbol
Alex A Mathematical Theory of Communication 11 / 17
![Page 39: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/39.jpg)
Conditional entropy and entropy of a source
x, y - events
H(x) +H(y) ≥ H(x, y) = H(x) +Hx(y)
A source has states with entropies Hi, transition probabilities are pi(j),then
H =∑i
PiHi = −∑i,j
Pi pi(j) log pi(j)
Different units! Turns out,
limN→∞
logn(q)
N= H
n(q) – number of the most probable sequences of length N , totalprobability q with any q 6= 0, 1.
Entropy of source: bits per symbol
Alex A Mathematical Theory of Communication 11 / 17
![Page 40: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/40.jpg)
Conditional entropy and entropy of a source
x, y - events
H(x) +H(y) ≥ H(x, y) = H(x) +Hx(y)
A source has states with entropies Hi, transition probabilities are pi(j),then
H =∑i
PiHi = −∑i,j
Pi pi(j) log pi(j)
Different units! Turns out,
limN→∞
logn(q)
N= H
n(q) – number of the most probable sequences of length N , totalprobability q with any q 6= 0, 1.
Entropy of source: bits per symbol
Alex A Mathematical Theory of Communication 11 / 17
![Page 41: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/41.jpg)
Noiseless case
Theorem (the fundamental theorem for a noiseless channel)
Let a source have entropy H bits/symbol and a channel have a capacityC bits/second. Then it is possible to encode the output of the source totransmit at the average rate C
H− ε symbols/second over the channel
where ε is arbitrarily small.It is not possible to transmit at an average rate greater than C
H .
[C]
[H]=
bits/s
bits/sym= sym/s
The proof involves constructing an explicit code that achieves the requiredrate: Shannon-Fano coding.
Alex A Mathematical Theory of Communication 12 / 17
![Page 42: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/42.jpg)
Noiseless case
Theorem (the fundamental theorem for a noiseless channel)
Let a source have entropy H bits/symbol and a channel have a capacityC bits/second. Then it is possible to encode the output of the source totransmit at the average rate C
H− ε symbols/second over the channel
where ε is arbitrarily small.It is not possible to transmit at an average rate greater than C
H .
[C]
[H]=
bits/s
bits/sym= sym/s
The proof involves constructing an explicit code that achieves the requiredrate: Shannon-Fano coding.
Alex A Mathematical Theory of Communication 12 / 17
![Page 43: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/43.jpg)
Noiseless case
Theorem (the fundamental theorem for a noiseless channel)
Let a source have entropy H bits/symbol and a channel have a capacityC bits/second. Then it is possible to encode the output of the source totransmit at the average rate C
H− ε symbols/second over the channel
where ε is arbitrarily small.It is not possible to transmit at an average rate greater than C
H .
[C]
[H]=
bits/s
bits/sym= sym/s
The proof involves constructing an explicit code that achieves the requiredrate: Shannon-Fano coding.
Alex A Mathematical Theory of Communication 12 / 17
![Page 44: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/44.jpg)
Noiseless case
Theorem (the fundamental theorem for a noiseless channel)
Let a source have entropy H bits/symbol and a channel have a capacityC bits/second. Then it is possible to encode the output of the source totransmit at the average rate C
H− ε symbols/second over the channel
where ε is arbitrarily small.It is not possible to transmit at an average rate greater than C
H .
[C]
[H]=
bits/s
bits/sym= sym/s
The proof involves constructing an explicit code that achieves the requiredrate: Shannon-Fano coding.
Alex A Mathematical Theory of Communication 12 / 17
![Page 45: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/45.jpg)
Noisy case
Source output: x, decoded output: y. Noise: stochastic process as well.
Alex A Mathematical Theory of Communication 13 / 17
![Page 46: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/46.jpg)
EquivocationHy(x) – equivocation.
Actual transmission rate:
R = H(x)−Hy(x).
Capacity of a noisy channel (maximum over all sources):
C = max(H(x)−Hy(x)).
Transmitting: 1000 bits/second with probabilities p0 = p1 =12 . On
average, 1 in 100 is received incorrectly.Saying that rate is 990 (= 1000× 0.99) is not reasonable: don’t knowwhere the errors occur.With the above definition of Hy(x) (if y = 1 is received, probability thatx = 1 was sent is 0.99, etc):
Hy(x) = −(0.99 log 0.99 + 0.01 log 0.01) = 0.081 bits/symbol.
Thus the actual transmission rate is
R = 1000− 81 = 919 bits/second
Alex A Mathematical Theory of Communication 14 / 17
![Page 47: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/47.jpg)
EquivocationHy(x) – equivocation. Actual transmission rate:
R = H(x)−Hy(x).
Capacity of a noisy channel (maximum over all sources):
C = max(H(x)−Hy(x)).
Transmitting: 1000 bits/second with probabilities p0 = p1 =12 . On
average, 1 in 100 is received incorrectly.Saying that rate is 990 (= 1000× 0.99) is not reasonable: don’t knowwhere the errors occur.With the above definition of Hy(x) (if y = 1 is received, probability thatx = 1 was sent is 0.99, etc):
Hy(x) = −(0.99 log 0.99 + 0.01 log 0.01) = 0.081 bits/symbol.
Thus the actual transmission rate is
R = 1000− 81 = 919 bits/second
Alex A Mathematical Theory of Communication 14 / 17
![Page 48: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/48.jpg)
EquivocationHy(x) – equivocation. Actual transmission rate:
R = H(x)−Hy(x).
Capacity of a noisy channel (maximum over all sources):
C = max(H(x)−Hy(x)).
Transmitting: 1000 bits/second with probabilities p0 = p1 =12 . On
average, 1 in 100 is received incorrectly.
Saying that rate is 990 (= 1000× 0.99) is not reasonable: don’t knowwhere the errors occur.With the above definition of Hy(x) (if y = 1 is received, probability thatx = 1 was sent is 0.99, etc):
Hy(x) = −(0.99 log 0.99 + 0.01 log 0.01) = 0.081 bits/symbol.
Thus the actual transmission rate is
R = 1000− 81 = 919 bits/second
Alex A Mathematical Theory of Communication 14 / 17
![Page 49: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/49.jpg)
EquivocationHy(x) – equivocation. Actual transmission rate:
R = H(x)−Hy(x).
Capacity of a noisy channel (maximum over all sources):
C = max(H(x)−Hy(x)).
Transmitting: 1000 bits/second with probabilities p0 = p1 =12 . On
average, 1 in 100 is received incorrectly.Saying that rate is 990 (= 1000× 0.99) is not reasonable: don’t knowwhere the errors occur.
With the above definition of Hy(x) (if y = 1 is received, probability thatx = 1 was sent is 0.99, etc):
Hy(x) = −(0.99 log 0.99 + 0.01 log 0.01) = 0.081 bits/symbol.
Thus the actual transmission rate is
R = 1000− 81 = 919 bits/second
Alex A Mathematical Theory of Communication 14 / 17
![Page 50: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/50.jpg)
EquivocationHy(x) – equivocation. Actual transmission rate:
R = H(x)−Hy(x).
Capacity of a noisy channel (maximum over all sources):
C = max(H(x)−Hy(x)).
Transmitting: 1000 bits/second with probabilities p0 = p1 =12 . On
average, 1 in 100 is received incorrectly.Saying that rate is 990 (= 1000× 0.99) is not reasonable: don’t knowwhere the errors occur.With the above definition of Hy(x)
(if y = 1 is received, probability thatx = 1 was sent is 0.99, etc):
Hy(x) = −(0.99 log 0.99 + 0.01 log 0.01) = 0.081 bits/symbol.
Thus the actual transmission rate is
R = 1000− 81 = 919 bits/second
Alex A Mathematical Theory of Communication 14 / 17
![Page 51: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/51.jpg)
EquivocationHy(x) – equivocation. Actual transmission rate:
R = H(x)−Hy(x).
Capacity of a noisy channel (maximum over all sources):
C = max(H(x)−Hy(x)).
Transmitting: 1000 bits/second with probabilities p0 = p1 =12 . On
average, 1 in 100 is received incorrectly.Saying that rate is 990 (= 1000× 0.99) is not reasonable: don’t knowwhere the errors occur.With the above definition of Hy(x) (if y = 1 is received, probability thatx = 1 was sent is 0.99, etc):
Hy(x) = −(0.99 log 0.99 + 0.01 log 0.01) = 0.081 bits/symbol.
Thus the actual transmission rate is
R = 1000− 81 = 919 bits/second
Alex A Mathematical Theory of Communication 14 / 17
![Page 52: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/52.jpg)
EquivocationHy(x) – equivocation. Actual transmission rate:
R = H(x)−Hy(x).
Capacity of a noisy channel (maximum over all sources):
C = max(H(x)−Hy(x)).
Transmitting: 1000 bits/second with probabilities p0 = p1 =12 . On
average, 1 in 100 is received incorrectly.Saying that rate is 990 (= 1000× 0.99) is not reasonable: don’t knowwhere the errors occur.With the above definition of Hy(x) (if y = 1 is received, probability thatx = 1 was sent is 0.99, etc):
Hy(x) = −(0.99 log 0.99 + 0.01 log 0.01) = 0.081 bits/symbol.
Thus the actual transmission rate is
R = 1000− 81 = 919 bits/second
Alex A Mathematical Theory of Communication 14 / 17
![Page 53: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/53.jpg)
Theorem (the fundamental theorem for a discrete channelwith noise)
Let a discrete channel have the capacity C and a discrete source theentropy per second H. If H ≤ C, there exists a coding system with anarbitrarily small frequency of errors (or an arbitrarily small equivocationHy(x)) during transmission. If H > C it is possible to encode the sourceso that the equivocation is less than H − C + ε where ε is arbitrarilysmall. There is no method of encoding which gives an equivocation lessthan H − C.
Alex A Mathematical Theory of Communication 15 / 17
![Page 54: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/54.jpg)
Theorem (the fundamental theorem for a discrete channelwith noise)
Let a discrete channel have the capacity C and a discrete source theentropy per second H. If H ≤ C, there exists a coding system with anarbitrarily small frequency of errors (or an arbitrarily small equivocationHy(x)) during transmission. If H > C it is possible to encode the sourceso that the equivocation is less than H − C + ε where ε is arbitrarilysmall. There is no method of encoding which gives an equivocation lessthan H − C.
Alex A Mathematical Theory of Communication 15 / 17
![Page 55: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/55.jpg)
Shannon-Fano coding
Image: Wikimedia
Alex A Mathematical Theory of Communication 16 / 17
![Page 56: A Mathematical Theory of Communication · "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. AlexA Mathematical Theory](https://reader034.fdocuments.in/reader034/viewer/2022043017/5f3a0d33bd43b2172508ceeb/html5/thumbnails/56.jpg)
Thanks!