Information Theory Student

download Information Theory Student

of 12

Transcript of Information Theory Student

  • 8/16/2019 Information Theory Student

    1/12

    Information theory

  • 8/16/2019 Information Theory Student

    2/12

    Information theory and entropy

    • Information theory tries tosolve the problem ofcommunicating as much

    data as possible over a noisychannel

    • Measure of data is entropy 

    • Claude Shannon firstdemonstrated that reliablecommunication over a noisychannel is possible (jump-

    started digital age)

  • 8/16/2019 Information Theory Student

    3/12

    Measure of Information

    • Information content of symbol si  

    • (in bits) –log p(si  )

    • !"amples

    •  p(si  ) # $ has no information

    • smaller p(si  ) has more information% as it &asune"pected or surprising

  • 8/16/2019 Information Theory Student

    4/12

    •'elated to predictability or scarcity value• he more predictable or probable a particularmessage% the less information conveyed bytransmitting that message

    • *ighly probable message contain littleinformation

      -+ (message) # $% carries ,ero information

      -+ (message) # % carries infinite information

    • Information content% of a message% m

    )m(Plog)m(P

    1logI 22m   −==

      0)m(I,1)m(P   ==

  • 8/16/2019 Information Theory Student

    5/12

    Multisymbol alphabets

    Vocabulary size No. of binary digits Symbol probabilty

    2 1 1/2

    4 2 1/4

    8 3 1/8

    : : :

    : : :

    128 1/128

  • 8/16/2019 Information Theory Student

    6/12

    Entropy of a Binary Source

    • !ntropy (*) # average amount of information conveyed persymbol

      .or alphabet of si,e and assuming that symbols are statisticallyindependent

    • .or -symbol alphabet (%$)% if &e let +($) # p then +() # $-p and

     symbol bit m P 

    m P  H m

    /!"

    1log!"

    2

    1

    2∑=

    =

    symbol/bitsp1

    1log)1p(p1logpH 22

    −−+=

  • 8/16/2019 Information Theory Student

    7/12

    Entropy Example

    •  /lphabet # 0/% 12• p(/) # 34 p(1) # 5

    • Compute !ntropy (*)

    • -36log 3 7 -56log 5 # 89 bits

    • Ma"imum uncertainty (gives largest *)• occurs &hen all probabilities are e:ual

  • 8/16/2019 Information Theory Student

    8/12

    H

    P0   1/2 1

    • !ntropy is ma"imi,ed &hen the symbols aree:uiproable

  • 8/16/2019 Information Theory Student

    9/12

    • In binary case% as either of the messages

     becomes more li;ely% the entropy decreases• 

  • 8/16/2019 Information Theory Student

    10/12

    Conditional Entropy and Redundancy

    • .or sources in &hich each symbol selected is notstatistically independent from all previous symbols ( iesources &ith memory)% the joint and conditional statisticsof symbol se:uences must be considered

    •  / source &ith a memory of one symbol has an entropygiven by 

     &here+(i%j)# probability of the source selecting i and j

    +(j=i) # probability that the source &ill select j given thatpreviously selected i% thus the e:uation can be re-

    e"pressed as>

     symbol bit i j P 

     ji P  H i j

    /!/"

    1log!$" 2∑ ∑=

    !$"   i  j P 

  • 8/16/2019 Information Theory Student

    11/12

    • 'edundancy > the difference bet&een the actual entropy of asource and the ma"imum entropy% the source could have if its

    symbols &ere independent and e:uiprobable

    • !g > .ind the entropy% redundancy and information rate of afour symbol source (/% 1% C% ?) &ith a symbol rate of $3symbol=s and symbol selection probabilities of @% % and$ under the follo&ing conditions>

      (a) he source is memoryless (ie the source is statisticallyindependent)

      (b) he source has a one-symbol memory such that no t&o

    consecutively selected symbols can be the same

     symbol bit 

    i j P 

    i j P i P  H 

    i j

    /

    !/"

    1log!/"!" 2∑ ∑=

    !/"ma#   symbol bit  H  H  R   −=

  • 8/16/2019 Information Theory Student

    12/12

    Code Efficiency

    •  / code efficiency can be defined as>

    %1&&

    ma#

     x H 

     H code

      =η