Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most...

16
Lecture 18: Gaussian Channel Gaussian channel Gaussian channel capacity Dr. Yao Xie, ECE587, Information Theory, Duke University

Transcript of Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most...

Page 1: Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most important continuous alphabet channel: AWGN Yi = Xi + Zi, noise Zi ˘ N(0;N), independent

Lecture 18: Gaussian Channel

• Gaussian channel

• Gaussian channel capacity

Dr. Yao Xie, ECE587, Information Theory, Duke University

Page 2: Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most important continuous alphabet channel: AWGN Yi = Xi + Zi, noise Zi ˘ N(0;N), independent

Mona Lisa in AWGN

Mona Lisa

200 400 600

100

200

300

400

500

600

700

800

900

1000

1100

Noisy Mona Lisa

200 400 600

100

200

300

400

500

600

700

800

900

1000

1100

Dr. Yao Xie, ECE587, Information Theory, Duke University 1

Page 3: Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most important continuous alphabet channel: AWGN Yi = Xi + Zi, noise Zi ˘ N(0;N), independent

Gaussian channel

• the most important continuous alphabet channel: AWGN

• Yi = Xi + Zi, noise Zi ∼ N (0, N), independent of Xi

• model for communication channels: satellite links, wireless phone

Zi

YiXi

Dr. Yao Xie, ECE587, Information Theory, Duke University 2

Page 4: Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most important continuous alphabet channel: AWGN Yi = Xi + Zi, noise Zi ˘ N(0;N), independent

Channel capacity of AWGN

• intuition: C = log number of distinguishable inputs

• if N = 0, C = ∞

• if no power constraint on the input, C = ∞

• to make it more meaningful, impose average power constraint: for anycodewords (x1, . . . , xn)

1

n

n∑i=1

x2i ≤ P

Dr. Yao Xie, ECE587, Information Theory, Duke University 3

Page 5: Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most important continuous alphabet channel: AWGN Yi = Xi + Zi, noise Zi ˘ N(0;N), independent

Naive way of using Gaussian channel

• Binary phase-shift keying (BPSK)

• transmit 1 bit over the channel

• 1 → +√P , 0 → −

√P

• Y = ±√P + Z

• Probability of errorPe = 1− Φ(

√P/N)

normal cumulative probability function (CDF): Φ(x) =∫ x

−∞1√2πe−

t2

2 dt

• convert Gaussian channel into a discrete BSC with p = Pe. Loseinformation in quantization

Dr. Yao Xie, ECE587, Information Theory, Duke University 4

Page 6: Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most important continuous alphabet channel: AWGN Yi = Xi + Zi, noise Zi ˘ N(0;N), independent

Definition: Gaussian channel capacity

• C = maxf(x):EX2≤P

I(X;Y )

• we can calculate from here

C =1

2log

(1 +

P

N

)maximum attained when X ∼ N (0, P )

Dr. Yao Xie, ECE587, Information Theory, Duke University 5

Page 7: Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most important continuous alphabet channel: AWGN Yi = Xi + Zi, noise Zi ˘ N(0;N), independent

C as maximum data rate

• we can also show this C is the supremum of rate achievable for AWGN

• definition: a rate R is achievable for Gaussian channel with powerconstraint P :

if there exists a (2nR, n) codes with maximum probability of error

λn = max2nR

i=1 λi → 0 as n → ∞.

Dr. Yao Xie, ECE587, Information Theory, Duke University 6

Page 8: Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most important continuous alphabet channel: AWGN Yi = Xi + Zi, noise Zi ˘ N(0;N), independent

Sphere packing

why we may construct (2nC, n) codes with a small probability of error?

Fix one codeword

• consider any codeword of length n

• received vector ∼ N (true codeword, N)

• with high probability, received vector contained in a sphere of radius√n(N + ϵ) around true codeword

• assign each ball a codeword

Dr. Yao Xie, ECE587, Information Theory, Duke University 7

Page 9: Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most important continuous alphabet channel: AWGN Yi = Xi + Zi, noise Zi ˘ N(0;N), independent

Consider all codewords

• with power constraint, with high probability the space of received vectoris a sphere with radius

√n(P +N)

• volume of n-dimensional sphere = Cnrn for constant Cn and radius rn

• how many codewords can we pack in a “total power sphere”?

Cn(n(P +N))n/2

Cn(nN)n/2=

(1 +

P

N

)n/2

• rate of this codebook = log2(size of the codewords)/n:

=1

2log2

(1 +

P

N

)Dr. Yao Xie, ECE587, Information Theory, Duke University 8

Page 10: Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most important continuous alphabet channel: AWGN Yi = Xi + Zi, noise Zi ˘ N(0;N), independent

sphere packing

Dr. Yao Xie, ECE587, Information Theory, Duke University 9

Page 11: Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most important continuous alphabet channel: AWGN Yi = Xi + Zi, noise Zi ˘ N(0;N), independent

Gaussian channel capacity theorem

Theorem. The capacity of a Gaussian channel with power constraint Pand noise variance N is

C =1

2log

(1 +

P

N

)bits per transmission

Proof: 1) achievability; 2) converse

Dr. Yao Xie, ECE587, Information Theory, Duke University 10

Page 12: Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most important continuous alphabet channel: AWGN Yi = Xi + Zi, noise Zi ˘ N(0;N), independent

New stuff in proof

Achievability:

• codeword elements generated i.i.d. according Xj(i) ∼ N (0, P − ϵ). So

1

nX2

i → P − ϵ

• power outage error

E0 =

1

n

n∑j=1

X2j (1) > P

small according to law of large number

Converse: Gaussian distribution has maximum entropy

Dr. Yao Xie, ECE587, Information Theory, Duke University 11

Page 13: Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most important continuous alphabet channel: AWGN Yi = Xi + Zi, noise Zi ˘ N(0;N), independent

Band-limited channels

• more realistic channel model: band-limited continuous AWGN

Y (t) = (X(t) + Z(t)) ∗ h(t)

*: convolution

• Nyquist-Shannon sampling theorem:

sampling a band-limited signal at sampling rate 1/(2W ) is sufficient toreconstruct the signal from samples

“two samples per period”

Dr. Yao Xie, ECE587, Information Theory, Duke University 12

Page 14: Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most important continuous alphabet channel: AWGN Yi = Xi + Zi, noise Zi ˘ N(0;N), independent

Capacity of continuous-time band-limited AWGN

• noise has power spectral density N0/2 watts/hertz, bandwidth W hertz,noise power = N0W

• signal power P watts

• 2W samples each second

• channel capacity

C = W log

(1 +

P

N0W

)bits per second

• when W → ∞, C → PN0

log2 e bits per second

Dr. Yao Xie, ECE587, Information Theory, Duke University 13

Page 15: Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most important continuous alphabet channel: AWGN Yi = Xi + Zi, noise Zi ˘ N(0;N), independent

Telephone line

• telephone signal are band-limited to 3300 Hz

• SNR = 33 dB: P/(N0W ) = 2000

• capacity = 36 kb/s

• practical modems achieve transmission rates up to 33.6 kb/s uplink anddownlink

• ADSL achieves 56 kb/s downlink (asymmetric data rate)

Dr. Yao Xie, ECE587, Information Theory, Duke University 14

Page 16: Lecture 18: Gaussian Channel - ISyEyxie77/ece587/Lecture18.pdf · Gaussian channel the most important continuous alphabet channel: AWGN Yi = Xi + Zi, noise Zi ˘ N(0;N), independent

Summary

• additive white Gaussian noise (AWGN) channel

• noise power: N , signal power constraint P , capacity

C =1

2log

(1 +

P

N

)

• band-limited channel with bandwidth = W

C = W log

(1 +

P

N0W

)

Dr. Yao Xie, ECE587, Information Theory, Duke University 15