Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets...

23
Quantum Shannon Theory Patrick Hayden (McGill) http://www.cs.mcgill.ca/~patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info

Transcript of Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets...

Page 1: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

Quantum Shannon Theory

Patrick Hayden (McGill)

http://www.cs.mcgill.ca/~patrick/QLogic2005.ppt17 July 2005, Q-Logic Meets Q-Info

Page 2: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

Overview

Part I: What is Shannon theory? What does it have to do with quantum

mechanics? Some quantum Shannon theory highlights

Part II: Resource inequalities A skeleton key

Page 3: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

Information (Shannon) theory

A practical question: How to best make use of a given communications

resource?

A mathematico-epistemological question: How to quantify uncertainty and information?

Shannon: Solved the first by considering the second. A mathematical theory of communication [1948]

The

Page 4: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

Quantifying uncertainty

Entropy: H(X) = - x p(x) log2 p(x) Proportional to entropy of statistical physics Term suggested by von Neumann

(more on him soon) Can arrive at definition axiomatically:

H(X,Y) = H(X) + H(Y) for independent X, Y, etc.

Operational point of view…

Page 5: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

X1X2 …Xn

Compression

Source of independent copies of X

{0,1}n: 2n possible strings

2nH(X) typical strings

If X is binary:0000100111010100010101100101About nP(X=0) 0’s and nP(X=1) 1’s

Can compress n copies of X toa binary string of length ~nH(X)

Page 6: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

H(Y)

Quantifying information

H(X)

H(Y|X)

Information is that which reduces uncertainty

I(X;Y)H(X|Y)

Uncertainty in Xwhen value of Yis known

H(X|Y) = H(X,Y)-H(Y)= EYH(X|Y=y)

I(X;Y) = H(X) – H(X|Y) = H(X)+H(Y)-H(X,Y)

H(X,Y)

Page 7: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

Sending information through noisy channels

Statistical model of a noisy channel: ´

mEncoding Decoding

m’

Shannon’s noisy coding theorem: In the limit of many uses, the optimalrate at which Alice can send messages reliably to Bob through is given by the formula

Page 8: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

Shannon theory provides

Practically speaking: A holy grail for error-correcting codes

Conceptually speaking: A operationally-motivated way of thinking about

correlations

What’s missing (for a quantum mechanic)? Features from linear structure:

Entanglement and non-orthogonality

Page 9: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

Quantum Shannon Theory provides

General theory of interconvertibility between different types of communications resources: qubits, cbits, ebits, cobits, sbits…

Relies on a Major simplifying assumption:

Computation is free

Minor simplifying assumption:Noise and data have regular structure

Page 10: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

Quantifying uncertainty

Let = x p(x) |xihx| be a density operator von Neumann entropy:

H() = - tr [ log Equal to Shannon entropy of eigenvalues Analog of a joint random variable:

AB describes a composite system A B

H(A) = H(A) = H( trB AB)

Page 11: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

­­ …

­­

Compression

Source of independent copies of :

B n

dim(Effective supp of B n ) ~ 2nH(B)

Can compress n copies of B toa system of ~nH(B) qubits whilepreserving correlations with A

No statistical assumptions:Just quantum mechanics!

A A A

B B B(aka typical subspace)

[Schumacher, Petz]

Page 12: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

H(B)

Quantifying information

H(A)

H(B|A)H(A|B)

Uncertainty in Awhen value of Bis known?

H(A|B) = H(AB)-H(B)

|iAB=|0iA|0iB+|1iA|1iB

B = I/2

H(A|B) = 0 – 1 = -1

Conditional entropy canbe negative!

H(AB)

Page 13: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

H(B)

Quantifying information

H(A)

H(B|A)

Information is that which reduces uncertainty

I(A;B)H(A|B)

Uncertainty in Awhen value of Bis known?

H(A|B) = H(AB)-H(B)

I(A;B) = H(A) – H(A|B) = H(A)+H(B)-H(AB)̧ 0

H(AB)

Page 14: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

Data processing inequality(Strong subadditivity)

Alice Bob

timeAB

U

I(A;B)

I(A;B)

I(A;B) ¸ I(A;B)

Page 15: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

Sending classical information

through noisy channels

Physical model of a noisy channel:(Trace-preserving, completely positive map)

m Encoding( state)

Decoding(measurement)

m’

HSW noisy coding theorem: In the limit of many uses, the optimalrate at which Alice can send messages reliably to Bob through is given by the (regularization of the) formula

where

Page 16: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

Sending classical information

through noisy channels

m Encoding( state)

Decoding(measurement)

m’

B n

2nH(B)

X1,X2,…,Xn

2nH(B|A)

2nH(B|A)

2nH(B|A)

Page 17: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

Sending quantum information

through noisy channels

Physical model of a noisy channel:(Trace-preserving, completely positive map)

|i 2 Cd Encoding(TPCP map)

Decoding(TPCP map)

LSD noisy coding theorem: In the limit of many uses, the optimalrate at which Alice can reliably send qubits to Bob (1/n log d) through is given by the (regularization of the) formula

whereConditional

entropy!

Page 18: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

All x

Random 2n(I(X;Y)-) x

Entanglement and privacy: More than an analogy

p(y,z|x)x = x1 x2 … xn

y=y1 y2 … yn

z = z1 z2 … zn

How to send a private message from Alice to Bob?

AC93Can send private messages at rate I(X;Y)-I(X;Z)

Sets of size 2n(I(X;Z)+)

Page 19: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

All x

Random 2n(I(X:A)-) x

Entanglement and privacy: More than an analogy

UA’->BE n|xiA’

|iBE = U n|xi

How to send a private message from Alice to Bob?

D03Can send private messages at rate I(X:A)-I(X:E)

Sets of size 2n(I(X:E)+)

Page 20: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

All x

Random 2n(I(X:A)-) x

Entanglement and privacy: More than an analogy

UA’->BE nx px

1/2|xiA|xiA’x px

1/2|xiA|xiBE

How to send a private message from Alice to Bob?

SW97D03Can send private messages at rate I(X:A)-I(X:E)=H(A)-H(E)

Sets of size 2n(I(X:E)+)

H(E)=H(AB)

Page 21: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

Notions of distinguishability

Basic requirement: quantum channels do not increase “distinguishability”

Fidelity Trace distance

F(,)=max |h|i|2

T(,)=|-|1F(,)={Tr[(1/21/2)1/2]}2

F=0 for perfectly distinguishableF=1 for identical

T=2 for perfectly distinguishableT=0 for identical

T(,)=2max|p(k=0|)-p(k=0|)| where max is over POVMS {Mk}

F((),()) ¸ F(,) T(,) ¸ T((,())

Statements made today hold for both measures

Page 22: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

Conclusions: Part I

Information theory can be generalized to analyze quantum information processing

Yields a rich theory, surprising conceptual simplicity

Operational approach to thinking about quantum mechanics: Compression, data transmission, superdense

coding, subspace transmission, teleportation

Page 23: Quantum Shannon Theory Patrick Hayden (McGill) patrick/QLogic2005.ppt 17 July 2005, Q-Logic Meets Q-Info.

Some references:

Part I: Standard textbooks:* Cover & Thomas, Elements of information theory.* Nielsen & Chuang, Quantum computation and quantum information. (and references therein)

Part II: Papers available at arxiv.org:* Devetak, The private classical capacity and quantum capacity of a quantum channel, quant-ph/0304127* Devetak, Harrow & Winter, A family of quantum protocols,

quant-ph/0308044.* Horodecki, Oppenheim & Winter, Quantum information can be

negative, quant-ph/0505062