Generalised probabilistic theories and the extension complexity of polytopes

Post on 23-Feb-2016

43 views 1 download

Tags:

description

Generalised probabilistic theories and the extension complexity of polytopes. Serge Massar. From Foundations to Combinatorial Optimisation. Physical Theories Classical Quantum Generalised Probablisitic Theories (GPT). Factorisation of Communication / Slack Matrix Linear SDP Conic. - PowerPoint PPT Presentation

Transcript of Generalised probabilistic theories and the extension complexity of polytopes

Generalised probabilistic theories and the extension complexity of polytopes

Serge Massar

Physical Theories

• Classical• Quantum• Generalised Probablisitic

Theories (GPT)

Factorisation ofCommunication / Slack Matrix

• Linear• SDP• Conic

ExtendedFormulations

• linear• SDP• Conic

Polytopes& Combinat.Optimisation

Comm.Complexity

From Foundations to Combinatorial Optimisation

Physical Theories

• Classical• Quantum• Generalised Probablisitic

Theories (GPT)

Factorisation ofCommunication / Slack Matrix

• Linear• SDP• Conic

ExtendedFormulations

• linear• SDP• Conic

Polytopes& Combinat.Optimisation

Comm.Complexity

From Foundations to Combinatorial Optimisation

M. Yannakakis, Expressing Combinatorial Problems by Linear Programs, STOC 1988 S. Gouveia, P. Parillo, R. Rekha, Lifts of Convex Sets and Conic Factorisations, Math. Op. Res. 2013 S. Fiorini, S. Massar, S. Pokutta, H. R. Tiwary, R. de Wolf, Linear vs. Semi definite Extended Formulations: Exponential

Separation and Strong Lower Bounds, STOC 2012 S. Fiorini, S. Massar, M. K. Patra, H. R. Tiwary, Generalised probabilistic theories and the extension complexity of

polytopes, arXiv:1310.4125

Physical Theories

• Classical• Quantum• Generalised Probablisitic

Theories (GPT)

Factorisation ofCommunication / Slack Matrix

• Linear• SDP• Conic

ExtendedFormulations

• linear• SDP• Conic

Polytopes& Combinat.Optimisation

Comm.Complexity

From Foundations to Combinatorial Optimisation

Generalised Probabilistic Theories• Minimal framework to build theories

– States = convex set– Measurements: Predict probability of outcomes

• Adding axioms restricts to Classical or Quantum Theory– Aim: find « Natural » axioms for quantum theory. (Fuchs, Brassar, Hardy,

Barrett, Masanes Muller, D’Ariano etal, etc…)

• GPTs with « unphysical » behavior -> rule them out.– PR boxes make Communication Complexity trivial (vanDam 05)– Correlations that violate Tsirelson bound violate Information Causality

(Pawlowski et al 09)

A bit of geometry

Generalised Probabilistic Theories

• Mixture of states = state– State space is convex

• Theory predicts probability of outcome of measurement.

Generalised Probabilistic Theory GPT(C,u)• Space of unnormalised states = Cone

• Effects belong to dual Cone

• Normalisation– Unit – Normalised state – Measurement – Probability of outcome i :

C*

C

. u

Normalisedstates

0

Classical Theory

• • u=(1,1,1,…,1)

• Normalised state w=(p1,p2,…,pn)Probability distribution over possible states

• Canonical measurement={ei}ei=(0,..,0,1,0,..,0)

Quantum Theory

• • u=I=identity matrix• Normalised states = density matrices• Measurements = POVM

Lorentz ConeSecond Order Cone Programming

• CSOCP={x = (x0, x1,…,xn) such that x12+x2

2+…+xn2≤ x0

2}• Lorentz cone has a natural SDP formulation

-> subcone of the cone of SDP matrices• Can be arbitrarily well approximated using linear inequalities• Linear programs include SOCP include SDP

• Status?

Completely Positive and Co-positive Cones

Open Question.

• Other interesting families of Cones ?

One way communication complexity.

Alice Bob : M(b)

a b

w(a)

r

Classical Capacity.

• Holevo Theorem: – How much classical information can be stored in a

GPT state? Max I(A:R) ?– At most log(n) bits can be stored in

Alice Bob : M

a

w(a)

r

Proof 1: Refining MeasurementsGeneralised Probabilistic Theory GPT(C,u)• States • Measurement

• Refining measurements– If ei=pfi+(1-p)gi with – then we can refine the measurement to containeffect pfi and effect (1-p)gi rather than ei

• Theorem: Measurements can be refined so that all effects are extreme points of C* (Krein-Milman theorem)

Proof 2: Extremal MeasurementsGeneralised Probabilistic Theory GPT(C,u)• States • Measurement

• Convex combinations of measurements:– M1={ei} & M2={fi} – pM1+(1-p)M2={pei+(1-p)fi}

• If has m>n outcomes– Carathéodory: Then there exists a subset of size n, such that – Hence M=pM1+(1-p)M2 & M1 has n outcomes & M2 has m-1 outcomes.

• By recurrence: all measurements can be written as convex combination of measurements with at most n effects.

Proof 3: Classical Capacity of GPT

• Holevo Theorem for – Refining a measurement and decomposing measurement into convex

combination can only increase the capacity of the channel – Capacity of channel ≤ log( # of measurement outcomes) Capacity of channel ≤ log(n) bits

• OPEN QUESTION:– Get better bounds on the classical capacity for specific theories?

Alice Bob : M

a

w(a)r

Physical Theories

• Classical• Quantum• Generalised Probablisitic

Theories (GPT)

Factorisation ofCommunication / Slack Matrix

• Linear• SDP• Conic

ExtendedFormulations

• linear• SDP• Conic

Polytopes& Combinat.Optimisation

Comm.Complexity

From Foundations to Combinatorial Optimisation

Randomised one way communication complexity with positive outcomes

Theorem: Randomised one way communication with positive outcomes using GPT(C,u) and one bit of classical communication produces on average Cab on inputs a,bIf and only ifCone factorisation of

Alice

a b

w(a)

r(i,b)≥0

1 bit {0,1}

Different Cone factorisations

Theorem: Randomised one way communication with positive outcomes using GPT(C,u) and one bit of classical communication produces on average Cab on inputs a,bIf and only ifCone factorisation of

Alice

a b

w(a)

r(i,b)≥0

Physical Theories

• Classical• Quantum• Generalised Probablisitic

Theories (GPT)

Factorisation ofCommunication / Slack Matrix

• Linear• SDP• Conic

ExtendedFormulations

• linear• SDP• Conic

Polytopes& Combinat.Optimisation

Comm.Complexity

From Foundations to Combinatorial Optimisation

Background: solving NP by LP?

• Famous P-problem: linear programming (Khachian’79)• Famous NP-hard problem: traveling salesman problem• A polynomial-size LP for TSP would show P = NP• Swart’86–87 claimed to have found such LPs• Yannakakis’88 showed that any symmetric LP for TSP needs

exponential size• Swart’s LPs were symmetric, so they couldn’t work• 20-year open problem: what about non-symmetric LP?• There are examples where non-symmetry helps a lot

(Kaibel’10)• Any LP for TSP needs exponential size (Fiorini et al 12)

Polytope

• P = conv {vertices} = {x : Aex < be}

v e

Combinatorial Polytopes• Travelling Salesman Problem (TSP) polytope

– Rn(n-1)/2 : one coordinate per edge of graph– Cycle C : vC=(1,0,0,1,1,…,0)– PTSP=conv{vC}– Shortest cycle: min

• Correlation polytope–

• Bell polytope with 2 parties, N settings, 2 outcomes

• Linear optimisation over these polytopes is NP Hard• Deciding if a point belongs to the polytope is NP Hard

Physical Theories

• Classical• Quantum• Generalised Probablisitic

Theories (GPT)

Factorisation ofCommunication / Slack Matrix

• Linear• SDP• Conic

ExtendedFormulations

• linear• SDP• Conic

Polytopes& Combinat.Optimisation

Comm.Complexity

From Foundations to Combinatorial Optimisation

Extended Formulations• View polytope as projection of a simpler

object in a higher dimensional space.

p

Q=extended formulation

P=polytope

Linear Extensions: the higher dimensional object is a polytope

p

Q=extended formulation

P=polytope

Size of linear extended formulation = # of facets of Q

Conic extensions: Extended object= intersection of cone and hyperplane.

p

Cone=C

Q

Polytope P

Conic extensions

p

Cone=C

• Linear extensions– positive orthant

• SDP extensions– cone of SDP matrices

• Conic extensions– C=cone in Rn

• Why this construction?– Small extensions exist for many problems– Algorithmics: optimise over small extended formulation is efficient for linear and SDP extension– Possible to obtain Lower bound on size of extension

Q

Polytope P

Physical Theories

• Classical• Quantum• Generalised Probablisitic

Theories (GPT)

Factorisation ofCommunication / Slack Matrix

• Linear• SDP• Conic

ExtendedFormulations

• linear• SDP• Conic

Polytopes& Combinat.Optimisation

Comm.Complexity

From Foundations to Combinatorial Optimisation

Slack Matrix of a Polytope

• P = conv {vertices} = {x : Aex – be≥0}• Slack Matrix– Sve= distance between v and e = Aexv – be

v e

Factorisation Theorem(Yannakakis88)

Theorem: Polytope P has Cone C extension • Iff Slack matrix has Conic factorisation–

• Iff Alice and Bob can solve communication complexity problem based on Sev by sending GPT(C,u) states.

Alice Bob

e v

GPT(C)

s : <s>=Sev

Physical Theories

• Classical• Quantum• Generalised Probablisitic

Theories (GPT)

Factorisation ofCommunication / Slack Matrix

• Linear• SDP• Conic

ExtendedFormulations

• linear• SDP• Conic

Polytopes& Combinat.Optimisation

Comm.Complexity

From Foundations to Combinatorial Optimisation

S. Fiorini, S. Massar, S. Pokutta, H. R. Tiwary, R. de Wolf, Linear vs. Semi definite Extended Formulations: Exponential Separation and Strong Lower Bounds, STOC 2012 • There do not exist polynomial size linear extensions of the TSP polytope

A Classical versus Quantum gap

Alice Bob

a b

Classical/Quantum Communication

m : <m>=Mab

Theorem: Linear Extension Complexity of Correlation Polytope=

Alice Bob

a b

Classical Communication

m : <m>=Mab

Linear extension complexity of polytopes

OPEN QUESTION?

• Prove that SDP (Quantum) extension complexity of TSP, Correlation, etc.. polytopes is exponential– Strongly conjectured to be true– The converse would almost imply P=NP

– Requires method to lower bound quantum communication complexity in the average output model (cannot give the parties shared randomness)

Physical Theories

• Classical• Quantum• Generalised Probablisitic

Theories (GPT)

Factorisation ofCommunication / Slack Matrix

• Linear• SDP• Conic

ExtendedFormulations

• linear• SDP• Conic

Polytopes& Combinat.Optimisation

Comm.Complexity

From Foundations to Combinatorial Optimisation

S. Fiorini, S. Massar, M. K. Patra, H. R. Tiwary, Generalised probabilistic theories and the extension complexity of polytopes, arXiv:1310.4125 • GPT based on cone of completely positive matrices allow exponential saving with respect to classical (conjectured

quantum) communication• All combinatorial polytopes (vertices computable with poly size circuit) have poly size completely positive extension.

Recall:Completely Positive and Co-positive Cones

Completely Positive extention of Correlation Polytope

• Theorem: The Correlation polytope COR(n) has a 2n+1 size extension for the Completely Positive Cone.– Sketch of proof:

• Consider arbitary linear optimisation over COR(n)• Use Equivalence (Bürer2009) to linear optimisation over

C*2n+1

• Implies COR(n)=projection of intersection of C*2n+1 with hyperplane

Polynomialy definable 0/1 polytopes

Polynomialy definable 0/1-polytopes

• Theorem (Maksimenko2012): All polynomialy definable 0/1-polytopes in Rd are projections of faces of the correlation polytope COR(poly(d)).

• Corollary: All polynomialy definable 0/1-polytopes in Rd have poly(d) size extension for the Completely Positive Cone.– Generalises a large number of special cases proved before.– « Cook-Levin» like theorem for combinatorial polytopes

Summary• Generalised Probabilistic Theories

– Holevo Theorem for GPT

• Connection between Classical/Quantum/GPT communication complexity and Extension of Polytopes– Exponential Lower bound on linear extension complexity of COR, TSP polytopes– All 0/1 combinatorial polytopes have small extension for the Completely Positive

Cone– Hence: GPT(Completely Positive Cone) allows exponential saving with respect to

classical (conjectured quantum) communication.• Use this to rule out the theory? (Of course many other reasons to rule out the

theory using other axioms)

• OPEN QUESTIONS: Gaps between Classical/Quantum/GPT for– Other models of communication complexity?– Models of Computation

Physical Theories

• Classical• Quantum• Generalised Probablisitic

Theories (GPT)

Factorisation ofCommunication / Slack Matrix

• Linear• SDP• Conic

ExtendedFormulations

• linear• SDP• Conic

Polytopes& Combinat.Optimisation

Comm.Complexity

From Foundations to Combinatorial Optimisation

M. Yannakakis, Expressing Combinatorial Problems by Linear Programs, STOC 1988

S. Gouveia, P. Parillo, R. Rekha, Lifts of Convex Sets and Conic Factorisations, Math. Op. Res. 2013

S. Fiorini, S. Massar, S. Pokutta, H. R. Tiwary, R. de Wolf, Linear vs. Semi definite Extended Formulations: Exponential Separation and Strong Lower Bounds, STOC 2012

There do not exist polynomial size linear extensions of the TSP polytope

S. Fiorini, S. Massar, M. K. Patra, H. R. Tiwary, Generalised probabilistic theories and the extension complexity of polytopes, arXiv:1310.4125 • All combinatorial polytopes (vertices computable with poly size circuit) have poly size completely positive

extension.• GPT based on cone of completely positive matrices allow exponential saving with respect to classical (conjectured

quantum) communication