Michael Lampis LAMSADE Universite Paris...

Post on 18-Jul-2020

0 views 0 download

Transcript of Michael Lampis LAMSADE Universite Paris...

Hardness of Approximation for the TSP

Michael LampisLAMSADE

Universite Paris Dauphine

Sep 2, 2015

Overview

Parameterized Approximation Schemes 2 / 39

• Hardness of Approximation

• What is it?

• How to do it?

• (Easy) Examples

• The PCP Theorem

• What is it?

• How to use it?

• The Traveling Salesman Problem

• Approximation algorithms

• Strategy for Proving Hardness

• Other tools

• Expander Graphs

• Bounded-Occurrence CSPs

• A full reduction for the TSP

Hardness of Approximation

Hardness of Approximation

Parameterized Approximation Schemes 4 / 39

• Day Summary: Approximation Algorithms with a performance

guarantee

Hardness of Approximation

Parameterized Approximation Schemes 4 / 39

• Day Summary: Approximation Algorithms with a performance

guarantee

• Reminder:

• We have an (NP-hard) optimization problem

• We want to design an algorithm that gives “good enough” solution

• We want a guarantee of this

For all instances I we haveSOL(I)OPT(I) < r

Hardness of Approximation

Parameterized Approximation Schemes 4 / 39

• Day Summary: Approximation Algorithms with a performance

guarantee

• How close to 1 can we get the approximation ratio r?

• Typical situation:

• An initial algorithm gives some (bad) r.

• Then someone comes up with an improvement

• Repeat. . .

Hardness of Approximation

Parameterized Approximation Schemes 4 / 39

• Day Summary: Approximation Algorithms with a performance

guarantee

• How close to 1 can we get the approximation ratio r?

• Typical situation:

• An initial algorithm gives some (bad) r.

• Then someone comes up with an improvement

• Repeat. . .

• Until we are stuck! Now what?

Hardness of Approximation

Parameterized Approximation Schemes 4 / 39

• Day Summary: Approximation Algorithms with a performance

guarantee

• How close to 1 can we get the approximation ratio r?

• Typical situation:

• An initial algorithm gives some (bad) r.

• Then someone comes up with an improvement

• Repeat. . .

• Until we are stuck! Now what?

Hardness of Approximation

Parameterized Approximation Schemes 4 / 39

• Day Summary: Approximation Algorithms with a performance

guarantee

• How close to 1 can we get the approximation ratio r?

• Typical situation:

• An initial algorithm gives some (bad) r.

• Then someone comes up with an improvement

• Repeat. . .

• Until we are stuck! Now what?

• Goal of the theory of Hardness of Approximation:

Prove the we are not incompetent!

Hardness of Approximation: Can we do it?

Parameterized Approximation Schemes 5 / 39

• Approximation Algorithms vs. Hardness

• Poly-time algorithms vs. NP-completeness

• Algorithms vs. Complexity

Hardness of Approximation: Can we do it?

Parameterized Approximation Schemes 5 / 39

• Approximation Algorithms vs. Hardness

• Poly-time algorithms vs. NP-completeness

• Algorithms vs. Complexity

Hardness of Approximation: Can we do it?

Parameterized Approximation Schemes 5 / 39

• Approximation Algorithms vs. Hardness

• Poly-time algorithms vs. NP-completeness

• Algorithms vs. Complexity

Hardness of Approximation: Can we do it?

Parameterized Approximation Schemes 5 / 39

• Approximation Algorithms vs. Hardness

• Poly-time algorithms vs. NP-completeness

• Algorithms vs. Complexity

The two are related!

Hardness of Approximation: Can we do it?

Parameterized Approximation Schemes 5 / 39

• Approximation Algorithms vs. Hardness

• Poly-time algorithms vs. NP-completeness

• Algorithms vs. Complexity

• The main tool will be algorithmic: Reductions

• Reminder: Basic tool of NP-hardness

• A is NP-hard. Reduce A to B. → B is NP-hard.

Hardness of Approximation: Can we do it?

Parameterized Approximation Schemes 5 / 39

• Approximation Algorithms vs. Hardness

• Poly-time algorithms vs. NP-completeness

• Algorithms vs. Complexity

• The main tool will be algorithmic: Reductions

• Reminder: Basic tool of NP-hardness

• A is NP-hard. Reduce A to B. → B is NP-hard.

• Approximation version: Approximation Preserving Reductions

• Idea: A has no good approximation algorithm.

• We reduce A to B

• Conclusion: B has no good approximation algorithm.

Hardness of Approximation: Can we do it?

Parameterized Approximation Schemes 5 / 39

• Approximation Algorithms vs. Hardness

• Poly-time algorithms vs. NP-completeness

• Algorithms vs. Complexity

• The main tool will be algorithmic: Reductions

• Reminder: Basic tool of NP-hardness

• A is NP-hard. Reduce A to B. → B is NP-hard.

• Approximation version: Approximation Preserving Reductions

• Idea: A has no good approximation algorithm.

• We reduce A to B

• Conclusion: B has no good approximation algorithm.

Hardness of Approximation: Can we do it?

Parameterized Approximation Schemes 6 / 39

There are a couple of serious problems with this approach.

Hardness of Approximation: Can we do it?

Parameterized Approximation Schemes 6 / 39

There are a couple of serious problems with this approach.

• What is the “first” hard to approximate problem?

• Recall: Cook’s theorem gives us a “first” NP-hard problem. Then

we reduce from that.

• Here, we don’t have a problem to begin from. . .

Hardness of Approximation: Can we do it?

Parameterized Approximation Schemes 6 / 39

There are a couple of serious problems with this approach.

• What is the “first” hard to approximate problem?

• Recall: Cook’s theorem gives us a “first” NP-hard problem. Then

we reduce from that.

• Here, we don’t have a problem to begin from. . .

• How can we prove that a problem does not have a good approximation

algorithm?

• This implies that it does not have a poly-time exact algorithm.

• P6=NP !!

Hardness of Approximation: Can we do it?

Parameterized Approximation Schemes 6 / 39

There are a couple of serious problems with this approach.

• What is the “first” hard to approximate problem?

• Recall: Cook’s theorem gives us a “first” NP-hard problem. Then

we reduce from that.

• Here, we don’t have a problem to begin from. . .

• How can we prove that a problem does not have a good approximation

algorithm?

• This implies that it does not have a poly-time exact algorithm.

• P6=NP !!

Hardness of Approximation: How to do it

Parameterized Approximation Schemes 7 / 39

• We cannot avoid the second problem (without resolving P=NP)

• We will prove all our hardness results assuming P6=NP

Hardness of Approximation: How to do it

Parameterized Approximation Schemes 7 / 39

• We cannot avoid the second problem (without resolving P=NP)

• We will prove all our hardness results assuming P6=NP

• We can solve the first problem using gap-introducting reductions.

Hardness of Approximation: How to do it

Parameterized Approximation Schemes 7 / 39

• We cannot avoid the second problem (without resolving P=NP)

• We will prove all our hardness results assuming P6=NP

• We can solve the first problem using gap-introducting reductions.

• A gap-introducting reduction from SAT to a problem A has the

following properties

• Given a SAT formula φ it produces in polynomial time an instance I

of A

• (Completeness): If φ is satisfiable then OPT(I) > c• (Soundness): If φ is not satisfiable then OPT(I) < s

Hardness of Approximation: How to do it

Parameterized Approximation Schemes 7 / 39

• We cannot avoid the second problem (without resolving P=NP)

• We will prove all our hardness results assuming P6=NP

• We can solve the first problem using gap-introducting reductions.

• A gap-introducting reduction from SAT to a problem A has the

following properties

• Given a SAT formula φ it produces in polynomial time an instance I

of A

• (Completeness): If φ is satisfiable then OPT(I) > c• (Soundness): If φ is not satisfiable then OPT(I) < s

• This establishes that no algorithm can achieve approximation ratio

better than c/s

Gap introduction: An easy example

Parameterized Approximation Schemes 8 / 39

Recall the NP-hard Graph Coloring Problem

• Given a graph G(V,E) we want to find a coloring of the vertices such

that any two neighbors have different colors.

• Objective: Minimize the number of colors used.

• Suppose my friend Bob claims to have designed an algorithm for

Graph Coloring with approximation ratio 1.1.

Gap introduction: An easy example

Parameterized Approximation Schemes 8 / 39

Recall the NP-hard Graph Coloring Problem

• Given a graph G(V,E) we want to find a coloring of the vertices such

that any two neighbors have different colors.

• Objective: Minimize the number of colors used.

• Suppose my friend Bob claims to have designed an algorithm for

Graph Coloring with approximation ratio 1.1.

• This proves that P=NP!

Gap introduction: An easy example

Parameterized Approximation Schemes 8 / 39

Recall the NP-hard Graph Coloring Problem

• Given a graph G(V,E) we want to find a coloring of the vertices such

that any two neighbors have different colors.

• Objective: Minimize the number of colors used.

• Suppose my friend Bob claims to have designed an algorithm for

Graph Coloring with approximation ratio 1.1.

• Recall: Deciding if a graph can be colored with 3 colors in NP-hard.

Gap introduction: An easy example

Parameterized Approximation Schemes 8 / 39

Recall the NP-hard Graph Coloring Problem

• Given a graph G(V,E) we want to find a coloring of the vertices such

that any two neighbors have different colors.

• Objective: Minimize the number of colors used.

• Suppose my friend Bob claims to have designed an algorithm for

Graph Coloring with approximation ratio 1.1.

• Recall: Deciding if a graph can be colored with 3 colors in NP-hard.

• Translation: there is reduction which given a SAT formula φ produces

either a graph that can be 3-colored, or one that needs more colors.

Gap introduction: An easy example

Parameterized Approximation Schemes 8 / 39

Recall the NP-hard Graph Coloring Problem

• Given a graph G(V,E) we want to find a coloring of the vertices such

that any two neighbors have different colors.

• Objective: Minimize the number of colors used.

• Suppose my friend Bob claims to have designed an algorithm for

Graph Coloring with approximation ratio 1.1.

• Recall: Deciding if a graph can be colored with 3 colors in NP-hard.

• Translation: there is reduction which given a SAT formula φ produces

either a graph that can be 3-colored, or one that needs more colors.

• Run Bob’s algorithm on this graph.

• If the graph can be 3-colored, the algorithm is guaranteed to

produce a solution with at most 3× 1.1 = 3.3 colors (!!)

• Otherwise, the algorithm will return a solution with at least 4 colors.

• From the number of colors of the solution we can deduce if the formula

was satisfiable!

Gap introduction: An easy example

Parameterized Approximation Schemes 8 / 39

Recall the NP-hard Graph Coloring Problem

• Given a graph G(V,E) we want to find a coloring of the vertices such

that any two neighbors have different colors.

• Objective: Minimize the number of colors used.

• Suppose my friend Bob claims to have designed an algorithm for

Graph Coloring with approximation ratio 1.1.

• Recall: Deciding if a graph can be colored with 3 colors in NP-hard.

• Translation: there is reduction which given a SAT formula φ produces

either a graph that can be 3-colored, or one that needs more colors.

• Run Bob’s algorithm on this graph.

• If the graph can be 3-colored, the algorithm is guaranteed to

produce a solution with at most 3× 1.1 = 3.3 colors (!!)

• Otherwise, the algorithm will return a solution with at least 4 colors.

• From the number of colors of the solution we can deduce if the formula

was satisfiable!

TSP

Parameterized Approximation Schemes 9 / 39

Traveling Salesman Problem:

• Given: Edge-weighted complete graph, weights follow triangle

inequality

• Output: A tour that visits each vertex exactly once

• Objective: Minimize total cost

TSP

Parameterized Approximation Schemes 9 / 39

Traveling Salesman Problem:

• What if we don’t have the triangle inequality?

TSP

Parameterized Approximation Schemes 9 / 39

Traveling Salesman Problem:

• What if we don’t have the triangle inequality?

Reduction from Hamiltonian Cycle

• Ham. Cycle: Given a graph, is there a cycle that visits each vertex

exactly once?

• Given graph G(V,E) construct an instance of TSP

• Each edge ∈ E has weight 1

• Each non-edge has weight w

• YES: There is a TSP tour with weight |V |• NO: Any TSP tour has weight ≥ |V | − 1 + w

• → No algorithm can have ratio better than|V |−1+w

|V |

• We can now set w to something huge! (e.g. w = 2n)

Gap Introduction: A non-trivial example

Parameterized Approximation Schemes 10 / 39

Graph Balancing / Scheduling with restricted Assignment

• Given: n machines and m jobs. Each job has a duration and a set of

machines it is allowed to run on.

• Output: An assignment of jobs to machines.

• Objective: Minimize makespan (time needed for last machine to finish

all its jobs).

Gap Introduction: A non-trivial example

Parameterized Approximation Schemes 10 / 39

Graph Balancing / Scheduling with restricted Assignment

• Given: n machines and m jobs. Each job has a duration and a set of

machines it is allowed to run on.

• Output: An assignment of jobs to machines.

• Objective: Minimize makespan (time needed for last machine to finish

all its jobs).

We are mainly interested in the case of the problem where each job can

run on two machines.

Gap Introduction: A non-trivial example

Parameterized Approximation Schemes 10 / 39

Graph Balancing / Scheduling with restricted Assignment

• Given: n machines and m jobs. Each job has a duration and a set of

machines it is allowed to run on.

• Output: An assignment of jobs to machines.

• Objective: Minimize makespan (time needed for last machine to finish

all its jobs).

We are mainly interested in the case of the problem where each job can

run on two machines.

Example:

Gap Introduction: A non-trivial example

Parameterized Approximation Schemes 10 / 39

Graph Balancing / Scheduling with restricted Assignment

• Given: n machines and m jobs. Each job has a duration and a set of

machines it is allowed to run on.

• Output: An assignment of jobs to machines.

• Objective: Minimize makespan (time needed for last machine to finish

all its jobs).

We are mainly interested in the case of the problem where each job can

run on two machines.

Example:

Gap Introduction: A non-trivial example

Parameterized Approximation Schemes 10 / 39

Graph Balancing / Scheduling with restricted Assignment

• Given: n machines and m jobs. Each job has a duration and a set of

machines it is allowed to run on.

• Output: An assignment of jobs to machines.

• Objective: Minimize makespan (time needed for last machine to finish

all its jobs).

We are mainly interested in the case of the problem where each job can

run on two machines.

Example:

Gap Introduction: A non-trivial example

Parameterized Approximation Schemes 10 / 39

Graph Balancing / Scheduling with restricted Assignment

• Given: n machines and m jobs. Each job has a duration and a set of

machines it is allowed to run on.

• Output: An assignment of jobs to machines.

• Objective: Minimize makespan (time needed for last machine to finish

all its jobs).

We are mainly interested in the case of the problem where each job can

run on two machines.

Example:

Gap Introduction: A non-trivial example

Parameterized Approximation Schemes 11 / 39

• Target Theorem: There is no approximation algorithm for Graph

Balancing with ratio better than 3/2.

Plan:

• Gap-Introducing reduction from 3-SAT

• Satisfiable formula → maximum load 2

• Unsatisfiable formula → maximum load ≥ 3.

Gap Introduction: A non-trivial example

Parameterized Approximation Schemes 11 / 39

• Target Theorem: There is no approximation algorithm for Graph

Balancing with ratio better than 3/2.

Plan:

• Gap-Introducing reduction from 3-SAT

• Satisfiable formula → maximum load 2

• Unsatisfiable formula → maximum load ≥ 3.

Thm: 3-OCC-3-SAT is NP-hard

• This is the version of 3-SAT where each variable appears at most 3

times and each literal at most twice.

• Proof:

• Replace each appearance of variable x with a fresh variable

x1, x2, . . . , xn• Add the clauses (x1 → x2) ∧ (x2 → x3) ∧ . . . ∧ (xn → x1)

Example continued

Parameterized Approximation Schemes 12 / 39

Reduction: 3-OCC-3-SAT → Graph Balancing

Example continued

Parameterized Approximation Schemes 12 / 39

Reduction: 3-OCC-3-SAT → Graph Balancing

For each variable create an edge of weight 2 and two vertices

Example continued

Parameterized Approximation Schemes 12 / 39

Reduction: 3-OCC-3-SAT → Graph Balancing

For each create a vertex and connect it with its literals, here

c1 = (x1 ∨ ¬x2 ∨ x3)

Example continued

Parameterized Approximation Schemes 12 / 39

Reduction: 3-OCC-3-SAT → Graph Balancing

A truth assignment orients heavy edges towards the false literal

Example continued

Parameterized Approximation Schemes 12 / 39

Reduction: 3-OCC-3-SAT → Graph Balancing

In order to achieve load=2 we must find a true literal in each clause

Recap

Parameterized Approximation Schemes 13 / 39

• Gap-introducing reductions

• Reduce an NP-hard problem to instances of our problem which are

very different in the YES/NO cases.

• This implies hardness of approximation for our problem

• Next step, reduct to other problems. . .

• Unfortunately, direct gap-introducing reductions are very rare.

• Usually work for problems of the form Max-Min

• Does not work for Min-Avg, Min-Sum, . . .

• How to prove that such problems are hard?

The PCP Theorem

Min-Max or Min-Sum?

Parameterized Approximation Schemes 15 / 39

• Consider the MAX-3-SAT problem

• Given: 3-SAT formula

• Objective: Find assignment that satisfies most clauses

• We can try the same trick to prove it’s hard to approximate

• YES: OPT(I) = m• NO: OPT(I) ≤ m− 1• No approximation better than m−1

m

Min-Max or Min-Sum?

Parameterized Approximation Schemes 15 / 39

• Consider the MAX-3-SAT problem

• Given: 3-SAT formula

• Objective: Find assignment that satisfies most clauses

• We can try the same trick to prove it’s hard to approximate

• YES: OPT(I) = m• NO: OPT(I) ≤ m− 1• No approximation better than m−1

m

• Unfortunately, this ratio is basically 1. . .

• Generally, direct gap-introducing reductions are hard to do for

problems where a bad instance needs to have “many” problems.

• To prove that such problems are hard we generally need the famous

PCP theorem.

The PCP theorem: approximation view

Parameterized Approximation Schemes 16 / 39

Theorem: There is a polynomial-time reduction from 3-SAT to 3-SAT with

the following properties

• If the original formula φ is satisfiable then the new formula φ′ is also

• If the original formula is not satisfiable, then any assignment satisfies

at most an r fraction of φ′, where r < 1 a constant indepent of φ.

The PCP theorem: approximation view

Parameterized Approximation Schemes 16 / 39

Theorem: There is a polynomial-time reduction from 3-SAT to 3-SAT with

the following properties

• If the original formula φ is satisfiable then the new formula φ′ is also

• If the original formula is not satisfiable, then any assignment satisfies

at most an r fraction of φ′, where r < 1 a constant indepent of φ.

• Translation: The PCP theorem gives a gap-introducing reduction to

MAX-3-SAT.

• This produces a “starting problem” from which we can do reductions to

show that other problems are hard.

• In this way, the PCP theorem is to approximation hardness what

Cook’s theorem is to NP-completeness.

The PCP theorem: approximation view

Parameterized Approximation Schemes 16 / 39

Theorem: There is a polynomial-time reduction from 3-SAT to 3-SAT with

the following properties

• If the original formula φ is satisfiable then the new formula φ′ is also

• If the original formula is not satisfiable, then any assignment satisfies

at most an r fraction of φ′, where r < 1 a constant indepent of φ.

• Translation: The PCP theorem gives a gap-introducing reduction to

MAX-3-SAT.

• This produces a “starting problem” from which we can do reductions to

show that other problems are hard.

• In this way, the PCP theorem is to approximation hardness what

Cook’s theorem is to NP-completeness.

• But it is also much more. . .

The PCP theorem: proof-checking view

Parameterized Approximation Schemes 17 / 39

• Problems in NP: ∃ a short proof for YES instances

• E.g. SAT, 3-Coloring

• Mathematical Theorems themselves !?!?

The PCP theorem: proof-checking view

Parameterized Approximation Schemes 17 / 39

• Problems in NP: ∃ a short proof for YES instances

• E.g. SAT, 3-Coloring

• Mathematical Theorems themselves !?!?

• Given such a proof/certificate, how can we verify it’s correct?

• We have to read it, of course.

• All of it ???

The PCP theorem: proof-checking view

Parameterized Approximation Schemes 17 / 39

• Problems in NP: ∃ a short proof for YES instances

• E.g. SAT, 3-Coloring

• Mathematical Theorems themselves !?!?

• Given such a proof/certificate, how can we verify it’s correct?

• We have to read it, of course.

• All of it ???

• PCP theorem (informal statement):

• There is a way to write the proof so that its size stays roughly the

same but it can be verified with high probability by reading a

constant number of bits.

The PCP theorem: proof-checking view

Parameterized Approximation Schemes 17 / 39

• Problems in NP: ∃ a short proof for YES instances

• E.g. SAT, 3-Coloring

• Mathematical Theorems themselves !?!?

• Given such a proof/certificate, how can we verify it’s correct?

• We have to read it, of course.

• All of it ???

• PCP theorem (informal statement):

• There is a way to write the proof so that its size stays roughly the

same but it can be verified with high probability by reading a

constant number of bits.

• This is unbelievable! (and it made the NY Times)

The PCP theorem: implications

Parameterized Approximation Schemes 18 / 39

• Equivalence of two forms: a 3-SAT formula for which it is easy to verify

a certificate (assignment) is a formula for which every assignment

makes many clauses false.

• Using the PCP theorem we have some (tiny) constant for the hardness

of MAX-3-SAT

Is this all?

The PCP theorem: implications

Parameterized Approximation Schemes 18 / 39

• Equivalence of two forms: a 3-SAT formula for which it is easy to verify

a certificate (assignment) is a formula for which every assignment

makes many clauses false.

• Using the PCP theorem we have some (tiny) constant for the hardness

of MAX-3-SAT

Is this all?

[Hastad 2001]

• There is no better than 7/8-approximation for MAX-E3-SAT

• There is no better than 1/2-approximation for MAX-E3-LIN2

• In MAX-E3-LIN2 we are given equations of the form x⊕ y ⊕ z = 0and want to satisfy as many as possible.

• MAX-E3-LIN2 is a common starting point for inapproximability

reductions.

These results match the performance of the trivial algorithm!

The Traveling Salesman Problem

The Traveling Salesman Problem

Parameterized Approximation Schemes 20 / 39

Input:

• An edge-weighted graph G(V,E)

Objective:

• Find an ordering of the vertices v1, v2, . . . , vnsuch that d(v1, v2) + d(v2, v3) + . . . + d(vn, v1) is

minimized.

• d(vi, vj) is the shortest-path distance of vi, vj on

G

The Traveling Salesman Problem

Parameterized Approximation Schemes 20 / 39

The Traveling Salesman Problem

Parameterized Approximation Schemes 20 / 39

The Traveling Salesman Problem

Parameterized Approximation Schemes 20 / 39

The Traveling Salesman Problem

Parameterized Approximation Schemes 20 / 39

The Traveling Salesman Problem

Parameterized Approximation Schemes 20 / 39

The Traveling Salesman Problem

Parameterized Approximation Schemes 20 / 39

The Traveling Salesman Problem

Parameterized Approximation Schemes 20 / 39

The Traveling Salesman Problem

Parameterized Approximation Schemes 20 / 39

TSP Approximations – Upper bounds

Parameterized Approximation Schemes 21 / 39

• 32 approximation (Christofides 1976)

For graphic (un-weighted) case

• 32 − ǫ approximation (Oveis Gharan et al. FOCS

’11)

• 1.461 approximation (Momke and Svensson

FOCS ’11)

• 139 approximation (Mucha STACS ’12)

• 1.4 approximation (Sebo and Vygen arXiv ’12)

• For ATSP the best ratio is O(logn/ log logn)(Asadpour et al. SODA ’10)

TSP Approximations – Lower bounds

Parameterized Approximation Schemes 22 / 39

• Problem is APX-hard (Papadimitriou and Yannakakis

’93)

• 53815380 -inapproximable, ATSP 2805

2804 (Engebretsen STACS

’99)

• 38133812 -inapproximable (Bockenhauer et al. STACS ’00)

• 220219 -inapproximable, ATSP 117

116 (Papadimitriou and Vem-

pala STOC ’00, Combinatorica ’06)

Current best (Karpinski, L., Schmied):

Theorem

It is NP-hard to approximate TSP better than 123122 and ATSP

better than 7574 .

TSP Approximations – Lower bounds

Parameterized Approximation Schemes 22 / 39

• Problem is APX-hard (Papadimitriou and Yannakakis

’93)

• 53815380 -inapproximable, ATSP 2805

2804 (Engebretsen STACS

’99)

• 38133812 -inapproximable (Bockenhauer et al. STACS ’00)

• 220219 -inapproximable, ATSP 117

116 (Papadimitriou and Vem-

pala STOC ’00, Combinatorica ’06)

Current best (Karpinski, L., Schmied):

Theorem

It is NP-hard to approximate TSP better than 123122 and ATSP

better than 7574 .

Notice the huge distance between the best algorithm (50%error) and hardness (0.8% error). . .

Reduction Technique

Parameterized Approximation Schemes 23 / 39

We reduce some inapproximable CSP (e.g. MAX-3SAT) to TSP.

Reduction Technique

Parameterized Approximation Schemes 23 / 39

First, design some gadgets to represent the clauses

Reduction Technique

Parameterized Approximation Schemes 23 / 39

Then, add some choice vertices to represent truth assignments to

variables

Reduction Technique

Parameterized Approximation Schemes 23 / 39

For each variable, create a path through clauses where it appears positive

Reduction Technique

Parameterized Approximation Schemes 23 / 39

. . . and another path for its negative appearances

Reduction Technique

Parameterized Approximation Schemes 23 / 39

Reduction Technique

Parameterized Approximation Schemes 23 / 39

A truth assignment dictates a general path

Reduction Technique

Parameterized Approximation Schemes 23 / 39

Reduction Technique

Parameterized Approximation Schemes 23 / 39

Reduction Technique

Parameterized Approximation Schemes 23 / 39

We must make sure that gadgets are cheaper to traverse if corresponding

clause is satisfied

Reduction Technique

Parameterized Approximation Schemes 23 / 39

If a clause is not satisfied, we will pay more. We need many clauses to be

unsatisfied in a No instance to have a big gap. (PCP theorem)

Reduction Technique

Parameterized Approximation Schemes 23 / 39

For the converse direction we must also make sure that ”cheating” tours

are not optimal!

How to ensure consistency

Parameterized Approximation Schemes 24 / 39

• Basic idea here: consistency would be easy if each variable occurred

at most c times, c a constant.

• Cheating would only help a tour ”fix” a bounded number of clauses.

How to ensure consistency

Parameterized Approximation Schemes 24 / 39

• Basic idea here: consistency would be easy if each variable occurred

at most c times, c a constant.

• Cheating would only help a tour ”fix” a bounded number of clauses.

• We will rely on techniques and tools used to prove inapproximability for

bounded-occurrence CSPs.

• This is where expander graphs are important.

• Main tool: “amplifier graph” constructions due to Berman and

Karpinski.

How to ensure consistency

Parameterized Approximation Schemes 24 / 39

• Basic idea here: consistency would be easy if each variable occurred

at most c times, c a constant.

• Cheating would only help a tour ”fix” a bounded number of clauses.

• We will rely on techniques and tools used to prove inapproximability for

bounded-occurrence CSPs.

• This is where expander graphs are important.

• Main tool: “amplifier graph” constructions due to Berman and

Karpinski.

• Expander graphs are a generally useful tool, so let’s take a look at

what they are. . .

Expander and Amplifier Graphs

Expander Graphs

Parameterized Approximation Schemes 26 / 39

• Informal description:

An expander graph is a well-connected and sparse graph.

Expander Graphs

Parameterized Approximation Schemes 26 / 39

• Informal description:

An expander graph is a well-connected and sparse graph.

• Definition:

A graph G(V,E) is an expander if

• For all S ⊆ V with |S| ≤ |V |2 we have for some constant c

|E(S, V \ S)|

|S|≥ c

• The maximum degree ∆ is bounded

Expander Graphs

Parameterized Approximation Schemes 26 / 39

• Informal description:

An expander graph is a well-connected and sparse graph.

• In any possible partition of the vertices into two sets, there are

many edges crossing the cut.

• This is achieved even though the graph has low degree, therefore

few edges.

Expander Graphs

Parameterized Approximation Schemes 26 / 39

• Informal description:

An expander graph is a well-connected and sparse graph.

• In any possible partition of the vertices into two sets, there are

many edges crossing the cut.

• This is achieved even though the graph has low degree, therefore

few edges.

Example:

Expander Graphs

Parameterized Approximation Schemes 26 / 39

• Informal description:

An expander graph is a well-connected and sparse graph.

• In any possible partition of the vertices into two sets, there are

many edges crossing the cut.

• This is achieved even though the graph has low degree, therefore

few edges.

Example:

A complete bipartite graph is well-connected

but not sparse.

Expander Graphs

Parameterized Approximation Schemes 26 / 39

• Informal description:

An expander graph is a well-connected and sparse graph.

• In any possible partition of the vertices into two sets, there are

many edges crossing the cut.

• This is achieved even though the graph has low degree, therefore

few edges.

Example:

A complete bipartite graph is well-connected

but not sparse.

Expander Graphs

Parameterized Approximation Schemes 26 / 39

• Informal description:

An expander graph is a well-connected and sparse graph.

• In any possible partition of the vertices into two sets, there are

many edges crossing the cut.

• This is achieved even though the graph has low degree, therefore

few edges.

Example:

A complete bipartite graph is well-connected

but not sparse.

Expander Graphs

Parameterized Approximation Schemes 26 / 39

• Informal description:

An expander graph is a well-connected and sparse graph.

• In any possible partition of the vertices into two sets, there are

many edges crossing the cut.

• This is achieved even though the graph has low degree, therefore

few edges.

Example:

A grid is sparse but not well-connected.

Expander Graphs

Parameterized Approximation Schemes 26 / 39

• Informal description:

An expander graph is a well-connected and sparse graph.

• In any possible partition of the vertices into two sets, there are

many edges crossing the cut.

• This is achieved even though the graph has low degree, therefore

few edges.

Example:

A grid is sparse but not well-connected.

Expander Graphs

Parameterized Approximation Schemes 26 / 39

• Informal description:

An expander graph is a well-connected and sparse graph.

• In any possible partition of the vertices into two sets, there are

many edges crossing the cut.

• This is achieved even though the graph has low degree, therefore

few edges.

Example:

A grid is sparse but not well-connected.

Expander Graphs

Parameterized Approximation Schemes 26 / 39

• Informal description:

An expander graph is a well-connected and sparse graph.

• In any possible partition of the vertices into two sets, there are

many edges crossing the cut.

• This is achieved even though the graph has low degree, therefore

few edges.

Example:

An infinite binary tree is a good expander.

Expander Graphs

Parameterized Approximation Schemes 26 / 39

• Informal description:

An expander graph is a well-connected and sparse graph.

• In any possible partition of the vertices into two sets, there are

many edges crossing the cut.

• This is achieved even though the graph has low degree, therefore

few edges.

Example:

An infinite binary tree is a good expander.

Expander Graphs

Parameterized Approximation Schemes 26 / 39

• Informal description:

An expander graph is a well-connected and sparse graph.

• In any possible partition of the vertices into two sets, there are

many edges crossing the cut.

• This is achieved even though the graph has low degree, therefore

few edges.

Example:

An infinite binary tree is a good expander.

Applications of Expanders

Parameterized Approximation Schemes 27 / 39

Expander graphs have a number of applications

• Proof of PCP theorem

• Derandomization

• Error-correcting codes

Applications of Expanders

Parameterized Approximation Schemes 27 / 39

Expander graphs have a number of applications

• Proof of PCP theorem

• Derandomization

• Error-correcting codes

• . . . and inapproximability of bounded occurrence CSPs!

Applications of Expanders

Parameterized Approximation Schemes 27 / 39

Expanders and inapproximability

• Consider the standard reduction from 3-SAT to 3-OCC-3-SAT

• Replace each appearance of variable x with a fresh variable

x1, x2, . . . , xn• Add the clauses (x1 → x2) ∧ (x2 → x3) ∧ . . . ∧ (xn → x1)

Applications of Expanders

Parameterized Approximation Schemes 27 / 39

Expanders and inapproximability

• Consider the standard reduction from 3-SAT to 3-OCC-3-SAT

• Replace each appearance of variable x with a fresh variable

x1, x2, . . . , xn• Add the clauses (x1 → x2) ∧ (x2 → x3) ∧ . . . ∧ (xn → x1)

Problem: This does not preserve inapproximability!

• We could add (xi → xj) for all i, j.• This ensures consistency but adds too many clauses and does not

decrease number of occurrences!

Applications of Expanders

Parameterized Approximation Schemes 27 / 39

Expanders and inapproximability

• We modify this using a 1-expander [Papadimitriou Yannakakis 91]

• Recall: a 1-expander is a graph s.t. in each partition of the vertices

the number of edges crossing the cut is larger than the number of

vertices of the smaller part.

Applications of Expanders

Parameterized Approximation Schemes 27 / 39

Expanders and inapproximability

• We modify this using a 1-expander [Papadimitriou Yannakakis 91]

• Replace each appearance of variable x with a fresh variable

x1, x2, . . . , xn• Construct an n-vertex 1-expander.

• For each edge (i, j) add the clauses (xi → xj) ∧ (xj → xi)

Applications of Expanders

Parameterized Approximation Schemes 27 / 39

Why does this work?

• Suppose that in the new instance the optimal assignment sets some of

the xi’s to 0 and others to 1.

• This gives a partition of the 1-expander.

• Each edge cut by the partition corresponds to an unsatisfied clause.

• Number of cut edges > number of minority assigned vertices =number of clauses lost by being consistent.

Hence, it is always optimal to give the same value to all xi’s.

• Also, because expander graphs are sparse, only linear number of

clauses added.

• This gives some inapproximability constant.

Limits of expanders

Parameterized Approximation Schemes 28 / 39

• Expanders sound useful. But how good expanders can we get?

We want:

• Low degree – few edges

• High expansion (at least 1).

These are conflicting goals!

Limits of expanders

Parameterized Approximation Schemes 28 / 39

• Expanders sound useful. But how good expanders can we get?

We want:

• Low degree – few edges

• High expansion (at least 1).

These are conflicting goals!

• The smallest ∆ for which we currently know we can have expansion 1

is ∆ = 6. [Bollobas 88]

Limits of expanders

Parameterized Approximation Schemes 28 / 39

• Expanders sound useful. But how good expanders can we get?

We want:

• Low degree – few edges

• High expansion (at least 1).

These are conflicting goals!

• The smallest ∆ for which we currently know we can have expansion 1

is ∆ = 6. [Bollobas 88]

• Problem: ∆ = 6 is too large, ∆ = 5 probably won’t work. . .

Amplifiers

Parameterized Approximation Schemes 29 / 39

• Amplifiers are expanders for some of the vertices.

• The other vertices are thrown in to make consistency easier to

achieve.

• This allows us to get smaller ∆.

Amplifiers

Parameterized Approximation Schemes 29 / 39

• Amplifiers are expanders for some of the vertices.

• The other vertices are thrown in to make consistency easier to

achieve.

• This allows us to get smaller ∆.

5-regular amplifier [Berman Karpinski 03]

• Bipartite graph. n vertices on left, 0.8n vertices

on right.

• 4-regular on left, 5-regular on right.

• Graph constructed randomly.

• Crucial Property: whp any partition cuts more

edges than the number of left vertices on the

smaller set.

Amplifiers

Parameterized Approximation Schemes 29 / 39

• Amplifiers are expanders for some of the vertices.

• The other vertices are thrown in to make consistency easier to

achieve.

• This allows us to get smaller ∆.

3-regular wheel amplifier [Berman Karpinski

01]

• Start with a cycle on 7n vertices.

• Every seventh vertex is a contact vertex.

Other vertices are checkers.

• Take a random perfect matching of

checkers.

Back to the Reduction

Overview

Parameterized Approximation Schemes 31 / 39

We start from an instance of MAX-E3-LIN2. Given a set of linear

equations (mod 2) each of size three satisfy as many as possible.

Problem known to be 2-inapproximable (Hastad)

Overview

Parameterized Approximation Schemes 31 / 39

We use the Berman-Karpinski amplifier construction to obtain an instance

where each variable appears exactly 5 times (and most equations have

size 2).

Overview

Parameterized Approximation Schemes 31 / 39

Overview

Parameterized Approximation Schemes 31 / 39

A simple trick reduces this to the 1in3 predicate.

Overview

Parameterized Approximation Schemes 31 / 39

From this instance we construct a graph.

1in3-SAT

Parameterized Approximation Schemes 32 / 39

Input:

A set of clauses (l1 ∨ l2 ∨ l3), l1, l2, l3 literals.

Objective:

A clause is satisfied if exactly one of its literals is true. Satisfy as many

clauses as possible.

• Easy to reduce MAX-LIN2 to this problem.

• Especially for size two equations (x+ y = 1) ↔ (x ∨ y).

• Naturally gives gadget for TSP

• In TSP we’d like to visit each vertex at least once, but not more

than once (to save cost)

TSP and Euler tours

Parameterized Approximation Schemes 33 / 39

TSP and Euler tours

Parameterized Approximation Schemes 33 / 39

TSP and Euler tours

Parameterized Approximation Schemes 33 / 39

TSP and Euler tours

Parameterized Approximation Schemes 33 / 39

• A TSP tour gives an Eulerian multi-graph com-

posed with edges of G.

• An Eulerian multi-graph composed with edges of

G gives a TSP tour.

• TSP ≡ Select a multiplicity for each edge so

that the resulting multi-graph is Eulerian and

total cost is minimized

• Note: no edge is used more than twice

Gadget – Forced Edges

Parameterized Approximation Schemes 34 / 39

We would like to be able to dictate in our construction that a certain edge

has to be used at least once.

Gadget – Forced Edges

Parameterized Approximation Schemes 34 / 39

If we had directed edges, this could be achieved by adding a dummy

intermediate vertex

Gadget – Forced Edges

Parameterized Approximation Schemes 34 / 39

Here, we add many intermediate vertices and evenly distribute the weight

w among them. Think of B as very large.

Gadget – Forced Edges

Parameterized Approximation Schemes 34 / 39

At most one of the new edges may be unused, and in that case all others

are used twice.

Gadget – Forced Edges

Parameterized Approximation Schemes 34 / 39

In that case, adding two copies of that edge to the solution doesn’t hurt

much (for B sufficiently large).

1in3 Gadget

Parameterized Approximation Schemes 35 / 39

Let’s design a gadget

for (x ∨ y ∨ z)

1in3 Gadget

Parameterized Approximation Schemes 35 / 39

First, three entry/exit

points

1in3 Gadget

Parameterized Approximation Schemes 35 / 39

Connect them . . .

1in3 Gadget

Parameterized Approximation Schemes 35 / 39

. . . with forced edges

1in3 Gadget

Parameterized Approximation Schemes 35 / 39

The gadget is a con-

nected component.

A good tour visits it

once.

1in3 Gadget

Parameterized Approximation Schemes 35 / 39

. . . like this

1in3 Gadget

Parameterized Approximation Schemes 35 / 39

This corresponds to

an unsatisfied clause

1in3 Gadget

Parameterized Approximation Schemes 35 / 39

This corresponds to a

dishonest tour

1in3 Gadget

Parameterized Approximation Schemes 35 / 39

The dishonest tour

pays this edge twice.

How expensive must

it be before cheating

becomes suboptimal?

Note that w = 10 suffices, since the two cheating variables appear in at

most 10 clauses.

Construction

Parameterized Approximation Schemes 36 / 39

High-level view: con-

struct an origin s and

two terminal vertices

for each variable.

Construction

Parameterized Approximation Schemes 36 / 39

Connect them with

forced edges

Construction

Parameterized Approximation Schemes 36 / 39

Add the gadgets

Construction

Parameterized Approximation Schemes 36 / 39

An honest traversal for

x2 looks like this

Construction

Parameterized Approximation Schemes 36 / 39

A dishonest traversal

looks like this. . .

Construction

Parameterized Approximation Schemes 36 / 39

. . . but there must be

cheating in two places

There are as many doubly-used forced edges as affected variables

→ w ≤ 5

Construction

Parameterized Approximation Schemes 36 / 39

. . . but there must be

cheating in two places

There are as many doubly-used forced edges as affected variables

→ w ≤ 5

In fact, no need to write off affected clauses. Use random assignment for

cheated variables and some of them will be satisfied

Under the carpet

Parameterized Approximation Schemes 37 / 39

• Many details missing

• Dishonest variables are set randomly but not

independently to ensure that some clauses

are satisfied with probability 1.

• The structure of the instance (from BK ampli-

fier) must be taken into account to calculate

the final constant.

Under the carpet

Parameterized Approximation Schemes 37 / 39

• Many details missing

• Dishonest variables are set randomly but not

independently to ensure that some clauses

are satisfied with probability 1.

• The structure of the instance (from BK ampli-

fier) must be taken into account to calculate

the final constant.

Theorem:

There is no 185184 approximation algorithm for TSP, unless P=NP.

Under the carpet

Parameterized Approximation Schemes 37 / 39

• Many details missing

• Dishonest variables are set randomly but not

independently to ensure that some clauses

are satisfied with probability 1.

• The structure of the instance (from BK ampli-

fier) must be taken into account to calculate

the final constant.

Theorem:

There is no 185184 approximation algorithm for TSP, unless P=NP.

Can we do better?

Under the carpet

Parameterized Approximation Schemes 37 / 39

• Many details missing

• Dishonest variables are set randomly but not

independently to ensure that some clauses

are satisfied with probability 1.

• The structure of the instance (from BK ampli-

fier) must be taken into account to calculate

the final constant.

Theorem:

There is no 185184 approximation algorithm for TSP, unless P=NP.

Can we do better?

Summary

Parameterized Approximation Schemes 38 / 39

• Hardness of Approximation theory is the evil twin

of the theory of approximation algorithms.

• It relies on some deep mathematical tools

• PCP theorem, expander graphs, . . .

• We discussed some general common patterns

• Local vs. global errors, gaps, . . .

Summary

Parameterized Approximation Schemes 38 / 39

• Hardness of Approximation theory is the evil twin

of the theory of approximation algorithms.

• It relies on some deep mathematical tools

• PCP theorem, expander graphs, . . .

• We discussed some general common patterns

• Local vs. global errors, gaps, . . .

• Area still under construction!

• Still far from answer for TSP and many other

prominent problems!

• For Graph Balancing the answer is between

1.5 and 1.75.

• Can we make more progress?

The end

Parameterized Approximation Schemes 39 / 39

Questions?