Lecture 14: Linear Programming Relaxation and Rounding

Post on 15-Oct-2021

2 views 0 download

Transcript of Lecture 14: Linear Programming Relaxation and Rounding

Lecture 14: Linear Programming Relaxation andRounding

Rafael Oliveira

University of WaterlooCheriton School of Computer Science

rafael.oliveira.teaching@gmail.com

June 24, 2021

1 / 100

Overview

Part IWhy Relax & Round?

Vertex Cover

Set Cover

Conclusion

Acknowledgements

2 / 100

Motivation - NP-hard problems

Many important optimization problems are NP-hard to solve.

What do we do when we see one?

1 Find approximate solutions in polynomial time!2 Sometimes we even do that for problems in P (but we want much

much faster solutions)

Integer Linear Program (ILP):

minimize cT x

subject to Ax ≤ b

x ∈ Nn

Advantage of ILPs: very expressive language to formulateoptimization problems (capture many combinatorial optimizationproblems)

Disadvantage of ILPs: capture even NP-hard problems (thus NP-hard)

But we know how to solve LPs. Can we get partial credit in life?

3 / 100

Motivation - NP-hard problems

Many important optimization problems are NP-hard to solve.

What do we do when we see one?

1 Find approximate solutions in polynomial time!2 Sometimes we even do that for problems in P (but we want much

much faster solutions)

Integer Linear Program (ILP):

minimize cT x

subject to Ax ≤ b

x ∈ Nn

Advantage of ILPs: very expressive language to formulateoptimization problems (capture many combinatorial optimizationproblems)

Disadvantage of ILPs: capture even NP-hard problems (thus NP-hard)

But we know how to solve LPs. Can we get partial credit in life?

4 / 100

Motivation - NP-hard problems

Many important optimization problems are NP-hard to solve.

What do we do when we see one?1 Find approximate solutions in polynomial time!

2 Sometimes we even do that for problems in P (but we want muchmuch faster solutions)

Integer Linear Program (ILP):

minimize cT x

subject to Ax ≤ b

x ∈ Nn

Advantage of ILPs: very expressive language to formulateoptimization problems (capture many combinatorial optimizationproblems)

Disadvantage of ILPs: capture even NP-hard problems (thus NP-hard)

But we know how to solve LPs. Can we get partial credit in life?

5 / 100

Motivation - NP-hard problems

Many important optimization problems are NP-hard to solve.

What do we do when we see one?1 Find approximate solutions in polynomial time!2 Sometimes we even do that for problems in P (but we want much

much faster solutions)

Integer Linear Program (ILP):

minimize cT x

subject to Ax ≤ b

x ∈ Nn

Advantage of ILPs: very expressive language to formulateoptimization problems (capture many combinatorial optimizationproblems)

Disadvantage of ILPs: capture even NP-hard problems (thus NP-hard)

But we know how to solve LPs. Can we get partial credit in life?

6 / 100

Motivation - NP-hard problems

Many important optimization problems are NP-hard to solve.

What do we do when we see one?1 Find approximate solutions in polynomial time!2 Sometimes we even do that for problems in P (but we want much

much faster solutions)

Integer Linear Program (ILP):

minimize cT x

subject to Ax ≤ b

x ∈ Nn

Advantage of ILPs: very expressive language to formulateoptimization problems (capture many combinatorial optimizationproblems)

Disadvantage of ILPs: capture even NP-hard problems (thus NP-hard)

But we know how to solve LPs. Can we get partial credit in life?

7 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Motivation - NP-hard problems

Many important optimization problems are NP-hard to solve.

What do we do when we see one?1 Find approximate solutions in polynomial time!2 Sometimes we even do that for problems in P (but we want much

much faster solutions)

Integer Linear Program (ILP):

minimize cT x

subject to Ax ≤ b

x ∈ Nn

Advantage of ILPs: very expressive language to formulateoptimization problems (capture many combinatorial optimizationproblems)

Disadvantage of ILPs: capture even NP-hard problems (thus NP-hard)

But we know how to solve LPs. Can we get partial credit in life?

8 / 100

Motivation - NP-hard problems

Many important optimization problems are NP-hard to solve.

What do we do when we see one?1 Find approximate solutions in polynomial time!2 Sometimes we even do that for problems in P (but we want much

much faster solutions)

Integer Linear Program (ILP):

minimize cT x

subject to Ax ≤ b

x ∈ Nn

Advantage of ILPs: very expressive language to formulateoptimization problems (capture many combinatorial optimizationproblems)

Disadvantage of ILPs: capture even NP-hard problems (thus NP-hard)

But we know how to solve LPs. Can we get partial credit in life?

9 / 100

Motivation - NP-hard problems

Many important optimization problems are NP-hard to solve.

What do we do when we see one?1 Find approximate solutions in polynomial time!2 Sometimes we even do that for problems in P (but we want much

much faster solutions)

Integer Linear Program (ILP):

minimize cT x

subject to Ax ≤ b

x ∈ Nn

Advantage of ILPs: very expressive language to formulateoptimization problems (capture many combinatorial optimizationproblems)

Disadvantage of ILPs: capture even NP-hard problems (thus NP-hard)

But we know how to solve LPs. Can we get partial credit in life?

10 / 100

Example

Maximum Independent Set:

G (V ,E ) graph.

Independent set S ⊆ V such that u, v ∈ S ⇒ {u, v} 6∈ E .

Integer Linear Program:

maximize∑v∈V

xv

subject to xu + xv ≤ 1 for {u, v} ∈ E

xv ∈ {0, 1} for v ∈ V

11 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Relax... & Round!In our quest to get efficient (exact or approximate) algorithms forproblems of interest, the following strategy is very useful:

1 Formulate combinatorial optimization problem as ILP

2 Derive LP from the ILP by removing the integral constraints

This is called an LP relaxation.

3 We are still minimizing the same objective function, but over a(potentially) larger set of solutions.

opt(LP) ≤ opt(ILP)

4 Solve LP optimally using efficient algorithm.

1 If solution to LP has integral values, then it is a solution to ILP and weare done

2 If solution has fractional values, then we have to devise roundingprocedure that transforms

fractional solutions → integral solutions

opt(LP) ≤ c · opt(ILP)

12 / 100

Relax... & Round!In our quest to get efficient (exact or approximate) algorithms forproblems of interest, the following strategy is very useful:

1 Formulate combinatorial optimization problem as ILP

2 Derive LP from the ILP by removing the integral constraints

This is called an LP relaxation.

3 We are still minimizing the same objective function, but over a(potentially) larger set of solutions.

opt(LP) ≤ opt(ILP)

4 Solve LP optimally using efficient algorithm.

1 If solution to LP has integral values, then it is a solution to ILP and weare done

2 If solution has fractional values, then we have to devise roundingprocedure that transforms

fractional solutions → integral solutions

opt(LP) ≤ c · opt(ILP)

13 / 100

Relax... & Round!In our quest to get efficient (exact or approximate) algorithms forproblems of interest, the following strategy is very useful:

1 Formulate combinatorial optimization problem as ILP

2 Derive LP from the ILP by removing the integral constraints

This is called an LP relaxation.

3 We are still minimizing the same objective function, but over a(potentially) larger set of solutions.

opt(LP) ≤ opt(ILP)

4 Solve LP optimally using efficient algorithm.

1 If solution to LP has integral values, then it is a solution to ILP and weare done

2 If solution has fractional values, then we have to devise roundingprocedure that transforms

fractional solutions → integral solutions

opt(LP) ≤ c · opt(ILP)

14 / 100

Relax... & Round!In our quest to get efficient (exact or approximate) algorithms forproblems of interest, the following strategy is very useful:

1 Formulate combinatorial optimization problem as ILP

2 Derive LP from the ILP by removing the integral constraints

This is called an LP relaxation.

3 We are still minimizing the same objective function, but over a(potentially) larger set of solutions.

opt(LP) ≤ opt(ILP)

4 Solve LP optimally using efficient algorithm.

1 If solution to LP has integral values, then it is a solution to ILP and weare done

2 If solution has fractional values, then we have to devise roundingprocedure that transforms

fractional solutions → integral solutions

opt(LP) ≤ c · opt(ILP)

15 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Relax... & Round!In our quest to get efficient (exact or approximate) algorithms forproblems of interest, the following strategy is very useful:

1 Formulate combinatorial optimization problem as ILP

2 Derive LP from the ILP by removing the integral constraints

This is called an LP relaxation.

3 We are still minimizing the same objective function, but over a(potentially) larger set of solutions.

opt(LP) ≤ opt(ILP)

4 Solve LP optimally using efficient algorithm.

1 If solution to LP has integral values, then it is a solution to ILP and weare done

2 If solution has fractional values, then we have to devise roundingprocedure that transforms

fractional solutions → integral solutions

opt(LP) ≤ c · opt(ILP)

16 / 100

Relax... & Round!In our quest to get efficient (exact or approximate) algorithms forproblems of interest, the following strategy is very useful:

1 Formulate combinatorial optimization problem as ILP

2 Derive LP from the ILP by removing the integral constraints

This is called an LP relaxation.

3 We are still minimizing the same objective function, but over a(potentially) larger set of solutions.

opt(LP) ≤ opt(ILP)

4 Solve LP optimally using efficient algorithm.1 If solution to LP has integral values, then it is a solution to ILP and we

are done

2 If solution has fractional values, then we have to devise roundingprocedure that transforms

fractional solutions → integral solutions

opt(LP) ≤ c · opt(ILP)

17 / 100

Relax... & Round!In our quest to get efficient (exact or approximate) algorithms forproblems of interest, the following strategy is very useful:

1 Formulate combinatorial optimization problem as ILP

2 Derive LP from the ILP by removing the integral constraints

This is called an LP relaxation.

3 We are still minimizing the same objective function, but over a(potentially) larger set of solutions.

opt(LP) ≤ opt(ILP)

4 Solve LP optimally using efficient algorithm.1 If solution to LP has integral values, then it is a solution to ILP and we

are done2 If solution has fractional values, then we have to devise rounding

procedure that transforms

fractional solutions → integral solutions

opt(LP) ≤ c · opt(ILP)

18 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Not all LPs created equalWhen solving LP

minimize cT x

subject to Ax = b

x ≥ 0

it is important to understand geometry of feasible set & how nice thecorner points are, as they are the candidates to optimum solution.

Let P := {x ∈ Rn≥0 | Ax = b}

Vertex Solutions: a solution x ∈ P is a vertex solution if 6 ∃y 6= 0such that x + y ∈ P and x − y ∈ P

Extreme Point Solutions: x ∈ P is an extreme point solution if∃u ∈ Rn such that x is the unique optimum solution to the LP withconstraint P and objective uT x .

Basic Solutions: let supp(x) := {i ∈ [n] | xi > 0} be the set ofnonzero coordinates of x . Then x ∈ P is a basic solution ⇔ thecolumns of A indexed by supp(x) are linearly independent.

19 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Not all LPs created equalWhen solving LP

minimize cT x

subject to Ax = b

x ≥ 0

it is important to understand geometry of feasible set & how nice thecorner points are, as they are the candidates to optimum solution.

Let P := {x ∈ Rn≥0 | Ax = b}

Vertex Solutions: a solution x ∈ P is a vertex solution if 6 ∃y 6= 0such that x + y ∈ P and x − y ∈ P

Extreme Point Solutions: x ∈ P is an extreme point solution if∃u ∈ Rn such that x is the unique optimum solution to the LP withconstraint P and objective uT x .

Basic Solutions: let supp(x) := {i ∈ [n] | xi > 0} be the set ofnonzero coordinates of x . Then x ∈ P is a basic solution ⇔ thecolumns of A indexed by supp(x) are linearly independent.

20 / 100

Not all LPs created equalWhen solving LP

minimize cT x

subject to Ax = b

x ≥ 0

it is important to understand geometry of feasible set & how nice thecorner points are, as they are the candidates to optimum solution.

Let P := {x ∈ Rn≥0 | Ax = b}

Vertex Solutions: a solution x ∈ P is a vertex solution if 6 ∃y 6= 0such that x + y ∈ P and x − y ∈ P

Extreme Point Solutions: x ∈ P is an extreme point solution if∃u ∈ Rn such that x is the unique optimum solution to the LP withconstraint P and objective uT x .

Basic Solutions: let supp(x) := {i ∈ [n] | xi > 0} be the set ofnonzero coordinates of x . Then x ∈ P is a basic solution ⇔ thecolumns of A indexed by supp(x) are linearly independent.

21 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Not all LPs created equalWhen solving LP

minimize cT x

subject to Ax = b

x ≥ 0

it is important to understand geometry of feasible set & how nice thecorner points are, as they are the candidates to optimum solution.

Let P := {x ∈ Rn≥0 | Ax = b}

Vertex Solutions: a solution x ∈ P is a vertex solution if 6 ∃y 6= 0such that x + y ∈ P and x − y ∈ P

Extreme Point Solutions: x ∈ P is an extreme point solution if∃u ∈ Rn such that x is the unique optimum solution to the LP withconstraint P and objective uT x .

Basic Solutions: let supp(x) := {i ∈ [n] | xi > 0} be the set ofnonzero coordinates of x . Then x ∈ P is a basic solution ⇔ thecolumns of A indexed by supp(x) are linearly independent.

22 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Not all LPs created equalWhen solving LP

minimize cT x

subject to Ax = b

x ≥ 0

it is important to understand geometry of feasible set & how nice thecorner points are, as they are the candidates to optimum solution.

Let P := {x ∈ Rn≥0 | Ax = b}

Vertex Solutions: a solution x ∈ P is a vertex solution if 6 ∃y 6= 0such that x + y ∈ P and x − y ∈ P

Extreme Point Solutions: x ∈ P is an extreme point solution if∃u ∈ Rn such that x is the unique optimum solution to the LP withconstraint P and objective uT x .

Basic Solutions: let supp(x) := {i ∈ [n] | xi > 0} be the set ofnonzero coordinates of x . Then x ∈ P is a basic solution ⇔ thecolumns of A indexed by supp(x) are linearly independent.

23 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Part IWhy Relax & Round?

Vertex Cover

Set Cover

Conclusion

Acknowledgements

24 / 100

Vertex Cover

Setup:

Input: a graph G (V ,E ).

Output: Minimum number of vertices that “touches” all edges ofgraph. That is, minimum set S such that for each edge {u, v} ∈ Ewe have

|S ∩ {u, v}| ≥ 1.

Weighted version: associate to each vertex v ∈ V a cost cv ∈ R≥0.

1 Setup ILP:

minimize∑u∈V

cu · xu

subject to xu + xv ≥ 1 for {u, v} ∈ E

xu ∈ {0, 1} for u ∈ V

25 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Vertex Cover

Setup:

Input: a graph G (V ,E ).

Output: Minimum number of vertices that “touches” all edges ofgraph. That is, minimum set S such that for each edge {u, v} ∈ Ewe have

|S ∩ {u, v}| ≥ 1.

Weighted version: associate to each vertex v ∈ V a cost cv ∈ R≥0.

1 Setup ILP:

minimize∑u∈V

cu · xu

subject to xu + xv ≥ 1 for {u, v} ∈ E

xu ∈ {0, 1} for u ∈ V

26 / 100

Vertex Cover

Setup:

Input: a graph G (V ,E ).

Output: Minimum number of vertices that “touches” all edges ofgraph. That is, minimum set S such that for each edge {u, v} ∈ Ewe have

|S ∩ {u, v}| ≥ 1.

Weighted version: associate to each vertex v ∈ V a cost cv ∈ R≥0.

1 Setup ILP:

minimize∑u∈V

cu · xu

subject to xu + xv ≥ 1 for {u, v} ∈ E

xu ∈ {0, 1} for u ∈ V

27 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Simple 2-approximation (unweighted)

1 List edges of E in any order. Set S = ∅

2 For each {u, v} ∈ E :1 If S ∩ {u, v} = ∅, then S ← S ∪ {u, v}

3 return S

Proof of correctness:

By construction, S is a vertex cover.

If added elements to S k times, then |S | = 2k and G has a matchingof size k, which means that optimum vertex cover is at least k .

Thus, we get a 2-approximation.

28 / 100

Simple 2-approximation (unweighted)

1 List edges of E in any order. Set S = ∅2 For each {u, v} ∈ E :

1 If S ∩ {u, v} = ∅, then S ← S ∪ {u, v}

3 return S

Proof of correctness:

By construction, S is a vertex cover.

If added elements to S k times, then |S | = 2k and G has a matchingof size k, which means that optimum vertex cover is at least k .

Thus, we get a 2-approximation.

29 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Simple 2-approximation (unweighted)

1 List edges of E in any order. Set S = ∅2 For each {u, v} ∈ E :

1 If S ∩ {u, v} = ∅, then S ← S ∪ {u, v}3 return S

Proof of correctness:

By construction, S is a vertex cover.

If added elements to S k times, then |S | = 2k and G has a matchingof size k, which means that optimum vertex cover is at least k .

Thus, we get a 2-approximation.

30 / 100

Simple 2-approximation (unweighted)

1 List edges of E in any order. Set S = ∅2 For each {u, v} ∈ E :

1 If S ∩ {u, v} = ∅, then S ← S ∪ {u, v}3 return S

Proof of correctness:

By construction, S is a vertex cover.

If added elements to S k times, then |S | = 2k and G has a matchingof size k, which means that optimum vertex cover is at least k .

Thus, we get a 2-approximation.

31 / 100

Simple 2-approximation (unweighted)

1 List edges of E in any order. Set S = ∅2 For each {u, v} ∈ E :

1 If S ∩ {u, v} = ∅, then S ← S ∪ {u, v}3 return S

Proof of correctness:

By construction, S is a vertex cover.

If added elements to S k times, then |S | = 2k and G has a matchingof size k, which means that optimum vertex cover is at least k .

Thus, we get a 2-approximation.

32 / 100

Rafael Oliveira

Simple 2-approximation (unweighted)

1 List edges of E in any order. Set S = ∅2 For each {u, v} ∈ E :

1 If S ∩ {u, v} = ∅, then S ← S ∪ {u, v}3 return S

Proof of correctness:

By construction, S is a vertex cover.

If added elements to S k times, then |S | = 2k and G has a matchingof size k, which means that optimum vertex cover is at least k .

Thus, we get a 2-approximation.

33 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Simple 2-approximation (unweighted)

1 List edges of E in any order. Set S = ∅2 For each {u, v} ∈ E :

1 If S ∩ {u, v} = ∅, then S ← S ∪ {u, v}3 return S

Proof of correctness:

By construction, S is a vertex cover.

If added elements to S k times, then |S | = 2k and G has a matchingof size k, which means that optimum vertex cover is at least k .

Thus, we get a 2-approximation.

34 / 100

What can go wrong in the weighted case?

Original Algo Heuristic: pick lowest weight only

35 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Vertex Cover - LP relaxation

1 Setup ILP:

minimize∑u∈V

cu · xu

subject to xu + xv ≥ 1 for {u, v} ∈ E

xu ∈ {0, 1} for u ∈ V

2 Drop integrality constraints

minimize∑u∈V

cu · xu

subject to xu + xv ≥ 1 for {u, v} ∈ E

0 ≤ xu ≤ 1 for u ∈ V

3 Solve LP. Get optimal solution z for LP, where z = (zu)u∈V .

4 Round LP as follows: round zv to nearest integer.

36 / 100

Vertex Cover - LP relaxation

1 Setup ILP:

minimize∑u∈V

cu · xu

subject to xu + xv ≥ 1 for {u, v} ∈ E

xu ∈ {0, 1} for u ∈ V

2 Drop integrality constraints

minimize∑u∈V

cu · xu

subject to xu + xv ≥ 1 for {u, v} ∈ E

0 ≤ xu ≤ 1 for u ∈ V

3 Solve LP. Get optimal solution z for LP, where z = (zu)u∈V .

4 Round LP as follows: round zv to nearest integer.

37 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Vertex Cover - LP relaxation

1 Setup ILP:

minimize∑u∈V

cu · xu

subject to xu + xv ≥ 1 for {u, v} ∈ E

xu ∈ {0, 1} for u ∈ V

2 Drop integrality constraints

minimize∑u∈V

cu · xu

subject to xu + xv ≥ 1 for {u, v} ∈ E

0 ≤ xu ≤ 1 for u ∈ V

3 Solve LP. Get optimal solution z for LP, where z = (zu)u∈V .

4 Round LP as follows: round zv to nearest integer.

38 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Vertex Cover - LP relaxation

1 Setup ILP:

minimize∑u∈V

cu · xu

subject to xu + xv ≥ 1 for {u, v} ∈ E

xu ∈ {0, 1} for u ∈ V

2 Drop integrality constraints

minimize∑u∈V

cu · xu

subject to xu + xv ≥ 1 for {u, v} ∈ E

0 ≤ xu ≤ 1 for u ∈ V

3 Solve LP. Get optimal solution z for LP, where z = (zu)u∈V .

4 Round LP as follows: round zv to nearest integer.

39 / 100

Vertex Cover - Analysis1 Drop integrality constraints

minimize∑u∈V

cu · xu

subject to xu + xv ≥ 1 for {u, v} ∈ E

0 ≤ xu ≤ 1 for u ∈ V

2 Solve LP. Get optimal solution z for LP.

3 Round zv to nearest integer. That is yv =

{1, if zv ≥ 1/2

0, if 0 ≤ zv < 1/24 y is an integral cover by construction5 each edge is covered, since given {u, v} ∈ E , at least one of zu, zv is≥ 1/2 (by feasibility of LP)

6 Cost of y is:∑u∈V

cu · yu ≤∑u∈V

cu · (2 · zu) ≤ 2 · OPT (ILP)

40 / 100

Vertex Cover - Analysis1 Drop integrality constraints

minimize∑u∈V

cu · xu

subject to xu + xv ≥ 1 for {u, v} ∈ E

0 ≤ xu ≤ 1 for u ∈ V

2 Solve LP. Get optimal solution z for LP.

3 Round zv to nearest integer. That is yv =

{1, if zv ≥ 1/2

0, if 0 ≤ zv < 1/2

4 y is an integral cover by construction5 each edge is covered, since given {u, v} ∈ E , at least one of zu, zv is≥ 1/2 (by feasibility of LP)

6 Cost of y is:∑u∈V

cu · yu ≤∑u∈V

cu · (2 · zu) ≤ 2 · OPT (ILP)

41 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Vertex Cover - Analysis1 Drop integrality constraints

minimize∑u∈V

cu · xu

subject to xu + xv ≥ 1 for {u, v} ∈ E

0 ≤ xu ≤ 1 for u ∈ V

2 Solve LP. Get optimal solution z for LP.

3 Round zv to nearest integer. That is yv =

{1, if zv ≥ 1/2

0, if 0 ≤ zv < 1/24 y is an integral cover by construction

5 each edge is covered, since given {u, v} ∈ E , at least one of zu, zv is≥ 1/2 (by feasibility of LP)

6 Cost of y is:∑u∈V

cu · yu ≤∑u∈V

cu · (2 · zu) ≤ 2 · OPT (ILP)

42 / 100

Vertex Cover - Analysis1 Drop integrality constraints

minimize∑u∈V

cu · xu

subject to xu + xv ≥ 1 for {u, v} ∈ E

0 ≤ xu ≤ 1 for u ∈ V

2 Solve LP. Get optimal solution z for LP.

3 Round zv to nearest integer. That is yv =

{1, if zv ≥ 1/2

0, if 0 ≤ zv < 1/24 y is an integral cover by construction5 each edge is covered, since given {u, v} ∈ E , at least one of zu, zv is≥ 1/2 (by feasibility of LP)

6 Cost of y is:∑u∈V

cu · yu ≤∑u∈V

cu · (2 · zu) ≤ 2 · OPT (ILP)

43 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Vertex Cover - Analysis

1 Drop integrality constraints

minimize∑u∈V

cu · xu

subject to xu + xv ≥ 1 for {u, v} ∈ E

0 ≤ xu ≤ 1 for u ∈ V

2 Solve LP. Get optimal solution z for LP.

3 Round zv to nearest integer. That is yv =

{1, if zv ≥ 1/2

0, if 0 ≤ zv < 1/24 y is an integral cover by construction5 each edge is covered, since given {u, v} ∈ E , at least one of zu, zv is≥ 1/2 (by feasibility of LP)

6 Cost of y is:∑u∈V

cu · yu ≤∑u∈V

cu · (2 · zu) ≤ 2 · OPT (ILP)

44 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Part IWhy Relax & Round?

Vertex Cover

Set Cover

Conclusion

Acknowledgements

45 / 100

Set CoverSetup:

Input: a finite set U and a collection S1,S2, . . . ,Sn of subsets of U.

Output: The fewest collection of sets I ⊆ [n] such that⋃i∈I

Sj = U.

Weighted version: associate to each set Si a weight wi ∈ R≥0.

1 Setup ILP:

minimize∑i∈[n]

wi · xi

subject to∑i :v∈Si

xi ≥ 1 for v ∈ U

xi ∈ {0, 1} for i ∈ [n]

46 / 100

Set CoverSetup:

Input: a finite set U and a collection S1,S2, . . . ,Sn of subsets of U.

Output: The fewest collection of sets I ⊆ [n] such that⋃i∈I

Sj = U.

Weighted version: associate to each set Si a weight wi ∈ R≥0.

1 Setup ILP:

minimize∑i∈[n]

wi · xi

subject to∑i :v∈Si

xi ≥ 1 for v ∈ U

xi ∈ {0, 1} for i ∈ [n]

47 / 100

Set CoverSetup:

Input: a finite set U and a collection S1,S2, . . . ,Sn of subsets of U.

Output: The fewest collection of sets I ⊆ [n] such that⋃i∈I

Sj = U.

Weighted version: associate to each set Si a weight wi ∈ R≥0.

1 Setup ILP:

minimize∑i∈[n]

wi · xi

subject to∑i :v∈Si

xi ≥ 1 for v ∈ U

xi ∈ {0, 1} for i ∈ [n]

48 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Set Cover - Relax...

1 Obtain LP relaxation:

minimize∑i∈[n]

wi · xi

subject to∑i :v∈Si

xi ≥ 1 for v ∈ U

0 ≤ xi ≤ 1 for i ∈ [n]

2 Suppose we end up with fractional solution z ∈ [0, 1]n when we solvethe LP above. Now need to come up with a rounding scheme.

3 Can we just round each coordinate zi to the nearest integer (like invertex cover)?

4 Not really. Say v ∈ U is in 20 sets, and we got zi = 1/20 for each ofthe sets v ∈ Si . Then rounding procedure above would not select anysuch set!

49 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Set Cover - Relax...

1 Obtain LP relaxation:

minimize∑i∈[n]

wi · xi

subject to∑i :v∈Si

xi ≥ 1 for v ∈ U

0 ≤ xi ≤ 1 for i ∈ [n]

2 Suppose we end up with fractional solution z ∈ [0, 1]n when we solvethe LP above. Now need to come up with a rounding scheme.

3 Can we just round each coordinate zi to the nearest integer (like invertex cover)?

4 Not really. Say v ∈ U is in 20 sets, and we got zi = 1/20 for each ofthe sets v ∈ Si . Then rounding procedure above would not select anysuch set!

50 / 100

Set Cover - Relax...

1 Obtain LP relaxation:

minimize∑i∈[n]

wi · xi

subject to∑i :v∈Si

xi ≥ 1 for v ∈ U

0 ≤ xi ≤ 1 for i ∈ [n]

2 Suppose we end up with fractional solution z ∈ [0, 1]n when we solvethe LP above. Now need to come up with a rounding scheme.

3 Can we just round each coordinate zi to the nearest integer (like invertex cover)?

4 Not really. Say v ∈ U is in 20 sets, and we got zi = 1/20 for each ofthe sets v ∈ Si . Then rounding procedure above would not select anysuch set!

51 / 100

Set Cover - Relax...

1 Obtain LP relaxation:

minimize∑i∈[n]

wi · xi

subject to∑i :v∈Si

xi ≥ 1 for v ∈ U

0 ≤ xi ≤ 1 for i ∈ [n]

2 Suppose we end up with fractional solution z ∈ [0, 1]n when we solvethe LP above. Now need to come up with a rounding scheme.

3 Can we just round each coordinate zi to the nearest integer (like invertex cover)?

4 Not really. Say v ∈ U is in 20 sets, and we got zi = 1/20 for each ofthe sets v ∈ Si . Then rounding procedure above would not select anysuch set!

52 / 100

Set Cover - Rounding

1 Think of zi as the “probability” that we would pick set Si .

2 Solution z describes an “optimal probability distribution” over waysto chose the sets Si .

3 Okay, but how do we cover?

Algorithm (Random Pick)

1 Input: values z = (z1, . . . , zn) ∈ [0, 1]n such that z is a solution to our LP

2 Output: a set cover for U

3 Set I = ∅4 for i = 1, . . . n

with probability zi , set I = I ∪ {i}

5 return I

4 Expected cost of the sets is∑n

i=1 wi · zi , which is the optimum forthe LP. But will this process cover U?

53 / 100

Set Cover - Rounding

1 Think of zi as the “probability” that we would pick set Si .

2 Solution z describes an “optimal probability distribution” over waysto chose the sets Si .

3 Okay, but how do we cover?

Algorithm (Random Pick)

1 Input: values z = (z1, . . . , zn) ∈ [0, 1]n such that z is a solution to our LP

2 Output: a set cover for U

3 Set I = ∅4 for i = 1, . . . n

with probability zi , set I = I ∪ {i}

5 return I

4 Expected cost of the sets is∑n

i=1 wi · zi , which is the optimum forthe LP. But will this process cover U?

54 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Set Cover - Rounding

1 Think of zi as the “probability” that we would pick set Si .

2 Solution z describes an “optimal probability distribution” over waysto chose the sets Si .

3 Okay, but how do we cover?

Algorithm (Random Pick)

1 Input: values z = (z1, . . . , zn) ∈ [0, 1]n such that z is a solution to our LP

2 Output: a set cover for U

3 Set I = ∅4 for i = 1, . . . n

with probability zi , set I = I ∪ {i}

5 return I

4 Expected cost of the sets is∑n

i=1 wi · zi , which is the optimum forthe LP. But will this process cover U?

55 / 100

Set Cover - Rounding

1 Think of zi as the “probability” that we would pick set Si .

2 Solution z describes an “optimal probability distribution” over waysto chose the sets Si .

3 Okay, but how do we cover?

Algorithm (Random Pick)

1 Input: values z = (z1, . . . , zn) ∈ [0, 1]n such that z is a solution to our LP

2 Output: a set cover for U

3 Set I = ∅4 for i = 1, . . . n

with probability zi , set I = I ∪ {i}

5 return I

4 Expected cost of the sets is∑n

i=1 wi · zi , which is the optimum forthe LP. But will this process cover U?

56 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Set Cover - Rounding

1 Think of zi as the “probability” that we would pick set Si .

2 Solution z describes an “optimal probability distribution” over waysto chose the sets Si .

3 Okay, but how do we cover?

Algorithm (Random Pick)

1 Input: values z = (z1, . . . , zn) ∈ [0, 1]n such that z is a solution to our LP

2 Output: a set cover for U

3 Set I = ∅

4 for i = 1, . . . n

with probability zi , set I = I ∪ {i}

5 return I

4 Expected cost of the sets is∑n

i=1 wi · zi , which is the optimum forthe LP. But will this process cover U?

57 / 100

Set Cover - Rounding

1 Think of zi as the “probability” that we would pick set Si .

2 Solution z describes an “optimal probability distribution” over waysto chose the sets Si .

3 Okay, but how do we cover?

Algorithm (Random Pick)

1 Input: values z = (z1, . . . , zn) ∈ [0, 1]n such that z is a solution to our LP

2 Output: a set cover for U

3 Set I = ∅4 for i = 1, . . . n

with probability zi , set I = I ∪ {i}

5 return I

4 Expected cost of the sets is∑n

i=1 wi · zi , which is the optimum forthe LP. But will this process cover U?

58 / 100

Set Cover - Rounding

1 Think of zi as the “probability” that we would pick set Si .

2 Solution z describes an “optimal probability distribution” over waysto chose the sets Si .

3 Okay, but how do we cover?

Algorithm (Random Pick)

1 Input: values z = (z1, . . . , zn) ∈ [0, 1]n such that z is a solution to our LP

2 Output: a set cover for U

3 Set I = ∅4 for i = 1, . . . n

with probability zi , set I = I ∪ {i}

5 return I

4 Expected cost of the sets is∑n

i=1 wi · zi , which is the optimum forthe LP. But will this process cover U?

59 / 100

Set Cover - Rounding

1 Think of zi as the “probability” that we would pick set Si .

2 Solution z describes an “optimal probability distribution” over waysto chose the sets Si .

3 Okay, but how do we cover?

Algorithm (Random Pick)

1 Input: values z = (z1, . . . , zn) ∈ [0, 1]n such that z is a solution to our LP

2 Output: a set cover for U

3 Set I = ∅4 for i = 1, . . . n

with probability zi , set I = I ∪ {i}

5 return I

4 Expected cost of the sets is∑n

i=1 wi · zi , which is the optimum forthe LP. But will this process cover U?

60 / 100

Analyzing Random Pick

Let’s consider the Random Pick process from point of view of v ∈ U.

v ∈ S1, . . . ,Sk (for simplicity)

As long as we select one of Si ’s above we are good (w.r.t. v)

We select Si with probability zi such that

k∑i=1

zi ≥ 1

Because z is a solution to our LP

What is probability that v is covered in Random Pick?

Definitely not 1. Think about case k = 2 and z1 = z2 = 1/2.

If had many elements like that, would expect many elementsuncovered. How to deal with this?

By perseverance! :)

61 / 100

Analyzing Random Pick

Let’s consider the Random Pick process from point of view of v ∈ U.

v ∈ S1, . . . ,Sk (for simplicity)

As long as we select one of Si ’s above we are good (w.r.t. v)

We select Si with probability zi such that

k∑i=1

zi ≥ 1

Because z is a solution to our LP

What is probability that v is covered in Random Pick?

Definitely not 1. Think about case k = 2 and z1 = z2 = 1/2.

If had many elements like that, would expect many elementsuncovered. How to deal with this?

By perseverance! :)

62 / 100

Analyzing Random Pick

Let’s consider the Random Pick process from point of view of v ∈ U.

v ∈ S1, . . . ,Sk (for simplicity)

As long as we select one of Si ’s above we are good (w.r.t. v)

We select Si with probability zi such that

k∑i=1

zi ≥ 1

Because z is a solution to our LP

What is probability that v is covered in Random Pick?

Definitely not 1. Think about case k = 2 and z1 = z2 = 1/2.

If had many elements like that, would expect many elementsuncovered. How to deal with this?

By perseverance! :)

63 / 100

Analyzing Random Pick

Let’s consider the Random Pick process from point of view of v ∈ U.

v ∈ S1, . . . ,Sk (for simplicity)

As long as we select one of Si ’s above we are good (w.r.t. v)

We select Si with probability zi such that

k∑i=1

zi ≥ 1

Because z is a solution to our LP

What is probability that v is covered in Random Pick?

Definitely not 1. Think about case k = 2 and z1 = z2 = 1/2.

If had many elements like that, would expect many elementsuncovered. How to deal with this?

By perseverance! :)

64 / 100

Analyzing Random Pick

Let’s consider the Random Pick process from point of view of v ∈ U.

v ∈ S1, . . . ,Sk (for simplicity)

As long as we select one of Si ’s above we are good (w.r.t. v)

We select Si with probability zi such that

k∑i=1

zi ≥ 1

Because z is a solution to our LP

What is probability that v is covered in Random Pick?

Definitely not 1. Think about case k = 2 and z1 = z2 = 1/2.

If had many elements like that, would expect many elementsuncovered. How to deal with this?

By perseverance! :)

65 / 100

Analyzing Random Pick

Let’s consider the Random Pick process from point of view of v ∈ U.

v ∈ S1, . . . ,Sk (for simplicity)

As long as we select one of Si ’s above we are good (w.r.t. v)

We select Si with probability zi such that

k∑i=1

zi ≥ 1

Because z is a solution to our LP

What is probability that v is covered in Random Pick?

Definitely not 1. Think about case k = 2 and z1 = z2 = 1/2.

If had many elements like that, would expect many elementsuncovered. How to deal with this?

By perseverance! :)

66 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Analyzing Random Pick

Let’s consider the Random Pick process from point of view of v ∈ U.

v ∈ S1, . . . ,Sk (for simplicity)

As long as we select one of Si ’s above we are good (w.r.t. v)

We select Si with probability zi such that

k∑i=1

zi ≥ 1

Because z is a solution to our LP

What is probability that v is covered in Random Pick?

Definitely not 1. Think about case k = 2 and z1 = z2 = 1/2.

If had many elements like that, would expect many elementsuncovered. How to deal with this?

By perseverance! :)

67 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Analyzing Random Pick

Let’s consider the Random Pick process from point of view of v ∈ U.

v ∈ S1, . . . ,Sk (for simplicity)

As long as we select one of Si ’s above we are good (w.r.t. v)

We select Si with probability zi such that

k∑i=1

zi ≥ 1

Because z is a solution to our LP

What is probability that v is covered in Random Pick?

Definitely not 1. Think about case k = 2 and z1 = z2 = 1/2.

If had many elements like that, would expect many elementsuncovered. How to deal with this?

By perseverance! :)

68 / 100

Probability that Element is Covered

Lemma (Probability of Covering an Element)

In a sequence of k independent experiments, in which the i th experimenthas success probability pi , and

k∑i=1

pi ≥ 1

then there is a probability ≥ 1− 1/e that at least one experiment issuccessful.

Probability that no experiment is successful:

(1− p1) · (1− p2) · · · (1− pk)

1− x ≤ e−x for x ∈ [0, 1]Thus probability of failure is

k∏i=1

(1− pi ) ≤k∏

i=1

e−pi = e−p1−···−pk ≤ 1/e

69 / 100

Probability that Element is Covered

Lemma (Probability of Covering an Element)

In a sequence of k independent experiments, in which the i th experimenthas success probability pi , and

k∑i=1

pi ≥ 1

then there is a probability ≥ 1− 1/e that at least one experiment issuccessful.

Probability that no experiment is successful:

(1− p1) · (1− p2) · · · (1− pk)

1− x ≤ e−x for x ∈ [0, 1]Thus probability of failure is

k∏i=1

(1− pi ) ≤k∏

i=1

e−pi = e−p1−···−pk ≤ 1/e

70 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Probability that Element is Covered

Lemma (Probability of Covering an Element)

In a sequence of k independent experiments, in which the i th experimenthas success probability pi , and

k∑i=1

pi ≥ 1

then there is a probability ≥ 1− 1/e that at least one experiment issuccessful.

Probability that no experiment is successful:

(1− p1) · (1− p2) · · · (1− pk)

1− x ≤ e−x for x ∈ [0, 1]

Thus probability of failure is

k∏i=1

(1− pi ) ≤k∏

i=1

e−pi = e−p1−···−pk ≤ 1/e

71 / 100

Probability that Element is Covered

Lemma (Probability of Covering an Element)

In a sequence of k independent experiments, in which the i th experimenthas success probability pi , and

k∑i=1

pi ≥ 1

then there is a probability ≥ 1− 1/e that at least one experiment issuccessful.

Probability that no experiment is successful:

(1− p1) · (1− p2) · · · (1− pk)

1− x ≤ e−x for x ∈ [0, 1]Thus probability of failure is

k∏i=1

(1− pi ) ≤k∏

i=1

e−pi = e−p1−···−pk ≤ 1/e72 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Randomized Rounding

Algorithm (Randomized Rounding)

1 Input: values z = (z1, . . . , zn) ∈ [0, 1]n s.t. z is a solution to our LP

2 Output: a set cover for U

3 Set I = ∅4 While there is element v ∈ U uncovered:

For i = 1, . . . , n:

with probability zi , set I = I ∪ {i}5 return I

To analyze this, need to show that we don’t execute the for loop too manytimes.

Lemma (Probability Decay)

Let t ∈ N. The probability that the for loop will be executed more thanln(|U|) + t times is at most e−t .

73 / 100

Randomized Rounding

Algorithm (Randomized Rounding)

1 Input: values z = (z1, . . . , zn) ∈ [0, 1]n s.t. z is a solution to our LP

2 Output: a set cover for U

3 Set I = ∅4 While there is element v ∈ U uncovered:

For i = 1, . . . , n:

with probability zi , set I = I ∪ {i}5 return I

To analyze this, need to show that we don’t execute the for loop too manytimes.

Lemma (Probability Decay)

Let t ∈ N. The probability that the for loop will be executed more thanln(|U|) + t times is at most e−t .

74 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Randomized Rounding

Algorithm (Randomized Rounding)

1 Input: values z = (z1, . . . , zn) ∈ [0, 1]n s.t. z is a solution to our LP

2 Output: a set cover for U

3 Set I = ∅4 While there is element v ∈ U uncovered:

For i = 1, . . . , n:

with probability zi , set I = I ∪ {i}5 return I

To analyze this, need to show that we don’t execute the for loop too manytimes.

Lemma (Probability Decay)

Let t ∈ N. The probability that the for loop will be executed more thanln(|U|) + t times is at most e−t .

75 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Randomized Rounding

Algorithm (Randomized Rounding)

1 Input: values z = (z1, . . . , zn) ∈ [0, 1]n s.t. z is a solution to our LP

2 Output: a set cover for U

3 Set I = ∅4 While there is element v ∈ U uncovered:

For i = 1, . . . , n:

with probability zi , set I = I ∪ {i}5 return I

To analyze this, need to show that we don’t execute the for loop too manytimes.

Lemma (Probability Decay)

Let t ∈ N. The probability that the for loop will be executed more thanln(|U|) + t times is at most e−t .

76 / 100

Proof of Lemma

Lemma (Probability Decay)

Let t ∈ N. The probability that the for loop will be executed more thanln(|U|) + t times is at most e−t .

Probability that for loop is executed more than ln(|U|) + t times isthe probability that there is an uncovered element after theln(|U|) + t iteration.

Let v ∈ U. For each iteration of the loop, there is a probability of 1/ethat v is not covered. (by our previous lemma)

Probability that v not covered after ln(|U|) + t iterations is(1

e

)ln(|U|)+t

=1

|U|· e−t

Union bound.

77 / 100

Proof of Lemma

Lemma (Probability Decay)

Let t ∈ N. The probability that the for loop will be executed more thanln(|U|) + t times is at most e−t .

Probability that for loop is executed more than ln(|U|) + t times isthe probability that there is an uncovered element after theln(|U|) + t iteration.

Let v ∈ U. For each iteration of the loop, there is a probability of 1/ethat v is not covered. (by our previous lemma)

Probability that v not covered after ln(|U|) + t iterations is(1

e

)ln(|U|)+t

=1

|U|· e−t

Union bound.

78 / 100

Proof of Lemma

Lemma (Probability Decay)

Let t ∈ N. The probability that the for loop will be executed more thanln(|U|) + t times is at most e−t .

Probability that for loop is executed more than ln(|U|) + t times isthe probability that there is an uncovered element after theln(|U|) + t iteration.

Let v ∈ U. For each iteration of the loop, there is a probability of 1/ethat v is not covered. (by our previous lemma)

Probability that v not covered after ln(|U|) + t iterations is(1

e

)ln(|U|)+t

=1

|U|· e−t

Union bound.

79 / 100

Proof of Lemma

Lemma (Probability Decay)

Let t ∈ N. The probability that the for loop will be executed more thanln(|U|) + t times is at most e−t .

Probability that for loop is executed more than ln(|U|) + t times isthe probability that there is an uncovered element after theln(|U|) + t iteration.

Let v ∈ U. For each iteration of the loop, there is a probability of 1/ethat v is not covered. (by our previous lemma)

Probability that v not covered after ln(|U|) + t iterations is(1

e

)ln(|U|)+t

=1

|U|· e−t

Union bound.

80 / 100

Proof of Lemma

Lemma (Probability Decay)

Let t ∈ N. The probability that the for loop will be executed more thanln(|U|) + t times is at most e−t .

Probability that for loop is executed more than ln(|U|) + t times isthe probability that there is an uncovered element after theln(|U|) + t iteration.

Let v ∈ U. For each iteration of the loop, there is a probability of 1/ethat v is not covered. (by our previous lemma)

Probability that v not covered after ln(|U|) + t iterations is(1

e

)ln(|U|)+t

=1

|U|· e−t

Union bound.

81 / 100

Cost of Rounded SolutionNow that we know we will cover with high probability, we need to boundthe cost of the solution we came up with.

At each implementation of for loop, our expected cover weight is

k∑i=1

wi · zi

After t iterations of for loop, expected weight is

t ·k∑

i=1

wi · zi

By Markov:Pr[X ≥ 2 · E[X ]] ≤ 1/2.

Lemma (Cost of Rounding)

Given z optimal for the LP, our randomized rounding outputs, withprobability ≥ 0.45 a feasible solution to set cover with≤ 2 · (ln(|U|) + 3) · OPT (ILP) sets

82 / 100

Cost of Rounded SolutionNow that we know we will cover with high probability, we need to boundthe cost of the solution we came up with.

At each implementation of for loop, our expected cover weight is

k∑i=1

wi · zi

After t iterations of for loop, expected weight is

t ·k∑

i=1

wi · zi

By Markov:Pr[X ≥ 2 · E[X ]] ≤ 1/2.

Lemma (Cost of Rounding)

Given z optimal for the LP, our randomized rounding outputs, withprobability ≥ 0.45 a feasible solution to set cover with≤ 2 · (ln(|U|) + 3) · OPT (ILP) sets

83 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Cost of Rounded SolutionNow that we know we will cover with high probability, we need to boundthe cost of the solution we came up with.

At each implementation of for loop, our expected cover weight is

k∑i=1

wi · zi

After t iterations of for loop, expected weight is

t ·k∑

i=1

wi · zi

By Markov:Pr[X ≥ 2 · E[X ]] ≤ 1/2.

Lemma (Cost of Rounding)

Given z optimal for the LP, our randomized rounding outputs, withprobability ≥ 0.45 a feasible solution to set cover with≤ 2 · (ln(|U|) + 3) · OPT (ILP) sets

84 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Cost of Rounded SolutionNow that we know we will cover with high probability, we need to boundthe cost of the solution we came up with.

At each implementation of for loop, our expected cover weight is

k∑i=1

wi · zi

After t iterations of for loop, expected weight is

t ·k∑

i=1

wi · zi

By Markov:Pr[X ≥ 2 · E[X ]] ≤ 1/2.

Lemma (Cost of Rounding)

Given z optimal for the LP, our randomized rounding outputs, withprobability ≥ 0.45 a feasible solution to set cover with≤ 2 · (ln(|U|) + 3) · OPT (ILP) sets

85 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Cost of Rounded SolutionNow that we know we will cover with high probability, we need to boundthe cost of the solution we came up with.

At each implementation of for loop, our expected cover weight is

k∑i=1

wi · zi

After t iterations of for loop, expected weight is

t ·k∑

i=1

wi · zi

By Markov:Pr[X ≥ 2 · E[X ]] ≤ 1/2.

Lemma (Cost of Rounding)

Given z optimal for the LP, our randomized rounding outputs, withprobability ≥ 0.45 a feasible solution to set cover with≤ 2 · (ln(|U|) + 3) · OPT (ILP) sets

86 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Cost of Rounding

Lemma (Cost of Rounding)

Given z optimal for the LP, our randomized rounding outputs, withprobability ≥ 0.45 a feasible solution to set cover with≤ 2 · (ln(|U|) + 3) · OPT (ILP) sets

1 Let t = ln(|U|) + 3. There is a probability at most e−3 < 0.05 thatwhile loop runs for more than t steps.

2 After t steps, expected weight is

ω := t ·∑

wi · zi ≤ t · OPT (ILP)

3 Markov ⇒ probability that our solution has weight ≥ 2 · ω is ≤ 1/2

4 Union bound, with probability ≤ 0.55 either run for more than ttimes, or our solution has weight ≥ 2ω

5 Thus, with probability ≥ 0.45 we stop at t iterations and constructsolution to set cover with cost ≤ 2t · OPT (ILP)

87 / 100

Cost of Rounding

Lemma (Cost of Rounding)

Given z optimal for the LP, our randomized rounding outputs, withprobability ≥ 0.45 a feasible solution to set cover with≤ 2 · (ln(|U|) + 3) · OPT (ILP) sets

1 Let t = ln(|U|) + 3. There is a probability at most e−3 < 0.05 thatwhile loop runs for more than t steps.

2 After t steps, expected weight is

ω := t ·∑

wi · zi ≤ t · OPT (ILP)

3 Markov ⇒ probability that our solution has weight ≥ 2 · ω is ≤ 1/2

4 Union bound, with probability ≤ 0.55 either run for more than ttimes, or our solution has weight ≥ 2ω

5 Thus, with probability ≥ 0.45 we stop at t iterations and constructsolution to set cover with cost ≤ 2t · OPT (ILP)

88 / 100

Cost of Rounding

Lemma (Cost of Rounding)

Given z optimal for the LP, our randomized rounding outputs, withprobability ≥ 0.45 a feasible solution to set cover with≤ 2 · (ln(|U|) + 3) · OPT (ILP) sets

1 Let t = ln(|U|) + 3. There is a probability at most e−3 < 0.05 thatwhile loop runs for more than t steps.

2 After t steps, expected weight is

ω := t ·∑

wi · zi ≤ t · OPT (ILP)

3 Markov ⇒ probability that our solution has weight ≥ 2 · ω is ≤ 1/2

4 Union bound, with probability ≤ 0.55 either run for more than ttimes, or our solution has weight ≥ 2ω

5 Thus, with probability ≥ 0.45 we stop at t iterations and constructsolution to set cover with cost ≤ 2t · OPT (ILP)

89 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Cost of Rounding

Lemma (Cost of Rounding)

Given z optimal for the LP, our randomized rounding outputs, withprobability ≥ 0.45 a feasible solution to set cover with≤ 2 · (ln(|U|) + 3) · OPT (ILP) sets

1 Let t = ln(|U|) + 3. There is a probability at most e−3 < 0.05 thatwhile loop runs for more than t steps.

2 After t steps, expected weight is

ω := t ·∑

wi · zi ≤ t · OPT (ILP)

3 Markov ⇒ probability that our solution has weight ≥ 2 · ω is ≤ 1/2

4 Union bound, with probability ≤ 0.55 either run for more than ttimes, or our solution has weight ≥ 2ω

5 Thus, with probability ≥ 0.45 we stop at t iterations and constructsolution to set cover with cost ≤ 2t · OPT (ILP)

90 / 100

Cost of Rounding

Lemma (Cost of Rounding)

Given z optimal for the LP, our randomized rounding outputs, withprobability ≥ 0.45 a feasible solution to set cover with≤ 2 · (ln(|U|) + 3) · OPT (ILP) sets

1 Let t = ln(|U|) + 3. There is a probability at most e−3 < 0.05 thatwhile loop runs for more than t steps.

2 After t steps, expected weight is

ω := t ·∑

wi · zi ≤ t · OPT (ILP)

3 Markov ⇒ probability that our solution has weight ≥ 2 · ω is ≤ 1/2

4 Union bound, with probability ≤ 0.55 either run for more than ttimes, or our solution has weight ≥ 2ω

5 Thus, with probability ≥ 0.45 we stop at t iterations and constructsolution to set cover with cost ≤ 2t · OPT (ILP)

91 / 100

Rafael Oliveira
Rafael Oliveira

Cost of Rounding

Lemma (Cost of Rounding)

Given z optimal for the LP, our randomized rounding outputs, withprobability ≥ 0.45 a feasible solution to set cover with≤ 2 · (ln(|U|) + 3) · OPT (ILP) sets

1 Let t = ln(|U|) + 3. There is a probability at most e−3 < 0.05 thatwhile loop runs for more than t steps.

2 After t steps, expected weight is

ω := t ·∑

wi · zi ≤ t · OPT (ILP)

3 Markov ⇒ probability that our solution has weight ≥ 2 · ω is ≤ 1/2

4 Union bound, with probability ≤ 0.55 either run for more than ttimes, or our solution has weight ≥ 2ω

5 Thus, with probability ≥ 0.45 we stop at t iterations and constructsolution to set cover with cost ≤ 2t · OPT (ILP)

92 / 100

Putting Everything Together

1 Formulate set cover problem as ILP

2 Derive LP from the ILP LP relaxation

3 We are still minimizing the same objective function (weight of cover),but over a (potentially) larger (fractional) set of solutions.

OPT (LP) ≤ OPT (ILP)

4 Solve LP optimally using efficient algorithm.

1 If solution to LP has integral values, then it is a solution to ILP and weare done

2 If have fractional values, rounding procedure

Randomized Rounding algorithm, with probability ≥ 0.45 we get

cost(rounded solution) ≤ 2 · (ln(|U|) + 3) · OPT (ILP)

93 / 100

Putting Everything Together

1 Formulate set cover problem as ILP

2 Derive LP from the ILP LP relaxation

3 We are still minimizing the same objective function (weight of cover),but over a (potentially) larger (fractional) set of solutions.

OPT (LP) ≤ OPT (ILP)

4 Solve LP optimally using efficient algorithm.

1 If solution to LP has integral values, then it is a solution to ILP and weare done

2 If have fractional values, rounding procedure

Randomized Rounding algorithm, with probability ≥ 0.45 we get

cost(rounded solution) ≤ 2 · (ln(|U|) + 3) · OPT (ILP)

94 / 100

Putting Everything Together

1 Formulate set cover problem as ILP

2 Derive LP from the ILP LP relaxation

3 We are still minimizing the same objective function (weight of cover),but over a (potentially) larger (fractional) set of solutions.

OPT (LP) ≤ OPT (ILP)

4 Solve LP optimally using efficient algorithm.

1 If solution to LP has integral values, then it is a solution to ILP and weare done

2 If have fractional values, rounding procedure

Randomized Rounding algorithm, with probability ≥ 0.45 we get

cost(rounded solution) ≤ 2 · (ln(|U|) + 3) · OPT (ILP)

95 / 100

Putting Everything Together

1 Formulate set cover problem as ILP

2 Derive LP from the ILP LP relaxation

3 We are still minimizing the same objective function (weight of cover),but over a (potentially) larger (fractional) set of solutions.

OPT (LP) ≤ OPT (ILP)

4 Solve LP optimally using efficient algorithm.

1 If solution to LP has integral values, then it is a solution to ILP and weare done

2 If have fractional values, rounding procedure

Randomized Rounding algorithm, with probability ≥ 0.45 we get

cost(rounded solution) ≤ 2 · (ln(|U|) + 3) · OPT (ILP)

96 / 100

Putting Everything Together

1 Formulate set cover problem as ILP

2 Derive LP from the ILP LP relaxation

3 We are still minimizing the same objective function (weight of cover),but over a (potentially) larger (fractional) set of solutions.

OPT (LP) ≤ OPT (ILP)

4 Solve LP optimally using efficient algorithm.1 If solution to LP has integral values, then it is a solution to ILP and we

are done

2 If have fractional values, rounding procedure

Randomized Rounding algorithm, with probability ≥ 0.45 we get

cost(rounded solution) ≤ 2 · (ln(|U|) + 3) · OPT (ILP)

97 / 100

Putting Everything Together

1 Formulate set cover problem as ILP

2 Derive LP from the ILP LP relaxation

3 We are still minimizing the same objective function (weight of cover),but over a (potentially) larger (fractional) set of solutions.

OPT (LP) ≤ OPT (ILP)

4 Solve LP optimally using efficient algorithm.1 If solution to LP has integral values, then it is a solution to ILP and we

are done2 If have fractional values, rounding procedure

Randomized Rounding algorithm, with probability ≥ 0.45 we get

cost(rounded solution) ≤ 2 · (ln(|U|) + 3) · OPT (ILP)

98 / 100

Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira
Rafael Oliveira

Conclusion

Integer Linear programming - very general, and pervasive in(combinatorial) algorithmic life

ILP NP-hard

Rounding for the rescue!

Solve LP and round the solution

Deterministic rounding when solutions are niceRandomized rounding when things a bit more complicated

99 / 100

Acknowledgement

Lecture based largely on:

Lectures 7-8 of Luca’s Optimization class

See Luca’s vertex cover notes at https://lucatrevisan.github.io/teaching/cs261-11/lecture07.pdf

See Luca’s set cover notes at https://lucatrevisan.github.io/teaching/cs261-11/lecture08.pdf

100 / 100