CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices...

41
CS420 Lecture 9 Dynamic Programming

Transcript of CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices...

Page 1: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

CS420 Lecture 9Dynamic Programming

Page 2: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Optimization Problems

• In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems are encountered.

• This often leads to a recursive definition of a solution. However, the recursive algorithm is often inefficient in that it solves the same sub problem many times.

• Dynamic programming avoids this repetition by solving the problem bottom up and storing sub solutions.

Page 3: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Dynamic vs Greedy

Compared to Greedy, there is no predetermined local best choice of a sub solution but a best solution is chosen by computing a set of alternatives and picking the best.

Page 4: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Dynamic Programming

• Dynamic Programming has the following steps - Characterize the structure of the problem - Recursively define the optimum - Compute the optimum bottom up, storing values of sub solutions - Construct the optimum from the stored data

Page 5: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Assembly line scheduling

• A factory has two assembly lines with the same stations S1,j and S2,j, j=1..n.

• Each station has its own assembly time Ai,j, it takes ei time to enter assembly line i, and xi time to leave assembly line i.

• For rush jobs the product can switch lines to go through faster.

• Switching from station Si,j costs ti,j time.

Page 6: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

S1,1 S1,2 S1,3

S2,1S2,2 S2,3

7 9 3

8 5 6start finish

42

332

2

2

1

Fastest way to get through?

Page 7: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

S1,1 S1,2 S1,3

S2,1S2,2 S2,3

7 9 3

8 5 6start finish

42

332

2

2

1

Fastest way to get through: S1,1, S2,2 and S1,3.

Page 8: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Structure of the Optimal Solution

• The fastest way to get through S1,j is:

only one way for j=1 two ways for j>1: without extra cost from S1,j-1 or

with switch cost from S2,j-1

• A similar argument holds for S2,j.

Page 9: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Optimal Sub structure

• the fastest way to get through S1,j is based on the fastest ways to get through S1,j-1 and S2,j-1

• otherwise a faster way could be constructed( contradiction)

• This property is called optimal substructure and indicates that a recursive definition of the optimum problem, based on optimal solutions of sub problems can be formulated.

Page 10: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Recursive definition

• The fastest way through the assembly lines f* = min(f1(n)+x1,f2(n)+x2)

f1(j) = e1+A1,1 if j=1

= min(f1(j-1)+A1,j, f2(j-1)+t2,j-1+A1,j) if j>1

f2(j) = e2+A2,1 if j=1

= min(f2(j-1)+A2,j, f1(j-1)+t2,j-1+A2,j) if j>1

• Draw a call tree, how many times is fi(n-j) called?

Page 11: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Recursive definition

• The fastest way through the assembly lines f* = min(f1(n)+x1,f2(n)+x2)

f1(j) = e1+A1,1 if j=1

= min(f1(j-1)+A1,j, f2(j-1)+t2,j-1+A1,j) if j>1

f2(j) = e2+A2,1 if j=1

= min(f2(j-1)+A2,j, f1(j-1)+t2,j-1+A2,j) if j>1

• Draw a call tree, how many times is fi(n-j) called: 2n-j times , so exponential complexity

Page 12: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Bottom Up Approach

• In the recursive definition we define the f1(n)

top down in terms of f1(n-1) and f2(n-1)

• When we compute fi(j) bottom up in increasing j order we can reuse the previous fi(j-1) values in the computation of fi(j).

Page 13: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

S1,1 S1,2 S1,3

S2,1S2,2 S2,3

7 9 3

8 5 6start finish

42

332

2

2

1

Time at end of Si,1?

Page 14: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

S1,1 S1,2 S1,3

S2,1S2,2 S2,3

7 9 3

8 5 6start finish

42

332

2

2

1

Best time at end of Si,2 ,source?

9

12

Page 15: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

S1,1 S1,2 S1,3

S2,1S2,2 S2,3

7 9 3

8 5 6start finish

42

332

2

2

1

Best time at end of Si,3? source?

9

12

18

16

Page 16: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

S1,1 S1,2 S1,3

S2,1S2,2 S2,3

7 9 3

8 5 6start finish

42

332

2

2

1

Best finish time?, source?

9

12

18

16

20

22

Page 17: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

S1,1 S1,2 S1,3

S2,1S2,2 S2,3

7 9 3

8 5 6start finish

42

332

2

2

1

Best Finish time?, source?

9

12

18

16

20

22

23

Page 18: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

FastLine(A,t,e,x,n) {

f1[1] = e[1]+A[1,1]; f2[1] = e[2]+A[2,1] for j = 2 to n { l1[j] = minSource1(j); l2[j] = minSource2(j); } if (f1[n]+x[1] <= f2[n]+x[2]) { fstar = f1[n]+x[1]; lstar=1 } else { fstar = f2[n]+x[2]; lstar=2 } }

Page 19: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

minSource1(j) { if (f1[j-1]+A[1,j] <= f2[j-1]+t[2,j-1]+A[1,j]) { f1[j]=f1[j-1]+A[1,j] return 1 } else { f1[j]=f2[j-1]+t[2,j-1]+A[1,j] return 2 } }

Page 20: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

minSource2(j) { if (f2[j-1]+A[2,j] <= f1[j-1]+t[1,j-1]+A[2,j]) { f2[j]=f2[j-1]+A[2,j] return 2 } else { f2[j]=f1[j-1]+t[1,j-1]+A[2,j] return 1 } }

Page 21: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

The arrays l1 and l2 keep track of which source station is used: li[j] is the line number whose station j-1 is used in the fastest way through Si,j.

i = lstar print "line" i ", station" n for j = n downto 2{ i = li[j] print "line" i ", station" j-1 }

Page 22: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Optimal substructure

• Dynamic programming works when a problem

has optimal substructure.

• Not all problems have optimal substructure.

Page 23: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Shortest Paths

• Find a shortest path in terms of number of edges from u to v (u v) in an un-weighted graph.

• In lecture 7 we saw that breadth first search finds such shortest paths, and that these paths have no cycles.

• Any path from u to v has an intermediate vertex w (which can be u or v). The shortest path from u to v consists of the shortest path from u to w and the shortest path from w to v. Why?

Page 24: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Shortest Paths

• Find a shortest path in terms of number of edges from u to v (u v) in an un-weighted graph.

• In lecture 7 we saw that breadth first search finds such shortest paths, and that these paths have no cycles.

• Any path from u to v has an intermediate vertex w (which can be u or v). The shortest path from u to v consists of the shortest path from u to w and the shortest path from w to v. If not there would be a shorter shortest path (contradiction). So?

Page 25: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Shortest Paths

• ... has optimal subsructure

Page 26: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Longest simple paths

• Simple: no cycles• Again in an unweighted graph• Suppose again w is on the path from u to v.• Then the longest simple path from u to v does

not need to consist of the longest simple paths from u to w and from w to v.

Page 27: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Q

T

R

Longest simple path from Q to T?

Page 28: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Q

T

R

Longest simple path from Q to T: QRT

Longest simple path from R to T?

Page 29: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Q

T

R

Longest simple path from Q to T: QRTLongest simple path from R to T: RQT

Longest path from Q to T is Q ->R-> T, but it does not consist of the longest paths from Q to R and from R to T. The longest simple path problem has no optimal substructure!

Page 30: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Subsequence

• A sequence X=<x1,x2,..,xm> is a subsequence of sequence Y=<y1,y2,..,yn> if there is a strictly increasing sequence of indices I=<i1,i2,..,im> such that:

for all j=1,2,..,m we have X[j]=Y[ij]• Eg, X=<b,c,d,b>, Y=<a,b,c,b,d,a,b>, I=<2,3,5,7>.• How many sub sequences does Y have?

Page 31: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Subsequence

• A sequence X=<x1,x2,..,xm> is a subsequence of sequence Y=<y1,y2,..,yn> if there is a strictly increasing sequence of indices I=<i1,i2,..,im> such that:

for all j=1,2,..,m we have X[j]=Y[ij]• Eg, X=<b,c,d,b>, Y=<a,b,c,b,d,a,b>, I=<2,3,5,7>.• How many sub sequences does Y have: 2|Y|

Page 32: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Longest Common Subsequence (LCS)

• A sequence Z is a common subsequence of X and Y if it is a subsequence of X and of Y.

• The longest common subsequence problem (LCS) is to find, well yes, the longest common subsequence of two sequences X and Y.

• Trial and error takes exponential because there are an exponential #subsequences

Page 33: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

LCS has optimal substructure

Xi is the length i prefix of X.

Cormen et.al. prove that for two sequences X=<x1,x2,..,xm> and Y=<y1,y2,..,yn> and

LCS Z=<z1,z2,..,zk> of X and Y:

1. if xm=yn then zk=xm and Zk-1 is an LCS of Xm-1 and Yn-1

2. if xmyn then zkxm Z is an LCS of Xm-1 and Y

3. if xmyn then zkyn Z is an LCS of Yn-1 and X

Page 34: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Recursive algorithm for LCS

1. if xm=yn then zk=xm and Zk-1 is an LCS of Xm-1 and Yn-1

2. if xmyn then zkxm Z is an LCS of Xm-1 and Y

3. if xmyn then zkyn Z is an LCS of Yn-1 and X gives rise to a recursive formulation:

LCS(Xm ,Yn ):

if xm=yn find the LCS of Xm-1 and Yn-1 and append xm to it,

else find the LCSs of Xm-1 and Y and of Yn-1 and X

and pick the largest.Complexity? (call tree)

Page 35: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Length of LCS

Length C[i,j] of of Xi and Yj:

C[i,j] = 0 if i=0 or j=0, C[i-1,j-1]+1 if i,j>0 and xi=yj

max(C[i,j-1],C[i-1,j] if i,j>0 and xiyj

This can be computed in dynamic fashionWe build C bottom up and we build a table BB[i,j] points at the entry chosen in the recurrence of C:for C[i-1,j-1] for C[i,j-1} for C[i-1,j] In which order?

Page 36: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Length of LCS

Length C[i,j] of of Xi and Yj:

C[i,j] = 0 if i=0 or j=0, C[i-1,j-1]+1 if i,j>0 and xi=yj

max(C[i,j-1],C[i-1,j] if i,j>0 and xiyj

This can be computed in dynamic fashionWe build C bottom up and we build a table BB[i,j] points at the entry chosen in the recurrence of C:for C[i-1,j-1] for C[i,j-1} for C[i-1,j] In which order? Def before use row major is OK

Page 37: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

LCS algorithm for i in 1 to m C[i,0]=0 for j in 0 to n C[0,j] = 0 for i = 1 to m for j = 1 to n if (X[i]==Y[j]) { C[i,j]=C[i-1,j-1]+1; B[i,j]= } else if (C[i-1,j]>=C[i,j-1]) { C[i,j]=C[i-1,j]; B[i,j]= } else { C[i,j]=C[i,j-1]; B[i,j]= }

Page 38: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

try it on x=bab y=ababa

C: B: 0 0 0 0 0 0 0 0 1 1 1 1 0 1 1 2 2 2 0 0 2 2 3 3

Page 39: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Finding LCS

• By backtracking through B, starting at B[n,m] we can print the LCS.

• A at B[i,j] indicates that X[i]=Y[j] is the end of the prefix of the LCS still to be printed.

Page 40: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

Print LCS

if (i==0 or j==0) return if (B[i,j]==) { Print-LCS(i-1,j-1); print(X[i]) else if (B[i,j]== ) Print-LCS(i-1,j) else Print-LCS(i,j-1)

Page 41: CS420 Lecture 9 Dynamic Programming. Optimization Problems In optimization problems a set of choices are to be made to arrive at an optimum, and sub problems.

LCS

• The B table is not necessary; – C can be used in Print-LCS.

• The creation of C and B in LCS takes O(mn) time and space

• Print-LCS takes O(m+n) time.