CSCC73 Fall 2018

25
CSCC73 Fall 2018 Introduction Anna Bretscher 1 / 25

Transcript of CSCC73 Fall 2018

CSCC73 Fall 2018

Introduction

Anna Bretscher

1 / 25

Welcome!All the course info is on the course website.

http://www.utsc.utoronto.ca/bretscher/c73

The syllabus has all the “rules".

The content schedule is on the lectures page.

The due dates and deadlines are on a google calendar.

The course breakdown is simple:

Term Work: 8 assignments, 5% eachMidterm: 20%Final Exam: 40%

2 / 25

ResourcesLectures. Monday 11-12, Wednesday 9.20 - 11

Tutorials. May review material or contain new material

Piazza. Rules on the syllabus, but good way to get help with concepts.

Office Hours. Monday 12.10-1.30, Wednesday 1.10-2

TA Office Hours. TBD

Textbooks

One of:

Algorithm Design by Kleinberg & Tardos

Algorithms by Dasgupta, Papadimitriou & Vazirani

3 / 25

CSCC73 Algorithm Design and AnalysisQ. What is it all about?

A. Optimization problems.

I A large variety of problems are characterized as optimizationproblems.

I Problems where many solutions may exist, but we want the bestone.

I Generally, this requires maximizing or minimizing an objectivefunction.

I Usually, the solutions must obey some feasibility constraints.

4 / 25

Principle of Optimality

A problem obeys the principle of optimality if it is possible to

I subdivide the problem into sub-problems

I the sub-problems have optimal solutions

I combining the optimal solutions to the subproblems constructs anoptimal solution to the original problem.

5 / 25

Objectives of CSCC73Content

Learn several different algorithm paradigms for solving optimizationproblems, including

I greedy,

I divide and conquer,

I dynamic,

I max flow and applications,

I linear programming.

6 / 25

Skills

I know the definitions of the various techniques;

I be able to recognize algorithms that employ these techniques;

I be able to write algorithms using these techniques;

I understand what it means for algorithms written using thesetechniques to be correct;

I be able to prove that algorithms written using these techniques arecorrect;

I be able to analyze the efficiency of algorithms written using thesetechniques.

7 / 25

A first exampleMaking Change

A cashier has an unlimited number of toonies,loonies, quarters, dimes and nickels.

Problem

Q. How can the cashier make change of $3.90 using the least number ofcoins?

A. Select 1 toonie, 1 loonie, 3 quarters, 1 dime, 1 nickel.

Q. In general, how can one make change of value $X using the leastnumber of coins?

A. Obvious Algorithm: Choose the most valuable coin < $X, thensubtract this value from $X and continue.

This is a greedy algorithm.

8 / 25

Greedy AlgorithmsIdea

Choose the “best-looking" possibility without considering how thedecision will affect future choices.

Greedy Paradigm

The solution is defined to be a group of choices from a set.

For example, a selection of coins from the set of {toonies, loonies,quarters, dimes, nickels}.

9 / 25

Greedy ParadigmGeneral Algorithm Strategy

Sort the choices into a listRepeat until no more items can be chosen

if the first element in the list is a legal choiceSelect the item

elseDelete the item from the list

Complexity of Greedy Algorithms

I Sorting the elements of the set

⇒ O(n log n) (usually)

I Selection/Deletion of items

⇒ O(n)× (time required to test legality)

10 / 25

Correctness of Greedy Algorithms

I There exist problems for which the greedy strategy will not alwaysfind the optimal solution.

I Hence, we need to prove the correctness of each greedy algorithmwe create.

Q. Who is awake?

A problem obeys the principle of optimality if it is possible to

1. subdivide the problem into subproblems

2. the subproblems have optimal solutions

3. combining the optimal solutions to the subproblems constructs anoptimal solution to the original problem.

11 / 25

Correctness of Greedy AlgorithmsA problem P admits a greedy solution when it satisfies the principle ofoptimality such that...

1. The problem is subdivided by choosing the first element e of thesolution resulting in a single subproblem p (i.e. P − {e})

2. p has an optimal solution

3. combining e with an optimal solution to p guarantees an optimalsolution to P

This means that the greedy choice e belongs to an optimal solution.

12 / 25

Proof StrategiesGreedy Stays Ahead. Prove that the greedy choice is at least as goodas an optimal. Alternatively, can think of this as the greedy choicebelongs to some optimal solution.

Exchange Argument. Prove that an optimal solution can be graduallytransformed into the greedy solution without reducing its quality.

Q. Which proof technique should we try for the change making example?

Q. If our monetary system did not have nickels would our greedyalgorithm be correct?

A. No...think of $0.30.

Q. Why does our greedy algorithm work?

13 / 25

Proof of Change Making Algorithm - Greedy Stays AheadQ: Since we are solving the problem by repeatedly solving a slightlysmaller problem, which proof method might be a good fit?

A. Induction.

Main Observation.

Q: At most how many loonies, quarters, dimes and nickels can theoptimal solution have?

A. 1 loonie, 3 quarters, 2 dimes, 1 nickel

Q. Why?

A. 2 nickels = 1 dime, 3 dimes = 1 quarter and 1 nickel, 2 loonies = 1toonie

14 / 25

Greedy Choice is in OptimalConsider each of the possible coins c as the first choice using our greedystategy.

What does it imply about the value of X and why is c in the optimalsolution?

I a nickel: then X = 5. Notice that the solution must consist of only anickel (in a world with no pennies, we jump by 5s)

I a dime: Then 10 ≤ X < 25. Since we can only use 1 nickel, we mustneed at least one dime...so a dime is in the optimal solution.

I a quarter: Then 25 ≤ X < $1. Suppose the optimal solution onlyuses dimes and nickels (can only have 1 nickel and 1 dime or 2dimes < 25c).

15 / 25

Greedy Choice is in OptimalConsider each of the possible coins c as the first choice using our greedystategy.

What does it imply about the value of X and why is c in the optimalsolution?

I a loonie: $1 ≤ X < $2. The maximum attainable less than 1 is 95c,hence a loonie must belong to the optimal solution.

I a toonie: $2 ≤ X. The maximum attainable less than 2 is $1.95,hence a toonie must belong to the optimal solution.

In every case, the greedy choice must belong to an optimal solution.Hence, by induction on X − c the cashier’s algorithm is optimal.

16 / 25

Another Example: Meeting schedulingProblem

I You are in charge of scheduling meetings for your boss.

I There are a number of people who have requested meeting timesby specifying a start time si and a finish time fi.

I Your task is to try to schedule as many meetings as possible suchthat no two overlap.

1

Chapter 4

GreedyAlgorithms

Slides by Kevin Wayne.Copyright © 2005 Pearson-Addison Wesley.All rights reserved.

4.1 Interval Scheduling

3

Interval Scheduling

Interval scheduling.

! Job j starts at sj and finishes at fj.

! Two jobs compatible if they don't overlap.

! Goal: find maximum subset of mutually compatible jobs.

Time0 1 2 3 4 5 6 7 8 9 10 11

f

g

h

e

a

b

c

d

4

Interval Scheduling: Greedy Algorithms

Greedy template. Consider jobs in some order. Take each job provided

it's compatible with the ones already taken.

! [Earliest start time] Consider jobs in ascending order of start

time sj.

! [Earliest finish time] Consider jobs in ascending order of finish

time fj.

! [Shortest interval] Consider jobs in ascending order of interval

length fj - sj.

! [Fewest conflicts] For each job, count the number of conflicting

jobs cj. Schedule in ascending order of conflicts cj.

17 / 25

Greedy choices?Possible greedy selection choices:

I by start time:

5

Interval Scheduling: Greedy Algorithms

Greedy template. Consider jobs in some order. Take each job provided

it's compatible with the ones already taken.

breaks earliest start time

breaks shortest interval

breaks fewest conflicts

6

Greedy algorithm. Consider jobs in increasing order of finish time.

Take each job provided it's compatible with the ones already taken.

Implementation. O(n log n).

! Remember job j* that was added last to A.

! Job j is compatible with A if sj ! fj*.

Sort jobs by finish times so that f1 " f2 " ... " fn.

A # $

for j = 1 to n {

if (job j compatible with A)

A # A % {j}

}

return A

jobs selected

Interval Scheduling: Greedy Algorithm

7

Interval Scheduling: Analysis

Theorem. Greedy algorithm is optimal.

Pf. (by contradiction)

! Assume greedy is not optimal, and let's see what happens.

! Let i1, i2, ... ik denote set of jobs selected by greedy.

! Let j1, j2, ... jm denote set of jobs in the optimal solution with

i1 = j1, i2 = j2, ..., ir = jr for the largest possible value of r.

j1 j2 jr

i1 i1 ir ir+1

. . .

Greedy:

OPT: jr+1

why not replace job jr+1

with job ir+1?

job ir+1 finishes before jr+1

8

j1 j2 jr

i1 i1 ir ir+1

Interval Scheduling: Analysis

Theorem. Greedy algorithm is optimal.

Pf. (by contradiction)

! Assume greedy is not optimal, and let's see what happens.

! Let i1, i2, ... ik denote set of jobs selected by greedy.

! Let j1, j2, ... jm denote set of jobs in the optimal solution with

i1 = j1, i2 = j2, ..., ir = jr for the largest possible value of r.

. . .

Greedy:

OPT:

solution still feasible and optimal,but contradicts maximality of r.

ir+1

job ir+1 finishes before jr+1

problem, first one that starts may be longest.

I by shortest duration:

5

Interval Scheduling: Greedy Algorithms

Greedy template. Consider jobs in some order. Take each job provided

it's compatible with the ones already taken.

breaks earliest start time

breaks shortest interval

breaks fewest conflicts

6

Greedy algorithm. Consider jobs in increasing order of finish time.

Take each job provided it's compatible with the ones already taken.

Implementation. O(n log n).

! Remember job j* that was added last to A.

! Job j is compatible with A if sj ! fj*.

Sort jobs by finish times so that f1 " f2 " ... " fn.

A # $

for j = 1 to n {

if (job j compatible with A)

A # A % {j}

}

return A

jobs selected

Interval Scheduling: Greedy Algorithm

7

Interval Scheduling: Analysis

Theorem. Greedy algorithm is optimal.

Pf. (by contradiction)

! Assume greedy is not optimal, and let's see what happens.

! Let i1, i2, ... ik denote set of jobs selected by greedy.

! Let j1, j2, ... jm denote set of jobs in the optimal solution with

i1 = j1, i2 = j2, ..., ir = jr for the largest possible value of r.

j1 j2 jr

i1 i1 ir ir+1

. . .

Greedy:

OPT: jr+1

why not replace job jr+1

with job ir+1?

job ir+1 finishes before jr+1

8

j1 j2 jr

i1 i1 ir ir+1

Interval Scheduling: Analysis

Theorem. Greedy algorithm is optimal.

Pf. (by contradiction)

! Assume greedy is not optimal, and let's see what happens.

! Let i1, i2, ... ik denote set of jobs selected by greedy.

! Let j1, j2, ... jm denote set of jobs in the optimal solution with

i1 = j1, i2 = j2, ..., ir = jr for the largest possible value of r.

. . .

Greedy:

OPT:

solution still feasible and optimal,but contradicts maximality of r.

ir+1

job ir+1 finishes before jr+1

possible that one shortest meeting knocks out two longer meetings

18 / 25

Greedy choices?Possible greedy selection choices:

I by fewest overlap count:

5

Interval Scheduling: Greedy Algorithms

Greedy template. Consider jobs in some order. Take each job provided

it's compatible with the ones already taken.

breaks earliest start time

breaks shortest interval

breaks fewest conflicts

6

Greedy algorithm. Consider jobs in increasing order of finish time.

Take each job provided it's compatible with the ones already taken.

Implementation. O(n log n).

! Remember job j* that was added last to A.

! Job j is compatible with A if sj ! fj*.

Sort jobs by finish times so that f1 " f2 " ... " fn.

A # $

for j = 1 to n {

if (job j compatible with A)

A # A % {j}

}

return A

jobs selected

Interval Scheduling: Greedy Algorithm

7

Interval Scheduling: Analysis

Theorem. Greedy algorithm is optimal.

Pf. (by contradiction)

! Assume greedy is not optimal, and let's see what happens.

! Let i1, i2, ... ik denote set of jobs selected by greedy.

! Let j1, j2, ... jm denote set of jobs in the optimal solution with

i1 = j1, i2 = j2, ..., ir = jr for the largest possible value of r.

j1 j2 jr

i1 i1 ir ir+1

. . .

Greedy:

OPT: jr+1

why not replace job jr+1

with job ir+1?

job ir+1 finishes before jr+1

8

j1 j2 jr

i1 i1 ir ir+1

Interval Scheduling: Analysis

Theorem. Greedy algorithm is optimal.

Pf. (by contradiction)

! Assume greedy is not optimal, and let's see what happens.

! Let i1, i2, ... ik denote set of jobs selected by greedy.

! Let j1, j2, ... jm denote set of jobs in the optimal solution with

i1 = j1, i2 = j2, ..., ir = jr for the largest possible value of r.

. . .

Greedy:

OPT:

solution still feasible and optimal,but contradicts maximality of r.

ir+1

job ir+1 finishes before jr+1

I ?? by finish time

19 / 25

The Algorithm

Algorithm Greedy_ScheduleInput: Set of meetings.Output: A max set S of non-overlapping meetings.L← meetings sorted by finish time

# `i ∈ L has start time si and finish time fi, so f1 ≤ f2 ≤ . . . ≤ fn−1 ≤ fnS = ∅i = 1while i ≤ n:

if `i compatible with SS = S ∪ `i

i = i + 1return S

20 / 25

Correctness of Greedy_Schedule - Exchange Argument

Proof by contradiction. Assume that greedy set not optimal.

I Let I = i1, i2, . . . , ik be schedule selected by greedy algorithm.

I Let J = j1, j2, . . . , jm be an optimal schedule sorted by finish time.

Q. What do we know about m and k?

I Let r be the largest index such that ir = jr, soi1 = j1, i2 = j2, . . . , ir = jr

In words, ir+1 is the first meeting that disagrees with optimalsolution, so ir+1 , jr+1.

Q. What do we know about ir+1?

A. Finish time of ir+1 less than or equal to finish time of jr+1.

I Why do we care?

21 / 25

Proof cont...

5

Interval Scheduling: Greedy Algorithms

Greedy template. Consider jobs in some order. Take each job provided

it's compatible with the ones already taken.

breaks earliest start time

breaks shortest interval

breaks fewest conflicts

6

Greedy algorithm. Consider jobs in increasing order of finish time.

Take each job provided it's compatible with the ones already taken.

Implementation. O(n log n).

! Remember job j* that was added last to A.

! Job j is compatible with A if sj ! fj*.

Sort jobs by finish times so that f1 " f2 " ... " fn.

A # $

for j = 1 to n {

if (job j compatible with A)

A # A % {j}

}

return A

jobs selected

Interval Scheduling: Greedy Algorithm

7

Interval Scheduling: Analysis

Theorem. Greedy algorithm is optimal.

Pf. (by contradiction)

! Assume greedy is not optimal, and let's see what happens.

! Let i1, i2, ... ik denote set of jobs selected by greedy.

! Let j1, j2, ... jm denote set of jobs in the optimal solution with

i1 = j1, i2 = j2, ..., ir = jr for the largest possible value of r.

j1 j2 jr

i1 i1 ir ir+1

. . .

Greedy:

OPT: jr+1

why not replace job jr+1

with job ir+1?

job ir+1 finishes before jr+1

8

j1 j2 jr

i1 i1 ir ir+1

Interval Scheduling: Analysis

Theorem. Greedy algorithm is optimal.

Pf. (by contradiction)

! Assume greedy is not optimal, and let's see what happens.

! Let i1, i2, ... ik denote set of jobs selected by greedy.

! Let j1, j2, ... jm denote set of jobs in the optimal solution with

i1 = j1, i2 = j2, ..., ir = jr for the largest possible value of r.

. . .

Greedy:

OPT:

solution still feasible and optimal,but contradicts maximality of r.

ir+1

job ir+1 finishes before jr+1

Notice we can swap ir+1 for jr+1 and the optimal solution is now moresimilar to the greedy solution.

5

Interval Scheduling: Greedy Algorithms

Greedy template. Consider jobs in some order. Take each job provided

it's compatible with the ones already taken.

breaks earliest start time

breaks shortest interval

breaks fewest conflicts

6

Greedy algorithm. Consider jobs in increasing order of finish time.

Take each job provided it's compatible with the ones already taken.

Implementation. O(n log n).

! Remember job j* that was added last to A.

! Job j is compatible with A if sj ! fj*.

Sort jobs by finish times so that f1 " f2 " ... " fn.

A # $

for j = 1 to n {

if (job j compatible with A)

A # A % {j}

}

return A

jobs selected

Interval Scheduling: Greedy Algorithm

7

Interval Scheduling: Analysis

Theorem. Greedy algorithm is optimal.

Pf. (by contradiction)

! Assume greedy is not optimal, and let's see what happens.

! Let i1, i2, ... ik denote set of jobs selected by greedy.

! Let j1, j2, ... jm denote set of jobs in the optimal solution with

i1 = j1, i2 = j2, ..., ir = jr for the largest possible value of r.

j1 j2 jr

i1 i1 ir ir+1

. . .

Greedy:

OPT: jr+1

why not replace job jr+1

with job ir+1?

job ir+1 finishes before jr+1

8

j1 j2 jr

i1 i1 ir ir+1

Interval Scheduling: Analysis

Theorem. Greedy algorithm is optimal.

Pf. (by contradiction)

! Assume greedy is not optimal, and let's see what happens.

! Let i1, i2, ... ik denote set of jobs selected by greedy.

! Let j1, j2, ... jm denote set of jobs in the optimal solution with

i1 = j1, i2 = j2, ..., ir = jr for the largest possible value of r.

. . .

Greedy:

OPT:

solution still feasible and optimal,but contradicts maximality of r.

ir+1

job ir+1 finishes before jr+1

By induction, the greedy solution is optimal.

22 / 25

Interval PartitioningGiven a set of lectures, find the minimum number of classrooms neededto schedule all lectures so that no two occur at the same time.

4.1 Interval Partitioning

10

Interval Partitioning

Interval partitioning.

! Lecture j starts at sj and finishes at fj.

! Goal: find minimum number of classrooms to schedule all lectures

so that no two occur at the same time in the same room.

Ex: This schedule uses 4 classrooms to schedule 10 lectures.

Time9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30

h

c

b

a

e

d g

f i

j

3 3:30 4 4:30

11

Interval Partitioning

Interval partitioning.

! Lecture j starts at sj and finishes at fj.

! Goal: find minimum number of classrooms to schedule all lectures

so that no two occur at the same time in the same room.

Ex: This schedule uses only 3.

Time9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30

h

c

a e

f

g i

j

3 3:30 4 4:30

d

b

12

Interval Partitioning: Lower Bound on Optimal Solution

Def. The depth of a set of open intervals is the maximum number that

contain any given time.

Key observation. Number of classrooms needed ! depth.

Ex: Depth of schedule below = 3 & schedule below is optimal.

Q. Does there always exist a schedule equal to depth of intervals?

Time9 9:30 10 10:30 11 11:30 12 12:30 1 1:30 2 2:30

h

c

a e

f

g i

j

3 3:30 4 4:30

d

b

a, b, c all contain 9:30

Q. What is the smallest that the minimum number of classrooms can be?

A. The max overlap. Call this the depth.

Q. Can we always find a schedule using depth classrooms?

23 / 25

Interval Partitioning AlgorithmQ. What should our greedy choice be?

A. Start time !

Let d represent the number of rooms used so far.

sort lectures by start time, s1 ≤ s2 ≤ · · · ≤ sn.d = 0for j = 1 .. n:

if lecture j can be scheduled in room kschedule it j in room k.

elseOpen a new room d + 1 and schedule lecture j in room d + 1d = d + 1

24 / 25

Correctness that Greedy Algorithm is OptimalGreedy algorithm stays ahead...

I Suppose d is the number of classrooms allocated by the greedyalgorithm.

I Consider when the dth classroom is opened.

I Classroom d is opened because a job j conflicted with all previousd − 1 classrooms.

I Since we sorted by start time, this means that d − 1 lectures startedbefore or at sj and finish after sj, ie, overlap with the start time sj.

I Thus, we have d lectures overlapping at time sj + ε so every optimalsolution requires d classrooms.

Note. The greedy algorithm provides a conflict free schedule.

25 / 25