Lecture 7

61
Lecture 7

description

Lecture 7. Solution by Substitution Method . T(n) = 2 T(n/2) + n Substitute n/2 into the main equation 2T(n/2) = 2(2(T(n/4)) + n/2) = 4T(n/4) + n And T(n) = 4T (n/4) + 2n Again by substituting n/4, we can see 4T(n/4) = 4(2T(n/8)) + (n/4) = 8T(n/8) + n And T(n) = 8T(n/8) + 3n - PowerPoint PPT Presentation

Transcript of Lecture 7

Page 1: Lecture  7

Lecture 7

Page 2: Lecture  7

Solution by Substitution Method T(n) = 2 T(n/2) + n• Substitute n/2 into the main equation2T(n/2) = 2(2(T(n/4)) + n/2) = 4T(n/4) + nAnd T(n) = 4T (n/4) + 2n• Again by substituting n/4, we can see4T(n/4) = 4(2T(n/8)) + (n/4) = 8T(n/8) + nAnd T(n) = 8T(n/8) + 3n• Continuing in this manner, we obtainT(n) = 2k T(n/2k) + k.nUsing k = lgnT(n) = nT(1) + nlgn = nlgn + nT(n) = O(nlgn)

Page 3: Lecture  7

3

Overview

• Divide and Conquer

• Merge Sort

• Quick Sort

Page 4: Lecture  7

4

Quick Sort• Divide:

• Pick any element p as the pivot, e.g, the first element

• Partition the remaining elements into FirstPart, which contains all elements < pSecondPart, which contains all elements ≥ p

• Recursively sort the FirstPart and SecondPart

• Combine: no work is necessary since sorting

is done in place

Page 5: Lecture  7

5

Quick Sort

x < p p p ≤ x

PartitionFirstPart SecondPart

ppivot

A:

Recursive call

x < p p p ≤ x

SortedFirstPart

SortedSecondPart

Sorted

Page 6: Lecture  7

6

Quick SortQuick-Sort(A, left, right)if left ≥ right return

else middle ← Partition(A, left,

right) Quick-Sort(A, left, middle–1 ) Quick-Sort(A, middle+1, right)

end if

Page 7: Lecture  7

7

Partitionp

p x < p p ≤ x

p p ≤ xx < p

A:

A:

A:p

Page 8: Lecture  7

8

Partition Example

A: 4 8 6 3 5 1 7 2

Page 9: Lecture  7

9

Partition Example

A: 4 8 6 3 5 1 7 2

i=0

j=1

Page 10: Lecture  7

10

Partition Example

A:

j=1

4 8 6 3 5 1 7 2

i=0

8

Page 11: Lecture  7

11

Partition Example

A: 4 8 6 3 5 1 7 26

i=0

j=2

Page 12: Lecture  7

12

Partition Example

A: 4 8 6 3 5 1 7 2

i=0

383

j=3

i=1

Page 13: Lecture  7

13

Partition Example

A: 4 3 6 8 5 1 7 2

i=1

5

j=4

Page 14: Lecture  7

14

Partition Example

A: 4 3 6 8 5 1 7 2

i=1

1

j=5

Page 15: Lecture  7

15

Partition Example

A: 4 3 6 8 5 1 7 2

i=2

1 6

j=5

Page 16: Lecture  7

16

Partition Example

A: 4 3 8 5 7 2

i=2

1 6 7

j=6

Page 17: Lecture  7

17

Partition Example

A: 4 3 8 5 7 2

i=2

1 6 22 8

i=3

j=7

Page 18: Lecture  7

18

Partition Example

A: 4 3 2 6 7 8

i=3

1 5

j=8

Page 19: Lecture  7

19

Partition Example

A: 4 1 6 7 8

i=3

2 542 3

Page 20: Lecture  7

20

A: 3 6 7 81 542

x < 4 4 ≤ x

pivot incorrect position

Partition Example

Page 21: Lecture  7

21

Partition(A, left, right)1. x ← A[left]2. i ← left3. for j ← left+1 to right4. if A[j] < x then 5. i ← i + 16. swap(A[i], A[j])7. end if8. end for j9. swap(A[i], A[left])10. return i

n = right – left +1 Time: cn for some constant c Space: constant

Page 22: Lecture  7

22

4 8 6 3 5 1 7 22 3 1 5 6 7 84

Quick-Sort(A, 0, 7)Partition

A:

Page 23: Lecture  7

23

2 3 1

5 6 7 84

2 1 3

Quick-Sort(A, 0, 7)Quick-Sort(A, 0, 2)

A:

, partition

Page 24: Lecture  7

24

2

5 6 7 84

1

1 3

Quick-Sort(A, 0, 7)Quick-Sort(A, 0, 0) , base case, return

Page 25: Lecture  7

25

2

5 6 7 84

1

33

Quick-Sort(A, 0, 7)Quick-Sort(A, 1, 1) , base case

Page 26: Lecture  7

26

5 6 7 842 1 3

2 1 3

Quick-Sort(A, 0, 7)Quick-Sort(A, 2, 2), returnQuick-Sort(A, 0, 2), return

Page 27: Lecture  7

27

42 1 3

5 6 7 86 7 85

Quick-Sort(A, 0, 7)Quick-Sort(A, 2, 2), returnQuick-Sort(A, 4, 7) , partition

Page 28: Lecture  7

28

4

5

6 7 87 866

2 1 3

Quick-Sort(A, 0, 7)Quick-Sort(A, 5, 7) , partition

Page 29: Lecture  7

29

4

5

6

7 887

2 1 3

Quick-Sort(A, 0, 7)Quick-Sort(A, 6, 7) , partition

Page 30: Lecture  7

30

4

5

6

7

2 1 3

Quick-Sort(A, 0, 7)Quick-Sort(A, 7, 7)

8

, return, base case

8

Page 31: Lecture  7

31

4

5

6 87

2 1 3

Quick-Sort(A, 0, 7)Quick-Sort(A, 6, 7) , return

Page 32: Lecture  7

32

4

5

2 1 3

Quick-Sort(A, 0, 7)Quick-Sort(A, 5, 7) , return

6 87

Page 33: Lecture  7

33

42 1 3

Quick-Sort(A, 0, 7)Quick-Sort(A, 4, 7) , return

5 6 87

Page 34: Lecture  7

34

42 1 3

Quick-Sort(A, 0, 7)Quick-Sort(A, 0, 7) , done!

5 6 87

Page 35: Lecture  7

35

Quick-Sort: Best Case Even Partition

Total time: (nlogn)

cn

2 × cn/2 = cn

4 × c/4 = cn

n/3 × 3c = cn

log n levels

n

n/2 n/2

n/4

3 3 3

n/4n/4n/4

Page 36: Lecture  7

36

cn

c(n-1)

3c

2c

n

n-1

n-2

3

2

c(n-2)

Happens only if input is sortd input is reversely sorted

Quick-Sort: Worst Case Unbalanced Partition

Total time: (n2)

Page 37: Lecture  7

37

Quick-Sort: an Average Case• Suppose the split is 1/10 : 9/10

Quick-Sort: an Average Case

cn

cn

cn

≤cn

n

0.1n 0.9n

0.01n 0.09n 0.09n

Total time: (nlogn)

0.81n

2

2

log10n

log10/9n

≤cn

Page 38: Lecture  7

38

Quick-Sort Summary• Time

– Most of the work done in partitioning.– Average case takes (n log(n)) time.– Worst case takes (n2) time

Space– Sorts in-place, i.e., does not require additional

space

Page 39: Lecture  7

Recursion

• Recursion is more than just a programming technique. It has two other uses in computer science and software engineering, namely:

• as a way of describing, defining, or specifying things.

• as a way of designing solutions to problems (divide and conquer).

Page 40: Lecture  7
Page 41: Lecture  7

• In general, we can define the factorial function in the following way:

Page 42: Lecture  7

Iterative Definition

• This is an iterative definition of the factorial function.

• It is iterative because the definition only contains the algorithm parameters and not the algorithm itself.

• This will be easier to see after defining the recursive implementation.

Page 43: Lecture  7

Recursive Definition

• We can also define the factorial function in the following way:

Page 44: Lecture  7

Iterative vs. Recursive

• Iterative 1 if n=0factorial(n) = n x (n-1) x (n-2) x … x 2 x 1 if n>0

• Recursive

1 if n=0 factorial(n) = n x factorial(n-1) if n>0

Function calls itself

Function does NOTcall itself

Page 45: Lecture  7

Recursion

• To see how the recursion works, let’s break down the factorial function to solve factorial(3)

Page 46: Lecture  7

Breakdown

• Here, we see that we start at the top level, factorial(3), and simplify the problem into 3 x factorial(2).

• Now, we have a slightly less complicated problem in factorial(2), and we simplify this problem into 2 x factorial(1).

Page 47: Lecture  7

Breakdown

• We continue this process until we are able to reach a problem that has a known solution.

• In this case, that known solution is factorial(0) = 1.

• The functions then return in reverse order to complete the solution.

Page 48: Lecture  7

Breakdown

• This known solution is called the base case.

• Every recursive algorithm must have a base case to simplify to.

• Otherwise, the algorithm would run forever (or until the computer ran out of memory).

Page 49: Lecture  7

Breakdown

• The other parts of the algorithm, excluding the base case, are known as the general case.

• For example:3 x factorial(2) general case2 x factorial(1) general caseetc …

Page 50: Lecture  7

Iterative Algorithm

factorial(n) {i = 1factN = 1loop (i <= n)

factN = factN * i

i = i + 1end loopreturn factN

}

The iterative solution is very straightforward. We simply loop through all the integers between 1 and n and multiply them together.

Page 51: Lecture  7

Recursive Algorithm

factorial(n) {if (n = 0)

return 1else

return n*factorial(n-1)end if

}

Note how much simpler the code for the recursive version of the algorithm is as compared with the iterative version

we have eliminated the loop and implemented the algorithm with 1 ‘if’ statement.

Page 52: Lecture  7

How Recursion Works

• To truly understand how recursion works we need to first explore how any function call works.

• When a program calls a subroutine (function) the current function must suspend its processing.

• The called function then takes over control of the program.

Page 53: Lecture  7

How Recursion Works• When the function is finished, it needs to

return to the function that called it.• The calling function then ‘wakes up’ and

continues processing.• One important point in this interaction is

that, unless changed through call-by- reference, all local data in the calling module must remain unchanged.

Page 54: Lecture  7

How Recursion Works

• Therefore, when a function is called, some information needs to be saved in order to return the calling module back to its original state (i.e., the state it was in before the call).

• We need to save information such as the local variables and the spot in the code to return to after the called function is finished.

Page 55: Lecture  7

How Recursion Works• To do this we use a stack.• Before a function is called, all relevant

data is stored in a stackframe.• This stackframe is then pushed onto the

system stack.• After the called function is finished, it

simply pops the system stack to return to the original state.

Page 56: Lecture  7

How Recursion Works

• By using a stack, we can have functions call other functions which can call other functions, etc.

• Because the stack is a first-in, last-out data structure, as the stackframes are popped, the data comes out in the correct order.

Page 57: Lecture  7

Main disadvantage of programming recursively

• The main disadvantage of programming recursively is that, while it makes it easier to write simple and elegant programs, it also makes it easier to write inefficient ones.

• when we use recursion to solve problems we are interested exclusively with correctness, and not at all with efficiency. Consequently, our simple, elegant recursive algorithms may be inherently inefficient.

Page 58: Lecture  7

Limitations of Recursion

• Recursive solutions may involve extensive overhead because they use calls.

• When a call is made, it takes time to build a stackframe and push it onto the system stack.

• Conversely, when a return is executed, the stackframe must be popped from the stack and the local variables reset to their previous values – this also takes time.

Page 59: Lecture  7

Limitations of Recursion

• In general, recursive algorithms run slower than their iterative counterparts.

• Also, every time we make a call, we must use some of the memory resources to make room for the stackframe.

Page 60: Lecture  7

Limitations of Recursion

• Therefore, if the recursion is deep, say, factorial(1000), we may run out of memory.

• Because of this, it is usually best to develop iterative algorithms when we are working with large numbers.

Page 61: Lecture  7

• Recursion is based upon calling the same function over and over, whereas iteration simply `jumps back' to the beginning of the loop. A function call is often more expensive than a jump.