1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it...
-
Upload
isabel-maxwell -
Category
Documents
-
view
213 -
download
0
Transcript of 1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it...
1
Time Analysis
Analyzing an algorithm = estimating the resources it requires.
Time How long will it take to execute?
Impossible to find exact value Depends on implementation, compiler, architecture
So let's use a different measure of time e.e. number of steps/simple operations
Space Amount of temporary storage required We don’t count the input
2
Time Analysis
Goals: Compute the running time of an algorithm. Compare the running times of algorithms that solve
the same problem
Observations: Since the time it takes to execute an algorithm usually
depends on the size of the input, we express the algorithm's time complexity as a function of the size of the input.
3
Time Analysis
Observations: Since the time it takes to execute an algorithm usually
depends on the size of the input, we express the algorithm's time complexity as a function of the size of the input.
Two different data sets of the same size may result in different running times
e.g. a sorting algorithm may run faster if the input array is already sorted.
As the size of the input increases, one algorithm's running time may increase much faster than another's
The first algorithm will be preferred for small inputs but the second will be chosen when the input is expected to be large.
4
Time Analysis
Ultimately, we want to discover how fast the running time of an algorithm increases as the size of the input increases. This is called the order of growth of the algorithm
Since the running time of an algorithm on an input of size n depends on the way the data is organized, we'll need to consider separate cases depending on whether the data is organized in a "favorable" way or not.
5
Time analysis Best case analysis
Given the algorithm and input of size n that makes it run fastest (compared to all other possible inputs of size n), what is the running time?
Worst case analysis Given the algorithm and input of size n that makes it run
slowest (compared to all other possible inputs of size n), what is the running time?
A bad worst-case complexity doesn't necessarily mean that the algorithm should be rejected.
Average case analysis Given the algorithm and a typical, average input of size n,
what is the running time?
6
Time Analysis
Iterative algorithms Concentrate on the time it takes to execute the loops
Recursive algorithms Come up with a recursive function expressing the
time and solve it.
7
Example: Sorting
Sorting algorithm: insertion sort The idea:
Divide the array in two imaginary parts: sorted, unsorted
The sorted part is initially empty Pick the first element from the unsorted part and
insert it in its correct slot in the sorted part. Do the insertion by traversing the sorted part to
find where the element should be placed. Shift the other elements over to make room for it.
Repeat the process. This algorithm is very efficient for sorting small arrays
8
Example continued
As we "move" items to the sorted part of the array, this imaginary wall between the parts moves towards the end of the array. When it reaches the edge, we are done!
9
Example continued
Input: array A of size nOutput: A, sorted in ascending orderfunction InsertionSort(A[1..n])begin for i:=2 to n item := A[i] j := i - 1 while (j > 0 and A[j] > item) A[j+1] := A[j] j := j - 1 end while A[j+1] := item end forend
10
Example continued
times executednn-1n-1tj
(tj-1)(tj-1)
n-1
Input: array A of size nOutput: A, sorted in ascending orderfunction InsertionSort(A[1..n])begin for i:=2 to n item := A[i] j := i - 1 while (j > 0 and A[j] > item) A[j+1] := A[j] j := j - 1 end while A[j+1] := item end forend
depends on input
11
Example continued
Best case for insertion sort The best case occurs when the array is already sorted.
Then, the while loop is not executed and the running time of the algorithm is a linear function of n.
Worst case for insertion sort The worst case occurs when the array is sorted in reverse.
Then, for each value of i, the while loop is executed i-1 times. The running time of the algorithm is a quadratic function of n.
Average case for insertion sort On average, the while loop is executed i/2 times for each
value of i. The running time is again a quadratic function of n (albeit with a smaller coefficient than in the worst case)
12
Algorithm Analysis
Observations Some terms grow faster than others as the size of the
input increases ==> these determine the rate of growth
Example 1: In n2+4n , n2 grows much faster than n. We say that
this is a quadratic function and concentrate on the term n2.
Example 2: Both n2+4n and n2 are quadratic functions. They
grow at approximately the same rate. Since they have the same rate of growth, we consider them "equivalent".
We are interested in finding upper (and lower) bounds for the order of growth of an algorithm.
13
Sorting: Lower bounds
We found that the running time of insertion sort is quadratic. Can we perhaps do better?
Comparison sort = A family of algorithms that use comparisons to
determine the sorted order of a collection Example: insertion sort
Decision tree = a tree that represents the comparisons performed
by a sorting algorithm on input of given size
14
Sorting: Lower boundsDecision tree for 3 elements:
b ? c
a ? b
a ? c
a ? c b ? c
(a , b , c)
(a , c , b ) (c , a , b ) (c , b a )(b , c , a )
(b , a , c )
Each internal node contains a comparison
Each leaf contains a permutation
Algorithm execution = a path from the root to a leaf
15
Sorting: Lower bounds
Each internal node contains a comparison Each leaf contains a permutation Algorithm execution = a path from the root to a leaf Worst-case number of comparisons = height of decision
tree
Idea: If we find a lower bound on the height of the decision tree, we’ll have a lower bound on the running time of any comparison-based sorting algorithm A decision tree that sorts n elements has height at least
nlgn This means that comparison-based sorting algorithms
whose worst-case running time is at most nlgn are optimal!
16
MergeSort Main idea:
break the list in two sort each half recursively base case: a single-element array is already sorted merge the two halves
17
Sorting: MergeSortvoid mergesort (int array[ ], int low, int high) {
int mid;if (low < high) {
mid=(low+high)/2;mergesort(array, low, mid);mergesort(array, mid+1, high);merge(array, low, mid, mid+1, high);
}}
/* merge() is used to merge the two sorted subarrays, [low..mid] and [mid+1..high] into one sorted array. This is done by selecting the smallest element at the front of each subarray and placing it in the "final" array. */
18
Sorting: MergeSort
Running time: Time to mergesort n items =
twice the time to mergesort n/2 items+the time to merge a total of n items
T(n) = 2*T(n/2) + n =...=O(nlgn)
19
Quicksort
Basic idea: Pick an element (called pivot) Partition the array in two subsequences: those
smaller than or equal to the pivot and those larger than or equal to the pivot
Sort each subsequence recursively
20
Quicksort
Partitioning an array Goal : move all elements smaller than or equal to the
pivot to the left of the array and all elements larger than or equal to the pivot to the right of the array. In the end, the right part of the array will contain elements that are larger than or equal to those in the left part.
21
Quicksort
Running time: depends on selection of pivot Best case (array partitioned in half every time)
time to quicksort n elements = time to partition array + twice the time to quicksort n/2 elements
T(n) = 2*T(n/2) + n = ... = O(nlgn)
22
Quicksort
Running time: depends on selection of pivot Average case
Still O(nlgn) but with a slightly larger constant
Worst case: Partitioning n elements results in two subsequences
of lengths 1 and n-1 T(n) = T(n-1) + n = ... =O(n2)
23
Stable sorting algorithms
A sorting algorithm is stable if it does not change the relative order of items with the same value.
This is important when satellite data is carried around with the element being sorted. Example: ATM transactions where the key is the
account number. Initially the transactions are ordered by time, but if we sort them by account number, the relative order of the transactions must stay the same.
Insertion sort and mergesort are stable. Quicksort is not stable
24
Space requirements
Mergesort needs additional space (n) for the temporary array.
All other algorithms that we saw sort the arrays in-place and need only constant additional space for temporary variables.
25
Comparison
Insertion sort Speed: very good in the best case or on partially sorted arrays, bad in the worst
case. Very little overhead Space: constant Stability: stable When to use: when the array is small or we already know that it's almost sorted.
Mergesort Speed: optimal time in all cases. Overhead due to recursion. Space: requires a copy of the array (extra n). Extra stack space for recursion. Stability: stable When to use: when sorting a very large array, when time is critical, when space is
not an issue. Quicksort
Speed: optimal best/average case, bad worst case. Overhead due to recursion. Space: constant. Extra stack space for recursion Stability: not stable When to use: when sorting a large array and time is not critical. Fast in practice.