Chapter 2

65
1 Chapter 2 Program Performance – Part 2

description

Chapter 2. Program Performance – Part 2. Step Counts. Instead of accounting for the time spent on chosen operations, the step-count method accounts for the time spent in all parts of the program/function - PowerPoint PPT Presentation

Transcript of Chapter 2

Page 1: Chapter 2

1

Chapter 2

Program Performance – Part 2

Page 2: Chapter 2

2

Step Counts

• Instead of accounting for the time spent on chosen operations, the step-count method accounts for the time spent in all parts of the program/function

• Program step: loosely defined to be a syntactically or semantically meaningful segment of a program for which the execution time is independent of the instance characteristics

• Return a+b*c/(a-b)*4

• X =y

Page 3: Chapter 2

3

Use a global variable to count program steps

Count = 2n + 3

Page 4: Chapter 2

4

Counting steps in a recursive function

• tRsum = 2, n=0

• tRsum = 2+tRsum(n-1), n>0

• tRsum = 2+2+tRsum(n-2), n>0

• tRsum = 2(n+1), n>=0

Page 5: Chapter 2

5

Matrix Addition

Page 6: Chapter 2

6

Count steps in Matrix Addition

count = 2rows*cols+2rows+1

Page 7: Chapter 2

7

Using a Step TableSum

Page 8: Chapter 2

8

Rsum

Page 9: Chapter 2

9

Matrix Addition

Page 10: Chapter 2

10

Matrix TransposeTemplate <class T>

void transpose(T** a, int rows)

{

for (int i = 0; i < rows ; i++)

for (int j = i+1; j < rows ; j++)

swap(a[i][j], a[j][i])

}

Page 11: Chapter 2

11

Matrix Transpose

Page 12: Chapter 2

12

Inefficient way to compute the prefix sums

for j = 0, 1, …, n-1

j

i

ia0

][

Note: number of S/E for sum() varies depending on parameters

Page 13: Chapter 2

13

Steps Per Execution

• Sum(a, n) requires 2n + 3 steps• Sum(a , j + 1) requires 2(j+1) + 3 = 2j +5 steps• Assignment statement: b[j]=sum(….)• ==>2j + 6 steps

n-1

• Total: ∑ (2j +6) = n(n+5)

j=0

Page 14: Chapter 2

14

Prefix sums

Page 15: Chapter 2

15

Sequential Search - Best case

Page 16: Chapter 2

16

Sequential Search - Worst case

Page 17: Chapter 2

17

Average for successful searches• X has equal probability of being any one element of a.

• Step count if X is a[j]

Page 18: Chapter 2

18

Average for successful searches

1

0

2/)7()4(1

)(n

j

AVG

SearchSequentialnj

nnt

Page 19: Chapter 2

19

Insertion in a Sorted Array – Best Case

Page 20: Chapter 2

20

Insertion – Worst Case

Page 21: Chapter 2

21

Insertion - Average

n

0k

n

0j3n)]1n(3k2[

1n

1)3j2n2((

1n

1

• the step count for inserting into position j is 2n-2j+3

• Average count is:

Page 22: Chapter 2

22

Asymptotic Notation• Objectives of performance Evaluation:

– Compare time complexities of two programs that do the same function

– Predict the growth in run time as instance characteristics change

• Operation count and step count methods not accurate for either objectives– Op count: counts some ops and ignores others– Step count: definition of a step is inexact

Page 23: Chapter 2

23

Asymptotic Notation• If two programs:

– Program A with complexity C1n2+C2n– Program B with complexity C3n

• Program B is faster than program A for sufficiently large values of n

• For Small values of n, either could be faster and it may not matter any way.

• There is a break-even point for n beyond which B is always faster than A.

Page 24: Chapter 2

24

Asymptotic Notation• Describes behavior of space and time

complexities of programs for LARGE instance characteristics– To establish a relative order among functions. – To compare their relative rate of growth

• Allows us to make meaningful, though inexact statements about the complexity of programs

Page 25: Chapter 2

25

Mathematical backgroundT(n) denotes the time or space complexity of a

program

Big- Oh: Growth rate of T(n) is <= f(n)

• T(n)= ( f(n) ) iff constants c and n0 exist such that T(n)<=c f(n) when n>=n0

• f is an upper bound function for T

• Example: Algoritm A is (n2) means, for data sets big enough (n>n0), algorithm A executes less than c*n2 (c a positive constant).

Page 26: Chapter 2

26

The Idea• Example:

– 1000n• larger than n2 for small values of n

• n2 grows at a faster rate and thus n2 will eventually be the larger function.

– Here we have• T(n) = 1000n, f(n) = n2 , n0 = 1000, and c=1

• T(n) <= f(n) and n > n0

– Thus we say that• 1000n = (n2 )

– Note that we can get a tighter upper bound

Page 27: Chapter 2

27

Example

• Suppose T(n) = 10n2 + 4n + 2

• for n>= 2, T(n) <= 10n2 + 5n

• for n>=5, T(n) <= 11n2

• T(n) = O(n2 )

Page 28: Chapter 2

28

Big Oh Ratio Theorem

• T(n) = O(f(n)) iff (T(n)/f(n)) < c

for some finite constant c.

• f(n) dominates T(n).

lim

n

Page 29: Chapter 2

29

Examples

• Suppose T(n) = 10n2 + 4n + 2

• T(n)/n2 = 10 + 4/n + 2/n2

• (T(n)/ n2) = 10

• T(n) = O (n2 )

lim

n

Page 30: Chapter 2

30

Common Orders of MagnitudeFunctions Name

1 Constant

log n Logarithmic

log2n Log-squared

n log n

n2 Quadratic

n3 Cubic

2n Exponential

n! Factorial

Page 31: Chapter 2

31

Loose Bounds

• Suppose T(n) = 10n2 + 4n + 2

• 10n2 + 4n + 2 <= 11n3

• T(n) = O(n3)

• Need to get the smallest upper bound.

Page 32: Chapter 2

32

Polynomials

• If T(n) = amnm + ….+a1n1 +a0n0

then T(n) = O(nm)

Page 33: Chapter 2

33

Omga Notation--Lower BoundOmega:

• T(n)= ( g(n) ) iff constants c and n0 exist such that T(n)>=c g(n) for all n >=n0

• Establishes a lower bound

• eg: T(n) = C1n2+C2n• C1n2+C2n C1n2 for all n 1• T(n) C1 n2 for all n 1• T(n) is (n2)• Note: T(n) is also (n) and (1). Need to

get largest lower-bound

Page 34: Chapter 2

34

Omega Ratio Theorem

• T(n) = (f(n)) iff (f(n)/T(n)) <= c

for some finite constant c.

limn

Page 35: Chapter 2

35

Lower Bound of Polynomials

• If T(n) = amnm + ….+a1n1 +a0n0

then T(n) = (nm)

• T(n) = n4 + 3500n3 + 400n2 +1• T(n) is (n4)

Page 36: Chapter 2

36

Theta NotationTheta: When O and meet we indicate that with notation

• Definition: T(n)= ( h(n) ) iff constants c1, c2 and n0 exist such that c1h(n)<=T(n)<=c2h(n) for all n > n0

• T(n)= ( h(n) ) iff T(n)=O(h(n)) and T(n)= (h(n))

• e.g. T(n) = 3n + 8 • 3n<= 3n+8 <= 11n for n >= 1• T(n) = (n)• T(n) = 20log2(n) +8 = log2 (n)• log2 (n) < 20log2 (n) + 8<= 21log2 (n) for all n>=32

Page 37: Chapter 2

37

Theta Notation cntd

• T(n) = 1000n

• T(n) = O(n2)• but T(n) != (n2) because T(n) != n2)

Page 38: Chapter 2

38

Theta of Polynomials

• If T(n) = amnm + ….+a1n1 +a0n0

then T(n) = (nm)

Page 39: Chapter 2

39

Little o Notation

Little- Oh: Growth rate of T(n) is < p(n)• T(n)= ( p(n) ) if T(n)= ( p(n) ) and T(n)!=

( p(n) )

• T(n) = 1000n• T(n) o(n2)

Page 40: Chapter 2

40

Simplifying Rules

• If f(n) is O(g(n)) and g(n) is O(h(n)), then f(n) is O(h(n)).

• If f(n) is O(kg(n)) for any k>0, then f(n) is O(g(n)).

• f1(n) = O(g1(n)) and f2 (n) = O(g2(n)), then

(a) (f1 + f2 )(n) = max (O(g1 (n)), O(g2(n))),

(b) f1 (n) * f2 (n) = O(g1(n) * g2(n))

Page 41: Chapter 2

41

Some Points

• DO NOT include constants or low-order terms inside a Big-Oh.

• For example:– T(n) = O(2n2) or

– T(n) = O(n2 + n)

• are the same as: – T(n) = O(n2)

Page 42: Chapter 2

42

Examples

• Example1: a = b;

This assignment takes constant time, so it is

• Example 2: sum =0;

for( I= 0; I<= n; I++)

sum += n;

• time complexity is (n)

Page 43: Chapter 2

43

Examples CNTD

a = 0;

for (i=1; i<=n; i++)

for (j=1; j<=n; j++)

a++;• time complexity is (n2)

Page 44: Chapter 2

44

Examples CNTD

a = 0;

for (i=1; i<=n; i++)

for (j=1; j<= i ; j++)

a++;

• a++ statement will execute n(n+1)/2 times• time complexity is (n2)

Page 45: Chapter 2

45

Examples CNTD

a = 0; (1)

for (i=1; i<=n; i++)

for (j=1; j<= i ; j++)

a++; (n2)

for (k=1; k<=n; k++) (n)

A[k] = k-1;

• time complexity is (n2)

Page 46: Chapter 2

46

Examples CNTD

• Not all doubly nested loops execute n2 times

a = 0;

for (i=1; i<=n; i++)

for (j=1; j<= n ; j *= 2)

a++;

• Inner loop executes log2(n)

• Outer loop execute n times

• time complexity is (n log2 (n))

Page 47: Chapter 2

49

First determine the asymptotic complexity of each statement and then

add up

Page 48: Chapter 2

50

Asymptotic complexity of Rsum

Page 49: Chapter 2

51

Asymptotic complexity of Matrix Addition

Page 50: Chapter 2

52

Asymptotic complexity of Transpose

Page 51: Chapter 2

53

Asymptotic complexity of Inef

Page 52: Chapter 2

54

Asymptotic complexity of Sequential Search

Page 53: Chapter 2

55

Binary Search

Worst-case complexity is Θ(log n)

Page 54: Chapter 2

56

Performance Measurement

Chapter 2 Section 6

Page 55: Chapter 2

57

Run time on a pseudo machine

Page 56: Chapter 2

58

Conclusions

• The utility of a program with exponential complexity is limited to small n (typically <= 40)

• Programs that have a complexity of high degree polynomial are also of limited utility

• Linear complexity is desirable in practice of programming

Page 57: Chapter 2

59

Performance Measurement

• Obtain the actual space and time requirements of a program

• Choosing Instance Size

• Developing the test data - exhibits the best-, worst-, and average-case time complexity (using randomly generated data)

• Setting up the experiment - write a program that will measure the desired run times

Page 58: Chapter 2

60

Measuring the performance of Insertion Sort Program

Page 59: Chapter 2

61

Measuring the performance of Insertion Sort Program (continue)

Page 60: Chapter 2

62

Experimental results - Insertion Sort

Page 61: Chapter 2

63

Measuring with repeated runs

Page 62: Chapter 2

64

Do without overhead

Page 63: Chapter 2

65

Do without overhead (continue)

Page 64: Chapter 2

66

Overhead

Page 65: Chapter 2

67

End of Chapter 2