CSCE 2100: Computing Foundations 1 Running Time of Programs

Post on 22-Feb-2016

54 views 0 download

Tags:

description

CSCE 2100: Computing Foundations 1 Running Time of Programs. Tamara Schneider Summer 2013. What is Efficiency?. Time it takes to run a program? Resources Storage space taken by variables Traffic generated on computer network Amount of data moved to and from disk. - PowerPoint PPT Presentation

Transcript of CSCE 2100: Computing Foundations 1 Running Time of Programs

CSCE 2100: Computing Foundations 1

Running Time of Programs

Tamara SchneiderSummer 2013

2

What is Efficiency?

• Time it takes to run a program?• Resources

– Storage space taken by variables– Traffic generated on computer network– Amount of data moved to and from disk

3

Summarizing Running Time

Benchmarking• Use of benchmarks: small collection of typical

inputsAnalysis• Group input based on size

Running time is influenced by various factors• Computer• Compiler

4

Running Time

• worst-case running time: maximum running time over all inputs of size

• average running time: average running time of all inputs of size

• best-case running time: minimum running time over all inputs of size

5

Worst, Best, and Average Case

6

Running Time of a Program

is the running time of a program as a function of the input size .– indicates that the running time is linearly

proportional to the size of the input, that is, linear time.

7

Running Time of Simple Statements

We assume that “primitive operations” take a single instruction. – Arithmetic operations (+, %, *, -, ...)– Logical operations (&&, ||, ...)– Accessing operations (A[i], x->y, ...)– Simple assignment – Calls to library functions (scanf, printf, ... )

8

Code Segment 1

1sum = 0;for(i=0; i<n; i++)

sum++;

9

Code Segment 1

11

sum = 0;for(i=0; i<n; i++)

sum++;

10

Code Segment 1

11 + (n+1)

sum = 0;for(i=0; i<n; i++)

sum++;

11

Code Segment 1

11 + (n+1) + n = 2n+2

sum = 0;for(i=0; i<n; i++)

sum++;

12

Code Segment 1

11 + (n+1) + n1 How many times?

= 2n+2sum = 0;for(i=0; i<n; i++)

sum++;

13

Code Segment 1

11 + (n+1) + n1 How many times?

= 2n+2

1 + (2n+2) + n*1 = 3n + 3

Complexity?

sum = 0;for(i=0; i<n; i++)

sum++;

14

Code Segment 2

sum = 0;for(i=0; i<n; i++) for(j=0; j<n; j++) sum++;

15

Code Segment 2

sum = 0;for(i=0; i<n; i++) for(j=0; j<n; j++) sum++;

1

16

Code Segment 2

sum = 0;for(i=0; i<n; i++) for(j=0; j<n; j++) sum++;

12n+2

17

Code Segment 2

sum = 0;for(i=0; i<n; i++) for(j=0; j<n; j++) sum++;

12n+22n+2

18

Code Segment 2

sum = 0;for(i=0; i<n; i++) for(j=0; j<n; j++) sum++;

1

1

2n+22n+2

19

Code Segment 2

sum = 0;for(i=0; i<n; i++) for(j=0; j<n; j++) sum++;

1

1

2n+22n+2

1 + (2n+2) + (2n+2)*n + n*n*1

Complexity?

20

Code Segment 3

sum = 0;for(i=0; i<n; i++) for(j=0; j<n*n; j++) sum++;

1

1

2n+2?

Complexity?

21

Code Segment 4

sum = 0;for(i=0; i<=n; i++) for(j=0; j<i; j++) sum++;

1

1

2n+4?

Complexity?i=0i=1 j=0 i=2 j=0 j=1 i=3 j=0 j=1 j=2

i=n j=0 j=1 j=2 j=3 . . . j=n-1…

22

How Do Running Times Compare?

n2

3*2n

3n-1

n-1

23

Towards “Big Oh”

T(n) describes the runtime of some program,e.g. T(n) = 2x2-4x+3

n (input size)

t (time) c f(n),e.g. 5 x2

with c = 5, f(n)=x2

n0

We can observe that for an input size n ≥ n0 , the graph of the function c f(n) has a higher time value than the graph for the function T(n).

For n ≥ n0, c f(n) is an upper bound on T(n), i.e. c f(n) ≥ T(n).

24

Big-Oh [1]

• It is too much work to use the exact number of machine instructions

• Instead, hide the details– average number of compiler-generated machine instructions– average number of instructions executed by a machine per

second

• Simplification– Instead of 4m-1 write O(m)

• O(m) ?!

25

Definition: is if an integer and a constant : ∃

Big-Oh [2]

• Restrict argument to integer • is nonnegative for all

∃ “there exists” ∀ “for all”

26

Big-Oh - Example [1]

Example 1: T(0) = 1T(1) = 4T(2) = 9in general : T(n) = (n+1)2

Is T(n) also O(n2) ???

Definition:T(n) is O(f(n)) if an integer n∃ 0 and a constant c > 0:

n ≥ n∀ 0 T(n) ≤ cf(n)

27

Big-Oh - Example [2]

T(n)=(n+1)2. We want to show that T(n) is O(n2).In other words, f(n) = n2

If this is true, there exist and integer n0 and a constant c > 0 such that for all n ≥ n0 : T(n) ≤ cn2

Definition:T(n) is O(f(n)) if an integer n∃ 0 and a constant c > 0:

n ≥ n∀ 0 T(n) ≤ cf(n)

28

Big-Oh - Example [3]

T(n) ≤ cn2 (n+1)⇔ 2 ≤ cn2

Choose c=4, n0=1: Show that (n+1)2 ≤ 4n2 for n ≥ 1

(n+1)2 = n2 + 2n + 1 ≤ n2 + 2n2 + 1

= 3n2 + 1 ≤ 3n2 + n2

= 4n2

= cn2

Definition:T(n) is O(f(n)) if an integer n∃ 0 and a constant c > 0:

n ≥ n∀ 0 T(n) ≤ cf(n)

29

Big-Oh - Example [Alt 3]

T(n) ≤ cn2 (n+1)⇔ 2 ≤ cn2

Choose c=2, n0=3: Show that (n+1)2 ≤ 2n2 for n ≥ 3

(n+1)2 = n2 + 2n + 1 ≤ n2 + n2

= 2n2 = cn2For all n≥3: 2n+2 ≤ n2

Definition:T(n) is O(f(n)) if an integer n∃ 0 and a constant c > 0:

n ≥ n∀ 0 T(n) ≤ cf(n)

30

Simplification Rules for Big-Oh

• Constant factors can be omitted– O(54n2) = O(n2)

• Lower-oder terms can be omitted– O(n4 + n2) = O(n4)– O(n2) + O(1) = O(n2)

• Note that the highest-order term should never be negative. – Lower order terms can be negative.– Negative terms can be omitted since they do not increase

the runtime.

31

Transitivity [1]

What is transitivity?– if A B and B C, then A C☺ ☺ ☺– example: a < b and b < c, then a < c

e.g. 2 < 4 and 4 < 7, then 2 < 7 since “<“ is transitive

Is Big Oh transitive?

Transitivity [2]• if f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) is O(h(n))

– f(n) is O(g(n)): n∃ 1, c1 such that f(n) ≤ c1 g(n) n ≥ n∀ 1

– g(n) is O(h(n)): n∃ 2, c2 such that g(n) ≤ c2 h(n) n ≥ n∀ 2

– Choose n0 = max{n1,n2} and c = c1 c2

f(n) ≤ c1 g(n) ≤ c1 c2 h(n) f(n) is O(h(n))⇒≤ c2 h(n)

Tightness• Use constant factor “1”• Use tightest upper bound that we can proof– 3n is O(n2) and O(n) and O(2n)

Which one should we use?

Summation Rule [1]

Consider a program that that contains 2 parts

• Part 1 takes T1(n) time and is O(f1(n))

• Part 2 takes T2(n) time and is O(f2(n))

• We also know that f2 grows no faster than f1 f⇒ 2(n) is O(f1(n))

• What is the running time of the entire program?• T1(n) + T2(n) is O(f1(n) + f2(n))• But can we simplify this?

Summation Rule [2]• T1(n) + T2(n) is O(f1(n)) since f2 grows no faster than f1

• Proof:

T1(n) ≤ c1 f1(n) for n ≥ n1

T2(n) ≤ c2 f2(n) for n ≥ n2

f2(n) ≤ c3 f1(n) for n ≥ n3

n0 = max{n1,n2,n3}

T1(n) + T2(n) ≤ c1 f1(n) + c2 f2(n)

= c1 f1(n) + c2 f2(n)

≤ c1 f1(n) + c2 c3 f1(n)

= c1 +c2 c3 f1(n) = c f1(n) with c=c1+c2c3

T⇒ 1(n) + T2(n) is O(f1(n))

36

Summation Rule - Example

𝑂 (𝑛)

//make A identity matrixscanf("%d", &d);for(i=0; i<n; i++) for(j=0; j<n; j++)

A[i][j] = 0;for(i=0; i<n; i++) A[i][i] = 1;

𝑂 (1)

𝑂 (1)

𝑂 (1) 𝑂 (𝑛)

O(n2)

O(1) + O(n2) + O(n) = O(n2)

Summary of Rules & Concepts [1]• Worst-case, average-case, and best-case running time are

compared for a fixed input size n, not for varying n!

• Counting Instructions– Assume 1 instruction for assignments, simple calculations,

comparisons, etc.

• Definition of Big-OhT(n) is O(f(n))if an integer n∃ 0 and a constant c > 0:

n ≥ n∀ 0 T(n) ≤ cf(n)

Summary of Rules & Concepts [2]

• Rule 1: Constant factors can be omitted– Example: O(3n5) = O(n5)

• Rule 2: Low order terms can be omitted– Example: O(3n5 + 10n4 - 4n3 + n + 1) = O(3n5)

• We can combine Rule 1 and Rule 2:– Example: O(3n5 + 10n4 - 4n3 + n + 1) = O(n5)

Summary of Rules & Concepts [3]• For O(f(n) + g(n)), we can neglect the function with the slower

growth rate. – Example: O(f(n) + g(n)) = O(n + nlogn) = O(nlogn)

• Transitivity: If f(n) is O(g(n)) and g(n) is O(h(n))then f(n) is O(h(n)) – Example: f(n)=3n, g(n)=n2, h(n)=n6

3n is O(n2) and n2 is O(n6) 3n is O(n6)

• Tightness: We try to find an upper bound Big-Oh that is as small as possible. – Example: n2 is O(n6), but is O(n2) is a much tighter (and better) bound.

40

Solutions to Instruction Counts on Code Segments

Instructions Big Oh

Code Segment 1 3n + 3 O(n)

Code Segment 2 3n2 + 4n + 3 O(n2)

Code Segment 3 3n3 + 4n + 3 O(n3)

Code Segment 4 Argh! O(n2)