The running time of an algorithm as input size approaches infinity is called the asymptotic running...

43

Transcript of The running time of an algorithm as input size approaches infinity is called the asymptotic running...

Page 1: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Page 2: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

The running time of an algorithm as input size approaches infinity is called the asymptotic running time

We study different notations for asymptotic efficiency.

In particular, we study tight bounds, upper bounds and lower bounds.

Page 3: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

Why do we need the different sets? Definition of the sets O (Oh), (Omega)

and (Theta), o (oh) Classifying examples:

◦ Using the original definition◦ Using limits

Page 4: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

Let f(n) and g(n) be asymptotically nonnegative functions whose domains are the set of natural numbers N={0,1,2,…}.

A function g(n) is asymptotically nonnegative, if g(n)0 for all nn0 where n0N

Page 5: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

Big “oh” - asymptotic upper bound on the growth of an algorithm

When do we use Big Oh?1. Theory of NP-completeness2. To provide information on the

maximum number of operations that an algorithm performs◦ If an algorithm is O(n2) in the worst case

This means that in the worst case it performs at most cn2 operations

Page 6: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

O(f(n)) is the set of functions g(n) such that:there exist positive real constants c, and nonnegative integer N for which,g(n) cf(n) for all nN

f(n) is called an asymptotically upper bound for g(n).

Page 7: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

N n

cf(n)

g(n)

Page 8: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

0200400600800

100012001400

0 10 20 30

n2 + 10n

2 n2

take c = 2N = 10n2+10n <=2n2 for all n>=10

Page 9: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

Proof: From the definition of Big Oh, there must exist c>0 and integer N>0 such that 0 5n+2cn for all nN.

Dividing both sides of the inequality by n>0 we get: 0 5+2/nc.

2/n 2, 2/n>0 becomes smaller when n increases

There are many choices here for c and N.

Page 10: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

If we choose N=1 then 5+2/n 5+2/1= 7. So any c 7. Choose c = 7.

If we choose c=6, then 0 5+2/n6. So any N 2. Choose N = 2.

In either case (we only need one!) we have a c>0 and N>0 such that 0 5n+2cn for all n N. So the definition is satisfied and

5n+2 O(n)

Page 11: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

We will prove by contradiction that the definition cannot be satisfied.

Assume that n2 O(n).From the definition of Big Oh, there must

exist c>0 and integer N>0 such that 0 n2cn for all nN.

Dividing the inequality by n>0 we get 0 n c for all nN.

n c cannot be true for any n >max{c,N }, contradicting our assumption

So there is no constant c>0 such that nc is satisfied for all nN, and n2 O(n)

Page 12: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

1,000,000 n2 O(n2) why/why not? True

(n - 1)n / 2 O(n2) why /why not? True

n / 2 O(n2) why /why not? True

lg (n2) O( lg n ) why /why not? True

n2 O(n) why /why not? False

Page 13: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

Omega - asymptotic lower bound on the growth of an algorithm or a problem*

When do we use Omega? To provide information on the

minimum number of operations that an algorithm performs◦ If an algorithm is (n) in the best case

This means that in the best case its instruction count is at least cn,

◦ If it is (n2) in the worst case This means that in the worst case its instruction

count is at least cn2

Page 14: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

(f(n)) is the set of functions g(n) such that:there exist positive real constants c, and nonnegative integer N for which,g(n) >= cf(n) for all n N

f(n) is called an asymptotically lower bound for g(n).

Page 15: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

N n

cf(n)

g(n)

Page 16: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

Proof: From the definition of Omega, there must exist c>0 and integer N>0 such that 0 cn 5n-20 for all nN

Dividing the inequality by n>0 we get: 0 c 5-20/n for all nN.

20/n 20, and 20/n becomes smaller as n grows.

There are many choices here for c and N. Since c > 0, 5 – 20/n >0 and N >4If we choose N=5, then 1 5-20/5 5-20/n. So 0 < c 1.

Choose c = ½.If we choose c=4, then 5 – 20/n 4 and N 20. Choose N =

20.

In either case (we only need one!) we have a c>o and N>0 such that 0 cn 5n-20 for all n N. So the definition is satisfied and 5n-20 (n)

Page 17: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

1,000,000 n2 (n2) why /why not? (true) (n - 1)n / 2(n2) why /why not? (true)

n / 2(n2) why /why not? (false) lg (n2) ( lg n ) why /why not? (true) n2 (n) why /why not? (true)

Page 18: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

Theta - asymptotic tight bound on the growth rate of an algorithm ◦ If an algorithm is (n2) in the worst

and average cases This means that in the worst case and

average cases it performs cn2 operations

◦Binary search is (lg n) in the worst and average cases The means that in the worst case and average

cases binary search performs clgn operations

Page 19: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

(f(n)) is the set of functions g(n) such that:there exist positive real constants c, d, and nonnegative integer N, for which,cf(n) g(n) df(n) for all nN

f(n) is called an asymptotically tight bound for g(n).

Page 20: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

N n

cf(n)

g(n)

df(n)

Page 21: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

))(())(())(( nfnfOnf

log n n2 n5

5n + 3 1/2 n2 2n

n1.5 5n2+ 3nnlog n

(n2)

O(n2) (n2)

Page 22: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

We use the last definition and show:

1.

2.

)(32

1 22 nOnn

)(32

1 22 nnn

?)(32

1 Does 22 nnn

Page 23: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

?)(32

1 Does 22 nOnn

6 Choose 6. all for

. all for

:get weby inequality the Dividing

. all for

that such and, exist must there definition the From

Choose

chosen be can 1/2 any learly

NNn

Nncn

n

Nncnnn

Nc

c

cC

2

13210

3210

02

232210

00

.2/1

Page 24: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

?)(32

1 Does 22 nnn

. and So

. all for

1/4.c Choose finite for 0 Since

6. Since

.

that such and exist must There

1241

12321

41

.2/1 ,3

and 32100

3210

get we02by Dividing

allfor 322120

00

N/c

nn

cn/n

N, c

nc

n

Nnnncn

Nc

N

Page 25: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

1,000,000 n2 (n2) why /why not? (true)

(n - 1)n / 2(n2) why /why not? (true)

n / 2(n2) why /why not? (false)

lg (n2) ( lg n ) why /why not? (true) n2 (n) why /why not? (false)

Page 26: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

(f(n)) is the set of functions g(n) which satisfy the following condition:For every positive real constant c, there exists a positive integer N, for which,

g(n) cf(n) for all nN

Page 27: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

Little “oh” - used to denote an upper bound that is not asymptotically tight. ◦ n is in o(n3). ◦ n is not in o(n)

Page 28: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

Let c>0 be given. we need to find an N such that for n>=N, n<=cn2

If we divide both sides of this inequality by cn, we get

1/c <=n Therefore we can choose any N>=1/c.

Page 29: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

Proof by contradiction Let c=1/6 If nεo(5n), then there must exist some N such that, for n>=N n<= (5/6)n This contradiction proves that n is not in o(5n).

Page 30: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

c then f (n) = ( g (n)) if c > 0

if lim f (n) / g (n) = 0 then f (n) = o ( g(n)) then g (n) = o ( f (n))

The limit must exist

{n

Page 31: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

If f(x) and g(x) are both differentiable with derivatives f’(x) and g’(x), respectively, and if

exists right the on limit the whenever

then

)('

)('lim

)(

)(lim

)(lim)(lim

xg

xf

xg

xf

xfxg

xx

xx

Page 32: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Page 33: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

02ln

1lim

'

)'(lglim

lglim

2ln

1'

2ln

ln'lg and

2ln

lnlg

)(lg

nn

n

n

n

n

nn

nn

non

nnn

Page 34: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

02ln2

!lim

...2ln2

)1(lim

2ln2lim

2lim

)(')'(:

)2(2ln2ln''2

2

integer positive a is where)2(

2

21

2ln2ln

2ln

knn

n

k

nn

k

nn

k

n

xx

nnnn

nn

nk

k

nkkknn

exeNote

ee

e

kon

Page 35: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

Properties of growth functions

O <= >= = o <

Page 36: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

Using limits we get:

So ln n = o(na) for any a > 0 When the exponent a is very small, we need to look at very large values of n to see that na > ln n

01

limln

lim anan ann

n

Page 37: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

O ( g (n) ) = { f (n )| there exist positive constant c and a positive integer N such that

0 f( n) c g (n ) for all n N }

o ( g (n) ) = { f(n) | for any positive constant c >0, there exists a positive integer N such that 0 f( n) c g (n ) for all n N }

For ‘o’ the inequality holds for all positive constants.

Whereas for ‘O’ the inequality holds for some positive constants.

Page 38: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

Lower order terms of a function do not matter since lower-order terms are dominated by the higher order term.

Constants (multiplied by highest order term) do not matter, since they do not affect the asymptotic growth rate

All logarithms with base b >1 belong to (lg n) since

constant a is wherelglg

lglog cnc

b

nnb

Page 39: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

We say a function f (n ) is polynomially bounded if f (n ) = O ( nk ) for some positive constant k

We say a function f (n ) is polylogarithmic bounded if f (n ) = O ( lgk n) for some positive constant k

Exponential functions ◦grow faster than positive polynomial functions

Polynomial functions◦grow faster than polylogarithmic functions

Page 40: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

What does n2 + 2n + 99 = n2 + (n)

mean?n2 + 2n + 99 = n2 + f(n)

Where the function f(n) is from the set (n). In fact f(n) = 2n + 99 .

Using notation in this manner can help to eliminate non-affecting details and clutter in an equation.

2n2 + 5n + 21= 2n2 + (n ) = (n2)

Page 41: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

If f (n) = (g(n )) and g (n) = (h(n )) then f (n) = (h(n )) .

If f (n) = (g(n )) and g (n) = (h(n )) then f (n) = (h(n )).

If f (n) = (g(n )) and g (n) = (h(n )) then f (n) = (h(n )) .

If f (n) = (g(n )) and g (n) = (h(n )) then f (n) = (h(n )) .

Page 42: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

f (n) = (f (n )).

f (n) = (f (n )).

f (n) = (f (n )).

“o” is not reflexive

Page 43: The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.

Symmetry:f (n) = (g(n )) if and only if g (n) = (f (n )) .

Transpose symmetry: f (n) = (g(n )) if and only if g (n) = (f (n )).