Class 18: Measuring Cost
-
Upload
david-evans -
Category
Technology
-
view
459 -
download
0
description
Transcript of Class 18: Measuring Cost
Class 18: Measuring Cost
cs1120 Fall 2011David Evans3 October 2011Colossus Rebuilt, Bletchley Park, Summer 2004
2
Plan
How Computer Scientists Measure CostAsymptotic Operators
Assistant Coaches’ Review Sessions for Exam 1:Tuesday (tomorrow), 6:30pm, Rice 442
Wednesday, 7:30pm, Rice 442
(define (fibo-loop n) (cdr (loop 1 (cons 0 1) (lambda (i) (< i n)) inc (lambda (i v) (cons (cdr v) (+ (car v) (cdr v)))))))
(define (fibo-rec n) (if (= n 0) 0 (if (= n 1) 1 (+ (fibo-rec (- n 1)) (fibo-rec (- n 2))))))
> (time (fibo-rec 2))cpu time: 0 real time: 0 gc time: 01> (time (fibo-rec 5))cpu time: 0 real time: 0 gc time: 05> (time (fibo-rec 20))cpu time: 0 real time: 2 gc time: 06765> (time (fibo-rec 30))cpu time: 359 real time: 370 gc time: 0832040> (time (fibo-rec 60))still waiting since Friday…
> (time (fibo-loop 2))cpu time: 0 real time: 0 gc time: 01> (time (fibo-loop 5))cpu time: 0 real time: 0 gc time: 05> (time (fibo-loop 20))cpu time: 0 real time: 0 gc time: 06765> (time (fibo-loop 30))cpu time: 0 real time: 0 gc time: 0832040> (time (fibo-loop 60))cpu time: 0 real time: 0 gc time: 01548008755920
(define (fibo-loop n) (cdr (loop 1 (cons 0 1) (lambda (i) (< i n)) inc (lambda (i v) (cons (cdr v) (+ (car v) (cdr v)))))))
(define (fibo-rec n) (if (= n 0) 0 (if (= n 1) 1 (+ (fibo-rec (- n 1)) (fibo-rec (- n 2))))))
Number of “expensive” calls:
where n is the value of the input, and is the “golden ratio”:
Number of “expensive” calls:
5
How Computer Scientists Measure Cost
Abstract: hide all the details that will change when you get your next laptop to capture the fundamental cost of the procedureCost: number of steps for a Turing Machine to
execute the procedure
Order of Growth: what matters is how the cost scales with the size of the inputSize of input: how many TM squares needed to represent itUsually, we can determine these without actually writing a TM version of our procedure!
6
Orders of Growth
7
Asymptotic Operators
8
Asymptotic Operators
These notations define sets of functions
In computing, the function inside the operator is (usually) a mapping from the size of the input to the number of steps required. We use the asymptotic operators to abstract away all the silly details about particular computers.
9
Big O
• Intuition: the set of functions that grow no faster than f (more formal definition soon)
• Asymptotic growth rate: as input to f approaches infinity, how fast does value of f increase– Hence, only the fastest-growing term in f matters.
?
?
10
ExamplesO(n3)
O(n2)
f(n) = n2.5
f(n) = 12n2 + n
f(n) = n3.1 – n2
Faster Growing
11
Formal Definition
12
O Examples
x O (x2)?
10x O (x)?
x2 O (x)?
13
Lower Bound: (Omega)
Only difference from O: this was
14
O(n3)
O(n2)
f(n) = n2.5
f(n) = 12n2 + n
f(n) = n3.1 – n2
Where is(n2)?
(n2)
Faster Growing
15
(n3)
(n2)
f(n) = n2.5
f(n) = 12n2 + n
f(n) = n3.1 – n2
O(n2)
Slower Growing
Inside-Out
16
Recap• Big-O: functions that grow no faster than f
• Omega (): functions that grow no slower than f
17
The Sets O(f ) and Ω(f )
f
O(f)Functions that grow no faster than f
Ω(f) Functions that grow no slower than f
Increasingly fast growing functions
18
O(n3)
O(n2)
f(n) = n2.5
f(n) = 12n2 + n
f(n) = n3.1 – n2
(n2)
Faster Growing
What else might be useful?
19
Theta (“Order of”)
Intuition: set of functions that grow as fast as f
Definition:
Slang: When people say, “f is order g” that means
20
O(n3)
O(n2)
f(n) = n2.5
f(n) = 12n2 + n
f(n) = n3.1 – n2
(n2)
Faster Growing
Tight Bound Theta ()
(n2)
21
Θ Examples• Is 10n in Θ(n)?– Yes, since 10n is (n) and 10n is in O(n)• Doesn’t matter that you choose different c values
for each part; they are independent• Is n2 in Θ(n)?–No, since n2 is not in O(n)
• Is n in Θ(n2)?–No, since n2 is not in (n)
22
The Sets O(f ), Ω(f ), and Θ(f )
f
O(f)Functions that grow no faster than f
Ω(f) Functions that grow no slower than f
Θ(f)
How big are O(f ), Ω(f ), and Θ(f )?
Increasingly fast growing functions
23
Summary
Which of these would we most like to know about costproc(n) = number of steps to execute proc on input of size n?
Big-O: grow no faster than fOmega: grow no slower than fTheta:
24
Complexity of Problems
So, why do we need O and Ω?
Computer scientists often care about the complexity of problems not algorithms. The complexity of a problem is the complexity of the best possible algorithm that solves the problem.
25
Algorithm Analysis
What is the asymptotic running time of the Scheme procedure below:
(define (bigger a b) (if (> a b) a b))
26
Charge
Assistant Coaches’ Review Sessions for Exam 1:Tuesday (tomorrow), 6:30pm, Rice 442
Wednesday, 7:30pm, Rice 442
Next class: analyzing the costs of biggeranalyzing the costs of brute-force-lorenz