Self-Adjusting Computation Umut Acar Carnegie Mellon University
description
Transcript of Self-Adjusting Computation Umut Acar Carnegie Mellon University
Self-Adjusting Computation
Umut Acar
Carnegie Mellon University
Joint work with Guy Blelloch, Robert Harper, Srinath Sridhar, Jorge Vittes, Maverick Woo
14 January 2004 Workshop on Dynamic Algorithms and Applications
2
Dynamic Algorithms
Maintain their input-output relationship as the input changes
Example: A dynamic MST algorithm maintains the MST of a graph as user to insert/delete edges
Useful in many applications involvinginteractive systems, motion, ...
14 January 2004 Workshop on Dynamic Algorithms and Applications
3
Developing Dynamic Algorithms: Approach I
Dynamic by designMany papersAgarwal, Atallah, Bash, Bentley, Chan, Cohen, Demaine, Eppstein, Even, Frederickson, Galil, Guibas, Henzinger, Hershberger, King, Italiano, Mehlhorn, Overmars, Powell, Ramalingam, Roditty, Reif, Reps, Sleator, Tamassia, Tarjan, Thorup, Vitter, ...
Efficient algorithms but can be complex
14 January 2004 Workshop on Dynamic Algorithms and Applications
4
Approach II: Re-execute the algorithm when the input changes
Very simpleGeneralPoor performance
14 January 2004 Workshop on Dynamic Algorithms and Applications
5
Smart re-execution
Suppose we can identify the pieces of execution affected by the input changeRe-execute by re-building only the affected pieces
Execution (A,I)
Execution (A,I+)
14 January 2004 Workshop on Dynamic Algorithms and Applications
6
Smart Re-execution
Time re-execute = O(distance between executions)
Execution (A,I)
Execution (A,I+)
14 January 2004 Workshop on Dynamic Algorithms and Applications
7
Incremental Computation or Dynamization
General techniques for transforming algorithms dynamicMany papers: Alpern, Demers, Field, Hoover, Horwitz, Hudak, Liu, de Moor, Paige, Pugh, Reps, Ryder, Strom, Teitelbaum, Weiser, Yellin...Most effective techniques are
Static Dependence Graphs [Demers, Reps, Teitelbaum ‘81]Memoization [Pugh, Teitelbaum ‘89]
These techniques work well for certain problems
14 January 2004 Workshop on Dynamic Algorithms and Applications
8
Bridging the two worlds
Dynamization simplifies development of dynamic algorithms
but generally yields inefficient algorithms
Algorithmic techniques yield good performance
Can we have the best of the both worlds?
14 January 2004 Workshop on Dynamic Algorithms and Applications
9
Our Work
Dynamization techniques:Dynamic dependence graphs [Acar,Blelloch,Harper ‘02]Adaptive memoization [Acar, Blelloch,Harper ‘04]
Stability: Technique for analyzing performance [ABHVW ‘04]
Provides a reduction from dynamic to static problemsReduces solving a dynamic problem to finding a stable solution to the corresponding static problem
Example: Dynamizing parallel tree contraction algorithm [Miller, Reif 85] yields an efficient solution to the dynamic trees problem [Sleator, Tarjan ‘83], [ABHVW SODA 04]
14 January 2004 Workshop on Dynamic Algorithms and Applications
10
Outline
Dynamic Dependence GraphsAdaptive Memoization
Applications toSortingKinetic Data Structures with experimental resultsRetroactive Data Structures
14 January 2004 Workshop on Dynamic Algorithms and Applications
11
Control dependences arise from function calls
Dynamic Dependence Graphs
14 January 2004 Workshop on Dynamic Algorithms and Applications
12
Control dependences arise from function callsData dependences arise from reading/writing the memory
Dynamic Dependence Graphs
aabb cc
14 January 2004 Workshop on Dynamic Algorithms and Applications
13
Change
Propagation
Change propagation
aabb cc
aabb
cc
14 January 2004 Workshop on Dynamic Algorithms and Applications
14
Change
Propagation
Change propagation
aabb cc
aabb
cc
14 January 2004 Workshop on Dynamic Algorithms and Applications
15
Change
Propagation
Change propagation with Memoization
aabb cc
aabb
cc
14 January 2004 Workshop on Dynamic Algorithms and Applications
16
Change
Propagation
aabb cc
aabb
cc
Change propagation with Memoization
14 January 2004 Workshop on Dynamic Algorithms and Applications
17
Change
Propagation
Change propagation with Memoization
aabb cc
aabb
cc
14 January 2004 Workshop on Dynamic Algorithms and Applications
18
Change
Propagation
Change Propagation with Adaptive Memoization
aabb cc
aabb
cc
14 January 2004 Workshop on Dynamic Algorithms and Applications
19
The Internals
1. Order Maintenance Data Structure [Dietz, Sleator ‘87]Time stamp vertices of the DDG in sequential execution order
2. Priority queue for change propagationpriority = time stampRe-execute functions in sequential execution order
Ensures that a value is updated before being read
3. Hash tables for memoizationRemember results from the previous execution only
4. Constant-time equality tests
14 January 2004 Workshop on Dynamic Algorithms and Applications
20
Standard Quicksortfun qsort (l) = let fun qs (l,rest) = case l of NIL => rest | CONS(h,t) => let (smaller, bigger) = split(h,t) sbigger = qs (bigger,rest) in qs (smaller, CONS(h,sbigger)) end in qs(l,NIL) end
14 January 2004 Workshop on Dynamic Algorithms and Applications
21
Dynamic Quicksortfun qsort (l) = let fun qs (l,rest,d) = read(l, fn l' => case l' of NIL => write (d, rest) | CONS(h,t) => let (less,bigger) = split (h,t) sbigger = mod (fn d => qs(bigger,rest,d)) in qs(less,CONS(h,sbigger,d)) endin modmod (fn d => qs (l,NIL,d)) end
14 January 2004 Workshop on Dynamic Algorithms and Applications
22
Performance of QuicksortDynamized Quicksort updates its output in expected
O(logn) time for insertions/deletions at the end of the inputO(n) time for insertions/deletions at the beginning of the inputO(logn) time for insertions/deletions at a random location
Other Results for insertions/deletions anywhere in the inputDynamized Mergesort: expected O(logn) Dynamized Insertion Sort: expected O(n) Dynamized minimum/maximum/sum/...: expected O(logn)
14 January 2004 Workshop on Dynamic Algorithms and Applications
23
Function Call Tree for Quicksort
14 January 2004 Workshop on Dynamic Algorithms and Applications
24
Function Call Tree for Quicksort
14 January 2004 Workshop on Dynamic Algorithms and Applications
25
Function Call Tree for Quicksort
14 January 2004 Workshop on Dynamic Algorithms and Applications
26
Insertion at the end of the input
14 January 2004 Workshop on Dynamic Algorithms and Applications
27
Insertion in the middle
14 January 2004 Workshop on Dynamic Algorithms and Applications
28
Insertion in the middle
14 January 2004 Workshop on Dynamic Algorithms and Applications
29
Insertion at the start, in linear time
15
1 30
5
3
26 35
4616 279
Input: 15,30,26,1,5,16,27,9,3,35,46
14 January 2004 Workshop on Dynamic Algorithms and Applications
30
Insertion at the start, in linear time
15
1
30
5
3
26 35
4627
9
20
16
15
1 30
5
3
26 35
4616 279
Input: 20,15,30,26,1,5,16,27,9,3,35,46
14 January 2004 Workshop on Dynamic Algorithms and Applications
31
Kinetic Data Structures [Basch,Guibas,Herschberger ‘99]
Goal: Maintain properties of continuously moving objects
Example: A kinetic convex-hull data structure maintains the convex hull of a set of continuously moving objects
14 January 2004 Workshop on Dynamic Algorithms and Applications
32
Kinetic Data StructuresRun a static algorithm to obtain a proof of the propertyCertificate = Comparison + Failure timeInsert the certificates into a priority queue
Priority = Failure time
A framework for handling motion [Guibas, Karavelas, Russel, ALENEX 04]
while queue empty do { certificate = remove (queue) flip (certificate) update the certificate set (proof) }
14 January 2004 Workshop on Dynamic Algorithms and Applications
33
Kinetic Data Structures via Self-Adjusting Computation
Update the proof automatically with change propagation
A library for kinetic data structures [Acar, Blelloch, Vittes]Quicksort: expected O(1), Mergesort: expected O(1)Quick Hull, Chan’s algorithm, Merge Hull: expected O(logn)
while queue empty do { certificate = remove (queue) flip (certificate) propagate ()}
14 January 2004 Workshop on Dynamic Algorithms and Applications
34
Quick Hull: Find Min and Max
A
C
B
D
F
G
H
I
E
J
M
K
L
N
O
P
[A B C D E F G H I J K L M N O P]
14 January 2004 Workshop on Dynamic Algorithms and Applications
35
Quick Hull: Furthest Point
A
C
B
D
F
G
H
I
E
J
M
K
L
N
O
P
[A B D F G H J K M O P]
14 January 2004 Workshop on Dynamic Algorithms and Applications
36
Quick Hull: Filter
A
C
B
D
F
G
H
I
E
J
M
K
L
N
O
P
[ [A B F J ] [J O P] ]
14 January 2004 Workshop on Dynamic Algorithms and Applications
37
Quick Hull: Find left hull
A
C
B
D
F
G
H
I
E
J
M
K
L
N
O
P
[ [A B] [B J] [J O] [O P] ]
14 January 2004 Workshop on Dynamic Algorithms and Applications
38
Quick Hull: Done
A
C
B
D
F
G
H
I
E
J
M
K
L
N
O
P
[ [A B] [B J] [J O] [O P] ]
14 January 2004 Workshop on Dynamic Algorithms and Applications
39
Static Quick Hull fun findHull(line as (p1,p2),l,hull) = let pts = filter l (fn p => Geo.lineside(p,line)) in case pts of EMPTY => CONS(p1, hull) | _ => let pm = max (Geo.dist line) l left = findHull((pm,p2),l,hull,dest) full = findHull((p1,pm),l,left) in full end end fun quickHull l = let (mx,xx) = minmax (Geo.minX, Geo.maxX) l in findHull((mx,xx),points,CONS(xx,NIL) end
14 January 2004 Workshop on Dynamic Algorithms and Applications
40
Kinetic Quick Hullfun findHull(line as (p1,p2),l,hull,dest) = let pts = filter l(fn p => Kin.lineside(p,line))in modr (fn dest => read l (fn l => case l of NIL => write(dest,CONS(p1, hull)) | _ => read (max (Kin.dist line) l) (fn pm => let gr = modr (fn d => findHull((pm,p2),l,hull,d)) in findHull((p1,pm),l,gr,dest)))) end end
fun quickHull l = let (mx,xx) = minmax (Kin.minX, Kin.maxX) lin modr(fn d => read (mx,xx)(fn (mx,xx) => split ((mx,xx),l, CONS(xx,NIL),d)))) end
14 January 2004 Workshop on Dynamic Algorithms and Applications
41
Kinetic Quick Hull
Input size
Cert
ificate
s /
Even
t
14 January 2004 Workshop on Dynamic Algorithms and Applications
42
Dynamic and Kinetic Changes
Often interested in dynamic as well as kinetic changesInsert and delete objectsChange the motion plan, e.g., direction, velocity
Easily programmed via self-adjusting computationExample: Kinetic Quick Hull code is both dynamic and kinetic
Batch changesReal time changes: Can maintain partially correct data structures (stop propagation when time expires)
14 January 2004 Workshop on Dynamic Algorithms and Applications
43
Retroactive Data Structures[Demaine, Iacono, Langerman ‘04]
Can change the sequence of operations performed on the data structure
Example: A retroactive queue would allow the user to go back in time and insert/remove an item
14 January 2004 Workshop on Dynamic Algorithms and Applications
44
Retroactive Data Structures via Self-Adjusting Computation
Dynamize the static algorithm that takes as input the list of operations performed
Example: retroactive queuesInput: list of insert/remove operationsOutput: list of items removed Retroactive change: change the input list and propagate
14 January 2004 Workshop on Dynamic Algorithms and Applications
45
Rake and Compress Trees [Acar,Blelloch,Vittes]
Obtained by dynamizing tree contraction [ABHVW ‘04]
Experimental analysisImplemented and applied to a broad set of applications
Path queries, subtree queries, non-local queries etc.
For path queries, compared to Link-Cut Trees [Werneck]
Structural changes are relatively slow Data changes are faster
14 January 2004 Workshop on Dynamic Algorithms and Applications
46
Conclusions
Automatic dynamization techniques can yield efficient dynamic and kinetic algorithms/data structuresGeneral-purpose techniques for
transforming static algorithms to dynamic and kineticanalyzing their performance
Applications to kinetic and retroactive data structuresReduce dynamic problems to static problems
Future work: Lots of interesting problemsDynamic/kinetic/retroactive data structures
14 January 2004 Workshop on Dynamic Algorithms and Applications
47
Thank you!