Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy...

53
Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)
  • date post

    22-Dec-2015
  • Category

    Documents

  • view

    218
  • download

    1

Transcript of Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy...

Page 1: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Self-Adjusting Computation

Robert Harper

Carnegie Mellon University

(With Umut Acar and Guy Blelloch)

Page 2: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

The Problem

• Given a static algorithm, obtain a dynamic, or incremental, version.– Maintain a sorted list under insertions and

deletions– Maintain a convex hull under motions of the

points.– Maintain semantics of a program under edits.

Page 3: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Example: Sorting

Input:

Output:

5, 1, 4, 2, 3

1, 2, 3, 4, 5

5, 1, 2, 3

1, 2, 3, 5

Page 4: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Dynamic Algorithms

• There is a large body of work on dynamic / incremental algorithms.– Specific techniques for specific problems.

• Our interest is in general methods, rather than ad hoc solutions.– Applying them to a variety of problems.– Understanding when these methods apply.

Page 5: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Self-Adjusting Computation

• Self-adjusting computation is a method for “dynamizing” a static algorithm.– Start with a static algorithm for a problem.– Make it robust under specified changes.

• Goal: “fast” response to “small” change.– “Fast” and “small” are problem-specific!– As ever, the analysis can be difficult.

Page 6: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Self-Adjusting Computation

• Generalizes incremental computation.– Attribute grammars, circuit models assume static

control dependencies.– SAC permits dynamic dependencies.

• Combines algorithmic and programming language techniques.– Linguistic tools to ensure correctness relative to

static algorithm.– Algorithmic techniques for efficient implementation.

Page 7: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Self-Adjusting Computation

• Adaptivity: Propagate the effects on the output of a change to the input.

• Selective Memoization: Reuse old results, provided they are valid after change.

• Adaptive Memoization: Reuse old results, even though they may not be valid after change.

Page 8: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Model of Computation

• Purely functional programming model.– Data structures are persistent.– No implicit side effects or mutation.

• Imperative model of change.– Run on initial input to obtain output.– Make modifications to the input.– Propagate changes to the output.

Page 9: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Model of Computation

Steps of ExecutionS

tage

s of

Rev

isio

n (pure/persistent)(i

mp

ure

/eph

em

era

l)

Page 10: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

A Simple Example: Map

data cell = nil | cons of int £ listand list = cell

fun map (l:list) = map’(l)

and map’(c) =case c of nil ) nil

| cons (h, t) ) cons (h+10, map t)

Page 11: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Static Version of Map

• Running map on the input list …

• … yields the new output list

2 3 4

12 13 14

Page 12: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Dynamic Version of Map

• To permit insertions and deletions, lists

are made modifiable:

data cell =

nil | cons of int £ listand list =

cell mod

Page 13: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

2 3 52 3 5

4

Dynamic Version of Map

Insertion changes a modifiable:

Page 14: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

12 13 15

Dynamic Version of Map

We’d like to obtain the result …

12 13 15

14

Page 15: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Dynamic Version of Map

• Can we update the result in O(1) time?– Make one new call to map.– Splice new cell into “old” result.

• Yes, using self-adjusting computation!– Adaptivity: call map on the new node.– Memoization: re-synchronize with suffix.

Page 16: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptivity Overview

• To make map adaptive, ensure that– Changes invalidate results that depend on

the modified value.– Computations dependent on a change are

re-run with the “new” value.

• Two key ideas:– Make access to modifiables explicit.– Maintain dependencies dynamically.

Page 17: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptive Map

data cell = nil | cons of int £ listand list = cell mod

fun map (l:list) =mod (let mod c = l in write(map’ c))

and map’ c =case c of nil ) nil| cons (h, t) ) cons (h+10, map t)

Allocate new modifiable

Read old modifiableWrite new modifiable

Page 18: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptive Map

Modification to input:

2 3 5

4Modified cell is

argument of map’, which must be re-run.

Page 19: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

15

14

12 1312 13

Adaptive Map

• Associated output is invalidated, and suffix is re-created.

Result of map’ written here.

12 13 15

Page 20: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptive Programming

• Crux: dependencies among modifiables.– Writing a modifiable invalidates any

computation that reads it.– One read can be contained within another.

• Dependencies are fully dynamic!– Cells are allocated dynamically.– Reads affect control flow.

Page 21: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptive Programming

• Change propagation consists of– Re-running readers of changed cells.– Updating dependencies during re-run.

• To ensure correctness,– All dependencies must be accurately tracked.– Containment ordering must be maintained.

• Linguistic tools enforce these requirements!

Page 22: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Type System for Adaptivity

• The type mod is a modality.– From lax modal logic.– And therefore forms a monad.

• Two modes of expression:– Stable: ordinary functional code, not

affected by changes.– Changeable: affected by change, will be

written to another modifiable.

Page 23: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Type System for Adaptivity

• Elimination form for mod:let mod x: = s in c end– Read modifiable given by s.– Bind value to x:, evaluate c.

• Makes dependencies explicit:– Records read of given modifiable.– Re-run c with new x if changed.– Reads within c are contained in this read.

Page 24: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Type System for Adaptivity

• Execution maintains a trace of adaptive events.– Creation of a modifiable.– Writes to a modifiable.– Reads of a modifiable.

• Containment is recorded using Sleator-Dietz order maintenance algorithm.– Associate time intervals with events.– Re-run reader within the “old” time interval.– Requires arbitrary fractionation of time steps.

Page 25: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptive Map

• Responds to insertion in linear time:

12 13 15

14

Page 26: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Memoizing Map

• For constant-time update, we must re-synchronize with the old result.– Results after insertion point remain valid

despite change.– Re-use, rather than recompute, to save

time.

• Selective memoization is a general technique for achieving this.

Page 27: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Selective Memoization

• Standard memoization is data-driven.– Associate with a function f a finite set of

ordered pairs (x, f(x)).– Consult memo table before call, update

memo table after call.

• Cannot handle partial dependencies.– Eg, read only first 10 elements of an array.– Eg, use an approximation of input.

Page 28: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Selective Memoization

• Selective memoization is control-driven.– Guided by the exploration of the input.– Sensitive to approximations.

• Associate results with control paths.– “Have I been here before?”– Control path records dependencies of

output on input.

Page 29: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Memoized Adaptive Map

fun map (l:list) =mod (let mod c = l in write(map’ c))

and memo map’ c =mcase c of nil ) return (nil)| cons (h, t) ) let !h’=h and !t’=t in return(cons (h’+10, map t’))

Depends on nil/cons, head, and tail.

Depends only on nil/cons.

Page 30: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Memoized Adaptive Map

• With selective memoization we obtain

• Constant-time update after insertion.

2 3 5

4

Page 31: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Memoized Programming

• Selective memoization requires an accurate record of I/O dependencies.– Which aspects of the input are relevant to

the output?– Sensitive to dynamic control flow.

• Linguistic support provides– Specification of approximations.– Accurate maintenance of dependencies.

Page 32: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Type System for Memoization

• Based on S4 modal logic for necessity.– Truth assumptions: restricted variables.– Validity assumptions: ordinary variables.

• Modality ! means value is necessary.– !(int £ int) : both components necessary– !int £ !int : either, neither, or both parts

needed, depending on control flow.

Page 33: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Type System for Memoization

• Key idea: variables are classified as restricted or unrestricted.– Arguments to memoized functions are restricted.– Results of memoized functions may not involve

restricted variables.– Elimination form for ! binds an unrestricted

variable.

• Ensures that relevant portion of input must be explicitly “touched” to record dependency.

Page 34: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Selective Memoization

fun map (l:list) = …

and memo map’ c =mcase c of nil ) return (nil)| cons (h, t) ) let !h’=h and !t’=t in return(cons (h’+10, map t’))

Restricted

Unrestricted

Page 35: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptive Memoization

• Effectiveness of memoization depends on preserving identity.– Modifiables compare “by reference”.– Copying a modifiable impedes re-use.

• This conflicts with the functional programming model.– Eg, functional insertions copy structure.– Undermines effectiveness of memoization.

Page 36: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptive Memoization

• Consider again applying map to:

• We obtain the new modifiable list

12 13 15

2 3 5

Page 37: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptive Memoization

• Now functionally insert an element:

2 3 5

4

Page 38: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptive Memoization

• Running map on result yields

12 13 15

14

Page 39: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptive Memoization

• Subsequent runs propagate the effect:

22 23 25

24

Page 40: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptive Memoization

• Ideally, we’d re-use the “old” prefix!

• But what justifies this?

12 13 15

14

Page 41: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Memoization and Adaptivity

• Consider a memoized “copy” function:

fun copy (!m : int mod) = return (mod (let mod x = m in x))

• What happens if we modify m?– Value might change “under our feet”.– But adaptivity restores correctness!

Page 42: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Memoization and Adaptivity

• Initial run of copy:

• Calls to copy at later stages return “old” cell, which is updated by adaptivity.

43

43

1743

43

17

17 17

Page 43: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptive Memoization

• Permit inaccurate memoization.– Allows recovery of “old” cells.– Cached result will be incorrect.

• Adapt incorrect result to restore correctness.– Use change propagation to revise answer.– Only sensible in conjunction with

adaptivity!

Page 44: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptive Memoization

fun map (l:list) = …

and memo map’ c =mcase c of nil ) return (nil)| cons (h, t) ) let !h’=h in let ?t’=t in return(cons (h’+10, map t’))

Do not record dependency!

Memo match only on nil/cons and head.

Page 45: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptively Memoized Map

• On the initial input …

• Map yields the output …

2 3 5

12 13 15

Page 46: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptively Memoized Map

• After a functional update, the input is

2 3 5

4

Page 47: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptively Memoized Map

• Now maps yields the inaccurate result:

• Memo matches on head value 2, yielding old result, with incorrect tail.

12 13 15

Result of mapTail is incorrect!

Page 48: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

2 3 5

4

2 3 5

4

Adaptively Memoized Map

• Restore accuracy by self-assignment:

Page 49: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Adaptively Memoized Map

• Change to input propagates to output

12 13 1512 13 1512 13 15

14

Page 50: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Some Results

• Quicksort: expected O(lg n) update after insert or delete at a random position.

• Mergesort: expected O(lg n) update after insert or delete.

• Tree Contraction: O(lg n) update after adding or deleting an edge.

• Kinetic Quickhull: O(lg n) per event measured.

Page 51: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Ongoing Work

• When can an algorithm be dynamized?– Consider edit distance between traces for

a class of input changes.– Small edit distance suggests we can build

a dynamic version using SAC.

• What is a good semantic model?– Current methods are rather ad hoc.– Are there better models?

Page 52: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Conclusion

• Self-adjusting computation is a powerful method for building dynamic algorithms.– Systematic methodology.– Simple correctness criteria.– Easy to implement.

• The interplay between linguistic and algorithmic methods is vital!– is also a powerful algorithmic tool!

Page 53: Self-Adjusting Computation Robert Harper Carnegie Mellon University (With Umut Acar and Guy Blelloch)

Questions?