Randomization in Graph Optimization Problems
-
Upload
simone-strong -
Category
Documents
-
view
35 -
download
1
description
Transcript of Randomization in Graph Optimization Problems
![Page 1: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/1.jpg)
MIT
Randomization in Graph Optimization
ProblemsDavid Karger
MIThttp://theory.lcs.mit.edu/~karger
![Page 2: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/2.jpg)
MIT2
Randomized Algorithms
Flip coins to decide what to do next Avoid hard work of making “right” choice Often faster and simpler than
deterministic algorithms
Different from average-case analysis» Input is worst case» Algorithm adds randomness
![Page 3: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/3.jpg)
MIT3
Methods
Random selection» if most candidate choices “good”, then a
random choice is probably good Random sampling
» generate a small random subproblem» solve, extrapolate to whole problem
Monte Carlo simulation» simulations estimate event likelihoods
Randomized Rounding for approximation
![Page 4: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/4.jpg)
MIT4
Cuts in Graphs
Focus on undirected graphs A cut is a vertex partition Value is number (or total weight) of
crossing edges
![Page 5: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/5.jpg)
MIT5
Optimization with Cuts
Cut values determine solution of many graph optimization problems:» min-cut / max-flow» multicommodity flow (sort-of)» bisection / separator» network reliability» network design
Randomization helps solve these problems
![Page 6: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/6.jpg)
MIT6
Presentation Assumption For entire presentation, we consider
unweighted graphs (all edges have weight/capacity one)
All results apply unchanged to arbitrarily weighted graphs» Integer weights = parallel edges» Rational weights scale to integers» Analysis unaffected» Some implementation details
![Page 7: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/7.jpg)
MIT7
Basic Probability Conditional probability
» Pr[A B] = Pr[A] Pr[B | A] Independent events multiply:
» Pr[A B] = Pr[A] Pr[B] Union Bound
» Pr[X Y] Pr[X] + Pr[Y] Linearity of expectation:
» E[X + Y] = E[X] + E[Y]
![Page 8: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/8.jpg)
MIT
Random Selection forMinimum Cuts
Random choices are good when problems are rare
![Page 9: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/9.jpg)
MIT9
Minimum Cut Smallest cut of graph Cheapest way to separate into 2 parts Various applications:
» network reliability (small cuts are weakest)» subtour elimination constraints for TSP» separation oracle for network design
Not s-t min-cut
![Page 10: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/10.jpg)
MIT10
Max-flow/Min-cut s-t flow: edge-disjoint packing of s-t paths s-t cut: a cut separating s and t [FF]: s-t max-flow = s-t min-cut
» max-flow saturates all s-t min-cuts» most efficient way to find s-t min-cuts
[GH]: min-cut is “all-pairs” s-t min-cut» find using n flow computations
![Page 11: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/11.jpg)
MIT11
Flow Algorithms Push-relabel [GT]:
» push “excess” around graph till it’s gone» max-flow in O*(mn) (note: O* hides logs)
– recent O*(m3/2) [GR]» min-cut in O*(mn2) --- “harder” than flow
Pipelining [HO]:» save push/relabel data between flows» min-cut in O*(mn) --- “as easy” as flow
![Page 12: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/12.jpg)
MIT12
Contraction Find edge that doesn’t cross min-cut Contract (merge) endpoints to 1 vertex
![Page 13: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/13.jpg)
MIT13
Contraction Algorithm Repeat n - 2 times:
» find non-min-cut edge» contract it (keep parallel edges)
Each contraction decrements #vertices At end, 2 vertices left
» unique cut» corresponds to min-cut of starting graph
![Page 14: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/14.jpg)
![Page 15: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/15.jpg)
MIT15
Picking an Edge Must contract non-min-cut edges [NI]: O(m) time algorithm to pick edge
» n contractions: O(mn) time for min-cut» slightly faster than flows
If only could find edge faster….
Idea: min-cut edges are few
![Page 16: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/16.jpg)
MIT16
Randomize
Repeat until 2 vertices remainpick a random edgecontract it
(keep fingers crossed)
![Page 17: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/17.jpg)
MIT17
Analysis I Min-cut is small---few edges
» Suppose graph has min-cut c» Then minimum degree at least c» Thus at least nc/2 edges
Random edge is probably safePr[min-cut edge] c/(nc/2)
= 2/n(easy generalization to capacitated case)
![Page 18: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/18.jpg)
MIT18
Analysis II Algorithm succeeds if never accidentally
contracts min-cut edge Contracts #vertices from n down to 2 When k vertices, chance of error is 2/k
» thus, chance of being right is 1-2/k Pr[always right] is product of
probabilities of being right each time
![Page 19: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/19.jpg)
MIT19
22
)1(2
31
132
32
122
n
2k
)())(()1()1)(1(
safe]n contractio Pr[ success]Pr[
n
nn
nn
nn
nn
thk
Analysis III
…not too good!
![Page 20: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/20.jpg)
MIT20
Repetition Repetition amplifies success probability
» basic failure probability 1 - 2/n2
» so repeat 7n2 times
6
72
7
2
10
)1(
once]) (Pr[fail
times]7 Pr[failfailure] completePr[
2
2
2
nn
n
n
![Page 21: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/21.jpg)
MIT21
How fast? Easy to perform 1 trial in O(m) time
» just use array of edges, no data structures But need n2 trials: O(mn2) time Simpler than flows, but slower
![Page 22: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/22.jpg)
MIT22
An improvement [KS] When k vertices, error probability 2/k
» big when k small Idea: once k small, change algorithm
» algorithm needs to be safer» but can afford to be slower
Amplify by repetition!» Repeat base algorithm many times
![Page 23: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/23.jpg)
MIT23
Recursive AlgorithmAlgorithm RCA ( G, n )
{G has n vertices} repeat twice
randomly contract G to n/2½ vertices
RCA(G,n/21/2)
(50-50 chance of avoiding min-cut)
![Page 24: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/24.jpg)
MIT24
Main Theorem On any capacitated, undirected graph,
Algorithm RCA» runs in O*(n2) time with simple structures» finds min-cut with probability 1/log n
Thus, O(log n) repetitions suffice to find the minimum cut (failure probability 10-6) in O(n2 log2 n) time.
![Page 25: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/25.jpg)
MIT25
Proof Outline Graph has O(n2) (capacitated) edges So O(n2) work to contract, then two
subproblems of size n/2½
» T(n) = 2 T(n/2½) + O(n2) = O(n2 log n) Algorithm fails if both iterations fail
» Iteration succeeds if contractions and recursion succeed
» P(n)=1 - [1 - ½ P(n/2½)]2 = (1 / log n)
![Page 26: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/26.jpg)
MIT26
Failure Modes
Monte Carlo algorithms always run fast and probably give you the right answer
Las Vegas algorithms probably run fast and always give you the right answer
To make a Monte Carlo algorithm Las Vegas, need a way to check answer» repeat till answer is right
No fast min-cut check known (flow slow!)
![Page 27: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/27.jpg)
MIT
How do we verify a minimum cut?
![Page 28: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/28.jpg)
MIT
Enumerating CutsThe probabilistic method,
backwards
![Page 29: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/29.jpg)
MIT29
Cut Counting Original CA finds any given min-cut with
probability at least 2/n(n-1) Only one cut found Disjoint events, so probabilities add So at most n(n-1)/2 min-cuts
» probabilities would sum to more than one Tight
» Cycle has exactly this many min-cuts
![Page 30: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/30.jpg)
MIT30
Enumeration
RCA as stated has constant probability of finding any given min-cut
If run O(log n) times, probability of missing a min-cut drops to 1/n3
But only n2 min-cuts So, probability miss any at most 1/n So, with probability 1-1/n, find all
» O(n2 log3 n) time
![Page 31: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/31.jpg)
MIT31
Generalization
If G has min-cut c, cut c is -mincut Lemma: contraction algorithm finds any
given -mincut with probability (n-2) » Proof: just add factor to basic analysis
Corollary: O(n2) -mincuts Corollary: Can find all in O*(n2) time
» Just change contraction factor in RCA
![Page 32: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/32.jpg)
MIT32
Summary A simple fast min-cut algorithm
» Random selection avoids rare problems Generalization to near-minimum cuts Bound on number of small cuts
» Probabilistic method, backwards
![Page 33: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/33.jpg)
MIT
Random Sampling
![Page 34: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/34.jpg)
MIT34
Random Sampling General tool for faster algorithms:
» pick a small, representative sample» analyze it quickly (small)» extrapolate to original (representative)
Speed-accuracy tradeoff» smaller sample means less time» but also less accuracy
![Page 35: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/35.jpg)
MIT35
A Polling Problem Population of size m Subset of c red members Goal: estimate c Naïve method: check whole population Faster method: sampling
» Choose random subset of population» Use relative frequency in sample as
estimate for frequency in population
![Page 36: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/36.jpg)
MIT36
Analysis: Chernoff Bound
Random variables Xi [0,1] Sum X = Xi Bound deviation from expectation
Pr[ |X-E[X]| E[X] ] < exp(-2E[X] / 4) “Probably, X 1±E[X]” If E[X] 4(ln n)/2, “tight concentration”
» Deviation by probability < 1 / n
![Page 37: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/37.jpg)
MIT37
Application to Polling Choose each member with probability p Let X be total number of reds seen Then E[X]=pc So estimate ĉ by X/p Note ĉ accurate to within 1± iff X is
within 1± of expectation: ĉ = X/p 1±E[X]/p1±c
![Page 38: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/38.jpg)
MIT38
Analysis Let Xi=1 if ith red item chosen, else 0 Then X= Xi Chernoff Bound applies
» Pr[deviation by ] < exp(-2pc/ 4)» < 1/n if pc > 4(log n)/2
Pretty tight» if pc < 1, likely no red samples» so no meaningful estimate
![Page 39: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/39.jpg)
MIT
Sampling for Min-Cuts
![Page 40: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/40.jpg)
MIT40
Min-cut Duality [Edmonds]: min-cut=max tree packing
» convert to directed graph» “source” vertex s (doesn’t matter which)» spanning trees directed away from s
[Gabow] “augmenting trees”» add a tree in O*(m) time» min-cut c (via max packing) in O*(mc)» great if m and c are small…
![Page 41: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/41.jpg)
MIT41
Examplemin-cut 2
2 directedspanning trees
directed min-cut 2
![Page 42: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/42.jpg)
MIT42
Random Sampling Gabow’s algorithm great if m, c small Random sampling
» reduces m, c» scales cut values (in expectation)» if pick half the edges, get half of each cut
So find tree packings, cuts in samples
Problem: maybe some large deviations
![Page 43: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/43.jpg)
MIT43
Sampling Theorem Given graph G, build a sample G(p) by
including each edge with probability p Cut of value v in G has expected value
pv in G(p) Definition: “constant” = 8 (ln n) / 2
Theorem: With high probability, all exponentially many cuts in G( / c) have (1 ± ) times their expected values.
![Page 44: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/44.jpg)
MIT44
A Simple Application
[Gabow] packs trees in O*(mc) time Build G( / c)
» minimum expected cut » by theorem, min-cut probably near » find min-cut in O*(m) time using [Gabow]» corresponds to near-min-cut in G
Result: (1+) times min-cut in O*(m/2) time
![Page 45: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/45.jpg)
MIT45
Proof of Sampling: Idea Chernoff bound says probability of large
deviation in cut value is small Problem: exponentially many cuts.
Perhaps some deviate a great deal Solution: showed few small cuts
» only small cuts likely to deviate much» but few, so Chernoff bound applies
![Page 46: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/46.jpg)
MIT46
Proof of Sampling
Sampled with probability /c, » a cut of value c has mean » [Chernoff]: deviates from expected size by
more than with probability at most n-3
At most n2 cuts have value c Pr[any cut of value c deviates] = O(n) Sum over all
![Page 47: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/47.jpg)
MIT
Las Vegas AlgorithmsFinding Good Certificates
![Page 48: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/48.jpg)
MIT48
Approximate Tree Packing
Break edges into c /random groups Each looks like a sample at rate / c
» O*( m / c) edges» each has min expected cut » so theorem says min-cut 1 –
So each has a packing of size 1 – [Gabow] finds in time O*(m/c) per group
» so overall time is (c / ) O*(m/c) = O*(m)
![Page 49: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/49.jpg)
MIT49
Las Vegas Algorithm
Packing algorithm is Monte Carlo Previously found approximate cut (faster) If close, each “certifies” other
» Cut exceeds optimum cut» Packing below optimum cut
If not, re-run both Result: Las Vegas, expected time O*(m)
![Page 50: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/50.jpg)
MIT50
Exact Algorithm Randomly partition edges in two groups
» each like a ½ -sample: =O*(c-½) Recursively pack trees in each half
» c/2 - O*(c½) trees Merge packings
» gives packing of size c - O*(c½)» augment to maximum packing: O*(mc½)
T(m,c)=2T(m/2,c/2)+O*(mc½) = O*(mc½)
![Page 51: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/51.jpg)
MIT
Nearly Linear Time
![Page 52: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/52.jpg)
MIT52
Analyze Trees Recall: [G] packs c (directed)-edge
disjoint spanning trees Corollary: in such a packing, some tree
crosses min-cut only twice To find min-cut:
» find tree packing» find smallest cut with 2 tree edges crossing
![Page 53: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/53.jpg)
MIT53
Constraint trees Min-cut c:
» c directed trees» 2c directed min-cut
edges » On average, two
min-cut edges/tree Definitions:
tree 2-crosses cut
![Page 54: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/54.jpg)
MIT54
Finding the Cut From crossing tree
edges, deduce cut Remove tree edges No other edges cross So each component
is on one side And opposite its
“neighbor’s” side
![Page 55: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/55.jpg)
MIT55
Two Problems Packing trees takes too long
» Gabow runtime is O*(mc) Too many trees to check
» Only claimed that one (of c) is good Solution: sampling
![Page 56: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/56.jpg)
MIT56
Sampling
Use G(/c) with =1/8» pack O*() trees in O*(m) time» original min-cut has (1) edges in G( / c) » some tree 2-crosses it in G( / c) » …and thus 2-crosses it in G
Analyze O*() trees in G» time O*(m) per tree
Monte Carlo
![Page 57: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/57.jpg)
MIT
Simple First StepDiscuss case where one tree
edge crosses min-cut
![Page 58: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/58.jpg)
MIT58
Root tree, so cut subtree Use dynamic program up from leaves to
determine subtree cuts efficiently Given cuts at children of a node,
compute cut at parent Definitions:
» v are nodes below v» C(v) is value of cut at subtree v
Analyzing a tree
![Page 59: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/59.jpg)
MIT59
The Dynamic Programu
v w
keep
discard
C u C v C w C v w 2 ,
Edges with leastcommon ancestor u
![Page 60: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/60.jpg)
MIT60
Algorithm: 1-Crossing Trees
Compute edges’ LCA’s: O(m) Compute “cuts” at leaves
» Cut values = degrees» each edge incident on at most two leaves» total time O(m)
Dynamic program upwards: O(n)
Total: O(m+n)
![Page 61: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/61.jpg)
MIT61
2-Crossing Trees Cut corresponds to two subtrees:
n2 table entries fill in O(n2) time with dynamic program
C v w C v C w C v w 2 ,
v w
keep
discard
![Page 62: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/62.jpg)
MIT62
Bottleneck is C(v, w) computations Avoid. Find right “twin” w for each v
Linear Time
Compute using addpath and minpath operations of dynamic trees [ST]
Result: O(m log3 n) time (messy)
C v C w C v ww
min ,2
![Page 63: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/63.jpg)
MIT
How do we verify a minimum cut?
![Page 64: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/64.jpg)
MIT
Network DesignRandomized Rounding
![Page 65: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/65.jpg)
MIT65
Problem Statement
Given vertices, and cost cvw to buy and edge from v to w, find minimum cost purchase that creates a graph with desired connectivity properties
Example: minimum cost k-connected graph.
Generally NP-hard Recent approximation algorithms [GW],
[JV]
![Page 66: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/66.jpg)
MIT66
Integer Linear Program Variable xvw=1 if buy edge vw Solution cost xvw cvw
Constraint: for every cut, xvw k Relaxing integrality gives tractable LP
» Exponentially many cuts» But separation oracles exist (eg min-cut)
What is integrality gap?
![Page 67: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/67.jpg)
MIT67
Randomized Rounding Given LP solution values xvw
Build graph where vw is present with probability xvw
Expected cost is at most opt: xvw cvw
Expected number of edges crossing any cut satisfies constraint
If expected number large for every cut, sampling theorem applies
![Page 68: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/68.jpg)
MIT68
k-connected subgraph Fractional solution is k-connected So every cut has (expected) k edges
crossing in rounded solution Sampling theorem says every cut has at
least k-(k log n)1/2 edges Close approximation for large k Can often repair: e.g., get k-connected
subgraph at cost 1+((log n)/k)1/2 times min
![Page 69: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/69.jpg)
MIT69
Repair Methods Slightly increase all xvw before rounding
» E.g., multiply by (1+)» Works fine, but some xvw become > 1» Problem if only want single use of edges
Round to approx, then fix» Solve “augmentation problem” using other
network design techniques» May be worse approx, but only to a small
part of cost
![Page 70: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/70.jpg)
MIT
Nonuniform Sampling
Concentrate on the important things
[Benczur-Karger, Karger, Karger-Levine]
![Page 71: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/71.jpg)
MIT71
s-t Min-Cuts Recall: if G has min-cut c, then in G(/c)
all cuts approximate their expected values to within.
Applications:Min-cut inO*(mc) time [G]
Approximate/exact inO*((m/c) c) =O*(m)
s-t min-cut of value v in O*(mv)
Approximate inO*(mv/c) time
Trouble if c is small and v large.
![Page 72: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/72.jpg)
MIT72
The Problem Cut sampling relied on Chernoff bound Chernoff bounds required that no one
edge is a large fraction of the expectation of a cut it crosses
If sample rate <<1/c, each edge across a min-cut is too significant
But: if edge only crosses large cuts, then sample rate <<1/c is OK!
![Page 73: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/73.jpg)
MIT73
Biased Sampling Original sampling theorem weak when
» large m » small c
But if m is large» then G has dense regions» where c must be large» where we can sample more sparsely
![Page 74: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/74.jpg)
MIT74
Approx. s-t min-cut O*(mv) O*(nv / 2) Approx. s-t min-cut O*(mn) O*(n2 / 2) Approx. s-t max-flow O*(m3/2 ) O*(mn1/2 / ) Flow of value v O*(mv) O*(nv)
m n /2 in weighted, undirected graphs
Problem Old Time New Time
![Page 75: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/75.jpg)
MIT75
Definition: A k-strong component is a maximal vertex-induced subgraph with min-cut k.
Strong Components
2 2
3
3
![Page 76: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/76.jpg)
MIT76
Nonuniform Sampling Definition: An edge is k-strong if its
endpoints are in same k-component. Stricter than k-connected endpoints. Definition: The strong connectivity ce
for edge e is the largest k for which e is k-strong.
Plan: sample dense regions lightly
![Page 77: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/77.jpg)
MIT77
Nonuniform Sampling Idea: if an edge is k-strong, then it is in a
k-connected graph So “safe” to sample with probability 1/k Problem: if sample edges with different
probabilities, E[cut value] gets messy Solution: if sample e with probability pe,
give it weight 1/pe
Then E[cut value]=original cut value
![Page 78: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/78.jpg)
MIT78
Compression TheoremDefinition: Given compression
probabilities pe, compressed graph G[pe]» includes edge e with probability pe and
» gives it weight 1/pe if included
Note E[G[pe]] = G
Theorem: G[/ ce] » approximates all cuts by » has O (n) edges
![Page 79: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/79.jpg)
MIT79
Application Compress graph to n=O*(n/2) edges Find s-t max-flow in compressed graph Gives s-t mincut in compressed So approx. s-t mincut in original Assorted runtimes:
» [GT] O(mn) becomes O*(n2/2)» [FF] O(mv) becomes O(nv/2)» [GR] O(m3/2) become O(n3/2/3)
![Page 80: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/80.jpg)
MIT80
Proof (approximation)
Basic idea: in a k-strong component, edges get sampled with prob. / k» original sampling theorem works
Problem: some edges may be in stronger components, sampled less
Induct up from strongest components:» apply original sampling theorem inside» then “freeze” so don’t affect weaker parts
![Page 81: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/81.jpg)
MIT81
Strength Lemma
Lemma: 1/ce n» Consider connected component C of G» Suppose C has min-cut k» Then every edge e in C has ce k» So k edges crossing C’s min-cut have
1/ce 1/k k (1/k ) = 1» Delete these edges (“cost” 1)» Repeat n - 1 times: no more edges!
![Page 82: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/82.jpg)
MIT82
Proof (edge count) Edge e included with probability / ce
So expected number is / ce
We saw 1/ce n So expected number at most n
![Page 83: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/83.jpg)
MIT83
Construction To sample, must find edge strengths
» can’t, but approximation suffices Sparse certificates identify weak edges:
» construct in linear time [NI]» contain all edges crossing cuts k » iterate until strong components emerge
Iterate for 2i-strong edges, all i» tricks turn it strongly polynomial
![Page 84: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/84.jpg)
MIT84
NI Certificate Algorithm
Repeat k times» Find a spanning forest» Delete it
Each iteration deletes one edge from every cut (forest is spanning)
So at end, any edge crossing a cut of size k is deleted
[NI] pipeline all iterations in O(m) time
![Page 85: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/85.jpg)
MIT85
Approximate Flows
Uniform sampling led to tree algorithms» Randomly partition edges» Merge trees from each partition element
Compression problematic for flow» Edge capacities changed» So flow path capacities distorted» Flow in compressed graph doesn’t fit in
original graph
![Page 86: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/86.jpg)
MIT86
Smoothing
If edge has strength ce, divide into / ce edges of capacity ce /» Creates 1/ce = n edges
Now each edge is only 1/ fraction of any cut of its strong component
So sampling a 1/ fraction works So dividing into groups works Yields (1-) max-flow in O*(mn1/2 / ) time
![Page 87: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/87.jpg)
MIT
Exact Flow AlgorithmsSampling from residual graphs
![Page 88: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/88.jpg)
MIT88
Residual Graphs Sampling can be used to approximate
cuts and flows A non-maximum flow can be made
maximum by augmenting paths But residual graph is directed. Can sampling help?
» Yes, to a limited extent
![Page 89: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/89.jpg)
MIT89
First Try Suppose current flow value f
» residual flow value v-f Lemma: if all edges sampled with
probability v/c(v-f) then, w.h.p., all directed cuts within of expectations» Original undirected sampling used /c
Expectations nonzero, so no empty cut So, some augmenting path exists
![Page 90: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/90.jpg)
MIT90
Application When residual flow i, seek augmenting
path in a sample of mv/ic edges. Time O(mv/ic).
Sum over all i from v down to 1 Total O(mv (log v)/c) since 1/i=O(log v) Here, can be any constant < 1 (say ½) So =O(log n) Overall runtime O*(mv/c)
![Page 91: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/91.jpg)
MIT91
Proof Augmenting a unit of flow from s to t
decrements residual capacity of each s-t cut by exactly one
s
t
![Page 92: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/92.jpg)
MIT92
Analysis Each s-t cut loses f edges, had at least v So, has at least ( v-f ) / v times as many
edges as before But we increase sampling probability by
a factor of v / ( v-f ) So expected number of sampled edges
no worse than before So Chernoff and union bound as before
![Page 93: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/93.jpg)
MIT93
Strong Connectivity Drawback of previous: dependence on
minimum cut c Solution: use strong connectivities Initialize =1 Repeat until done
» Sample edges with probabilities / ke
» Look for augmenting path» If don’t find, double
![Page 94: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/94.jpg)
MIT94
Analysis Theorem: if sample with probabilities /ke, and > v/(v-f), then will find augmenting path w.h.p.
Runtime: » always within a factor of 2 of “right” v/(v-f)» As in compression, edge count O(n)» So runtime O(n iv/i)=O*(nv)
![Page 95: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/95.jpg)
MIT95
Summary Nonuniform sampling for cuts and flows Approximate cuts in O(n2) time
» for arbitrary flow value Max flow in O(nv) time
» only useful for “small” flow value» but does work for weighted graphs» large flow open
![Page 96: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/96.jpg)
MIT
Network ReliabilityMonte Carlo estimation
![Page 97: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/97.jpg)
MIT97
The Problem Input:
» Graph G with n vertices » Edge failure probabilities
– For exposition, fix a single p Output:
» FAIL(p): probability G is disconnected by edge failures
![Page 98: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/98.jpg)
MIT98
Approximation Algorithms
Computing FAIL(p) is #P complete [V] Exact algorithm seems unlikely Approximation scheme
» Given G, p, , outputs -approximation» May be randomized:
– succeed with high probability» Fully polynomial (FPRAS) if runtime is
polynomial in n, 1/
![Page 99: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/99.jpg)
MIT99
Monte Carlo Simulation Flip a coin for each edge, test graph k failures in t trials FAIL(p) k/t E[k/t] = FAIL(p) How many trials needed for confidence?
» “bad luck” on trials can yield bad estimate» clearly need at least 1/FAIL(p)
Chernoff bound: O*(1/2FAIL(p)) suffice to give probable accuracy within » Time O*(m/2FAIL(p))
![Page 100: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/100.jpg)
MIT100
Chernoff Bound
Random variables Xi [0,1] Sum X = Xi Bound deviation from expectation
Pr[ |X-E[X]| E[X] ] < exp(-2E[X] / 4) If E[X] 4(log n)/2, “tight concentration”
» Deviation by probability < 1 / n No one variable is a big part of E[X]
![Page 101: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/101.jpg)
MIT101
Application
Let Xi=1 if trial i is a failure, else 0 Let X = X1 + … + Xt Then E[X] = t · FAIL(p) Chernoff says X within relative of E[X]
with probability 1-exp(2 t FAIL(p)/4) So choose t to cancel other terms
» “High probability” t = O(log n / 2FAIL(p))» Deviation by withprobability < 1 / n
![Page 102: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/102.jpg)
MIT102
For network reliability Random edge failures
» Estimate FAIL(p) = Pr[graph disconnects] Naïve Monte Carlo simulation
» Chernoff bound---“tight concentration”Pr[ |X-E[X]| E[X] ] exp(-2E[X] / 4)
» O(log n / 2FAIL(p)) trials expect O(log n / 2) network failures---sufficient for Chernoff
» So estimate within in O*(m/2FAIL(p)) time
![Page 103: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/103.jpg)
MIT103
Rare Events When FAIL(p) too small, takes too long
to collect sufficient statistics Solution: skew trials to make interesting
event more likely But in a way that let’s you recover
original probability
![Page 104: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/104.jpg)
MIT104
DNF Counting Given DNF formula (OR of ANDs)
(e1 e2 e3) (e1 e4) (e2 e6) Each variable set true with probability p Estimate Pr[formula true]
» #P-complete [KL, KLM] FPRAS
» Skew to make true outcomes “common”» Time linear in formula size
![Page 105: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/105.jpg)
MIT105
Rewrite problem Assume p=1/2
» Count satisfying assignments “Satisfaction matrix
» Truth table with one column per clause» Sij=1 if ith assignment satisfies jth clause
We want number of nonzero rows
![Page 106: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/106.jpg)
MIT106
Satisfaction Matrix
Clauses
Assignm
ents
1 1 0 1
0 0 0 0
0 1 0 0
1 0 1 0
Randomly sampling rows won’t workMight be too few nonzero rows
3 nonzero rows
![Page 107: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/107.jpg)
MIT107
New sample space Normalize each
nonzero row to one So sum of nonzeros
is desired value Goal: estimate
average nonzero Method: sample
random nonzeros
Clauses
Assignm
ents1/3 1/3 0 1/3
0 0 0 0
0 1 0 0
0 1/2 0 1/2
![Page 108: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/108.jpg)
MIT108
Sampling Nonzeros We know number of nonzeros/column
» If satisfy given clause, all variables in clause must be true
» All other variables unconstrained Estimate average by random sampling
» Know number of nonzeros/column» So can pick random column» Then pick random true-for-column
assignment
![Page 109: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/109.jpg)
MIT109
Few Samples Needed Suppose k clauses Then E[sample] > 1/k
» 1 satisfied clauses k» 1 sample value 1/k
Adding O(k log n / 2) samples gives “large” mean
So Chernoff says sample mean is probably good estimate
![Page 110: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/110.jpg)
MIT110
Reliability Connection
Reliability as DNF counting:» Variable per edge, true if edge fails» Cut fails if all edges do (AND of edge vars)» Graph fails if some cut does (OR of cuts)» FAIL(p)=Pr[formula true]
Problem: the DNF has 2n clauses
![Page 111: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/111.jpg)
MIT111
Focus on Small Cuts Fact: FAIL(p) > pc
Theorem: if pc=1/n(2+) then Pr[>-mincut fails]< n-
Corollary: FAIL(p) Pr[-mincut fails],
where=1+2/ Recall: O(n2) -mincuts Enumerate with RCA, run DNF counting
![Page 112: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/112.jpg)
MIT112
Review Contraction Algorithm
» O(n2) -mincuts» Enumerate in O*(n2) time
![Page 113: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/113.jpg)
MIT113
Proof of Theorem Given pc=1/n(2+)
At most n2 cuts have value c Each fails with probability pc=1/n(2+)
Pr[any cut of value c fails] = O(n) Sum over all
![Page 114: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/114.jpg)
MIT114
Algorithm RCA can enumerate all -minimum cuts
with high probability in O(n2) time. Given -minimum cuts, can -estimate
probability one fails via Monte Carlo simulation for DNF-counting (formula size O(n2))
Corollary: when FAIL(p)< n-(2+), can -approximate it in O (cn2+4/) time
![Page 115: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/115.jpg)
MIT115
Combine For large FAIL(p), naïve Monte Carlo For small FAIL(p), RCA/DNF counting Balance: -approx. in O(mn3.5/2) time Implementations show practical for
hundreds of nodes Again, no way to verify correct
![Page 116: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/116.jpg)
MIT116
Summary Naïve Monte Carlo simulation works
well for common events Need to adapt for rare events Cut structure and DNF counting lets us
do this for network reliability
![Page 117: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/117.jpg)
MIT
Conclusions
![Page 118: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/118.jpg)
MIT118
Conclusion Randomization is a crucial tool for
algorithm design Often yields algorithms that are faster or
simpler than traditional counterparts In particular, gives significant
improvements for core problems in graph algorithms
![Page 119: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/119.jpg)
MIT119
Randomized Methods
Random selection» if most candidate choices “good”, then a
random choice is probably good Monte Carlo simulation
» simulations estimate event likelihoods Random sampling
» generate a small random subproblem» solve, extrapolate to whole problem
Randomized Rounding for approximation
![Page 120: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/120.jpg)
MIT120
Random Selection When most choices good, do one at
random Recursive contraction algorithm for
minimum cuts» Extremely simple (also to implement)» Fast in theory and in practice [CGKLS]
![Page 121: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/121.jpg)
MIT121
Monte Carlo To estimate event likelihood, run trials Slow for very rare events Bias samples to reveal rare event FPRAS for network reliability
![Page 122: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/122.jpg)
MIT122
Random Sampling Generate representative subproblem Use it to estimate solution to whole
» Gives approximate solution» May be quickly repaired to exact solution
Bias sample toward “important” or “sensitive” parts of problem
New max-flow and min-cut algorithms
![Page 123: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/123.jpg)
MIT123
Randomized Rounding Convert fractional to integral solutions Get approximation algorithms for integer
programs “Sampling” from a well designed sample
space of feasible solutions Good approximations for network
design.
![Page 124: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/124.jpg)
MIT124
Generalization Our techniques work because
undirected graph are matroids All our results extend/are special cases
» Packing bases» Finding minimum “quotients”» Matroid optimization (MST)
![Page 125: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/125.jpg)
MIT125
Directed Graphs?
Directed graphs are not matroids Directed graphs can have lots of
minimum cuts Sampling doesn’t appear to work
![Page 126: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/126.jpg)
MIT126
Open problems Flow in O(n2) time
» Eliminate v dependence» Apply to weighted graphs with large flows» Flow in O(m) time?
Las Vegas algorithms» Finding good certificates
Detrministic algorithms» Deterministic construction of “samples”» Deterministically compress a graph
![Page 127: Randomization in Graph Optimization Problems](https://reader031.fdocuments.in/reader031/viewer/2022020422/56812f21550346895d94b996/html5/thumbnails/127.jpg)
MIT
Randomization in Graph Optimization
ProblemsDavid Karger
MIThttp://theory.lcs.mit.edu/~karger