A Trainable Graph Combination Scheme for Belief Propagation

29
A Trainable Graph Combination Scheme for Belief Propagation Kai Ju Liu New York University

description

A Trainable Graph Combination Scheme for Belief Propagation. Kai Ju Liu New York University. Images. Pairwise Markov Random Field. 4. 1. 2. 3. 5. Basic structure: vertices, edges. and observed value y i. Compatibility between states and observed values,. - PowerPoint PPT Presentation

Transcript of A Trainable Graph Combination Scheme for Belief Propagation

Page 1: A Trainable Graph Combination Scheme for Belief Propagation

A Trainable Graph Combination Scheme for Belief Propagation

Kai Ju Liu

New York University

Page 2: A Trainable Graph Combination Scheme for Belief Propagation

Images

Page 3: A Trainable Graph Combination Scheme for Belief Propagation

Pairwise Markov Random Field

1 2 3

4

5

• Basic structure: vertices, edges

Page 4: A Trainable Graph Combination Scheme for Belief Propagation

Pairwise Markov Random Field

• Basic structure: vertices, edges

• Vertex i has set of possible states Xi

1X 2X 3X

4X

5X

and observed value yi

1y 2y 3y4y

5y

• Compatibility between states and observed values, iii yx ,

1 2 3

4

5

• Compatibility between neighboring vertices i and j, jiij xx ,

12 23

34

35

45

Page 5: A Trainable Graph Combination Scheme for Belief Propagation

Pairwise MRF: Probabilities

• Joint probability:

1X 2X 3X

4X

5X

1y 2y 3y4y

5y

1 2 3

4

5

12 23

34

35

45

ij

jiiji

iii xxyxZ

xxp ,,1

,,5

151

• Marginal probability:

ijjXx

ii

jj

xxpxp,51,

51 ,,

– Advantage: allows average over ambiguous states– Disadvantage: complexity exponential in number of vertices

Page 6: A Trainable Graph Combination Scheme for Belief Propagation

Belief Propagation

1 2 3

4

5

Page 7: A Trainable Graph Combination Scheme for Belief Propagation

Belief Propagation

1b 2b 3b

4b

5b

• Beliefs replace probabilities:

iNj

ijiiiii

ii xmyxz

xb ,1

• Messages propagate information:

jj Xx ijNk

jkjijjijjjiji xmxxyxxm\

,,

212 xm

121 xm

323 xm

232 xm

434 xm

343 xm

535 xm

353 xm

Page 8: A Trainable Graph Combination Scheme for Belief Propagation

BP: Questions

• When can we calculate beliefs exactly?• When do beliefs equal probabilities?• When is belief propagation efficient?

Answer: Singly-Connected Graphs (SCG’s)• Graphs without loops• Messages terminate at leaf vertices• Beliefs equal probabilities• Complexity in previous example reduced from 13S5

to 24S2

Page 9: A Trainable Graph Combination Scheme for Belief Propagation

BP on Loopy Graphs

• Messages do not terminate

• Energy approximation schemes [Freeman et al.]– Standard belief propagation– Generalized belief propagation

• Standard belief propagation– Approximates Gibbs free energy of system by

Bethe free energy– Iterates, requiring convergence criteria

1 2

4 3

121 xm

232 xm

343 xm

414 xm

Page 10: A Trainable Graph Combination Scheme for Belief Propagation

BP on Loopy Graphs

• Tree-based reparameterization [Wainwright]– Reparameterizes distributions on singly-connected

graphs– Convergence improved compared to standard

belief propagation– Permits calculation of bounds on approximation

errors

Page 11: A Trainable Graph Combination Scheme for Belief Propagation

BP-TwoGraphs

• Eliminates iteration• Utilizes advantages of SCG’s

Page 12: A Trainable Graph Combination Scheme for Belief Propagation

BP-TwoGraphs

• Calculate beliefs on each set of SCG’s:–

• Select set of beliefs with minimum entropy

iiiHii

Gi x

iHii

Hi

xi

Gii

Gi

xbxb

xbxbxbxb log,logminarg,

iHii

Gi xbxb and

n

n

HH

GG

,,

,,

1

1

• Consider loopy graph with n vertices• Select two sets of SCG’s that approximate the graph

Page 13: A Trainable Graph Combination Scheme for Belief Propagation

BP-TwoGraphs on Images

• Rectangular grid of pixel vertices

• Hi: horizontal graphs

• Gi: vertical graphs

horizontal graph vertical graphoriginal graph

Page 14: A Trainable Graph Combination Scheme for Belief Propagation

Image Segmentation

add noise segment

Page 15: A Trainable Graph Combination Scheme for Belief Propagation

Image Segmentation Results

Page 16: A Trainable Graph Combination Scheme for Belief Propagation

Image Segmentation Revisited

add noise ground truth

max-flowground truth

Page 17: A Trainable Graph Combination Scheme for Belief Propagation

Image Segmentation:Horizontal Graph Analysis

Page 18: A Trainable Graph Combination Scheme for Belief Propagation

Image Segmentation:Vertical Graph Analysis

Page 19: A Trainable Graph Combination Scheme for Belief Propagation

BP-TwoLines

• Rectangular grid of pixel vertices

• Hi: horizontal lines

• Gi: vertical lines

horizontal line vertical lineoriginal graph

Page 20: A Trainable Graph Combination Scheme for Belief Propagation

Image Segmentation Results II

Page 21: A Trainable Graph Combination Scheme for Belief Propagation

Image Segmentation Results III

Page 22: A Trainable Graph Combination Scheme for Belief Propagation

Natural Image Segmentation

Page 23: A Trainable Graph Combination Scheme for Belief Propagation

Boundary-Based Image Segmentation: Window Vertices

• Square 2-by-2 window of pixels

• Each pixel has two states

– foreground

– background

Page 24: A Trainable Graph Combination Scheme for Belief Propagation

Boundary-Based Image Segmentation: Overlap

Page 25: A Trainable Graph Combination Scheme for Belief Propagation

Boundary-Based Image Segmentation: Graph

Page 26: A Trainable Graph Combination Scheme for Belief Propagation

Real Image Segmentation: Training

Page 27: A Trainable Graph Combination Scheme for Belief Propagation

Real Image Segmentation: Results

Page 28: A Trainable Graph Combination Scheme for Belief Propagation

Real Image Segmentation: Gorilla Results

Page 29: A Trainable Graph Combination Scheme for Belief Propagation

Conclusion

• BP-TwoGraphs– Accurate and efficient– Extensive use of beliefs– Trainable parameters

• Future work– Multiple states– Stereo– Image fusion