Motion estimation

114
Motion estimation Digital Visual Effects, Spring 2008 Yung-Yu Chuang 2008/4/8 with slides by Michael Black and P. Anandan

description

Motion estimation. Digital Visual Effects, Spring 2008 Yung-Yu Chuang 2008/4/8. with slides by Michael Black and P. Anandan. Announcements. Artifacts #1 voting http://140.112.29.103/~hsiao/cgi-bin/votepage.cgi Some scribes are online - PowerPoint PPT Presentation

Transcript of Motion estimation

Page 1: Motion estimation

Motion estimation

Digital Visual Effects, Spring 2008Yung-Yu Chuang2008/4/8

with slides by Michael Black and P. Anandan

Page 2: Motion estimation

Announcements

• Artifacts #1 voting http://140.112.29.103/~hsiao/cgi-bin/votepage.cgi

• Some scribes are online• Project #2 checkpoint due on this Friday.

Send an image to TA.

Page 3: Motion estimation

Motion estimation

• Parametric motion (image alignment)• Tracking• Optical flow

Page 4: Motion estimation

Parametric motion

direct method for image stitching

Page 5: Motion estimation

Tracking

Page 6: Motion estimation

Optical flow

Page 7: Motion estimation

Three assumptions

• Brightness consistency• Spatial coherence• Temporal persistence

Page 8: Motion estimation

Brightness consistency

Image measurement (e.g. brightness) in a small region remain the same although their location may change.

Page 9: Motion estimation

Spatial coherence

• Neighboring points in the scene typically belong to the same surface and hence typically have similar motions.

• Since they also project to nearby pixels in the image, we expect spatial coherence in image flow.

Page 10: Motion estimation

Temporal persistence

The image motion of a surface patch changes gradually over time.

Page 11: Motion estimation

Image registration

Goal: register a template image T(x) and an input image I(x), where x=(x,y)T. (warp I so that it matches T)

Image alignment: I(x) and T(x) are two imagesTracking: T(x) is a small patch around a point p i

n the image at t. I(x) is the image at time t+1. Optical flow: T(x) and I(x) are patches of images a

t t and t+1.

T

fixed

I

warp

Page 12: Motion estimation

Simple approach (for translation)

• Minimize brightness difference

yx

yxTvyuxIvuE,

2),(),(),(

Page 13: Motion estimation

Simple SSD algorithm

For each offset (u, v)

compute E(u,v);

Choose (u, v) which minimizes E(u,v);

Problems:• Not efficient• No sub-pixel accuracy

Page 14: Motion estimation

Lucas-Kanade algorithm

Page 15: Motion estimation

Newton’s method

• Root finding for f(x)=0• March x and test signs• Determine Δx (small→slow; large→ miss)

Page 16: Motion estimation

Newton’s method

• Root finding for f(x)=0

Taylor’s expansion:

20000 )(''

2

1)(')()( xfxfxfxf

)(')()( 000 xfxfxf

)('

)(

n

nn xf

xf

)('

)(1

n

nnn xf

xfxx

Page 17: Motion estimation

Newton’s method

• Root finding for f(x)=0

x0x1x2

Page 18: Motion estimation

Newton’s method

pick up x=x0iterate

compute update x by x+Δxuntil converge

Finding root is useful for optimization because Minimize g(x) → find root for f(x)=g’(x)=0

)('

)(

xf

xfx

Page 19: Motion estimation

Lucas-Kanade algorithm

yx

yxTvyuxIvuE,

2),(),(),(

yx vIuIyxIvyuxI ),(),(

yx

yx vIuIyxTyxI,

2),(),(

yx

yxx vIuIyxTyxIIu

E

,

),(),(20

yx

yxy vIuIyxTyxIIv

E

,

),(),(20

Page 20: Motion estimation

Lucas-Kanade algorithm

yx

yxx vIuIyxTyxIIu

E

,

),(),(20

yx

yxy vIuIyxTyxIIv

E

,

),(),(20

yxy

yxy

yxyx

yx yxxyx

yxx

yxIyxTIvIuII

yxIyxTIvIIuI

,,

2

,

, ,,

2

),(),(

),(),(

yxy

yxx

yxy

yxyx

yxyx

yxx

yxIyxTI

yxIyxTI

v

u

III

III

,

,

,

2

,

,,

2

),(),(

),(),(

Page 21: Motion estimation

Lucas-Kanade algorithmiterate shift I(x,y) with (u,v) compute gradient image Ix, Iy

compute error image T(x,y)-I(x,y) compute Hessian matrix solve the linear system (u,v)=(u,v)+(∆u,∆v)until converge

yxy

yxx

yxy

yxyx

yxyx

yxx

yxIyxTI

yxIyxTI

v

u

III

III

,

,

,

2

,

,,

2

),(),(

),(),(

Page 22: Motion estimation

Parametric model

yx

yxTvyuxIvuE,

2),(),(),(

x

xp)W(x;p 2)()()( TIE

Tyx

y

xddp

dy

dx),(,

p)W(x;translation

Tyxyyyxxyxx

yyyyx

xxyxx

ddddddp

y

x

ddd

ddd

),,,,,(

,

11

1

dAxp)W(x;affine

Our goal is to find p to minimize E(p)

for all x in T’s domain

Page 23: Motion estimation

Parametric model

x

xΔp)pW(x; 2)()( TIminimize

with respect toΔp

Δpp

Wp)W(x;Δp)pW(x;

)()( Δpp

Wp)W(x;Δp)pW(x;

II

Δpp

W

xp)W(x;

I

I )(

x

xΔpp

Wp)W(x;

2

)()( TIIminimize

Page 24: Motion estimation

Parametric model

x

xΔpp

Wp)W(x;

2

)()( TII

image gradient

Jacobian of the warp

warped image

n

yyy

n

xxx

y

x

p

W

p

W

p

Wp

W

p

W

p

W

p

Wp

W

p

W

21

21

target image

Page 25: Motion estimation

Jacobian matrix• The Jacobian matrix is the matrix of all first-order

partial derivatives of a vector-valued function.mnF RR :

)),,(),,,(),,,((

),,(

21212211

21

nmnn

n

xxxfxxxfxxxf

xxxF

),,(

),,(

or

),,(

21

21

21

n

m

nF

xxx

fff

xxxJ

n

mm

n

x

f

x

f

x

f

x

f

1

1

1

1

ΔxxxΔxx )()()( FJFF

Page 26: Motion estimation

Jacobian matrix 32,0,0: RR F

cos

sinsin

cossin

rv

ru

rt

),,(),,( vutrF

0sincos

cossinsincossinsin

sinsincoscoscossin

),,(

r

rr

rr

vv

r

v

uu

r

u

tt

r

t

rJ F

Page 27: Motion estimation

Parametric model

x

xΔpp

Wp)W(x;

2

)()( TII

image gradient

Jacobian of the warp

warped image

n

yyy

n

xxx

y

x

p

W

p

W

p

Wp

W

p

W

p

W

p

Wp

W

p

W

21

21

target image

Page 28: Motion estimation

Jacobian of the warp

For example, for affine

yyyyx

xxyxx

yyyyx

xxyxx

dydxd

dydxdy

x

ddd

ddd

)1(

)1(

11

1p)W(x;

n

yyy

n

xxx

y

x

p

W

p

W

p

Wp

W

p

W

p

W

p

Wp

W

p

W

21

21

1000

0100

yx

yx

p

W

dxx dyx dxy dyy dx dy

Page 29: Motion estimation

Parametric model

x

xΔpp

Wp)W(x;

2

)()( TII

x

xΔpp

Wp)W(x;

p

W)()(0 TIII

T

x

p)W(x;xp

WHΔp )()(1 ITI

T

x p

W

p

WH II

T

(Approximated) Hessian

Δpminarg

Page 30: Motion estimation

Lucas-Kanade algorithmiterate1) warp I with W(x;p)2) compute error image T(x,y)-I(W(x,p))3) compute gradient image with W(x,p)4) evaluate Jacobian at (x;p)5) compute 6) compute Hessian7) compute 8) solve 9) update p by p+until converge

p

W

p

W

I

x

p)W(x;xp

W)()( ITI

T

Δp

Δp

x

p)W(x;xp

WHΔp )()(1 ITI

T

I

Page 31: Motion estimation

x

p)W(x;xp

WHΔp )()(1 ITI

T

Page 32: Motion estimation

x

p)W(x;xp

WHΔp )()(1 ITI

T

Page 33: Motion estimation

x

p)W(x;xp

WHΔp )()(1 ITI

T

Page 34: Motion estimation

x

p)W(x;xp

WHΔp )()(1 ITI

T

Page 35: Motion estimation

x

p)W(x;xp

WHΔp )()(1 ITI

T

Page 36: Motion estimation

x

p)W(x;xp

WHΔp )()(1 ITI

T

Page 37: Motion estimation

x

p)W(x;xp

WHΔp )()(1 ITI

T

Page 38: Motion estimation

x

p)W(x;xp

WHΔp )()(1 ITI

T

Page 39: Motion estimation

x

p)W(x;xp

WHΔp )()(1 ITI

T

Page 40: Motion estimation

x

p)W(x;xp

WHΔp )()(1 ITI

T

Page 41: Motion estimation

x

p)W(x;xp

WHΔp )()(1 ITI

T

Page 42: Motion estimation

Coarse-to-fine strategy

J Jw Iwarp refine

ina

a

+

J Jw Iwarp refine

a

a+

J

pyramid construction

J Jw Iwarp refine

a+

I

pyramid construction

outa

Page 43: Motion estimation

Application of image alignment

Page 44: Motion estimation

Direct vs feature-based• Direct methods use all information and

can be very accurate, but they depend on the fragile “brightness constancy” assumption.

• Iterative approaches require initialization.• Not robust to illumination change and

noise images.• In early days, direct method is better.

• Feature based methods are now more robust and potentially faster.

• Even better, it can recognize panorama without initialization.

Page 45: Motion estimation

Tracking

Page 46: Motion estimation

Tracking

I(x,y,t) I(x,y,t+1)

Page 47: Motion estimation

Tracking

0),,()1,,( tyxItvyuxI

0),,(),,(),,(),,(),,( tyxItyxItyxvItyxuItyxI tyx

0),,(),,(),,( tyxItyxvItyxuI tyx

0 tyx IvIuI optical flow constraint equation

brightness constancy

Page 48: Motion estimation

Optical flow constraint equation

Page 49: Motion estimation

Multiple constraints

Page 50: Motion estimation

Area-based method• Assume spatial smoothness

Page 51: Motion estimation

Area-based method

• Assume spatial smoothness

yx

tyx IvIuIvuE,

2),(

Page 52: Motion estimation

Area-based method

must be invertible

Page 53: Motion estimation

Area-based method• The eigenvalues tell us about the local image s

tructure.• They also tell us how well we can estimate the

flow in both directions.• Link to Harris corner detector.

Page 54: Motion estimation

Textured area

Page 55: Motion estimation

Edge

Page 56: Motion estimation

Homogenous area

Page 57: Motion estimation

KLT tracking

• Select feature by • Monitor features by measuring

dissimilarity

),( 21min

Page 58: Motion estimation

Aperture problem

Page 59: Motion estimation

Aperture problem

Page 60: Motion estimation

Aperture problem

Page 62: Motion estimation

Aperture problem

• Larger window reduces ambiguity, but easily violates spatial smoothness assumption

Page 63: Motion estimation
Page 64: Motion estimation
Page 65: Motion estimation
Page 66: Motion estimation

KLT tracking

http://www.ces.clemson.edu/~stb/klt/

Page 67: Motion estimation

KLT tracking

http://www.ces.clemson.edu/~stb/klt/

Page 68: Motion estimation

SIFT tracking (matching actually)

Frame 0 Frame 10

Page 69: Motion estimation

SIFT tracking

Frame 0 Frame 100

Page 70: Motion estimation

SIFT tracking

Frame 0 Frame 200

Page 71: Motion estimation

KLT vs SIFT tracking• KLT has larger accumulating error; partly beca

use our KLT implementation doesn’t have affine transformation?

• SIFT is surprisingly robust• Combination of SIFT and KLT (example) http://www.frc.ri.cmu.edu/projects/buzzard/smalls/

Page 72: Motion estimation

Rotoscoping (Max Fleischer 1914)

1937

Page 73: Motion estimation

Tracking for rotoscoping

Page 74: Motion estimation

Tracking for rotoscoping

Page 75: Motion estimation

Waking life (2001)

Page 76: Motion estimation

A Scanner Darkly (2006)

• Rotoshop, a proprietary software. Each minute of animation required 500 hours of work.

Page 77: Motion estimation

Optical flow

Page 78: Motion estimation

Single-motion assumption

Violated by• Motion discontinuity• Shadows• Transparency• Specular reflection• …

Page 79: Motion estimation

Multiple motion

Page 80: Motion estimation

Multiple motion

Page 81: Motion estimation

Simple problem: fit a line

Page 82: Motion estimation

Least-square fit

Page 83: Motion estimation

Least-square fit

Page 84: Motion estimation

Robust statistics

• Recover the best fit for the majority of the data

• Detect and reject outliers

Page 85: Motion estimation

Approach

Page 86: Motion estimation

Robust weighting

Truncated quadratic

Page 87: Motion estimation

Robust weighting

Geman & McClure

Page 88: Motion estimation

Robust estimation

Page 89: Motion estimation

Fragmented occlusion

Page 90: Motion estimation
Page 91: Motion estimation
Page 92: Motion estimation

Regularization and dense optical flow

• Neighboring points in the scene typically belong to the same surface and hence typically have similar motions.

• Since they also project to nearby pixels in the image, we expect spatial coherence in image flow.

Page 93: Motion estimation
Page 94: Motion estimation
Page 95: Motion estimation
Page 96: Motion estimation
Page 97: Motion estimation
Page 98: Motion estimation
Page 99: Motion estimation

Input image Horizontal motion

Vertical motion

Page 100: Motion estimation
Page 101: Motion estimation
Page 102: Motion estimation

Application of optical flow

videomatching

Page 103: Motion estimation
Page 104: Motion estimation

Input for the NPR algorithm

Page 105: Motion estimation

Brushes

Page 106: Motion estimation

Edge clipping

Page 107: Motion estimation

Gradient

Page 108: Motion estimation

Smooth gradient

Page 109: Motion estimation

Textured brush

Page 110: Motion estimation

Edge clipping

Page 111: Motion estimation

Temporal artifacts

Frame-by-frame application of the NPR algorithm

Page 112: Motion estimation

Temporal coherence

Page 113: Motion estimation

What dreams may come

Page 114: Motion estimation

References• B.D. Lucas and T. Kanade,

An Iterative Image Registration Technique with an Application to Stereo Vision, Proceedings of the 1981 DARPA Image Understanding Workshop, 1981, pp121-130.

• Bergen, J. R. and Anandan, P. and Hanna, K. J. and Hingorani, R., Hierarchical Model-Based Motion Estimation, ECCV 1992, pp237-252.

• J. Shi and C. Tomasi, Good Features to Track, CVPR 1994, pp593-600. • Michael Black and P. Anandan,

The Robust Estimation of Multiple Motions: Parametric and Piecewise-Smooth Flow Fields, Computer Vision and Image Understanding 1996, pp75-104.

• S. Baker and I. Matthews, Lucas-Kanade 20 Years On: A Unifying Framework, International Journal of Computer Vision, 56(3), 2004, pp221 - 255.

• Peter Litwinowicz, Processing Images and Video for An Impressionist Effects, SIGGRAPH 1997.

• Aseem Agarwala, Aaron Hertzman, David Salesin and Steven Seitz, Keyframe-Based Tracking for Rotoscoping and Animation, SIGGRAPH 2004, pp584-591.