RECALL: Parameter Estimation: Lecture 15 Robust Estimation ...

8
1 CSE486, Penn State Robert Collins Lecture 15 Robust Estimation : RANSAC CSE486, Penn State Robert Collins RECALL: Parameter Estimation: General Strategy • Least-Squares estimation from point correspondences Let’s say we have found point matches between two images, and we think they are related by some parametric transformation (e.g. translation; scaled Euclidean; affine). How do we estimate the parameters of that transformation? But there are problems with that approach.... CSE486, Penn State Robert Collins Problem : Outliers Loosely speaking, outliers are points that don’t “fit” the model. outlier outlier CSE486, Penn State Robert Collins Bad Data => Outliers Loosely speaking, outliers are points that don’t “fit” the model. Points that do fit are called “inliers” outlier outlier inliers CSE486, Penn State Robert Collins Problem with Outliers compare Least squares estimation is sensitive to outliers, so that a few outliers can greatly skew the result. Least squares regression with outliers Solution: Estimation methods that are robust to outliers. CSE486, Penn State Robert Collins Outliers aren’t the Only Problem Multiple structures can also skew the results. (the fit procedure implicitly assumes there is only one instance of the model in the data).

Transcript of RECALL: Parameter Estimation: Lecture 15 Robust Estimation ...

Page 1: RECALL: Parameter Estimation: Lecture 15 Robust Estimation ...

1

CSE486, Penn StateRobert Collins

Lecture 15Robust Estimation : RANSAC

CSE486, Penn StateRobert Collins

RECALL: Parameter Estimation:

General Strategy• Least-Squares estimation from point correspondences

Let’s say we have found point matches between two images, andwe think they are related by some parametric transformation (e.g. translation; scaled Euclidean; affine). How do we estimate theparameters of that transformation?

But there are problems with that approach....

CSE486, Penn StateRobert Collins

Problem : OutliersLoosely speaking, outliers are points that don’t “fit” the model.

outlier

outlier

CSE486, Penn StateRobert Collins

Bad Data => OutliersLoosely speaking, outliers are points that don’t “fit” the model.Points that do fit are called “inliers”

outlier

outlier

inliers

CSE486, Penn StateRobert Collins

Problem with Outliers

compare

Least squares estimation is sensitive to outliers,so that a few outliers can greatly skew the result.

Least squaresregression withoutliers

Solution: Estimation methodsthat are robust to outliers.

CSE486, Penn StateRobert Collins

Outliers aren’t the Only Problem

Multiple structures can alsoskew the results. (the fit procedureimplicitly assumes there is only oneinstance of the model in the data).

Page 2: RECALL: Parameter Estimation: Lecture 15 Robust Estimation ...

2

CSE486, Penn StateRobert Collins

Robust Estimation

• View estimation as a two-stage process:– Classify data points as outliers or inliers

– Fit model to inliers while ignoring outliers

• Example technique: RANSAC(RANdom SAmple Consensus)

M. A. Fischler and R. C. Bolles (June 1981). "RandomSample Consensus: A Paradigm for Model Fitting withApplications to Image Analysis and AutomatedCartography". Comm. of the ACM 24: 381--395.

CSE486, Penn StateRobert Collins

Ransac Procedure

CSE486, Penn StateRobert Collins

Ransac Procedure CSE486, Penn StateRobert Collins

Count = 4

Ransac Procedure

CSE486, Penn StateRobert Collins

Ransac Procedure CSE486, Penn StateRobert Collins

Count = 6

Ransac Procedure

Page 3: RECALL: Parameter Estimation: Lecture 15 Robust Estimation ...

3

CSE486, Penn StateRobert Collins

Ransac Procedure CSE486, Penn StateRobert Collins

Count = 19

Ransac Procedure

CSE486, Penn StateRobert Collins

Ransac Procedure CSE486, Penn StateRobert Collins

Count = 13

Ransac Procedure

CSE486, Penn StateRobert Collins

Count = 4Count = 6Count = 19Count = 13

Ransac Procedure CSE486, Penn StateRobert Collins

(Forsyth & Ponce)

s

s

s

N

N

d

dd

T

T

Page 4: RECALL: Parameter Estimation: Lecture 15 Robust Estimation ...

4

CSE486, Penn StateRobert Collins

How Many Samples to Choose?

1 - (1- ( 1 - e ) ) = ps N

e = probability that a point is an outliers = number of points in a sampleN = number of samples (we want to compute this)p = desired probability that we get a good sample

Solve the following for N:

Where in the world did that come from? ….

CSE486, Penn StateRobert Collins

How Many Samples to Choose?

1 - (1- ( 1 - e ) ) = ps N

e = probability that a point is an outliers = number of points in a sampleN = number of samples (we want to compute this)p = desired probability that we get a good sample

Probability that choosing one point yields an inlier

CSE486, Penn StateRobert Collins

How Many Samples to Choose?

1 - (1- ( 1 - e ) ) = ps N

e = probability that a point is an outliers = number of points in a sampleN = number of samples (we want to compute this)p = desired probability that we get a good sample

Probability of choosing s inliers in a row (sampleonly contains inliers)

CSE486, Penn StateRobert Collins

How Many Samples to Choose?

1 - (1- ( 1 - e ) ) = ps N

e = probability that a point is an outliers = number of points in a sampleN = number of samples (we want to compute this)p = desired probability that we get a good sample

Probability that one or morepoints in the sample were outliers(sample is contaminated).

CSE486, Penn StateRobert Collins

How Many Samples to Choose?

1 - (1- ( 1 - e ) ) = ps N

e = probability that a point is an outliers = number of points in a sampleN = number of samples (we want to compute this)p = desired probability that we get a good sample

Probability that N sampleswere contaminated.

CSE486, Penn StateRobert Collins

How Many Samples to Choose?

1 - (1- ( 1 - e ) ) = ps N

e = probability that a point is an outliers = number of points in a sampleN = number of samples (we want to compute this)p = desired probability that we get a good sample

Probability that at leastone sample was not contaminated (at least one sample of spoints is composed of onlyinliers).

Page 5: RECALL: Parameter Estimation: Lecture 15 Robust Estimation ...

5

CSE486, Penn StateRobert Collins

How many samples?

Choose N so that, with probability p, at least one randomsample is free from outliers. e.g. p=0.99

11772727844269585881635433208472939737241674614657261712645723417139534351911974331711765322

50%40%30%25%20%10%5%s

proportion of outliers e

CSE486, Penn StateRobert Collins

Example: N for the line-fitting problem

• n = 12 points

• Minimal sample size s = 2• 2 outliers: e = 1/6 => 20%• So N = 5 gives us a 99% chance of getting a pure-inlier sample

– Compared to N = 66 by trying every pair of points

from Hartley & Zisserman

CSE486, Penn StateRobert Collins

Acceptable consensus set?

• We have seen that we don’t have to exhaustively samplesubsets of points, we just need to randomly sample N subsets.

• However, typically, we don’t even have to sample N sets!

• Early termination: terminate when inlier ratio reachesexpected ratio of inliers

( ) ∗ (total number of data points)eT −= 1

CSE486, Penn StateRobert Collins

RANSAC: Picking Distance Threshold d• Usually chosen empirically• But…when measurement error is known to be Gaussian with

mean 0 and variance s2:– Sum of squared errors follows a χ2 distribution with m DOF, where m

is the DOF of the error measure (the codimension)– (dimension + codimension) = dimension of parameter space

• E.g., m = 1 for line fitting because error is perpendicular distance

• E.g., m = 2 for point distance

• Examples for probability p = 0.95 that point is inlier

5.99 s2Homography, camera matrix2

3.84 s2Line, fundamental matrix1

d2Modelm

CSE486, Penn StateRobert Collins

After RANSAC• RANSAC divides data into inliers and outliers and yields

estimate computed from minimal set of inliers with greatestsupport

• Improve this initial estimate with Least Squares estimationover all inliers (i.e., standard minimization)

• Find inliers wrt that L.S. line, and compute L.S. one more time.

from Hartley & Zisserman

CSE486, Penn StateRobert Collins

Practical Example

• Stabilizing aerial imagery using RANSAC- find corners in two images

- hypothesize matches using NCC

- do RANSAC to find matches consistent with anaffine transformation

- take the inlier set found and estimate a fullprojective transformation (homography)

Page 6: RECALL: Parameter Estimation: Lecture 15 Robust Estimation ...

6

CSE486, Penn StateRobert Collins

Stabilization ApplicationInput: two images from an aerial video sequence.

Note that the motion of the camera is “disturbing”

CSE486, Penn StateRobert Collins

Stabilization ApplicationStep1: extract Harris corners from both frames. We use a smallthreshold for R because we want LOTS of corners (fodder for ournext step, which is matching).

Harris corners for first frame Detailed view

CSE486, Penn StateRobert Collins

Stabilization ApplicationStep2: hypothesize matches. For each corner in image 1, look formatching intensity patch in image2 using NCC. Make sure matchingpairs have highest NCC match scores in BOTH directions.

image1 image2

best match from 1 to 2

best match from 2 to 1

best match from 1 to 2

best match from 2 to 1

mutuallycompatible!

CSE486, Penn StateRobert Collins

Stabilization ApplicationStep2: hypothesize matches.

yikes!

As you can see, a lot of false matches get hypothesized. The jobof RANSAC will be to clean this mess up.

CSE486, Penn StateRobert Collins

Stabilization ApplicationStep3: Use RANSAC to robustly fit best affine transformation tothe set of point matches.

0 1

1

0x

y

x’

y’

transform

xi’ = a xi + c yi + cyi’ = d xi + e yi + f

How many unknowns?How many point matches are needed?

CSE486, Penn StateRobert Collins

Stabilization ApplicationStep3: Use RANSAC to robustly fit best affine transformation tothe set of point matches.

Affine transformation has 6 degrees of freedom.We therefore need 3 point matches [each gives 2 equations]

Randomly sample sets of 3 point matches. For each, computethe unique affine transformation they define. How?

Page 7: RECALL: Parameter Estimation: Lecture 15 Robust Estimation ...

7

CSE486, Penn StateRobert Collins

Stabilization ApplicationHow to compute affine transformation from 3 point matches?Use Least Squares! (renewed life for a nonrobust approach)

Then transform all points from image1 to image2 using thatcomputed transformation, and see how many other matchesconfirm the hypothesis.

Repeat N times.

CSE486, Penn StateRobert Collins

Stabilization Application

green: inliersred: outliers

original point matches labels from RANSAC

CSE486, Penn StateRobert Collins

Stabilization ExampleStep4: Take inlier set labeled by RANSAC, and now use leastsquares to estimate a projective transformation that aligns the images. (we will discuss this ad nauseum in a later lecture).

0 1

1

0x

y

x’

y’

transform

Projective Transformation

CSE486, Penn StateRobert Collins

Stabilization ExampleStep4: estimate projective transformation that aligns the images.

Now it is easier for people (and computers) to see the moving objects.

CSE486, Penn StateRobert Collins

Stabilization Examples CSE486, Penn StateRobert Collins

Stabilization Examples

green: inliersred: outliers

original point matches labels from RANSAC

Page 8: RECALL: Parameter Estimation: Lecture 15 Robust Estimation ...

8

CSE486, Penn StateRobert Collins

Stabilization Examples CSE486, Penn StateRobert Collins

Stabilization Examples

CSE486, Penn StateRobert Collins

Stabilization Examples

green: inliersred: outliers

original point matches labels from RANSAC

CSE486, Penn StateRobert Collins

Stabilization Examples