Post on 06-Sep-2018
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Aaron BobickSchool of Interactive Computing
CS 4495 Computer VisionTracking 2: Particle Filters
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Administrivia PS5 due on Wed Nov 12, 11:55pm
Hopfully PS6 out Thurs Nov 13, due Nov 23rd
Problem set resubmission policy: Full questions only Be email to me and the TAs. You get 50% credit to replace whatever you got last time on that
question. Must be submitted by: DEC 1. NO EXCEPTIONS.
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Tracking Slides still adapted from Kristen Grauman, Deva
Ramanan, but mostly from Svetlana Lazebnik
And now more adaptations from Sebastian Thrun, Dieter Fox and someone who did great particle filter illustrations but whose name is lost to web thievery.
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Recall: Some examples http://www.youtube.com/watch?v=InqV34BcheM
http://www.youtube.com/watch?v=InqV34BcheM
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Detection vs. tracking
t=1 t=2 t=20 t=21
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Detection vs. tracking
Detection: We detect the object independently in each frame and can record its position over time, e.g., based on blobs centroid or detection window coordinates
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Detection vs. tracking
Tracking with dynamics: We use image measurements to estimate position of object, but also incorporate position predicted by dynamics, i.e., our expectation of objects motion pattern.
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Tracking with dynamics Use model of expected motion to predict where objects
will occur in next frame, even before seeing the image.
Intent: Do less work looking for the object, restrict the search. Get improved estimates since measurement noise is tempered by
smoothness, dynamics priors.
Assumption: continuous motion patterns: Camera is not moving instantly to new viewpoint Objects do not disappear and reappear in different places in the
scene Gradual change in pose between camera and scene
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Tracking as induction Base case:
Assume we have initial prior that predicts state in absence of any evidence: P(X0)
At the first frame, correct this given the value of Y0=y0 Given corrected estimate for frame t:
Predict for frame t+1 Correct for frame t+1
predict correct
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Simplifying assumptions Only the immediate past matters
Measurements depend only on the current state
( ) ( )tttttt XYPXYXYXYP = ,,,, 1100
( ) ( )110 ,, = tttt XXPXXXP
observation model
dynamics model
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Steps of tracking Prediction: What is the next state of the object given past
measurements?
Correction: Compute an updated estimate of the state from prediction and measurements
Tracking can be seen as the process of propagating the posterior distribution of state given measurements across time
( )1100 ,, == ttt yYyYXP
( )ttttt yYyYyYXP === ,,, 1100
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
The Kalman filter Method for tracking linear dynamical models in Gaussian
noise The predicted/corrected state distributions are Gaussian
You only need to maintain the mean and covariance The calculations are easy (all the integrals can be done in closed
form)
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Propagation of Gaussian densities
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Kalman Filters
Measurementevidence
posterior
prior
)()|()|( xpxzpzxp
new priorprevious posterior
dxxpxxpxp )()|'()'( =Prediction
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Tracking with KFs: Gaussians!
updateinitial estimte
x
y
x
y
prediction
x
y
measurement
x
y
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
A Quiz
prior
Measurementevidence
( | ) ( | ) ( )p x y p y x p x
posterior?
Does this agree with your intuition?
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Kalman filter pros and cons Pros
Simple updates, compact and efficient
Cons Can be sensitive to process noise and Unimodal distribution, only single hypothesis Restricted class of motions defined by linear model
Extensions call Extended Kalman Filtering
So what might we do if not Gaussian? Or even unimodal? E.g. Object seems to be detected (noisy) in two locations in the next image
that are both plausible in terms of dynamics and belief where the object was in the last image.
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Kalman filter pros and cons Pros
Simple updates, compact and efficient
Cons Can be sensitive to process noise and Unimodal distribution, only single hypothesis Restricted class of motions defined by linear model
Extensions call Extended Kalman Filtering
So what might we do if not Gaussian? Or even unimodal?
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Propagation of general densities
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Before we go any further In particle filtering the measurements are written as and
not as
So well start seeing zs
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Particle Filters: Basic Idea)(xp
tx
Goal: 1 with equality when
set of n (weighted) particles XtDensity is represented by both where the particles are and their weight. ( = 0)is now probability of drawing an x with value (really close to) 0.
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
A slight shift For tracking we had the notion of a dynamics model and
for the Kalman filter a linear dynamics model. At each time step the prediction was based upon linear transform of
state so could easily include velocity, acceleration, etc. The linear form was necessary to make the math work out. To do
non-linear dynamics need to linearize about the current state (remember Jacobians?).
In general, the state transition matrix represented what was known or expected to change from t to t+1.
We can generalize that notion of action or input ut at time t.
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Graphical Model Representation
1: 1 1: 1 1: 1( | , , ) ( | , )t t t t t t tp x x z u p x x u =0: 1: 1 1:( | , , ) ( | )t t t t t tp z x z u p z x =
Observation model
1st order MarkovianDyanmics
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Given: Stream of observations z and action data u:
Sensor model (|) . Action model (| 1 , 1 ) Prior probability of the system state p(x).
Wanted: Estimate of the state X of a dynamical system. The posterior of the state is also called Belief:
Bayes Filters: Framework
),,,|()( 121 tttt zuzuxPxBel =
1 2 1{ , , , }t t tdata u z u z=
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Bayes Filters
1 2 1 1 1 2 1 1( | , , , , , ) ( | , , , , )t t t t t t tP z x u z z u P x u z z u = Bayes
z = observationu = actionx = state
1 12 1( ) ( | , ),, ,tt t t tBel x P x u z z u z=
Observation1 2 1 1( | ) ( | , , , , )t t t t tP z x P x u z z u =
1111 )(),|()|( = ttttttt dxxBelxuxPxzP
Markov1 1 1 1 2 1 1 1( | ) ( | , ) ( | , , , , )t t t t t t t t tP z x P x u x P x u z z u dx =
1 2 1 1 1
1 1 2 1 1 1
( | ) ( | , , , , , )
( | , , , , )
t t t t t t
t t t t
P z x P x u z z u x
P x u z z u dx
=
TotalProbability ofPrior
Prediction before taking measurement
Prediction
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
The intuitions behind the particle filter Each particle represents one possible state (e.g. could
position and velocity)
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
The intuitions behind the particle filter Two fundamental steps to filtering:1. Make prediction based upon previous belief:
Kalman: predict forward the mean based upon the dynamics and then add uncertainty based upon the process noise (dynamics noise).
Particle: Generate prediction distribution by sampling particles based upon their probability, propagate each particles state based upon dynamics, then perturb based upon noise.
2. Update belief based upon measurement: Kalman: Combine the two Gaussians (prediction and measurement
guess) to get one new Gaussian Particle filter: Use Bayes rule to assign particle weight based upon
the likelihood.
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Bayes Rule reminder
( | ) ( )(
( | ) )
| )( )
(
p z x p xp x zp z
p z x p x
=
=
Prior before measurement
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
To illustrate Assume a simple robot with a noisy range sensor looking
to its side
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
When density is not easily represented analytically (ie a couple of Gaussians), represent by a set of (possibly weighted) samples
Particle Filters
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
( " " | )p z hole x=
Sensor Information
Robot sees a hole
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
'd)'()'|()( , xxBelxuxpxBelRobot Motion
Move and Resampleu
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Next Sensor Reading
Robot sees a hole again
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Robot Moves Again
Move and Resample again
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
1. Algorithm particle_filter
2.
3. For Resample (generate i new samples)
4. Sample index j(i) from the discrete distribution given by wt-15. Sample from using and Control
6. Compute importance weight (or reweight)
7. Update normalization factor
8. Insert
9. For
10. Normalize weights
Particle Filter Algorithm (Sequential Importance Resampling)
0, == tS
ni 1=
},{ >=
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Graphical steps particle filtering
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
A real robot sensing problem Assume a robot knows a 3D map of its world It has a noisy depth sensor or sensors whose sensing
uncertainty is known. It moves from frame to frame. How well can it know where it is (x,y,) ?
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Proximity Sensor Model
Laser sensor Sonar sensor
No return!
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Sample-based Localization (sonar)
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Smithsonian Museum of American History
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
48
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
49
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
How about simple vision.
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Using Ceiling Maps for Localization
Dellaert, et al. 1997
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Vision-based Localization
P(z|x)
h(x)z
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Under a Light
Measurement z: P(z|x):
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Next to a Light
Measurement z: P(z|x):
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Elsewhere
Measurement z: P(z|x):
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Global Localization Using Vision
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Odometry Only
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Using Vision
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
To do real tracking
x is the state. But of what? The object? Some representation of the object?
z is the measurement. But what measurement? And how does it relate to the state?
Where do you get your dynamics from?
( | ) ( )(
( | ) )
| )( )(
p z x p xp x zp z
p z x p x
=
=1( | , )t t tp x x u
State
State dynamicsSensor model
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
The source
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
39
The object to be tracked here is a hand initialized contour. Could have been the image pixels. Which is better?
Its state is its affine deformation. How many parameters? Each particle represents those six parameters.
Particle filter tracking - state
[Isard 1998]Picture of the states represented by
the top weighted particlesThe mean state
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
More complex state Tracking of a hand movement using an edge detector
State is translation and rotation of hand plus shape of the hand- 12 DOF
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Suppose x is a hand initialized contour.
What is z?
Gaussian in Distance to nearest high-contrast feature summed over the contour.
Particle filter tracking - measurement
[Isard 1998]
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
More tracking contours Head tracking with contour
models (Zhihong et al. 2002)
How did it do occlusion? With velocity? Without velocity?
How did you get dynamics?
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Getting the dynamics
Why are we doing all this work? Cuz tracking is hard. If it werent hard we could just detect the contour. If we could just detect the contour we could get out correct
trajectories, If we can get correct trajectories we can.?
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
A different model Hands and head movement tracking using color models
and optical flow (Tung et al. 2008) State: location of colored blob (x,y) Prediction based upon flow. Sensor model: color match
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
How about a really, really simple model? State is just location of an image patch: x,y
Dynamics: just random noise
Sensor model: avgsquared difference of pixel intensities. Really a similarity model:
more similar is more likely.
Oh, you need a patch
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
An even better model Suppose you want to track a region of colors.
What would be a good model/State: Location Region size? Distribution of colors
What would be a good sensor model? Similarity of distributions
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
If dealing with highly peaked observations Add noise to observation and prediction models Better proposal distributions: e.g., perform Kalman filter step to
determine proposal
Overestimating noise often reduces number of required samples
Recover from failure by selectively adding samples from observations
Recover from failure by uniformly adding some samples Can Resample only when necessary (efficiency of
representation measured by variance of weights)
PF: Practical Considerations
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Given: Set S of weighted samples.
Wanted : Random sample, where the probability of drawing xi is given by wi.
Typically done n times with replacement to generate new sample set S. Or even not done except when neededtoo many low
weight particles.
A detail: Resampling method can matter
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
w2
w3
w1wn
Wn-1
Resampling
w2
w3
w1wn
Wn-1
Roulette wheel Binary search, n log n
Stochastic universal sampling Systematic resampling Linear time complexity Easy to implement, low variance
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
1. Algorithm systematic_resampling(S,n):
2.3. For Generate cdf4.5. Initialize threshold
6. For Draw samples 7. While ( ) Skip until next threshold reached8.9. Insert10. Increment threshold
11. Return S
Resampling Algorithm
11,' wcS ==
ni 2=i
ii wcc += 11
1 ~ [0, ], 1u U n i =
nj 1=
11
+ += nuu jj
ij cu >
{ }>
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Tracking issues Initialization
Manual Background subtraction Detection
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Tracking issues Initialization Obtaining observation and dynamics model
Dynamics model: learn (very difficult) or specify using domain knowledge
Generative observation model: render the state on top of the image and compare
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Tracking issues Initialization Obtaining observation and dynamics model Prediction vs. correction
If the dynamics model is too strong, will end up ignoring the data
If the observation model is too strong, tracking is reduced to repeated detection
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Tracking issues Initialization Obtaining observation and dynamics model Prediction vs. correction Data association
What if we dont know which measurements to associate with which tracks?
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Data association So far, weve assumed the entire
measurement to be relevant to determining the state
In reality, there may be uninformative measurements (clutter) or measurements may belong to different tracked objects
Data association: task of determining which measurements go with which tracks
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Data association Simple strategy: only pay attention to the measurement
that is closest to the prediction
Source: Lana Lazebnik
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Data association Simple strategy: only pay attention to the measurement
that is closest to the prediction
Doesnt always workAlternative: keep track of multiple hypotheses at once
Source: Lana Lazebnik
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Data association Simple strategy: only pay attention to the measurement
that is closest to the prediction More sophisticated strategy: keep track of multiple
state/observation hypotheses Can be done with particle filtering
This is a general problem in computer vision, there is no easy solution
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Tracking issues Initialization Obtaining observation and dynamics model Prediction vs. correction Data association Drift
Errors caused by dynamical model, observation model, and data association tend to accumulate over time
Tracking 2: Particle FiltersCS 4495 Computer Vision A. Bobick
Drift
D. Ramanan, D. Forsyth, and A. Zisserman. Tracking People by Learning their Appearance. PAMI 2007.
http://www.ics.uci.edu/%7Edramanan/papers/trackingpeople/index.html
CS 4495 Computer VisionTracking 2: Particle Filters AdministriviaTrackingRecall: Some examplesDetection vs. trackingDetection vs. trackingDetection vs. trackingTracking with dynamicsTracking as inductionSimplifying assumptionsSteps of trackingThe Kalman filterPropagation of Gaussian densitiesKalman FiltersTracking with KFs: Gaussians!A QuizKalman filter pros and consKalman filter pros and consPropagation of general densitiesBefore we go any furtherParticle Filters: Basic IdeaA slight shiftGraphical Model RepresentationBayes Filters: FrameworkBayes FiltersThe intuitions behind the particle filterThe intuitions behind the particle filterBayes Rule reminderTo illustrateParticle FiltersSensor InformationRobot MotionNext Sensor ReadingRobot Moves AgainParticle Filter Algorithm (Sequential Importance Resampling)Graphical steps particle filteringA real robot sensing problemProximity Sensor ModelSample-based Localization (sonar)Smithsonian Museum of American HistorySlide Number 44Slide Number 45Slide Number 46Slide Number 47Slide Number 48Slide Number 49How about simple vision.Using Ceiling Maps for LocalizationVision-based LocalizationUnder a LightNext to a LightElsewhereGlobal Localization Using VisionOdometry OnlyUsing VisionTo do real trackingThe sourceParticle filter tracking - stateMore complex stateParticle filter tracking - measurementMore tracking contoursGetting the dynamicsA different modelHow about a really, really simple model?An even better modelPF: Practical ConsiderationsA detail: Resampling method can matterResamplingResampling AlgorithmTracking issuesTracking issuesTracking issuesTracking issuesData associationData associationData associationData associationTracking issuesDrift