Dorin Comaniciu Visvanathan Ramesh (Imaging & Visualization
Dept., Siemens Corp. Res. Inc.) Peter Meer (Rutgers University)
Real-Time Tracking of Non-Rigid Objects using Mean Shift
Slide 2
Outline Introduction Introduction Mean Shift Analysis Mean
Shift Analysis Tracking Algorithm Tracking Algorithm Experiments
Experiments Conclusion Conclusion 2
Slide 3
Outline Introduction Introduction Mean Shift Analysis Mean
Shift Analysis Tracking Algorithm Tracking Algorithm Experiments
Experiments Conclusion Conclusion 3
Slide 4
Introduction The proposed tracking is appropriate for a large
variety of objects with different color/texture patterns. The
proposed tracking is appropriate for a large variety of objects
with different color/texture patterns. The mean shift iterations
are employed to find the target candidate that is the most similar
to a given target model, with the similarity being expressed by a
metric based on the Bhattacharyya coefficient The mean shift
iterations are employed to find the target candidate that is the
most similar to a given target model, with the similarity being
expressed by a metric based on the Bhattacharyya coefficient 4
Slide 5
Outline Introduction Introduction Mean Shift Analysis Mean
Shift Analysis Tracking Algorithm Tracking Algorithm Experiments
Experiments Conclusion Conclusion 5
Slide 6
Sample Mean Shift
Slide 7
Kernel Density Estimation Multivariate kernel density
estimation Multivariate kernel density estimation Kernels Kernels
Gaussian Epanechnikov
Slide 8
Kernel Density Estimation(2) In statistics, kernel density
estimation (KDE) is a non-parametric way to estimate the
probability density function (PDF)of a random variable In
statistics, kernel density estimation (KDE) is a non-parametric way
to estimate the probability density function (PDF)of a random
variable Kernel: In non-parametric statistics, a kernel is a
weighting function used in non-parametric estimation techniques.
Kernels are used in kernel density estimation to estimate random
variables' density functions Kernel: In non-parametric statistics,
a kernel is a weighting function used in non-parametric estimation
techniques. Kernels are used in kernel density estimation to
estimate random variables' density functions 8
Slide 9
Nonparametric statistics Nonparametric statistics are
statistics not based on parameterized families of probability
distributions Nonparametric statistics are statistics not based on
parameterized families of probability distributions 9
Slide 10
Non-parametric models(1) A histogram is a simple nonparametric
estimate of a probability distribution. A histogram is a simple
nonparametric estimate of a probability distribution. Kernel
density estimation provides better estimates of the density than
histograms. Kernel density estimation provides better estimates of
the density than histograms.
Slide 11
Non-parametric models(2) Kernel density estimates are closely
related to histograms, but can be endowed with properties such as
smoothness or continuity by using a suitable kernel Kernel density
estimates are closely related to histograms, but can be endowed
with properties such as smoothness or continuity by using a
suitable kernel
Slide 12
Histogram VS Kernel Density Estimators using these 6 data
points: x1 = 2.1, x2 = 1.3, x3 = 0.4, x4 = 1.9, x5 = 5.1, x6 = 6.2.
6 bins each of width 2(left Histogram) normal kernel with variance
2.25 (indicated by the red dashed lines)(right KDE)
Slide 13
Kernel and Kernel Profile(1) The kernel density estimator The
kernel density estimator A special class of radially symmetric
kernels A special class of radially symmetric kernels where c k
makes K(x) integrate to 1. where c k makes K(x) integrate to 1. k()
is called profile k() is called profile
Slide 14
Kernel and Kernel Profile(2) We can use profile to describe the
estimator We can use profile to describe the estimator
Slide 15
Derivative Kernel and Profile Define the derivative of the
kernel profile Define the derivative of the kernel profile The
kernel corresponding to g(x) is The kernel corresponding to g(x) is
The kernel K(x) is called the shadow of G(x) The kernel K(x) is
called the shadow of G(x)
Slide 16
Density Gradient Estimation Lets compute the gradient of the
kernel estimate Lets compute the gradient of the kernel
estimate
Slide 17
Mean-Shift Vector(1) Look at this Look at this which is the
kernel density estimator using kernel g() So we have I In other
words,
Slide 18
Mean-Shift Vector(2) We can treat pK (x)/pG (x) as a normalized
density gradient estimate We can treat pK (x)/pG (x) as a
normalized density gradient estimate Local mean large density
change Local mean large density change Shift
Slide 19
Outline Introduction Introduction Mean Shift Analysis Mean
Shift Analysis Tracking Algorithm Tracking Algorithm Experiments
Experiments Conclusion Conclusion 19
Slide 20
Non-Rigid Object Tracking
Slide 21
Current frame Mean-Shift Object Tracking General Framework:
Target Representation Choose a feature space Represent the model in
the chosen feature space Choose a reference model in the current
frame
Slide 22
Mean-Shift Object Tracking General Framework: Target
Localization Search in the models neighborhood in next frame Start
from the position of the model in the current frame Find best
candidate by maximizing a similarity func. Repeat the same process
in the next pair of frames Current frame ModelCandidate
Slide 23
Mean-Shift Object Tracking Target Representation Choose a
reference target model Quantized Color Space Choose a feature space
Represent the model by its PDF in the feature space Kernel Based
Object Tracking, by Comaniniu, Ramesh, Meer
Slide 24
Mean-Shift Object Tracking Finding the PDF of the target model
Target pixel locations A differentiable, isotropic, convex,
monotonically decreasing kernel Peripheral pixels are affected by
occlusion and background interference Normalization factor Pixel
weight Probability of feature u in model Probability of feature u
in candidate Normalization factor Pixel weight 0 model y
candidate
Slide 25
Mean-Shift Object Tracking Similarity Function Target model:
Target candidate: Similarity function: 1 1 The Bhattacharyya
Coefficient
Slide 26
Mean-Shift Object Tracking Target Localization Algorithm Start
from the position of the model in the current frame Search in the
models neighborhood in next frame Find best candidate by maximizing
a similarity func.
Slide 27
Linear approx. (around y 0 ) Mean-Shift Object Tracking
Approximating the Similarity Function Model location: Candidate
location: Independent of y Density estimate! (as a function of y)
Bhattacharyya coefficient
Slide 28
Mean-Shift Object Tracking Maximizing the Similarity Function
The mode of = sought maximum Important Assumption: One mode in the
searched neighborhood The target representation provides sufficient
discrimination
Slide 29
Mean-Shift Object Tracking Applying Mean-Shift Original
Mean-Shift: Find mode ofusing The mode of = sought maximum Extended
Mean-Shift: Find mode of using
Outline Introduction Introduction Mean Shift Analysis Mean
Shift Analysis Tracking Algorithm Tracking Algorithm Experiments
Experiments Conclusion Conclusion 31
Slide 32
Mean-Shift Object Tracking Results Feature space: RGB space
with 32 32 32 bins Target: manually selected on 1 st frame Average
mean-shift iterations: 4 154 frames of 352 X 240 pixels
Outline Introduction Introduction Mean Shift Analysis Mean
Shift Analysis Tracking Algorithm Tracking Algorithm Experiments
Experiments Conclusion Conclusion 34
Slide 35
Conclusion By exploiting the spatial gradient of the
statistical measure (Bhattacharyya Coefficient) the new method
achieves real-time tracking performance, while effectively
rejecting background clutter and partial occlusions. By exploiting
the spatial gradient of the statistical measure (Bhattacharyya
Coefficient) the new method achieves real-time tracking
performance, while effectively rejecting background clutter and
partial occlusions. 35