Object Detection with Discriminatively Trained Part Based...
Transcript of Object Detection with Discriminatively Trained Part Based...
Object Detection with DiscriminativelyTrained Part Based Model
P.F. Felzenszwalb, R.B. Girshick, D. McAllester and D. Ramanan
PAMI 2010
Presented by : Philippe Weinzaepfel1st February 2013
Introduction Object localization
Object localization
GoalDetect and localize objects from generic categories in static images
Training: bounding boxes around objects
ChallengesIllumination changes
Viewpoint
Intraclass variability
Non-rigid deformation
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 2 / 28
Introduction Part-based model
Part-based model
A collection of parts arranged in a deformable configurationPart locations are not known: latent variablesIn this paper: star model (1 root + multiple parts)Parts filter at twice resolution of the root filterScore of the detection:
score(model, x) = score(root, x) +∑
p∈parts
maxy
[score(p, y)− cost(p, x, y)]
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 3 / 28
Introduction Part-based model
Part-based model with latent variables
Simple modelScore:
fβ(x) = β.Φ(x)
Part-based modelScore:
fβ(x) = maxx∈Z(x)
β.Φ(x, z)
β: model parameters
z: specification of object configuration
Latent SVM to learn β and data-minng for hard negative examples
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 4 / 28
Introduction Mixture of part-based models
Mixture of part-based models
Score: max over components
Include the component in z
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 5 / 28
Introduction Outline
Outline
1 Model
2 Latent SVM
3 Training Models
4 Features
5 Post-processing
6 Some experimental results
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 6 / 28
Model Feature pyramid
Outline
1 ModelFeature pyramidDeformable part modelsDetection processMixture of models
2 Latent SVM
3 Training Models
4 Features
5 Post-processing
6 Some experimental results
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 7 / 28
Model Feature pyramid
Linear filter of features
Each part: linear filter of feature mapBuild a feature pyramid H
5 levels per octave at training, 10 at testing
Weight vector F
Position p = (x, y, l)
Score of F at p: F ′.φ(H,p)
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 7 / 28
Model Deformable part models
Outline
1 ModelFeature pyramidDeformable part modelsDetection processMixture of models
2 Latent SVM
3 Training Models
4 Features
5 Post-processing
6 Some experimental results
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 8 / 28
Model Deformable part models
Deformable part models
A modelA root filter F0
n parts Pi
A filter Fi
An anchor vi
A deformation cost di
A bias term b (to compare models in mixtures)
An object hypothesisPosition of root and parts z = (p0, ...,pn)
pi = (xi , yi , li)
Hypothesis: parts at twice the resolution of root
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 8 / 28
Model Deformable part models
Deformable part models
Score of an hypothesisscore(p0, ...,pn) =
∑ni=0 F ′i .φ(H,pi)−
∑ni=1 di .φd(dxi , dyi) + b
(dxi , dyi) = (xi , yi)− (2(x0, y0) + vi)
φd(dx, dy) = (dx, dy, dx2, dy2)
Link to Latent SVMThe score can be written as β.Ψ(H, z) with:
β = (F ′0, ...,F′n, d1, ..., dn, b)
Ψ(H, z) = (φ(H,p0), ..., φ(H,pn),−φd(dx1, dy1), ...,−φd(dxn, dyn), 1)
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 9 / 28
Model Deformable part models
Deformable part models
Score of an hypothesisscore(p0, ...,pn) =
∑ni=0 F ′i .φ(H,pi)−
∑ni=1 di .φd(dxi , dyi) + b
(dxi , dyi) = (xi , yi)− (2(x0, y0) + vi)
φd(dx, dy) = (dx, dy, dx2, dy2)
Link to Latent SVMThe score can be written as β.Ψ(H, z) with:
β = (F ′0, ...,F′n, d1, ..., dn, b)
Ψ(H, z) = (φ(H,p0), ..., φ(H,pn),−φd(dx1, dy1), ...,−φd(dxn, dyn), 1)
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 9 / 28
Model Detection process
Outline
1 ModelFeature pyramidDeformable part modelsDetection processMixture of models
2 Latent SVM
3 Training Models
4 Features
5 Post-processing
6 Some experimental results
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 10 / 28
Model Detection process
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 10 / 28
Model Mixture of models
Outline
1 ModelFeature pyramidDeformable part modelsDetection processMixture of models
2 Latent SVM
3 Training Models
4 Features
5 Post-processing
6 Some experimental results
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 11 / 28
Model Mixture of models
Mixture of models
Mixture M of m modelsM = (M1, ...,Mm)
z = (c,p0, ...,pnc) = (c, z′)
score(z) = βc .Ψ(H, z′)
Score as β.Ψ(H, z)
β = (β1, ..., βm)
Ψ(H, z) = (0, ..., 0,Ψ(H, z′), 0, ..., 0)
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 11 / 28
Latent SVM Problem
Outline
1 Model
2 Latent SVMProblemSemi-convexityOptimizationData-mining hard examples
3 Training Models
4 Features
5 Post-processing
6 Some experimental results
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 12 / 28
Latent SVM Problem
Problem
Classifier that score an example x with:
fβ(x) = maxz∈Z(x)
β.Φ(x, z)
Z(x): set of possible latent values for x
As for SVM, we learn β by minimizing
LD(β) =12‖β‖2 + C
n∑i=1
max(
0, 1− yi fβ(xi))
with D =(〈x1, y1〉, ..., 〈xn, yn〉
)a set of labeled examples
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 12 / 28
Latent SVM Semi-convexity
Outline
1 Model
2 Latent SVMProblemSemi-convexityOptimizationData-mining hard examples
3 Training Models
4 Features
5 Post-processing
6 Some experimental results
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 13 / 28
Latent SVM Semi-convexity
Semi-convexity
LD(β) =12‖β‖2 + C
n∑i=1
max(
0, 1− yi fβ(xi))
A latent SVM is semi-convexThe loss function is convex in β for negative examples.
LD(β) is convex when latent variables are specified for positiveexamples.
SVM
Convexity:
fβ linear in β
Hinge loss is convex asmaximum of two convexfunctions
LSVM
fβ is convex as maximum of convexfunctions
For negative examples, the loss functionis convex
If there is a single possible latent valuefor each positive example: fβ is linear
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 13 / 28
Latent SVM Semi-convexity
Semi-convexity
LD(β) =12‖β‖2 + C
n∑i=1
max(
0, 1− yi fβ(xi))
A latent SVM is semi-convexThe loss function is convex in β for negative examples.
LD(β) is convex when latent variables are specified for positiveexamples.
SVM
Convexity:
fβ linear in β
Hinge loss is convex asmaximum of two convexfunctions
LSVM
fβ is convex as maximum of convexfunctions
For negative examples, the loss functionis convex
If there is a single possible latent valuefor each positive example: fβ is linear
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 13 / 28
Latent SVM Semi-convexity
Semi-convexity
LD(β) =12‖β‖2 + C
n∑i=1
max(
0, 1− yi fβ(xi))
A latent SVM is semi-convexThe loss function is convex in β for negative examples.
LD(β) is convex when latent variables are specified for positiveexamples.
SVM
Convexity:
fβ linear in β
Hinge loss is convex asmaximum of two convexfunctions
LSVM
fβ is convex as maximum of convexfunctions
For negative examples, the loss functionis convex
If there is a single possible latent valuefor each positive example: fβ is linear
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 13 / 28
Latent SVM Semi-convexity
Semi-convexity
LD(β) =12‖β‖2 + C
n∑i=1
max(
0, 1− yi fβ(xi))
A latent SVM is semi-convexThe loss function is convex in β for negative examples.
LD(β) is convex when latent variables are specified for positiveexamples.
SVM
Convexity:
fβ linear in β
Hinge loss is convex asmaximum of two convexfunctions
LSVM
fβ is convex as maximum of convexfunctions
For negative examples, the loss functionis convex
If there is a single possible latent valuefor each positive example: fβ is linear
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 13 / 28
Latent SVM Optimization
Outline
1 Model
2 Latent SVMProblemSemi-convexityOptimizationData-mining hard examples
3 Training Models
4 Features
5 Post-processing
6 Some experimental results
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 14 / 28
Latent SVM Optimization
Optimization
Zp specifying a latent value for each positive example
D(Zp) derived from D by restricting latent values for positive examples
LD(β) = minZp LD(β,Zp) = minZp LD(Zp)(β)
LD(β) 6 LD(β,Zp)
AlgorithmRelabel positive examples: Optimize LD(β,Zp) over Zp, i.e. select
zi = argmaxz∈Z(xi)
β.Φ(xi , z)
.
Optimize β: Stochastic gradient descent
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 14 / 28
Latent SVM Optimization
Optimization
Zp specifying a latent value for each positive example
D(Zp) derived from D by restricting latent values for positive examples
LD(β) = minZp LD(β,Zp) = minZp LD(Zp)(β)
LD(β) 6 LD(β,Zp)
AlgorithmRelabel positive examples: Optimize LD(β,Zp) over Zp, i.e. select
zi = argmaxz∈Z(xi)
β.Φ(xi , z)
.
Optimize β: Stochastic gradient descent
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 14 / 28
Latent SVM Optimization
Optimization
Zp specifying a latent value for each positive example
D(Zp) derived from D by restricting latent values for positive examples
LD(β) = minZp LD(β,Zp) = minZp LD(Zp)(β)
LD(β) 6 LD(β,Zp)
AlgorithmRelabel positive examples: Optimize LD(β,Zp) over Zp, i.e. select
zi = argmaxz∈Z(xi)
β.Φ(xi , z)
.
Optimize β: Stochastic gradient descent
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 14 / 28
Latent SVM Optimization
Stochastic gradient descent
Gradient descent
LD(β) =12‖β‖2 + C
n∑i=1
max(
0, 1− yi fβ(xi))
∇LD(β) = β + Cn∑
i=1
h(β, xi , yi) (subgradient)
h(β, xi , yi) =
{0 if yi fβ(xi) > 1
−yiφ(xi , zi(β)) otherwise
IssueToo costly to go over data to compute the exact gradient.
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 15 / 28
Latent SVM Optimization
Stochastic gradient descent
Approximation of the gradient
Approximate∑n
i=1 h(β, xi , yi) by nh(β, xi , yi) for one example 〈xi , yi〉
AlgorithmLet αt be the learning rate for iteration t
Let i be a random example
Let zi = argmaxz∈Z(xi)β.φ(xi , z)
Update β = β − αt∇iLD(β)
Convergence
Learning rate αt = 1t
Depends on the number of training examples
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 16 / 28
Latent SVM Data-mining hard examples
Outline
1 Model
2 Latent SVMProblemSemi-convexityOptimizationData-mining hard examples
3 Training Models
4 Features
5 Post-processing
6 Some experimental results
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 17 / 28
Latent SVM Data-mining hard examples
Recall: Data-mining hard examples in SVM
Many negative instances
BootstrappingCollect hard negatives as incorrectly classified examples from a previousmodel.
Hard and easy examplesH(β,D) = {〈x, y〉 ∈ D | yfβ(x) < 1}E(β,D) = {〈x, y〉 ∈ D | yfβ(x) > 1}
β∗(D) = argminβ
LD(β) (unique)
GoalFind a subset C ⊆ D such that β∗(C) = β∗(D).
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 17 / 28
Latent SVM Data-mining hard examples
Recall: Data-mining hard examples in SVM
Many negative instances
BootstrappingCollect hard negatives as incorrectly classified examples from a previousmodel.
Hard and easy examplesH(β,D) = {〈x, y〉 ∈ D | yfβ(x) < 1}E(β,D) = {〈x, y〉 ∈ D | yfβ(x) > 1}
β∗(D) = argminβ
LD(β) (unique)
GoalFind a subset C ⊆ D such that β∗(C) = β∗(D).
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 17 / 28
Latent SVM Data-mining hard examples
Recall: Data-mining hard examples in SVM
Algorithmβt = β∗(Ct) (model training)
If H(βt,D) ⊆ Ct, stop and return βt
Remove easy examples
Add hard examples
ProofLet C ⊆ D. If H(β,D) ⊆ C, β∗(C) = β∗(D).
The algorithm terminates.
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 18 / 28
Latent SVM Data-mining hard examples
Recall: Data-mining hard examples in SVM
Algorithmβt = β∗(Ct) (model training)
If H(βt,D) ⊆ Ct, stop and return βt
Remove easy examples
Add hard examples
ProofLet C ⊆ D. If H(β,D) ⊆ C, β∗(C) = β∗(D).
The algorithm terminates.
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 18 / 28
Latent SVM Data-mining hard examples
Data-mining hard examples in LSVM
Feature cache F : set of (i, v) with 1 6 i 6 n and v = φ(xi , z) withz ∈ Z(xi) (one for positive examples)I(F) examples indexed by FLF (β) = 1
2‖β‖2 + C
∑i∈I(F) max(0, 1− yi(max(i,v)∈F β.v))
Modified stochastic gradient descentLet αt be the learning rate for iteration t
Let i ∈ I(F) be a random example indexed by F
Let vi = argmaxv∈V (i) β.v
Update β = β − αt∇iLD(β)
We would like to find a small F such that β∗(F) = β∗(D(Zp))H(β,D) = {(i,Φ(xi , zi)) | zi = argmaxz∈Z(xi )
β.Φ(xi , z) and yi(β.Φ(xi , zi)) < 1}E(β,D) = {(i, v) ∈ F |yi(β.v) > 1}Same procedure
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 19 / 28
Latent SVM Data-mining hard examples
Data-mining hard examples in LSVM
Feature cache F : set of (i, v) with 1 6 i 6 n and v = φ(xi , z) withz ∈ Z(xi) (one for positive examples)I(F) examples indexed by FLF (β) = 1
2‖β‖2 + C
∑i∈I(F) max(0, 1− yi(max(i,v)∈F β.v))
Modified stochastic gradient descentLet αt be the learning rate for iteration t
Let i ∈ I(F) be a random example indexed by F
Let vi = argmaxv∈V (i) β.v
Update β = β − αt∇iLD(β)
We would like to find a small F such that β∗(F) = β∗(D(Zp))H(β,D) = {(i,Φ(xi , zi)) | zi = argmaxz∈Z(xi )
β.Φ(xi , z) and yi(β.Φ(xi , zi)) < 1}E(β,D) = {(i, v) ∈ F |yi(β.v) > 1}Same procedure
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 19 / 28
Latent SVM Data-mining hard examples
Data-mining hard examples in LSVM
Feature cache F : set of (i, v) with 1 6 i 6 n and v = φ(xi , z) withz ∈ Z(xi) (one for positive examples)I(F) examples indexed by FLF (β) = 1
2‖β‖2 + C
∑i∈I(F) max(0, 1− yi(max(i,v)∈F β.v))
Modified stochastic gradient descentLet αt be the learning rate for iteration t
Let i ∈ I(F) be a random example indexed by F
Let vi = argmaxv∈V (i) β.v
Update β = β − αt∇iLD(β)
We would like to find a small F such that β∗(F) = β∗(D(Zp))H(β,D) = {(i,Φ(xi , zi)) | zi = argmaxz∈Z(xi )
β.Φ(xi , z) and yi(β.Φ(xi , zi)) < 1}E(β,D) = {(i, v) ∈ F |yi(β.v) > 1}Same procedure
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 19 / 28
Training Models Learning parameters
Outline
1 Model
2 Latent SVM
3 Training ModelsLearning parameters
Initialization
4 Features
5 Post-processing
6 Some experimental results
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 20 / 28
Training Models Learning parameters
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 20 / 28
Training Models Initialization
Outline
1 Model
2 Latent SVM
3 Training ModelsLearning parameters
Initialization
4 Features
5 Post-processing
6 Some experimental results
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 21 / 28
Training Models Initialization
Initialization
Phase 1: Initializing root filters
Split positive examples according to aspect ratio in m groupsDecide area of Fi according to aspect ratio and areaTrain the root filter Fi with classical SVM
Phase 2: Merging components
Train mixture of models with no part (latent: component and p0)
Phase 3: Initializing part filters
Initialize 6 rectangles to cover high energy of the root filterAnchor at center or symmetric according to vertical axisdi = (0, 0, 1, 1)
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 21 / 28
Training Models Initialization
Initialization
Phase 1: Initializing root filters
Split positive examples according to aspect ratio in m groupsDecide area of Fi according to aspect ratio and areaTrain the root filter Fi with classical SVM
Phase 2: Merging components
Train mixture of models with no part (latent: component and p0)
Phase 3: Initializing part filters
Initialize 6 rectangles to cover high energy of the root filterAnchor at center or symmetric according to vertical axisdi = (0, 0, 1, 1)
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 21 / 28
Training Models Initialization
Initialization
Phase 1: Initializing root filtersSplit positive examples according to aspect ratio in m groupsDecide area of Fi according to aspect ratio and areaTrain the root filter Fi with classical SVM
Phase 2: Merging componentsTrain mixture of models with no part (latent: component and p0)
Phase 3: Initializing part filtersInitialize 6 rectangles to cover high energy of the root filterAnchor at center or symmetric according to vertical axisdi = (0, 0, 1, 1)
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 21 / 28
Features HOG, PCA and analytic dimension reduction
Outline
1 Model
2 Latent SVM
3 Training Models
4 FeaturesHOG, PCA and analyticdimension reduction
5 Post-processing
6 Some experimental results
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 22 / 28
Features HOG, PCA and analytic dimension reduction
Features
36-dimensional HOG descriptors (9 orientations, 4 normalizations,non-contrast sensitive)
11 main eigenvectorsProjection is costly
Similar results when using 9+4 basis vectors(18+9)+4=31 for contrast and non-contrast sensitive
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 22 / 28
Features HOG, PCA and analytic dimension reduction
Features
36-dimensional HOG descriptors (9 orientations, 4 normalizations,non-contrast sensitive)
11 main eigenvectorsProjection is costlySimilar results when using 9+4 basis vectors(18+9)+4=31 for contrast and non-contrast sensitive
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 22 / 28
Post-processing Bounding box prediction
Outline
1 Model
2 Latent SVM
3 Training Models
4 Features
5 Post-processingBounding box predictionNon-Maxima SuppressionContextual information
6 Some experimental results
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 23 / 28
Post-processing Bounding box prediction
Bounding box prediction
Input z: position of each part +root width
Outputs (x1, y1, x2, y2):position of the predictionbounding box
How: for each component of amixture, 4 linear functionslearned by linear least-squareregression on training data
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 23 / 28
Post-processing Bounding box prediction
Bounding box prediction
Input z: position of each part +root width
Outputs (x1, y1, x2, y2):position of the predictionbounding box
How: for each component of amixture, 4 linear functionslearned by linear least-squareregression on training data
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 23 / 28
Post-processing Non-Maxima Suppression
Outline
1 Model
2 Latent SVM
3 Training Models
4 Features
5 Post-processingBounding box predictionNon-Maxima SuppressionContextual information
6 Some experimental results
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 24 / 28
Post-processing Non-Maxima Suppression
Non-Maxima Suppression
IssueFor one object, there are a lot of overlapping detections
SolutionSort detections by score
Add the detection one by one and skip if there exists a detection withoverlap of at least 50%
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 24 / 28
Post-processing Contextual information
Outline
1 Model
2 Latent SVM
3 Training Models
4 Features
5 Post-processingBounding box predictionNon-Maxima SuppressionContextual information
6 Some experimental results
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 25 / 28
Post-processing Contextual information
Contextual information
Rescore the detection score
Best score for k other categories: c(I) = (s1, s2, ..., sk)
For a detection (B, s), classify g = (σ(s), x1w ,
y1h ,
x2w ,
y2h , c(I))
Learned from training data
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 25 / 28
Some experimental results Models
Outline
1 Model
2 Latent SVM
3 Training Models
4 Features
5 Post-processing
6 Some experimental resultsModelsDetections
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 26 / 28
Some experimental results Models
Models
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 26 / 28
Some experimental results Models
Models
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 26 / 28
Some experimental results Detections
Outline
1 Model
2 Latent SVM
3 Training Models
4 Features
5 Post-processing
6 Some experimental resultsModelsDetections
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 27 / 28
Some experimental results Detections
Detections
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 27 / 28
Some experimental results Detections
Detections
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 27 / 28
Conclusion
Conclusion
Recent extensionsSpeeding-up detection using cascade classifier
Grammar models
Thanks for your attention.
Any question ?
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 28 / 28
Conclusion
Conclusion
Recent extensionsSpeeding-up detection using cascade classifier
Grammar models
Thanks for your attention.
Any question ?
Philippe Weinzaepfel Latent SVM for Object Detection 1st Feburary 2013 28 / 28