Formation et Analyse d’Images Session 8

Post on 31-Dec-2015

48 views 3 download

description

Formation et Analyse d’Images Session 8. Daniela Hall 14 November 2005. Course Overview. Session 1 (19/09/05) Overview Human vision Homogenous coordinates Camera models Session 2 (26/09/05) Tensor notation Image transformations Homography computation Session 3 (3/10/05) - PowerPoint PPT Presentation

Transcript of Formation et Analyse d’Images Session 8

1

Formation et Analyse d’ImagesSession 8

Daniela Hall

14 November 2005

2

Course Overview

• Session 1 (19/09/05)– Overview– Human vision – Homogenous coordinates– Camera models

• Session 2 (26/09/05)– Tensor notation– Image transformations– Homography computation

• Session 3 (3/10/05)– Camera calibration– Reflection models– Color spaces

• Session 4 (10/10/05)– Pixel based image analysis

• 17/10/05 course is replaced by Modelisation surfacique

3

Course overview

• Session 5 + 6 (24/10/05) 9:45 – 12:45– Contrast description– Hough transform

• Session 7 (7/11/05)– Kalman filter

• Session 8 (14/11/05)– Tracking of regions, pixels, and lines

• Session 9 (21/11/05)– Gaussian filter operators

• Session 10 (5/12/05)– Scale Space

• Session 11 (12/12/05)– Stereo vision – Epipolar geometry

• Session 12 (16/01/06): exercises and questions

4

Session overview

1. Tracking of objects

2. Architecture of the robust tracker

3. Tracking using Kalman filter

4. Tracking using CONDENSATION

5

Robust tracking of objects

Trigger regions

Detection New targets

List of targets

PredictList of predictions

Correct

Detection

Measurements

6

Tracking system

• Tracking system: detects position of targets at each time instant (using i.e. background differencing)

7

Tracking system

• Supervisor– calls image acquisition, target observation and detection in a cycle

• Target observation module– ensures robust tracking by prediction of target positions using a Kalman

filter• Detection module

– verifies the predicted positions by measuring detection energy within the search region given by the Kalman filter

– creates new targets by evaluating detection energy within trigger regions• Parameters

– noise threshold, detection energy threshold, parameters for splitting and merging

8

Detection by background differencing

• I=(IR,IG,IB) image, B=(BR,BG,BB) background • Compute a binary difference image Id, where all pixels that have a

difference diff larger than the noise threshold w are set to one.

• Then we compute the connected components of Id to detect the pixels that belong to a target.

• For each target, we compute mean and covariance of its pixels. The covariance is transformed to width and height of the bounding box and orientation of the target.

9

Real-time target detection

• Computing connected components for an image is computationally expensive.

• Idea:– Restrict search of targets to a small number of search

regions.

• These regions are:– Entry regions marked by the user– Search region obtained from the Kalman filter that

predicts the next most likely position of a current target.

10

Background adaption to increase robustness of detection

• In long-term tracking, illumination of a scene changes. Image differencing with a static background causes lots of false detections.

• The background is updated regularily by

• t time, α=0.1 background adaption parameter• Background adaption allows that the background

incorporates slow illumination changes.

11

Example• Detection module

• Parameters: detection energy threshold– energy threshold too high: targets are missed or targets are split– energy threshold too low: false detections

• Problem: energy threshold depends on illumination and target appearance

12

Session overview

1. Tracking of objects

2. Architecture of the robust tracker

3. Tracking using Kalman filter

4. Tracking using CONDENSATION

13

Tracking

• Targets are represented by position (x,y) and covariance.

• A first order Kalman filter is used to predict the position of the target in the next frame.

• The Kalman filter provides a ROI where to look for the target. ROI is computed from the a posteriori estimate xk and from the a posteriori error covariance Pk

14

Example

15

Example: Tracking bouncing ball

• Specifications:– constant background– colored ball

• Problems:– noisy observations– motion blur– rapid motion changes

Thanks to B. Fisher UEdin for providing slides and figures of this example. http://homespages.inf.ed.ac.uk/rbf/AVAUDIO/lect8.pdf

16

Ball physical model

• Position zk = (x, y)

• Position update zk = zk-1 + vk-1Δt

• Velocity update vk = vk-1+ak-1Δt

• Acceleration (gravity down) ak=(0,g)T

17

Robust tracking of objects

• Measurement

• State vector

• State equation

• Prediction

• State control

Tk

kkk

kkk

Tk

k

tgBu

BuxAx

HvHxz

yxyxx

y

xz

),0,0,0(

,ˆˆ

0010

0001,

',',,

18

Robust Tracking of objects

• Measurement noise error covariance

• Temporal matrix

• Process noise error covariance

• a affects the computation speed (large a increases uncertainty and therefore the search regions)

IQ

t

t

A

R

k

k

01.0

1000

0100

010

001

046.0005.0

005.0285.0

19

Kalman filter successes

20

Kalman filter failures

21

Kalman filter analysis

• smoothes noisy observations

• dynamic model fails at bounce and stop

• could estimate ball radius

• could plot a boundary of 95% likelihood of ball position (the boundary would grow when the fit is bad).

22

Session overview

1. Tracking of objects

2. Architecture of the robust tracker

3. Tracking using Kalman filter

4. Tracking using CONDENSATION

23

Tracking by CONDENSATION

• CONDENSATION: Conditional Density Propagation. Also known as Particle Filtering.

Ref: M.Isard and A. Blake: CONDENSATION for visual tracking, Int Journal of Computer Vision, 29(1),1998.http://www.robots.ox.ac.uk/%7Econtours/

24

CONDENSATION tracking

• Keeps multiple hypotheses

• updates using new data

• selects hypotheses probabilistically

• copes with very noisy data and process state changes

• tunable computation load (by choosing number of particles).

25

CONDENSATION algorithm

• Given a set of N hypotheses at time k Hk={x1,k, ... , xN,k} with associated probabilities {p(x1,k), ..., p(xN,k)}

• Repeat N times to generate Hk+1– 1. randomly select a hypothesis xu,k from Hk with p(xu,k)– 2. generate a new state vector sk from a distribution centered at xu,k

– 3. get new state vector using dynamic model xk+1=f(sk) and kalman filter.

– 4. evaluate probability p(zk+1|xk+1) of observed data zk+1 given state xk

– 5. use bayes rule to get p(xk+1|zk+1)

26

CONDENSATION algorithm

Figure from book Isard, Blake: Active Contours

27

Why does condensation tracking work?

• many slightly different hypotheses suggests that maybe we find one that fits better.

• dynamic model allows to switch between different motion models – Motion models of bouncing ball: bounce,

freefall, stop

• sampling by probability weeds out bad hypotheses

28

Tracking of bouncing ball

1. Select 100 hypotheses xk with probabilities p(xk)

2. use estimated covariance P() to create state samples sk

3. define a situation switching model

29

Tracking of bouncing ball

• If in STOP situation: y'=0• If in BOUNCE: x'=-0.7x', also add some random

y' motion, y'=y'+r.• If in FREEFALL: use freefall motion model.

y'=gΔt and x'=x'+r• then use Kalman filter for predicting ^xk

• 4. estimate hypothesis goodness by 1/||Hxk – zk||2

• p(xk) is estimated from the goodness by normalization.

30

Example of sampling effects

31

Kalman filter failures fixed

32

Comparison Kalman vs condensation

• Kalman: – assumes Gaussian motion model.

– Easy to parametrize.

– Fast.

• Condensation:– can track objects with non-gaussian motion.

– very good for multi-modal motion models

– simple algorithm

– reasonably fast