Bayesian Filtering for Location Estimation D. Fox, J. Hightower, L. Liao, D. Schulz, and G....

Post on 22-Dec-2015

219 views 3 download

Tags:

Transcript of Bayesian Filtering for Location Estimation D. Fox, J. Hightower, L. Liao, D. Schulz, and G....

Bayesian Filtering for Location Estimation

D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello

Presented by: Honggang Zhang

Outline

• Basic idea of Bayes filters• Several types of Bayes filters• Some applications

Bayes Filters

1( , )

( , )t t t t

t t t t

x f x w

z g x v

System state dynamics

Observation dynamics

1( ) ( | , , )t t tBel x p x z z

We are interested in: Belief or posterior density

Estimating system state from noisy observations

1:( 1) 1 1where , ,t tz z z

1:( 1) 1, 1:( 1) 1 1:( 1) 1( | ) ( | ) ( | )t t t t t t t tp x z p x x z p x z dx

From above, constructing two steps of Bayes Filters

1:( 1)1:( 1) 1:( 1)

1:( 1)

( | , )( | , ) ( | )

( | )t t t

t t t t tt t

p z x zp x z z p x z

p z z

Predict:

Update:

1 1 1( ) ( | ) ( )t t t t tp x p x x p x dx ( | ) ( )

( | )( )

t t tt t

t

p z x p xp x z

p z

Recall “law of total probability” and “Bayes’ rule”

1:( 1) 1, 1:( 1) 1 1:( 1) 1( | ) ( | ) ( | )t t t t t t t tp x z p x x z p x z dx

1:( 1)replace ( | , ) with ( | )t t t t tp z x z p z x

Predict:

Update:

Assumptions: Markov Process

1 1: 1 1replace ( | , ) with ( | )t t t t tp x x z p x x

1:( 1)1:( 1) 1:( 1)

1:( 1)

( | , )( | , ) ( | )

( | )t t t

t t t t tt t

p z x zp x z z p x z

p z z

1:( 1) 1:( 1)( | , ) ( | ) ( | )t t t t t t t tp x z z p z x p x z

Bayes Filter

1:( 1) 1 1 1:( 1) 1( | ) ( | ) ( | )t t t t t t tp x z p x x p x z dx

1( | )

( | )t t

t t

p x x

p z x

How to use it? What else to know?

Motion Model

Perceptual Model

Start from: 0 00 0 0

0

( | )( | ) ( )

( )

p z xp x z p x

p z

Example 1

10 0( ) or ( )Bel x p x

Step 0: initialization

0 0 0

0 0 0 0

( ) or ( | )

( | ) ( )

Bel x p x z

p z x p x

Step 1: updating

Example 1 (continue)

1 1 1

1 1 1 0 0

( ) or ( | )

( | ) ( | )

Bel x p x z

p z x p x z

Step 3: updating

12 2 1

2 1 1 1 1

( ) or ( | )

( | ) ( | )

Bel x p x z

p x x p x z dx

Step 4: predicting

11 1 0

1 0 0 0 0

( ) or ( | )

( | ) ( | )

Bel x p x z

p x x p x z dx

Step 2: predicting

Several types of Bayes filters

• They differs in how to represent probability densities– Kalman filter– Multihypothesis filter– Grid-based approach– Topological approach– Particle filter

Kalman FilterRecall general problem

1( , )

( , )t t t t

t t t t

x f x w

z g x v

Assumptions of Kalman Filter:

1 , where (0, )

, where (0, )t t t t t t

t t t t t t

x A x w w N Q

z C x v v N R

( ) ( : , )t t t tBel x N x Belief of Kalman Filter is actually a unimodal Gaussian

Advantage: computational efficiencyDisadvantage: assumptions too restrictive

Multi-hypothesis Tracking

• Belief is a mixture of Gaussian

• Tracking each Gaussian hypothesis using a Kalman filter

• Deciding weights on the basis of how well the hypothesis predict the sensor measurements

• Advantage: – can represent multimodal Gaussian

• Disadvantage:– Computationally expensive– Difficult to decide on hypotheses

( ) ~ ( : , )i i it t t t t

i

Bel x w N x

Grid-based Approaches

• Using discrete, piecewise constant representations of the belief

• Tessellate the environment into small patches, with each patch containing the belief of object in it

• Advantage:– Able to represent arbitrary distributions over the

discrete state space

• Disadvantage– Computational and space complexity required to

keep the position grid in memory and update it

Topological approaches

• A graph representing the state space– node representing object’s location (e.g.

a room)– edge representing the connectivity (e.g.

hallway)• Advantage

– Efficiency, because state space is small • Disadvantage

– Coarseness of representation

Particle filters

• Also known as Sequential Monte Carlo Methods

• Representing belief by sets of samples or particles

• are nonnegative weights called importance factors

• Updating procedure is sequential importance sampling with re-sampling

( ) ~ { , | 1,..., }i it t t tBel x S x w i n

itw

Example 2: Particle Filter

Step 0: initialization

Each particle has the same weight

Step 1: updating weights. Weights are proportional to p(z|x)

Example 2: Particle Filter

Particles are more concentrated in the region where the person is more likely to be

Step 3: updating weights. Weights are proportional to p(z|x)

Step 4: predicting.

Predict the new locations of particles.

Step 2: predicting.

Predict the new locations of particles.

Compare Particle Filter with Bayes Filter with Known Distribution

Example 1

Example 2

Example 1

Example 2

Predicting

Updating

Comments on Particle Filters

• Advantage:– Able to represent arbitrary density– Converging to true posterior even for non-

Gaussian and nonlinear system– Efficient in the sense that particles tend to

focus on regions with high probability

• Disadvantage– Worst-case complexity grows exponentially

in the dimensions

ComparisonKalman Multihypothesi

s TrackingGrid Topolog

yParticle

Belief Unimodal

Multimodal Discrete

Discrete Discrete

Accuracy + + 0 - +Robustness

0 + + + +

Sensor Variety

- - + 0 +

Efficiency + 0 - 0 0Implementation

0 - 0 0 +

+ : good; 0 : neutral; - : weak

• Particle Filters (unconstrained)• Particle Filters (constrained)• Combination of Particle Filters and

Kalman Filters

Example Applications

Sensors

• Ultra sound and infrared Sensors:– Less accurate but certain with identification

– Laser range finder– Accurate but anonymous

Example Indoor Environment

Red circles: ultra-sound ID sensors

Blue squares: infrared ID sensors

Using Particle Filters (unconstrained)

• Due to high noise level of ultrasound and infrared sensors, we use particle filters

• Whenever detect the person, updating particles

Using Particle Filters (unconstrained)Another Example

Using Particle Filters (unconstrained)Another Example

Using Particle Filters (constrained)A more efficient way to use particle filters

• constraining the state space to locations on a Voronoi graph (a structure similar to a skeleton of an environment’s free space)

Combine Particle and Kalman FiltersTo Solve Data Association Problem

Area covered by ID sensors

Data Association Problem

In area 3 and 4, identities of A and B are known

In area 5 and 6, resolving ambiguity, but need additional hypotheses

Laser range finder

• Track individual people using Kalman filters (using laser range data)

• A particle filter maintains multiple hypothesis wrt identities of people

Combine Particle and Kalman FiltersTo Solve Data Association Problem

Conclusion

• “The Location Stack”: a general framework with publicly available implementation

• Probabilistic techniques have tremendous potential for inference problems

Questions?