Particle Filters Outline 1.Introduction to particle filters 1.Recursive Bayesian estimation...

Preview:

Citation preview

Particle Particle FiltersFilters

Outline1. Introduction to particle filters

1. Recursive Bayesian estimation

2. Bayesian Importance sampling1. Sequential Importance sampling (SIS)2. Sampling Importance resampling (SIRSIR)

3. Improvements to SIRSIR1. On-line Markov chain Monte Carlo

4. Basic Particle Filter algorithm5. Example for robot localization6. Conclusions

•But what if not a gaussian distribution in our problem?

Motivation for particle filters

Key Idea of Particle Filters

• Idea = we try to have more samples where we expect to have the solution

Motion Model Reminder

• Density of samples represents the expected probability of robot location

Global Localization of Robot with Sonarhttp://www.cs.washington.edu/ai/Mobile_Robotics/mcl/animations/global-floor.gif

•This is the lost robot problem

Particles are used for Particles are used for probability probability density function Approximationdensity function Approximation

Particle sets can be used to approximate functions

Function Approximation

The more particles fall into an interval, the The more particles fall into an interval, the higher the probability of that intervalhigher the probability of that interval

How to draw samples from a How to draw samples from a function/distribution?function/distribution?

Importance Sampling PrincipleImportance Sampling Principle

w = f / g

f is often calledtarget

g is often calledproposal

Pre-condition: f(x)>0 g(x)>0

Importance Sampling Principle

weight

Importance samplingImportance sampling: another example of calculating weight samplesweight samples

• How to calculate formally the f/g value?

Importance Sampling Formulas Importance Sampling Formulas for for f, g f, g and and f/gf/g

),...,,(

)()|(),...,,|( :fon distributiTarget

2121

n

kk

n zzzp

xpxzpzzzxp

)(

)()|()|( :gon distributi Sampling

l

ll zp

xpxzpzxp

),...,,(

)|()(

)|(

),...,,|( : w weightsImportance

21

21

n

lkkl

l

n

zzzp

xzpzp

zxp

zzzxp

g

f

f

g

f/g

History of Monte Carlo Idea History of Monte Carlo Idea and especially Particle Filtersand especially Particle Filters

• First attempts – simulations of growing polymers– M. N. Rosenbluth and A.W. Rosenbluth, “Monte Carlo calculation of the average extension of molecular chains,”

Journal of Chemical Physics, vol. 23, no. 2, pp. 356–359, 1956.

• First application in signal processing - 1993– N. J. Gordon, D. J. Salmond, and A. F. M. Smith, “Novel approach to nonlinear/non-Gaussian Bayesian state

estimation,” IEE Proceedings-F, vol. 140, no. 2, pp. 107–113, 1993.

• Books– A. Doucet, N. de Freitas, and N. Gordon, Eds., Sequential Monte Carlo Methods in Practice, Springer, 2001.– B. Ristic, S. Arulampalam, N. Gordon, Beyond the Kalman Filter: Particle Filters for Tracking Applications, Artech

House Publishers, 2004.

• Tutorials– M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-

gaussian Bayesian tracking,” IEEE Transactions on Signal Processing, vol. 50, no. 2, pp. 174–188, 2002.

What is the problem that we What is the problem that we want to solve?want to solve?

•The problem is tracking the state of a system as it evolves over time

•Sequentially arriving (noisy or ambiguous) observations

•We want to know: Best possible estimate of the hidden variables

Solution: Sequential Update

• Storing and processing all incoming measurements is inconvenient and may be impossible

• Recursive filteringRecursive filtering:1. Predict next state pdf state pdf from current estimatecurrent estimate

2. Update the prediction using sequentially arriving new measurements

• Optimal Bayesian solutionOptimal Bayesian solution: • recursively calculating exact posterior density

These lead to various particle filters

Particle Filters

1. Sequential Monte Carlo methods for on-line learning within a Bayesian framework.

2. Known as1. Particle filters2. Sequential sampling-importance resampling (SIR)3. Bootstrap filters4. Condensation trackers5. Interacting particle approximations6. Survival of the fittest

Particle Filter characteristicsParticle Filter characteristics

Approaches to Particle Filters

METAPHORS

Particle filters

• Sequential andand Monte Carlo properties• Representing belief by sets of samples or

particles

• are nonnegative weights called importance factors

• Updating procedure is sequential importance sampling with re-sampling

( ) ~ { , | 1,..., }i it t t tBel x S x w i n

itw

Tracking in 1D:Tracking in 1D: the blue trajectory is the target.The best of10 particles is in red.

Short, more formal, Introduction to Particle

Filters and Monte Carlo

Localization

Proximity Sensor Model Reminder

Particle filtering ideas• Recursive Bayesian filter by Monte Carlo sampling• The ideaThe idea: represent the posterior density by a set of random

particles with associated weights. • Compute estimates Compute estimates based on these samples and weights

•Sample space

•Posterior density

Particle filtering ideas

•Sample space

•Posterior density

1. Particle filters are based on recursive generation of random measures that approximate the distributions of the unknowns.

2.2. RRandom measuresandom measures: : particles and importance weights. 3. As new observations become available, the particles and the

weights are propagatedpropagated by exploiting Bayes theoremBayes theorem.

Mathematical tools needed for Particle Filters

1 1 1( ) ( | ) ( )t t t t tp x p x x p x dx ( | ) ( )

( | )( )

t t tt t

t

p z x p xp x z

p z

•Recall “law of total probability” and “Bayes’ rule”

Recursive Bayesian Recursive Bayesian estimation (I)estimation (I)

• Recursive filter:– System model:

– Measurement model:

– Information available:

)|( ),( 11 kkkkkk xxpxfx

)|( ),( kkkkkk xypxhy

),,( 1 kk yyD

)( 0xp

Recursive Bayesian estimation (II)• Seek:

– i = 0: filtering.– i > 0: prediction.– i<0: smoothing.

• Prediction:Prediction:

– since:

)|( kik Dxp

1111 )|,()|( kkkkkk dxDxxpDxp

11111 )|()|()|( kkkkkkk dxDxpxxpDxp

)|()|()|(),|()|,( 111111111 kkkkkkkkkkkk DxpxxpDxpDxxpDxxp

Recursive Bayesian estimation (III)

• Update:Update:

• where:

– since:

kkkkkk dxDxypDyp )|,()|( 11

kkkkkkk dxDxpxypDyp )|()|()|( 11

)|(

)|()|()|(

1

1

kk

kkkkkk Dyp

DxpxypDxp

)|()|()|(),|()|,( 1111 kkkkkkkkkkkk DxpxypDxpDxypDxyp

Bayes Filters (second Bayes Filters (second pass)pass)

1( , )

( , )t t t t

t t t t

x f x w

z g x v

•System state dynamics

•Observation dynamics

1( ) ( | , , )t t tBel x p x z z

•We are interested in: Belief or posterior density

•Estimating system state from noisy observations

1:( 1) 1 1where , ,t tz z z

1:( 1) 1, 1:( 1) 1 1:( 1) 1( | ) ( | ) ( | )t t t t t t t tp x z p x x z p x z dx

•From above, constructing two steps of Bayes Filters

1:( 1)1:( 1) 1:( 1)

1:( 1)

( | , )( | , ) ( | )

( | )t t t

t t t t tt t

p z x zp x z z p x z

p z z

•Predict:

•Update:

1:( 1) 1, 1:( 1) 1 1:( 1) 1( | ) ( | ) ( | )t t t t t t t tp x z p x x z p x z dx

1:( 1)replace ( | , ) with ( | )t t t t tp z x z p z x

•Predict:

•Update:

•Assumptions: Markov Process

1 1: 1 1replace ( | , ) with ( | )t t t t tp x x z p x x

1:( 1)1:( 1) 1:( 1)

1:( 1)

( | , )( | , ) ( | )

( | )t t t

t t t t tt t

p z x zp x z z p x z

p z z

1:( 1) 1:( 1)( | , ) ( | ) ( | )t t t t t t t tp x z z p z x p x z

•Bayes Filter

1:( 1) 1 1 1:( 1) 1( | ) ( | ) ( | )t t t t t t tp x z p x x p x z dx

1( | )

( | )t t

t t

p x x

p z x

•How to use it? What else to know?

•Motion Model

•Perceptual Model

•Start from: 0 00 0 0

0

( | )( | ) ( )

( )

p z xp x z p x

p z

Particle Filters: Compare Gaussian and Particle Filters

Example 1Example 1: :

theoretical PDFtheoretical PDF

• Example 1: theoretical PDF

10 0( ) or ( )Bel x p x

•Step 0: initialization

0 0 0

0 0 0 0

( ) or ( | )

( | ) ( )

Bel x p x z

p z x p x

•Step 1: updating

Example 2: Particle Filter

•Step 0: initialization

•Each particle has the same weight

•Step 1: updating weights. Weights are proportional to p(z|x)

•Example 1 (continue)

1 1 1

1 1 1 0 0

( ) or ( | )

( | ) ( | )

Bel x p x z

p z x p x z

•Step 3: updating

12 2 1

2 1 1 1 1

( ) or ( | )

( | ) ( | )

Bel x p x z

p x x p x z dx

•Step 4: predicting

11 1 0

1 0 0 0 0

( ) or ( | )

( | ) ( | )

Bel x p x z

p x x p x z dx

•Step 2: predicting

•1

Robot Motion

Example 2: Example 2: Particle Particle FilterFilter

Example 2: Particle Filter

•Particles are more concentrated in the region where the person is more likely to be

•Step 3: updating weights. Weights are proportional to p(z|x)

•Step 4: predicting.

•Predict the new locations of particles.

•Step 2: predicting.

•Predict the new locations of particles.

Robot Motion

Compare Particle Filter with Bayes Filter with Known Distribution

•Example 1

•Example 2

•Example 1

•Example 2

•Predicting

•Updating

Classical approximations

• Analytical methods: – Extended Kalman filter,– Gaussian sums… (Alspach et al. 1971)

• Perform poorly in numerous cases of interest

• Numerical methods:– point masses approximations,– splines. (Bucy 1971, de Figueiro 1974…)

• Very complex to implement, not flexible.

Monte Carlo Monte Carlo LocalizationLocalization

Mobile Robot Localization

Each particleparticle is a potential pose of the robot

Proposal distribution is the motion model of the robot (prediction step)

The observation model is used to compute the importance weight importance weight (correction step)

Monte Carlo Localization Each particleparticle is a potential pose of the robot

Proposal distribution is the motion model of the robot (prediction step)

The observation model is used to compute the importance weight importance weight (correction step)

Sample-based Localization (sonar)

Random samples and the pdf (I)Random samples and the pdf (I)• Take p(x)=Gamma(4,1)Gamma(4,1)• Generate some random samples• Plot histogram and basic approximation to pdf

0 2 4 6 8 10 12 14 16 18 200

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0 20 40 60 80 100 120 140 160 180 2000

2

4

6

8

10

12

•200 samples

Random samples and the pdf (II)

0 2 4 6 8 10 12 14 16 18 200

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0 2 4 6 8 10 12 14 16 18 200

0.05

0.1

0.15

0.2

0.25

0.3

0.35

•500 samples

•1000 samples

Random samples and the pdf (III)Random samples and the pdf (III)

0 5 10 15 20 250

0.05

0.1

0.15

0.2

0.25

0 5 10 15 20 250

0.05

0.1

0.15

0.2

0.25

•200000 samples•5000 samples

Importance Importance SamplingSampling

Importance SamplingImportance Sampling• Unfortunately it is often not possible to sample directly from the posterior

distribution, but we can use importance sampling.

• Let p(x) be a pdf from which it is difficult to draw samples.

• Let xi ~ q(x), i=1, …, N, be samples that are easily generated from a proposal pdf q, which is called an importance density.

• Then approximation to the density p is given by

)(

)(i

ii

xq

xpw

)()(1

in

i

i xxwxp

•where

Bayesian Importance SamplingBayesian Importance Sampling

• By drawing samples from a known easy to sample proposal distribution we obtain:

N

i

ikk

ikkk xxwDxp

1:0:0:0 )()|(

)|( :0 kk Dxq

ikx :0

)|(

)|(

:0

:0

ki

k

ki

kik Dxq

Dxpw

•where

•are normalized weights.

Sensor Information: Importance Sampling

Sequential Importance Sampling (I)Sequential Importance Sampling (I)• Factorizing the proposal distribution:

• and remembering that the state evolution is modeled as a Markov process

• we obtain a recursive estimate of the importance weights:

• Factorizing is obtained by recursively applying

k

jjjjkk DxxqxqDxq

11:00:0 ),|()()|(

),|(

)|()|(

1:0

11

kkk

kkkkkk Dxxq

xxpxypww

)|(),|()|( 11:01:0:0 kkkkkkk DxqDxxqDxq

•Sequential Importance Sampling (SIS) Particle Filter

•SIS Particle Filter Algorithm

],},[{]},[{ 1111 kNi

ik

ik

Ni

ik

ik zwxSISwx

•for i=1:N

•Draw a particle

•Assign a weight

•end

),|(~ 1 kik

ik

ik zxxqx

),|(

)|()|(

1:0

11

ki

kik

ik

ik

ikki

kik Dxxq

xxpxzpww

•(k is index over time and i is the particle index)

Rejection Rejection SamplingSampling

Let us assume that f(x)<1 for all x

Sample x from a uniform distribution

Sample c from [0,1]

if f(x) > c keep the sampleotherwise reject the sample

Rejection SamplingRejection Sampling

•c

•x

•f(x)

•c’

•x’

•f(x’)

•OK

Importance Sampling with Importance Sampling with Resampling:Resampling:

Landmark Detection ExampleLandmark Detection Example

Distributions

Distributions

•Wanted: samples distributed according to p(x| z1, z2, z3)

This is Easy!•We can draw samples from p(x|zl) by adding noise to the detection parameters.

Importance sampling with Resampling

•After After ResamplingResampling

Particle Particle Filter Filter

AlgorithmAlgorithm

weight =

target distribution / proposal distribution

•draw xit1 from Bel(xt1)

•draw xit from p(xt | xi

t1,ut1)

•Importance factor for xit:

)|(

)(),|(

)(),|()|(

ondistributi proposal

ondistributitarget

111

111

tt

tttt

tttttt

it

xzp

xBeluxxp

xBeluxxpxzp

w

1111 )(),|()|()( tttttttt dxxBeluxxpxzpxBel

Particle Filter Algorithm

Particle Filter Algorithm

1. Algorithm particle_filter( St-1, ut-1 zt):

2.

3. For Generate new samples

4. Sample index j(i) from the discrete distribution given by wt-

1

5. Sample from using and

6. Compute importance weight

7. Update normalization factor

8. Insert

9. For

10. Normalize weights

Particle Filter Algorithm

0, tS

ni 1

},{ it

ittt wxSS

itw

itx ),|( 11 ttt uxxp )(

1ij

tx 1tu

)|( itt

it xzpw

ni 1/i

tit ww

Particle Filter for LocalizationParticle Filter for Localization

Particle Filter in Particle Filter in Matlab Matlab

•Matlab code: truex is a vector of 100 positions to be tracked.Matlab code: truex is a vector of 100 positions to be tracked.

Application: Particle Filter for Localization Application: Particle Filter for Localization (Known Map)(Known Map)

Sources• Longin Jan Latecki • Keith Copsey • Paul E. Rybski• Cyrill Stachniss • Sebastian Thrun • Alex Teichman• Michael Pfeiffer• J. Hightower• L. Liao• D. Schulz• G. Borriello• Honggang Zhang• Wolfram Burgard• Dieter Fox

•76

• Giorgio Grisetti• Maren Bennewitz• Christian Plagemann • Dirk Haehnel• Mike Montemerlo• Nick Roy• Kai Arras• Patrick Pfaff• Miodrag Bolic• Haris Baltzakis

•77

Perfect Monte Carlo simulationPerfect Monte Carlo simulation• Recall that

• Random samples are drawn from the posterior distribution.

• Represent posterior distribution using a set of samples or particles.

• Easy to approximate expectations of the form:

– by:

),,( 0:0 kk xxx

kkkkk dxDxpxgxgE :0:0:0:0 )|()())((

N

i

ikk xg

NxgE

1:0:0 )(

1))((

ikx :0

N

i

ikkkk xx

NDxp

1:0:0:0 )(

1)|(