80
City College of New York 1 Dr. Jizhong Xiao Department of Electrical Engineering CUNY City College [email protected] Particle Filter/Monte Carlo Localization Advanced Mobile Robotics Probabilistic Robotics:

Particle Filter/Monte Carlo Localization

  • Upload
    shen

  • View
    117

  • Download
    1

Embed Size (px)

DESCRIPTION

Advanced Mobile Robotics. Probabilistic Robotics: . Particle Filter/Monte Carlo Localization. Dr. J izhong Xiao Department of Electrical Engineering CUNY City College [email protected]. Probabilistic Robotics. Bayes Filter Implementations Particle filters Monte Carlo Localization. - PowerPoint PPT Presentation

Citation preview

Page 1: Particle Filter/Monte Carlo Localization

City College of New York

1

Dr. Jizhong XiaoDepartment of Electrical Engineering

CUNY City [email protected]

Particle Filter/Monte Carlo Localization

Advanced Mobile Robotics

Probabilistic Robotics:

Page 2: Particle Filter/Monte Carlo Localization

City College of New York

Probabilistic Robotics

Bayes Filter Implementations

Particle filtersMonte Carlo Localization

Page 3: Particle Filter/Monte Carlo Localization

City College of New York

Sample-based Localization (sonar)

Page 4: Particle Filter/Monte Carlo Localization

City College of New York

Particle Filter

Definition:Particle filter is a Bayesian based filter that

sample the whole robot work space by a weight function derived from the belief distribution of previous stage.

Basic principle:Set of state hypotheses (“particles”)Survival-of-the-fittest

4

Page 5: Particle Filter/Monte Carlo Localization

City College of New York

Represent belief by random samplesEstimation of non-Gaussian, nonlinear

processes

Particle filtering non-parametric inference algorithm - suited to track non-linear dynamics.- efficiently represent non-Gaussian distributions

Why Particle Filters

Page 6: Particle Filter/Monte Carlo Localization

City College of New York

Function Approximation Particle sets can be used to approximate functions

The more particles fall into an interval, the higher the probability of that interval

6

Page 7: Particle Filter/Monte Carlo Localization

City College of New York

Particle Filter Projection

7

Page 8: Particle Filter/Monte Carlo Localization

City College of New York

Rejection Sampling Let us assume that f(x)<1 for all x Sample x from a uniform distribution Sample c from [0,1] if f(x) > c keep the sample otherwise reject the sample

8

Page 9: Particle Filter/Monte Carlo Localization

City College of New York

Importance Sampling Principle

We can even use a different distribution g togenerate samples from f

By introducing an importance weight w, we canaccount for the “differences between g and f ”

w = f / g f is often called target g is often called proposal

9

Page 10: Particle Filter/Monte Carlo Localization

City College of New York

10

Importance Sampling

Samples cannot be drawn conveniently from the target distribution f.

A sample of f is obtained by attaching the weight f/g to each sample x.

Instead, the importance sampler draws samples from the proposal distribution g, which has a simpler form.

Importance factor w

ondistributi proposalondistributitarget

w

Page 11: Particle Filter/Monte Carlo Localization

City College of New York

Particle Filter Basics

Sample: Randomly select M particles based on weights (same particle may be picked multiple times)

Predict: Move particles according to deterministic dynamics

Measure: Get a likelihood for each new sample by making a prediction about the image’s local appearance and comparing; then update weight on particle accordingly

11

Page 12: Particle Filter/Monte Carlo Localization

City College of New York

Particle Filter Basics Particle filters represent a distribution by a set of samples drawn from the

posterior distribution. The denser a sub-region of the state space is populated by samples, the

more likely it is that true state falls into this region. Such a representation is approximate, but it is nonparametric, and therefore

can represent a much broader space of distributions than Gaussians. Weight of particle are given through the measurement model. Re-sampling allows to redistribute particles approximately according to the

posterior . The re-sampling step is a probabilistic implementation of the Darwinian idea

of survival of the fittest: it refocuses the particle set to regions in state space with high posterior probability

By doing so, it focuses the computational resources of the filter algorithm to regions in the state space where they matter the most.

12

Page 13: Particle Filter/Monte Carlo Localization

City College of New York

Importance Sampling with Resampling:Landmark Detection Example

Page 14: Particle Filter/Monte Carlo Localization

City College of New York

Distributions

Page 15: Particle Filter/Monte Carlo Localization

City College of New York

15

Distributions

Wanted: samples distributed according to p(x| z1, z2, z3)

Page 16: Particle Filter/Monte Carlo Localization

City College of New York

This is Easy!We can draw samples from p(x|zl) by adding noise to the detection parameters.

Page 17: Particle Filter/Monte Carlo Localization

City College of New York

Importance Sampling with Resampling

),...,,(

)()|(),...,,|( :fon distributiTarget

2121

n

kk

n zzzp

xpxzpzzzxp

)()()|()|( :gon distributi Sampling

l

ll zp

xpxzpzxp

),...,,(

)|()(

)|(),...,,|( : w weightsImportance

21

21

n

lkkl

l

n

zzzp

xzpzp

zxpzzzxp

gf

Page 18: Particle Filter/Monte Carlo Localization

City College of New York

Importance Sampling with Resampling

Weighted samples After resampling

Page 19: Particle Filter/Monte Carlo Localization

City College of New York

19

draw xit1 from Bel(xt1)

draw xit from p(xt | xi

t1,ut1)Importance factor for xi

t:

)|()(),|(

)(),|()|()()(

ondistributi proposalondistributitarget

111

111

tt

tttt

tttttt

t

tit

xzpxBeluxxp

xBeluxxpxzpxBelxBel

w

1111 )(),|()|()( tttttttt dxxBeluxxpxzpxBel

Particle Filter Algorithm

• Prediction (Action)

• Correction (Measurement)111 )(),|()( tttttt dxxbelxuxpxbel

)()|()( tttt xbelxzpxbel

Page 20: Particle Filter/Monte Carlo Localization

City College of New York

Particle Filter Algorithm

20

Each particle is a hypothesis as to what the true world state may be at time tSampling from the state transition distributionImportance factor, incorporates the measurement into particle set Re-sampling, importance sampling

Page 21: Particle Filter/Monte Carlo Localization

City College of New York

Particle Filter Algorithm Line 4 – hypothetical state

- sampling from the state transition distribution- set of particles obtained after M iterations is the filter’s representation of the posterior

Line 5 – importance factor- incorporate measurement into the particle set- set of weighted particles represents Bayes filter posterior

Line 8 to 11 – Importance samplingParticle distribution changes - incorporating importance weights

into the re-sampling process. Survival of the fittest: After resampling step, refocuses particle

set to the regions in state space with higher posterior probability, distributed according to 21

)( txbel

)( txbel

)()()( ][t

mttt xbelxzpxbel

Page 22: Particle Filter/Monte Carlo Localization

City College of New York

1. Algorithm particle_filter( St-1, ut-1 zt):2.

3. For Generate new samples

4. Sample index j(i) from the discrete distribution given by wt-

1

5. Sample from using and

6. Compute importance weight

7. Update normalization factor

8. Insert

9. For

10. Normalize weights

Particle Filter Algorithm(from Fox’s slides)

0, tS

ni 1

},{ it

ittt wxSS

itw

itx ),|( 11 ttt uxxp )(

1ij

tx 1tu

)|( itt

it xzpw

ni 1

/it

it ww

Page 23: Particle Filter/Monte Carlo Localization

City College of New York

Resampling

• Given: Set S of weighted samples.

• Wanted : Random sample, where the probability of drawing xi is given by wi.

• Typically done n times with replacement to generate new sample set S’.

Page 24: Particle Filter/Monte Carlo Localization

City College of New York

1. Algorithm systematic_resampling(S,n):

2. 3. For Generate cdf4. 5. Initialize threshold

6. For Draw samples …7. While ( ) Skip until next threshold reached8. 9. Insert10. Increment threshold

11. Return S’

Resampling Algorithm

11,' wcS

ni 2i

ii wcc 1

1],,0[~ 11 inUu

nj 1

11

nuu jj

ij cu

1,'' nxSS i

1ii

Also called stochastic universal sampling

Page 25: Particle Filter/Monte Carlo Localization

City College of New York

Particle Filters

Pose particles drawn at random and uniformly over the entire pose space

Page 26: Particle Filter/Monte Carlo Localization

City College of New York

)|()(

)()|(

)()|()(

xzpxBel

xBelxzpw

xBelxzpxBel

Sensor Information: Importance Sampling

After robot senses the door, MCL Assigns importance factors to each particle

Page 27: Particle Filter/Monte Carlo Localization

City College of New York

'd)'()'|()( , xxBelxuxpxBel

Robot Motion

After incorporating the robot motion and after resampling, leads to new particle set with uniform importance weights, but with an increased number of particles near the three likely places

Page 28: Particle Filter/Monte Carlo Localization

City College of New York

)|()(

)()|(

)()|()(

xzpxBel

xBelxzpw

xBelxzpxBel

Sensor Information: Importance Sampling

New measurement assigns non-uniform importance weights to the particle sets, most of the cumulative probability mass is centered on the second door

Page 29: Particle Filter/Monte Carlo Localization

City College of New York

Robot Motion

'd)'()'|()( , xxBelxuxpxBel

Further motion leads to another re-sampling step, and a step in which a new particle set is generated according to the motion model

Page 30: Particle Filter/Monte Carlo Localization

City College of New York

Practical Considerations Particles represent discrete approximation of continuous beliefs

1. Density Extraction (Estimation)extracting a continuous density from samplesDensity Estimation MethodsGaussian approximation – unimodal distributionHistogram - Multi-model distributionsThe probability of each bin is by summing the weights of the particles that fall in its range Kernel Density Estimation – multi-modal, smooth

each particle is used as the kernelplacing a Gaussian kernel at each particleoverall density is a mixture of the kernel density

30

Page 31: Particle Filter/Monte Carlo Localization

City College of New York

Different Ways of Extracting Densities from Particles

31

Gaussian Approximation

Histogram Approximation

Kernel density Estimate

Page 32: Particle Filter/Monte Carlo Localization

City College of New York

Properties of Particle Filters

Sampling VarianceVariation due to random samplingThe sampling variance decreases with the

number of samplesHigher number of samples result in accurate

approximations with less variability If enough samples are chosen, the observations

made by a robot – sample based belief “close enough” to the true belief.

32

Page 33: Particle Filter/Monte Carlo Localization

City College of New York

Drawbacks In order to explore a significant part of the state space, the

number of particles should be very large which induces complexity problems not adapted to a real-time implementation.

PF methods are very sensitive to non consistent measures or high measurement errors.

33

Page 34: Particle Filter/Monte Carlo Localization

City College of New York

34

Page 35: Particle Filter/Monte Carlo Localization

City College of New York

Importance Sampling with Resampling

Reducing sampling error1. Variance reduction Reduce the frequency of resampling Maintains importance weight in memory & updates them as

follows:

2. Low variance sampling Computes a single random number Any number points to exactly only one particle,

35

Instead of choosing M random numbers and selecting those particles accordingly, this procedure computes a single random number and selects samples according to this number but still with a probability proportional to the sample weight

Resampling too often increases the risk of losing diversity. Too infrequently many samples might be wasted in regions of low probability,

Page 36: Particle Filter/Monte Carlo Localization

City College of New York

Low variance resampling

36

Draw a random number r in the interval (0; M-1 )

Select particles by repeatedly adding the fixed amount of M-1 to r and by choosing the particle that corresponds to the resulting number

Page 37: Particle Filter/Monte Carlo Localization

City College of New York

Low Variance Resampling ProcedurePrinciple:Selection involves sequential stochastic

processSelect samples w.r.t. a single random number,r

but with a probability proportional to sample weight

37

Page 38: Particle Filter/Monte Carlo Localization

City College of New York

Advantages of Particle Filters:

can deal with non-linearitiescan deal with non-Gaussian noisemostly parallelizableeasy to implementPFs focus adaptively on probable regions of

state-space

38

Page 39: Particle Filter/Monte Carlo Localization

City College of New York

Summary Particle filters are an implementation of recursive

Bayesian filtering They represent the posterior by a set of weighted

samples. In the context of localization, the particles are

propagated according to the motion model. They are then weighted according to the likelihood of

the observations. In a re-sampling step, new particles are drawn with a

probability proportional to the likelihood of the observation

39

Page 40: Particle Filter/Monte Carlo Localization

City College of New York

40

Page 41: Particle Filter/Monte Carlo Localization

City College of New York

41

Monte Carlo Localization

• One of the most popular particle filter methods for robot localization

• Reference: – Dieter Fox, Wolfram Burgard, Frank Dellaert,

Sebastian Thrun, “Monte Carlo Localization: Efficient Position Estimation for Mobile Robots”, Proc. 16th National Conference on Artificial Intelligence, AAAI’99, July 1999

Page 42: Particle Filter/Monte Carlo Localization

City College of New York

42

MCL in action“Monte Carlo” Localization -- refers to the resampling of the

distribution each time a new observation is integrated

Page 43: Particle Filter/Monte Carlo Localization

City College of New York

43

Monte Carlo Localization• the probability density function is represented by

samples• randomly drawn from it• it is also able to represent multi-modal distributions,

and thus localize the robot globally• considerably reduces the amount of memory required

and can integrate measurements at a higher rate• state is not discretized and the method is more

accurate than the grid-based methods• easy to implement

Page 44: Particle Filter/Monte Carlo Localization

City College of New York

44

Robot modeling

p( z | x, m ) sensor modelmap m and location x

p( | x, m ) = .75

p( | x, m ) = .05

potential observations z

Page 45: Particle Filter/Monte Carlo Localization

City College of New York

45

Robot modeling

p( z | x, m ) sensor model

p( xnew | xold, u, m ) action model

“probabilistic kinematics” -- encoder uncertainty

• red lines indicate commanded action• the cloud indicates the likelihood of various final states

map m and location r

p( | x, m ) = .75

p( | x, m ) = .05

potential observations o

Page 46: Particle Filter/Monte Carlo Localization

City College of New York

46

Robot modeling: how-to

p(z | x, m ) sensor model

p( xnew | xold, u, m ) action model

(0) Model the physics of the sensor/actuators(with error estimates)

(1) Measure lots of sensing/action results and create a model from them

theoretical modeling

empirical modeling

• take N measurements, find mean (m) and st. dev. () and then use a Gaussian model• or, some other easily-manipulated model...

(2) Make something up...

p( x ) = p( x ) = 0 if |x-m| > 1 otherwise 1- |x-m|/

otherwise

0 if |x-m| >

Page 47: Particle Filter/Monte Carlo Localization

City College of New York

Start

Motion Model Reminder

Page 48: Particle Filter/Monte Carlo Localization

City College of New York

Proximity Sensor Model Reminder

Laser sensor Sonar sensor

Page 49: Particle Filter/Monte Carlo Localization

City College of New York

49

Monte Carlo Localization

Start by assuming p( x0 ) is the uniform distribution.take K samples of x0 and weight each with an importance factor of 1/K

Page 50: Particle Filter/Monte Carlo Localization

City College of New York

50

Monte Carlo Localization

Start by assuming p( x ) is the uniform distribution.

Get the current sensor observation, z1

For each sample point x0 multiply the importance factor by p(z1 | x0, m)

take K samples of x0 and weight each with an importance factor of 1/K

Page 51: Particle Filter/Monte Carlo Localization

City College of New York

51

Monte Carlo Localization

Start by assuming p( x0 ) is the uniform distribution.

Get the current sensor observation, z1

For each sample point x0 multiply the importance factor by p(z1 | x0, m)

take K samples of x0 and weight each with an importance factor of 1/K

Normalize (make sure the importance factors add to 1)

You now have an approximation of p(x1 | z1, …, m) and the distribution is no longer uniform

Page 52: Particle Filter/Monte Carlo Localization

City College of New York

52

Monte Carlo Localization

Start by assuming p( x0 ) is the uniform distribution.

Get the current sensor observation, z1

For each sample point x0 multiply the importance factor by p(z1 | x0, m)

take K samples of x0 and weight each with an importance factor of 1/K

Normalize (make sure the importance factors add to 1)

You now have an approximation of p(x1 | z1, …, m)

Create x1 samples by dividing up large clumps

and the distribution is no longer uniform

each point spawns new ones in proportion to its importance factor

Page 53: Particle Filter/Monte Carlo Localization

City College of New York

53

Monte Carlo Localization

Start by assuming p( x0 ) is the uniform distribution.

Get the current sensor observation, z1

For each sample point x0 multiply the importance factor by p(z1 | x0, m)

take K samples of x0 and weight each with an importance factor of 1/K

Normalize (make sure the importance factors add to 1)

You now have an approximation of p(x1 | z1, …, m)

Create x1 samples by dividing up large clumps

and the distribution is no longer uniform

The robot moves, u1

each point spawns new ones in proportion to its importance factor

For each sample x1, move it according to the model p(x2 | u1, x1, m)

Page 54: Particle Filter/Monte Carlo Localization

City College of New York

54

Monte Carlo Localization

Start by assuming p( x0 ) is the uniform distribution.

Get the current sensor observation, z1

For each sample point x0 multiply the importance factor by p(z1 | x0, m)

take K samples of x0 and weight each with an importance factor of 1/K

Normalize (make sure the importance factors add to 1)

You now have an approximation of p(x1 | z1, …, m)

Create x1 samples by dividing up large clumps

and the distribution is no longer uniform

The robot moves, u1

each point spawns new ones in proportion to its importance factor

For each sample x1, move it according to the model p(x2 | u1, x1, m)

Page 55: Particle Filter/Monte Carlo Localization

City College of New York

55

MCL in action“Monte Carlo” Localization -- refers to the resampling of the

distribution each time a new observation is integrated

Page 56: Particle Filter/Monte Carlo Localization

City College of New York

56

Initial Distribution

Page 57: Particle Filter/Monte Carlo Localization

City College of New York

57

After Incorporating Ten Ultrasound Scans

Page 58: Particle Filter/Monte Carlo Localization

City College of New York

58

After Incorporating 65 Ultrasound Scans

Page 59: Particle Filter/Monte Carlo Localization

City College of New York

59

Estimated Path

Page 60: Particle Filter/Monte Carlo Localization

City College of New York

Using Ceiling Maps for Localization

Page 61: Particle Filter/Monte Carlo Localization

City College of New York

Vision-based Localization

P(z|x)

h(x)z

Page 62: Particle Filter/Monte Carlo Localization

City College of New York

Under a LightMeasurement z: P(z|x):

Page 63: Particle Filter/Monte Carlo Localization

City College of New York

Next to a LightMeasurement z: P(z|x):

Page 64: Particle Filter/Monte Carlo Localization

City College of New York

ElsewhereMeasurement z: P(z|x):

Page 65: Particle Filter/Monte Carlo Localization

City College of New York

Global Localization Using Vision

Page 66: Particle Filter/Monte Carlo Localization

City College of New York

66

Robots in Action: Albert

Page 67: Particle Filter/Monte Carlo Localization

City College of New York

67

Application: Rhino and Albert Synchronized in Munich and Bonn

[Robotics And Automation Magazine, to appear]

Page 68: Particle Filter/Monte Carlo Localization

City College of New York

Localization for AIBO robots

Page 69: Particle Filter/Monte Carlo Localization

City College of New York

69

Limitations

• The approach described so far is able to – track the pose of a mobile robot and to– globally localize the robot.

• How can we deal with localization errors (i.e., the kidnapped robot problem)?

Page 70: Particle Filter/Monte Carlo Localization

City College of New York

70

Approaches

• Randomly insert samples (the robot can be teleported at any point in time).

• Insert random samples proportional to the average likelihood of the particles (the robot has been teleported with higher probability when the likelihood of its observations drops).

Page 71: Particle Filter/Monte Carlo Localization

City College of New York

71

Random Samples Vision-Based Localization

936 Images, 4MB, .6secs/imageTrajectory of the robot:

Page 72: Particle Filter/Monte Carlo Localization

City College of New York

72

Odometry Information

Page 73: Particle Filter/Monte Carlo Localization

City College of New York

73

Image Sequence

Page 74: Particle Filter/Monte Carlo Localization

City College of New York

74

Resulting TrajectoriesPosition tracking:

Page 75: Particle Filter/Monte Carlo Localization

City College of New York

75

Resulting TrajectoriesGlobal localization:

Page 76: Particle Filter/Monte Carlo Localization

City College of New York

76

Global Localization

Page 77: Particle Filter/Monte Carlo Localization

City College of New York

77

Kidnapping the Robot

Page 78: Particle Filter/Monte Carlo Localization

City College of New York

78

Summary• Particle filters are an implementation of recursive

Bayesian filtering• They represent the posterior by a set of

weighted samples.• In the context of localization, the particles are

propagated according to the motion model.• They are then weighted according to the

likelihood of the observations.• In a re-sampling step, new particles are drawn

with a probability proportional to the likelihood of the observation.

Page 79: Particle Filter/Monte Carlo Localization

City College of New York

79

References

1. Dieter Fox, Wolfram Burgard, Frank Dellaert, Sebastian Thrun, “Monte Carlo Localization: Efficient Position Estimation for Mobile Robots”, Proc. 16th National Conference on Artificial Intelligence, AAAI’99, July 1999

2. Dieter Fox, Wolfram Burgard, Sebastian Thrun, “Markov Localization for Mobile Robots in Dynamic Environments”, J. of Artificial Intelligence Research 11 (1999) 391-427

3. Sebastian Thrun, “Probabilistic Algorithms in Robotics”, Technical Report CMU-CS-00-126, School of Computer Science, Carnegie Mellon University, Pittsburgh, USA, 2000

Page 80: Particle Filter/Monte Carlo Localization

City College of New York

80

Thank You