64
Introduction to Mobile Robotics Bayes Filter Implementations Gaussian filters

Introduction to Mobile Robotics

  • Upload
    missy

  • View
    54

  • Download
    0

Embed Size (px)

DESCRIPTION

Introduction to Mobile Robotics. Bayes Filter Implementations Gaussian filters. Bayes Filter Reminder. Prediction Correction. m. Univariate. - s. s. m. Multivariate. Gaussians. Properties of Gaussians. Multivariate Gaussians. - PowerPoint PPT Presentation

Citation preview

Page 1: Introduction to  Mobile Robotics

Introduction to Mobile Robotics

Bayes Filter Implementations

Gaussian filters

Page 2: Introduction to  Mobile Robotics

•Prediction

•Correction

Bayes Filter Reminder

111 )(),|()( tttttt dxxbelxuxpxbel

)()|()( tttt xbelxzpxbel

Page 3: Introduction to  Mobile Robotics

Gaussians

2

2)(

2

1

2

2

1)(

:),(~)(

x

exp

Nxp

-

Univariate

)()(2

1

2/12/

1

)2(

1)(

:)(~)(

μxΣμx

Σx

Σμx

t

ep

,Νp

d

Multivariate

Page 4: Introduction to  Mobile Robotics

),(~),(~ 22

2

abaNYbaXY

NX

Properties of Gaussians

22

21

222

21

21

122

21

22

212222

2111 1

,~)()(),(~

),(~

NXpXpNX

NX

Page 5: Introduction to  Mobile Robotics

• We stay in the “Gaussian world” as long as we start with Gaussians and perform only linear transformations.

),(~),(~ TAABANY

BAXY

NX

Multivariate Gaussians

12

11

221

11

21

221

222

111 1,~)()(

),(~

),(~

NXpXpNX

NX

Page 6: Introduction to  Mobile Robotics

6

Discrete Kalman Filter

tttttt uBxAx 1

tttt xCz

Estimates the state x of a discrete-time controlled process that is governed by the linear stochastic difference equation

with a measurement

Page 7: Introduction to  Mobile Robotics

7

Components of a Kalman Filter

t

Matrix (nxn) that describes how the state evolves from t to t-1 without controls or noise.

tA

Matrix (nxl) that describes how the control ut changes the state from t to t-1.tB

Matrix (kxn) that describes how to map the state xt to an observation zt.tC

t

Random variables representing the process and measurement noise that are assumed to be independent and normally distributed with covariance Rt and Qt respectively.

Page 8: Introduction to  Mobile Robotics

8

Kalman Filter Updates in 1D

Page 9: Introduction to  Mobile Robotics

9

Kalman Filter Updates in 1D

1)(with )(

)()(

tTttt

Tttt

tttt

ttttttt QCCCK

CKI

CzKxbel

2,

2

2

22 with )1(

)()(

tobst

tt

ttt

tttttt K

K

zKxbel

Page 10: Introduction to  Mobile Robotics

Kalman Filter Updates in 1D

tTtttt

tttttt RAA

uBAxbel

1

1)(

2

,2221)(

tactttt

tttttt a

ubaxbel

Page 11: Introduction to  Mobile Robotics

Kalman Filter Updates

Page 12: Introduction to  Mobile Robotics

0000 ,;)( xNxbel

Linear Gaussian Systems: Initialization

• Initial belief is normally distributed:

Page 13: Introduction to  Mobile Robotics

• Dynamics are linear function of state and control plus additive noise:

tttttt uBxAx 1

Linear Gaussian Systems: Dynamics

ttttttttt RuBxAxNxuxp ,;),|( 11

1111

111

,;~,;~

)(),|()(

ttttttttt

tttttt

xNRuBxAxN

dxxbelxuxpxbel

Page 14: Introduction to  Mobile Robotics

Linear Gaussian Systems: Dynamics

tTtttt

tttttt

ttttT

tt

ttttttT

tttttt

ttttttttt

tttttt

RAA

uBAxbel

dxxx

uBxAxRuBxAxxbel

xNRuBxAxN

dxxbelxuxpxbel

1

1

1111111

11

1

1111

111

)(

)()(2

1exp

)()(2

1exp)(

,;~,;~

)(),|()(

Page 15: Introduction to  Mobile Robotics

• Observations are linear function of state plus additive noise:

tttt xCz

Linear Gaussian Systems: Observations

tttttt QxCzNxzp ,;)|(

ttttttt

tttt

xNQxCzN

xbelxzpxbel

,;~,;~

)()|()(

Page 16: Introduction to  Mobile Robotics

Linear Gaussian Systems: Observations

1

11

)(with )(

)()(

)()(2

1exp)()(

2

1exp)(

,;~,;~

)()|()(

tTttt

Tttt

tttt

ttttttt

tttT

ttttttT

tttt

ttttttt

tttt

QCCCKCKI

CzKxbel

xxxCzQxCzxbel

xNQxCzN

xbelxzpxbel

Page 17: Introduction to  Mobile Robotics

Kalman Filter Algorithm

1. Algorithm Kalman_filter( t-1, t-1, ut, zt):

2. Prediction:3. 4.

5. Correction:6. 7. 8.

9. Return t, t

ttttt uBA 1

tTtttt RAA 1

1)( tTttt

Tttt QCCCK

)( tttttt CzK

tttt CKI )(

Page 18: Introduction to  Mobile Robotics

18

The Prediction-Correction-Cycle

tTtttt

tttttt RAA

uBAxbel

1

1)(

2

,2221)(

tactttt

tttttt a

ubaxbel

Prediction

Page 19: Introduction to  Mobile Robotics

19

The Prediction-Correction-Cycle

1)(,)(

)()(

tTttt

Tttt

tttt

ttttttt QCCCK

CKI

CzKxbel

2,

2

2

22 ,)1(

)()(

tobst

tt

ttt

tttttt K

K

zKxbel

Correction

Page 20: Introduction to  Mobile Robotics

20

The Prediction-Correction-Cycle

1)(,)(

)()(

tTttt

Tttt

tttt

ttttttt QCCCK

CKI

CzKxbel

2,

2

2

22 ,)1(

)()(

tobst

tt

ttt

tttttt K

K

zKxbel

tTtttt

tttttt RAA

uBAxbel

1

1)(

2

,2221)(

tactttt

tttttt a

ubaxbel

Correction

Prediction

Page 21: Introduction to  Mobile Robotics

Kalman Filter Summary

•Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k2.376 + n2)

•Optimal for linear Gaussian systems!

•Most robotics systems are nonlinear!

Page 22: Introduction to  Mobile Robotics

Nonlinear Dynamic Systems

•Most realistic robotic problems involve nonlinear functions

),( 1 ttt xugx

)( tt xhz

Page 23: Introduction to  Mobile Robotics

Linearity Assumption Revisited

Page 24: Introduction to  Mobile Robotics

Non-linear Function

Page 25: Introduction to  Mobile Robotics

EKF Linearization (1)

Page 26: Introduction to  Mobile Robotics

EKF Linearization (2)

Page 27: Introduction to  Mobile Robotics

EKF Linearization (3)

Page 28: Introduction to  Mobile Robotics

•Prediction:

•Correction:

EKF Linearization: First Order Taylor Series Expansion

)(),(),(

)(),(

),(),(

1111

111

111

ttttttt

ttt

tttttt

xGugxug

xx

ugugxug

)()()(

)()(

)()(

ttttt

ttt

ttt

xHhxh

xx

hhxh

Page 29: Introduction to  Mobile Robotics

EKF Algorithm

1. Extended_Kalman_filter( t-1, t-1, ut, zt):

2. Prediction:3. 4.

5. Correction:6. 7. 8.

9. Return t, t

),( 1 ttt ug

tTtttt RGG 1

1)( tTttt

Tttt QHHHK

))(( ttttt hzK

tttt HKI )(

1

1),(

t

ttt x

ugG

t

tt x

hH

)(

ttttt uBA 1

tTtttt RAA 1

1)( tTttt

Tttt QCCCK

)( tttttt CzK

tttt CKI )(

Page 30: Introduction to  Mobile Robotics

Localization

• Given • Map of the environment.• Sequence of sensor measurements.

• Wanted• Estimate of the robot’s position.

• Problem classes• Position tracking• Global localization• Kidnapped robot problem (recovery)

“Using sensory information to locate the robot in its environment is the most fundamental problem to providing a mobile robot with autonomous capabilities.” [Cox ’91]

Page 31: Introduction to  Mobile Robotics

Landmark-based Localization

Page 32: Introduction to  Mobile Robotics

1. EKF_localization ( t-1, t-1, ut, zt, m):

Prediction:

2.

3.

4.

5.

6.

),( 1 ttt ug T

tttTtttt VMVGG 1

,1,1,1

,1,1,1

,1,1,1

1

1

'''

'''

'''

),(

tytxt

tytxt

tytxt

t

ttt

yyy

xxx

x

ugG

tt

tt

tt

t

ttt

v

y

v

y

x

v

x

u

ugV

''

''

''

),( 1

2

43

221

||||0

0||||

tt

ttt

v

vM

Motion noise

Jacobian of g w.r.t location

Predicted mean

Predicted covariance

Jacobian of g w.r.t control

Page 33: Introduction to  Mobile Robotics

1. EKF_localization ( t-1, t-1, ut, zt, m):

Correction:

2.

3.

4.

5.

6.

7.

8.

)ˆ( ttttt zzK

tttt HKI

,

,

,

,

,

,),(

t

t

t

t

yt

t

yt

t

xt

t

xt

t

t

tt

rrr

x

mhH

,,,

2,

2,

,2atanˆ

txtxyty

ytyxtxt

mm

mmz

tTtttt QHHS

1 tTttt SHK

2

2

0

0

r

rtQ

Predicted measurement mean

Pred. measurement covariance

Kalman gain

Updated mean

Updated covariance

Jacobian of h w.r.t location

Page 34: Introduction to  Mobile Robotics

EKF Prediction Step

Page 35: Introduction to  Mobile Robotics

EKF Observation Prediction Step

Page 36: Introduction to  Mobile Robotics

EKF Correction Step

Page 37: Introduction to  Mobile Robotics

Estimation Sequence (1)

Page 38: Introduction to  Mobile Robotics

Estimation Sequence (2)

Page 39: Introduction to  Mobile Robotics

Comparison to GroundTruth

Page 40: Introduction to  Mobile Robotics

EKF Summary

•Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k2.376 + n2)

•Not optimal!•Can diverge if nonlinearities are large!•Works surprisingly well even when all

assumptions are violated!

Page 41: Introduction to  Mobile Robotics

Linearization via Unscented Transform

EKF UKF

Page 42: Introduction to  Mobile Robotics

UKF Sigma-Point Estimate (2)

EKF UKF

Page 43: Introduction to  Mobile Robotics

UKF Sigma-Point Estimate (3)

EKF UKF

Page 44: Introduction to  Mobile Robotics

44

Unscented Transform Intuition

f(…)

Sample a 2N+1 set of points from an N dimensional Gaussian distribution

For each sample compute its mapping via f()

Recover a Gaussian approximation from the samples

Page 45: Introduction to  Mobile Robotics

Unscented Transform

nin

wwn

nw

nw

ic

imi

i

cm

2,...,1for )(2

1 )(

)1( 2000

Sigma points Weights

)( ii g

n

i

Tiiic

n

i

iim

)')('(w'

w'

2

0

2

0

Pass sigma points through nonlinear function

Recover mean and covariance

Page 46: Introduction to  Mobile Robotics

46

Unscented Prediction

g(xt-1,ut-1)

xt-1 ut-1

tx

Construct a N+M dimensional Gaussian from the from the previous state distribution and the controls

For each of the 2(N+M)+1 samples <x,u>(i), compute its mapping via g(x,u) Recover a Gaussian

approximation from the samples

Page 47: Introduction to  Mobile Robotics

UKF_localization ( t-1, t-1, ut, zt, m):

Prediction:

2

43

221

||||0

0||||

tt

ttt

v

vM

2

2

0

0

r

rtQ

TTTt

at 000011

t

t

tat

Q

M

00

00

001

1

at

at

at

at

at

at 111111

xt

utt

xt ug 1,

L

i

T

txtit

xti

ict w

2

0,,

L

i

xti

imt w

2

0,

Motion noise

Measurement noise

Augmented state mean

Augmented covariance

Sigma points

Prediction of sigma points

Predicted mean

Predicted covariance

Page 48: Introduction to  Mobile Robotics

48

Unscented Correction

h(xt)tx

Sample from the predicted state and the observation noise, to obtain the expected measurement

tz

R

Compute the cross correlation matrix of measurements and states, and perform a Kalman update.

Page 49: Introduction to  Mobile Robotics

UKF_localization ( t-1, t-1, ut, zt, m):

Correction:

zt

xtt h

L

iti

imt wz

2

0,ˆ

Measurement sigma points

Predicted measurement mean

Pred. measurement covariance

Cross-covariance

Kalman gain

Updated mean

Updated covariance

Ttti

L

itti

ict zzwS ˆˆ ,

2

0,

Ttti

L

it

xti

ic

zxt zw ˆ,

2

0,

,

1, tzx

tt SK

)ˆ( ttttt zzK

Tttttt KSK

Page 50: Introduction to  Mobile Robotics

1. EKF_localization ( t-1, t-1, ut, zt, m):

Correction:

2.

3.

4.

5.

6.

7.

8.

)ˆ( ttttt zzK

tttt HKI

,

,

,

,

,

,),(

t

t

t

t

yt

t

yt

t

xt

t

xt

t

t

tt

rrr

x

mhH

,,,

2,

2,

,2atanˆ

txtxyty

ytyxtxt

mm

mmz

tTtttt QHHS

1 tTttt SHK

2

2

0

0

r

rtQ

Predicted measurement mean

Pred. measurement covariance

Kalman gain

Updated mean

Updated covariance

Jacobian of h w.r.t location

Page 51: Introduction to  Mobile Robotics

UKF Prediction Step

Page 52: Introduction to  Mobile Robotics

UKF Observation Prediction Step

Page 53: Introduction to  Mobile Robotics

UKF Correction Step

Page 54: Introduction to  Mobile Robotics

EKF Correction Step

Page 55: Introduction to  Mobile Robotics

Estimation Sequence

EKF PF UKF

Page 56: Introduction to  Mobile Robotics

Estimation Sequence

EKF UKF

Page 57: Introduction to  Mobile Robotics

Prediction Quality

EKF UKF

Page 58: Introduction to  Mobile Robotics

58

UKF Summary

•Highly efficient: Same complexity as EKF, with a constant factor slower in typical practical applications

•Better linearization than EKF: Accurate in first two terms of Taylor expansion (EKF only first term)

•Derivative-free: No Jacobians needed

•Still not optimal!

Page 59: Introduction to  Mobile Robotics

• [Arras et al. 98]:

• Laser range-finder and vision

• High precision (<1cm accuracy)

Kalman Filter-based System

Courtesy of K. Arras

Page 60: Introduction to  Mobile Robotics

Multi-hypothesisTracking

Page 61: Introduction to  Mobile Robotics

• Belief is represented by multiple hypotheses

• Each hypothesis is tracked by a Kalman filter

• Additional problems:

• Data association: Which observation

corresponds to which hypothesis?

• Hypothesis management: When to add / delete

hypotheses?

• Huge body of literature on target tracking, motion

correspondence etc.

Localization With MHT

Page 62: Introduction to  Mobile Robotics

• Hypotheses are extracted from LRF scans

• Each hypothesis has probability of being the correct one:

• Hypothesis probability is computed using Bayes’ rule

• Hypotheses with low probability are deleted.

• New candidates are extracted from LRF scans.

MHT: Implemented System (1)

)}(,,ˆ{ iiii HPxH

},{ jjj RzC

)(

)()|()|(

sP

HPHsPsHP ii

i

[Jensfelt et al. ’00]

Page 63: Introduction to  Mobile Robotics

MHT: Implemented System (2)

Courtesy of P. Jensfelt and S. Kristensen

Page 64: Introduction to  Mobile Robotics

MHT: Implemented System (3)Example run

Map and trajectory

# hypotheses

#hypotheses vs. time

P(Hbest)

Courtesy of P. Jensfelt and S. Kristensen