Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
Kalman Localization
2.166 Lecture 729 September 2008
These viewgraphs are based heavily on materials generously made available to me by Dr. Paul Newman, Oxford (SSS 06)
Next Topics for Discussion
• EKF localization with beacons • Particle Filter localization with beacons• How to extend this to the real world?
Next Topics for Discussion• EKF localization with beacons
(this is research assignment 2) • Particle Filter localization with beacons
(demonstration at the start of class?)• How to extend this to the real world?
– How do we get the map? (build it online)– Feature extraction (not just points)– Matching features to the model
(a.k.a. data association or correspondence)
– Robustness– Scalability
Towards EKF SLAM (RA3)• Some handouts:
– Seminal paper by Smith, Self, & Cheeseman– Tutorial papers by Durrant-Whyte and Bailey– SSS 06’ lecture notes by Paul Newman– Localization using beacons paper from 1991– Read Chap 10 of Thrun, Burgard and Fox
• Today we will talk about:– EKF Navigation architectures (P. Newman)– Composition and inversion of coordinate
transforms (Smith, Self, and Cheeseman)– An introductory discussion of SLAM
Overall class roadmap• Research Assignment 3:
– EKF SLAM in matlab– Jacques has provided a template for you,
very similar to research assignment 2– Data association is provided, – Features are points– Scale of the environment is small
• Time to start thinking substantially about the project (meet JL one-on-one?)
• Next 3-4 lectures will cover SLAM basic machinery, data association, scaling, etc.
• First guest lecture: oct 15th Information-Form SLAM, Dr. Matt Walter
Example: AUV Navigation with Beacons
Typical LBL Data (Charles River, 1994)
Navigation results with and without outlier rejection
Much more challenging conditions –Deep Ocean (Juan de Fuca, 1995)
Italy, 2002
Estimated AUV Trajectory
Travel Time Measurements (LBL)
Without outlier rejection
With outlier rejection
Using The EKF in Navigation
From: P. Newman SSS 06
Navigation Architecture (P. Newman)
DR Model
Encoder Counts
HiddenControl
Nav
Physics
Other Measurements
Supplied OnboardSystem
Navigation (to be installed by us)
Physics (to be estimated)
Dead Reckoned output(drifts badly)
required (corrected)navigation estimates
integration
EKF Summary
One Cycle of the Kalmanfilter
From: Bar-Shalom
Reminder: the i|j notation
true estimatedThis is useful for derivations but we can never use it in a calc asx is unknown truth!
Data up to t=j
Nonlinear Kalman Filtering
Same trick as in Non-linear Least Squares:• Linearize around a current estimate using jacobian• Problem becomes linear again
Recalculate Jacs at each iteration
One Cycle of the EKF
From: Bar-Shalom
Implementing this as part of a robust navigation architecture:
•Asynchronisity•Prediction Covariance Inflation•Update Covariance Deflation•Observability•Correlations
From: P. Newman SSS 06
New control u(k)available ?
x(k|k-1) = Fx(k-1|k-1) +Bu(k)P(k|k-1) = F P(k-1|k-1) FT +GQGT
x(k|k-1) = x(k-1|k-1)P(k|k-1) = P(k-1|k-1)
x(k|k) = x(k|k-1)P(k|k) = P(k|k-1)
k=k+1
Delay
Prepare For Update
Prediction
Update
Input to plant enters heree.g steering wheel angle
yesno
yesno
prediction is lastestimate
estimate is lastprediction
S = H P(k|k-1) HT +RW = P(k|k-1) HT S-1
v(k) = z(k) - H x(k|k-1)
x(k|k) = x(k|k-1)+ Wv(k)P(k|k) = P(k|k-1) - W S WT
New observation z(k)available ?
Data from sensors enteralgorithm here e.g compass
measurement
From: P. Newman SSS 06
Vehicle Models - Prediction
(x,y)
Global Reference Frame
L
θ
φ
InstantaneousCenter ofRotation
V
φ
a
control
Truth model
Noise is in control….
Effect of control noise on uncertainty:
Composing Spatial Relationships (Smith, Self, & Cheeseman, 1990)
2
3
1
X1,2
X2,3
X1,3
Compounding transformations
O-Plus and O-Minus Notation
Jacobians: J1 and J2
Deduce an Incremental Move
xo(k-1)
u(k)xo(k)
Global Frame
These can be in massive error
But the common error is subtracted out here
Use this “move” as a control
Substitution into Prediction equation (using J1 and J2 as Jacobians):
Diagonal covariance matrix (3x3) of error in uo
Mapping vs Localisation
Problem Geometry
Landmarks / Features
Things that standout to a sensor:Corners, windows, walls, bright patches, texture…
MapPoint Feature called “i”
Observations / Measurements
Relative On Vehicle sensing environment:• Radar• Cameras• Odometry (really)• Sonar Laser
AbsoluteRelies on infrastructure:•GPS•Compass
How smart can we be with relative only measurements?
From Bayes Rule…..
Input is measurements conditioned on map and vehicle
We want to use Bayes rule to “invert this” and get maps and vehicles given measurements.
Data:
Problem 1 - Localisation
KF implementation using o-plus and o-minus
Remember:u is control, J’s are a fancy way of writing jacobians(composition operator). Q is strength of noise in plant model.
Plant Model
Processing Data
r
Implementation
No features seen here
Location Covariance
Location Innovation
Problem II Mapping
With known vehicle
Map
The state vector is the map
But how is map built?Key Point: State Vector GROWS!
New, bigger map
Old mapObs of new feature
“State augmentation”
How is P augmented?Simple! Use the transformation of covariance rule..
G is the feature initialisation function
Leading to :
Vehicle orientationAngle from Veh to feature
So what are models h and f?h is a function of the feature being observed:
f is simply the identity transformation :
Turn the handle on the EKF:
All hail the Oracle ! How do we know whatfeature we are observing?
Problem III SLAM
“Build a map and use it at the same time”
“This a cornerstone of autonomy”
Bayesian Framework
How Is that sum evaluated?
A current area of interest/debate– Monte-carlo Methods – Thin Junction Trees – Grid based techniques– Kalman Filter
•All have their individual pros and cons•All try to estimate p(xk|Zk) – state of the world given data
Naïve SLAM
A union of Localisation and Mapping
Why naïve? Computation!
State vector has vehicle AND map
Prediction:
Note:features stay still- no noise added and jacobian is identity
Note:The control is noisy u = unominal+noise
Feature Initialisation:This whole function is y(.)from discussion of state augmentationin mapping section
These last two linesare g()
This is our new expandedcovariance