View
86
Download
0
Category
Preview:
Citation preview
Course Calendar Class DATE Contents
1 Sep. 26 Course information & Course overview
2 Oct. 4 Bayes Estimation
3 〃 11 Classical Bayes Estimation - Kalman Filter -
4 〃 18 Simulation-based Bayesian Methods
5 〃 25 Modern Bayesian Estimation :Particle Filter
6 Nov. 1 HMM(Hidden Markov Model)
Nov. 8 No Class
7 〃 15 Supervised Learning
8 〃 29 Bayesian Decision
9 Dec. 6 PCA(Principal Component Analysis)
10 〃 13 ICA(Independent Component Analysis)
11 〃 20 Applications of PCA and ICA
12 〃 27 Clustering, k-means et al.
13 Jan. 17 Other Topics 1 Kernel machine.
14 〃 22(Tue) Other Topics 2
Lecture Plan
Simulation-based Bayesian Estimation - Prelude to the particle filter -
1. Why simulation-based ? 2. Monte Carlo Sampling Methods Historical example Monte Carlo Approximation 3. Sampling Theory Sample Generation Method Importance Sampling
1. Why simulation-based ?
Go Nonlinear and Non-Gaussian
Kalman Filter
Linear
Gaussian density
Analytic form
Particle Filter
Nonlinear
Non-Gaussian
density
Simulation-based
(Monte Carlo
approaches)
Extended
Kalman Filter
Linearization
of nonlinear
system
(Unscented
Kalman filter)
2. Monte Carlo Methods
3.1 Historical example
- Buffon’s Needle - A needle of length l is dropped at random on
a flat surface ruled with parallel lines a distance d>l apart, what is
the probability that the needle will cross one of the lines.
d l
2Probability =
2
Dropped times, is the number of times
the needle crosses a line.
E Ml
d n
n l
E M d
n M
The experiment by Captain Fox (1864) estimates =3.1416
The Monte Carlo (MC) methods is a collection of techniques
performing estimation through random sampling.
Let perform a numerical integration:
: N-dimensional vector
Factorize where ( ) is interpreted as a proability density
satisfyin
I g x dx x
g x f x p x p x
( )
1
g 0 and 1.
For the new expression
,
(1) Draw N samples : 1, , independently from the density .
(2) Approximate by an empirical density distribution
1ˆ
(3)
i
Ni
i
p x p x dx
I E f x f x p x dx
x i N p x
p x
p x x xN
( )
1 1
1 1ˆ ˆ
N Nii
i i
I E f x f x x x dx f xN N
2.2 Monte Carlo Approximation (Integration)
This approximation is referred to as Monte Carlo integration
Remarks:
1) Numerical integration techniques are not efficient due to;
/ the number of points to be evaluated increases with the
dimensionality of parameter space,
/very small proportion of the samples will make a significant
contribution to the integral
2) Key idea of MC is to represent the target distribution as a set of
random samples.
MC method provides appropriate results in some statistical sense
(Convergence, non-biased etc. ) as far as we generate proper samples.
From the low of large numbers approves the convergence of MC
integration.
3. Sampling Theory
1
1
: Given an input random vector , the PDF transformed by
z which is monotonic, one-to-one, invertible ( ) is given by
where :Jacobian of the transformation .
(whe
Z
Z X
x p z
T x T
xp z p x T z
z
xT
z
Corollary
n and are scalars, opearation means the absolute value)x z
Monte Carlo simulation starts by generating random samples from a
known distribution.
3.1 Sample generation method:
From a given probability density function (PDF) to a target density
: X
x
PDF p x
z
: ZPDF p zT(x)
In practice, it is difficult to generate the samples directly from the
density . Instead, the samples can be generated by , referred
to as importance distribution or proposal distribution, whose PDF is
similar as .
ix
p x q x
Similarity of means:
0 0
In terms of , we have
where : is referred to as the .
q x
p x q x for all x
q x
I f x x q x dx
p xx importance weght
q x
3.2 Importance sampling
p x
The importance sampling approximation of can be written by
N
( i ) ( i )
i
I
ˆ ˆI I f x x q x dx x f xN
1
1
( ) ( )
1
can be evaluated up to a normalization constant, i.e.
( is unknown), q ( is unknown)
Then,
=
1 ,
p qp q
q
p
Nq i i
p i
p x
p x q xp x Z x Z
Z Z
Z p xI f x p x dx f x q x dx
Z q x
Zw x f x
Z N
Case :
( )
( )
( )
i
i
i
p xw x
q x
( )
( )
1
( )
1
Consider the case ( ), the ratio can be evaluated by using the sample
set as follows.
11
1Thus,
1
i
Nq i
p i
q
Np i
i
f x
x
Zp x dx w x
Z N
Z
Zw x
N
( )
( ) ( ) ( )
1 1
( )
( )
( )
1
Therefore,
where :
iN Ni i i
pi i
q
i
i
Nj
j
w xI f x w x f x
Z
Z
w xw x
w x
Recommended