41

Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) [email protected]

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Sequential Quasi Monte Carlo

N. Chopin (CREST-ENSAE)

[email protected]

joint work with Mathieu Gerber (CREST, Université deLausanne)

1 / 28

Page 2: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Outline

Particle �ltering (a.k.a. Sequential Monte Carlo) is a set of MonteCarlo techniques for sequential inference in state-space models.The error rate of PF is therefore OP(N−1/2).

Quasi Monte Carlo (QMC) is a substitute for standard Monte Carlo(MC), which typically converges at the faster rate OP(N−1+ε).However, standard QMC is usually de�ned for IID problems.

The purpose of this work is to derive a QMC version of PF, whichwe call SQMC (Sequential Quasi Monte Carlo).

2 / 28

Page 3: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Outline

Particle �ltering (a.k.a. Sequential Monte Carlo) is a set of MonteCarlo techniques for sequential inference in state-space models.The error rate of PF is therefore OP(N−1/2).

Quasi Monte Carlo (QMC) is a substitute for standard Monte Carlo(MC), which typically converges at the faster rate OP(N−1+ε).However, standard QMC is usually de�ned for IID problems.

The purpose of this work is to derive a QMC version of PF, whichwe call SQMC (Sequential Quasi Monte Carlo).

2 / 28

Page 4: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Outline

Particle �ltering (a.k.a. Sequential Monte Carlo) is a set of MonteCarlo techniques for sequential inference in state-space models.The error rate of PF is therefore OP(N−1/2).

Quasi Monte Carlo (QMC) is a substitute for standard Monte Carlo(MC), which typically converges at the faster rate OP(N−1+ε).However, standard QMC is usually de�ned for IID problems.

The purpose of this work is to derive a QMC version of PF, whichwe call SQMC (Sequential Quasi Monte Carlo).

2 / 28

Page 5: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

QMC basics

Consider the standard MC approximation

1

N

N∑n=1

ϕ(un) ≈ˆ[0,1]d

ϕ(u)du

where the N vectors un are IID variables simulated from U [0, 1]d .

In SQMC, we replace u1:N by a set of N points that are moreevenly distributed on the hyper-cube [0, 1]d . This idea is formalisedthrough the notion of discrepancy.

3 / 28

Page 6: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

QMC basics

Consider the standard MC approximation

1

N

N∑n=1

ϕ(un) ≈ˆ[0,1]d

ϕ(u)du

where the N vectors un are IID variables simulated from U [0, 1]d .

In SQMC, we replace u1:N by a set of N points that are moreevenly distributed on the hyper-cube [0, 1]d . This idea is formalisedthrough the notion of discrepancy.

3 / 28

Page 7: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

QMC vs MC in one plot

0.0

0.2

0.4

0.6

0.8

1.0

●●

●●

●●

●●

●●

●●

●●

●●

● ●

0.0 0.2 0.4 0.6 0.8 1.0

0.0

0.2

0.4

0.6

0.8

1.0

0.0 0.2 0.4 0.6 0.8 1.0

QMC versus MC: N = 256 points sampled independently anduniformly in [0, 1]2 (left); QMC sequence (Sobol) in [0, 1]2 of thesame length (right)

4 / 28

Page 8: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Discrepancy

Koksma�Hlawka inequality:∣∣∣∣∣ 1NN∑n=1

ϕ(un)−ˆ[0,1]d

ϕ(u) du

∣∣∣∣∣ ≤ V (ϕ)D?(u1:N)

where V (ϕ) depends only on ϕ, and the star discrepancy is de�nedas:

D?(u1:N) = sup[0,b]

∣∣∣∣∣ 1NN∑n=1

1 (un ∈ [0,b])−d∏i=1

bi

∣∣∣∣∣ .There are various ways to construct point sets PN =

{u1:N

}so

that D?(u1:N) = O(N−1+ε). (Describing these di�erentconstructions is beyond the scope of this talk.)

5 / 28

Page 9: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Particle Filtering: Feynman-Kac formulation

Consider a Markov chain (xt), x0 ∼ m0(dx0) andxt |xt−1 = xt−1 ∼ mt(xt−1, dxt) taking values in X ⊂ Rd , and asequence of potential functions Gt : X ×X → R. One would like tocompute quantities such as

Qt(ϕ) =1

ZtE

[ϕ(xt)G0(x0)

t∏s=1

Gs(xs−1, xs)

],

with Zt = E

[G0(x0)

t∏s=1

Gs(xs−1, xs)

].

6 / 28

Page 10: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Application to Hidden Markov models

The Feynman-Kac formalism is rather abstract, but consider aHidden Markov model, with latent Markov process (xt), observedprocess (yt), such that yt |xt ∼ f Y (yt |xt). If we take

Gt(xt−1, xt) = f Y (yt |xt)

we observe that Qt(ϕ) is the �ltering expectation of function ϕ,and that Zt is the marginal likelihood of the data y0:t .

Zt =

ˆXT+1

m0(dx0)f Y (y0|x0)t∏

s=1

{ms(xs−1, dxs)f

Y (ys |xs)}.

7 / 28

Page 11: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Particle �ltering: the algorithm

Operations must be be performed for all n ∈ 1 : N.At time 0,

(a) Generate xn0 ∼ m0(dx0).

(b) Compute W n0 = G0(xn0)/

∑N

m=1 G0(xm0 ) and

ZN0 = N−1

∑N

n=1 G0(xn0).

Recursively, for time t = 1 : T ,

(a) Generate ant−1 ∼M(W 1:N

t−1).

(b) Generate xnt ∼ mt(xant−1t−1 , dxt).

(c) Compute W nt = Gt(x

ant−1t−1 , x

nt )/∑

N

m=1 Gt(xamt−1t−1 , x

mt )

and ZNt = ZN

t−1

{N−1

∑N

n=1 Gt(xant−1t−1 , x

nt )}.

8 / 28

Page 12: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

PF output

At iteration t, compute

QNt (ϕ) =N∑n=1

W nt ϕ(xnt )

to approximate Qt(ϕ) (the �ltering expectation of ϕ). In addition,compute

ZNt

as an approximation of Zt (the likelihood of the data).

9 / 28

Page 13: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Cartoon representation

Source for image: some dark corner of the Internet.

10 / 28

Page 14: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

A closer look at resampling

The resampling step (a) ensures that particle xnt−1 has On

t−1o�-springs, where is On

t−1 random and such thatE(On

t−1) = NW nt−1.

In this way, particles with large (normalised) weight get manyo�-springs, while particles with small weights are likely to generatezero o�-springs.

One way to formalise resampling is that of drawing N timesindependently from

N∑n=1

W nt−1δxnt−1(dx̃t−1)

which is our current approx. of Qt−1(dxt−1).

11 / 28

Page 15: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

A closer look at resampling

The resampling step (a) ensures that particle xnt−1 has On

t−1o�-springs, where is On

t−1 random and such thatE(On

t−1) = NW nt−1.

In this way, particles with large (normalised) weight get manyo�-springs, while particles with small weights are likely to generatezero o�-springs.

One way to formalise resampling is that of drawing N timesindependently from

N∑n=1

W nt−1δxnt−1(dx̃t−1)

which is our current approx. of Qt−1(dxt−1).

11 / 28

Page 16: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

A closer look at resampling

The resampling step (a) ensures that particle xnt−1 has On

t−1o�-springs, where is On

t−1 random and such thatE(On

t−1) = NW nt−1.

In this way, particles with large (normalised) weight get manyo�-springs, while particles with small weights are likely to generatezero o�-springs.

One way to formalise resampling is that of drawing N timesindependently from

N∑n=1

W nt−1δxnt−1(dx̃t−1)

which is our current approx. of Qt−1(dxt−1).

11 / 28

Page 17: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Formalisation

In fact, we can formalise the succession of the resampling step (a)and the mutation step (b) at iteration t as an importance samplingstep from random probability measure

QN

t (d(x̃t−1, xt)) =N∑n=1

W nt−1δxnt−1(dx̃t−1)mt(x̃t−1, dxt)

toQNt (d(x̃t−1, xt)) ∝ Q

N

t (d(x̃t−1, xt))Gt(x̃t−1, xt).

Idea: use QMC instead of MC to sample N points from

QN

t (d(x̃t−1, xt)). The main di�culty is that this distribution ispartly discrete, partly continuous.

12 / 28

Page 18: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Formalisation

In fact, we can formalise the succession of the resampling step (a)and the mutation step (b) at iteration t as an importance samplingstep from random probability measure

QN

t (d(x̃t−1, xt)) =N∑n=1

W nt−1δxnt−1(dx̃t−1)mt(x̃t−1, dxt)

toQNt (d(x̃t−1, xt)) ∝ Q

N

t (d(x̃t−1, xt))Gt(x̃t−1, xt).

Idea: use QMC instead of MC to sample N points from

QN

t (d(x̃t−1, xt)). The main di�culty is that this distribution ispartly discrete, partly continuous.

12 / 28

Page 19: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Case d = 1

0.0 0.5 1.0 1.5 2.0 2.50.0

0.2

0.4

0.6

0.8

1.0

x(1)

u1

x(2)

u2

x(3)

u3

Let unt = (vnt ,wnt ) be uniform variates in [0, 1]2. Then

1 Use the inverse transform to obtain x̃n−1t = F̂−1(vnt ), where F̂is the empirical cdf of

∑N

n=1Wnt−1δxnt−1(dx̃t−1).

2 Sample xnt ∼ mt(x̃n−1t , dxt) as: xnt = Γt(x

nt−1,w

nt ), where Γt is

e.g. the inverse CDF of mt(x̃n−1t , dxt) (or some other

appropriate deterministic function)

13 / 28

Page 20: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

From d = 1 to d > 1

When d > 1, we cannot use the inverse CDF method to samplefrom the empirical distribution

N∑n=1

W nt−1δxnt−1(dx̃t−1).

Idea: we �project� the xnt−1's into [0, 1] through the (generalised)

inverse of the Hilbert curve, which is a fractal, space-�lling curveH : [0, 1]→ [0, 1]d .

More precisely, we transform X into [0, 1]d through some functionψ, then we transform [0, 1]d into [0, 1] through h = H−1.

14 / 28

Page 21: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

From d = 1 to d > 1

When d > 1, we cannot use the inverse CDF method to samplefrom the empirical distribution

N∑n=1

W nt−1δxnt−1(dx̃t−1).

Idea: we �project� the xnt−1's into [0, 1] through the (generalised)

inverse of the Hilbert curve, which is a fractal, space-�lling curveH : [0, 1]→ [0, 1]d .

More precisely, we transform X into [0, 1]d through some functionψ, then we transform [0, 1]d into [0, 1] through h = H−1.

14 / 28

Page 22: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Hilbert curve

The Hilbert curve is the limit of this sequence. Note the localityproperty of the Hilbert curve: if two points are close in [0, 1]d , thenthe the corresponding transformed points remains close in [0, 1].(Source for the plot: Wikipedia)

15 / 28

Page 23: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

SQMC Algorithm

At time 0,

(a) Generate a QMC point set u1:N0 in [0, 1]d , and

compute xn0 = Γ0(un0). (e.g. Γ0 = F−1m0)

(b) Compute W n0 = G0(xn0)/

∑N

m=1 G0(xm0 ).

Recursively, for time t = 1 : T ,

(a) Generate a QMC point set u1:Nt in [0, 1]d+1; let

unt = (unt , vnt ).

(b) Hilbert sort: �nd permutation σ such that

h ◦ ψ(xσ(1)t−1 ) ≤ . . . ≤ h ◦ ψ(x

σ(N)t−1 ).

(c) Generate a1:Nt−1 using inverse CDF Algorithm, with

inputs sort(u1:Nt ) and Wσ(1:N)t , and compute

xnt = Γt(xσ(an

t−1)

t−1 , vnt ). (e.g. Γt = F−1mt)

(e) Compute

W nt = Gt(x

σ(ant−1)

t−1 , xnt )/∑

N

m=1 Gt(xσ(am

t−1)

t−1 , xmt ).

16 / 28

Page 24: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Some remarks

• Because two sort operations are performed, the complexity ofSQMC is O(N logN). (Compare with O(N) for SMC.)

• The main requirement to implement SQMC is that one maysimulate from Markov kernel mt(xt−1, dxt) by computingxt = Γt(xt−1,ut), where ut ∼ U [0, 1]d , for some deterministicfunction Γt (e.g. multivariate inverse CDF).

• The dimension of the point sets u1:Nt is 1 + d : �rst component

is for selecting the parent particle, the d remaining

components is for sampling xnt given xant−1t−1 .

17 / 28

Page 25: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Some remarks

• Because two sort operations are performed, the complexity ofSQMC is O(N logN). (Compare with O(N) for SMC.)

• The main requirement to implement SQMC is that one maysimulate from Markov kernel mt(xt−1, dxt) by computingxt = Γt(xt−1,ut), where ut ∼ U [0, 1]d , for some deterministicfunction Γt (e.g. multivariate inverse CDF).

• The dimension of the point sets u1:Nt is 1 + d : �rst component

is for selecting the parent particle, the d remaining

components is for sampling xnt given xant−1t−1 .

17 / 28

Page 26: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Some remarks

• Because two sort operations are performed, the complexity ofSQMC is O(N logN). (Compare with O(N) for SMC.)

• The main requirement to implement SQMC is that one maysimulate from Markov kernel mt(xt−1, dxt) by computingxt = Γt(xt−1,ut), where ut ∼ U [0, 1]d , for some deterministicfunction Γt (e.g. multivariate inverse CDF).

• The dimension of the point sets u1:Nt is 1 + d : �rst component

is for selecting the parent particle, the d remaining

components is for sampling xnt given xant−1t−1 .

17 / 28

Page 27: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Extensions

• If we use RQMC (randomised QMC) point sets u1:Nt , then

SQMC generates an unbiased estimate of the marginallikelihood Zt .

• This means we can use SQMC within the PMCMC framework.(More precisely, we can run e.g. a PMMH algorithm, where thelikelihood of the data is computed via SQMC instead of SMC.)

• We can also adapt quite easily the di�erent particle smoothingalgorithms: forward smoothing, backward smoothing, two-�ltersmoothing.

18 / 28

Page 28: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Extensions

• If we use RQMC (randomised QMC) point sets u1:Nt , then

SQMC generates an unbiased estimate of the marginallikelihood Zt .

• This means we can use SQMC within the PMCMC framework.(More precisely, we can run e.g. a PMMH algorithm, where thelikelihood of the data is computed via SQMC instead of SMC.)

• We can also adapt quite easily the di�erent particle smoothingalgorithms: forward smoothing, backward smoothing, two-�ltersmoothing.

18 / 28

Page 29: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Extensions

• If we use RQMC (randomised QMC) point sets u1:Nt , then

SQMC generates an unbiased estimate of the marginallikelihood Zt .

• This means we can use SQMC within the PMCMC framework.(More precisely, we can run e.g. a PMMH algorithm, where thelikelihood of the data is computed via SQMC instead of SMC.)

• We can also adapt quite easily the di�erent particle smoothingalgorithms: forward smoothing, backward smoothing, two-�ltersmoothing.

18 / 28

Page 30: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Extensions

• If we use RQMC (randomised QMC) point sets u1:Nt , then

SQMC generates an unbiased estimate of the marginallikelihood Zt .

• This means we can use SQMC within the PMCMC framework.(More precisely, we can run e.g. a PMMH algorithm, where thelikelihood of the data is computed via SQMC instead of SMC.)

• We can also adapt quite easily the di�erent particle smoothingalgorithms: forward smoothing, backward smoothing, two-�ltersmoothing.

18 / 28

Page 31: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

A bit of theory

We were able to establish the following types of results: consistency

QNt (ϕ)− Qt(ϕ)→ 0, as N → +∞

for certain functions ϕ, and rate of convergence

MSE

[QNt (ϕ)

]= OP(N−1)

(under technical conditions, and for certain types of RQMC pointsets).Theory is non-standard and borrows heavily from QMC concepts.

19 / 28

Page 32: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Examples: Kitagawa (d = 1)

Well known toy example (Kitagawa, 1998):yt = x2ta

+ εt

xt = b1xt−1 + b2xt−1

1+x2t−1

+ b3 cos(b4t) + σνt

No paramater estimation (parameters are set to their true value).We compare SQMC with SMC (based on systematic resampling)both in terms of N, and in terms of CPU time.

20 / 28

Page 33: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Examples: Kitagawa (d = 1)

−272

−270

−268

−266

−264

−262

0 10000 20000 30000 40000 50000Number of particles

Min

/Max

1e−02

1e+00

1e+02

1e+04

0.1 10.0CPU time in second ( log10 scale)

Var

ianc

e ( l

og10

sca

le)

Log-likelihood evaluation (based on T = 100 data point and 500independent SMC and SQMC runs).

21 / 28

Page 34: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Examples: Kitagawa (d = 1)

0

500

1000

1500

0 25 50 75 100time step

gain

fact

or

N=50 N=100

0

50000

100000

150000

0 25 50 75 100time step

gain

fact

orN=10'000 N=50'000

Filtering: computing E(xt |y0:t) at each iteration t. Gain factor isMSE(SMC)/MSE(SQMC).

22 / 28

Page 35: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Examples: Multivariate Stochastic Volatility

Model is {yt = S

12t εt

xt = µ + Φ(xt−1 − µ) + Ψ12νt

with possibly correlated noise terms: (εt ,νt) ∼ N2d (0,C ).We shall focus on d = 2 and d = 4.

23 / 28

Page 36: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Examples: Multivariate Stochastic Volatility (d = 2)

0

50

100

0e+00 5e+04 1e+05Number of particles

Gai

n fa

ctor

1e−03

1e−01

1e+01

1 100CPU time in second ( log10 scale)

Var

ianc

e ( l

og10

sca

le)

Log-likelihood evaluation (based on T = 400 data points and 200independent SMC and SQMC runs).

24 / 28

Page 37: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Examples: Multivariate Stochastic Volatility (d = 2)

4

5

6

7

8

9

0 100 200 300 400Time step

Gai

n fa

ctor

N=128 N=256 N=512

0

250

500

750

0 100 200 300 400Time step

Gai

n fa

ctor

N=131'072

Filtering.

25 / 28

Page 38: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Examples: Multivariate Stochastic Volatility (d = 4)

2

4

6

8

10

0e+00 5e+04 1e+05Number of particles

Gai

n fa

ctor

0.1

10.0

1 100CPU time in second ( log10 scale)

Var

ianc

e ( l

og10

sca

le)

Log-likelihood estimation.

26 / 28

Page 39: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Conclusion

• Only requirement to replace SMC with SQMC is that thesimulation of xnt |xnt−1 may be written as a xnt = Γt(x

nt−1,u

nt )

where unt ∼ U[0, 1]d .

• We observe very impressive gains in performance (even forsmall N or d = 6).

• Supporting theory.

27 / 28

Page 40: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Further work

• Adaptive resampling (triggers resampling steps when weightdegeneracy is too high).

• Adapt SQMC to situations where sampling from mt(xnt−1, dxt)

involves some accept/reject mechanism (e.g. Metropolis). Inthis way, we could develop SQMC counterparts of SMCsamplers (Del Moral et al, 2006).

• SQMC2 (C. et al, 2013)?

Soon on arxiv!

28 / 28

Page 41: Sequential Quasi Monte Carlo - Drexel Universitymwl25/mcmski/slides/nc_chamonix.pdf · 2014-01-14 · Sequential Quasi Monte Carlo N. Chopin (CREST-ENSAE) nicolas.chopin@ensae.fr

Further work

• Adaptive resampling (triggers resampling steps when weightdegeneracy is too high).

• Adapt SQMC to situations where sampling from mt(xnt−1, dxt)

involves some accept/reject mechanism (e.g. Metropolis). Inthis way, we could develop SQMC counterparts of SMCsamplers (Del Moral et al, 2006).

• SQMC2 (C. et al, 2013)?

Soon on arxiv!

28 / 28