48
Stochastic Integration Prakash Balachandran Department of Mathematics Duke University June 11, 2008 These notes are based on Durrett’s Stochastic Calculus, Revuz and Yor’s Continuous Martingales and Brownian Motion, and Kuo’s Introduction to Stochastic Integration. 1 Preliminaries Definition: A continuous-time process X t is said to be a continuous local martingale w.r.t. {F t ,t 0} if there are stopping times T n ↑∞ such that X Tn t = X Tnt on {T n > 0} 0 on {T n =0} is a martingale w.r.t. {F tTn : t 0}. The stopping times {T n } are said to reduce X . Remarks: 1. I brooded over why we set X T t =0 in the definition, and this is the only explanation I could find: If we defined X T t = X T t on {T 0}, then X T 0 = X 0 , so according to the above definition of a local martingale, X T t a martingale implies E[X T 0 ]= E[X 0 ] < . So, with the above definition of X T t , X 0 has to be integrable. 1

Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

  • Upload
    lamdien

  • View
    236

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Stochastic Integration

Prakash Balachandran

Department of Mathematics

Duke University

June 11, 2008

These notes are based on Durrett’s Stochastic Calculus, Revuz and Yor’s Continuous Martingales and

Brownian Motion, and Kuo’s Introduction to Stochastic Integration.

1 Preliminaries

Definition: A continuous-time processXt is said to be a continuous local martingale w.r.t. Ft, t ≥ 0if there are stopping times Tn ↑ ∞ such that

XTnt =

XTn∧t on Tn > 0

0 on Tn = 0

is a martingale w.r.t. Ft∧Tn : t ≥ 0. The stopping times Tn are said to reduce X .

Remarks:

1. I brooded over why we set XTt = 0 in the definition, and this is the only explanation I could find:

If we defined

XTt = XT∧t on T ≥ 0,

thenXT0 = X0, so according to the above definition of a local martingale,XT

t a martingale implies

E[XT0 ] = E[X0] <∞.

So, with the above definition of XTt , X0 has to be integrable.

1

Page 2: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Since we want to consider more general processes in which X0 need not be integrable, we set

XTt =

XT∧t on T > 0

0 on T = 0

so that

XT0 =

X0 on T > 0

0 on T = 0

and according to the definition of a local martingale, XTt a martingale implies that

E[XT0 ] = E[X0;T > 0] <∞

which does not necessarily imply that X0 is integrable since E[X0;T > 0] ≤ E[X0].

Thus, our definition of a continuous local martingale frees us from integrability of X0.

2. We say that a process Y is locally A if there is a sequence of stopping times Tn ↑ ∞ so that the

stopped processes Y Tt has property A.

Now, you might ask why the hell we should care about continuous local martingales. Again, I thought

about this a lot, and these are the only reasons I could salvage:

Example 1: Let Bt =(B

(1)t , . . . , B

(n)t

)be n-dimensional Brownian motion, where

B

(j)t

nj=1

are

independent Brownian motions on R.

Suppose we’re interested in the process ||Bt|| =√(

B(1)t

)2+ · · ·+

(B

(n)t

)2; it can be shown that:

||Bt|| =n∑j=1

∫ t

0

B(j)s

||Bs||dB(j)

s +n− 1

2

∫ t

0

1||Bs||

ds

and that

Wt =n∑j=1

∫ t

0

B(j)s

||Bs||dB(j)

s

is a Brownian motion, and hence, a martingale.

Now, what about the second integral above? It’s not immediately obvious how this integral behaves,

but it’s certainly not a martingale. In fact, it can be shown that 1||Bt|| for t ≥ 0 is a continuous local

martingale, and so∫ t

0ds||Bs|| is a continuous local martingale.

2

Page 3: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Since a continuous martingale is certainly a continuous local martingale, it follows that ||Bt|| is a con-

tinuous local martingale.

Now, suppose that a particle is exhibiting Brownian motion. If w(t, ω) is any suitable process which

represents a quantity that varies with the distance from the origin to the particle, the process

W (ω) =∫ t

0w(s, ω)d||Bs||(ω)

represents the total accumulation of this quantity, along a path of Brownian motion. So, we need to know

how to integrate processes w.r.t. continuous local martingales to evaluate this quantity (and ensure that it

does, in fact, exist).

Example 2: Let Xt be a continuous martingale, and let φ be a convex function (imagine that Xt is the

interest rate at time t, so that φ(Xt) = e−tXt is the present value of a dollar made in t years).

Theorem 1 If E[|φ(Xt)|] <∞, then φ(Xt) is a submartingale.

Proof: Jensen’s inequality for conditional expectation states that if φ is convex, andE[|Xt|], E[|[φ(Xt)|] <∞ for each t, then for s < t:

φ(Xs) = φ (E[Xt|Fs]) ≤ E[φ(Xt)|Fs].

On the other hand, we have:

Theorem 2 φ(Xt) is always a local submartingale.

For the proof of this, see the corollary after Theorem 4.

So, if φ(t) is a cash flow from now to time T ,∫ T

0 φ(t)e−tXtdt =∫ T

0 φ(t)dYt is the net present value of

this cash flow, where we’ve set dYt = e−tXtdt. Again, we need to know how to integrate processes w.r.t.

continuous local martingales to evaluate this quantity (and ensure that it does, in fact, exist).

Example 3:

Definition: Define Lad(Ω, L2[a, b]) denote the space of stochastic processes f(t, ω) satisfying:

1. f(t, ω) is adapted to the filtration Ft of Brownian motion.

2.∫ ba |f(t, ω)|2dt <∞ almost surely.

Definition: DefineL2([a, b]×Ω) to be the space of Ft adapted processes f(t, ω) such that∫ ba E(|f(t, ω)|2)dt <

∞.

3

Page 4: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Now, by Fubini’s Theorem if f ∈ L2([a, b]× Ω), E[∫ ba |f(t, ω)|2dt

]=∫ ba E[|f(t, ω)|2]dt <∞. Thus,∫ b

a |f(t, ω)|2dt <∞ almost surely, so that f ∈ Lad(Ω, L2[a, b]). Since f was arbitrary, we must have

L2([a, b]× Ω) ⊆ Lad(Ω, L2[a, b]).

Now, in Stochastic Calculus, one constructs the integral∫ t

0 f(s, ω)dBs for f ∈ L2([a, b] × Ω). In this

case, we have that∫ t

0 f(s, ω)dBs is a martingale.

However, when f ∈ Lad(Ω, L2[a, b]),∫ t

0 f(s, ω)dBs need not be a martingale.

Now, in order to proceed, we need a couple of theorems about continuous local martingales, and theorems

concerning variance and covariance processes. They may seem irrelevant, now, but they’ll come in handy

later.

4

Page 5: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

2 Continuous Local Martingales

The first section was supposed to convince you why you should care about continuous local martingales.

Now, we prove some theorems.

Theorem 3 (The Optional Stopping Theorem) Let X be a continuous local martingale. If S ≤ T are

stopping times, and XT∧t is a uniformly integrable martingale, then E[XT |FS ] = XS .

Proof: The classic Optional Stopping Theorem states that:

If L ≤M are stopping times and YM∧n is a uniformly integrable martingale w.r.t. Gn, then

E[YM |GL] = YL.

To extend the result from discrete to continuous time, let Sn = [2nS]+12n . Applying the discrete time result

to the uniformly integrable martingale Ym = XT∧m2−n with L = 2nSn and M =∞, we have

E[XT |FSn ] = XT∧Sn .

Now, the dominated convergence theorem for conditional expectation states:

If Zn → Z a.s., |Zn| ≤W for all n where E[W ] <∞, and Fn ↑ F∞,

E[Zn|Fn]→ E[Z|F∞] a.s.

Taking Zn = XT and Fn = FSn , and noticing that XT∧t a uniformly integrable martingale implies that

E[|XT |] <∞, we have

E[XT |FSn ]→ E[XT |FS ]

a.s.. Since E[XT |FSn ] = XT∧Sn → XS a.s., we have that XS = E[XT |FS ].

5

Page 6: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Theorem 4 If X is a continuous local martingale, we can always take the sequence which reduces X to

be

Tn = inft : |Xt| > n

or any other sequence T ′n ≤ Tn that has T ′n ↑ ∞ as n ↑ ∞.

Proof: Let Sn be a sequence that reduces X . If s < t, then applying the optional stopping theorem to

XSnr at times r = s ∧ T ′m and t ∧ T ′m gives:

E[Xt∧T ′m∧Sn1Sn>0|Fs∧T ′m∧Sn ] = Xs∧T ′m∧Sn1Sn>0.

Multiplying by 1T ′m>0 ∈ F0 ⊆ Fs∧T ′m∧Sn :

E[Xt∧T ′m∧Sn1T ′m>0,Sn>0|Fs∧T ′m∧Sn ] = Xs∧T ′m∧Sn1T ′m>0,Sn>0.

As n ↑ ∞, Fs∧T ′m∧Sn ↑ Fs∧T ′m , and Xr∧T ′m∧Sn1Sn>0,T ′m>0 → Xr∧T ′m1T ′m>0 for all r ≥ 0 and

|Xr∧T ′m∧Sn1Sn>0,T ′m>0| ≤ m it follows from the dominated convergence theorem for conditional ex-

pectation that:

E[Xt∧T ′m1T ′m>0|Fs∧T ′m ] = Xs∧T ′m1T ′m>0.

Corollary 1 If X is a continuous martingale, and φ is a convex function, then φ(Xt) is a continuous

local submartingale.

Proof: By Theorem 4, we can let Tn = inft : |Xt| > n to be a sequence of stopping times that reduce

Xt. By definition of XTnt , we therefore have |XTn

t | ≤ n⇒ E[|XTnt |] ≤ n, t ≥ 0.

Now, since φ is convex, and |XTnt | ≤ n, φ(XTn

t ) is contained in φ([−n, n]). Since φ is convex, it is

continuous, so that φ([−n, n]) is bounded. Hence, |φ(XTnt )| ≤ M for some 0 < M < ∞, and so

E[|φ(XTnt )|] ≤M .

So, by Jensen’s inequality:

E[φ(XTnt )|FTn∧s] ≥ φ(E[XTn

t |FTn∧s]) = φ(XTns ).

Thus, φ(XTnt ) is a submartingale, so that φ(Xt) is a local submartingale.

6

Page 7: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

In the proof of Theorem 4, we used the fact that XTnt is a martingale w.r.t. Ft∧Tn , t ≥ 0, as per the

definition of a continuous local martingale. In general, we have

Theorem 5 Let S be a stopping time. Then, XSt is a martingale w.r.t. Ft∧S , t ≥ 0 if and only if it is a

martingale w.r.t. Ft, t ≥ 0.

Proof: (⇐) We begin by proving the following

Claim 1 If S ≤ T are stopping times, then FS ⊆ FT .

Proof of Claim: Recall that if S is a stopping time,

FS = A : A ∩ S ≤ t ∈ Ft for all t ≥ 0.

Now, let A ∈ FS . Then, A ∩ S ≤ t ∈ Ft. Since T ≤ t ⊆ S ≤ t since S ≤ T ,

T ≤ t = T ≤ t ∩ S ≤ t.

Finally, since T is a stopping time, T ≤ t ∈ Ft, so that we have:

A ∩ T ≤ t = (A ∩ S ≤ t) ∩ T ≤ t ∈ Ft.

Thus, A ∈ FT since t was arbitrary, and so FS ⊆ FT . 4

Now, suppose that XSt is a martingale w.r.t. Ft, t ≥ 0. Then, for s ≤ t:

E[XSt |Fs] = XS

s ⇔∫AXSs =

∫AXSt

for A ∈ Fs. By the claim, Fs∧S ⊆ Fs, so that for any A ∈ Fs∧S , the same holds, so that

E[XSt |Fs∧S ] = XS

s .

Since s, t were arbitrary, XSt is a martingale w.r.t. Ft∧S , t ≥ 0.

(⇒) : Suppose that XSt us a martingale w.r.t. Ft∧S , t ≥ 0. Let A ∈ Fs.

Claim 2 A ∩ S > s ∈ FS∧s.

Proof of Claim: If r < s and ω ∈ S > s, then

(S ∧ s)(ω) = s > r.

Thus, ω ∈ S ∧ s ≤ rc ⇒ S > s ⊆ S ∧ s ≤ rc, so that

S > s ∩ S ∧ s ≤ r = ∅ ⇒ A ∩ S > s ∩ S ∧ s ≤ r = ∅ ∈ FS∧s

for r < s.

7

Page 8: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

If r ≥ s, Fs ⊆ Fr. Thus, A ∈ Fs ⊆ Fr and S > s = S ≤ sc ∈ Fs ⊆ Fr. Thus, since S ∧ s is a

stopping time:

(A ∩ S > s) ∩ S ∧ s ≤ r ∈ Fr.

So, for all values of r, (A ∩ S > s) ∩ S ∧ s ≤ r ∈ Fr, so that A ∩ S > s ∈ FS∧s. 4

By the above claim, A ∩ S > s ∈ FS∧s, so that since XSt is a martingale w.r.t. Ft∧S , we have for

s ≤ t:E[XS

t |Fs∧S ] = XSs ⇔

∫A∩S>s

XSt =

∫A∩S>s

XSs .

On the other hand, ∫A∩S≤s

XSt =

∫A∩S≤s

XS · 1S>0 =∫A∩S≤s

XSs .

So, for s ≤ t:∫AXSt =

∫A∩S>s

XSt +

∫A∩S≤s

XSt =

∫A∩S>s

XSs +

∫A∩S≤s

XSs =

∫AXSs

⇒ E[Xt|Fs] = Xs

by uniqueness of conditional expectation.

Thus, in the definition of a continuous local martingale, XTnt can be allowed to be a martingale w.r.t. Ft.

We use this in Theorem 6.

When is a continuous local martingale a martingale?

Definition: A real valued process X is said to be of class DL if for every t > 0, the family of random

variables XT where T ranges through all stopping times less than t is uniformly integrable.

Theorem 6 A local martingale is a martingale if and only if it is of class DL.

Proof: (⇒) : Suppose that Xt is a local martingale that is a martingale, and let t > 0 be fixed, and T be

a stopping time s.t. T ≤ t. Since Xt is a (true) martingale, we have that:

Xs = E[Xt|Fs]

for s ≤ t.

Now, Theorem 5.1 of Durrett states:

Given a probability space (Ω,F , P ) and an X ∈ L1, E[X|G] : G is a σ-field ⊆ F is uniformly inte-

grable.

8

Page 9: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

So, Xs∧t = E[Xt|Fs], which is a uniformly integrable martingale by Durrett’s theorem, since

Xt ∈ L1(Ω,F , P ).

Thus, the Optional Stopping Theorem implies: XT = E[Xt|FT ], so that XT T≤t = E[Xt|FT ]T≤t,the latter of which is again uniformly integrable by Durrett’s Theorem.

Since t > 0 was arbitrary, the result follows.

(⇐) : Suppose that Xt is a local martingale that is of class DL. Let Tn be a sequence of stopping

times that reduce X .

For fixed t ≥ 0, then, we have that XTnt → Xt a.s. as n→∞. By hypothesis, XTn

t n≥0 are uniformly

integrable, so that we must have XTnt → Xt in L1. Thus, Xt ∈ L1(Ω,F , P ), so that since t was

arbitrary, this is true for all t > 0.

Let s ≤ t.

Now, XTnt → Xt a.s. and XTn

t is uniformly integrable implies that this happens in L1.

Claim 3 XTnt → Xt in L1 implies E[XTn

t |Fs]→ E[Xt|Fs] in L1.

Proof of Claim:

||E[Xt|Fs]− E[XTnt |Fs]||1 = ||E[Xt −XTn

t |Fs]||1 ≤ ||Xt −XTnt ||1 < ε

for n > N since XTnt → Xt in L1. 4.

By the claim E[XTnt |Fs]→ E[Xt|Fs] in L1.

On the other hand, since Xt is a local martingale (recall theorem 5): E[XTnt |Fs] = XTn

s → Xs a.s. as

n→∞, so that since XTns are uniformly integrable, E[XTn

t |Fs]→ Xs in L1.

Thus, Xs = E[Xt|Fs].

9

Page 10: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Corollary 2 If Xt is a local martingale, and

E

[sup

0≤s≤t|Xs|

]<∞

for each t, then Xt is a martingale.

Proof: Let t > 0 be fixed. For 0 ≤ k ≤ t, |Xk| ≤ sup0≤s≤t |Xs|. Thus, if T be a stopping time less

than t,

|XT | ≤ sup0≤s≤t

|Xs|.

Since sup0≤s≤t |Xs| is an integrable function, it follows immediately that the family XT where T

ranges through all stopping times less than t is uniformly integrable. Since t was arbitrary, the result

follows from Theorem 6.

Corollary 3 A bounded local martingale is a martingale.

Proof: Obvious.

Theorem 7 Let Xt be a local martingale on [0, τ). If

E

[sup

0≤s≤τ|Xs|

]<∞

then Xτ = limt↑τ Xt exists and E[X0] = E[Xτ ].

Proof: |Xk| ≤ sup0≤s≤τ |Xs| for 0 ≤ k < τ , so that E[Xk] ≤ E[sup0≤s≤τ |Xs|

]<∞ for 0 ≤ k < τ .

Thus, Xt0≤t<τ is uniformly integrable.

The method of Theorem 6 (⇐) carries over, so that Xt is a martingale on [0, τ).

By the Martingale Convergence Theorem (which we can apply since E[Xk] ≤ E[sup0≤s≤τ |Xs|

]<∞

for 0 ≤ k < τ implies sup0≤s<τ E[Xs] ≤ E[sup0≤s≤τ |Xs|

]<∞) Xτ (ω) := limt→τ Xt(ω) exists for

a.e. ω ∈ Ω.

Since Xt0≤t<τ is uniformly integrable, this convergence occurs in L1. But then,

E[Xτ ] = limt→τ

E[Xt] = E[X0]

since Xt is a martingale.

10

Page 11: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

3 The Doob-Meyer Decomposition: Motivation

In this section, we motivate the construction of variance and covariance processes for continuous local

martingales, which is crucial in the construction of stochastic integrals w.r.t. continuous local martingales

as we shall see.

In this section, unless otherwise specified, we fix a Brownian motion Bt and a filtration Ft such that:

1. For each t, Bt is Ft-measurable.

2. For and s ≤ t, the random variable Bt −Bs is independent of the σ-field Fs.

Recall that for any Brownian motion, 〈B〉t = t where 〈B〉t is the quadratic variation of Bt. This

immediately implies (2) in the following

Definition: Define L2ad([a, b]× Ω) to be the space of all stochastic processes f(t, ω), a ≤ t ≤ b, ω ∈ Ω

such that:

1. f(t, ω) is adapted to the filtration Ft.

2.∫ ba E[|f(t)|2]dt =

∫ ba E[|f(t)|2]d 〈B〉t <∞.

Also recall when constructing a theory of integration w.r.t. a Brownian motion, we begin with construct-

ing the stochastic integral ∫ b

af(t)dBt

for f ∈ L2ad([a, b]× Ω).

Now, we want a more general formalism of integrating a class of processes w.r.t. a generalized martin-

gale that in the case Brownian motion will reduce to the above.

Definition: Let Gt be a right-continuous filtration. We define L denote the collection of all jointly

measurable stochastic processes X(t, ω) such that:

1. Xt is adapted w.r.t. Gt.

2. Almost all sample paths of Xt are left continuous.

Furthermore, we define P to be the smallest σ-field of subsets of [a, b]× Ω with respect to which all the

stochastic processes in L are measurable. A stochastic processes Y (t, ω) that is P measurable is said to

be predictible.

11

Page 12: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

The motivation for the definition of a predictable process comes from the following argument: If Yt is

a predictable process, then almost all its values at time t can be determined [with certainty] with the

information available strictly before time t, since left continuity of the process Yt implies that for almost

every ω ∈ Ω and any sequence tn ↑ t as n→∞:

limn→∞

Ytn(ω) = Yt(ω).

Now, we have the following theorem [which we shall prove a version of in the next section for continuous

local martinagles]:

Theorem 8 (Doob-Meyer) Let Mt, a ≤ t ≤ b be a right continuous, square integrable martingale with

left hand limits. Then, there exists a unique decomposition

(Mt)2 = Lt +At, a ≤ t ≤ b

whereLt is a right-continuous martingale with left-hand limits, andAt is a predictable, right continuous,

increasing process such that Aa ≡ 0 and E[At] <∞ for all a ≤ t ≤ b.

The above theorem certainly applies to the square integrable process Bt.

Claim 4 In the case Mt = Bt in Doob-Meyer, At = 〈B〉t = t.

Proof of Claim 4: WLOG, we may take a = 0 and b = t0. Define Pt = (Bt)2 − t. Then, for

0 ≤ s ≤ t ≤ t0:

E[(Bt)2|Fs] = E[(Bt −Bs +Bs)2|Fs] = E[(Bt −Bs)2 + 2Bs(Bt −Bs) + (Bs)2|Fs]

= E[(Bt −Bs)2] + 2BsE[Bt −Bs] + (Bs)2 = t− s+ (Bs)2

⇒ E[Pt|Fs] = E[(Bt)2 − t|Fs] = (Bs)2 − s = Ps.

Thus, Pt = (Bt)2 − t is a martingale, so that (Bt)2 = Pt + t. Clearly, t satisfies all the conditions that

At must satisfy in Doob-Meyer, so that by uniqueness of At, At = t = 〈B〉t.

4

So, another way of viewing the integral w.r.t. the martingale Mt w.r.t. the filtration Gt is the following:

First, we look for the unique process (guaranteed by Doob-Meyer) 〈M〉t such that

Lt = (Mt)2 − 〈M〉t

is a martingale. Then, we make the

12

Page 13: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Definition: Define L2pred([a, b]〈M〉 × Ω) to be the space of all stochastic processes f(t, ω), a ≤ t ≤ b,

ω ∈ Ω such that:

1. f(t, ω) is predictable w.r.t. Gt.

2.∫ ba E[|f(t)|2]d 〈M〉t <∞.

Then, we proceed to construct the integral ∫ b

af(t)dMt

for f ∈ L2pred([a, b]〈M〉 × Ω).

It’s clear that in the case Mt = Bt and Gt = Ft that the above formulation coincides with the original

construction of the stochastic integral w.r.t. Bt reviewed at the beginning of this section.

For right continuous, square integrable martingales Mt with left hand limits, at least, this process works.

In the case where Mt is a continuous local martingale, we do the same thing. However, it’s not immedi-

ately clear:

1. that we have a version of Doob-Meyer for continuous local martingales.

2. how the construction of the integral is affected by the stopping times Tn that reduce Mt, if at all.

In the next section, we deal with the first problem. Then, we proceed to remedy the second.

13

Page 14: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

4 Variance and Covariance Processes

We take L and P as defined in section 1.

Theorem 9 If Xt is a continuous local martingale, then we define the variance process 〈X〉t to be the

unique continuous predictable increasing processes At that has A0 ≡ 0 and makes X2t − At a local

martingale.

Definition: If X and Y are two continuous local martingales, we let

〈X,Y 〉t =14

(〈X + Y 〉t − 〈X − Y 〉t).

We call 〈X,Y 〉t the covariance of X and Y .

Based on the discussion in the first section, it’s clear why we’re interested in variance processes. It is

convenient to define covariance processes since they are very useful and have quite nice properties, such

as:

Theorem 10 〈·, ·〉t is a symmetric bilinear form on the class of continuous local martinagles.

We might prove it this time around. If not, hopefully next time. Two questions I’m still pondering is

1. Can you turn this into an inner product?

2. If so, how you can characterize the class of processes that is the completion of this space?

The proof of theorem 9 is long, but it is instructive to go through it, since it develops techniques that

will be useful later. In order to proceed, recall that any predictable discrete time martingale is constant

[why?]. There is a result analogous to this in continuous time, and we use it to prove the uniqueness

statement in theorem 9:

Theorem 11 Any continuous local martingale Xt that is predictable and locally of bounded variation

is constant (in time).

Proof of theorem 11: By subtracting X0, WMA that X0 ≡ 0. Thus, we wish to show that Xt ≡ 0 for

all t > 0 almost surely.

Let Vt(ω) = supπ∈Πt Tπ(ω) be the variation of Xs(ω) on [0, t], where Πt denotes the set of all (finite)

partitions of [0, t], π = 0 = t0 < t1 < · · · < tN = t, and where for a given partition of this sort,

Tπ(ω) =N∑m=1

|Xtm(ω)−Xtm−1(ω)|.

Lemma 1 For almost all ω ∈ Ω, t 7→ Vt(ω) is continuous

Proof of Lemma 1: First, notice that for any ω ∈ Ω, t 7→ Vt(ω) is increasing:

14

Page 15: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

For s < t, [0, s] ⊂ [0, t], so that any finite partition π = 0 = t0 < t1 < · · · < tN = s of [0, s] gives a

finite partition π′ = 0 = t1 < · · · < tN = s < tN+1 = t of [0, t]. Thus, for any finite partition π of

[0, s], Tπ(ω) ≤ Tπ′(ω), where π′ is a finite partition of [0, t], so that

Tπ(ω) ≤ Tπ′(ω) ≤ supπ∈Πt

Tπ(ω) = Vt(ω)

⇒ Vs(ω) = supπ∈Πs

Tπ(ω) ≤ supπ∈Πt

Tπ(ω) = Vt(ω).

Since ω was arbitrary, this is true for all ω ∈ Ω.

Thus, to show that t 7→ Vt is continuous a.s., it suffices to show that for almost all ω ∈ Ω, t 7→ Vt(ω) has

no discontinuities (of the first kind).

Claim 5 For any ω ∈ Ω, Vu(ω) = Vs(ω) + V us (ω) where V u

s (ω) is the variation of Xt(ω) on [s, u].

Proof of Claim 5: Take any two partitions s = t0 < t1 < · · · < tN = u, 0 = t−N ′ < t−N ′−1 <

· · · < t0 = s. Then:

0∑m=−N ′+1

|Xtm(ω)−Xtm−1(ω)|+N∑m=1

|Xtm(ω)−Xtm−1(ω)| ≤ Vs(ω) + V us (ω).

Now, the LHS is Tπ(ω) for π = 0 = t−N ′ < · · · < t0 = s < · · · < tN = u. Given an arbitrary

partition π′ of [0, u], it’s clear that there exists a partition π′′ such that π′ ⊂ π′′ and π′′ is of the form π.

Since Tπ′(ω) ≤ Tπ′′(ω), and π′ was arbitrary, we have that Vu(ω) ≤ Vs(ω) + V us (ω).

For the other inequality, note that 0 = t−N ′ < · · · < t0 = s < · · · < tN = u is a partition of [0, u].

Thus:

Vu(ω) ≥N∑

m=−N ′+1

|Xtm(ω)−Xtm−1(ω)| =0∑

m=−N ′+1

|Xtm(ω)−Xtm−1(ω)|+N∑m=1

|Xtm(ω)−Xtm−1(ω)|.

Now, fixing one of the partitions on the RHS, we may take the supremum of the remaining, and then

proceed to take the supremum of the final term. Thus: Vu(ω) ≥ Vs(ω) + V us (ω), so that Vu(ω) =

Vs(ω) + V us (ω).4

Now, by hypothesis, Xs is of locally bounded variation. So, there exists a sequence of stopping times

Tn ↑ ∞ a.s. such that XTns (ω) is of bounded variation in time. Let

A = ω ∈ Ω : Tn(ω) ↑ ∞

B = ω ∈ Ω : Xt(ω) is continuous.

By definition, P [A] = P [B] = 1 ⇒ P [A ∩ B] = 1. Now, let ω ∈ A ∩ B be fixed, and suppose that

s 7→ Vs(ω) has a discontinuity at t. Choosing n large enough so that Tn(ω) > t, there exists s0 ≤ t < u0

such that Xs(ω) is of bounded variation on [s0, u0].

15

Page 16: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Since s 7→ Vs(ω) has a discontinuity at t, there exists ε > 0 such that for every δ > 0, u− s < δ implies

Vu(ω)−Vs(ω) > 3ε where s < t < u. By Claim 5 then, for every δ > 0, u−s < δ implies V us (ω) > 3ε

where s < t < u.

Pick δ′ > 0 so that if |r − s| < δ′ then |Xs −Xr| < ε (using uniform continuity of Xs(ω) on [s0, u0]).

Assuming sn and un have been defined, pick a partition of [sn, un] not containing t with mesh less than

δ′ and variation greater than 2ε (this is possible since for every δ > 0, u − s < δ implies V us (ω) > 3ε

where s < t < u).

Let sn+1 be the largest point in the partition less than t, and un+1 be the smallest point in the partition

larger than t. Then un+1 − sn+1 < δ′ ⇒ |Xsn+1(ω)−Xun+1(ω)| < ε. Thus:

N∑m=1

|Xtm(ω)−Xtm−1(ω)| > 2ε⇔N∑

m=1,tm 6=sn+1,un+1

|Xtm(ω)−Xtm−1(ω)| > 2ε−|Xun+1(ω)−Xsn+1(ω)| > ε.

By omitting the points sn+1 and un+1 from the partition, we obtain a partition for [sn, un]−[sn+1, un+1].

Thus, after taking supremums:

V[sn,un]−[sn+1,un+1](ω) = V unsn (ω)− V un+1

sn+1(ω) > ε.

Thus, V u0s0 (ω) > Mε for arbitrarily large integer values M . Thus, it must be infinity, contradicting that

Xs(ω) has bounded variation on [s0, u0].

Thus, t 7→ Vt(ω) must be continuous for every ω ∈ Ω, since ω ∈ A was arbitrary. 4

Now, we needed Lemma 1 in order to guarantee that the functions

Sn(ω) = infs : Vs(ω) ≥ n

are stopping times (why?).

Lemma 2 Sn reduce Xt.

Proof of Lemma 2: Recall theorem 4:

If X is a continuous local martingale, we can always take the sequence which reduces X to be

Tn = inft : |Xt| > n

or any other sequence T ′n ≤ Tn that has T ′n ↑ ∞ as n ↑ ∞.

Now, suppose that t satisfies n < |Xt| = |Xt −X0| ≤ Vt. Then, Vt > n, so that

t : |Xt| > n ⊆ t : Vt ≥ n ⇒ Sn = inft : Vt ≥ n ≤ inft : |Xt| ≥ n = Tn.

16

Page 17: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Certainly, Vs ≥ n+ 1 ≥ n⇒ Vs ≥ n so that

t : Vs ≥ n+ 1 ⊆ t : Vs ≥ n ⇒ Sn = inft : Vs ≥ n ≤ t : Vs ≥ n+ 1 = Sn+1.

Finally, since t 7→ Vt is continuous a.s., it’s clear that limn→∞ Sn(ω) = ∞ almost surely. Thus, Snreduce Xt. 4

Now, fix some n > 0. Then, t ≤ Sn implies |Xt| ≤ n. By Lemma 2, Mt = Xt∧Sn is a bounded

martingale.

Now, if s < t:

E[(Mt −Ms)2|Fs] = E[M2t |Fs]− 2MsE[Mt|Fs] +M2

s = E[M2t |Fs]−M2

s = E[M2t −M2

s |Fs].

(we refer to this relationship as orthogonality of martingale increments). If 0 = t0 < t1 < · · · <tN = t is a partition of [0, t], we have:

E[M2t ] = E

[N∑m=1

M2tm −M

2tm−1

]= E

[N∑m=1

(Mtm −Mtm−1)2

]≤ E

[Vt∧Sn sup

m|Mtm −Mtm−1 |

]

≤ nE[supm|Mtm −Mtm−1 |

]Taking a sequence of partitions ∆n = 0 = tn0 < tn1 < · · · < tnk(n) = t in which the mesh |∆n| =

supm |tnm − tnm−1| → 0 continuity of sample paths imply supm |Mtnm −Mnm−1| → 0 a.s.

Since supm |Mtm −Mtm−1 | ≤ 2n, the bounded convergence theorem implies

E

[supm|Mtnm −M

ntm−1|]→ 0.

Thus, E[M2t ] = 0 so that Mt = 0 a.s.

Let At = ω ∈ Ω : Mt(ω) 6= 0. Then, since t above was arbitrary, P [At] = 0 for any t, so that

P

⋃t∈Q, t≥0

At

= 0.

Thus, with probability 1, Mt = 0 for all rational t. By continuity of sample paths, we have that Mt = 0

with probability 1 for all t.

Uniqueness in theorem 9: Suppose thatAt andA′t are two continuous, predictable, increasing processes

that have A0 = A′0 ≡ 0, and make X2t − At, X2

t − A′t local martingales. If Tn reduce X2t − At and T ′n

reduce X2t − A′t it’s clear that Tn ∧ T ′n reduce X2 − At − (X2 − A′t) = A′t − At, so that A′t − At is a

continuous local martingale.

It’s clear that A′t −At is predictable, since each A′t and At are predictable.

17

Page 18: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Finally, A′t − At is locally of bounded variation. To see this, take the stopping times Sn = Tn ∧ T ′n.

Clearly, Tn ∧ T ′n ↑ ∞, and the stopped processes A′t∧Tn∧T ′n − At∧Tn∧T ′n are of bounded variation for

each ω, being the difference of two increasing processes on the random interval [0, Tn(ω) ∧ T ′n(ω)].

Thus, by theorem 11, A′t−At must be constant, so that since A′0 = A0 = 0, A′t−At = 0 for all t. Thus,

A′t = At for all t.

The existence proof is a little more difficult, but uses some great analysis.

Existence in theorem 9:

We proceed in steps:

Step 1: Proof of existence in theorem 9 when Xt is a bounded martingale: (note that uniqueness

follows from the previous argument)

Given a partition ∆ = 0 = t0 < t1 < · · · with limn→∞ tn = ∞, let k(t) = supk : tk < t be the

index of the last point before t; note that k(t) is not a random variable, but a number.

Define

Q∆t (X) =

k(t)∑k=1

(Xtk −Xtk−1)2 +

(Xt −Xtk(t)

)2.

Lemma 3 If Xt is a bounded continuous martingale, then X2t −Q∆

t (X) is a martingale.

Proof of Lemma 3: First, notice that

Q∆t −Q∆

s =k(t)∑k=1

(Xtk −Xtk−1

)2 +(Xt −Xtk(t)

)2−

k(s)∑k=1

(Xtk −Xtk−1

)2 − (Xs −Xtk(s)

)2

=(Xtk(s)+1

−Xtk(s)

)2−(Xs −Xtk(s)

)2+

k(t)∑k=k(s)+2

(Xtk −Xtk−1

)2 +(Xt −Xtk(t)

)2.

Define ui = ti for k(s) ≤ i ≤ k(t) and uk(t)+1 = t. Then, writing Q∆t = Q∆

s + (Q∆t −Q∆

s ):

E[X2t −Q∆

t (X)|Fs]

= E[X2t |Fs]−Q∆

s (X)−E

k(t)+1∑i=k(s)+2

(Xui −Xui−1

)2 ∣∣Fs−E [(Xtk(s)+1

−Xtk(s)

)2|Fs]+E

[(Xs −Xtk(s)

)2|Fs]

= E[X2t |Fs]−Q∆

s (X)−E

k(t)+1∑i=k(s)+2

X2ui −X

2ui−1

∣∣Fs−E [X2

tk(s)+1− 2Xtk(s)+1

Xtk(s) +X2tk(s)|Fs]

+E[X2s − 2XsXtk(s) +X2

tk(s)|Fs]

18

Page 19: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

= −Q∆s (X) + E

[X2tk(s)+1

|Fs]− E

[X2tk(s)+1

|Fs]

+ 2XsXtk(s) −X2tk(s)

+X2s − 2XsXtk(s) +X2

tk(s)

= X2s −Q∆

s (X)

where in the first equality, we have used the fact that Q∆s (X) is Fs measurable, and in the second equal-

ity, we have used the orthogonality of martingale increments. 4

Lemma 4 LetXt be a bounded continuous martingale. Fix r > 0 and let ∆n be a sequence of partitions

0 = tn0 < · · · < tnkn = r

of [0, r] with mesh |∆n| = supk |tnk − tnk−1| → 0. Then, Q∆nr (X) converges to a limit in L2(Ω,F , P ).

Proof of Lemma 4: First, we begin with some notation. If ∆ and ∆′ are two partitions of [0, r], we let

∆∆′ denote the partition obtained by taking all the points in ∆ and ∆′.

Now, by lemma 3, for fixed partitions ∆ and ∆′ of [0, r], we have that for a bounded continuous martin-

gale Xt:

Yt = (X2t −Q∆′

t )− (X2t −Q∆

t ) = Q∆t −Q∆′

t

is again a bounded martingale (Since |Xt| ≤ M for all t ≥ 0 implies |Q∆t − Q∆′

t | ≤ KM since the

partitons ∆ and ∆′ are fixed). Thus, again by lemma 3: Zt = (Yt)2 − Q∆∆′t (Y ) is a martingale with

Z0 = 0, so that

E[Zr] = 0⇒ E

[(Q∆r −Q∆′

r

)2]

= E[(Yr)2

]= E

[Q∆∆′r (Y )

].

Now,

2a2 + 2b2 − (a+ b)2 = (a− b)2 ≥ 0

for any real numbers a and b, so that

(a+ b)2 ≤ 2(a2 + b2)

for any real numbers a, b. Thus:

Q∆∆′r (Y ) =

k(r)∑k=1

(Ytk − Ytk−1

)2 +(Yr − Ytk(r)

)2=

k(r)∑k=1

[(Q∆tk−Q∆′

tk

)−(Q∆tk−1−Q∆′

tk−1

)]2

+[(Q∆r −Q∆′

r

)−(Q∆tk(r)−Q∆′

tk(r)

)]2

=k(r)∑k=1

[(Q∆tk−Q∆

tk−1

)−(Q∆′tk−Q∆′

tk−1

)]2+[(Q∆r −Q∆

tk(r)

)−(Q∆′r −Q∆′

tk(r)

)]2

≤ 2k(r)∑k=1

(Q∆tk−Q∆

tk−1

)2+(Q∆′tk−Q∆′

tk−1

)2+(Q∆r −Q∆

tk(r)

)2+(Q∆′r −Q∆′

tk(r)

)2= 2

[Q∆∆′r

(Q∆)

+Q∆∆′r

(Q∆′

)].

19

Page 20: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Putting it all together then, we have:

E

[(Q∆r −Q∆′

r

)2]

= E[Q∆∆′r (Y )

]≤ 2E

[Q∆∆′r

(Q∆)

+Q∆∆′r

(Q∆′

)].

Thus, to show that Q∆nr (X) is Cauchy in L2(Ω,F , P ), and hence converges in this space since it is

complete, it is sufficient to show that

|∆|+ |∆′| → 0⇒ E[Q∆∆′r

(Q∆)]→ 0.

To do this, let sknk=1 = ∆∆′ and tj = ∆. Let sk ∈ ∆∆′ and tj ∈ ∆ such that tj ≤ sk < sk+1 ≤tj+1. Then:

Q∆sk+1−Q∆

sk= (Xsk+1

−Xtj )2 − (Xsk −Xtj )

2 = (Xsk+1−Xsk)2 + 2(Xsk+1

−Xsk)(Xsk −Xtj )

= (Xsk+1−Xsk)(Xsk+1

+Xsk − 2Xtj )

⇒ Q∆∆′r (Q∆) ≤ Q∆∆′

r (X) supk

(Xsk+1+Xsk − 2Xtj(k))

2

where j(k) = supj : tj ≤ sk. By the Cauchy-Schwartz inequality:

E[Q∆∆′r

(Q∆)]≤ E

[Q∆∆′r (X)2

] 12E

[supk

(Xsk+1

+Xsk − 2Xtj(k)

)4] 1

2

.

Since the sample paths ofXt are continuous almost surely, supk(Xsk+1

+Xsk − 2Xtj(k)

)4→ 0 almost

surely as |∆| + |∆′| → 0. Since supk(Xsk+1

+Xsk − 2Xtj(k)

)4≤ (4M)4, the bounded convergence

theorem implies that

E

[supk

(Xsk+1

+Xsk − 2Xtj(k)

)4] 1

2

→ 0

as |∆|+ |∆′| → 0.

Thus, it remains to show that E[Q∆∆′r (X)2

]is bounded. To do this, note that:

Q∆∆′r (X)2 =

(n∑

m=1

(Xsm −Xsm−1)2

)2

=n∑

m=1

(Xsm−Xsm−1)4+2n−1∑m=1

(Xsm−Xsm−1)2n∑

j=m+1

(Xsj−Xsj−1)2

=n∑

m=1

(Xsm −Xsm−1)4 + 2n−1∑m=1

(Xsm −Xsm−1)2(Q∆∆′r (X)−Q∆∆′

sm (X))

⇒ E[Q∆∆′r (X)2

]= E

[n∑

m=1

(Xsm −Xsm−1)4

]+2E

[n−1∑m=1

(Xsm −Xsm−1)2(Q∆∆′r (X)−Q∆∆′

sm (X))]

To bound the first term on the RHS, note that |Xt| ≤M for all t implies:

E

[n∑

m=1

(Xsm −Xsm−1)4

]≤ (2M)2E

[n∑

m=1

(Xsm −Xsm−1)2

]= 4M2E

[n∑

m=1

X2sm −X

2sm−1

]≤ 4M2E

[X2r

]≤ 4M4

20

Page 21: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

where in the first equality, we have used that orthogonality of martingale increments:

E[(Xsm −Xsm−1)2|Fsm−1 ] = E[X2sm −X

2sm−1|Fsm−1 ]

implies:

E[(Xsm −Xsm−1)2] = E[X2sm −X

2sm−1

].

For the second term on the RHS, note that (Xsm −Xsm−1)2 ∈ Fsm . By lemma 3, and orthogonality of

martingale increments:

E[Q∆t (X)−Q∆

s (X)|Fs]

= E[X2t −X2

s |Fs]

= E[(Xt −Xs)

2 |Fs].

So:

E[(Xsm −Xsm−1)2

(Q∆∆′r (X)−Q∆∆′

sm (X))|Fsm

]= (Xsm−Xsm−1)2E

[(Q∆∆′r (X)−Q∆∆′

sm (X))|Fsm

]= (Xsm −Xsm−1)2E

[(Xr −Xsm)2|Fsm

]≤ (Xsm −Xsm−1)2(2M)2

⇒ E

[n−1∑m=1

E[(Xsm −Xsm−1)2

(Q∆∆′r (X)−Q∆∆′

sm (X))|Fsm

]]≤ 4M2E

[n−1∑m=1

(Xsm −Xsm−1)2

]

⇒ E

[n−1∑m=1

(Xsm −Xsm−1)2(Q∆∆′r (X)−Q∆∆′

sm (X))]≤ 4M2E

[n−1∑m=1

X2sm −X

2sm−1

]≤ 4M2E

[X2r

]≤ 4M4.

Thus,

E[Q∆∆′r (X)2

]≤ 4M4 + 2 · 4M4 = 12M4.

4

Lemma 5 Let ∆n be as in Lemma 4. Then, there exists a subsequence ∆nk such thatQ

∆nkt

converges uniformly a.s. on [0, r].

Proof of Lemma 5: SinceQ∆nr

converges in L2(Ω,F , P ), it is Cauchy in this space. So, choose a

subsequenceQ

∆nkr

such that for m ≥ nk,

E

[∣∣∣Q∆mr −Q∆nk

r

∣∣∣2] < 2−k.

Let

Ak =ω ∈ Ω : sup

t≤r

∣∣∣Q∆nk+1

t (ω)−Q∆nkt (ω)

∣∣∣ > 1k2

.

By Chebyshev’s inequality:

P [Ak] = P

[supt≤r

∣∣∣Q∆nk+1

t −Q∆nkt

∣∣∣ ≥ 1k2

]≤ k4E

[∣∣∣Q∆nk+1r −Q∆nk

r

∣∣∣2] < k4

2k.

21

Page 22: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Since the RHS is summable, Borel-Cantelli implies that P [lim supk Ak] = 0. So, for almost all ω ∈ Ω,

there exists Nω such that k > Nω implies

supt≤r

∣∣∣Q∆nk+1

t (ω)−Q∆nkt (ω)

∣∣∣ < 1k2.

So, for m > m′ > Nω:

supt≤r

∣∣∣Q∆nmt (ω)−Q

∆nm′t (ω)

∣∣∣ ≤ m−1∑k=m′

supt≤r

∣∣∣Q∆nk+1

t (ω)−Q∆nkt (ω)

∣∣∣ < m−1∑k=m′

1k2.

Since the series∑∞

k=11k2 converges, we have that given ε > 0, there exists N such that m,m′ > N

impliesm−1∑k=m′

1k2

< ε.

thus, for m,m′ > maxN,Nω:

supt≤r

∣∣∣Q∆nmt (ω)−Q

∆nm′t (ω)

∣∣∣ < ε.

Thus,Q

∆nkt

converges uniformly almost surely on [0, r]. 4

In what follows, call the limiting function in Lemma 5 Art , and define it to be zero outside [0, r].

Now, for each ∆nk in lemma 5, we can extend it to a partition ∆′nk of [0, r + 1], such that |∆′nk | → 0.

Then, for this sequence of partitions, lemma 4 implies that Q∆′nkr+1 converges to a limit in L2(Ω,F , P ).

Repeating the procedure in lemma 5 then, we can select a subsequenceQ

∆′nkjt

such that it converges

uniformly almost surely on [0, r + 1]. Call the limiting function Ar+1t , and similarly define it to be zero

outside [0, r + 1].

It’s clear that for t ≤ r, Q∆′nkjt = Q

∆nkj

t , so that Art = Ar+1t for t ≤ r.

Repeating the procedure above, we obtain a sequence of functions Ar+jt ∞j=0 such that Ar+jt is contin-

uous on [0, r + j] a.s., and Ar+jt = Ar+kt for t ≤ minr + j, r + k. So, we can unambiguously define

At(ω) = limj→∞Ar+jt (ω).

If Dj = ω ∈ Ω : Ar+jt (ω) is not continuous on [0, r + j], then clearly D =⋃∞j=0Dj has measure

zero. Thus, for ω ∈ Dc, it’s clear that At(ω) will be continuous, so that At is continuous a.s.

It’s clear from the construction that At is predictable.

To show that At is increasing, it is sufficient to show that each Ar+jt is increasing. To this end, let ∆n

be the partition of [0, r + j] with points at k2−n(r + j) for 0 ≤ k ≤ 2n; clearly, taking this sequence of

partitions doesn’t alter the above arguments, so that Q∆nt → Ar+jt uniformly a.e. on [0, r + j].

22

Page 23: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Clearly, ∆n+1 is a refinement of ∆n and⋃∞n=1 ∆n is dense in [0, r+ j]. Thus, for any pair s, t, s < t, in⋃∞

n=1 ∆n there exists n0 such that s and t belong to ∆n for n ≥ n0. Thus, Q∆ns ≤ Q∆n

t for n ≥ n0, so

that Ar+js ≤ Ar+jt . Since this is true for any s < t, s, t ∈⋃∞n=1 ∆n, by continuity of the process, it must

hold everywhere on [0, r + j].

Thus, At is continuous, predictable, and increasing. All we need to verify now is that (Xt)2 − At is a

martingale.

Now, for each j, Ar+jt is the limit of processes of the form Q∆nkt which converge uniformly almost

surely to Ar+jt . Thus, we have convergence in probability.

Similarly, sinceQ∆nkt was obtained as a subsequence of a sequence converging inL2(Ω,F , P ) to sayQt,

the subsequence Q∆nkt converges to Qt in L2(Ω,F , P ), so that we also have convergence in probability.

Thus, Qt = AR+jt with probability 1, so that Q

∆nkt converges to Ar+jt in L2(Ω,F , P ).

Claim 6 Suppose that for each n, Znt is a martingale w.r.t. Ft, and that for each t, Znt → Zt in

Lp(Ω,F , P ) where p ≥ 1. Then, Zt is a martingale.

Proof of Claim 6: Recall that since we’re working over the finite measure space (Ω,F , P ), convergence

in Lp(Ω,F , P ) implies convergence in L1(Ω,F , P ) (why?).

Now, the martingale property implies that for s < t, E[Znt |Fs] = Zns , so that for any A ∈ Fs∫AE[Znt |Fs] =

∫AZns .

Since Zns → Zs in Lp (and hence in L1) we have that

limn→∞

∫AE[Znt |Fs] = lim

n→∞

∫AZns =

∫AZs.

Now,

E[|E[Znt |Fs]− E[Zt|Fs]|p] = E[|E[Znt − Zt|Fs]|p] ≤ E[E[|Znt − Zt|p|Fs]] = E[|Znt − Zt|p]

where we have used the conditional Jensen inequality, and linearity of conditional expectation.

Thus, E[Znt |Fs → E[Zt|Fs] in Lp(Ω,F , P ), so that E[Znt |Fs → E[Zt|Fs] in L1(Ω,F , P ). Thus:∫AE[Zt|Fs] = lim

n→∞

∫AE[Znt |Fs] =

∫AZs

so that since A ∈ Fs was arbitrary,

E[Zt|Fs] = Zs.

4

Thus, by lemma 3, since (Xt)2−Q∆nkt is a martingale, andQ

∆nkt → Ar+jt inL2(Ω,F , P ), (Xt)2−Ar+jt

is a martingale. Thus, (Xt)2 −At is obviously a martingale.

23

Page 24: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Step 2: Proof of existence in theorem 2 when Xt is a local martingale

Lemma 6 Let X be a bounded martingale, and T be a stopping time. Then,⟨XT⟩

= 〈X〉T .

Proof of Lemma 6: By the construction in step 1, Mt = (Xt)2 − 〈X〉t is a martingale. Then, MTt =

(XTt )2 − 〈X〉T is a martingale, so that by uniqueness of the process

⟨XT⟩,⟨XT⟩

= 〈X〉T . 4

Now, let Xt be a continuous local martingale, with a sequence of stopping times Tn that reduce it.

WLOG, we may take the stopping times to be the canonical times: Tn = inft : |Xt| > n.

Then, Y n = XTn · 1Tn>0 is a bounded martingale.

By the results in step 1, there is a unique, continuous predictable, increasing process Ant such that

(Y nt )2 − Ant is a martingale. By lemma 6, for t ≤ Tn, Ant = An+1

t , so that we may unambiguously

define 〈X〉t = Ant for t ≤ Tn.

Clearly, 〈X〉t is continuous, predictable, and increasing. By definition:

X2Tn∧t · 1Tn>0 − 〈X〉Tn∧t

is a martingale, so that (Xt)2 − 〈X〉t is a local martingale.

We proceed now to prove the analogue above for the covariance process. In particular, it is very useful

in computing 〈X,Y 〉t.

Theorem 12 Suppose thatXt and Yt are continuous local martingales. 〈X,Y 〉t is the unique continuous

predictable process At that is locally of bounded variation, has A0 = 0, and makes XtYt − At a local

martingale.

Proof of Theorem 5: By definition:

XtYt − 〈X,Y 〉t =14[(Xt + Yt)2 − 〈X + Y 〉t −

(Xt − Yt)2 − 〈X − Y 〉t

]is [obviously] a continuous local martingale.

To prove uniqueness, notice that if At and A′t are two processes with the desired properties, then

At − A′t = (XtYt − A′t) − (XtYt − At) is a continuous local martingale that is of locally bounded

variation. Hence, by theorem 4, this must be identically zero, so that At = A′t for all t.

24

Page 25: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

5 Integration

5.1 Integration w.r.t. Bounded Continuous Martingales

It’s time to put all our hard work in section 4 to use. In this section, we establish how to integrate pre-

dictable processes w.r.t. bounded martingales. Once we establish the Kunita-Watanabe inequality, we

will be able to integrate w.r.t. continuous local martingales, and then ultimately, continuous semimartin-

gales.

We proceed in three steps: 1) define the integral for basic integrands, 2) extend the definition to simple

integrands, 3) take limits of simple integrands and define the integral for square integrable integrands.

Throughout the integration section, we will continually verify the following three (desirable) properties

for each class of integrands and integrators:

1. If H and K are predictable, then ((H +K) ·X)t = (H ·X)t + (K ·X)t.

2. If H is predictable and X,Y are continuous bounded martingales, then (H · (X + Y ))t = (H ·X)t + (H · Y )t.

3. ForH,K predictable andX,Y bounded continuous martingales, 〈H ·X,K · Y 〉t =∫ t

0 HsKsd 〈X,Y 〉s .

5.1.1 Basic Integrands

Definition: We say that H(s, ω) is a basic predictable process if H(s, ω) = 1(a,b]C(ω) where C ∈ Fa.

We set Π0 to be the set of basic predictable processes, and bΠ0 to be the set of bounded basic predictable

processes.

Definition: When Xt is a continuous martingale, and H ∈ Π0, we define∫ ∞0

HsdXs = C(ω)(Xb(ω)−Xa(ω))

(H ·X)t =∫ t

0HsdXs =

∫Hs1[0,t](s)dXs.

Theorem 13 If Xt is a continuous martingale, and H ∈ bΠ0, (H ·X)t is a continuous martingale.

Proof of Theorem 13:

(H ·X)t =

0 0 ≤ t ≤ a

C(Xt −Xa) a ≤ t ≤ b

C(Xb −Xa) b ≤ t <∞.

25

Page 26: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

so it’s clear that (H ·X)t is continuous, (H ·X)t ∈ Ft, and E[|(H ·X)t|] <∞. To verify the martingale

property, let s < t. There are three cases we need to check:

1. t < a : Since s < t < a, this implies that (H ·X)t = 0 so that E[(H ·X)t|Fs] = 0.

2. a ≤ t ≤ b : In the first case where s < a ≤ t ≤ b,

E[(H ·X)t|Fs] = E[E[(H ·X)t|Fa]|Fs] = E[E[C(Xt −Xa)|Fa]|Fs]

= E[CE[Xt −Xa|Fa]|Fs] = 0 = (H ·X)s

where we have used the martingale property E[Xt|Fa] = Xa.

In the second case where a ≤ s < t ≤ b,

E[(H ·X)t|Fs] = E[C(Xt −Xa)|Fs] = CE[Xt −Xa|Fs] = C(Xs −Xa) = (H ·X)s

since Xa, C ∈ Fa ⊆ Fs and E[Xt|Fs] = Xs.

3. b < t <∞ : In the first case s < a,

E[(H·X)t|Fs] = E[C(Xb−Xa)|Fs] = E[E[C(Xb−Xa)|Fa]|Fs] = E[CE[Xb−Xa|Fa]|Fs] = 0

since E[Xb|Fa] = Xa by the martingale property.

In the second case a ≤ s ≤ b,

E[(H ·X)t|Fs] = E[C(Xb −Xa)|Fs] = CE[Xb −Xa|Fs] = C(Xs −Xa) = (H ·X)s

since Xa, C ∈ Fa ⊆ Fs and E[Xb|Fs] = Xs.

In the third case b < s <∞,

E[(H ·X)t|Fs] = E[C(Xb −Xa)|Fs] = C(Xb −Xa).

Corollary 4 If Yt is constant for t /∈ [a, b] and E[Yt|Fs] = Ys when a ≤ s ≤ t ≤ b then Yt is a

martingale.

26

Page 27: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

5.1.2 Simple Integrands

Definition: We say that H(s, ω) is a simple predictable process if H can be written as the sum of a

finite number of basic predictable processes. We set Π1 to be the set of simple predictable processes, and

bΠ1 to be the set of simple predictable processes.

It’s clear that if H ∈ Π1, then H can be written:

H(s, ω) =m∑j=1

1(tj−1,tj ](s)Cj(ω)

where t0 < t1 < · · · < tm and Cj ∈ Ftj−1 . In this case, we make the

Definition: ∫ ∞0

HsdXs =m∑j=1

Cj(Xtj −Xtj−1).

Remark: while the representation of H above is not unique, it’s clear that this definition of the integral

does not depend on the choice of representation of H .

Theorem 14 Suppose X and Y are continuous martinagles. If H,K ∈ Π1 then

((H +K) ·X)t = (H ·X)t + (K ·X)t

(H · (X + Y ))t = (H ·X)t + (H · Y )t.

Proof of Theorem 14: Let H1 = H , H2 = K. By subdividing the intervals if necessary, we may

assume Hjs =

∑mk=1 1(tk−1,tk](s)C

jk for j = 1, 2. Thus,

((H +K) ·X)t =m∑j=1

(Cj1 + Cj2)(Xtj −Xtj−1) = (H ·X)t + (K ·X)t.

In the second case, writing Hs =∑m

i=1 1(ti−1,ti](s)Ci, we have:

(H · (X + Y ))t =m∑i=1

Ci(Xti + Yti −Xti−1 − Yti−1) = (H ·X)t + (H · Y )t.

Using theorem 13, since the sum of a finite number of continuous martingales is again a continuous

martingale, the first part of theorem 15 implies:

Corollary 5 If X is a continuous martingale and H ∈ bΠ1, then (H ·X)t is a continuous martingale.

Theorem 15 If X and Y are bounded continuous martingales and H,K ∈ bΠ1, then

〈H ·X,K · Y 〉t =∫ t

0HsKsd 〈X,Y 〉s .

27

Page 28: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Consequently, E[(H ·X)t(K · Y )t] = E[∫ t

0 HsKsd 〈X,Y 〉s], and

E[(H ·X)2t ] = E

[∫ t

0H2sd 〈X〉s

].

Remark: Here and in what follows, integrals w.r.t. 〈X,Y 〉s are Lebesgue-Stieltjes integrals. That is,

since 〈X,Y 〉s is locally of bounded variation, for almost all ω ∈ Ω, 〈X,Y 〉s (ω) is a function of bounded

variation on the random time intervals [0, Tn(ω)], where Tn(ω) ↑ ∞, so that on each of these intervals,

we may write 〈X,Y 〉t (ω) = Ant (ω) − Bnt (ω) where Ant and Bn

t are increasing functions, and hence

make sense of the Lebesgue-Stieltjes integral∫ t

0 f(s, ω)d 〈X,Y 〉s (ω) for almost all ω and any finite t.

Proof of Theorem 15: To prove these results, we note that it is sufficient to prove that

Zt = (H ·X)t(K · Y )t −∫ t

0HsKsd 〈X,Y 〉s

is a martingale, since the first result will follow from theorem 12, the second from taking expectation of

both sides, and the third by taking H = K and X = Y .

To prove that Zt is a martingale, we note that

((H1 +H2) ·X)t(K · Y )t = (H1 ·X)t(K · Y )t + (H2 ·X)t(K · Y )t∫ t

0(H1

s +H2s )Ksd 〈X,Y 〉s =

∫ t

0H1sKsd 〈X,Y 〉s +

∫ t

0H2sKsd 〈X,Y 〉s .

So, if the result holds for the pairs (H1,K) and (H2,K), then it holds for (H1 + H2,K). Similarly, if

the result holds for (H,K1) and (H,K2), then it holds for (H,K1 +K2).

Thus, it is sufficient to prove that Zt is a martingale, with H = 1(a,b]C, K = 1(c,d]D. WLOG, we may

assume that 1) b ≤ c or 2) a = c, b = d (since otherwise we could decompose the interval into disjoint

half open intervals, and apply the previous argument)

Case 1: In this case,∫ t

0 HsKsd 〈X,Y 〉s = 0 for all t. Thus, we need to show that (H ·X)t(K · Y )t is a

martingale. To prove this, observe that if J = C(Xb −Xa)D1(c,d] then (H ·X)t(K · Y )t = (J · Y )t is

a martingale by theorem 13.

Case 2: In this case

Zs =

0 s ≤ a

CD[(Xs −Xa)(Ys − Ya)− (〈X,Y 〉s − 〈X,Y 〉a)] a ≤ s ≤ b

CD[(Xb −Xa)(Yb − Ya)− (〈X,Y 〉b − 〈X,Y 〉a)] s ≥ b

28

Page 29: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

so it suffices to check the martingale property for a ≤ s ≤ t ≤ b (recall corollary 4). To do this, note that

Zt − Zs = CD[XtYt −XsYs −Xa(Yt − Ys)− Ya(Xt −Xs)− (〈X,Y 〉t − 〈X,Y 〉s)].

Taking expected values and noting Xa ∈ Fa, E[Yt − Ys|Fs] = 0 we have

E[Zt − Zs|Fs] = CDE[XtYt − 〈X,Y 〉t − (XsYs − 〈X,Y 〉s)|Fs] = 0.

5.1.3 Square integrable integrands

Definition: For any martingale X , let Π2(X) denote the set of all predictable processes H such that

||H||X =(E

[∫H2sd 〈X〉s

]) 12

<∞.

Since integrating w.r.t. 〈X〉s for each ω in the Lebesgue-Stieltjes sense is the same as integrating w.r.t. a

σ-finite Borel measure generated by 〈X〉s (ω) in the Lebesgue sense, it’s clear that

Claim 7 || · ||X is a norm on Π2(X).

Remark: In this defintion, we have used the Doob-Meyer decomposition for the bounded continuous

martingale Xt developed in section 4.

Definition: We defineM2 to be the set of all martingales adapted to Ftt≥0 such that

||X||2 =(

supt≥0

E[X2t

]) 12

<∞.

Theorem 16 If X is a bounded continuous martingale and H ∈ bΠ1, then ||H ·X||2 = ||H||X .

Proof of Theorem 16: From theorem 15:

||H||2X = E

[∫H2sd 〈X〉s

]= sup

t≥0E

[∫ t

0H2sd 〈X〉s

]= sup

t≥0E[(H ·X)2

t

]= ||H ·X||22.

Now, we show that we can define the integral of H ∈ Π2(X) w.r.t. X by taking limits in bΠ1:

Lemma 7 If for 1 ≤ i ≤ k we have Xi ∈ M2 and H ∈⋂ki=1 Π2(Xi) then there is a sequence

Hn ∈ bΠ1 with ||Hn −H||Xi → 0 for 1 ≤ i ≤ k.

Proof of Lemma 7: Fix i, and letX = Xi. SinceX ∈M2,X is a martingale, and hence is a continuous

local martingale, so that we have the local martingale Zt = (Xt)2 − 〈X〉t. Now, let Tn be the canonical

stopping times that reduce Xt; it was shown that in the construction of the process 〈X〉t that Tn reduces

Zt1Tn>0 = (Xt)21Tn>0 − 〈X〉t 1Tn>0

29

Page 30: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

so that

ZTnt 1Tn>0 = (XTnt )21Tn>0 − 〈X〉Tnt 1Tn>0

is a martingale.

Thus:

E[Z0] = E[ZTnt 1Tn>0]

⇒ E[X20 1Tn>0] = E[X2

Tn∧t1Tn>0]− E[〈X〉Tn∧t 1Tn>0].

Now, by the L2 maximal inequality:

E

[(supt≥0|Xt|

)2]≤ 4 sup

t≥0E[X2

t ] <∞.

Thus, supt≥0 |Xt| ∈ L2 so that |XTn∧t|2 ≤(supt≥0 |Xt|

)2. Since X2

Tn∧t → X2t almost surely, the

dominated convergence theorem, and monotone convergence theorems imply:

E[X20 ] = E[X2

t ]− E[〈X〉t]

so that E[X20 ], E[X2

t ] <∞ for all t imply that E[〈X〉t] <∞.

So, if H ∈ bΠ1, we may write |Ht| ≤M for all t ≥ 0 so that,

||H||2Xi = E

[∫H2sd⟨Xi⟩s

]≤M2E

[⟨Xi⟩s

]<∞.

Thus, bΠ1 ⊂⋂ki=1 Π2(Xi) since Xi in the above argument was arbitrary. We need this in order to

guarantee that ||H||Xi makes sense for H ∈ bΠ1.

Now, let

Ht =

G ∈

k⋂i=1

Π2(Xi) : G vanishes on (t,∞) and the conclusion holds

and recall the

Theorem 17 (Monotone Class Theorem) Let A be a collection of subsets of a measure space X that

contains X and is closed under intersection. Let H be a vector space of real valued functions on X

satisfying

1. If A ∈ A, 1A ∈ H.

2. If 0 ≤ fn ∈ H and fn ↑ f , where f is a bounded function, then f ∈ H.

Then,H contains all the bounded functions on Ω that are measurable w.r.t. σ(A).

30

Page 31: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

We want to apply the monotone class theorem to Ht, the space [0, t] × Ω and the collection of sets Aof the form (r, s] × A r < s ≤ t A ∈ Fr. Clearly, this class of sets is closed under intersections and

contains [0, t]× Ω.

Also, any set S ∈ A has 1S ∈ Ht since 1(r,s]1A ∈ Ht. This also shows thatHt is nonempty.

By definition, it’s clear that Ht is a vector space. In what follows, we use the definitions of L and P as

defined in section 3.

Now, suppose that 0 ≤ Gn ∈ Ht and that Gn ↑ G where G is bounded and in⋂ki=1 Π2(Xi). Since each

Gn is predictable, each Gn is P-measurable, so that G is P-measurable, and hence predictable, since

the convergence is a.e. Also, since Gn ∈ Ht implies that Gn vanishes outside (t,∞) for every n, the

monotone convergence implies that G vanishes outside (t,∞).

Now, the dominated convergence theorem implies that

||G−Gn||2Xi = E

[∫(Gs −Gns )2d

⟨Xi⟩s

]→ 0

for each 1 ≤ i ≤ k, so we can pick nij large enough such that ||G − Gnij ||Xi < 12j−1 . Letting nj =

max1≤i≤k nij we have that ||G−Gnj ||Xi < 1

2j+1 for all 1 ≤ i ≤ k.

Now, sinceGnj ∈ Ht, we can find a sequence of functionsHnj ,m ∈ bΠ1 such that ||Hnj ,m−Gnj ||Xi →0 for all 1 ≤ i ≤ k. Assuming mij−1

i=1 have been defined, we can pick mj > max1≤i≤j−1mi and large

enough such that ||Hnj ,mj − Gnj ||Xi < 12j+1 . By construction then, we have a sequence of functions

Hnj ,mj ∈ bΠ1 such that

||Hnj ,mj −G||Xi ≤ ||Hnj ,mj −Gnj ||Xi + ||Hnj ,mj −Gnj ||Xi <12j

for all 1 ≤ i ≤ k. Thus, G ∈ Ht.

So, the monotone class theorem implies that Ht contains all bounded predictable processes that vanish

on (t,∞).

If K ∈⋂ki=1 Π2(Xi), and we define Kn = K1|K|≤n1[0,n] then the dominated convergence theorem

implies

||Kn −K||Xi → 0

for each 1 ≤ i ≤ k. So we can pick nij large enough such that ||K − Knij||Xi < 1

2j−1 . Letting

nj = max1≤i≤k nij we have that ||K −Knj ||Xi < 1

2j+1 for all 1 ≤ i ≤ k.

Since Knj is bounded, predictable, and vanishes on (n,∞) for each j, Knj ∈ Hnj , so that there exists

a sequence of functions Knj ,m ∈ bΠ such that

||Knj ,m −Knj ||Xi → 0

31

Page 32: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

for all 1 ≤ i ≤ k. Assuming mij−1i=1 have been defined, we can pick mj > max1≤i≤j−1mi and large

enough such that ||Knj ,mj − Knj ||Xi < 12j+1 . Thus, we obtain a sequence of functions Knj ,mj such

that

||Knj ,mj −K||Xi ≤ ||Knj ,mj −Knj ||Xi + ||Knj ,mj −Knj ||Xi <12j

for all 1 ≤ i ≤ k.

Theorem 18 M2 is complete.

Proof of Theorem 18:

In order to proceed, we prove the following

Lemma 8 Let Mt be a continuous martingale, for which Mt ∈ L2(Ω,F , P ) for all t. Then,

supt≥0

E[M2t ] <∞⇔

∞∑k=1

E[(Mtk −Mtk−1)2] <∞

for any sequence tk∞k=0 such that tk ↑ ∞ strictly.

In particular, when this occurs, we have that Mt →M∞ almost surely, and in L2(Ω,F , P ).

Proof of Lemma 8: Let tk∞k=0 be a sequence of real numbers s.t. tk ↑ ∞ strictly.

Now, for ti < tj < tm < tn:

E[Mtn |Ftm ] = Mtm

implies that Mtn −Mtm is orthogonal to L2(Ftm) (recall that for functions f ∈ L2(F), E[f |Ft] is the

projection of f onto the closed subspace L2(Ft)).

So, since Mtj −Mti ∈ Ftm ,⟨Mtm −Mtn ,Mtj −Mti

⟩= 0.

Thus, since we may write Mtj = Mt0 +∑j

k=1(Mtk −Mtk−1), this implies

E[M2tj ] = E[M2

t0 ] +j∑

k=1

E(Mtk −Mtk−1)2. (1)

(⇒) : By (1):

E[M2t0 ]+

j∑k=1

E(Mtk−Mtk−1)2 = E[M2

tj ] ≤ supt≥0

E[M2t ]⇒

j∑k=1

E(Mtk−Mtk−1)2 ≤ sup

t≥0E[M2

t ] <∞.

Since this is true for all j and the sequence∑j

k=1E(Mtk −Mtk−1)2

is monotonically increasing, we

must have that∞∑k=1

E[(Mtk −Mtk−1)2] ≤ sup

t≥0E[M2

t ] <∞.

32

Page 33: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Since the sequence tk was arbitrary, the result follows.

(⇐) : By (1):

E[M2tj ] = E[M2

t0 ] +j∑

k=1

E(Mtk −Mtk−1)2 ≤ E[M2

t0 ] +∞∑k=1

E(Mtk −Mtk−1)2 = K <∞.

Since the sequence I = tk was arbitrary, we may it to be a strictly ascending countable sequence of

numbers dense in R. In that case: E[M2t ] ≤ K < ∞ for all t ∈ I . By continuity of sample paths, the

dominated convergence theorem immediately implies that

E[M2t ] ≤ K

for all t ≥ 0. Thus, supt≥0E[M2t ] ≤ K <∞.

To prove the final claim, note that by the above, either condition implies thatMt is bounded inL2(Ω,F , P ),

and hence in L1(Ω,F , P ) since L2(Ω,F , P ) ⊆ L1(Ω,F , P ). So, by the Martingale Convergence The-

orem, M∞ = limt→∞Mt exists for almost all ω ∈ Ω.

For any tk∞k=0 tk ↑ ∞ strictly,E[(Mtn+r−Mtn)2] =∑n+r

k=n+1E[(Mtk −Mtk−1

)2] by the discussion

above. Letting r →∞ and using Fatou’s lemma:

E[(M∞ −Mtn)2

]≤∑

k≥n+1

E[(Mtk −Mtk−1

)2].

The RHS is simply the statement of convergence of the series∑∞

k=1E[(Mtk −Mtk−1)2] so that

limn→∞

E[(M∞ −Mtn)2

]= 0.

Since the sequence tn was arbitrary, the result follows immediately. 4

Now, given a martingale M ∈M2, define the function

Φ : (M2, || · ||2)→ L2(Ω,F∞, P )

by Φ(M) = M∞, where F∞ = σ (F : t ≥ 0), which we can do by lemma 8. Again by lemma 8, we

have that Mt →M∞ in L2(Ω,F , P ). So that:

E[M2∞] = lim

t→∞E[M2

t ] = supt≥0

E[M2t ]

(the first equality is the statement of the convergence of the L2 norms; the latter equality comes from

the fact that M2t is a continuous submartingale: use the conditional Jensen inequality and the fact that

Mt ∈ L2 for all t ≥ 0). Thus, Φ preserves norms.

To prove that Φ is an isomorphism, it suffices to show 1) injectivity, 2) surjectivity since it’s obvious that

Φ is linear.

33

Page 34: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

1. Suppose X∞ = Φ(X) = Φ(Y ) = Y∞. Then, since the martingales Xt and Yt can be recovered

from Yt = E[Y∞|Ft] = E[X∞|Ft] = Xt, it’s clear that X = Y .

2. Suppose Y ∈ L2(F∞). Then, Y ∈ L1(F∞) ⊆ L1(F) so that Yt = E[Y |Ft] is a martingale (it’s

certainly L1 by construction, adapted by definition, and satisfied the martingale property since for

s ≤ t:E[Yt|Fs] = E[E[Y |Ft]|Fs] = E[Y |Fs] = Ys).

By the conditional Jensen inequality:

E[Y 2t ] = E[E[Y |Ft]2] ≤ E[E[Y 2|Ft]] = E[Y 2] <∞

so that ||Y ||22 = supt≥0E[X2t ] <∞ and so Y ∈M2.

Finally, since Yt → Y∞ a.e. we have that Y∞ = limn→∞E[Y |Ft] = E[Y |F∞] = Y . Thus,

Φ([Yt]) = Y .

Thus, Φ is an isometry, so that since L2(F∞) is complete,M2 must be complete.

We may now define the integral for integrands in Π2(X). To define H · X when X is a continuous

bounded martingale and H ∈ Π2(X), let Hn ∈ bΠ1 so that ||Hn −H||X → 0 the existence of which is

guaranteed by lemma 7. Since Hn ∈ bΠ1, Hn ·X is a continuous martingale for each n and

E[(Hn ·X)2

t

]= E

[∫ t

0(Hn)2

sd 〈X〉s]

by corollary 5 and theorem 15. So:

E[((Hn −Hm) ·X)2

t

]≤ E

[∫(Hn

s −Hms )2d 〈X〉s

]= ||Hn −Hm||2X

⇒ ||Hn ·X −Hm ·X||22 = supt≥0

E[((Hn −Hm) ·X)2

t

]≤ ||Hn −Hm||2X

Thus, since the sequence Hn is Cauchy in Π2(X), the sequence (Hn ·X) is Cauchy inM2, and so

must converge to a limit in this space by theorem 18, which we call H ·X . It’s clear by using a shuffle

sequence that this definition does not depend on the sequence Hn.

Theorem 19 If X is a bounded continuous martingale and H ∈ Π2(X) then H · X ∈ M2 and is

continuous.

Proof of Theorem 19: By definition, H ·X ∈M2.

To show that H · X is continuous, note that if Hn ∈ bΠ1 have ||Hn − H||X → 0, then Hn · X are

continuous by corollary 5. Since ||Hn ·X −H ·X||2 → 0, the Chebyshev and L2 maximal inequality

34

Page 35: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

imply:

P

[supt≥0|(Hn ·X)t − (H ·X)t| > ε

]≤E[(

supt≥0 |(Hn ·X)t − (H ·X)t|)2]

ε2≤ 4||(Hn ·X)t − (H ·X)||22

ε2→ 0.

Thus, supt≥0 |(Hn · X)t − (H · X)t| converge to 0 in probability, so that there exists a subsequence

(Hnj ·X)t of (Hn ·X)t such that supt≥0 |(Hnj ·X)t − (H ·X)t| → 0 almost surely.

Thus, for almost all ω ∈ Ω, we have that

|(Hnj ·X)t(ω)− (H ·X)t(ω)| ≤ supt≥0|(Hnj ·X)t(ω)− (H ·X)t(ω)| → 0

for all t ≥ 0. Thus, (Hnj · X)t converge uniformly almost surely to (H · X)t, so that (H · X)t is

continuous almost surely.

Theorem 20 If X is a bounded continuous martingale, and H,K ∈ Π2(X) then H +K ∈ Π2(X) and

((H +K) ·X)t = (H ·X)t + (K ·X)t.

Proof of Theorem 20: Since Π2(X) is a normed linear space

||H +K||X ≤ ||H||X + ||K||X <∞

since by hypothesis, H,K ∈ Π2(X). Thus, H +K ∈ Π2(X).

Now, let Hn,Kn ∈ bΠ1 such that ||Hn −H||X → 0 and ||Kn −K||X → 0. By a trivial application of

the triangle inequality:

||(Hn +Kn)− (H +K)||X ≤ ||Hn −H||X + ||Kn −K||X → 0.

Let ((H +K) ·X) denote the martingale inM2 such that ((Hn +Kn) ·X) converges to.

By theorem 14,

((Hn +Kn) ·X)t = (Hn ·X)t + (Kn ·X)t

⇒ ||((H +K) ·X)t − (H ·X)t − (K ·X)t||2

≤ ||((H +K) ·X)t − ((Hn +Kn) ·X)t||2 + ||((Hn +Kn) ·X)t − (H ·X)t − (K ·X)t||2

= ||((H +K) ·X)t − ((Hn +Kn) ·X)t||2 + ||(Hn ·X)t + (Kn ·X)t − (H ·X)t − (K ·X)t||2

≤ ||((H+K) ·X)t− ((Hn+Kn) ·X)t||2 + ||(Hn ·X)t− (H ·X)t||2 + ||(Kn ·X)t− (K ·X)t||2 → 0

Thus,

((H +K) ·X)t = (H ·X)t + (K ·X)t

almost surely.

35

Page 36: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

5.2 The Kunita -Watanabe Inequality

The Kunita-Watanabe inequality states:

Theorem 21 For any two continuous local martingales M and N and measurable processes H and K,

the inequality ∫ t

0|HsKs|d |〈M,N〉|s ≤

[∫ t

0H2sd 〈M〉s

] 12[∫ t

0K2sd 〈N〉s

] 12

holds a.s. for t ≤ ∞.

Proof of Theorem 21: First, we show that∣∣∣∣∫ t

0HsKsd 〈M,N〉s

∣∣∣∣ ≤ [∫ t

0H2sd 〈M〉s

] 12[∫ t

0K2sd 〈N〉s

] 12

(2)

for simple integrands K and H .

Towards this end, let K =∑m

j=1 1(tj−1,tj ](s)Kj where Kj ∈ Ftj−1 . By subdividing intervals if neces-

sary, we may assume that H =∑m

j=1 1(tj−1,tj ](s)Hj where Hj ∈ Ftj−1 .

Now, we define 〈M,N〉ts = 〈M,N〉t − 〈M,N〉s. Since

0 ≤ 〈M + rN,M + rN〉ts = 〈M,M〉ts + 2r 〈M,N〉ts + r2 〈N,N〉ts

for every r ∈ R, we have that after minimizing this function w.r.t. r that

| 〈M,N〉ts | ≤(〈M〉ts

) 12(〈N〉ts

) 12 a.s.

So: ∣∣∣∣∫ t

0HsKsd 〈M,N〉s

∣∣∣∣ ≤ m∑j=1

|HjKj |∣∣∣〈M,N〉tjtj−1

∣∣∣ ≤ m∑j=1

|Hj ||Kj |(〈M〉tjtj−1

) 12(〈N〉tjtj−1

) 12

m∑j=1

H2j 〈M〉

tjtj−1

12 m∑j=1

K2j 〈N〉

tjtj−1

12

=(∫ t

0H2sd 〈M〉s

) 12(∫ t

0K2sd 〈N〉s

) 12

.

Now, if H and K are bounded measurable processes, then there exists Hn → H and Kn → K every-

where, where Hn,Kn are simple processes.

Since the sets over which we integrated in the preceding argument are of the form [0, t] which are com-

pact, 〈M〉s and 〈N〉s are finite measures a.e. Thus, we may apply the bounded convergence theorem so

that (2) holds for bounded measurable processes H and K.

Now, we have the following

36

Page 37: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Proposition 1 Let µ be a complex measure on a σ-algebraM onX . Then there is a measurable function

h such that |h(x)| = 1 for all x ∈ X and such that

dµ = hd|µ|.

For the proof, see Rudin’s Real and Complex Analysis, pg. 124.

In our case, we want to apply the result to the real signed measure d 〈M,N〉s and its total variation

measure d |〈M,N〉|s. The theorem then implies that for almost all ω ∈ Ω there exists a Radon-Nikodym

derivative Js(ω) taking values in −1, 1 such that d 〈M,N〉s (ω) = Js(ω)d |〈M,N〉|s.

So, when H and K are bounded measurable functions, by replacing H by HJsgn(HK) on the LHS of

(2), which is again a bounded measurable process, we obtain∫ t

0|HsKs|d |〈M,N〉|s ≤

[∫ t

0H2sd 〈M〉s

] 12[∫ t

0K2sd 〈N〉s

] 12

(3)

Finally, for arbitrary measurable functions H and K, we can find an increasing sequence of bounded

functions that converge to |H| and |K| respectively. The monotone convergence theorem then estab-

lishes (3) for arbitrary measurable functions, and so the proof is complete.

Theorem 22 If H ∈ Π2(X) ∩Π2(Y ) then H ∈ Π2(X + Y ) and

(H · (X + Y ))t = (H ·X)t + (H · Y )t.

Proof of Theorem 22: Recall that 〈X,Y 〉s is the unique continuous predictably processAt that is locally

of bounded variation, has A0 = 0, and makes XtYt −At a local martingale.

So, in the case when Yt = Xt, it’s clear that 〈X,X〉s = 〈X〉s. Furthermore, bilinearity of 〈·, ·〉t is

immediate from its uniqueness and definition, so that:

〈X + Y 〉s = 〈X + Y,X + Y 〉s = 〈X〉s + 〈Y 〉s + 2 〈X,Y 〉s .

Now, the Kunita-Watanabe inequality implies:

| 〈X,Y 〉 |t ≤ (〈X〉t 〈Y 〉t)12 ≤〈X〉t + 〈Y 〉t

2

the latter inequality which comes from the fact that 0 ≤ (a− b)2 ⇒ 2ab ≤ a2 + b2 for all a, b ∈ R.

Thus: | 〈X + Y 〉t | ≤ 2(〈X〉t + 〈Y 〉t). Thus, for H ∈ Π2(X) ∩Π2(Y ), we have that H ∈ Π2(X + Y ).

Now, by lemma 7, we can findHn ∈ bΠ1 such that ||Hn−H||Z → 0 for Z = X,Y,X+Y . By theorem

14,

(Hn · (X + Y ))t = (Hn ·X)t + (Hn · Y )t.

37

Page 38: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

So:

||(H·(X+Y ))t−(H·X)t−(H·Y )t||2 ≤ ||(H·(X+Y ))t−(Hn·(X+Y ))t||2+||(Hn·(X+Y ))t−(H·X)t−(H·Y )t||2

≤ ||(H · (X + Y ))t − (Hn · (X + Y ))t||2 + ||(Hn ·X)t − (H ·X)t||2 + ||(Hn · Y )t − (H · Y )t||2

≤ ||H −Hn||X+Y + ||H −Hn||X + ||Y − Y n||Y → 0.

Thus,

(H · (X + Y ))t = (H ·X)t + (H · Y )t a.e.

Theorem 23 If X,Y are bounded martingales, H ∈ Π2(X), K ∈ Π2(Y ) then

〈H ·X,K · Y 〉t =∫ t

0HsKsd 〈X,Y 〉s .

Proof of Theorem 23: By the discussion in the proof of theorem 15, it is sufficient to show that

Zt = (H ·X)t(K · Y )t −∫ t

0HsKsd 〈X,Y 〉s (4)

is a martingale. Let Hn and Kn be sequences of elements of bΠ1 that converge to H and K in Π2(X)

and Π2(Y ) respectively, and let Znt be the quantity that results when Hn and Kn replace H and K in

(4).

By theorem 15, Znt is a martingale. By claim 6, it is sufficient to show that Znt → Zt in L1 to show that

Zt is a martingale.

Now,

E

[supt≥0|(Hn ·X)t(Kn ·X)t − (H ·X)t(K · Y )t|

]≤ E

[supt≥0|((Hn −H) ·X)t(Kn · Y )t|

]+ E

[supt≥0|(H ·X)t((Kn −K) · Y )t|

].

To take care of the first term, note that:

E

[supt≥0|((Hn −H) ·X)t(Kn · Y )t|

]≤ E

[supt≥0|((Hn −H) ·X)t| sup

t≥0|(Kn · Y )t|

]

≤ E

[(supt≥0|((Hn −H) ·X)t|

)2] 1

2

E

[(supt≥0|(Kn · Y )t|

)2] 1

2

≤ 4||(Hn −H) ·X||2||Kn · Y ||2 = 4||Hn −H||X ||Kn||Y → 0

38

Page 39: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

as n → ∞ since ||Hn − H||X → 0 ||Kn||Y → ||K||Y < ∞ as n → ∞ and where we have used the

Cauchy-Schwartz inequality and the L2 maximal inequality.

Similarly, to take care of the second term, note that:

E

[supt≥0|(H ·X)t((Kn −K) · Y )t|

]≤ E

[supt≥0|(H ·X)t| sup

t≥0|((Kn −K) · Y )t|

]

≤ E

[(supt≥0|(H ·X)t|

)2] 1

2

E

[(supt≥0|((Kn −K) · Y )t|

)2] 1

2

≤ 4||(H ·X)t||2||((Kn −K) · Y )t||2 = 4||H||X ||K −Kn||Y → 0

as n→∞ since ||K −Kn||Y → 0 as n→∞ and ||H||X <∞.

Thus, we have that (Hn ·X)s(Kn · Y )s → (H ·X)s(K · Y )s in L1.

Now, ∣∣∣∣∫ t

0HnsK

ns d 〈X,Y 〉s −

∫ t

0HsKsd 〈X,Y 〉s

∣∣∣∣ ≤ ∫ t

0|Hn

sKns −HsKs|d |〈X,Y 〉|s

≤∫ t

0|Hn

s −Hs||Kns |d |〈X,Y 〉|s +

∫ t

0|Hs||Kn

s −Ks|d |〈X,Y 〉|s .

By the Kunita-Watanabe inequality:∫ t

0|Hn

s −Hs||Kns |d |〈X,Y 〉|s ≤

[∫ t

0|Hn

s −Hs|2d 〈X〉s] 1

2[∫ t

0|Kn

s |2d 〈Y 〉s] 1

2

⇒ E

[∫ t

0|Hn

s −Hs||Kns |d |〈X,Y 〉|s

]≤ E

[∫ t

0|Hn

s −Hs|2d 〈X〉s] 1

2[∫ t

0|Kn

s |2d 〈Y 〉s] 1

2

≤(E

[∫ t

0|Hn

s −Hs|2d 〈X〉s]) 1

2(E

[∫ t

0|Kn

s |2d 〈Y 〉s]) 1

2

→ 0

as n→∞ since ||Hn −H||X → 0 and ||Kn||Y → ||K||Y <∞.

Similarly: ∫ t

0|Hs||Kn

s −Ks|d |〈X,Y 〉|s ≤[∫ t

0|Hs|2d 〈X〉s

] 12[∫ t

0|Kn

s −Ks|2d 〈Y 〉s] 1

2

⇒ E

[∫ t

0|Hs||Kn

s −Ks|d |〈X,Y 〉|s]≤ E

[∫ t

0|Hs|2d 〈X〉s

] 12[∫ t

0|Kn

s −Ks|2d 〈Y 〉s] 1

2

≤(E

[∫ t

0|Hs|2d 〈X〉s

]) 12(E

[∫ t

0|Kn

s −Ks|2d 〈Y 〉s]) 1

2

→ 0

as n→∞ since ||H||X <∞ and ||Kn −K||Y → 0 as n→∞.

39

Page 40: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Thus ∫ t

0HnsK

ns d 〈X,Y 〉s →

∫ t

0HsKsd 〈X,Y 〉s

in L1, so that

E[|Znt −Zt|] = E

[∣∣∣∣(Hn ·X)t(Kn · Y )t −∫ t

0HnsK

ns d 〈X,Y 〉s −

((H ·X)t(K · Y )t −

∫ t

0HsKsd 〈X,Y 〉s

)∣∣∣∣]

≤ E [|(Hn ·X)s(Kn · Y )s − (H ·X)s(K · Y )s|]+E[∣∣∣∣∫ t

0HnsK

ns d 〈X,Y 〉s −

∫ t

0HsKsd 〈X,Y 〉s

∣∣∣∣]→ 0

as n→∞, so that Znt → Zt in L1.

40

Page 41: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

5.3 Integration w.r.t. Local Martingales

In this section we will extend the integral so that the integrators are allowed to be continuous local

martingales, and integrands are in

Π3(X) =H predictable :

∫ t

0H2sd 〈X〉s <∞ a.s. for all t ≥ 0

.

It turns out that this is the largest possible class of integrands.

In order to extend the integral from bounded martingales to local martingales, we prove the:

Theorem 24 Suppose X is a bounded continuous martingale, H,K ∈ Π2(X) and Hs = Ks for s ≤ Twhere T is a stopping time. Then, (H ·X)s = (K ·X)s for s ≤ T .

Proof of Theorem 24: We may write H2 = H1s +H2

s and Ks = K1s +K2

s where

H1s = K1

s = Hs1s≤T = K21s≤T , H2s = Hs1s>T , K2

s = Ks1s>T .

Since H,K ∈ Π2(X), it’s clear that H1, H2,K1,K2 ∈ Π2(X), and that

(H1 ·X)t = (K1 ·X)t

for all t ≥ 0. Since H2s = K2

s = 0 for s ≤ T theorem 23 implies:⟨H2 ·X

⟩s

=⟨K2 ·X

⟩s

= 0, s ≤ T.

In order to proceed further, we need the following two claims:

Claim 8 Let T be a stopping time and X,Y continuous local martingales. Then⟨XT , Y T

⟩= 〈X,Y 〉T .

Proof of Claim 8: First, notice that if X is a continuous local martingale, then XT is a continuous local

martingale; so see this, let Tn be a sequence of stopping times that reduce the continuous local martingale

X . Then,

(XTt )Tn = XT∧Tn

t = (XTnt )T .

Since XTnt is by definition a martingale for each n, the RHS is again a continuous martingale for each n,

so that the sequence of stopping times Tn that reduce X also reduce XT .

Let Zt = (Xt)2 − 〈X〉t, so that since Xt is a continuous local martingale, so is Zt.

Let Tn reduce Z and T ′n reduce X such that XT ′n is bounded. Let Sn = Tn ∧ T ′n. Since Tn ↑ ∞ and

T ′n ↑ ∞ it’s clear that Sn ↑ ∞. Since Sn ≤ Tn and Sn ≤ T ′n, theorem 4, Sn reduces Z and X .

41

Page 42: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

So:

ZT∧Snt =(

(Xt)2)T∧Sn

− 〈X〉T∧Snt =(XT∧Snt

)2− 〈X〉T∧Snt

=((

XT∧Snt

)2)− 〈X〉T∧Snt .

Since ZT∧Snt is a martingale, and XT∧Snt is a bounded continuous martingale, we must have that⟨

XT∧Sn⟩t

= 〈X〉T∧Snt by uniqueness of the process⟨XT∧Sn

⟩t. Letting n→∞ above, we have:

ZTt =(XTt

)2 − limn→∞

⟨XT∧Sn⟩

t

so that by uniqueness of the process⟨XT⟩t,⟨

XT⟩t

= limn→∞

⟨XT∧Sn⟩

t= lim

n→∞〈X〉T∧Snt = 〈X〉Tt

Thus, by definition of 〈X,Y 〉:⟨XT , Y T

⟩t

=14[⟨XT + Y T

⟩t−⟨XT − Y T

⟩t

]=

14

[〈X + Y 〉Tt − 〈X − Y 〉

Tt

]= 〈X,Y 〉Tt .4

Claim 9 Suppose that the continuous local martingaleX has 〈X〉t = 0 for t ≤ T where T is a stopping

time. Then, Xt = X0 for t ≤ T a.s.

Proof of Claim 9: WLOG, WMAX0 = 0. First, let’s assume thatX is a continuous bounded martingale

that satisfies the hypotheses. Then, we know that Zt = X2t − 〈X〉t is a martingale.

Using the L2 maximal inequality, we obtain:

E

[supt≤n

X2t∧T

]≤ E

[(supt≤n

Xt∧T

)2]≤ 4E

[X2T∧n

].

Now, for all t ≤ n, the optional stopping theorem applied to the martingale Zt and the stopping times

t ∧ T ≤ n ∧ T yields:

E[X2t∧T − 〈X〉t∧T

]= E

[X2n∧T − 〈X〉n∧T

]⇒ E

[X2n∧T

]= E [〈X〉n∧T ]

where we let t = 0. Thus:

E

[supt≤n

X2t∧T

]≤ 4E [〈X〉n∧T ] = 0⇒ E

[supt≤n

X2t∧T

]= 0.

Since the martingale Xt is bounded, the bounded convergence theorem implies

E

[supt≥0

X2t∧T

]= 0.

42

Page 43: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Thus, Xt∧T = 0 almost surely, so that Xt = 0 for t ≤ T almost surely.

Now let X be a continuous local martingale that satisfies the hypotheses, reduced by the canonical stop-

ping times Tn. Then by Claim 8,⟨XTn

⟩t

= 〈X〉Tnt = 0 for t ≤ T ∧ Tn. Since XTnt is a continuous

bounded martingale, for each n, the previous argument applies, so that we haveXTnt = X0 for t ≤ T∧Tn

almost surely. Since this is true for all n, letting n→∞, we have that Xt = X0 for t ≤ T almost surely.

4

Applying claim 9 to the martingales H2 ·X and K2 ·X , we obtain that (H2 ·X)t = (K2 ·X)t = 0 for

t ≤ T .

Thus:

(H ·X)t = (H1 ·X)t + (H2 ·X)t, (K ·X)t = (K1 ·X)t + (K2 ·X)t

⇒ (H ·X)t = (K ·X)t

for t ≤ T .

To extend the definition of the integral, let X be a continuous local martingale reduced by the canonical

times Sn, and let H ∈ Π3(X). Define

Rn(ω) = inf

0 ≤ t <∞ :∫ t

0H2s (ω)d 〈X〉s (ω) ≥ n

.

Since H ∈ Π3(X), it’s clear that Rn ↑ ∞ almost surely as n ↑ ∞. Now, let Tn be stopping times such

that Tn ↑ ∞ almost surely, and

Tn(ω) ≤ Rn(ω) ∧ Sn(ω),

and define

H(n)t (ω) = Ht(ω)1t≤Tn(ω).

It’s clear by definition that Tn reduce X . Now,∫ (H(n)s (ω)

)2d 〈X〉s (ω) =

∫ (Hs(ω)1s≤Tn(ω)

)2d 〈X〉s (ω) =

∫ Tn(ω)

0H2s (ω)d 〈X〉s (ω) ≤ n

⇒ E

[∫ (H(n)s

)2d 〈X〉s

]≤ n <∞.

Thus, H(n) ∈ Π2 (X) for every n, and so(H(n) ·X

)t

is a quantity that we’ve previously constructed

for all n.

Now, for n ≤ m, we have that H(n)t (ω) = H

(m)t (ω) for t ≤ Tn(ω). So, by theorem 24:

(H(n) ·X)s = (H(m) ·X)s

43

Page 44: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

for s ≤ Tn. Thus, we may unambiguously define

(H ·X)s(ω) =(H(n) ·X

)s

(ω)

for s ≤ Tn(ω). Since Tn ↑ ∞ almost surely, this definition extends to all non-negative values of s.

To show that this is independent of the choice of stopping times Tn, suppose that T ′n is another set of

stopping times such that T ′n ↑ ∞ a.s. and T ′n ≤ Rn ∧ Sn.

Define Qn = Tn ∧ T ′n. Then: (HTn ·X

)t

=(HT ′n ·X

)t

for t ≤ Qn. Since Qn ↑ ∞, it follows immediately that (H ·X)t is independent of the sequence of

stopping times.

Theorem 25 If X is a continuous local martingale and H ∈ Π3(X) then

HT ·X = (H ·X)T = H ·XT = HT ·XT .

Theorem 26 If X is a continuous local martingale and H ∈ Π3(X), then (H ·X)t is a continuous

local martingale.

Proof of Theorem 26: By stopping at Rn ∧ Sn as defined above, it suffices to show that if X is a

bounded martingale and H ∈ Π2(X) then (H ·X)t is a continuous martingale b theorem 25. But this is

immediate from theorem 19.

Theorem 27 Let X and Y be continuous local martingales. If H,K ∈ Π3(X) then H + K ∈ Π3(X)

and

((H +K) ·X)t = (H ·X)t + (K ·X)t .

If H ∈ Π3(X) ∩Π3(Y ) then H ∈ Π3(X + Y ) and

(H · (X + Y ))t = (H ·X)t + (H · Y )t .

Theorem 28 If X and Y are continuous local martingales, H ∈ Π3(X) and K ∈ Π3(Y ) then

〈H ·X,K · Y 〉t =∫ t

0HsKsd 〈X,Y 〉s .

Theorem 29 IfX is a continuous local martingale andH ∈ Π2(X) thenH ·X ∈M2 and ||H ·X||2 =

||H||X .

5.4 Integration w.r.t. Semimartingales

Defintion: X is said to be a semimartingale ifXt = Mt+At whereMt is a continuous local martingale

and At is a continuous adapted process that is locally of bounded variation.

44

Page 45: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

Theorem 30 Let Xt be a continuous semimartingale. If the continuous process Mt and At are chosen

so that A0 = 0 then the decomposition Xt = Mt +At is unique.

Proof of Theorem 30: IfM ′t+A′t is another decomposition withA′0 = 0, thenMt−At = Xt = M ′t−A′t

so that At − A′t = Mt −M ′t . Since this is a continuous local martingale of locally bounded variation,

theorem 11 implies that At−A′t is constant, so that At−A′t = 0 for all t. Thus, Mt = M ′t and At = A′t

for all t.

Why should we care about semimartingales? First, we recall Ito’s formula for continuous local martin-

gales:

Theorem 31 Suppose X is a continuous local martingale, and f is a function with two continuous

derivatives. Then, with probability 1, for all t ≥ 0:

f(Xt)− f(X0) =∫ t

0f ′(Xs)dXs +

12

∫ t

0f ′′(Xs)d 〈X〉s .

1. If X is a continuous local martingale and f is C2 then Ito’s formula shows that f(Xt) is always a

semimartingale but is not a local martingale unless f ′′(x) = 0 for all x. It can be shown that if X

is a continuous semimartingale and f is C2 then f(Xt) is again a semimartingale.

2. Define an easy integrand to be a process of the form

H =n∑j=0

Hj1(Tj ,Tj+1]

where 0 = T0 ≤ T1 ≤ · · · ≤ Tn+1 are stopping times and Hj ∈ FTj satisfy |Hj | <∞ a.s.

Let bΠe,t be the collection of bounded easy predictable processes that vanish on (t,∞) equipped

with the uniform norm

||H||u = sups,ω|Hs(ω)|.

Finally, let L0 be the collection of all random variables topologized by convergence in probability,

which is induced by the metric

||X||0 = E

[|X|

1 + |X|

].

A result proved by Bichteler and Dellacherie states:

Theorem 32 IfH 7→ (H ·X) is continuous from bΠe,t → L0 for all t thenX is a semimartingale.

The class of integrands we want to integrate are as follows:

Definition: We say H ∈ lbΠ =the set of locally bounded predictable processes if there is a sequence

of stopping times Tn ↑ ∞ such that |H(s, ω)| ≤ n for s ≤ Tn.

45

Page 46: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

To define the integral w.r.t. the semimartingale Xt = Mt +At, first note that we can define

(H ·A)t(ω) =∫ t

0Hs(ω)dAs(ω)

as a Lebesgue-Stieltjes integral (which exists a.e. ω).

To integrate w.r.t. the continuous local martingale Mt, note that∫ Tn

0H2sd 〈M〉s ≤ n

2 〈M〉Tn <∞.

Thus, since Tn ↑ ∞, we have that H ∈ Π3(M), so that since H was arbitrary, lbΠ ⊂ Π3(M). Thus, we

can define (H ·M)t, and let

(H ·X)t = (H ·M)t + (H ·A)t .

From this definition, it’s immediate that

Theorem 33 If X is a continuous semimartingale and H ∈ lbΠ then (H ·X)t is a continuous semi-

martingale.

Definition: IfX = M+A andX ′ = M ′+A′ are continuous semimartingales, we define the covariance

〈X ′, X〉t = 〈M,M ′〉t.

6 Concluding Remarks

1. Note that almost everything we’ve done above works when instead of continuous (local) martin-

gales, we deal with square integrable cadlag (local) martingales (of course, care must be taken to

deal with jumps). In particular, probably the most important square integrable cadlag process is

the Poisson process:

Definition: A process Nt is a Poisson process with parameter λ if

(a) P [ω ∈ Ω : N0(ω) = 0] = 1.

(b) If t1 < · · · tn, the incrementsNtj −Ntj−1

1 ≤ j ≤ n are independent where we set

t0 = 0.

(c) For any 0 ≤ s < t, P [Nt −Ns = k] = e−λ(t−s)(λ(t−s))kk! , k = 0, 1, 2, . . ..

(d) The sample paths of Nt are cadlag a.s.

So, appropriately replacing Xt with Nt when necessary, the theory developed above also lets us

integrate the appropriate integrands w.r.t. the Possion process.

46

Page 47: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

2. I’ve looked around, and it’s quite common to use semimartingales in Finance (see Shiryaev, A.

N. (1999). Essentials of Stochastic Finance: Facts, Models, Theory. World Scientic.). The price

process of a financial asset St = eHt where Ht is a semimartingale.

3. We’ve already encountered an example of a continuous local martingale, namely the Bessel process√(B

(1)t

)2+ · · ·+

(B

(n)t

)2= ||Bt|| =

n∑j=1

∫ t

0

B(j)s

||Bs||dB(j)

s +n− 1

2

∫ t

0

1||Bs||

ds.

Another example of a continuous local martinagle used in applications is f(t) = eB(t)k . For k ≥ 3,

we have

E[|f(t)|2

]= E

[e2B(t)k

]=∫ ∞−∞

e2xk e−x22t

√2πt

dx =∞.

Thus, f(t) /∈ Π2(B). On the other hand, since almost all sample paths of f(t) are continuous,∫ t

0|f(s)|2d 〈B〉s =

∫ t

0|f(s)|2ds <∞

almost surely for all t ≥ 0. Thus, f ∈ Π3(B).

4. What’s useful about what we’ve done is that in general it requires hard computation to check

whether a process H belongs to Π2(B), and hence hard to verify that H · B even exists. On the

other hand, it’s much easier to check that a processH belongs to Π3(B), so that it’s easier to verify

the existence of H ·B.

5. Consider the SDE

dXt = σ(Xt)dZt (5)

where Zt is a Rn valued continuous semimartingale, and σ : Rn → Mn,n, the space of n × nmatrices.

If this equation has a solution, and f ∈ C2 (Rn) then

f(Xt) = f(X0) +∫ t

0fxi(Xs)σiαdZ

αs +

12∈t0 fxi,xj (Xs)σiα(Xs)σ

jβ(Xs)d

⟨Zα, Zβ

⟩(6)

(this is the generalized Ito formula).

Of course in order to use this, we need to prove the existence of the solution Xt.

Theorem 34 Suppose that σ is globally Lipschitz and X0 is square integrable. Then, (5) has a

unique solution.

Notice how general (and constraining) this is! If Zt is obtained through some Ito formula calculu-

ation, instead of worrying about the martingale and bounded variation processes separately [which

could be messy] we can just lump it all into one quantity Zt (which could be neater) and invoke

theorem 34.

47

Page 48: Stochastic Integration - BUmath.bu.edu/people/prakashb/Math/stochint.pdf · Stochastic Integration ... and Kuo’s Introduction to Stochastic ... tis said to be a continuous local

6. Let’s go back to to (5) and (6). Recall that if H ∈ Π2(X) and X is a martingale then H · X is

a martingale. Thus, E [(H ·X)t] = 0. However, for H ∈ lbΠ where X is a semimartingale, we

know that H · X is a semimartingale, so it’s definitely not necessarily true that E [(H ·X)t] =

0. Even if H ∈ Π3(X) and X is a continuous local martingale, we still can’t conclude that

E [(H ·X)t] = 0 (to be more concrete, think of when H is a Bessel process, of the process eB(t)k

for k ≥ 3).

Thus, taking expectations in (6), the integral w.r.t. Zαs will contribute nontrivially. Thus, SDEs

driven by semimartingales (continuous local martingales) yield more interesting dynamics.

7. Theorem 34 implies that the solution Xt runs for all time since the growth of σ is at most lin-

ear (recall the Lipschitz condition). When σ is only locally Lipschitz, we have to allow for the

possibility of explosion. For example, the solution to

dXt

dt= X2

t , X0 = 1

is Xt = 11−t , which explodes at t = 1. In general, let M be a locally compact metric space, and

let M = M ∪ ∂M denote its one point compactification.

Definition: An M valued path x with explosion time e = e(x) > 0 is a continuous map x :

[0,∞)→ M such that xt ∈M for 0 ≤ t < e and xt = ∂M for all t ≥ e if e <∞.

It can be shown that the exploding time e(Xt) of a continuous process Xt is a stopping time. We

have the theorem

Theorem 35 Let Z be a semimartingale defined up to a stopping time τ . Then, there is a unique

solution X to (5) up to the stopping time e(X)∧ τ . If T is a another solution up to a stopping time

η ≤ τ then η ≤ e(X) ∧ τ and Xt = Yt for 0 ≤ t < e(X) ∧ η.

8. Finally, let U(r) be some radially symmetric potential, and let Bt be the motion of a body under-

going Brownian motion. Then, the integral∫ t

0 U(||Bt||)d||Bt|| represents the work done by that

particle. Of course, we would need sufficient hypotheses on U to guarantee that this integral exists,

but that’s exactly what we’ve been doing all along.

A future topic: now that we know all about integration, how would you go about simulating the paths of

Brownian motion, or any continuous local martingale?

48