112
Lectures on Stochastic Analysis Autumn 2014 version Xue-Mei Li The University of Warwick Typset: January 22, 2017

Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Lectures on Stochastic AnalysisAutumn 2014 version

Xue-Mei LiThe University of Warwick

Typset: January 22, 2017

Page 2: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Contents

1 Introduction 61.1 Theory of Integration (Lecture 1) . . . . . . . . . . . . . . . . . 6

1.1.1 Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . 71.2 Stochastic Processes, Brownian Motions (Lecture 1) . . . . . . . 8

1.2.1 Ito’s Integration Theory w.r.t. Brownian Motion (Lecture 2) 101.2.2 Stochastic Integral Equations and Parabolic PDE (Lecture 2) 11

1.3 Appendix. Sample Paths of a Brownian Motion . . . . . . . . . . 131.4 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

2 Stochastic Processes 162.1 Lecture 3. Kolmogorov’s Extension Theorem . . . . . . . . . . . 162.2 Lecture 4. Komogorov’s Continuity Theorem . . . . . . . . . . . 182.3 Wiener space and Wiener Measure (Lecture 5) . . . . . . . . . . . 192.4 Construction by White noise (Lecture 5) . . . . . . . . . . . . . . 212.5 Appendix A. Functions on the Wiener Space . . . . . . . . . . . . 222.6 Appendix B. Borel Measures and Tensor σ-algebras . . . . . . . . 22

3 Conditional Expectations and Uniform Integrability 243.1 Conditional Expectations (Lecture 6) . . . . . . . . . . . . . . . . 243.2 Properties of Conditional Expectations ( Lecture 6-7) . . . . . . . 25

3.2.1 Disintegration and Orthogonal Projection (Lecture 7) . . . 273.2.2 Appendix* . . . . . . . . . . . . . . . . . . . . . . . . . 28

3.3 Uniform Integrability (Lecture 7) . . . . . . . . . . . . . . . . . . 283.4 Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 293.5 Appendix. Absolute continuity of measures . . . . . . . . . . . . 31

4 Martingales 334.1 Definitions (Lecture 7) . . . . . . . . . . . . . . . . . . . . . . . 334.2 Discrete time martingales(Lecture 8) . . . . . . . . . . . . . . . . 34

1

Page 3: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

4.3 Discrete Integrals(Lecture 8) . . . . . . . . . . . . . . . . . . . . 354.4 The Upper Crossing Theorem and Martingale Convergence Theo-

rem(Lecture 9) . . . . . . . . . . . . . . . . . . . . . . . . . . . 364.5 Stopping Times (Lecture 10) . . . . . . . . . . . . . . . . . . . . 384.6 The Optional Stopping Theorems (Lecture 11) . . . . . . . . . . 414.7 Doob’s Optional Stopping Theorem (Lecture 12) . . . . . . . . . 424.8 Right End of a Martingale and OST II (Lecture 13) . . . . . . . . 434.9 Martingale Inequalities (Lecture 14-15) . . . . . . . . . . . . . . 45

5 Continuous Local Martingales and The quadratic Variation Process 485.1 Lecture 14-15. Local Martingales . . . . . . . . . . . . . . . . . 485.2 The Quadratic Variation Process (Lecture 15-16) . . . . . . . . . 515.3 Local Martingale Inequality and Levy’s Martingale Characteriza-

tion Theorem. (Lecture 18) . . . . . . . . . . . . . . . . . . . . . 535.3.1 Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . 54

5.4 The Hilbert space of L2 bounded martingale (Lecture 18) . . . . . 55

6 Stochastic Integration 576.1 Introduction (Lectures 18-19) . . . . . . . . . . . . . . . . . . . . 57

6.1.1 Integration w.r.t. Stochastic Processes of Finite Variation . 596.2 Space of Integrands (Lecture 18-19) . . . . . . . . . . . . . . . . 616.3 Lecture 20. Characterization of Stochastic Integrals . . . . . . . . 666.4 Integration w.r.t. Semi-martingales (Lecture 21) . . . . . . . . . . 686.5 Stochastic Integration w.r.t. Semi-Martingales (Lecture 22) . . . . 72

6.5.1 Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . 736.6 Ito’s Formula (Lecture 22-23) . . . . . . . . . . . . . . . . . . . 73

7 Stochastic Differential Equations 787.1 Stochastic processes defined up to a random time (Lecture 24) . . 787.2 Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 797.3 Stochastic Integral Equations (Lectures 22-26) . . . . . . . . . . . 807.4 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 867.5 Notions of Solutions (Lectures 26-27) . . . . . . . . . . . . . . . 877.6 Notions of uniqueness (Lecture 27) . . . . . . . . . . . . . . . . . 90

7.6.1 The Yamada-Watanabe Theorem . . . . . . . . . . . . . . 917.7 Markov process and Transition function (Lecture 26) . . . . . . . 92

7.7.1 Semigroup and Generators . . . . . . . . . . . . . . . . . 947.7.2 Solutions of SDE as Markov process . . . . . . . . . . . . 95

7.8 Existence of Solutions and The Martingale Problem . . . . . . . . 977.9 Localisation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98

2

Page 4: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

8 Girsanov Transform 1018.1 Girsanov Theorem For Martingales (Lecture 28) . . . . . . . . . . 1018.2 Girsanov for Martingales . . . . . . . . . . . . . . . . . . . . . . 102

9 Appendix 1059.1 Lyapunov Function Test . . . . . . . . . . . . . . . . . . . . . . 105

9.1.1 Strong Completeness, flow . . . . . . . . . . . . . . . . . 107

3

Page 5: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Prologue

Prerequisites: a working knowledge of probability theory, measure theory, theoryof integration, functional analysis, and metric spaces is required.

What do we cover in this course and why?

We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory of stochastic differential equations. This will providethe foundation for advancing to topics offered on stochastic flows, geometry ofstochastic differential equations and leading to stochastic partial differential equa-tions and Malliavin calculus.

What are Brownian motions? They result from summing many small and inde-pendent influential factors (law of large numbers) over a time interval [0, t], t ≥ 0.So we are talking about Gaussian laws that change with time t.

What are martingales? A stochastic process is a martingale if, roughly speak-ing, the conditioned average value at a future time t given its value at s is the valueat s. On average you expect to see what is already statistically known. Continuousmartingales and local martingales can be represented as stochastic integrals withrespect to a Brownian motion (Integral Representation Theorem or Clark-Oconeformula).

What are Markov processes? The conditional average of the future value of aMarkov process given knowledge of its past up to now is the same as the condi-tional average of the future value of the Markov process given knowledge on itspresent status only. The Dubin-Schwartz Theorem says that a martingale is a timechange of a Brownian motion, e.g. a Brownian motion run at a random clock. Therandom clock is the quadratic variation of the martingale. However the time changemay not be Markovian, and hence the process may not be a Markov process.

4

Page 6: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Acknowledgement

This note benefited from readings by those who attended my lectures. I would liketo specially thank Michael Coffey, Owen Daniel and Wojciech Ozanski.

5

Page 7: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Chapter 1

Introduction

1.1 Theory of Integration (Lecture 1)

People have been toying with various concepts of integration theory. We will ex-plore the concept of stochastic integration which is not covered by any of the the-ories below.

Definition 1.1 A function f : [a, b] → R is Riemann integrable if there exists anumber I s.t. for any number ε > 0, there exists δ > 0 s.t. for any tagged partition∆ : a = t0 < t1 < · · · < tn = b, t∗i ∈ [ti−1, ti] with mesh maxi(ti − ti−1) < δ,∣∣∣∣∣

n∑i=1

f(t∗i )(ti − ti−1)− I

∣∣∣∣∣ < ε.

The number I , which will be denoted by∫ ba f(t)dt is the Riemannian integral of

f .

Definition 1.2 Let f, g : [a, b]→ R be bounded functions. We say f is Riemann-Stieltjes integrable w.r.t. g if for for any number ε > 0, there exists δ > 0 s.t. forany tagged partition ∆ : a = t0 < t1 < · · · < tn = b, t∗i ∈ [ti−1, ti] with meshmaxi(ti − ti−1) < δ, ∣∣∣∣∣

n∑i=1

f(t∗i )(g(ti)− g(ti−1))− I

∣∣∣∣∣ < ε.

The number I is the Riemannian-Stieljes integral of f w.r.t. g and will be denotedby∫ ba f(t)dg(t).

6

Page 8: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

If µ is a finite measure on [a, b] and f is a bounded measurable function wecan define

∫[a,b] fsµ(ds). The so called regulated functions are in the closure of

step functions in the uniform topology. Regulated functions are hence boundedLebesgue measurable functions and are integrable with respect to the Lebesquemeasure.

Definition 1.3 A function g : [a, b]→ R has bounded variation if

gTV ([a, b]) ≡ Var(g, [a,b]) := supP

n∑i=1

|g(ti+1)− g(ti)| <∞

where ∆ : a = t0 < t1 < · · · < tn = b, t∗i ∈ [ti−1, ti]. The collection of suchfunctions is denoted by BV ([a, b]).

Both gTV ([a, b]) and Var(g, [a, b]) are commonly used notations for the total vari-ation of g over [a, b]. If g is continuous and of finite total variation, the variationcan be obtained by taking a sequence of partitions ∆n whose mesh converges tozero and take the limit lim|∆n|→0

∑ni=1 |g(ti+1)− g(ti)|.

Theorem 1.1 A real valued function of bounded variation on [a, b] is the differenceof two monotone functions on [a, b].

Theorem 1.2 There is a one to one correspondence between a function g ∈ BV (R+)which is also right continuous and a Radon measure µg on R+,

g(t)− g(0) = µg([0, t]).

If f is integrable with respect to µg, we say f is Stieltjes integrable w.r.t. g anddefine

∫fsdgs =

∫fsdµs.

If g(x) = x we have Lebesgue integrals.Young Integral: If f is α-regular and g is β regular with α + β > 1, Young

integral∫fdg can be defined.

Ito integral: we do not assume much on the regularity of the integrand (fs),left continuous adapted is sufficient, and the integrator (Bs) is almost surely notHolder continuous of order α > 1

2 .

1.1.1 Appendix

A measure is Radon if it is inner regular, i.e. for any Borel set B and ε > 0 thereexists a compactK ⊂ B with µ(B\K) < ε. Note that if g is of bounded variation,for any t1, t2 positive, µg((t1, t2]) := g(t2)−g(t1); µg(t) = µg((t, t]) = g(t)−g(t−) is the jump of g at t.

7

Page 9: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Example 1.1 If g is increasing gTV ([a, b]) = g(b) − g(a); If f ∈ C1([a, b]) thenf ∈ BV ([a, b]). If µ is a finite positive measure, set f(x) = µ((−∞, x]). Then fis of finite total variation, increasing and right continuous and limx→−∞ f(x) = 0.

If f ∈ BV ([a, b]) it has derivatives at almost surely all x ∈ [a, b].

Theorem 1.3 If f is Lebesque integrable on [a, b], then∫ xa f(t)dt is a continuous

function of finite variation. If g ∈ BV ([a, b]) and f ∈ C([a, b];R) then f isRiemann-Stieltjes integrable with respect to g, and

|∫ t

0fsdgs| ≤ gTV ([a, b]) · |f |∞.

1.2 Stochastic Processes, Brownian Motions (Lecture 1)

Let (Ω,F , µ) be a measure space. Let E be a separable complete metric spacewith a σ-algebra B which is usually the Borel σ-algebra. A function f : Ω → Eis said to be measurable if the pre-image of any measurable set B, f−1(B) = ω :f(ω) ∈ B, is a measurable set, i.e. belongs to F . Measurable functions on aprobability space are also called random variables. The concept of measurabilityis close to that of continuity: the first determined by σ-algebras and the latter bytopologies.

Let I be an index set, indicating time, e.g. [0, T ], [a, b], [0,∞) or N .

Definition 1.4 A stochastic process on a separable metric space E is a map X :I × Ω→ E s.t. for any t ∈ I , X(t) : Ω→ E is measurable.

Remark. In another word, a stochastic process consists of a family of measurablefunctions Xt : (Ω,F) → (E,B). Recall that the tensor σ-algebra is the smallestone such that for all α ∈ I , the mapping πα : (E,⊗α∈IFα) → (Eα,Fα) ismeasurable. For each ω, we may view the function t ∈ I 7→ Xt(ω) ∈ E an anelement of SI . Then X· : (Ω,F) → (EI ,⊗IB) is measurable if and only if eachXt : (Ω,F)→ (E,B) is measurable.

Example 1.2 (1) Take Ω = [0, 1], F = B([0, 1]), and P the Lebesque measure.Take I = 1, 2, . . . . Define Xn(ω) = ω

n . These are continuous functionsfrom [0, 1]→ R and are Borel measurable.

(2) Take I = [0, 3]. Let X,Y : Ω → R be two random variables on a measurespace (Ω,F). Then Xt(ω) = X(ω)1[0, 1

2](t) + Y (ω)1( 1

2,3](t) is a stochastic

process.

8

Page 10: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Definition 1.5 Let I be an interval.

(1) A stochastic process (Xt, t ∈ I) with state space E is said to be samplecontinuous (or path continuous or a continuous process) if t 7→ Xt(ω) iscontinuous for almost surely all ω.

(2) A stochastic processes is cadlag if t 7→ Xt(ω) has left limit and is rightcontinuous for a.s. all ω. Cadlag processes have jumps at the point of dis-continuity.

(3) A stochastic process (Xt, t ∈ I) is said to have independent increments iffor any finite number of disjoint intervals [ui, vi], i = 1, . . . , n, Xui −Xvini=1 are independent random variables.

(4) A stochastic process (Xt, t ≥ 0) is Gaussian if for any n ∈ N and anynumbers 0 ≤ t1 < · · · < tn, the distribution of the random variable(Xt1 , . . . Xtn), with values in Rn, is Gaussian.

Definition 1.6 A stochastic process (Bt : t ≥ 0) on R1 is the standard Brownianmotion if B0 = 0 and the following holds:

(1) it is sample continuous,

(2) it has independent increments,

(3) for any 0 ≤ s < t, the distribution of Bt −Bs is N(0, t− s).

Let W d0 or C0([0,∞),Rd) denote the space of continuous paths over Rd with

initial value 0:

W d0 = C0([0,∞),Rd) := σ : R+ → Rd : σ is continuous andσ(0) = 0.

We may treat (Bt) as a measurable function on the Banach spaceW d0 with its Borel

σ-algebra.B : Ω 7→W d

0

ω 7→ (Bt(ω), t ≥ 0)

It induces a measure on (W d0 ,B(W d

0 )) which will be called the Wiener measure.The probability space (W d

0 ,B(W d0 ), µ) is called the Wiener space. The evaluation

mapevt : (W d

0 ,B(W d0 ), µ)→ R

given by evt(σ) = σ(t) is a Brownian motion on the Wiener space. Let us visualizea basket of continuous curves, dropping down according to µ, what we see will bethe sample paths of the Brownian motion.

How does a typical Brownian path look like? We have the following facts:

9

Page 11: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Proposition 1.4 (1) For a.s. all ω and any pair of positive numbers a < b,Var(Bt(ω), [a, b]) =∞.

(2) For a.s. all ω, Bt(ω) cannot have Holder continuous path of order α > 12 .

The integration theory we mentioned earlier fails to define∫ t

0 Bs(ω)dBs(ω),path by path.

1.2.1 Ito’s Integration Theory w.r.t. Brownian Motion (Lecture 2)

Let I ⊂ R. If (Xt, t ∈ I) is a stochastic process.

Definition 1.7 (a) A family Ftt∈I of non-decreasing sub-σ- algebras of F isa filtration if Fs ⊂ Ft whenever s, t ∈ I, s < t.

(b) (Ω,F ,Ft, P ) is a filtered probability space.

(c) (Xt : t ∈ I) is Ft-adapted if Xt is Ft measurable for each t ∈ I .

(d) The natural filtration, FXt , of (Xt, t ≥ 0), is the smallest σ-algebra w.r.t.which each Xs, s ≤ t, is measurable.

Adapted means that the process does not look into the future.

Definition 1.8 An Ft adapted stochastic process (Xt) is a (Ft) Brownian motionif it is a Brownian motion and for each t ≥ 0, (Bt+s −Bs) is independent of Fs.

Let Kt be a stochastic processes that is piecewise constant

Kt(ω) = K−1(ω)10(t) +∞∑i=0

Ki(ω)1(ti,ti+1](ω),

where 0 = t0 < t1 < t2 < . . . with limn→∞ tn =∞. If t ∈ (tn, tn+1], we definean elementary integral:∫ t

0KsdBs =

n∑i=1

Ki(ω)(Bti+1(ω)−Bti(ω)) +Kn(ω)(Bt(ω)−Btn(ω)).

If f is a left continuous and adapted stochastic process, we wish to define∫ t0 fsdBs. Let ∆n be a partition of [0, t] with mesh converging to zero. On each

partition we have a piecewise constant function and an elementary integral. Wedefine :∫ t

0KsdBs = lim

|∆n|→0

n∑ti∈∆n

Kti(ω)(Bti+1(ω)−Bti(ω))+Kn(ω)(Bt(ω)−Btn(ω)).

10

Page 12: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

This convergence will be in probability and we do not hope, in general, that wehave almost sure convergence. Such integrals will be local martingales.

1.2.2 Stochastic Integral Equations and Parabolic PDE (Lecture 2)

Let σ, σ0 : R → R be Lipschitz continuous functions, and (Bt) a Brownian mo-tion, we seek a stochastic process (xt) that satisfies the stochastic integral equation

xt = x0 +

∫ t

0σ(xs)dBs +

∫ t

0σ0(xs)ds.

For each initial value x0, we denote the solution by (Ft(x0), t ≥ 0), whose prob-ability distribution is denoted by P (t, x0, dy). Assume that the solution is uniqueand exists for all time (non-explosion). Let f : R→ R be bounded Borel measur-able, We define

Ptf(x) := Ef(Ft(x0)) =

∫Rf(y)P (t, x, dy).

Then the Chapman-Kolmogorov equation holds,

P (t+ s, x,A) =

∫RP (t, y, A)P (s, x, dy).

and (Ft(x0)) is a Markov process. Then

Pt+sf(x) =

∫Rf(z)P (t+ s, x, dz) =

∫Rf(z)

∫RP (s, x, dy)P (t, y, dz)

=

∫RPtf(z)P (s, x, dy) = PsPtf(x).

Let us define

Af(x) := limt→0

Ptf(x)− f(x)

t,

whenever the limit exists. We say that A is the (infinitesimal) generator of theMarkov processes whose domain consists of functions f for which the limit exists.

Then under suitable conditions, Ptf solves the Kolmogorov equation

d

dtPtf = Pt(Af)

and the partial differential equation:

d

dtPtf = A(Ptf).

The generator A has a formal expression:

Af =1

2(σ(x))2f ′′(x) + σ0(x)f ′(x).

11

Page 13: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Example 1.3 If (Bt) is a d dimensional Brownian motion, it solves dxt = dBtwith x0 = 0.

Ptf(x) = Ef(x+Bt) =

∫Rd

f(x+y)1

(2πt)d2

e−|y|22t dy =

∫Rd

f(y)1

(2πt)d2

e−|y−x|2

2t dy.

If we differentiate Ptf for suitable f , we see that Ptf solves the heat equation,

∂t=

1

2

d∑i=1

∂2

∂x2i

.

For x, y ∈ Rd let

p(t, x, y) =1

(2πt)d2

e−|x−y|2

2t .

and let Kt(x) = p(t, 0, x). we define a probability measure P (t, x, ·) by

P (t, x,A) =

∫Ap(t, x, y)dy =

1√

2πtd2

∫Ae−|y−x|2

2t dy.

Each measure P (t, x, ·) is Gaussian measure, with variance t and mean x. Thisfamily of measures P (t, x, ·) are called the heat kernel measures.

Exercise 1.1 Prove that p(t, x, y) satisfies the Chapman-Kolmogorov equation∫Rd

p(s, x, y)p(t, y, z)dy = p(s+ t, x, z).

We quote a standard theorem from PDE, which together with Ito’s formulagives the Kolmogorov equation:

Theorem 1.5 If f ∈ Lp(Rd,R) where 1 ≤ p ≤ ∞. Then

Kt ∗ f(x) :=

∫Rd

f(y)Kt(x− y)dy =

∫Rd

f(y)p(t, x, y)dy

satisfies the heat equation

dutdt

=1

2∆ut, u0(x) = f(x).

on Rd × (0,∞). If

(1) 1 ≤ p <∞, then Kt ∗ f → f in Lp as t→ 0.

(2) f ∈ L∞∩C(Rd,R), thenKt∗f is continuous on Rd×[0,∞) (K0∗f = f ).

12

Page 14: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

1.3 Appendix. Sample Paths of a Brownian Motion

Proposition 1.6 Let ∆n : a = tn0 < tn1 < · · · < tnMn+1 = b be a sequence ofpartitions of [a, b] with |∆n| → 0. Define

Tn =

Mn∑i=0

(Btni+1−Btni )2.

Thenlimn→∞

E(Tn − (b− a))2 = 0.

In particular Tn converges in probability to b − a. There is a sub-sequence ofpartitions ∆nk , such that

Tnk → b− a, a.s.

Proof Firstly,

ETn =

Mn∑i=0

E(Btni+1−Btni )2 =

Mn∑i=0

(tni+1 − tni ) = b− a.

By the independent increment property of the BM,

E(Tn − (b− a))2 = var(Tn) =

Mn∑i=0

var(

(Btni+1−Btni )2

)=

Mn∑i=0

var((tni+1 − tni )B2

1

), since Bt −Bs

d=√t− sB1

=

Mn∑i=0

(tni+1 − tni

)2(varB2

1)

≤ maxi

(tni+1 − tni

) Mn∑i=0

(tni+1 − tni

)(varB2

1)

= maxi

(tni+1 − tni

)(b− a)(varB2

1)→ 0.

The first statement of Proposition 1.6 holds. Now L2 convergence impliesconvergence in probability, and so there is a sub-sequence that is convergent almostsurely.

If the partition is a dyadic partition, i.e. divide each interval by 2 each time,the whole sequence converge almost surely, [14]. Recall Definition 1.3 for the totalvariation of a function.

13

Page 15: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Proposition 1.7 For almost surely all ω, the Brownian paths t 7→ Bt(ω) haveinfinite total variation on any interval [a, b]. And Bt(ω) cannot have Holder con-tinuous path of order α > 1

2 .

Proof Fix an ω. Since Bt has almost surely continuous paths we only consider allsuch ω with t 7→ Bt(ω) continuous.

(1) Suppose thatB(ω)TV ([a, b]) <∞. Let us consider a sequence of partitions∆n such that Tn =

∑Mni=0(Btni+1

−Btni )2 converges almost surely, see Proposition1.6. Then

Mn∑i=0

(Btni+1−Btni )2 ≤ max

i

∣∣∣Btni+1(ω)−Btni (ω)

∣∣∣ · Mn∑i=0

∣∣∣Btni+1−Btni

∣∣∣≤ max

i

∣∣∣Btni+1(ω)−Btni (ω)

∣∣∣ ·B(ω)TV ([a, b])→ 0

The convergence follows from the fact thatBt(ω) is uniformly continuous on [a, b].This contradicts that

∑ni=0(Btni+1

(ω)−Btni (ω))2 converges to b− a.

(2) Suppose that∣∣∣Btni+1

(ω)−Btni (ω)∣∣∣ ≤ C(ω)|t − s|α, where C(ω) is a con-

stant for each ω, for some α > 12 .

Mn∑i=0

|Btni+1(ω)−Btni (ω)|2 ≤ C2(ω)

Mn∑i=0

|tni+1 − tni |2α

≤ C2(ω)|∆n|2α−1Mn∑i=0

(tni+1 − tni )

≤ C2(ω)(b− a)|∆n|2α−1 → 0,

as 2α− 1 > 0. This contradicts with Proposition 1.6.

1.4 References

For a comprehensive study of martingales we refer to “Continuous Martingales andBrownian Motion” by D. Revuz and M. Yor [24]. An enjoyable read for introduc-tion to martingales is the book “Probability with martingales” by D. Williams [30].For further reads on Brownian motions check on M. Yor’s recent books, e.g. [31]also [18] by R. Mansuy-M. Yor, and also [19] by P. Morters and Y. Peres.

For an overall reference for stochastic differential equations, we refer to “Stochas-tic differential equations and diffusion processes, second edition” by N. Ikeda andS. Watanabe [13]. The small book [16] by H. Kunita is nice to read. There are two

14

Page 16: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

lovely books by A. Friedman “Stochastic differential equations and applications”[9, 10], and “Stochastic differential equations ” by I. Gihman and A.V. Skorohod[11]. Another book that is good for working out examples is “Stochastic stabilityof differential equations” by R. Z. Khasminskii [12]. Two books that are good forthe beginners are “Stochastic Differential Equations” by B. Oksendale [20] and“Brownian Motion and Stochastic Calculus” by I. Karatzas and S.E. Shreve [15].The book by Oksendale has 6 editions. I like edition three and edition four: they areneat and compact. For further studies there are “Diffusions, Markov processes andDiffusions” by C. Rogers and D. Williams [26, 25]. Another lovely reference bookis “Foundations of Modern Probability” by Kallenberg [14]. It would work great asa reference book. For stochastic integrals for stochastic processes with jumps readProtter [22]. For SDEs driven by space time martingales see “Stochastic Flows andStochastic Differential Equations” by H. Kunita [17]. For SDEs on manifolds see“Stochastic differential equations on manifolds” by K. D. Elworthy [4]. For workfrom the point of view of random dynamics see “Random Dynamical systems” byL. Arnold [1] and “Random perturbations of dynamical systems” by M. I. Freidlinand A.D. Wentzell. For further work on the geometry of SDEs have a look at thebooks “On the geometry of diffusion operators and stochastic flows” [7] and “Thegeometry of filtesing” [5] by K. D. Elworthy, Y. LeJan and X.-M. Li. For a theoryon Markov processes and especially the treatment of the Martingale problem see“Multidimensional diffusion processes” by D. Stroock and S. R.S. Varadhan [28].There are a number of nice and slim books by the two authors, see D.W. Stroock[27] and S. R.S. Varadhan [29].

If you wish to review the theory of integration, try Royden’s book “Real Anal-ysis”. It is easy to read and useful as a reference. For further study on measures see“Real Analysis” by Folland [8]. Have a read of “Probability measures on metricspaces” by Parthasarathy [21] for a deep theory on measures. The books “MeasureTheory, vol 1&2” by Bogachev [3] is quite useful. For some aspects measure onthe Wiener space see “Convergence of Probability measures” by Billingsley [2].

15

Page 17: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Chapter 2

Stochastic Processes

In lectures 3-4, we discuss the existence of a Brownian motion on R.

2.1 Lecture 3. Kolmogorov’s Extension Theorem

Let (X,B1) and (Y,B2) be measurable spaces and µ a measure on (X,B1). LetΦ : X → Y be a measurable function. It induces a pushed forward measure on(Y,B2):

(Φ∗µ)(A) = µ(x : Φ(x) ∈ A).

If f : Y → R be an Φ∗(µ)-integrable function then∫X

(f Φ)(x) µ(dx) =

∫Yf(y) (Φ∗µ)(dy).

A measurable function f : Ω→ E induces a measure on E which is called theprobability distribution of f and will be denoted by PX .

Definition 2.1 Let (Xt, t ∈ [0,∞)) be a stochastic process on a metric space S.For n ∈ N , and 0 ≤ t1 < t2 < · · · < tn we denote by µt1,...,tn the probabilitymeasure on Sn pushed forward by

(Xt1 , Xt2 , . . . , Xtn).

The family of probability measures µt1,...,tn are the finite dimensional distribu-tions of the stochastic process (Xt).

16

Page 18: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Example 2.1 Let (Bt) be a one dimensional Brownian motion, and Ai Borelsets in R, then

P (Bt1 ∈ A1, . . . , Btk ∈ Ak)

=

∫A1

. . .

∫Ak

pt1(0, y1)pt2−t1(y1, y2) . . . ptk−tk−1(yk−1, yk)dyk . . . dy1.

Proof I prove this for k = 2, the rest is left as an exercise. Let f, g : R → R bebounded measurable functions. Then for s ≤ t,

Ef(Bs)g(Bt) = E (Ef(Bs)g(Bt −Bs +Bs)|Fs)= E (f(Bs)E g(Bt −Bs +Bs)|Fs)

= E

(f(Bs)

∫Rd

g(z +Bs)pt−s(0, z)dz

)= E

∫R

∫Rf(x)g(y)p(s, 0, x)p(t− s, x, y)dydx.

Hence (Bs, Bt) is distributed as p(s, 0, x)p(t− s, x, y)dydx.

Let I be an arbitrary index and let (Xα,Bα)α∈I be a family of measurablespaces. The Cartesian product XI = Πα∈Xα is the set of all maps x defined on Iwith x(α) ∈ Xα. We define the coordinate map πα : XI → Xα by πα(x) = x(α).The tensor σ-algebra, also called the product σ-algebra, on XI is the smallest σ-algebra such that each πα is measurable:

⊗α∈IBα = σπ−1α (Aα) : Aα ∈ Bα, α ∈ I.

For any I2 ⊂ I1 ⊂ I let πI1,I2(x) be the restriction of x in XI1 to XI2 .A family of measures µF , F ⊂ I,#|F | < ∞ is consistent if (1) µF is a

measure on (XF ,⊗α∈FBα); (2) For any F2 ⊂ F1 ⊂ I , (πF1F2)∗µF1 = µF2 .

XI

XF2

XF1

πF1F2

πF1

πF2

We note that a separable complete metric space with Borel σ-algebra is a stan-dard measure space. See Parthasarathy [21] for detail and for a proof of the fol-lowing theorem.

17

Page 19: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Theorem 2.1 (Kolmogorov’s Extension Theorem) Let (Xα,Bα, α ∈ I) be ‘stan-dard’ measure spaces. Given a consistent family of probability measures µF , F ⊂I,#|F | < ∞, there exists a unique probability measure µ on XI s.t. (πF )∗µ =µF .

Example 2.2 Let us define a family of finite dimensional probability measuresµt1,...,tn , 0 < t1 < · · · < tn, n ∈ N as below. Let Ai ∈ B(Rd),

µt1,...,tn(Πnj=1Aj)

=

∫A1

. . .

∫An

p(t1, 0, y1)p(t2 − t1, y1, y2) . . . p(tn − tn−1, yn−1, yn)dyn . . . dy1.

Let Eα = R, α ∈ [0, 1], then EI = R[0,1] and the coordinate maps are πt : x ∈EI 7→ x(t). This is a consistent family of probability measures (exercise). By Kol-mogorov’s Extension Theorem, there exists a measure µ on (R[0,1],⊗[0,1]B(R))such that its pushed forward measure by the map πt1,...,tn is µt1,...,tn . Then (πt, t ≥0) is a stochastic processes on the probability space (R[0,1],⊗[0,1]B(R), µ) with theproperty that it has independent increments, π0(x) = 0 almost surely, πt − πs ∼N(0, t− s) (exercise).

2.2 Lecture 4. Komogorov’s Continuity Theorem

Definition 2.2 1. Two stochastic processes Xt and Yt on the same probabilityspace are modifications of each other if for each t, P (Xt = Yt) = 1. Theexceptional set ω : Xt(ω) 6= Yt(ω) may depend on t.

2. Two stochastic processes Xt and Yt on the same probability space are indis-tinguishable of each other if P (Xt = Yt, ∀t) = 1.

Let E be a Banach space with norm ‖ − ‖, e.g. E = Rd.

Definition 2.3 Let α ∈ (0, 1) and I be an interval of R.

(1) A function f : I → E is Holder continuous of exponent α if for all t, s ∈ I ,

|f(t)− f(s)| ≤ C|t− s|α.

(2) A function f : I → E is locally Holder continuous of exponent α if on anycompact subinterval [a, b] ⊂ I ,

supt6=s,t,s∈[a,b]

|f(t)− f(s)||t− s|α

<∞.

18

Page 20: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Theorem 2.2 ( Kolmogorov’s Continuity Theorem) Let (xt, t ∈ I) be a stochas-tic process with values in a separable Banach space (E, | − |). Suppose that thereexist positive constants p, δ and C such that for all s, t ∈ I ,

E|xt − xs|p ≤ C|t− s|1+δ.

Then there is a continuous modification (xt, t ∈ I) of (xt, t ∈ I), s.t. for anyα ∈ (0, δp) and [a, b] ⊂ I ,

E sups 6=t,s,t∈[a,b]

(|xs − xt||t− s|α

)p<∞.

Example 2.3 Let (xt) be a stochastic process with xt − xs ∼ N(0, t − s). Thenfor any p ≥ 1,

E|xt − xs|p =1√

2π(t− s)

∫ ∞−∞|y|pe−

|y|22(t−s) dy

= |t− s|p2

1√2π

∫ ∞−∞|z|pe−

|z|22 dz

= E(|x1|p)|t− s|p2 <∞.

2.3 Wiener space and Wiener Measure (Lecture 5)

Let us consider the separable Banach space

W d0 = C0([0, 1];Rd) = ω : [0, 1]→ Rd continuous , ω(0) = 0.

with the uniform norm ‖ω‖ = sup0≤t≤1 |ω(t)| and distance

d(ω1, ω2) = sup0≤t≤1

|ω1(t)− ω2(t)| = supti∈Q|ω1(ti)− ω2(ti)|.

The Borel σ-algebra on W d0 is generated by open balls. Let fi, i ∈ N be a dense

set of W d0 . Since

ω ∈W d0 : d(ω, ω0) < a = ∩ti∈Q∩[0,1]ω ∈W d

0 : |ω(ti)− ω0(ti)| < a,

B(W d0 ) = σω ∈W d

0 : |ω(ti)− fk(ti)| < a : ti ∈ Q, k ∈ N.

To ease notation, take d = 1 and write W0 := W 10 .

19

Page 21: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Theorem 2.3 Let πt : W0 → R be the evaluation maps: πt(ω) = ωt. There is aprobability measure µ on (W0,B(W0)) such that for any 0 < t1 < · · · < tn, andany Ai ∈ B(R),

(πt1 , . . . , πtn)∗(µ)(Πni=1Ak)

=

∫A1

. . .

∫Ak

p(t1, 0, y1)p(t2 − t1, y1, y2) . . . p(tk − tk−1, yk−1, yk)dyk . . . dy1.

In particular, (πt, t ≤ T ) is a standard Brownian motion on (W0,B(W0), µ).

Proof Let us take T = 1 for simplicity and let E = fk a countable dense set ofW0. Open balls in W0 are determined by ‘cylindrical sets’ of the following form

ω ∈W0 : |ω(ti)− f(ti)| ≤ r, f ∈ E, r > 0, 1 ≤ i ≤ n, n ∈ N .

These sets are in ⊗[0,1]B(R). A continuous path is determined by its values onQ ∩ [0, 1]; however we cannot determine whether an arbitrary function from [0, 1]to R is continuous by a countable number of evaluations. HenceW0 6∈ ⊗[0,1]B(R).

We construct a map

Φ : (R[0,1],⊗[0,1]B(R))→ (W0,B(W0))

in the following way. If x : [0, 1] → R is continuous when restricted to Q, we setΦ(x)(ti) = x(ti) and continuously extend the value of φ(x)) to irrational numbers:

Φ(x)(t) = limti→t,ti∈Q

x(t).

Otherwise we set Φ(x)(t) = 0 for all t ∈ [0, 1]. Let µ be the measure on⊗[0,1]B(R) given in Example 2.2. By Kolmogorov’s continuity theorem µ(x :Φ(x) 6= x) = 0. We add all subsets of measurable sets of zero measure to ob-tain a completion of the σ-algebra ⊗[0,1]B(R). It is clear that Φ is a measurablemap. For any q ∈ Q take Bq ∈ B(R). Then ∩q∈Q∩[0,1]x : xq ∈ Bq belongs to⊗[0,1]B(R). Let µ = Φ∗(µ). This is is the required measure. For any n ∈ N andt1, . . . , tn ∈ [0, 1],

µ(∩ni=1x ∈ R[0,1] : πti(x) ∈ Bi

)= µ

(∩ni=1x ∈ R[0,1] : πti(Φ(x)) ∈ Bi

).

By Example 2.2, the required property of µ follows.

20

Page 22: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

2.4 Construction by White noise (Lecture 5)

Let (ei) be an o.n.b. of L2([0, 1];R). Let

xt =

∞∑i=1

ξi

∫ t

0ei(s)ds

where ξi are independent random variables with distribution N(0, 1). Then foreach t, the sum converges in L2, i.e.

limn→∞

E

(n+m∑i=n

ξi

∫ t

0ei(s)ds

)= 0.

To see this we note that∫ t

0ei(s)ds = 〈1[0,t](s), ei(s)〉L2([0,1];R).

Since ei ∈ L2, by Parseval’s theorem,

∞∑i=1

(∫ t

0ei(s)ds

)<∞.

Let

x(n)t =

n∑i=1

ξi

∫ t

0ei(s)ds.

For each t, there exists a subsequence xnkt , k ∈ N which converges almostsurely to the limit xt. It is easy to compute the distribution of x(n)

t , it is a meanzero Gaussian random variable and has variance

n∑i=0

(∫ t

0ei(s)ds

)2

.

For any λ ∈ R,

Eeiλx(n)t = e−

12λ2

∑ni=0(

∫ t0 ei(s)ds)

2

→ e−12λ2|∫ 1

0 1s≤tds|L2 = e−12λ2t.

Since Eeiλx(n)t → Eeiλxt we see that Eeiλxt = e−

12λ2t.

21

Page 23: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

A similar computation shows that (xt) has independent increments:

Eeiλ∑

(xtj−xtj−1 ) = Πje−λ

2

2(tj−tj−1) = ΠjEe

iλ(xtj−xtj−1 ).

We choose a special basis of L2. Let en be the Haar functions so thatSn =

∫ t0 en(s)ds is the Schauder basis. Then the convergence can be shown to

be uniform in t on compact subinterval from which it follows that (xt) has contin-uous sample paths. This proves that (xt) is a Brownian motion.

2.5 Appendix A. Functions on the Wiener Space

Let 0 = t0 < t1 < · · · < tk and g : (Rd)k → R a Borel measurable function.Then functions of the type f(ω) = g(ωt1 , . . . , ωtk) are called cylindrical functions.

Example 2.4 1. Cylindrical : (a) f(ω) = ω(2); (b) f(ω) = ω(1) + (ω(1)2);

2. Not cylindrical : (c) f(ω) = max0≤s≤1 ω(s); (d) f(ω) =∫ 1

0 ωsds.

Let us integrate an cylindrical function:∫W0

g(ωt1 , . . . , ωtk)dµ(ω) =

∫W0

g(πt1,...,tk(ω))dµ(ω)

=

∫(Rd)k

g(y)d(πt1,...,tk)∗µ(y)

=

∫(Rd)k

g(y1, . . . , yk)Πki=0p(ti − ti−1, yk−1, yk)dy.

where y0 = 0, t0 = 0 and dy = Πni=1dyi. In particular if s < t,

Eπt =

∫W0

ωt dµ(ω) =

∫Rd

y p(t, 0, y)dy = 0,∫W0

g(ωs, ωt) dµ(ω) =

∫R2

g(x, y)p(s, 0, x)p(t− s, x, y) dy dx.

2.6 Appendix B. Borel Measures and Tensor σ-algebras

Let (Eα,Fα, α ∈ I) be measurable spaces. The tensor or product σ-algebra of theσ-algebras Fα, α ∈ I is

FI = σπ−1α (Aα) : Aα ∈ Fα, α ∈ I

where πα : EI → Eα denotes the projection given by the formula πα(x) = x(α).

22

Page 24: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Proposition 2.4 For each α ∈ I let Gα be a generating set of Fα. Then

FI = σπ−1α (Aα) : Aα ∈ Gα, α ∈ I.

If I is a countable set then,

FI = σΠα∈IAα, Aα ∈ Fα.

Proof (1) It is clear that CI := σπ−1α (Aα) : Aα ∈ Gα, α ∈ I ⊂ FI . But CI is a

σ-algebra containing each Fα.(2) It is clear that FI ⊂ σΠα∈IAα, Aα ∈ Fα. We observe that Πα∈IAα =

∩α∈Iπ−1α (Aα). Since I is countable, the latter belongs to FI .

LetX be a metric space and B(X) its Borel σ-algebra. A measure on the Borelσ-algebra is a Borel measure. If X is a separable metric space, the metric topologysatisfies the second axiom of countability, i.e. there exists a countable base. Thiscountable base generates B(X). Let (Xα, α ∈ I) be separable metric spaces.The product topology on XI is the coarsest topology such that the projections arecontinuous, it is generated by sets of the form Π−1

α (Aα) where α ∈ I and Aαare open sets of Xα. The coordinate mappings are measurable with respect to theBorel σ algebra on the product space and ⊗αB(Xα) ⊂ B(Πα∈IXα).

Proposition 2.5 (Thm 1.10 in [21] ) Let (X1, X2, . . . ) be separable metric spacesand X = Π∞i=1Xi. Then B(X) = ⊗∞n=1B(Xn).

23

Page 25: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Chapter 3

Conditional Expectations andUniform Integrability

Definition 3.1 Let p ≥ 1.

1. A family of Borel measurable functions fα on a measure space is Lp

bounded if supα∫|fα|p <∞.

2. A stochastic process (Xt) is Lp integrable if E(|Xt|p) <∞ for all t; it is Lp

bounded if suptE(|Xt|p) <∞.

3.1 Conditional Expectations (Lecture 6)

Definition 3.2 Let X ∈ L1(Ω,F , P ) be a r.v.. Let G be a sub-σ-algebra of F .A conditional expectation of X given G is any G-measurable integrable randomvariable Y such that ∫

AXdP =

∫AY dP, ∀A ∈ G (3.1)

Theorem 3.1 Let X ∈ L1(Ω,F , P ).

(1) If Y1, Y2 ∈ L1(Ω,G, P ) are conditional expectations of X then Y1 = Y2 a.s.

(2) If a, b ∈ R, X1, X2 ∈ L1(Ω,F , P ) then E(aX1 + bX2|G) = aE(X1|G) +bE(X2|G).

(3) The conditional expectation of X given G exists.

(4) If X ≥ 0, E(X|G) ≥ 0.

24

Page 26: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

We denote by E(X|G) or EX|G any version of the conditional expectation of Xgiven G.

Proof (1) We first prove uniqueness. Let Y1, Y2 be variables such that for anyA ∈ G, ∫

A(Y1 − Y2)dP = 0.

This implies that Y1 = Y2 a.s.(2) The linearity follows from uniqueness.(3) and (4). Assume that X ≥ 0. Define Q(A) =

∫AX(ω)dP (ω) for A ∈ G.

ThenQ is a measure. The measure P restricts to a measure on G. If P (A) = 0 thenQ(A) = 0. By the Radon-Nikodym theorem, there exists a non-negative randomvariable dQ

dP , that belongs to L1(Ω,G, P ), such that

Q(A) =

∫AX(ω)dP (ω) =

∫A

dQ

dPdP.

Thus dQdP satisfies (3.1) and is the conditional expectation of X given G.

This proves (4).Let X ∈ L1. Then X = X+ − X− where X+, X− are positive functions in

L1. By part (2) they have conditional expectations. We define

EX|G = EX+|G −EX−|G.

(The conditional expectation can also be obtained directly by Radon-Nikodym the-orem for signed measures). This proves (3).

Proposition 3.2 For all bounded G-measurable functions g,∫Ωg(ω)X(ω)dP (ω) =

∫Ωg(ω)EX|G(ω) dP (ω). (3.2)

3.2 Properties of Conditional Expectations ( Lecture 6-7)

Proposition 3.3 Let X,Y ∈ L1(Ω,F , P ) and G a sub-σ-algebra of F .

1. Positivity Preserving. If X ≤ Y , then E(X|G) ≤ E(Y |G).

2. Linearity. For all a, b ∈ R,

E(aX + bY |G) = aE(X|G) + bE(Y |G).

25

Page 27: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

3. |E(X|G)| ≤ E(|X| |G).

4. If X is G-measurable, E(X|G) = X .

5. If σ(X) is independent of G, E(X|G) = EX a.s.

6. Taking out what is known: If X is G measurable, XY ∈ L1 then

E(XY |G) = XE(Y |G).

7. E(E(X|G) ) = EX .

8. Tower property: If G1 is a sub σ-algebra of G2 then

E(X|G1) = E (E(X|G1)|G2) = E (E(X|G2)|G1) .

9. Conditional Jensen’s Inequality. Let φ : Rd → R be a convex function.Then

φ (E(X|G)) ≤ E(φ(X)|G).

For p ≥ 1, ‖E(X|G)‖Lp ≤ ‖X‖Lp .

10. Conditional dominated convergence Theorem. If |Xn| ≤ g ∈ L1 then

E(Xn|G)→ E(X|G).

11. L1 convergence. If Xn → X in L1 then E(Xn|G)→ E(X|G) in L1.

12. Monotone Convergence Theorem. If Xn ≥ 0 and Xn increases with n thenE(Xn|G) increases to E(limn→∞Xn|G).

13. Fatou’s Lemma. If Xn ≥ 0,

E(lim infn→∞

Xn|G) ≤ lim infn→∞

E(Xn|G).

14. Suppose that σ(X) ∨ G is independent of A, then E(X|A ∨ G) = E(X|G).

Proposition 3.4 Let h : E × E → R be an integrable function on a metric spaceE. Let X,Y be random variables with state space E such that h(X,Y ) ∈ L1. LetH(y) = E (h(X, y)). Then

E(h(X,Y )|σ(Y )) = H(Y ).

26

Page 28: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

3.2.1 Disintegration and Orthogonal Projection (Lecture 7)

Let G be a sub-σ-algebra of a σ-algebra F . Since L2(Ω,F , P ) is a Hilbert spaceand L2(Ω,G, P ) is a closed subspace of L2, let π denote the orthogonal projectiondefined by the projection theorem ( §II.2 Functional Analysis [23]),

π : L2(Ω,F , P )→ L2(Ω,G, P ).

f − π(f) ⊥ L2(Ω,G, P )

πf ∈ L2(Ω,G, P )

f

We will see below that the conditional expectation of an L2 function is pre-cisely its L2 orthogonal projection to L2(Ω,G, P ). We give below second prooffor the existence of conditional expectations.Proof (1) Let X ∈ L2(Ω,F , P ). Then for any h ∈ L2(Ω,G, P ),

〈X − πX, h〉L2(Ω,F ,P ) = 0.

This is, ∫ΩXhdP =

∫Ωπ(X)hdP

Let A ∈ G and take h = 1A to see that

πX = EX|G.

(2) Let X ∈ L1 with X ≥ 0. Let 0 ≤ X1 ≤ X2 ≤ . . . be a sequence ofbounded positive functions (increasing with n) converging to X pointwise. ThenXn ∈ L2, πXn exists, and are positive. Furthermore for any A ∈ G,∫

AXndP =

∫AπXndP

Since,0 ≤ 1AX1 ≤ 1AX2 ≤ . . . ,

limn→∞ πXn exists. By the monotone convergence theorem,∫AXdP = lim

n→∞

∫AXndP = lim

n→∞

∫AπXndP =

∫A

limn→∞

πXndP.

27

Page 29: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

(3) Finally for X ∈ L1 not necessarily positive, let X = X+ −X− and defineEX|G = EX+|G −EX−|G.

Remark 3.1 LetX ∈ L2(Ω,F , P ). Then πX is the unique element ofL2(Ω,G, P )such that

E|X − πX|2 = minY ∈L2(Ω,G,P )

E|X − Y |2.

3.2.2 Appendix*

At this point we note a simple problem from Filtering Theory. Let Yt be the ob-servation process of a signal process. What is the best estimation for Xt givenYs, s ≤ t? We have seen that in the L2 case, the conditional expectation is an L2

minimizer. We therefore define the L2 estimator to be:

Xt := EXt|σYs : 0 ≤ s ≤ t.

The concern in filtering is to find the conditional distribution, and the conditionaldensity when it exists, of X(t) given Y (t).

In linear filtering, we assume that

Xt(ω) = X0(ω) +Wt(ω) +

∫ t

0F (s)Xs(ω)ds+

∫ t

0f(s)ds (3.3)

Yt(ω) =

∫ t

0H(s)Xsds+

∫ t

0h(s)ds+Bt(ω). (3.4)

Here (Wt), (Bt) are independent Brownian motions and both independent ofX0. We assume that F, f,H, h : R+ → R are bounded measurable functions.This leads to Karman Filter, linear filtering and Zakai equation.

3.3 Uniform Integrability (Lecture 7)

Let (Ω,F , µ) be a (σ-finite) measure space, and I an index set.

Definition 3.3 A family of real-valued measurable functions (fα, α ∈ I) is uni-formly integrable (u.i.) if

limC→∞

supα∈I

∫|fα|≥C

|fα|dµ = 0.

28

Page 30: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Lemma 3.5 (Uniform Integrability of Conditional Expectations) Let X : Ω→R be in L1. Then the family of functions

EX|G : G is a sub σ-algebra of F

is uniformly integrable.

Lemma 3.6 Let X : Ω→ R be an integrable random function, then the family offunctions

E(X|G) : G is a sub σ-algebra of F

is uniformly integrable.

Proof exercise.

Theorem 3.7 (Vitali Theorem) Let fn ∈ Lp(µ), p ∈ [1,∞]. Then the following isequivalent.

1. fnLp→ f , i.e. limn→∞ ‖fn − f‖p = 0.

2. |fn|p is uniformly integrable and fn → f in measure.

3.∫|fn|pdµ→

∫|f |pdµ and fn → f in measure.

3.4 Appendix

Let (S,A, µ) be a measure space. Let f, fα : S → R be Borel measurable func-tions.

Proposition 3.8 If f ∈ L1(µ) where µ is a σ-finite measure, for every ε > 0 thereis δ > 0 such that for all A with µ(A) < δ,∫

A|f |dµ < ε.

Proof We define a measure ν(A) =∫A fdµ. It is a signed measure with both the

positive and negative part absolutely continuous w.r.t. µ. By considering ν+, ν−

separately, we may and will assume that f ≥ 0 and ν is a positive measure. Ifthe conclusion does not hold, there exists a positive number ε such that for each nthere is a set An with µ(An) < 1

2n and

ν(An) =

∫An

|f |dµ ≥ ε.

29

Page 31: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Let A = ∩∞n=1 ∪∞k=n Ak. Then,

µ(A) = µ(∩∞n=1 ∪∞k=n Ak) = limn→∞

µ (∪∞k=nAk) = 0.

In particular∫A fdµ = 0. But,

ν(A) = limn→∞

ν (∪∞k=nAk) ≥ ν(An) ≥ ε.

This gives a contradiction.

Definition 3.4 A family of integrable real valued random functions fα is uni-formly absolutely continuous if for every ε > 0 there is a number δ > 0 such thatif a measurable set A has µ(A) < δ then for all α ∈ I∫

A|fα|dµ < ε.

Proposition 3.9 Let µ be a finite measure. Let (fα, α ∈ I) be a family of inte-grable real valued functions. The following statements are equivalent:

(1) (fα, α ∈ I) is uniformly integrable (u.i.)

(2) (fα, α ∈ I) is L1 bounded and uniformly absolutely continuous.

(3) (de la Vallee-Poussin criterion) There exists an increasing convex functionΦ : R+ → R+ such that limx→∞

Φ(x)x =∞ and supαE (Φ(|fα|)) <∞.

Proposition 3.10 Let (S,A, µ) be a measure space. Suppose that fn : S → Rbelongs to L1.

1. If fn → f in L1 then fn is L1 bounded.

2. If fn → f in L1, then fn is uniformly absolutely continuous. See exercise11, section 3.2 in [8].

3. Suppose that µ is a finite measure. If fn → f in measure and fn isuniformly absolutely continuous then fn → f in L1.

Proof By Riesz-Fisher theorem, the L1 space is a complete Banach space. (1) isobvious.

(2)Suppose that fn → f in L1. For any ε > 0 there is N(ε) such that

supn≥N

∫|fn − f |dµ < ε/2.

30

Page 32: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Let α > 0 be such that if µ(A) < α then∫A|f |dµ < ε/2, sup

k≤N−1

∫A|fk|dµ < ε.

For n ≥ N , ∫A|fn|dµ ≤

∫|fn − f |dµ+

∫A|f |dµ < ε.

(3) We may assume that µ = P is a probability measure.Suppose that fn is uniformly absolutely continuous and fn → f in measure,

i.e. for any ε > 0,limn→∞

P (|fn − f | >ε

3) = 0.

Let ε > 0. Choose δ(ε) > 0, such that if E is a measurable set with µ(E) < δ,

supn

∫E|fn|dP < ε/3,

∫E|f |dP < ε/3.

There exists N(ε, δ) such that for P (|fn − f | > ε/3) < δ whenever n ≥ N(δ, ε).For such n,∫|fn − f |dP ≤

∫|fn−f |≤ ε3

|fn − f |dP +

∫|fn−f |> ε

3

|fn|dP +

∫|fn−f |> ε

3

|f |dP < ε.

(3.5)

It follows that fn → f in L1.

3.5 Appendix. Absolute continuity of measures

Definition 3.5 1. Let (Ω,F) be a measurable space. Given two measures Pand Q. The measure Q is said to be absolutely continuous with respect toP if Q(A) = 0 whenever P (A) = 0, A ∈ F . This will be denoted byQ << P .

2. They are said to be equivalent, denoted by Q ∼ P , if they are absolutelycontinuous with respect to the other.

Theorem 3.11 (Radon-Nikodym Theorem) If Q << P , there is a nonnegativemeasurable function Ω→ R, , which we denote by dQ

dP , such that for each measur-able set A we have

Q(A) =

∫A

dQ

dP(ω)dP (ω).

31

Page 33: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

The function dQdP : Ω → R is called the Radon-Nikodym derivative of Q with

respect to P . We also say that dQdP is the density of Q with respect to P . This

function is unique.

Note that if Q is a finite measure then dQdP ∈ L1(Ω,F , P ). If P is a probability

measure, and∫

ΩdQdP (ω) dP (ω) = 1, then Q is a probability measure.

If furthermore dQdP > 0, then∫

AdP =

∫A

1dQdP

dQ

dPdP =

∫A

1dQdP

dQ.

Since Q(A) = 0, it follows that P (A) =∫A

1dQdP

dQ = 0 and P << Q. The two

measures are equivalent and dPdQ ·

dQdP = 1.

Example 3.1 Let Ω = [0, 1) and P the Lebesgue measure. Let Ani = [ i2n ,i+12n ),

i = 0, 1, . . . , 2n − 1. and Fn = σAn0 , An1 , . . . , An2n−1. Let µ be a measure onFn. Check that

dP(x) =

∑i

µ(Ani )

P (Ani )1Ani (x), x ∈ [0, 1).

Two measuresQ1 andQ2 are singular ifQ1(A) = 0 wheneverQ2(A) 6= 0 andQ2(A) = 0 whenever Q1(A) 6= 0.

Example 3.2 Let Ω = [0, 1] and P the Lebesgue measure . Define Q1 by dQ1

dP =21[0, 1

2]. Then Q1 << P and P is not absolutely continuous with respect to Q1.

Define Q2 by dQ2

dP = 21[ 12,1]. The two measures Q1 and Q2 are singular.

32

Page 34: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Chapter 4

Martingales

A filtration (Ft, t ≥ 0) is right continuous if Ft+ := ∩h>0Ft+h equals Ft. LetF∞ = ∨t≥0Ft = σ(∪t≥0Ft), the smallest σ algebra containing every σ-algebraFt, t ≥ 0. The completion of a σ-algebra Ft is normally obtained by adding allnull sets in F∞ whose measure is zero and is called the augmented σ-algebra.

The standard assumption on the filtration is that it is right continuous andeach σ-algebra is complete.

The filtration Gt : Gt = Ft+ is right continuous. The natural filtration of acontinuous process is not necessarily right continuous. Let FBs := σBr : 0 ≤r ≤ s be the natural filtration of (Bt) complete with respect to P . Then FBs isright continuous. This is due to Blumenthal’s 0− 1 law.

Definition 4.1 An (Ft) adapted stochastic process is a Ft-Brownian motion if itis a Brownian motion and if for every pair of numbers 0 ≤ s < t, (Xt+s −Xs) isindependent of Fs.

4.1 Definitions (Lecture 7)

Definition 4.2 Let Ft be a filtration on (Ω,F , P ). An adapted stochastic process(Xt, t ∈ I)

(1) is a -martingale if E|Xt| <∞ and

EXt|Fs = Xs, ∀s ≤ t.

(2) is a (integrable) sub-martingale, if E|Xt| < ∞ and EXt|Fs ≥ Xs for alls ≤ t. (In [24], X+

t ∈ L1 is assumed instead of Xt ∈ L1)

33

Page 35: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

(3) is a (integrable) super martingale if E|Xt| <∞ and EXt|Fs ≤ Xs for alls ≤ t. (In [24], X−t ∈ L1 is assumed instead of Xt ∈ L1)

If (Xt) is a super-martingale then (−Xt) is a sub-martingale. If (Xt) is both asub-martingale and a super-martingale, it is a martingale.

Example 4.1 Let f ∈ L1 and ft = Ef |Ft then ft is a martingale.

Example 4.2 Take Ω = [0, 1] and define F1 to be the Borel sets of [0, 1] and Pthe Lebesgue measure. Define Ft to be the σ-algebra generated by the collectionof functions which are Borel measurable when restricted to [0, t] and constant on[t, 1]. Let f : [0, 1] → R be an integrable function and define Mt = Ef |Ft.Then

Mt(x) =

f(x), if x ≤ t

11−t∫ 1t f(r)dr if x > t.

Check that for s < t,

EMt|Fs(x) =

f(x), if x ≤ s

11−s [

∫ ts f(r)dr +

∫ 1t Mt(r)dr] if x > s.

=

f(x), if x ≤ s

11−s [

∫ ts f(r)dr +

∫ 1t ( 1

1−t∫ 1t f(u)du)dr] if x > s.

=

f(x), if x ≤ s

11−s [

∫ 1s f(r)dr] if x > s.

= Ms(x).

4.2 Discrete time martingales(Lecture 8)

Proposition 4.1 If (Xn, n ∈ I) where I is a countable set is an Fn-martingale ifand only if for all n ∈ I ,

E(Xn+1|Fn) = Xn.

This can be prove by induction.

Example 4.3 Let Xn, n ∈ N, be a sequence of independent integrable randomvariables. Let Fk = σX1, X2, . . . , Xk.

1. Suppose that E(Xn) = 0. Then Sn =∑n

j=1Xj is a martingale:

E(Sn|Fn−1) = E(Xn|Fn−1) + Sn−1 = E(Xn) + Sn−1 = Sn−1.

If Xn : Ω → 1,−1 are Bernoulli variables, Sn is said to be a simplerandom walk.

34

Page 36: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

2. Let Xn = Xn + 1. Then

Sn =

n∑k=1

Xn =

n∑k=1

Xk + n = Sn + n

is a sub-martingale.

3. Suppose that Xi ≥ 0 and E(Xn) = 1. Then Mn = Πni=1Xi is a discrete

time martingale.

4.3 Discrete Integrals(Lecture 8)

LetXn be the value of an asset at time n, we may take Fn = σX1, . . . , Xn, thenXn ∈ Fn. Let Hn be the number of stakes one puts down at time n − 1, basedon the values of X1, . . . , Xn−1, i.e. Hn ∈ Fn−1. Stochastic process Hn withHn ∈ Fn−1 is said to be previsible. The total winning at time n will be denoted byH ·X .

(H ·X)0 = 0

(H ·X)1 = H1(X1 −X0)

(H ·X)n = H1(X1 −X0) + · · ·+Hn(Xn −Xn−1), n ≥ 1

These are discrete ‘stochastic integrals’ and will be denoted by∫ n

0 HsdXs.

Lemma 4.2 Let (Hn) be previsible with |Hn(ω)| ≤ K for some K > 0.

(1) If (Xn) is an (Fn) martingale then H ·X is a martingale.

(2) Suppose that Hn ≥ 0. If (Xn) is a super-martingale then so is H ·X .

Proof (1) Since Hn is bounded, (H ·X)n ∈ L1 for each n. Furthermore,

E(H ·X)n+1|Fn = (H ·X)n +Hn+1EXn+1 −Xn|Fn.

If (Xn) is a martingale, the last term vanishes and

E(H ·X)n+1|Fn = (H ·X)n.

(2) If Hn ≥ 0 and Xn is a super-martingale, Hn+1EXn+1−Xn|Fn−1 ≤ 0,hence

E(H ·X)n+1|Fn ≤ (H ·X)n.

35

Page 37: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

4.4 The Upper Crossing Theorem and Martingale Con-vergence Theorem(Lecture 9)

Let a < b. By ‘an upper crossing’ by (Xn) we mean a journey starting from belowa and ends above b. Let us say Xn < a and m = infm>nXm > b. Thenconnecting the points Xn, Xn+1, . . . , Xm gives us an ‘upper crossing’ in graph.

Lemma 4.3 Let (Xn, n ∈ N ) be a super-martingale and let UN ([a, b])(ω) be thenumber of up-crossings of [a, b] made by a stochastic process Xn by time N .Then

(b− a)EUN ([a, b]) ≤ E(XN − a)−.

Proof (Sketch) Let H be a betting strategy that you play 1 unit when Xn < a,plays until X gets above b and stop playing. Then

(H ·X)N ≥ (b− a) (UN ([a, b]))− [XN (ω)− a]−.

Taking expectation, using the fact that (H ·X) is a super-martingale (Lemma 4.2),to see that

0 = E(H ·X)0 ≥ E(H ·X)N (ω) ≥ (b− a)E (UN ([a, b]))−E[XN (ω)− a]−.

If (an) is a sequence that crosses from a to b infinitely often for some a < b,then an cannot have a limit. If an does not have a limit, there will be a numbera < b such that an crosses it infinitely often. This is the philosophy behind thefollowing martingale convergence theorem.

Theorem 4.4 Let (Xn, n ∈ N ) be a discrete time super-martingale.

(1) Suppose that supnE(X−n ) <∞. Then X∞ := limn→∞Xn exists.

(2) Assume that supnE|Xn| <∞ then X∞ is in L1.

Proof If limn→∞Xn(ω) does not exists, there are two rational numbers a(ω) <b(ω) such that

lim infN→∞

XN (ω) < a(ω) < b(ω) < lim supN→∞

XN (ω).

Let A be the set of ω such that limN→∞XN does not exist. It is clear that

A ⊂ ∪a,b∈Q,a<bω : lim inf

N→∞XN (ω) < a < b < lim sup

N→∞XN (ω)

.

36

Page 38: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Let us prove that for each pairs of rational numbers (a, b), P (Λa,b) = 0 where

Λa,b =

ω : lim inf

N→∞XN (ω) < a < b < lim sup

N→∞XN (ω)

.

If ω ∈ Λa,b, there must be an infinite number of visits, UN (ω), to below a andto above b: limN→∞ UN ([a, b]) =∞. Hence

Λa,b ⊂ω : lim

N→∞UN ([a, b]) =∞

.

By Doob’s upper crossing Lemma (Lemma 4.3),

(b− a) limN→∞

EUN ([a, b]) ≤ supN

E(XN − a)− ≤ supN

E((XN )− + |a|) <∞.

By the monotone convergence theorem,

E limN→∞

UN ([a, b]) = limN→∞

E UN ([a, b]) <∞.

In particular limN→∞ UN ([a, b]) <∞ almost surely and P (Λa,b) = 0.If (Xn) is L1 bounded, we apply Fatou’s lemma

E| limN→∞

XN | ≤ E limN→∞

|XN | ≤ lim infN→∞

E|XN | ≤ supN

E|XN | <∞.

Remark 4.1 If (Xn) is a sub-martingale, we must control its positive part. IfsupnE(Xn)+ < ∞ then limn→∞Xn exist a.s.. Note also that |Xn| = (X+

n ) +(X−n ). So (Xn) is L1 bounded if and only if

supn

E(Xn)− <∞, supn

E(Xn)+ <∞.

If (Xt) is a continuous time martingale, it converges along every increasingsequence tk by applying the above convergence theorem to the discrete timesuper-martingaleXtk. Let f : R+ → R be a function then limt→T f(t) existsif and only if for any sequence tn → T , limn→∞ f(tn) exists and have the samelimit. However we do not have control over the exceptional sets on which this doesnot hold, unless some continuity conditions is imposed, in which case the valuesof the process will be determined by theirs values on Q and the countable points ofjumps.

Let T ∈ R+ ∪ ∞ and I = (0, T ).

37

Page 39: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Theorem 4.5 Let (Xt, t ∈ I) be a right continuous stochastic process. Thenlimt→T Xt exists almost surely if one of the following conditions hold:

(1) (Xt) is a super martingale with supt<T E(X−t ) <∞,

(2) (Xt) is a sub-martingale with suptE(X+t ) <∞.

It is easy to see this. For a.s. ω, functions Xt(ω) has a limit along its increas-ing sequence of times. Let us simply arrange the rational number and the set ofdiscontinuities of Xt(ω) in an increasing order.

Corollary 4.6 If the filtration (Ft) satisfies the usual assumptions, and Y ∈ L1,we may choose Yt among versions of E(Y |Ft) such that (Yt) is a cadlag martin-gale.

4.5 Stopping Times (Lecture 10)

A stopping time is, roughly speaking, the time that an event has arrived. Thistime is ∞ if the event does not arrive. Let I ⊂ R+ and (Ω,F ,Ft, P ) a filteredprobability space.

Definition 4.3 1. A function T : Ω→ I ∪ ∞ is a (Ft, t ∈ I) stopping timeif ω : T (ω) ≤ t ∈ Ft for all t ∈ I .

2. Given a stochastic process (Xt, t ∈ I). The stopped process XT is definedby XT

t (ω) = XT (ω)∧t(ω).

Example 4.4 (a) A constant time is a stopping time.

(b) T (ω) ≡ ∞ is also a stopping time.

Proposition 4.7 A function T : Ω → N is an Fn, n ∈ N stopping time if andonly if T (ω) = n ∈ Fn for all n.

Proof If T is a stopping time, T = n = T ≤ n ∩ T ≤ n − 1c ∈ Fn.Conversely, T ≤ n = ∪i=1T = i ∈ Fn if T (ω) = n ∈ Fn for all n.

Let (Xt) be a stochastic process on S. For B ∈ B(S) let

TB(ω) = inft > 0 : Xt(ω) ∈ B

TB be the hitting time of B by (Xt). By convention, inf(∅) = +∞.

38

Page 40: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Example 4.5 Suppose that (Xn) is (Fn) adapted. LetB be a measurable set. ThenTB is an Fn stopping time:

TB ≤ n = ∪k≤nω : Xk(ω) ∈ B ∈ Fn.

If (Xt) is an right continuous (Ft)-adapted stochastic process, the hitting timeof an open set is an F+

t -stopping time. Recall one of the usual assumptions: Ft =F+t . The first hitting time of closed set by a continuous (Ft)-adapted stochastic

process is an Ft- stopping time.

Proposition 4.8 Let S, T, Tn be stopping times.

(1) Then S ∨ T = max(S, T ), S ∧ T = min(S, T ) are stopping times.

(2) lim supn→∞ Tn and lim infn→∞ Tn are stopping times.

Proof Part (1) follows from the following observations:

ω : max(S, T ) ≤ t = S ≤ t∩T ≤ t, ω : min(S, T ) ≤ t = S ≤ T∪T ≤ t.

Sincelim supn→∞

Tn = infn≥1

supk≥n

Tn, lim infn→∞

Tn = supn≥1

infk≥n

Tn

we only proof that if Tn is an increasing sequence, supn Tn is a stopping time; andif Sn is a decreasing sequence of stopping times with limit S, infn Sn is a stoppingtime. These follows from

supnTn ≤ t = ∩nTn ≤ t, inf S ≤ t = ∪nSn ≤ t.

Definition 4.4 Let T be a stopping time. Define

FT = A ∈ F∞ : A ∩ T ≤ t ∈ Ft, ∀t ≥ 0.

If T = t is a constant time, FT agrees with Ft. For T takes values in N , FT =A ∈ F∞ : A ∩ T = n ∈ Fn, ∀n ∈ N.

Theorem 4.9 (1) If T is a stopping time and (Xt) is a progressively measur-able stochastic process, then XT if FT -measurable.

(2) If T is finite stopping time, then FT = σXT : X is cadłag.

39

Page 41: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

For a proof see Revuz-Yor [24] and Protter [22].

Proposition 4.10 Let S, T be stopping times.

(1) If S ≤ T then FS ⊂ FT .

(2) Let S ≤ T and A ∈ FS . Then S1A + T1Ac is a stopping time.

(3) S is FS measurable.

(4) FS ∩ S ≤ T ⊂ FS∧T .

Proof

(1) If A ∈ FS ,

A ∩ T ≤ t = (A ∩ S ≤ t) ∩ T ≤ t ∈ Ft

and hence A ∈ FT .

(2) Since FS ⊂ FT ,

S1A + T1Ac ≤ t = (S ≤ t ∩A) ∪ (T ≤ t ∩Ac) ∈ FT .

(3) Let r, t ∈ R, S ≤ r ∩ S ≤ t = S ≤ min(r, t) ∈ Ft. HenceS ≤ r ∈ Fr.

(4) Take A ∈ FS and t ≥ 0. Then

A∩S ≤ T∩S∧T ≤ t = (A ∩ T ≤ t)∩S ≤ t∩S∧t ≤ T∧t ∈ Ft.

which follows as S ∧ t and T ∧ t are Ft-measurable. Hence A∩S ≤ T ∈FS∧T .

For a nice account of stopping times see Kallenberg [14].

Proposition 4.11 Let T be a stopping time.

(1) If (Xn) is a martingale so is the stopped process (XTn ).

(2) If (Xn) is a super-martingale so is (XTn ).

40

Page 42: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Proof For any n ∈ N ,

XTn =

n−1∑i=1

Xi1T=i +Xn1T≥n.

Hence for every n, E|XTn | ≤

∑ni=1 E|Xi| < ∞. We observe that 1T≥n =

1− 1T≤n−1 is Fn−1-measurable. Hence

E(XTn |Fn−1) =

n−1∑i=1

Xi1T=i + E(Xn1T≥n|Fn−1)

=

n−1∑i=1

Xi1T=i + 1T≥nE(Xn|Fn−1)

=

n−2∑i=1

Xi1T=i +Xn−11T=n−1 + 1T≥nXn−1

= XTn−1.

In case (2), 1T≥nE(Xn|Fn−1) ≤ Xn−11T≥n and E(XTn |Fn−1) ≤ XT

n−1.

4.6 The Optional Stopping Theorems (Lecture 11)

By a bounded stopping time T we mean that there exists a number C such thatS(ω) ≤ T (ω) ≤ C a.s. Let I be an ordered countable set of positive real numbers.

Proposition 4.12 (Doob’s Elementary Optional Stopping Theorem) Let (Xr, r ∈I) be a super-martingale. Let S ≤ T be bounded stopping times. Then

E(XT ) ≤ E(XS).

If furthermore (Xr, r ∈ I) is a martingale, equality holds.

Proof Let Hn = 1T≥n − 1S≥n. Since S ≤ T ≤ N ,

(H ·X)N :=∑

S<i≤T(Xi−Xi−1) = (XT−XT−1)+· · ·+(XS+1−XS) = XT−XS .

SinceH is non-negative, (Xn) a super-martingale, thenH ·X is a super-martingale.Thus E(H ·X)N ≤ E(H ·X)0 = 0 and E(XT ) ≤ E(XS).

41

Page 43: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

4.7 Doob’s Optional Stopping Theorem (Lecture 12)

Let I be any index set.

Proposition 4.13 Let (Xt : t ∈ I) be an integrable right or left continuous ( orprogressively measurable) stochastic process.

(1) Suppose that for all bounded stopping times S ≤ T , EXT = EXS . Then

EXT |FS = XS .

(2) Suppose that for all bounded stopping times S ≤ T , EXT ≤ EXS . Then

EXT |FS ≤ XS .

Proof Let S ≤ T be two stopping times bounded by C. Let A ∈ FS . Defineτ = S1A + T1Ac ≤ C. It is a stopping time by Proposition 4.10. Then

EXT = E[XT1A] + E[XT1Ac ]

EXτ = E[XS1A] + E[XT1Ac ]..

(1) For the first statement, EXτ = EXT by assumption. Thus E[XT1A] =E[XS1A] for all A ∈ FS . It follows that EXT |FS = XS .

(2) For the second statement, EXτ ≥ EXT by the assumption giving that

E(XT1A) ≤ E(XS1A).

Since E(XT1A) = E (EXT |FS1A), we have

E ((EXT |FS −Xs)1A]) ≤ 0

for any A. Hence EXT |FS ≤ XS .

Note that if X∞ = limt→∞ exists, the above works with T replaced by∞.

Theorem 4.14 (Doob’s Optional Stopping Theorem) Let S and T be two boundedstopping times such that S ≤ T .

(1) Let (Xt, t ≥ 0) be a right continuous martingale. Then

EXT |FS = XS , a.s.

42

Page 44: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

(2) Let (Xt, t ≥ 0) be a right continuous super-martingale. Then

EXT |FS ≤ XS

almost surely.

Proof We prove part (1). Let K ∈ R be such that S(ω) ≤ T (ω) ≤ K a.s.. Let

Sn =1

2n[2nS + 1].

In other words,

Sn(ω) =j + 1

2n, if S(ω) ∈

[j

2n,j + 1

2n

), j = 0, 1, 2 . . . .

Then Sn decreases with n and |Sn − S| ≤ 12n → 0. If t ∈ [m2n ,

m+12n ),

Sn(ω) ≤ t = S(ω) ≤ m

2n ∈ F m

2n⊂ Ft.

So Sn are stopping times. Recall that (Xt) is integrable. By Doob’s elementaryoptional stopping theorem

XSn = EXK |FSn.

Since EXK |FSn , n ∈ N is uniformly integrable, see Lemma 3.5, by Proposition3.10,

EXS = E limn→∞

XSn = limn→∞

EXSn = limn→∞

E (EXK |FSn) = EXK .

We have used right continuity of the process. That EXT |FS = XS follows fromProposition 4.13.

4.8 Right End of a Martingale and OST II (Lecture 13)

Theorem 4.15 (Closability of Martingale) If (Xt, t ∈ [0, T )) is a right continu-ous martingale, the following statements are equivalent:

(1) Xt converges to a r.v. XT in L1 (i.e. limt→T E|Xt −XT | = 0).

(2) There exists a L1 random variable XT s.t. Xt = EXT |Ft for any 0 ≤t ≤ T .

43

Page 45: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

(3) (Xt, t < T ) is uniformly integrable.

Proof In all cases suptE|Xt| <∞ and by the convergence theorem (Proposition4.5), XT := limt→∞Xt exists almost surely.

• That (1) is equivalent to (3) is standard, c.f. part (5) of Proposition 3.10

• Assume (2). By Lemma 3.5, the conditional random variables (Xt, t ≥ 0)are uniformly integrable, hence (3) holds.

• Assume (3): (Xt, t < T ) is uniformly integrable. By Proposition 3.9,suptE|Xt| is bounded, By Theorem 4.5, limt→T Xt belongs to L1. By themartingale property, for any T > u > t, Xt = EXu|Ft. By the uniformintegrability,

Xt = limu→T

E(Xu|Ft) = E(XT |Ft),

giving (2).

We define XT = limt→∞Xt when the limit exists If limt→∞Xt exists and Ta stopping time, we define XT = X∞ on T =∞.

Theorem 4.16 (The Optional Stopping Theorem II) Let (Xt, t ≥ 0) be a uni-formly integrable sub-martingale. Let S ≤ T be stopping times (not necessarilybounded). Then

EXT |FS ≥ XS , E(X∞|FT ) ≥ XT .

If Xt, t ≥ 0 is furthermore a uniformly integrable martingale, then

EX∞|FS = XS , EX∞|FS = XS .

Proof We only prove the case of a martingale (the sub-martingale case is left as anexercise).

We only need to prove E(XS) = E(X∞) for any stopping time S, see theproof of Proposition 4.13 with T replaced by∞.

By Theorem 4.15, for all t > 0,

Xt = E(X∞|Ft).

Let Ank = S ∈ [ k2n ,k+12n ) and Sn =

∑ k2n1Ank . Let A ∈ FSn , Then

E(XSn1Ank1A) = E(X k2n1Ank1A) = E(X∞1Ank1A).

44

Page 46: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Summing over all k,E(XSn1A) = E(X∞1A). (4.1)

In particular for any B ∈ FS ⊂ FSn ,

E(XSn1A1B) = E(X∞1A1B).

Hence XSn1B = E(X∞1B|FSn). Thus XSn1B, n ∈ N is u.i. Taking n→∞we see that,

E(XS1B) = limn→∞

E(XSn1B) = E(X∞1B).

This proves that E(X∞|FS) = XS .

Corollary 4.17 (1) If (Mt, 0 ≤ t ≤ a) is a right continuous martingale thenMT : T is a stopping time, T ≤ a is uniformly integrable.

(2) If (Mt, 0 ≤ t ≤ ∞) is a uniformly integrable right continuous martingalethen MT : T any stopping time is uniformly integrable.

Proof (1) Mt = E(Ma|Ft) and (2) MT = EM∞|FT .

4.9 Martingale Inequalities (Lecture 14-15)

The inequalities in this section are important. The proofs are straight forward andI only cover the proof of the maximal inequality in the lecture.

Lemma 4.18 (Maximal Inequality) If (Xk) is a sub-martingale (or a martin-gale),

λP

(max

0≤k≤NXk ≥ λ

)≤ E

(XN1max0≤k≤N Xk≥λ

).

Proof This follows by letting T = infk : Xk ≥ λ, and set T = N if T ≥ N .Then T is a bounded stopping time. By the optional stopping theorem,

E(XN ) ≥ E(XT ) = E(XT1max0≤k≤N Xk≥λ

)+ E

(XT1max0≤k≤N Xk<λ

)≥ λP

(max

0≤k≤NXk ≥ λ

)+ E

(XN1max0≤k≤N Xk<λ

).

It follows that,

λP

(max

0≤k≤NXk ≥ λ

)≤ E

(XN1max0≤k≤N Xk≥λ

).

45

Page 47: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Lemma 4.19 Suppose that (Xk) is a martingale or a positive sub-martingale. LetX∗ = sup0≤k≤n |Xk|. Then

E sup0≤k≤n

|Xk|p ≤(

p

p− 1

)pE|Xk|p.

Proof If (Xk) is a martingale, |Xk| is a sub-martingale. It is sufficient to workwith positive sub-martingales. For any constant C

E(X∗ ∧ C)p = E

∫ X∗∧C

0ptp−1dt =

∫ C

0ptp−1E1t≤X∗dt.

Apply the maximal inequality∫ C

0ptp−1E1t≤X∗dt ≤

∫ C

0ptp−2E

(|Xn|1X∗≥t

)dt

= E

(|Xn|

∫ C

0ptp−21X∗≥tdt

)=

p

p− 1E(|Xn|(X∗ ∧ C)p−1

).

By Holder inequality,

E(|Xn|(X∗ ∧ C)p−1

)≤ (E|Xn|p)

1p [E(X∗ ∧ C)p]

p−1p .

To summarise we have

E(X∗ ∧ C)p ≤(

p

p− 1

)pE|Xn|p]

The required identity follows by taking C to infinity.

We return to the continuous time stochastic processes. Let I = [a, b], I = [a, b)or I = [a,∞). Let (Xt, t ∈ I) be a right continuous martingale or a positive sub-martingale. For p > 1, |Xt|p is a sub-martingale. In particular E|Xt|p increaseswith t. Set

X∗ = supt∈I|Xt|.

If (Bt) is a Brownian motion, then

P (sups≤t

Bs ≥ a) = 2P (Bt ≥ a) = P (|Bt| ≥ a) ≤ 1

apE|Bt|p.

We do not study this equality in this course, and will instead study theLp inequalitybelow.

46

Page 48: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Proposition 4.20 Let (Xt, t ∈ I) be a right continuous martingale or a positivesub-martingale. Let t ∈ I , an interval.

(1) Maximal Inequality. For p ≥ 1,

P(

supt∈I|Xt| ≥ λ

)≤ 1

λpsupt∈I

E|Xt|p, λ > 0

(2) Doob’s Lp inequality. Let p > 1. Then

E

(supt∈I|Xt|p

)≤(

p

p− 1

)psupt∈I

E|Xt|p.

Proof If (Xk) is a martingale or a positive sub-martingale, for p ≥ 1, (|Xk|p) is asub-martingale, sup0≤k≤nE|Xk|p = E|Xn|p. The maximal inequality for a finiteindex set extends to stochastic processes with a countable index set. As the valuesof (Xt) is determined by (Xt, t ∈ Q), the required inequality follows.

47

Page 49: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Chapter 5

Continuous Local Martingalesand The quadratic VariationProcess

5.1 Lecture 14-15. Local Martingales

We fixed a filtered probability space (ω,F ,Ft, P ).

Definition 5.1 Let (Xt) be an Ft-adapted stochastic process. If there exists a se-quence of stopping times Tnwith Tn ≤ Tm for n ≤ m and limn→∞ Tn =∞ a.s.and the property that for each n, (XTn

t 1Tn>0, t ≥ 0) is a uniformly integrablemartingale, we say that (Xt) is a local martingale and that Tn reduces X .

For t = 0, XTnt 1Tn>0 = X01Tn>0.

A sample continuous martingale is a local martingale (Take Tn = n). Abounded continuous local martingale is a martingale. The following definitioncomes from [6].

Definition 5.2 (1) A local martingale that is not a martingale is a strictly localmartingale.

(2) An n dimensional stochastic process (X1t , . . . , X

nt ) is a Ft local-martingale

if each component is a Ft local-martingale.

(3) A (special/decomposable) semi-martingale is an adapted stochastic processsuch that Xt = X0 + Mt + At where (X0) is a random variable, (Mt) is alocal martingale and (At) a process of finite total variation, M0 = A0 = 0.The decomposition is called the Doob-Meyer decomposition

48

Page 50: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

(4) A continuous semi-martingale Xt is of the form Xt = X0 +Mt +At whereMt and At are continuous.

The terminology semi-martingale was traditionally introduced to define stochasticintegrals. If a stochastic process can be decomposed as in (3), it is indeed a semi-martingale in the traditional sense. A process of the form Xt = X0 +Mt +At aretraditionally called special or decomposable semi-martingales. We will drop thequalifier ‘special’ or ‘decomposable’.

Remark 5.1 (1) A local martingale (Mt) is a martingale if for each t, and thereducing sequence of stopping times Tn, XTn

t , n ≥ 0 is uniformly inte-grable. It is a martingale if for all t > 0,

MT : T bounded stopping times, T ≤ t

is uniformly integrable.

(2) A local martingale (Mt) is a martingale If |Mt| ≤ Z where Z ∈ L1. Inparticular a bounded local martingale is a martingale.

(3) If Xt is a martingale then EXt = EX0. If Xt is a local martingale thisno longer holds. Furthermore given any function m(t) of bounded variationthere is a local martingale such that m(t) is its expectation process. A localmartingale which is not a martingale is called a strictly local martingale,otherwise it is a true martingale, see Elworthy-Li-Yor [6] for discussionsrelated to this.

(4) A stochastic integral, as to be defined in the next chapter, with respect to acontinuous local martingale is a martingale.

Theorem 5.1 Let (Mt, t ≤ T ) be a continuous local martingale. Let A = ω :M(ω)TV < ∞. Then for almost surely all ω ∈ A, Mt(ω) = Ma(ω) for allt ∈ [a, b].

Appendix

Proof of Theorem 5.1. This proof is the same as that for Brownian motions. Wemay assume that M0 = 0. Let t ≤ T . First let

MTV (t, ω) = sup∆

N−1∑j=0

|Mtj+1(ω)−Mtj (ω)|.

49

Page 51: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

where ∆ ranges through all partitions 0 = t0 < t1 < · · · < tN = t of [0, t]. It isincreasing and continuous in t. Let

Tn = inft : MTV (t) ≥ n.

Fix n, writeXt = MTn

t .

Then Xt is bounded by n. Since M0 = 0, |Xt| = |MTn∧t| ≤ |MTn∧t −M0| ≤MTV (t) ≤ n. Also (Xt) is a martingale: For a martingale, EX2

t = E∑N−1

i=0 |Xti+1−Xti |2. Indeed,

EN−1∑i=0

|Xti+1 −Xti |2 =N−1∑i=0

E

(E

|Xti+1 −Xti

∣∣∣2|Fti)

=N−1∑i=0

E(E

[X2ti+1− 2Xti+1Xti +X2

ti ]∣∣∣Fti)

=

N−1∑i=0

(EX2

ti+1−EX2

ti

)= EX2

t

EX2t = E

N−1∑i=0

|Xti+1 −Xti |2 ≤ E

(max |Xti+1 −Xti |

N−1∑i=0

|Xti+1 −Xti |

)

≤ maxω

XTV (ω)E

(maxi|Xti+1 −Xti |

)≤ nE

(maxi|Xti+1 −Xti |

).

Since Xt is uniformly continuous on [0, t] and bounded, by the dominated conver-gence theorem, E

(maxi |Xti+1 −Xti |

)→ 0. Hence E(X2

t ) = 0 and E(MTnt ) =

0. This implies that Mt = 0 on t < Tn for any n. On ω : MTV (ω) < ∞,Tn →∞. We take n→∞ to see that if

P ((MTV (ω)([a, b]) <∞) > 0,

on this set, Mt(ω) = Ma(ω) for all t ∈ [a, b].

50

Page 52: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

5.2 The Quadratic Variation Process (Lecture 15-16)

Definition 5.3 Let (Xt) and (Yt) be two continuous processes. If for any sequenceof partitions with |∆n| → 0,

limn→∞

∞∑j=0

(Xt∧tnj+1−Xt∧tnj )(Yt∧tnj+1

− Yt∧tnj )

exists in probability, we define the limit to be 〈X,Y 〉t.

In particular,

〈X,X〉tP= lim

n→∞

∞∑j=0

(Xt∧tnj+1−Xt∧tnj )2.

Theorem 5.2 For any continuous local martingales (Mt) and (Nt), there existsa unique continuous process 〈M,N〉t of finite variation vanishing at 0 such thatMtNt − 〈M,N〉t is a continuous local martingale. This process is called thebracket process or the quadratic variation of (Mt) and (Nt).

Theorem 5.3 The stochastic process (〈M,N〉t) has the following properties:

(1) It is symmetric and bilinear, and

〈M,N〉 =1

4[〈M +N,M +N〉 − 〈M −N,M −N〉]. (5.1)

(2) 〈M −M0, N −N0〉t = 〈M,N〉t.

(3) 〈M〉t ≡ 〈M,M〉t is increasing.

(4) If (Mt) is bounded, M2t − 〈M〉t is a martingale.

(5) For any t and any sequence of partitions ∆n : 0 ≤ tn0 < · · · < tnMn= t

of [0, t] with |∆n| → 0,

limn→∞

sups≤t

∣∣∣∣∣∣Mn∑j=0

(Ms∧tnj+1−Ms∧tnj )(Ns∧tnj+1

−Ns∧tnj )− 〈M,N〉s

∣∣∣∣∣∣ = 0.

The convergence is in probability.

Proposition 5.4 • If At is continuous process of finite variation and Xt is acontinuous semi-martingale then

〈X,A〉t = 0.

51

Page 53: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

• If Xt = Mt + At and Yt = Nt + Ct be two continuous semi-martingaleswith local martingale parts M and N ,

〈X,Y 〉t = 〈M,N〉t.

Proposition 5.5 Let T be a stopping time, M and N are local continuous martin-gales, then

〈MT , NT 〉 = 〈M,N〉T = 〈M,NT 〉.

Proof If (Mt) is bounded, then (MTt ) is a martingale and (MT

t )2 − 〈MT ,MT 〉tis a martingale,. Since (MT

t )2 = (M2)T∧t, we see that

〈MT ,MT 〉t = 〈M,M〉T∧t.

For non-bounded local martingales, take a localising sequence Tn such that (MTnt )

is bounded. Then

〈MTn∧T ,MTn∧T 〉t = 〈M,M〉T∧Tn∧t.

Thus (MT∧Tnt )2 − 〈M,M〉T∧Tn∧t is amartingale. Hence (MT

t )2 − 〈M,M〉T∧t isa local martingale which means that 〈MT ,MT 〉 = 〈M,M〉T .

Now

MTt N

Tt −M0N0 −

1

4[〈M +N,M +N〉T + 〈M −N,M −N〉T

is a local martingale. Hence 〈MT , NT 〉 = 〈M,N〉T . Similarly 〈M,NT 〉T =〈MT , NT 〉 = 〈M,N〉T . It follows that 〈M,NT 〉 = 〈M,N〉.

Proposition 5.6 If M is a continuous local martingale then 〈M,M〉t = 0 if andonly if Ms = M0 for s ∈ [0, t].

Proof Assume that 〈M,M〉t = 0. We first suppose that Mt is bounded and thatM0 = 0. Let s ≤ t. Then E(Ms−M0)2 = E(Ms)

2−E(M0)2 = E〈M,M〉s = 0.Then Ms = M0 for all s ≤ t. Otherwise let Tn be a reducing sequence of stoppingtimes then MTn

t −M0 = 0 almost surely for each n. Take n→∞ to complete theproof.

See Revuz-Yor, Proposition 1.13.

52

Page 54: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

5.3 Local Martingale Inequality and Levy’s MartingaleCharacterization Theorem. (Lecture 18)

Let I = [0, T ] if T ∈ R+ or I = [0,∞). Let X∗ = supt∈I |Xt|. In this section westate, without proof, some important theorems.

Theorem 5.7 [Burkholder-Davis-Gundy Inequality ] For every p > 0, there existuniversal constants cp and Cp such that if (Mt) is continuous local martingaleswith M0 = 0,

cpE(〈M,M〉T

) p2 ≤ E(sup

t∈I|Mt|)p ≤ CpE

(〈M,M〉T

) p2.

Remark 5.2 Let (Mt) is a continuous local martingale withM0 = 0. If supt<∞Mt ∈L1 then (Mt) is a martingale.

Let τ be a stopping time, note that

supt<∞

supτ|Mt∧τ |p ≤ sup

t<∞|Mt|p, sup

τ|Mτ |p ≤ sup

t<∞|Mt|p,

Theorem 5.8 [Levy’s martingale characterization Theorem] An Ft adapted con-tinuous real valued stochastic processBt vanishing at 0 is a standardFt-Brownianmotion if and only if (Bt) is an Ft-martingale with quadratic variation t.

Theorem 5.9 Let T be a finite stopping time. Then (BT+s − BT , s ≥ 0) is aBrownian motion.

Definition 5.4 An n dimensional stochastic process (X1t , . . . , X

nt ) is a Ft local-

martingale if each component is a Ft local-martingale.

Multi dimensional version:

Theorem 5.10 [Levy’s Martingale Characterization Theorem] An (Ft) adaptedsample continuous stochastic process (Bt) in Rd vanishing at 0 is a (Ft)-Brownianmotion if and only if each (Bt) is a (Ft) local martingale and 〈Bi, Bj〉t = δi,jt.

Theorem 5.11 (Dambis, Dubins-Schwartz) Let Ft be a right continuous filtra-tion. Let (Mt) be a continuous local martingale vanishing at 0 such that 〈M,M〉∞ =∞. Define

Tt = infs : 〈M,M〉s > t.

Then MTt is an FTt Brownian motion and Mt = B〈M,M〉t a.s..

53

Page 55: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

The condition on the bracket assures that the time change Tt is almost surely fi-nite for all t. Apply Levy’s Characterization Theorem, Theorem ??, for Brownianmotions.

5.3.1 Appendix

The following proof use Ito formulas. Proof of Theorem 5.8. If (Bt) is an Ft BMwe already know that it is a martingale with E(Bt)

2 = t a martingale. Supposethat (Bt) is a martingale with quadratic variation t. Let a be a real number, We

apply Ito’s formula (x, y) 7→ eiax+a2

2y and the stochastic process (Bt, t):

eiaBt−a2t2 = 1 + ia

∫ t

0eiBs−

s2dBs +

a2

2

∫ t

0eiBs−

s2ds+

1

2(ia)2

∫ t

0eiBs−

s2ds.

Then

eiaBt−a2t2 = 1 + ia

∫ t

0eiBs−

s2dBs

and eiaBt−a2t2 is a martingale. This means that

Eeia(Bt−Bs)|Fs = e−a2(t−s)

2 .

Let φ(a) := Eeia(Bt−Bs)|Fs. Since φ(a) is non-random,

Eeia(Bt−Bs) = Eeia(Bt−Bs)|Fs.

Thus Bt −Bs is independent of Fs and is a Gaussian random variable with distri-bution N(0, t− s).

Proof of Theorem 5.10 Proof of the ‘if part’: for any λ ∈ Rd, Yt = 〈λ,Xt〉Rd =∑λjX

jt is a local martingale with bracket |λ|2t. The exponential martingale

expi〈λ,Xt〉+12|λ|2t is a martingale as it is bounded on any compact time interval,

henceEexpi〈λ,Xt−Xs〉 |Fs = e−

12|λ|2(t−s).

This is sufficient to show that Xt −Xs is independent of Fs and

E expi〈λ,Xt−Xs〉 = e−12|λ|2(t−s),

which implies that Xt −Xs ∼ N(0, t− s).Proof of ‘only if part’. First for s < t,

EXit |Fs = EXi

t −Xis +Xi

s|Fs = E(Xit −Xi

s) +Xis = Xi

s

54

Page 56: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

and each Xt is a martingale. For s < t,

E(Xit)

2 − (Xis)

2|Fs = E(Xit −Xi

s)2|Fs = E(Xi

t −Xis)

2 = t− s.

Then 〈Xi, Xi〉t = t and the bracket of independent Brownian motions is zero.

5.4 The Hilbert space of L2 bounded martingale (Lecture18)

Let (Ω,F ,Ft, P ) be a filtered probability space, (Ft) satisfies the usual right con-tinuity and completion assumptions. If f ∈ L1, Yt = E(f |Ft) can be chosen insuch a way that (Yt) is right continuous and cadlag.

Theorem 5.12 Let H2 be the space of L2 bounded right continuous martingales.Let

‖M‖H2 =√E(M∞)2 = lim

t→∞

√E(Mt)2.

Then H2 is a Hilbert space and the space H2 of continuous L2 bounded martin-gales is closed in H2.

Proof Let (Mt) ∈ H2. Let M∞ = limt→∞Mt exists and Mt = E(M∞|Ft). ByDoob’s L2 inequality,

E supt

(Mt)2 ≤ 4 sup

tE(Mt)

2 <∞.

Hence (Mt)2 is uniformly integrable and (M∞)2 ∈ L2. There is a one to one

correspondence between H2 and L2(Ω,F , P ).Since 〈M,M〉t increases with t and is positive, By Fatou’s lemma,

E〈M,M〉∞ = E limt→∞〈M,M〉t ≤ lim

t→∞E〈M,M〉t = sup

tE(Mt)

2 <∞.

This means that (Mt)2−〈M,M〉t is an uniformly integrable martingale with limit

as t→∞. In particular,

E(M∞)2 = E〈M,M〉∞.

To see that H2 is closed, let M (n)t ⊂ H2 → (Mt) in H2. By Doob’s L2

inequality (Proposition 4.20 (2)) ,

E supt

(Mnt −Mt)

2 ≤ 4 supt

E(Mnt −Mt)

2 → 0.

55

Page 57: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

In particular there is a subsequence Mnk such that

supt

(Mnkt −Mt)

2 → 0

almost surely. This implies that (Mt) is continuous almost surely and H2 is closedin H2.

Theorem 5.13 If (Mt) is a continuous local martingale with M0 = 0, the follow-ing statements are equivalent.

1. (Mt) is L2 bounded.

2. E〈M,M〉∞ <∞.

If either holds, (Mt)2 − 〈M,M〉t is an uniformly integrable martingale and

E(Mt)2 = E〈M,M〉t, E(M∞)2 = E〈M,M〉∞.

It follows if f ∈ H2,‖f‖H2 = E〈M,M〉∞. (5.2)

56

Page 58: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Chapter 6

Stochastic Integration

We aim to define the stochastic integral∫ t

0 fsdXs where Xs = X0 + Ms + Asis a right continuous semi-martingale (Definition 5.2) and (ft) is a left continuousmeasurable process. In this chapter we we assume that (Ms) is continuous, inwhich case the bracket process is also continuous.

We take the index set to be [0,∞) or [0, T ] for a real number T . For simplicity,in most of the cases, we deal with only [0,∞). The [0, T ] case can be either dealtwith analogously or follows trivially by a localization 1[0,T ]. This is important asBrownian motion is not L2 bounded on R+, it is L2 bounded on any bounded timeinterval.

6.1 Introduction (Lectures 18-19)

Let E denotes the collection of elementary processes:

E =

(Kt) : Kt(ω) = K−1(ω)10(t) +

∞∑i=0

Ki(ω)1(ti,ti+1](t),

where 0 = t0 < · · · < tn < . . . is any sequence of positive numbers increasingto infinity, K−1 ∈ F0,Ki ∈ Fti , |Ki| are bounded. Let K ∈ E and let (Xs) be astochastic process. We previously defined elementary integration

(IM )t(K) := (K ·M)t = H0K0 +

∞∑i=1

Ki(Mti+1∧t −Mti∧t).

We seek a class of stochastic processes (fs) with the property that there exists asequence of stochastic processes Hn ∈ E with Hn converges to f (in some sense),

57

Page 59: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

and∫ t

0 Hn(s)dXs converges (in some sense) to a limit, the limit will be a candidatefor the stochastic integral

∫ t0 fsdXs. This is a continuity property. The standard

convergence used for stochastic processes is the uniform convergence on compactain probability (u.c.p.). In other words, a sequence of stochastic processes fn is saidto converge in the u.c.p. topology to a stochastic process f if for all t ≥ 0,

limn→∞

sup0≤s≤t

|fn(s)− f(s)| = 0

in probability.Let α and β be numbers satisfying that α + β > 1. It is not possible to find

a universal bi-linear operation from the space of ‘Cα regular functions’ and thespace of ‘Cβ regular functions’ to the space of ‘Cβ regular functions’ such that thelinear operation is continuous. Hence a stochastic integral cannot be, in general,interpreted as a pathwise integral.

Question

(1) For which class of stochastic processes (Xt), the map H ∈ ε →∫ t

0 HsdXs

is continuous from E to the set ID of cadlag adapted processes?

(2) What is the closure of E in D with respect to the u.c.p. topology? And whatare suitable step processes which can be used in the continuity argument?

The search for an answer to Question leads to the coinage of the terminology ‘semi-martingales’. If Xt = X0 + Mt + At, then it is a ‘semi-martingale’. From nowon we will be concerned only with continuous special semi-martingales and weomit the word special. The second question will lead to the terminology of simplepredictable stochastic processes,

S =

(Kt) : Kt(ω) = K−1(ω)10(t) +

n∑i=1

Ki(ω)1(Ti,Ti+1](t),

where 0 = T0 < T1 < . . . is an increasing sequence of stopping times convergingto infinity, with Ki finite valued and Ki ∈ FTi , K−1 ∈ F0. This set is dense inL, the set of left continuous right limit adapted stochastic processes in the u.c.p.topology.

We will be dealing with the simpler case where the integrators are sample con-tinuous. To begin with, we will use a Hilbert space construction and a suitable L2

norm for the convergence.

58

Page 60: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

6.1.1 Integration w.r.t. Stochastic Processes of Finite Variation

If (As, s ≥ 0) is a right continuous function of finite variation with A0 = 0.There is an associated Borel measure µA on [0,∞) determined by µA((c, d]) =A(d)−A(c). Note that µ(d) = A(d)−A(d−). If A is continuous the measuredoes not charge a singleton.

Write

As =ATV (s) +As

2− ATV (s)−As

2.

where ATV is the total variation process. Recall a signed measure µ decomposedas difference of two positive measures: µ = µ+ − µ−. For the Radon measureµA, |µA| is the measure determined by ATV . Let f : R+ → R be integrablewith integral denoted by

∫[0,∞) fsdAs. Let

∫ t0 fsdAs =

∫[0,∞) 1(0,t](s)fsdµA(s).

If f : R+ → R is left continuous,∫ t

0fsdAs = lim

|∆n|→0

∑j

f(tnj )(A(tnj+1)−A(tnj )

).

We may allow fs and As random. The above procedure holds for each ωfor which As(ω) is of finite variation. There is however the added complicationof measurability. We assume that f is progressively measurable (by progres-sive measurability we include the assumption that f is universally measurable,i.e. f : R+ ⊗ Ω → R is measurable with respect to B(R+) ⊗ F∞ and A is aright continuous finite variation process (recall, in particular, As is adapted). Then∫ t

0 fs(ω)dAs(ω) is a process of finite variation and is right continuous. The integralis furthermore continuous if (As) is sample continuous.

Recall that 〈M,M〉 correspond to a positive measure and 〈M,N〉 a signedmeasure, written as µ+ − µ− where µ+, µ− are positive measures. By |〈M,N〉|we mean the measure corresponds to µ+ + µ−.

Lemma 6.1 Let s ≤ t, we define 〈M,N〉st := 〈M,N〉t − 〈M,N〉s.

〈M,N〉st ≤√〈M,M〉t − 〈M,M〉s

√〈N,N〉t − 〈N,N〉s.

Proof For any a, 〈M − aN〉t ≥ 0. This means 〈M,M〉t + a2〈N,N〉t ≥2a〈M,N〉t. Take a =

√〈M,M〉t〈N,N〉t to see that 〈M,N〉t ≤

√〈M,M〉t〈N,N〉t.

A similar proof shows that for s < t:

〈M,N〉t − 〈M,N〉s ≤√〈M,M〉t − 〈M,M〉s

√〈N,N〉t − 〈N,N〉s.

59

Page 61: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Let Hs,Ks be measurable functions by which we mean they are Borel measur-able functions from (R+ × Ω,F∞ ⊗ B(R+) to (R,B(R)). Approximating themby elementary functions leads to the following theorem:

Theorem 6.2 Let (Mt) and (Nt) be two continuous local martingales. Let (Ht)and (Kt) be measurable processes. Then for t ≤ ∞,

∫ t

0|Hs||Ks| d|〈M,N〉|s ≤

√∫ t

0|Hs|2d〈M,M〉s

√∫ t

0|Ks|2d〈N,N〉s, a.s.

The inequality states in particular that the left hand side is finite if the right handside is. If furthermore H ∈ L1(d〈M,M〉s) and K ∈ L1(d〈N,N〉s),∣∣∣∣∫ t

0HsKs d〈M,N〉s

∣∣∣∣ ≤√∫ t

0|Hs|2d〈M,M〉s

√∫ t

0|Ks|2d〈N,N〉s, a.s.

Proof This is a schematic proof. Let (Hs) and (Ks) be from E , elementary pro-cesses. Let 0 = t1 < · · · < tN+1 be a partition such that on each sub-interval, bothHs(ω) and Ks(ω) are constant in s. We write, for H0,K0 ∈ F0, Hi,Ki ∈ Fti ,

Ht(ω) = H0(ω)10(t) +N∑i=1

Hi(ω)1(ti,ti+1](t),

and

Kt(ω) = K0(ω)10(t) +N∑i=1

Ki(ω)1(ti,ti+1](t),

Then∣∣∣∣∫ t

0Hs(ω)Ks(ω)d〈M,N〉s(ω)

∣∣∣∣=

∣∣∣∣∣∑i

Hi(ω)Ki(ω)〈M,N〉titi+1(ω)

∣∣∣∣∣ ≤∑i

|Hi(ω)||Ki(ω)||〈M,N〉titi+1(ω)|

≤√∑

i

|Hi(ω)|2〈M,M〉titi+1(ω)

√∑i

|Ki(ω)|2〈N,N〉titi+1(ω)

=

(∫ ∞0

(Hs(ω))2d〈M,M〉s(ω)

) 12(∫ ∞

0(Ks(ω))2d〈N,N〉s(ω)

) 12

.

Take appropriate limit to see the second inequality holds.

60

Page 62: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Let Hs = Hssign(HsKs)d〈M,N〉s|d〈M,N〉s| , we see that∫ t

0|Hs||Ks| d|〈M,N〉|s =

∫ t

0HsKsd〈M,N〉s

and apply the second inequality we see the first inequality holds for all boundedmeasurable functions (Hs) and (Ks). If they are not bounded, we take a sequenceof cut-off functions for Hs and Ks to see that first inequality always holds. Theymay however be infinite.

Apply Holder Inequality to the above inequality to obtain the following.

Corollary 6.3 [Kunita-Watanabe Inequality] For t ≤ ∞, and p > 1, 1p + 1

q = 1,

E

∫ t

0|Hs||Ks||d〈M,N〉|s

(E

(∫ t

0|Hs|2d〈M,M〉s

) p2

) 1p(E

(∫ t

0|Ks|2d〈N,N〉s

) q2

) 1q

6.2 Space of Integrands (Lecture 18-19)

Definition 6.1 Let E be a metric space. A stochastic process X : I × Ω → E isprogressively measurable if

(1) X : I × Ω→ E is measurable

(2) for each t > 0, X : [0, t]× Ω → E is a measurable map with respect to theproduct σ-algebra B([0, t])⊗Ft.

Proposition 6.4 Left or right continuous adapted processes are progressively mea-surable.

Proof exercise.

Let H2 be the space of L2 bounded continuous martingales.

Definition 6.2 For M ∈ H2, define L2(M) to be the space of progressively mea-surable stochastic process (ft) such that

‖f‖2L2(M) := E

∫ ∞0

(fs)2d〈M,M〉s <∞.

L2(M) = (ft) : f is progressively measurable,E∫ ∞

0(fs)

2d〈M,M〉s <∞..

61

Page 63: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

For H,K ∈ L2(M), we define

〈H,K〉L2(M) = E

∫ ∞0

HsKsd〈M〉s.

Let PM be the measure on R+ × Ω determined by Γ ∈ B(R+)⊗F∞,

PM (Γ) = E

∫ ∞0

1Γd〈M,M〉s.

Then L2(M) = L2(R+×Ω,B(R+)⊗F∞, PM ). We see that L2(M) is a Hilbertspace. By standard estimate, c.f. Kunita-Watanabe inequality (Corollary 6.3),

|〈H,K〉L2(M)| ≤ ‖H‖L2(M)‖K‖L2(N).

Let N be an integer and 0 = t1 < · · · < tN+1. Let

Kt(ω) = K0(ω)10(t) +

N∑i=1

Ki(ω)1(ti,ti+1](t),

where K0 ∈ F0,Ki ∈ Fti . Let M ∈ H2 then∣∣∣E ∫ ∞0

K2sd〈M,M〉s

∣∣∣ ≤ |K|∞E(〈M,M〉tN+1

)and E ⊂ ∩M∈H2L2(M). In other words, every elementary process belongs toL2(M) for any M ∈ H2.

Proposition 6.5 The set of elementary processes are dense in L2(M).

Proof We prove the case when f ∈ L2(M) is left continuous. First assume f isbounded and let

gn(s, ω) = f0(ω)10(s) +∑j≥1

f j2n

(ω)1( j2n≤s< j+1

2n].

Note that f j2n

is Fj/2n measurable. Since f is left continuous and bounded,

|gn|∞ ≤ |f |∞, by the dominated convergence theorem, gn → f in L2(M). Iff is not bounded, let fn(s) = fs1|fs|≤n. Then

‖fn − f‖2L2(M) ≤ E

∫ ∞0

f2s (ω)1(s,ω):|f(s,ω)|≥nd〈M,M〉s(ω)

(n→∞)→ 0.

62

Page 64: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Note to the Proof. If f is only assumed to be progressively measurable, theproof is as below. Let f ∈ L2(M) be a function orthogonal to E . Hence for anys < t, K ∈ Fs,

0 = 〈f,K1(s,t]〉L2(M) = E

∫ ∞0

frK1(s,t]d〈M,M〉r = EK

∫ t

sfrd〈M,M〉r.

In particular for any A ∈ Fs,

E

(∫ t

0frd〈M,M〉r1A

)= E

(∫ s

0frd〈M,M〉r1A

).

and (∫ t

0 frd〈M,M〉r, r ≥ 0), which is integrable by the Kunita-Watanabe inequal-ity (Corollary 6.3), is a continuous martingale and is also a finite variation process.Hence f = 0.

We also use the notation: K ·M =∫ ·

0 KsdMs.

Proposition 6.6 Let M ∈ H2 and K ∈ E . Then (∫ t

0 KsdMs, t ≥ 0) is an L2

bounded continuous martingale.

Proof Let Kt = K010 +∑Ki1(ti,ti+1](t), then∫ t

0KrdMr =

∑i

Ki(Mti+1∧t −Mti∧t).

Let us consider the ith interval [ti, ti+1]. If ti ≥ t,

Mti+1∧t −Mti∧t = 0.

If t ∈ (tj , tj+1), then

Mtj+1∧t −Mtj∧t = Mt −Mtj .

We may assume that the summation is from 1 to N and tN+1 = t and so∫ t

0KrdMr =

N∑i=1

Ki(Mti+1 −Mti).

Let s < t, we compute

E

N∑i=1

Ki(Mti+1 −Mti)|Fs

.

63

Page 65: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Let us analyze the ith interval Ii = (ti, ti+1]. If ti+1 ≤ s, then

EKi(Mti+1 −Mti)|Fs

= Ki(Mti+1 −Mti) = Ki(Mti+1∧s −Mti∧s).

If s ≤ ti, then

EKi(Mti+1 −Mti)|Fs

= E

KiE

(Mti+1 −Mti)|Fs

|Fti

= 0 = Ki(Mti+1∧s−Mti∧s).

If ti < s < ti+1,

EKi(Mti+1 −Mti)|Fs = KiEMti+1 |Fs −KiMti = Ki(Mti+1∧s −Mti∧s).

Summing up the three cases to obtain

E(K ·M)t|Fs =∑i

Ki(Mti+1∧s −Mti∧s).

and K ·M is a martingale.

Definition 6.3 Denote by H20 the subspace of H2 whose elements Mt satisfies

M0 = 0.

Then H20 is a closed subspace of H2 and

√E〈M,M〉∞ =

√E(M∞)2 is a norm

for H20 . See Theorems 5.12 and 5.13.

Proposition 6.7 LetM ∈ H2 andK ∈ E . Then the elementary integral (∫ t

0 KsdMs, t ≥0) ∈ H2. The map

K ∈ E 7→ K ·M ∈ H20

is linear and∥∥∥∥∫ ·0KsdMs

∥∥∥∥H2

= ‖K‖L2(M), (Ito Isometry).

Proof Linearity is clear. We prove the Ito isometry,

(‖(K ·M)‖H2)2 = E

( ∞∑i=0

Ki(Mti+1∧t −Mti∧t)

)2

=∞∑i=0

E(K2i

((Mti+1∧t)

2 − (Mti∧t)2))

=∞∑i=0

E(K2i E

(Mti+1∧t)2 − (Mti∧t)

2∣∣∣Fti)

=∞∑i=0

E(K2i

(〈M,M〉ti+1∧t − 〈M,M〉ti∧t

)).

64

Page 66: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

We use the fact that M2t − 〈M,M〉t is a martingale, c.f. Theorem 5.13. The

isometry follows from this and the following computation

(‖K‖L2(M))2 = E

∫ ∞0

(Ks)2d〈M,M〉s = E

∞∑i=1

(Ki)2〈M,M〉ti∧tti+1∧t.

Definition 6.4 (and Theorem) If f ∈ L2(M) and fn ⊂ E is a sequence con-verging to f in L2(M). Then

∫ t0 fndMs exists. We define the limit to be

∫ t0 fsdMs.

This limit is independent of the choices of the converging sequence.

Proof Since fn → f , ∫ ·

0 fsdMs is a Cauchy sequence in H20 which follows

from

‖∫ ·

0fn(s)dMs −

∫ ·0fm(s)dMs‖H2 =

∥∥∥∥∫ ·0

(fn(s)− fm(s))dMs

∥∥∥∥H2

= E

∫ ∞0

(fn(s)− fm(s))2d〈M,M〉s.

The uniqueness follows by the standard argument. Suppose that gn → f inL2(M).By Proposition 6.7,∥∥∥∥∫ ·

0fn(s)dMs −

∫ ·0gn(s)dMs

∥∥∥∥H2

= ‖∫ ·

0(fn(s)− gn(s))dMs‖H2

Proposition6.7= ‖fn − gn‖L2(M)

= E

∫ ∞0

(fn(s)− gn(s))2d〈M,M〉s

≤ E

∫ ∞0

(gn(s)− f(s))2d〈M,M〉s + E

∫ ∞0

(fn(s)− f(s))2d〈M,M〉s.

Exercise 6.1 Let M ∈ H2 and K ∈ E . Prove that the elementary integral∫ t0 KsdMs satisfies that for any N ∈ H2,

〈I,N〉t =

∫ t

0Ks d〈M,N〉s, ∀t ≥ 0 (6.1)

65

Page 67: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

6.3 Lecture 20. Characterization of Stochastic Integrals

All separable infinite dimensional Hilbert spaces are isomorphic (take a set of ba-sis in each space to construct the isometry). Hence L2(M)

iso= H2

0 . Below weconstruct an explicit isometric map:

K 7→ I(K,M).

We will call I(K,M), the Ito integral. This will agree with elementary integrals ifK ∈ E and denoted by ∫ t

0KsdMs.

Note that the value 〈M,N〉t is unchanged with the transformationM 7→M+cwhere c is a constant.

Theorem 6.8 Given M ∈ H2 and K ∈ L2(M), there is a unique process I ≡I(K,M) ∈ H2, vanishing at 0, such that

〈I,N〉t =

∫ t

0Ks d〈M,N〉s, ∀N ∈ H2, ∀t ≥ 0. (6.2)

Furthermore the map K ∈ L2(M) 7→ I(K,M) ∈ H20 is a linear isometry.

The identity (6.2) holds is equivalent to the following holds,

〈I,N〉∞ =

∫ ∞0

Ks d〈M,N〉s, ∀N ∈ H2. (6.3)

Proof Since the bracket processes 〈M,N〉 = 〈M,N−N0〉, it is sufficient to provethat (6.2) holds for all N ∈ H2

0 .

(a) The uniqueness. Suppose that there are two martingales I1, I2 ∈ H20 such

that

〈I1, N〉t =

∫ t

0Ks d〈M,N〉s, 〈I2, N〉t =

∫ t

0Ks d〈M,N〉s.

Then 〈I1−I2, N〉t = 0 for anyN ∈ H20 . In particular E〈I1−I2, I1−I2〉∞ =

0, which implies that I1 = I2.

(b) The existence.

(b1) Let N ∈ H20 . We define a real valued linear map:

U : H20 −→ R

U(N) = E

∫ ∞0

Ks d〈M,N〉s.

66

Page 68: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

(b2) We apply Kunita-Watanabe inequality:

|U(N)| ≤ E

∣∣∣∣∫ ∞0

Ks d〈M,N〉s∣∣∣∣

≤(E

∫ ∞0

K2s d〈M,M〉s

) 12 √

E〈N,N〉∞

≤ |K|L2(M)|N |H20<∞.

(6.4)

This proves thatU is a bounded linear operator. By the Riesz Represen-tation Theorem for bounded linear operators, there is a unique element,I , of H2

0 , such that

U(N) = 〈I,N〉H20

= E〈I(K,M), N〉∞.

This can be rewritten as:

E

(∫ ∞0

Ks d〈M,N〉s)

= E (〈I(K,M), N〉∞) . (6.5)

(b3) Define

Xt := ItNt −∫ t

0Ksd〈M,N〉s.

We prove that (Xt) is a martingale with initial value 0. By the definingproperty of the bracket process, 〈I,N〉t =

∫ t0 Ksd〈M,N〉s, observing

that both vanish at 0.Let τ be any bounded stopping time. We prove that E(Xτ ) = 0. ByTheorem 5.13, ItNt−〈I,N〉t is a uniformly integrable martingale, and

EIτNτ = E〈I,N〉τ = E〈I,N〉τ∧∞ = E〈I,N τ 〉∞by(6.5)

= E

∫ ∞0

Ksd〈M,N τ 〉s = E

∫ ∞0

1s<τKsd〈M,N〉s

= E

∫ τ

0Ksd〈M,N〉s.

This shows that EXτ = EX0 = 0 and (Xt) is a martingale.

(c) We show that the map K 7→ I(K,M) is a linear isometry. The linearity isclear: it follows from the linearity of the bracket 〈M,N〉 and the linearity ofK 7→

∫ t0 Ksd〈M,N〉s. Take K,K ′ ∈ L2(M). Then,

67

Page 69: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

〈I(K,M), I(K ′,M)〉H20

= E

(∫ ∞0

Ks d〈M, I(K ′,M)〉s)

= E

∫ ∞0

Ksd

(∫ s

0K ′rd〈M,M〉r

)= E

∫ ∞0

KsK′sd〈M,M〉s

= 〈K,K ′〉L2(M).

The isometry follows.

Proposition 6.9 Let K ∈ E and M ∈ L2(M) . Prove that the integral I(K,M)defined in Theorem 6.8 agrees with the elementary integral. Consequently, for allK ∈ L2(M),

I(K,M) =

∫ t

0KsdMs,

where the latter is defined by Definition 6.4.

Proof Exercise.

6.4 Integration w.r.t. Semi-martingales (Lecture 21)

The identity (6.2) in Theorem 6.8 leads easily to the following.

Proposition 6.10 Let H ∈ L2(M),K ∈ L2(N) where M,N ∈ H2. Then⟨∫ ·0HsdMs,

∫ ·0KsdNs

⟩=

∫ t

0HsKsd〈M,N〉s.

Proposition 6.11 Let τ be a stopping time. Then for (Mt) ∈ H2, (Kt) ∈ L2(M),∫ τ∧t

0KsdMs =

∫ t

0KsdM

τs =

∫ t

0Ks1[0,τ ](s)dMs.

68

Page 70: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Proof Take N ∈ H2. Then for any t ∈ [0,∞],⟨∫ τ∧·

0Ks dMs, N

⟩t

=

⟨∫ ·0Ks dMs, N

τ

⟩t

=

∫ t

0Ks d〈M,N τ 〉s =

∫ t

0Ks d〈M τ , N〉s

=

⟨∫ ·0Ks dM

τs , Ns

⟩t

.

The first required equality follows. We have used the properties of martingalebrackets: 〈M,N τ 〉 = 〈M τ , N〉. For the second equality note that,⟨∫ ·

0Ks1s≤τdMs, N·

⟩t

=

∫ t

0Ks1s≤τd〈M,N〉s.

By properties of Lebesgue-Stieltjes integrals,∫ t

0Ks1s≤τ d〈M,N〉s =

∫ t

0Ks d〈M,N〉τs

=

∫ t

0Ks d〈M τ , N〉s =

⟨∫ ·0KsdM

τs , N

⟩t

.

The second required inequality follows.

Corollary 6.12 Let S ≤ T be stopping times. Let τ be a stopping time. Onτ ≤ S ∧ T ,

∫ τ0 KsdM

Ss =

∫ τ0 KsdM

Ts . In particular if S ≤ T , then on t ≤ S∫ t

0KsdM

Ss =

∫ t

0KsdM

Ts .

Proof Just note that if S is a stopping time, then for all t > 0,∫ t

0KsdM

Ss =

∫ t∧S

0KsdMs.

Then

1τ≤S∧T

∫ τ

0KsdM

Ss = 1τ≤S∧T

∫ τ∧S

0KsdMs = 1τ≤S∧T

∫ τ

0KsdMs.

Similarly

1τ≤S∧T

∫ τ

0KsdM

Ts = 1τ≤S∧T

∫ τ

0KsdMs.

69

Page 71: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Thus1τ≤S∧T

∫ τ

0KsdM

Ss = 1τ≤S∧T

∫ τ

0KsdM

Ts .

Definition 6.5 Let M be a continuous local martingale. Let L2loc(M) be the space

of progressively measurable stochastic processes, (Kt), for which there is a se-quence of stopping times Tn increasing to infinity such that

E

∫ Tn

0K2sd〈M,M〉s <∞.

The class L2loc(M) consists of all progressively measurable K such that∫ t

0K2sd〈M,M〉s <∞, ∀t.

Recall that

H · 〈M,N〉t =

∫ t

0Hsd〈M,N〉s.

Proposition 6.13 Let M be a continuous local martingale with M0 = 0. If H ∈L2loc(M), there exists a unique local martingale I(H,M) with I(H,M)(0) = 0

and〈I(H,M), N〉t = H · 〈M,N〉t, ∀t ≥ 0

for all continuous local martingales (Nt).

Proof Step 1: Uniqueness. Let I1, I2 be continuous local martingales vanishing at0 and satisfying

〈I1, N〉t = H · 〈M,N〉t, 〈I2, N〉t = H · 〈M,N〉t ∀t ≥ 0.

Let Tn be a sequence of stopping times that reduces both I1 and I2 such thatboth stochastic processes are bounded. Then for all N ∈ H2

0 ,

〈I,H〉Tnt =

∫ t∧Tn

0Hsd〈M,N〉s =

∫ t

0Hs d〈MTn , N〉s

〈ITn1 , N〉 = 〈I1, N〉Tn , 〈ITn2 , N〉 = 〈I2, N〉Tn .

It follows that 〈ITn1 , N〉 = 〈ITn2 , N〉 for all N ∈ H20 . Since ITn1 , ITn2 are L2

bounded martingales, ITn1 = ITn2 . Since Tn →∞, I1 = I2 a.s.

70

Page 72: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Step 2. Existence. Let

Tn = inf

t ≥ 0 :

∫ t

0(1 +H2

s )d〈M,M〉s ≥ n.

Then Tn is an increasing sequence of stopping times increasing to infinity such thatMTn is in H2 and ‖H1s≤Tn‖L2(MTn ) is bounded by n. For n < m, on t < Tn,∫ t

0HsdM

Tns =

∫ t

0HsdM

Tms .

Define

I(H,M)(t) ≡∫ t

0HsdMs(ω) =

∫ t

01s≤TnHsdM

Tns (ω), ω ∈ t < Tn.

Since

I(H,M)Tnt =

∫ t

01s≤TnHsdM

Tns (ω)

are martingales, I(H,M) is a local martingale.Let N be a continuous local martingale with reducing stopping times Sn such

that NSn is bounded. Then

〈I(H,M), N〉Tn∧Snt =⟨I(H,M)Tn , NSn

⟩t

=

⟨∫ ·01s≤TnHsdM

Tns , NSn

⟩t

=

∫ t

01s≤TnHsd〈MTn , NSn〉s

=

∫ t∧Tn∧Sn

0Hsd〈M,N〉s.

Taking n→∞ to see that 〈I(H,M), N〉t =∫ t

0 Hsd〈M,N〉s.

Definition 6.6 A progressively measurable process f is locally bounded if there isan increasing sequence of stopping times Tn, with limn→∞ Tn = ∞, such thatthere are constants Cn with |fTn | ≤ Cn for all n.

Both continuous functions and convex functions are locally bounded. Any locallybounded functions are in L2

loc(M) for any continuous local martingale M .

71

Page 73: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

6.5 Stochastic Integration w.r.t. Semi-Martingales (Lec-ture 22)

Definition 6.7 If Xt = Mt + At is a continuous semi-martingale and f is a pro-gressively measurable locally bounded stochastic process, we define∫ t

0fsdXs =

∫ t

0fsdMs +

∫ t

0fsdAs.

Proposition 6.14 LetX,Y be continuous semi-martingales. Let f, g,K be locallybounded and progressively measurable. Let a, b ∈ R.

1.∫ t

0 (afs + bgs)dXs = a∫ t

0 fsdXs + b∫ t

0 gsdXs.

2.∫ t

0 fsd(aXs + bYs) = a∫ t

0 fsdXs + b∫ t

0 fsdYs.

3. ∫ t

0fsd

(∫ s

0grdXr

)=

∫ t

0fsgsdXs.

4. For any stopping time τ ,∫ τ

0KsdXs =

∫ ∞0

1s≤τKsdXs =

∫ ∞0

KsdXτs .

5. If Xs is of bounded total variation on [0, t] so is the integral∫ ·

0 KsdXs;and if Xs is a local martingale so is

∫KsdXs. In particular for a semi-

martingale Xt this gives the Doob-Meyer decomposition of∫ ·

0 KsdXs, c.f.Definition 5.2.

Definition 6.8 Let X,Y be continuous time semi-martingales. The Stratonovichintegral is defined as:∫ t

0Ys dXs =

∫ t

0YsdXs +

1

2〈X,Y 〉t.

Also note that for continuous processes, Riemann sums corresponding to asequence of partitions whose modulus goes to zero converges to the stochastic in-tegral in probability. Note that this convergence does not help with computation.Although there are sub-sequences that converges a.s. we do not know which subse-quence of the partition would work and this subsequence would be likely to differfor different integrands and different times.

72

Page 74: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Proposition 6.15 If (Kt) is left continuous and ∆n : 0 = tn0 < tn1 < · · · < tnNn =t is a sequence of partition of [0, t] such that their modulus goes to zero, then∫ t

0KsdXs = lim

n→∞

Nn∑i=1

Ktni(Xtni+1

−Xtni).

The sum converges in probability.

The proof is a consequence of Proposition 6.16 below.

6.5.1 Appendix

Proposition 6.16 (Dominated Convergence Theorem) Let (Xt) be a continuoussemi-martingale. Let (Kn

t ) be a sequence of locally bounded progressively mea-surable stochastic processes, converging to (Kt). Suppose that there is a locallybounded progressively measurable stochastic process (Ft) such that |Kn

t | ≤ Ftfor every n and t ≥ 0. Then

limn→∞

sups≤t

∣∣∣∣∫ s

0Knr dXr −

∫ s

0KrdXr

∣∣∣∣→ 0.

The convergence is in probability. That is, (∫ s

0 Knr dXr, s ≥ 0) converges in u.c.p.

If (Knt ) → (Kt) in L2(M), this follows from Ito Isometry. Otherwise a localisa-

tion procedure will lead to the conclusion. See Theorem 2.12 in Revuz-Yor [24].

6.6 Ito’s Formula (Lecture 22-23)

We define ∫ t

sHsdXs =

∫ t

0HsdXs −

∫ s

0HsdXs.

A Rn-valued stochastic process (Xt = (X1t , . . . , X

nt ) is respectively a martingale,

local martingale, semi-martingale or a Brownian motion if each (Xit) is such a

process. If (Xt, t ≥ 0) is a continuous semi-martingale, we denote by 〈X,X〉t thematrix whose entries are 〈Xi, Xj〉t.

Theorem 6.17 (Ito’s Formula) Let Xt = (X1t , . . . , X

nt ) be a Rn-valued sample

continuous semi-martingale and f a C2 real valued function on Rn then for s < t,

f(Xt) = f(Xs) +n∑i=1

∫ t

s

∂f

∂xi(Xr)dX

ir +

1

2

n∑i,j=1

∫ t

s

∂2f

∂xi∂xj(Xr)d〈Xi, Xj〉r.

73

Page 75: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

In short hand,

f(Xt) = f(Xs) +

∫ t

s(Df)(Xr)dXr +

1

2

∫ t

s(D2f)(Xr)d〈X,X〉r.

If T is a stopping time, apply Ito’s formula to Yt = XT∧t to see that

f(XT∧t) = f(X0) +

∫ T∧t

0(Df)XrdXr +

1

2

∫ T∧t

0(D2f)xrd〈X,X〉r.

Proof We give a sketch proof for n = 1. By the localization technique, it is suf-ficient to prove Ito’s formula for a continuous semi-martingale which takes valuesin [−C,C] for some C > 0.

If f : R→ R is a C2 function, let us recall Taylor’s Theorem with an integralremainder term:

f(y) = f(y0) + f ′(y0)(y − y0) +

∫ y

y0

(y − z)f ′′(z)dz

= f(y0) + f ′(y0)(y − y0) +1

2f ′′(y0)(y − y0)2 +

∫ y

y0

(y − z)(f ′′(z)− f ′′(y0))dz.

Let R, which depends on y, y0 and f , denote the remainder term. Let C > 0, sincef ′′ is uniformly continuous over the compact interval [−C,C], for y, y0 ∈ [−C,C],as y → y0,

supz∈[y0,y]

|f ′′(z)− f ′′(y0)| → 0.

The rate of convergence depending only on the distance |y − y0|, not on the theirindividual positions. Thus

|R(y0, y)| =∣∣∣∣∫ y

y0

(y − z)(f ′′(z)− f ′′(y0)

)dz

∣∣∣∣ ≤ |y − y0|∫ y

y0

|f ′′(z)− f ′′(y0)|dz

≤ (y − y0)2 supz∈[y0,y]

|f ′′(z)− f ′′(y0)|.

Let us take a partition ∆n : s = tn0 < tn1 < tn2 < · · · < tnN(n) = t and supposethat (Xt) is a continuous real valued semi-martingale bounded by K.

f(Xt)− f(Xs) =

N(n)−1∑i=0

(f(Xtni+1

)− f(Xtni))

=

N(n)−1∑i=0

f ′(Xtni)(Xtni+1

−Xtni

)+

1

2

N(n)−1∑i=0

f ′′(Xtni)(Xtni+1

−Xtni

)2

+

N(n)−1∑i=0

(R(Xtni+1

, Xtni)).

74

Page 76: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

The first term, on the right hand side, converges in probability to∫ t

0 f′(Xs)dXs, see

Proposition 6.15. The second term converges to 12

∫ t0 f′′(Xs)d〈X,X〉s in proba-

bility. ( IfXt is continuous and has finite total variation, the second term convergesto 0.)

The remainder term, denoted by R, is bounded by

R ≤N(n)−1∑i=0

supt∈[tni ,.t

ni+1]

∣∣∣f(Xtni+1)− f(Xtni

)∣∣∣ (Xtni+1

−Xtni

)2.

For any ε > 0, if the partition size is sufficiently small, then

R ≤ εN(n)−1∑i=0

(Xtni+1

−Xtni

)2.

Since∑N(n)−1

i=0

(Xtni+1

−Xtni

)2converges in probability, as max0≤i≤N(n)−1(tni+1−

tni ) → 0, to 〈X,X〉t, this concludes the formula for real valued semi-martingaleswith values in a compact set.

Proposition 6.18 (The product formula) IfXt and Yt are real valued semi-martingales,

XtYt = X0Y0 +

∫ t

0XsdYs +

∫ t

0YsdXs + 〈X,Y 〉t

This also provides an understanding, even serve as an definition, for the bracketprocess,

〈X,Y 〉t = XtYt −X0Y0 −∫ t

0XsdYs −

∫ t

0YsdXs

Example 6.1 If Bt = (B2t , . . . , B

nt ) is an n-dimensional BM, then 〈Bi, Bj〉t =

δijt, |Bt|2 =∑

i |Bit|2, and

|Bt|2 = 2

n∑i=1

∫ t

0BisdB

is + nt.

Example 6.2 Let (Mt) be a continuous semi-martingale. ThenXt = eMt− 12〈M,M〉t

satisfies the equation:

Xt = eM0 +

∫ t

0XsdMs.

75

Page 77: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Let Yt := Mt − 12〈M,M〉t, then 〈Y, Y 〉t = 〈M,M〉t, and Xt = eYy . Let

f(x) = ex and apply Ito’s formula to the function f and the process (Yt),

Xt = eYt =eY0 +

∫ t

0eYsdYs −

1

2

∫ t

0eYsd〈Y, Y 〉s

= eM0 +

∫ t

0eYsdMs +

1

2

∫ t

0eYsd〈M,M〉s −

1

2

∫ t

0eYsd〈Y, Y 〉s

=eM0 +

∫ t

0XsdMs.

Definition 6.9 If (Mt) is a continuous local martingale, eMt− 12〈M,M〉t is a contin-

uous local martingale and is called the exponential martingale of Mt.

Theorem 6.19 1. Let (Xt) be a continuous semi-martingale. Assume that ∂∂tF (t, x)

and ∂2

∂xi∂xjF (t, x), i, j = 1, . . . , d, exist and are continuous functions. Then

F (t,Xt) =F (0, X0) +

∫ t

0

∂F

∂s(s,Xs)ds+

∫ t

0DF (s,Xs)dXs

+1

2

∫ t

0D2F (s,Xs)d〈Xs, Xs〉.

2. Ito’s formula holds for complex valued semi-martingale Xt +√−1Yt.

Appendix

Theorem 6.20 Let (Mt) be a continuous local martingale. The exponential mar-tingale Nt := eMt− 1

2〈M,M〉t is a martingale if and only if E(Nt) = 1 for all t.

Proof If (Nt) is a martingale, the statement that its expectation is constant in t fol-lows from the definition. We prove the converse. Since (Nt) is a continuous localmartingale, it is a super-martingale. Indeed for a reducing sequence of stoppingtimes Tn and any pair of real numbers 0 ≤ s ≤ t, we apply Fatou’s lemma:

E(Nt|Fs) ≤ limn→∞

E(NTnt |Fs) = lim

n→∞NTns = Ns.

Let T be a stopping time bounded by a positive number K. By the optional stop-ping theorem,

E(NT ) ≥ E(NK) = 1, E(NT ) ≤ E(N0) = 1.

76

Page 78: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Thus E(NT ) = 1 and (Nt) is a martingale.

This can be generalized to stochastic processes that is not positive valued. Let(Mt) be a continuous local martingale with E|M0| < ∞. Suppose that the familyM−T , T bounded stopping times is uniformly integrable. Then (Mt) is a supermartingale. It is a martingale if and only if EMt = EM0, see Prop. 2.2 in [6]

77

Page 79: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Chapter 7

Stochastic Differential Equations

7.1 Stochastic processes defined up to a random time (Lec-ture 24)

Let (Bt) be a real valued Brownian motion. The stochastic process Xt(ω) :=1

2−Bt(ω) is defined up to the first time Bt(ω) reaches 2. We denote this time by τ :

τ(ω) = inft≥0Bt(ω) ≥ 2.

For any given time t, no matter how small it is, there is a set of path of positiveprobability (measured with respect to the Wiener measure on C([0, t];Rd)) whichwill have reached 2 by time t:

P (τ ≤ t) = P (sups≤t

Bs ≥ 2) = 2P (Bt ≥ 2) =

√2

π

∫ ∞2√t

e−y2

2 dy > 0.

This probability converges to zero as t → 0. We say that (Xt) is defined up to τand τ is called its life time or explosion time.

Exercise 7.1 (1) Using Ito’s formula to prove that on t < τ(ω), the followingholds almost surely

Xt =1

2+

∫ t

0(Xs)

2dBs +

∫ t

0(Xs)

3ds.

Let c 6= 0 and define

τc(ω) = inft≥0Bt(ω) ≥ c.

Prove that for c ≤ 2, on t < τc(ω), the above identity holds a.s. Theprocess (Xt, t < τ2) is the maximal solution to the above integral equation.

78

Page 80: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

(2) Replace (Bt) be another Brownian motion (Wt). Find a stochastic process(Xt) that solves

Xt =1

2+

∫ t

0(Xs)

2dWs +

∫ t

0(Xs)

3ds.

(3) Let a ∈ R, find a stochastic process (Xt) that solves

Xt = a+

∫ t

0(Xs)

2dBs +

∫ t

0(Xs)

3ds.

Prove your claim.

Let Rd ∪ ∆ be the one point compactification of Rd, which is a topologicalspace whose open sets are open sets of Rd plus set of the form (Rd rK) ∪ ∆where K denotes a compact set. Given a process (Xt, t < τ) on Rd we define aprocess (Xt, t ≥ 0) on Rd ∪ ∆:

Xt(ω) =

Xt(ω), if t < τ(ω)∆, if t ≥ τ(ω).

.

If (Xt, t < τ) is a continuous process on Rd then (Xt) is a continuous processon Rd ∪ ∆. Define W (Rd) ≡ C([0, T ];Rd ∪ ∆) whose elements satisfy that:if Ys = ∆ then Yt(ω) = ∆ for all t ≥ s. The last condition means that once aprocess enters the coffin state it does not return.

7.2 Concepts

For i = 1, 2, . . . ,m, let σi, b : R ×Rd → Rd be Lipschitz continuous functions,more generally Borel measurable functions. Let Bt = (B1

t , . . . , Bmt ) be an Rm

valued Brownian motion on the filtered probability space (Ω,F ,Ft, P ). We studyfirst the integral equation of Markovian type:

xt = x0 +

m∑k=1

∫ t

0σk(s, xs)dB

ks +

∫ t

0b(s, xs)ds. (7.1)

Once we are familiar with the basic estimates, we will proceed to the concept ofstochastic differential equations.

A stochastic process (xt(ω)) is a solution to the above integral equation if forall t the identity (7.1) holds almost surely. If the functions σi(t, x) and b(t, x) donot depend on t, the SDE is said to be time homogeneous.

79

Page 81: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

We concern ourselves, mostly, with time homogeneous SDEs (of Markoviantype), letting σi, b : Rd → R be Borel measurable locally bounded functions:

E(σ, b) dxt =m∑i=1

σi(xt)dBit + b(xt)dt. (7.2)

Example 7.1 The following SDE on Rd is not Markovian:

dxt =

(∫ t

0xrdr

)dBt,

equivalently, xt = x0 +∫ t

0

(∫ s0 xrdr

)dBs. Let us define σ : W d → R, where W d

is the Wiener space, by σ(w) :=(∫ t

0 wrdr)

. Note that σ(x·) depends on the valueof the whole path of the solution.

7.3 Stochastic Integral Equations (Lectures 22-26)

Let (Ω,F ,Ft, P ) be a filtered probability space satisfying the standard assump-tions Let (Bt) be a real value Brownian motion. Letting fs = (f1

s , . . . , fds ) be a

progressively measurable stochastic process with values in Rd, we define∫ t

0fsdBs =

(∫ t

0f1s dBs, . . . ,

∫ t

0fds dBs

).

Denote |x| =√∑d

i=1(xi)2 the Euclidean norm of x = (x1, . . . , xd) in Rd.

Lemma 7.1 Let T > 0 and t ≤ T . Let (fs, s ≤ T ) be progressively measurableRd-valued stochastic processes.

(1) There exists a universal constant C such that

E supt≤T

∣∣∣∣∫ t

0fsdBs

∣∣∣∣2 ≤ C ∫ T

0E|fs|2ds.

(2)

E

(supt≤T

∣∣∣∣∫ t

0fsds

∣∣∣∣2)≤ T

∫ T

0E|fs|2ds.

80

Page 82: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Proof For (1). By Burkholder-Davis-Gundy inequality, Theorem 5.7, there existsa universal constant C s.t.

E supt≤T

∣∣∣∣∫ t

0fsdBs

∣∣∣∣2 ≤ d∑i=1

E

(supt≤T

∫ t

0f isdBs

)2

≤ CEd∑i=1

∫ T

0(f is)

2ds = C

∫ T

0E|fs|2ds.

(2) For any 0 ≤ t ≤ T ,

E supt≤T

∣∣∣∣∫ t

0fsds

∣∣∣∣2 ≤ d∑i=1

E supt≤T

(∫ t

0f isds

)2

≤ TE∫ T

0|fs|2ds ≤ T

∫ T

0E|fs|2ds.

In particular, if (Bkt ), k = 1, . . . ,m are independent Brownian motions,

fk(s) are progressively measurable Rd-valued stochastic processes, then from thetriangle inequality for norms, and (1) in Lemma 7.1,

E

∣∣∣∣∣supt≤T

m∑k=1

∫ t

0fk(s)dB

ks

∣∣∣∣∣2

≤ E

(supt≤T

m∑k=1

∣∣∣∣∫ t

0fk(s)dB

ks

∣∣∣∣)2

≤ 2m−1m∑k=1

E

∣∣∣∣∣supt≤T

∫ t

0fk(s)dB

ks

∣∣∣∣∣2

≤ 2m−1Cm∑k=1

∫ T

0E|fk(s)|2ds.

Definition 7.1 A function f : Rd → R is said to growth at most linearly, if thereexists a constant C such that

|f(x)| ≤ C(1 + |x|)

for all x ∈ Rd.

Lemma 7.2 If f : Rd → R is Lipschitz continuous, then f grows at most linearlyat infinity.

81

Page 83: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Proof For any x ∈ Rd,

|f(x)| ≤ |f(x)− f(0)|+ |f(0)| ≤ |f |Lip|x|+ |f(0)|≤ C(1 + |x|),

where C = max(|f(0)|, |f |Lip).

Theorem 7.3 For i = 1, 2, . . . ,m, let σi, b : Rd → Rd be Lipschitz continu-ous functions. Let Bt = (B1

t , . . . , Bmt ) be an Rm valued Brownian motion on

(Ω,F ,Ft, P ). Then for each x0 ∈ Rd there exists a unique continuous FBt -adapted stochastic process such that

xt = x0 +m∑k=1

∫ t

0σk(xs)dB

ks +

∫ t

0b(xs)ds

for all t a.s. Furthermore for each t, xt is FBt -measurable.

Proof Fix T > 1. Define, for all t ∈ [0, T ],

x(0)t = x0,

x(1)t = x0 +

m∑k=1

∫ t

0σk(x0)dBk

r +

∫ t

0b(x0)dr

. . .

x(n)t = x0 +

m∑k=1

∫ t

0σk(x

(n−1)r )dBk

r +

∫ t

0b(x(n−1)

r )dr

x(n+1)t = x0 +

m∑k=1

∫ t

0σk(x

(n)r )dBk

r +

∫ t

0b(x(n)

r )dr.

E supt≤u|x(1)t − x0|2 ≤ 2E sup

t≤u

∣∣∣∣∣m∑k=1

∫ t

0σk(x0)Bk

t

∣∣∣∣∣2

+ 2uE|b(x0)|2

≤ 2mm∑k=1

E∣∣∣σk(x0)Bk

u

∣∣∣2 + 2u|b(x0)|2

≤ 2mm∑k=1

C2(1 + |x0|)2E(Bku)2 + 2uC2(1 + |x0|2)

= (2mmu+ 2u)(C)2(1 + |x0|)2 = C0,

82

Page 84: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

where C is the common linear growth constants for σk and b. By induction andanalogous estimation, E supt≤u |x

(n)t |2 is finite and the stochastic integrals make

sense. By construction each (x(n)t ) is sample continuous and is adapted to the

filtration of (Bt).We estimate the differences between iterations:

E sups≤t|x(n+1)s − x(n)

s |2

= E sups≤t

∣∣∣∣∣m∑k=1

∫ s

0

(σk(x

(n)r )− σ(x(n−1)

r

)dBk

r +

∫ s

0

(b(x(n)

r )− b(x(n−1)r

)dr

∣∣∣∣∣2

.

≤ 2E sups≤t

(m∑k=1

∣∣∣∣∫ s

0

(σk(x

(n)r )− σk(x(n−1)

r

)dBk

r

∣∣∣∣)2

+ 2E sups≤t

∣∣∣∣∫ s

0

(b(x(n)

r )− b(x(n−1)r )

)dr

∣∣∣∣2≤ 2m

m∑k=1

E sups≤t

∣∣∣∣∫ s

0

(σk(x

(n)r )− σk(x(n−1)

r

)dBk

r

∣∣∣∣2 + 2E sups≤t

∣∣∣∣∫ s

0

(b(x(n)

r )− b(x(n−1)r )

)dr

∣∣∣∣2 .Let K be the common Lipschitz constant for σk and b. By Lemma 7.1,

E sups≤t|x(n+1)s − x(n)

s |2

≤ 2mCm∑k=1

E

(∫ t

0

∣∣∣σk(x(n)r )− σk(x(n−1)

r )∣∣∣2 dr)+ 2TE

∫ t

0

∣∣∣b(x(n)r )− b(x(n−1)

r )∣∣∣2 dr.

≤ 2mCm∑k=1

K2

∫ t

0E∣∣∣x(n)r − x(n−1)

r

∣∣∣2 dr + 2TK2

∫ t

0E∣∣∣x(n)r − x(n−1)

r

∣∣∣2 dr.Let

D = 2mCmK2 + 2TK2,

Then

E sups≤t|x(n+1)s − x(n)

s |2 ≤ D∫ t

0E∣∣∣x(n)r − x(n−1)

r

∣∣∣2 dr≤ D

∫ t

0E supr≤s1

∣∣∣x(n)r − x(n−1)

r

∣∣∣2 ds1

≤ D2

∫ t

0

∫ s1

0E supr≤s2

∣∣∣x(n−1)r − x(n−2)

r

∣∣∣2 ds2ds1

≤ Dn

∫ t

0

∫ s1

0. . .

∫ sn−1

0E supr≤sn

∣∣∣x(1)r − x(0)

r

∣∣∣2 dsn . . . ds2ds1.

83

Page 85: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

By induction we see that

E sups≤t|x(n+1)s − x(n)

s |2 ≤ C1DnTn

n!.

whee C1 = TE supt≤T |x(1)t − x0|2 ≤ TC0. By Minkowski inequality,√√√√E

( ∞∑k=1

sups≤t|x(k+1)s − x(k)

s |

)2

≤∞∑k=1

(E sup

s≤t|x(k+1)s − x(k)

s |2) 1

2

<∞.

By Fatou’s lemma,

∞∑k=1

(E sup

s≤t|x(k+1)s − x(k)

s |2) 1

2

≤∞∑k=1

(E sup

s≤t|x(k+1)s − x(k)

s |2) 1

2

≤ C1

∞∑k=1

√DkT k

k!<∞.

In particular for almost surely all ω,

∞∑k=1

sups≤t|x(k+1)s (ω)− x(k)

s (ω)| <∞.

For such ω, x(n)s (ω) is a Cauchy sequence inC([0, t];Rd). Let xt(ω) = limn→∞ x

(n)t (ω).

The process is continuous in time by the uniform convergence.We take n→∞ in

x(n)t = x0 +

m∑k=1

∫ t

0σk(x

(n−1)r )dBk

r +

∫ t

0b(x(n−1)

r )dr.

As n → ∞,∫ t

0 σk(x(n)s )dBk

s →∫ t

0 σ(xs)dBs in probability. There will be analmost sure convergent subsequence and this proves that

xt = x0 +

m∑k=1

∫ t

0σk(xs)dB

ks +

∫ t

0b(xs)ds.

Since each (x(n)t ) is a adapted to the filtration of (Bt), so is its limit.

84

Page 86: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

(2) Uniqueness. Let (xt) and (yt) be two solutions with x0 = y0 a.s. Let C∗

be a constant.

E sups≤t|xs − ys|2

= E sups≤t

∣∣∣∣∣m∑k=1

∫ s

0

(σk(xr)− σ(yr)

)dBk

r +

∫ s

0

(b(xr)− b(yr)

)dr

∣∣∣∣∣2

≤ E sups≤t

(m∑k=1

∣∣∣∣∫ s

0

(σk(xr)− σk(yr)

)dBk

r

∣∣∣∣+

∣∣∣∣∫ s

0

(b(xr)− b(yr)

)dr

∣∣∣∣)2

≤ 2mm∑k=1

E

(sups≤t

∣∣∣∣∫ s

0

(σk(xr)− σk(yr)

)dBk

r

∣∣∣∣)2

+ 2E

(m∑k=1

sups≤t

∣∣∣∣∫ s

0

(b(xr)− b(yr)

)dr

∣∣∣∣)2

≤ 2mC∗m∑k=1

E

(∫ t

0|σk(xr)− σk(yr)|2 dr

)+ 2TE

∫ t

0|b(xr)− b(yr)|2 dr

≤ 2mC∗m∑k=1

K2

∫ t

0E |xr − yr|2 dr + 2TK2

∫ t

0E |xr − yr|2 dr

≤ (2mmC∗K2T + 2TK2)

∫ t

0E

(supr≤s|xr − yr|2

)ds,

By Grownall’s inequality,

E sups≤t|xs − ys|2 = 0.

In particular, sups≤t |xs − ys|2 = 0 almost surely.

Lemma 7.4 (Grownall’s Inequality/Gronwall’s Lemma) Let T > 0. Supposethat f : [0, T ] → R+ is a locally bounded Borel function such that there are tworeal numbers C and K such that for all 0 ≤ t,

f(t) ≤ C +K

∫ t

0f(s)ds.

Thenf(t) ≤ CeKt, t ≤ T

In particular if C = 0, f(t) = 0 for all t ≤ T .

85

Page 87: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Definition 7.2 A solution (xt, t < τ) of an SDE is a maximal solution if (yt, t <τ) is any other solution on the same probability space with the same driving noiseand with x0 = y0 a.s., then τ ≥ τ a.s.. We say that τ is the explosion time or thelife time of (xt).

By localisation, or cut off the functions σk we have the following theorem:

Theorem 7.5 Suppose that for k = 1, . . . ,m, σk : Rd → Rd and b : Rd → Rd

are locally Lipschitz continuous , i.e. for each N ∈ N , there exists a number KN

such that for all x, y with |x| ≤ N, |y| ≤ N ,

|σk(x)− σk(y)| ≤ KN |x− y|, |b(x)− b(y)| ≤ KN |x− y|.

Then there is a maximal solution (xt, t < τ). If (xt, t < τ) and (yt, t < ζ) be twomaximal solutions with the same initial value x ∈ Rd, then τ = ζ a.s. and (xt)and (yt) are indistinguishable.

7.4 Examples

Example 7.2 Consider x(t) = ax(t) on R where a ∈ R. Let x0 ∈ R. Thenx(t) = x0e

at is a solution with initial value x0. It is defined for all t ≥ 0.Let φt(x0) = x0e

at. Then (t, x) 7→ φt(x) is continuous and φt+s(x0) =φt(φs(x)).

Example 7.3 Linear Equation. let a, b ∈ R. Let d = m = 1. Then

x(t) = x0eaBt−a

2

2t+bt

solvesdxt = a xt dBt + b xt dt, x(0) = x0.

The solution exists for all time.Is this solution unique? The answer is yes. Let yt be a solution starting from

the same point, we could compute and prove that E|xt − yt|2 = 0 for all t, whichimplies that xt = yt a.s. for all t. However we do not prove it here, see Theorem7.19.

Example 7.4 Additive noise. Consider a particle of mass 1, subject to a forcewhich is proportional to its own speed, is subject to vt = −kvt. Its random pertur-bation equation is the Langevin equation:

dvt(ω) = −kvt(ω)dt+ dBt(ω).

86

Page 88: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

For each realisation of the noise (that means for each ω), the solution is an Ornstein-Uhlenbeck process,

vt(ω) = v0e−kt +

∫ t

0e−k(t−r)dBr(ω).

We check that the above equation satisfies −k∫ t

0 vsds = vt − v0 −Bt.

−k∫ t

0vsds = −kv0

∫ t

0e−ksds−

∫ t

0

(ke−ks

∫ s

0ekrdBr

)ds

= −v0 + v0e−kt +

∫ t

0

(∫ s

0ekrdBr

)d(e−ks)

= −v0 + v0e−kt +

(e−ks

∫ s

0ekrdBr

) ∣∣∣s=ts=0−∫ t

0e−ksd

(∫ s

0ekrdBr

)= −v0 + v0e

−kt + e−kt∫ t

0ekrdBr −Bt

= −v0 + vt −Bt.

This proved that the Ornstein-Uhlenbeck process is solution to the Langevin equa-tion, with life time∞.

Example 7.5 (1) Small Perturbation. Let ε > 0 be a small number,

xεt = x0 +

∫ t

0b(xεs)ds+ εBt.

As ε→ 0, xεt → xt. (Exercise)

(2) Let yεt = y0 + ε∫ t

0 b(yεs)ds+

√εWt. Assume that b are bounded, as ε→ 0,

yεt on any finite time interval converges uniformly in time on any finite timeinterval [0, t], E sup0≤s≤t(y

εs − y0)→ 0.

7.5 Notions of Solutions (Lectures 26-27)

For i = 1, 2, . . . ,m, let σi, b : R+ × Rd → Rd be Borel measurable locallybounded functions. Let Bt = (B1

t , . . . , Bmt ) be an Rm valued Brownian motion.

Definition 7.3 A d-dimensional stochastic process (xt, t < τ), where τ ≤ ∞, ona probability space (Ω,G, P ) is a solution to the SDE (of Markovian type)

dxt = σ(t, xt)dBt + b(t, xt)dt. (7.3)

If there exists a filtration (Ft) such that

87

Page 89: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

(1) xt is adapted to Ft,

(2) a Ft Brownian motion Bt = (B1t , . . . , B

mt ) with B0 = 0;

(3) for all stopping times T < τ , the following makes sense and holds almostsurely

xT = x0 +m∑k=1

∫ T

0σk(s, xs)dB

ks +

∫ T

0b(s, xs)ds.

We may replace (3) by (3’)

(3’) an adapted continuous stochastic process x· in C([0,∞);Rd∪∆), s.t. forall t ≥ 0,

xt = x0 +m∑k=1

∫ t

0σk(s, xs)dB

ks +

∫ t

0b(s, xs)ds, a.s.

In essence the SDE holds on t < τ(ω). The maximal time τ , up to whicha solution is defined is the explosion time, the solution (xt, t < τ) is the maximalsolution.

Definition 7.4 A solution is a global solution is its life time infinite. We say thatthe SDE does not explode from x0 if its solution from x0 is global. We say that theSDE does not explode if all of its solutions are global.

Example 7.6 (Tanaka’s SDE) Consider the equation on R,

dxt = sign(xt)dBt

where

sign(x) =

−1, if x ≤ 0

1, if x > 0.

Let (Wt) be a Brownian motion on any probability space with B0 = 0. Letx0 ∈ Rd. Define

Bt =

∫ t

0sign(x+Ws) dWs.

This is a local martingale with quadratic variation t and hence a Brownian motion.Furthermore ∫ t

0sign(x+Ws)dBs =

∫ t

0dWs = Wt.

88

Page 90: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Thus

x+Wt = x+

∫ t

0sign(x+Ws)dBs,

and x+Wt solves Tanaka’s equation.

Example 7.7 Consdier dxt = xtdBt on R. Let d ≥ 2 and Ω = W d = C([0,∞);Rd)with the Borel σ-algebra generated

d(ω1, ω2) =∞∑n=1

supt≤n |ω1(t)− ω2(t)| ∧ 1

2n

and filtration generated by evaluation maps (πs : s ≤ t). Let Bt = π1t , projection

to the first component, and evaluated at time t. Then x1t = xeπ

1t−

t2 is a solution

with initial value x. Let B2t = π2

t then x2t = xeπ

2t−

t2 is a solution with initial

value x. The stochastic processes x1t and x2

t are independent Brownian motions!However there is a related universal function Ft(x, σ) = xeπt(σ)− t

2 .

Let (xt) be a solution with initial value x0. Suppose that the solution (x0, ω) 7→(xs(ω), s ≤ t) is adapted to B(Rd) × FBt . Then there exists a Borel measurablefunction Ft on Rd × C([0, t];Rd) such that

xt(ω) = Ft(x0, (Bs, 0 ≤ s ≤ t)).

This raises the following questions.Questions.

1. Given any y0 ∈ Rd, is Ft(y0, B·) a solution with initial value y0?

2. Given another probability space (Ω′,F ′,F ′t, P ′) and a Brownian motion(Wt), is yt = Ft(y0, (Ws, 0 ≤ s ≤ t)) a solution to

yt = y0 +

m∑k=1

∫ t

0σk(ys)dW

ks +

∫ t

0σ(ys)ds?

Definition 7.5 A solution (xt, Bt) on (Ω,F ,Ft, P ) is said to be a strong solution,if xt is adapted to the filtration of Bt for each t. By a weak solution we mean onewhich is not strong.

If (xt, Bt) is a solution on the canonical probability space so Bt(ω) = ω(t).If xt is measurable w.r.t. to the natural filtration of (Bt) then there exists a Borelmeasurable function Ft(x0, ·) : Wm

0 → W d such that

xt(ω) = Ft(x0, (Bs(ω), s ≤ t)) = Ft(x, (ω.s ≤ t)).

89

Page 91: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

7.6 Notions of uniqueness (Lecture 27)

Example 7.8 (Tanaka’s SDE) Consider the equation on R,

dxt = sign(xt)dBt

where

sign(x) =

−1, if x ≤ 0

1, if x > 0.

If (xt) is a solution, then xt−x0 =∫ t

0 xsdBs is a Brownian motion. The stochasticintegral

∫ t0 xsdBs is a martingale with quadratic variation t. By Levy Character-

isation Theorem, the distribution of (xs − x0, s ≤ t) is the Wiener measure onC([0, t];Rd). In another word, the probability distribution of (xt) is that of aBrownian motion with initial value x0.

Definition 7.6 If, whenever (xt) and (xt) are two solutions with x0 = x0 almostsurely, the probability distribution of xt : t ≥ 0 is the same as the probabilitydistribution of xt, t ≥ 0, we say that uniqueness in law holds.

Uniqueness in law holds for Tanaka’s equation. Uniqueness in law implies thefollowing stronger conclusion: whenever x0 and x0 have the same distribution, thecorresponding solutions have the same law.

Example 7.9 (Tanaka’s SDE) If (xt) solves Tanaka’s equation xt =∫ t

0 xsdBs(initial value 0), then so does (−xs).

Definition 7.7 We say pathwise uniqueness of solution holds for an SDE, If when-ever (xt) and (xt) are two solutions for the SDE on the same probability space(Ω,F ,Ft, P ) with the same driving Brownian motion (Bt) and same initial data (x0 = x0 a.s.), then xt = xt for all t ≥ 0 almost surely.

Parthwise uniqueness fails for Tanaka’s equation.

Example 7.10 ODE xt = (xt)αdt, α < 1, which has two solutions from zero: the

trivial solution 0 and xt = (1− α)1

1−α t1

1−α . Both uniqueness fails.

Example 7.11 Dimension d = 1. Consider dxt = σ(xt)dWt. Suppose that σ isHolder continuous of order α, |σ(x) − σ(y)| ≤ c|x − y|α for all x, y. If α ≥ 1/2then pathwise uniqueness holds for dxt = σ(xt)dWt. If α < 1/2 uniqueness nolonger holds. For α > 1/2 this goes back to Skorohod (62-65) and Tanaka(64).The α = 1/2 case is credited to Yamada-Watanabe.

90

Page 92: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

7.6.1 The Yamada-Watanabe Theorem

The following beautiful, and somewhat surprising, theorem of Yamada and Watan-abe, states that the existence of a weak solution for any initial distribution togetherwith pathwise uniqueness implies the existence of a unique strong solution.

Proposition 7.6 If pathwise uniqueness holds then any solution is a strong solu-tion and uniqueness in law holds.

For the precise meaning of ‘universally measurable’ see P163 of Ikeda-Watanabe’sbook [13].

Theorem 7.7 (The Yamada-Watanabe Theorem) If for each initial probabilitydistribution there is a weak solution to the SDE and suppose that pathwise unique-ness holds then there exists a unique strong solution. By this we meant that thereis a progressively measurable map: F : Rd ×Wm

0 → W d, where the σ-algebrasare ‘universally complete’, such that

1. for any probability measure µ on Rd there exists F that is measurable w.r.t.B(Rd×Wm

0 )µ×P s.t. F (x, ω) = F (x, ω) a.s.. If ξ0 ∈ F0 we set F (ξ0, B) =F (ξ0, B).

2. For any BM (Bt) on a probability space (Ω,F ,Ft, P ), and any ξ0 ∈ F0,xt = Ft(ξ0, Bt) is a solution to the SDE with driving noise (Bt) and initialvalue ξ0.

3. If xt is a solution to the equation with driving noise (Bt), then xt = Ft(x0, B)a.s.

In another word, for any Bt, and x0 ∈ Rd, Ft(x0, B) is a solution with thedriving noise Bt. If xt is a solution on a filtered probability space with drivingnoise Bt, then xt = Ft(x,B) a.s.

We do not prove this theorem, but refer to Ikeda-Watanabe and Revuz-Yor. Thefollowing observation is important for the proof of the Yamada-Watanabe Theo-rem. Given two solutions on two probability space we could build them on thesame probability space: W d × W d ×Wm.

Lemma 7.8 Let f, g be locally bounded predictable processes (measurable withrespect to the filtration generated by left continuous processes), and B,W contin-uous semi-martingales. If (f,B) = (g,W ) in distribution then

(f,B,

∫ t

0fsdBs)

law= (g,W,

∫ t

0gsdWs),

i.e. they have the same probability distribution.

91

Page 93: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

See exercise 5.16 Revuz-Yor.

7.7 Markov process and Transition function (Lecture 26)

Definition 7.8 A family of Ft adapted stochastic process Xt is a Markov processif for all real valued bounded Borel measurable function f and for all 0 ≤ s ≤ t,

Ef(Xt)|Fs = Ef(Xt)|Xs.

It is strong Markov if for all stopping time τ and t ≥ 0, Ef(Xτ+t)|Fτ =Ef(Xτ+t)|Xτ.

Definition 7.9 Let (E,B(E)) be a measurable space. A family or probability mea-sures, P (s, x, ·), 0 ≤ s <∞, x ∈ E, on E is a Markov transition function for atime homogeneous Markov process, if

1. For all 0 ≤ s and x ∈ E, A 7→ P (s, x,A) is a probability measure on B(E).

2. For all x ∈ E, P (0, x, ·) = δx, the delta measure at x.

3. For all A ∈ B(E), (s, x) 7→ P (s, x,A) : [0,∞) × E → R is bounded andBorel measurable.

4. For all 0 ≤ s ≤ t ≤ u, all x ∈ E, and A ∈ B(E),

P (s+ t, x,A) =

∫y∈E

P (s, x, dy)P (t, y, A).

This equation is the Chapman-Kolmogorov equation.

If f a bounded integrable function, we define

Ptf(x) =

∫f(y)P (t, x, dy). (7.4)

Definition 7.10 LetB be a Banach space of functions. A family of bounded linearoperators Pt is a semigroup if P0 is the identity map, Ps+t = PsPt for any 0 ≤ s, t.

Let (Pt) be defined by formula (7.4). Then it is a semigroup of bounded linearoperators on the space of bounded measurable functions, c.f. condition (2) and theChapman-Kolmogorov equation.

92

Page 94: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Definition 7.11 An Ft adapted stochastic process (Xt) is a time homogeneousMarkov process w.r.t. Ft and with Markov transition probabilities Ps,t, if for 0 ≤s ≤ t and for all f : (E,B(E))→ R bounded measurable

Ef(Xt+s)|Fs = Ptf(Xs) ≡∫f(y)P (t,Xs, dy). (7.5)

It is strong Markov if for any stopping time τ ,

Ef(Xs+τ )|Fτ = Psf(Xτ ).

The probability measure ν(·) = P (X0 ∈ ·) is the initial distribution of the Markovprocess.

Exercise 7.2 Prove that (7.5) implies the the Chapman-Kolmogorov equation.

Exercise 7.3 Let (Xt) be a Markov process with probability transition functionP (t, x, ·) and initial distribution ν. Prove that forA0, A1, . . . , Ak Borel measurablesets of E and 0 ≤ t1 < · · · < tk,

P (X0 ∈ A0, Xt1 ∈ A1, . . . , Xtk ∈ Ak)∫A

∫A1

. . .

∫AK

P (tk − tk−1, xk−1, dxk) . . . P (t2 − t1, x1, dx2)P (t1, x0, dx1)ν(dx0).

(7.6)

If E = Rn, there exists a Markov process (Xt) whose finite dimensional dis-tribution is determined by (7.6), (7.6) is equivalent to the following: for any finitefamily of bounded measurable functions fi : Rd → R,

EΠki=0fi(Xti)

=

∫Rd

f0(x0)ν(dx0)

∫Rd

f1(x1)P (t1, x0, dx1) . . .

∫Rd

fk(xk)P (tk − tk−1xk−1, dxk)

=

∫Rd

f0(x0)Pt1[f1(x0) . . . Ptk−tk−1

fk(xk−1)]ν(dx0).

Remark 7.1 If (Xt) is a Markov process we may be tempted to define: Ts,t, t ≥s ≥ 0 on Bounded measurable functions by

Ts,tf(x) = Ef(Xt)|Xs = x.

Indeed this is a good proposal needing some thoughts. The conditional expectationEf(Xt)|Xs is defined up to a set of measure zero and this set of measure zeromay differ when the function f is changed. For the object Ef(Xt)|Xs = xto be well defined we only need to consider matters related to regular conditionalprobabilities, such considerations are in general quite messy. However if we beginwith a transition function, we have no more problem.

93

Page 95: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

7.7.1 Semigroup and Generators

Definition 7.12 A semigroup is strongly continuous, if limt→0 Ttf = f for everyf ∈ B. It is a contraction semigroup if ‖Pt‖ ≤ 1.

Definition 7.13 The infinitesimal generator of a semigroup Tt of bounded linearoperates, defined on a Banach space B, is the linear operator L given by the for-mula:

Lf = limt→0

Ptf − ft

.

The domain of L is the set of f ∈ B such that the above limit exists.

Proposition 7.9 Let Pt be a strongly continuous semigroup on a Banach space Bwith infinitesimal generator L. Let t ≥ 0. The following holds

1. If f ∈ B, then∫ t

0 Psfds ∈ Dom(L) and

Ptf − f = L(∫ t

0Psfds

).

2. If f ∈ Dom(L) then Ptf ∈ Dom(L), then

Ptf − f =

∫ t

0L(Psf)ds,

d

dtPtf = L(Ptf) = Pt(Lf).

We will review a number of properties concerning the Markov transition func-tion. LetE = R for simplicity. If Ps,t(x, dy) is absolutely continuous with respectto dy its density is denoted by p(s, x, t, y).

Theorem 7.10 If the transition density p(s, x, t, y) of a diffusion process is mea-surable in all its arguments, then for each s < u < t, x and for almost all y,∫

p(s, x, u, z)p(u, z, t, y)dz = p(s, x, t, y).

Theorem 7.11 Assume that the transition density p(s, x, t, y) of a diffusion pro-cess satisfies:

1. For 0 ≤ s < t, t − s > δ > 0, p(s, x, t, y) is continuous and bounded ins, t, x.

2. p is twice differentiable in x and once differentiable in t. Then for 0 < s < t,p(s, x, t, y) satisfies the backward Kolmogorov equation:

∂p

∂t(s, x, t, y) = −b(s, x)

∂p

∂x(s, x, t, y)− 1

2σ(s, x)

∂2p

∂x2(s, x, t, y).

94

Page 96: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

In the backward Kolmogorov equation we differentiate the x-variable. An integra-tion by parts of the formula gives,

Theorem 7.12 Assume that the transition density p(s, x, t, y) of a diffusion pro-cess is such that the partial derivatives ∂p

∂s (s, x, t, y), ∂p∂y (s, x, t, y), ∂2p∂y2

(s, x, t, y)

exist. Assume that ∂σ∂x (s, x) and ∂b

∂x(s, x) exist. Then the Kolomogrov’s equation(Fokker-Plank equation) holds for s < t:

∂p

∂t(s, x, t, y) = − ∂

∂y(b(s, x)p(s, x, t, y)) +

1

2

∂2

∂y2(σ(s, x)p(s, x, t, y)) .

7.7.2 Solutions of SDE as Markov process

We consider the time homogeneous SDE E(σ, b).

Theorem 7.13 If uniqueness in law holds, the solution is a Markov process.

Definition 7.14 Let (Ft(x), t < τ(x)) be a solution to the SDE with initial valuex. For f : R→ R bounded measurable we define

Ptf(x) = Ef(Ft(x))1t<τ(x).

We say Pt is the probability semigroup for the SDE.

In the rest of the section we assume that for each x0, there is a unique globalsolution Ft(x0) to E(σ, b). The solution is also denoted by (xt).

Let f : Rd → R be C2, we define a linear operator L : C2(Rd)→ C(Rd) by

Lf(x) =1

2

d∑i,j=1

(m∑k=1

σik(x)σjk(x)

)∂2f

∂xi∂xj(x) +

d∑j=1

∂f

∂xj(x)bj(x).

The linear operator is the infinitesimal generator of the SDE E(σ, b). We do notdiscuss its domain.

Exercise 7.4 If f : Rd → R is C2 prove that

f(xt) = f(x0) +

d∑j=1

m∑k=1

∫ t

0

∂f

∂xj(xs)(σ

jk(xs))dB

ks +

∫ t

0Lf(xs)ds. (7.7)

This lead to :

95

Page 97: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Lemma 7.14 • If f ∈ C2(Rd,R),

Mft := f(xt)− f(x0)−

∫ t

0Lf(xs)ds

is a local martingale.

• If (Wt) is the canonical process and P−(x·)∗P . Then for any f ∈ C2(Rd,R),

f(Wt)− f(W0)−∫ t

0Lf(Ws)ds

is a local martingale.

Remark 7.2 If f ∈ C∞K (Rd,R), C∞ smooth functions with compact supports, σand b are continuous bounded (and Lipschitz continuous), then

Ptf(x) = f(x) +

∫ t

0Ps(Lf)(x)ds.

Proposition 7.15 Let ut be a bounded regular solution to the Cauchy problem forthe parabolic equation

∂tut = Lut, u0 = f.

Then ut(x) = Eu0(xt).

Proof We apply Ito’s formula to (t, x) 7→ uT−t(x).

uT−t(xt) = uT (x)+

∫ t

0(∂

∂s(uT−s)+LuT−s)(xs)ds+

m∑k=1

d∑i=1

∫ t

0uT−s(xs)(σ

ik(xs))dB

ks .

The last term is a true martingale and vanish after taking the expectation. Note that∂∂s(uT−s) = − ∂

∂rur|r=T−s and the penultimate term vanishes also. Take t → Twe see that

uT (x) = Eu0(XT ) = PT f(x).

The equation

∂tUt + Lut = 0, 0 ≤ t ≤ T, UT = g

is called Kolmogorov’s forward equation. If PT−tg is reasonably smooth so wecan apply Ito’s formula, It is clear that it is the solution to Kolmogorov’s forwardequation.

96

Page 98: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Lemma 7.16 1. If f ∈ C2(Rd,R),

Mft := f(xt)− f(x0)−

∫ t

0Lf(xs)ds

is a local martingale.

2. If (Wt) is the canonical process on the canonical probability space, let Px =(x·)∗(P ). Then for any f ∈ C2(Rd,R),

f(Wt)− f(W0)−∫ t

0Lf(Ws)ds

is a local martingale.

This can be checked by Ito’s formula.

7.8 Existence of Solutions and The Martingale Problem

Definition 7.15 A probability measure µ on the canonical space (W d,B(W d)solves the martingale problem for L if for any f ∈ C2

K ,

f(Wt)− f(W0)−∫ t

0Lf(Ws)ds

is a local martingale.

The Martingale problem method has been used and developed by Stroock andVaradhan.

Theorem 7.17 (Ikedat-Watanabe, pp169) The existence of solutions to equationE(σ, b) is equivalent to the marginal problem associated to L is solvable.

The idea of the proof is to construct Brownian motions Bit . Let f(x) = xi and

f(x) = xixj . Compute Lf . ThenM it = xit−xi0−

∫ t0 b(xs)ds is a local martingale,

it is essentially ∑k

∫ t

0σik(xs)dB

ks .

One can compute the brackets 〈M i,M j〉t, which is∫ t

0

∑σikσ

jk(xs)ds.

97

Page 99: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

7.9 Localisation

Theorem 7.18 If σk and b are locally Lipschitz continuous and grows at mostlinearly there is a unique global strong solution to E(σ, b).

Proof Let τn = inft|xt| ≥ n. Let f(x) = |x|2. Then |xt|2−|x0|2−∫ t

0 Lf(xs)dsis a local martingale. Check that there exists C s.t. Lf(x) ≤ C(1 + |x|2). Thisgives,

E|xt∧τn |2 ≤ E|x0|2 + E

∫ t∧τn

0C(1 + |xs|2)ds

≤ E|x0|2 + Ct+ CE

∫ t

0|xs∧τn |2ds.

By Grownall,E|xt∧τn |2 ≤ C(E|x0|2 + Ct)eCT .

By Fatou, E|xt|2 <∞. So |xt| <∞ almsot surely.

Theorem 7.19 Suppose that the coefficients of the SDE are locally Lipschitz con-tinuous. For each initial value x0 there is a unique strong solution (xt, t < τ).Furthermore limt↑τ |xt| =∞ on ω : τ(ω) <∞.

Proof Write b = (b1, . . . , bd) in components. For each N , let bNj , j = 1, . . . , d,be a globally Lipschitz continuous function with bNj = bN if |x| ≤ N and bNj = 0

if |x| > N + 1. Let bN = (bN1 , . . . , bNd ). We also define a sequence of σN in the

same way. Let xNt be the unique strong global solution to the SDE:

dxNt = σN (xNt )dBt + bN (xNt )dt.

Let τN be the first time that |xNt | is greater or equal to N . Then xNt agrees withxN+1t for t < τN and τN increases with N . Define

xt(ω) = xNt (ω), on ω : t < τN (ω).

Then xt is defined up to t < τ where

τ = supNτN .

Note that the exit time from BN of xt is τN and limt↑τ(ω) xt(ω) =∞ on τ(ω) <

∞. The sets ΩN = ω : t < τN (ω) increase with N . Let Ω = ∪Nω : t <τN (ω).

P (Ω) = limN→∞

P (ω : t < τN (ω)).

98

Page 100: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

On t < τ(x0) = ∪NΩN , we have patched up a solution xt.It is now clear that xt is a maximal solution for E(σ, b). It is a strong solution.

Theorem 7.20 Suppose that for k = 1, . . . ,m, σk : Rd → Rd and b : Rd → Rd

are locally Lipschitz continuous , i.e. for each N ∈ N , there exists a number KN

such that for all x, y with |x| ≤ N, |y| ≤ N ,

|σk(x)− σk(y)| ≤ KN |x− y|, |b(x)− b(y)| ≤ KN |x− y|.

Then pathwise uniqueness holds.

Proof Let us prove the case of d = 1,m = 1, and let (xt, t < τ) and (yt, t < τ)be two solutions to

xt = x0 +

∫ t

0σ(xs)dBs +

∫ t

0b(xs)ds

with the same initial value and the same Brownian motion, and life times τ and τ .Let

τN = inft>0|xt| ≥ N, τN = inf

t>0|yt| ≥ N.

Then

xt∧τN = x0 +m∑k=1

∫ t∧τN

0σk(xs)dB

ks +

∫ t∧τN

0b(xs)ds.

Furthermore τ = supN τN . On τ <∞, limN→∞ |xτN | =∞.

Let ζN = minτN , τN and let T > 0. For t ≤ T ,

|xt∧ζN − yt∧ζN |2 =

(∫ t∧ζN

0(σ(xs)− σ(ys))dBs +

∫ t∧ζN

0(b(xs)− b(ys))ds

)2

≤ 2

(∫ t∧ζN

0(σ(xs)− σ(ys))dBs

)2

+ 2

(∫ t∧ζN

0(b(xs)− b(ys))ds

)2

≤ 2

(∫ t

0(σ(xs∧ζN )− σ(ys∧ζN ))dBs

)2

+ 2(t ∧ ζN )

∫ t∧ζN

0

(b(xs∧ζN )− b(ys∧ζN )

)2ds.

In the last step we applied Holder’s inequality,∫ t

0|f(s)g(s)|ds ≤

(∫ t

0|f(s)|pds

) 1p(∫ t

0|g(s)|qds

) 1q

.

99

Page 101: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

with p, q = 2. Apply Burkholder-Davies-Gundy inequality to see that, for someconstant C,

E

(∫ t

0

(σ(xs∧ζN )− σ(ys∧ζN )

)dBs

)2

≤ CE⟨∫ ·

0

(σ(xs∧ζN )− σ(ys∧ζN )

)dBs,

∫ ·0

(σ(xs∧ζN )− σ(ys∧ζN )

)dBs

⟩t

= CE

∫ t

0

(σ(xs∧ζN )− σ(ys∧ζN )

)2ds.

E|xt∧ζN−yt∧ζN |2 ≤ 2CE

(∫ t

0(σ(xs∧ζN )− σ(ys∧ζN ))2ds

)+2t

∫ t

0E(b(xs∧ζN )−b(ys∧ζN )

)2ds.

By local Lipschitz continuity,

E|xt∧ζN − yt∧ζN |2 ≤

(2C(KN )2 + 2(KN )2t

)∫ t

0E|xs∧ζN − ys∧ζN |

2ds.

By Grownall’s inequality, for all t ≤ T , E|xt∧ζN − yt∧ζN |2 = 0. Since T isarbitrary, E|xt∧ζN−yt∧ζN |2 = 0 for all t. This implies that xt = yt on t < τN∧τN .By the sample continuity of (Xt) and (yt), we see that τN = τN and and xt = yton t < τN ∧ τN . This τ = τ and xt = yt for all t < τ .

100

Page 102: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Chapter 8

Girsanov Transform

Let (Ω,F) be a measurable space. Let Q,P be probability measures with Q ab-solutely continuous with respect to P (This is denoted by Q << P ). Then thereexists a random variable D ∈ L1(Ω,F , P ) such that for any X : Ω→ R boundedmeasurable, ∫

ωXdQ =

∫ΩXDdP.

If there is risk of confusion, we denote by EP taking expectation with respect toP . Note that EP (D) = 1. If, furthermore, 1

D ∈ L1(Ω,F , Q), then Q is equivalent

to P . Indeed, for any A ∈ F

P (A) =

∫A

(1

DD

)dP =

∫A

1

DdQ.

If A has measure Q(A) = 0, the integral on the right hand side vanishes, andP (A) = 0. Conversely ifP is equivalent toQ, then dP

dQ = 1D and 1

D ∈ L1(Ω,F , Q).

8.1 Girsanov Theorem For Martingales (Lecture 28)

Let P and Q be equivalent probability measures on (Ω,F , P ). Let f = dQdP . Then

f ≥ 0 and Ef = 1. Let ft = E(f |Ft). Then (ft) is a strictly positive inte-grable martingale. Let (Ft) be a complete and right continuous filtration. LetD ∈ L1(Ω,F , P ). We define

Dt = ED|Ft.

We take Dt to be a cadlag version. Then (Mt, t < ∞) is a closed martingale. SeeProposition 4.15.

101

Page 103: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Proposition 8.1 Let (ft) be a strictly positive continuous local martingale. Thereis a continuous local martingale Nt s.t.

ft = eNt−12〈N,N〉t .

Proof By Ito’s formula,

log ft = log f0 +

∫ t

0

dfsfs− 1

2

∫ t

0

1

(fs)2d〈f, f〉s.

Let Nt = log f0 +∫ t

01fsdfs, then

〈N,N〉t =1

2

∫ t

0

1

(fs)2d〈f, f〉s,

and log(ft) = Nt − 12〈N,N〉t.

Let (Nt) be a continuous local martingale. We define Q on Ft such that dQdP =

eNt−12〈N,N〉t . If E(eNt−

12〈N,N〉t) = 1 then Q is a probability measure on (Ft).

This is equivalent to (eNt−12〈N,N〉t , s ≤ t) is a true martingale.

Theorem 8.2 (Novikov criterion) Let (Nt) be a continuous local martingale. Theexponential martingale eNt−

12〈N,N〉t is a true martingale if E

(e

12〈N,N〉t

)<∞ for

all t ≥ 0.

8.2 Girsanov for Martingales

Let P and Q be equivalent probability measures on (Ω,F , P ) and (Ft) a standardfiltration. Then

dQ

dP= eNt−

12〈N,N〉t

for a continuous local martingale (Nt).

Proposition 8.3 Let ft = eNt−12〈N,N〉t where (Nt) is a continuous local martin-

gale. Let (Mt) be a continuous local martingale prove that∫ t

0

1

fsd〈f,M〉s = 〈M,N〉t.

102

Page 104: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Proof If (Mt) is a sample continuous local martingale, then

〈f,M〉t =

∫ t

0fsd〈M,N〉s.

This follows from dft = ftdNt. In particular,∫ t

0

1

fsd〈f,M〉s = 〈M,N〉t.

Theorem 8.4 (Girsanov Theorem) Let (Mt) be a continuous (Ft)-martingale w.r.t.P . Then

Mt = Mt − 〈M,N〉tia a Q-martingale.

Proof We only need to prove the case of f0 = 1 in which case N0 = 0. Let s < tand A ∈ Fs, we prove that ∫

AMtdQ =

∫AMsdQ.

Equivalently, ∫AMtfstdP =

∫AMsfsdP.

It is sufficient to prove that Mtft = Mtft − 〈M,N〉tft is a P -martingale. Weapply Ito’s formula,

〈M,N〉tft =

∫ t

0fsd〈M,N〉s +

∫ t

0〈M,N〉sdfs

= 〈f,M〉t +

∫ t

0〈M,N〉sdfs.

ThusMtft = Mtft − 〈M,N〉tft

= Mtft − 〈f,M〉t −∫ t

0〈M,N〉sdfs.

Both Mtft−〈f,M〉t and∫ t

0 〈M,N〉sdfs, the latter is a stochastic integral wi.r.t. toa martingale, are martingales w.r.t. P . This completes the proof.

103

Page 105: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Corollary 8.5 Let (Bt) be a Brownian motion with respect to P then Bt = Bt −〈B,N〉t is a Q-Brownian motion.

Proof This is clear by Levy’s characterisation theorem: (Bt) is a BM with mar-tingale with 〈B, B〉t = t.

Example 8.1 Let σ, b : R → R be Borel measurable and such that there is aunique (in law) solution to the SDE

dxt = σ(xt)dBt + b(xt)dt. (8.1)

Then for any Borel measurable set A,

P (Xt ∈ A) = E(1Bt∈Ae

∫ t0 b(x0+Bs)dBs− 1

2

∫ t0 b

2(x0+Bs)ds).

Proof Let Nt =∫ t

0 b(ys)dBs. Let Q be defined by the formula:

dQ

dP= e

∫ t0 b(x0+Bs)dBs− 1

2

∫ t0 b

2(x0+Bs)ds.

Then

〈N,B〉t =

∫ t

0b(x0 +Bs)ds.

Let

Bt = Bt −∫ t

0b(ys)ds.

Let yt = x0 +Bt. Then

dyt = dBt = dBt + b(yt)dt.

By uniqueness in law (yt) under Q has the same distribution as (xt) under Q. Thatis

P (Xt ∈ A) = Q(yt ∈ A) = E1x0+Bt∈Ae∫ t0 b(x0+Bs)dBs− 1

2

∫ t0 b

2(x0+Bs)ds.

104

Page 106: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Chapter 9

Appendix

9.1 Lyapunov Function Test

Definition 9.1 A C2 function V : Rd → R+ is a Lyapunov function, for theexplosion problem associated to an infinitesimal generator L, if

(1) V ≥ 0,

(2) lim|x|→∞ |V (x)| =∞ and

(3) LV ≤ cV +K for some number c.

We will use below Fatou’s Lemma, lim infn→∞∫fdµ ≤

∫lim infn→∞ fndµ,

Proposition 9.1 Assume that σk’s are continuous and the SDE E(σ, b) has a solu-tion xt. Suppose that there is a Lyapunov function V for the generator L the SDEdoes not explode.

Proof Let τ be the life time of (xt) and ζ < τ a stopping time. By assumption (2),LV is locally bounded. Apply Ito’s formula to V and xt∧ζ Then

V (xt∧ζ) = V (x0) +

∫ t∧ζ

0(LV )(xs) ds+

m∑k=1

∫ t∧ζ

0

∂V

∂xj(xs)σk(xs)dB

ks

≤ V (x0) +

∫ t∧ζ

0

(cV (xs) +K

)ds+

m∑k=1

∫ t∧ζ

0

∂V

∂xj(xs)σk(xs)dB

ks

≤ V (x0) +Kt+ c

∫ t

0

(V (xs∧ζ)

)ds+

m∑k=1

∫ t∧ζ

0

∂V

∂xj(xs)σk(xs)dB

ks

105

Page 107: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Let τN = inf|xt| ≥ N and take ζ = τN in the above. Since dV and σk arecontinuous and therefore bounded on the ball of radius N , the local martingale isa martingale. Note that V and LV (x) ≤ CV + k are also bounded on BN . Takingexpectation to see that

EV (xt∧τN ) ≤ V (x0) +Kt+ c

∫ t

0E(V (xs∧τN )

)ds.

By Grownall’ lemma,

EV (xt∧τN ) ≤ [V (x0) +Kt]ect.

Since

EV (xt∧τN ) = EV (xt)1t<τN + EV (xt∧τN )1t≥τN )

= EV (xt)1t<τN + V (N)P (t ≥ τN ),

V (N)P (t ≥ τN ) ≤(V (x0) +Kt

)eCt

By Fatou’s lemma for non-negative functions,

P (τ ≤ t) = E1τ≤t ≤ limN→∞

P (t ≥ τN ).

Since limN→∞ |V (N)| =∞,

limN→∞

P (t ≥ τN ) ≤ limN→∞

1

V (N)[V (x0) + kt]eCt = 0.

So τ ≥ t for any t and there is no explosion.

Example 9.1 1. Assume that∑m

k=1 |σk(x)| ≤ c(1 + |x|2) and 〈b(x), x〉Rd ≤c(1 + |x|2). Then 1 + |x|2 is a Lyapunov function:

L(|x|2 + 1) =

m∑k=1

d∑i=1

(σik(x))2 + 2

d∑l=1

bl(x)xl

=

m∑k=1

|σk(x)|2 + 2〈b(x), x〉 ≤ 2c(1 + |x|2).

2. Show that the SDE below does not explode,

dxt = (y2t − x2

t )dB1t + 2xtytdB

2t

dyt = −2xtytdB1t + (x2

t − y2t )dB

2t .

106

Page 108: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

3. Let r(x) = |(x1, . . . , xd)| =√∑

x2i be the radius function. Let φ be a

harmonic function, so

φ(x) = c1|x|+ c2, dimension =1

φ(x) = c1 log |x|+ c2, dimension =2

φ(x) =c1

|x|n−2+ c2, dimension > 2

For dimension 1 and 2 harmonic functions can be used to build Lyapunovfunctions for generators of the formC∆, whereC can be a function. Modifythe function inside the ball of radius one so that the function is smooth.Harmonic functions in dimension 3 or greater are not useful for explosionproblems.

9.1.1 Strong Completeness, flow

Suppose that path wise uniqueness holds and the SDE does not explode.

Definition 9.2 If for each point x there is a solution Ft(x, ω) to

dxt =∑i

σi(xt) dBit + b(xt)dt,

and there is a version of Ft(x, ω) such that (t, x) 7→ Ft(x) is continuous, we saythat the SDE is strongly complete.

Definition 9.3 Let ξ be Fs-measurable. We denote by Fs,t(ξ) the solution to:

xt = ξ +∑i

∫ t

sσi(xr)dB

ir +

∫ t

sb(xr)dr.

For simplification let Ft = F0,t.

Definition 9.4 Let S be a stopping time we define the shift operator: θSB =BS+· −BS .

The process (θSB)t := BS+t−BS is anF·+S BM. If (Bt) is the canonical processon the Wiener space, this is θS(ω)(t) = ω(S + t)− ω(S).

107

Page 109: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Theorem 9.2 Given 0 ≤ S ≤ T be stopping times, assume there is a uniqueglobal strong solution Ft(·, B), t ≥ 0) to the SDE. Then the flow property holds:

FS,T (FS(x0, B), ω) = FT (x,B). (9.1)

And the Cocycle property holds:

FT−S(FS(x, ω), θS(ω)) = FT (x, ω). (9.2)

Proof The flow property follows from the pathwise uniqueness of the solution.

Remark. We do not prove this. Given the existence of a strong solution andpath wise uniqueness, we have the Cocycle property which implies the Markovproperty and the Cocycle property with stopping times implies the strong Markovproperty.

Ef(FS+t(x,B)) = EE f(Ft(FS(x), θS(B)))|FS(x)= Ef(Ft(−, θS(B)))(FS(x)) = Ptf(FS(x)).

108

Page 110: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

Bibliography

[1] Ludwig Arnold. Random dynamical systems. Springer Monographs in Math-ematics. Springer-Verlag, Berlin, 1998.

[2] Patrick Billingsley. Convergence of probability measures. Wiley Series inProbability and Statistics: Probability and Statistics. John Wiley & Sons Inc.,New York, second edition, 1999. A Wiley-Interscience Publication.

[3] V. I. Bogachev. Measure theory. Vol. I, II. Springer-Verlag, Berlin, 2007.

[4] K. D. Elworthy. Stochastic differential equations on manifolds, volume 70of London Mathematical Society Lecture Note Series. Cambridge UniversityPress, Cambridge, 1982.

[5] K. D. Elworthy, Y. Le Jan, and Xue-Mei Li. On the geometry of diffusion op-erators and stochastic flows, volume 1720 of Lecture Notes in Mathematics.Springer-Verlag, Berlin, 1999.

[6] K. D. Elworthy, Xue-Mei Li, and M. Yor. The importance of strictly localmartingales; applications to radial Ornstein-Uhlenbeck processes. Probab.Theory Related Fields, 115(3):325–355, 1999.

[7] K. David Elworthy, Yves Le Jan, and Xue-Mei Li. The geometry of filtering.Frontiers in Mathematics. Birkhauser Verlag, Basel, 2010.

[8] Gerald B. Folland. Real analysis. Pure and Applied Mathematics (New York).John Wiley & Sons Inc., New York, second edition, 1999. Modern techniquesand their applications, A Wiley-Interscience Publication.

[9] Avner Friedman. Stochastic differential equations and applications. Vol. 1.Academic Press [Harcourt Brace Jovanovich Publishers], New York, 1975.Probability and Mathematical Statistics, Vol. 28.

109

Page 111: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

[10] Avner Friedman. Stochastic differential equations and applications. Vol. 2.Academic Press [Harcourt Brace Jovanovich Publishers], New York, 1976.Probability and Mathematical Statistics, Vol. 28.

[11] I. I. Gıhman and A. V. Skorohod. Stochastic differential equations. Springer-Verlag, New York, 1972. Translated from the Russian by Kenneth Wickwire,Ergebnisse der Mathematik und ihrer Grenzgebiete, Band 72.

[12] R. Z. Has′minskiı. Stochastic stability of differential equations, volume 7 ofMonographs and Textbooks on Mechanics of Solids and Fluids: Mechanicsand Analysis. Sijthoff & Noordhoff, Alphen aan den Rijn, 1980. Translatedfrom the Russian by D. Louvish.

[13] Nobuyuki Ikeda and Shinzo Watanabe. Stochastic differential equationsand diffusion processes, volume 24 of North-Holland Mathematical Library.North-Holland Publishing Co., Amsterdam, second edition, 1989.

[14] Olav Kallenberg. Foundations of modern probability. Probability and itsApplications (New York). Springer-Verlag, New York, second edition, 2002.

[15] Ioannis Karatzas and Steven E. Shreve. Brownian motion and stochastic cal-culus, volume 113 of Graduate Texts in Mathematics. Springer-Verlag, NewYork, second edition, 1991.

[16] H. Kunita. Lectures on stochastic flows and applications, volume 78 of TataInstitute of Fundamental Research Lectures on Mathematics and Physics.Published for the Tata Institute of Fundamental Research, Bombay, 1986.

[17] Hiroshi Kunita. Stochastic flows and stochastic differential equations, vol-ume 24 of Cambridge Studies in Advanced Mathematics. Cambridge Univer-sity Press, Cambridge, 1990.

[18] Roger Mansuy and Marc Yor. Aspects of Brownian motion. Universitext.Springer-Verlag, Berlin, 2008.

[19] Peter Morters and Yuval Peres. Brownian motion. Cambridge Series in Statis-tical and Probabilistic Mathematics. Cambridge University Press, Cambridge,2010. With an appendix by Oded Schramm and Wendelin Werner.

[20] Bernt Øksendal. Stochastic differential equations. Universitext. Springer-Verlag, Berlin, sixth edition, 2003. An introduction with applications.

[21] K. R. Parthasarathy. Probability measures on metric spaces. AMS ChelseaPublishing, Providence, RI, 2005. Reprint of the 1967 original.

110

Page 112: Lectures on Stochastic Analysis Autumn 2014 version · 2017. 6. 6. · We cover the theory of martingales, basics of Brownian motions, theory of stochas-tic integration, basic theory

[22] Philip E. Protter. Stochastic integration and differential equations, volume 21of Applications of Mathematics (New York). Springer-Verlag, Berlin, secondedition, 2004. Stochastic Modelling and Applied Probability.

[23] Michael Reed and Barry Simon. Methods of modern mathematical physics.I. Academic Press Inc. [Harcourt Brace Jovanovich Publishers], New York,second edition, 1980. Functional analysis.

[24] Daniel Revuz and Marc Yor. Continuous martingales and Brownian motion,volume 293 of Grundlehren der Mathematischen Wissenschaften [Funda-mental Principles of Mathematical Sciences]. Springer-Verlag, Berlin, thirdedition, 1999.

[25] L. C. G. Rogers and David Williams. Diffusions, Markov processes, andmartingales. Vol. 2. Wiley Series in Probability and Mathematical Statistics:Probability and Mathematical Statistics. John Wiley & Sons Inc., New York,1987. Ito calculus.

[26] L. C. G. Rogers and David Williams. Diffusions, Markov processes, andmartingales. Vol. 1. Wiley Series in Probability and Mathematical Statistics:Probability and Mathematical Statistics. John Wiley & Sons Ltd., Chichester,second edition, 1994. Foundations.

[27] Daniel W. Stroock. An introduction to Markov processes, volume 230 ofGraduate Texts in Mathematics. Springer-Verlag, Berlin, 2005.

[28] Daniel W. Stroock and S. R. Srinivasa Varadhan. Multidimensional dif-fusion processes, volume 233 of Grundlehren der Mathematischen Wis-senschaften [Fundamental Principles of Mathematical Sciences]. Springer-Verlag, Berlin, 1979.

[29] S. R. S. Varadhan. Stochastic processes, volume 16 of Courant Lecture Notesin Mathematics. Courant Institute of Mathematical Sciences, New York,2007.

[30] David Williams. Probability with martingales. Cambridge MathematicalTextbooks. Cambridge University Press, Cambridge, 1991.

[31] Marc Yor. Some aspects of Brownian motion. Part II. Lectures in Mathe-matics ETH Zurich. Birkhauser Verlag, Basel, 1997. Some recent martingaleproblems.

111