50
Stochastic Process, ACF, PACF, White Noise, Estimation

Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

Stochastic Process, ACF, PACF, White Noise, Estimation

Page 2: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Outline

1 §2.1: Stationarity

2 §2.2: Autocovariance and Autocorrelation Functions

3 §2.4: White Noise

4 R: Random Walk

5 Homework 1b

Arthur Berg Stationarity, ACF, White Noise, Estimation 2/ 21

Page 3: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Stochastic Process

Definition (stochastic process)

A stochastic process is sequence of indexed random variables denoted as Z(ω, t)where ω belongs to a sample space and t belongs to an index set.

From here on out, we will simply write a stochastic process (or time series) as{Zt} (dropping the ω).

NotationFor the time units t1, t2, . . . , tn, denote the n-dimensional distribution function ofZt1 ,Zt2 , . . . ,Ztn as

FZt1 ,...,Ztn(x1, . . . , xn) = P (Zt1 ≤ x1, . . . ,Ztn ≤ xn)

Example

Let Z1, . . . ,Zniid∼ N (0, 1). Then FZt1 ,...,Ztn

(x1, . . . , xn) =∏n

j=1 Φ(xj).

Arthur Berg Stationarity, ACF, White Noise, Estimation 3/ 21

Page 4: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Stochastic Process

Definition (stochastic process)

A stochastic process is sequence of indexed random variables denoted as Z(ω, t)where ω belongs to a sample space and t belongs to an index set.

From here on out, we will simply write a stochastic process (or time series) as{Zt} (dropping the ω).

NotationFor the time units t1, t2, . . . , tn, denote the n-dimensional distribution function ofZt1 ,Zt2 , . . . ,Ztn as

FZt1 ,...,Ztn(x1, . . . , xn) = P (Zt1 ≤ x1, . . . ,Ztn ≤ xn)

Example

Let Z1, . . . ,Zniid∼ N (0, 1). Then FZt1 ,...,Ztn

(x1, . . . , xn) =∏n

j=1 Φ(xj).

Arthur Berg Stationarity, ACF, White Noise, Estimation 3/ 21

Page 5: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Stochastic Process

Definition (stochastic process)

A stochastic process is sequence of indexed random variables denoted as Z(ω, t)where ω belongs to a sample space and t belongs to an index set.

From here on out, we will simply write a stochastic process (or time series) as{Zt} (dropping the ω).

NotationFor the time units t1, t2, . . . , tn, denote the n-dimensional distribution function ofZt1 ,Zt2 , . . . ,Ztn as

FZt1 ,...,Ztn(x1, . . . , xn) = P (Zt1 ≤ x1, . . . ,Ztn ≤ xn)

Example

Let Z1, . . . ,Zniid∼ N (0, 1). Then FZt1 ,...,Ztn

(x1, . . . , xn) =∏n

j=1 Φ(xj).

Arthur Berg Stationarity, ACF, White Noise, Estimation 3/ 21

Page 6: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Stochastic Process

Definition (stochastic process)

A stochastic process is sequence of indexed random variables denoted as Z(ω, t)where ω belongs to a sample space and t belongs to an index set.

From here on out, we will simply write a stochastic process (or time series) as{Zt} (dropping the ω).

NotationFor the time units t1, t2, . . . , tn, denote the n-dimensional distribution function ofZt1 ,Zt2 , . . . ,Ztn as

FZt1 ,...,Ztn(x1, . . . , xn) = P (Zt1 ≤ x1, . . . ,Ztn ≤ xn)

Example

Let Z1, . . . ,Zniid∼ N (0, 1). Then FZt1 ,...,Ztn

(x1, . . . , xn) =∏n

j=1 Φ(xj).

Arthur Berg Stationarity, ACF, White Noise, Estimation 3/ 21

Page 7: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Stationarity

Definition (Strongly Stationarity (aka Stricktly, aka Completely))

A time series {xt} is strongly stationary if any collection

{xt1 , xt2 , . . . , xtn}

has the same joint distribution as the time shifted set

{xt1+h, xt2+h, . . . , xtn+h}

Strong stationarity implies the following:

All marginal distributions are equal, i.e. P(Zs ≤ c) = P(Zt ≤ c) for all s, t,and c. This is what is called “first-order stationary”.

The covariance of Zs and Zt (if exists) is shift-independent, i.e.E(Zs + h,Zt + h) = E(Zs,Zt) for any choice of h.

Strong stationarity typically assumes too much. This leads us to the weakerassumption of weak stationarity.

Arthur Berg Stationarity, ACF, White Noise, Estimation 4/ 21

Page 8: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Stationarity

Definition (Strongly Stationarity (aka Stricktly, aka Completely))

A time series {xt} is strongly stationary if any collection

{xt1 , xt2 , . . . , xtn}

has the same joint distribution as the time shifted set

{xt1+h, xt2+h, . . . , xtn+h}

Strong stationarity implies the following:

All marginal distributions are equal, i.e. P(Zs ≤ c) = P(Zt ≤ c) for all s, t,and c. This is what is called “first-order stationary”.

The covariance of Zs and Zt (if exists) is shift-independent, i.e.E(Zs + h,Zt + h) = E(Zs,Zt) for any choice of h.

Strong stationarity typically assumes too much. This leads us to the weakerassumption of weak stationarity.

Arthur Berg Stationarity, ACF, White Noise, Estimation 4/ 21

Page 9: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Stationarity

Definition (Strongly Stationarity (aka Stricktly, aka Completely))

A time series {xt} is strongly stationary if any collection

{xt1 , xt2 , . . . , xtn}

has the same joint distribution as the time shifted set

{xt1+h, xt2+h, . . . , xtn+h}

Strong stationarity implies the following:

All marginal distributions are equal, i.e. P(Zs ≤ c) = P(Zt ≤ c) for all s, t,and c. This is what is called “first-order stationary”.

The covariance of Zs and Zt (if exists) is shift-independent, i.e.E(Zs + h,Zt + h) = E(Zs,Zt) for any choice of h.

Strong stationarity typically assumes too much. This leads us to the weakerassumption of weak stationarity.

Arthur Berg Stationarity, ACF, White Noise, Estimation 4/ 21

Page 10: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Stationarity

Definition (Strongly Stationarity (aka Stricktly, aka Completely))

A time series {xt} is strongly stationary if any collection

{xt1 , xt2 , . . . , xtn}

has the same joint distribution as the time shifted set

{xt1+h, xt2+h, . . . , xtn+h}

Strong stationarity implies the following:

All marginal distributions are equal, i.e. P(Zs ≤ c) = P(Zt ≤ c) for all s, t,and c. This is what is called “first-order stationary”.

The covariance of Zs and Zt (if exists) is shift-independent, i.e.E(Zs + h,Zt + h) = E(Zs,Zt) for any choice of h.

Strong stationarity typically assumes too much. This leads us to the weakerassumption of weak stationarity.

Arthur Berg Stationarity, ACF, White Noise, Estimation 4/ 21

Page 11: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Mean Function

Definition (Mean Function)

The mean function of a time series {Zt} (if it exists) is given by

µt = E(Zt) =∫ ∞−∞

x ft(x) dx

Example (Mean of an iid MA(q))

Let atiid∼ N (0, σ2) and

Zt = at + θ1at−1 + θ2at−2 + · · ·+ θqat−q

then

µt = E(Zt) = E(at) + θ1 E(at−1) + θ2 E(at−2) + · · ·+ θq E(at−q) ≡ 0

(free of the time variable t)

Arthur Berg Stationarity, ACF, White Noise, Estimation 5/ 21

Page 12: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Mean Function

Definition (Mean Function)

The mean function of a time series {Zt} (if it exists) is given by

µt = E(Zt) =∫ ∞−∞

x ft(x) dx

Example (Mean of an iid MA(q))

Let atiid∼ N (0, σ2) and

Zt = at + θ1at−1 + θ2at−2 + · · ·+ θqat−q

then

µt = E(Zt) = E(at) + θ1 E(at−1) + θ2 E(at−2) + · · ·+ θq E(at−q) ≡ 0

(free of the time variable t)

Arthur Berg Stationarity, ACF, White Noise, Estimation 5/ 21

Page 13: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Mean Function

Definition (Mean Function)

The mean function of a time series {Zt} (if it exists) is given by

µt = E(Zt) =∫ ∞−∞

x ft(x) dx

Example (Mean of an iid MA(q))

Let atiid∼ N (0, σ2) and

Zt = at + θ1at−1 + θ2at−2 + · · ·+ θqat−q

then

µt = E(Zt) = E(at) + θ1 E(at−1) + θ2 E(at−2) + · · ·+ θq E(at−q) ≡ 0

(free of the time variable t)

Arthur Berg Stationarity, ACF, White Noise, Estimation 5/ 21

Page 14: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

White Noise

Definition (White Noise)White noise is a collection of uncorrelated random variables with constant meanand variance.

Notationat ∼WN(0, σ2) — white noise with mean zero and variance σ2

IID WNIf as is independent of at for all s 6= t, then wt ∼ IID(0, σ2)Gaussian White Noise ⇒ IIDSuppose at is normally distributed.uncorrelated + normality⇒ independentThus it follows that at ∼ IID(0, σ2) (a stronger assumption).

Arthur Berg Stationarity, ACF, White Noise, Estimation 6/ 21

Page 15: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

White Noise

Definition (White Noise)White noise is a collection of uncorrelated random variables with constant meanand variance.

Notationat ∼WN(0, σ2) — white noise with mean zero and variance σ2

IID WNIf as is independent of at for all s 6= t, then wt ∼ IID(0, σ2)Gaussian White Noise ⇒ IIDSuppose at is normally distributed.uncorrelated + normality⇒ independentThus it follows that at ∼ IID(0, σ2) (a stronger assumption).

Arthur Berg Stationarity, ACF, White Noise, Estimation 6/ 21

Page 16: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

White Noise

Definition (White Noise)White noise is a collection of uncorrelated random variables with constant meanand variance.

Notationat ∼WN(0, σ2) — white noise with mean zero and variance σ2

IID WNIf as is independent of at for all s 6= t, then wt ∼ IID(0, σ2)Gaussian White Noise ⇒ IIDSuppose at is normally distributed.uncorrelated + normality⇒ independentThus it follows that at ∼ IID(0, σ2) (a stronger assumption).

Arthur Berg Stationarity, ACF, White Noise, Estimation 6/ 21

Page 17: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

White Noise

Definition (White Noise)White noise is a collection of uncorrelated random variables with constant meanand variance.

Notationat ∼WN(0, σ2) — white noise with mean zero and variance σ2

IID WNIf as is independent of at for all s 6= t, then wt ∼ IID(0, σ2)Gaussian White Noise ⇒ IIDSuppose at is normally distributed.uncorrelated + normality⇒ independentThus it follows that at ∼ IID(0, σ2) (a stronger assumption).

Arthur Berg Stationarity, ACF, White Noise, Estimation 6/ 21

Page 18: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Now your turn!

Mean of a Random Walk with DriftSuppose Z0 = 0, and for t > 0, Zt = δ + Zt−1 + at where δ is a constant andat ∼WN(0, σ2). What is the mean function of Zt? (That is compute E [Zt].)

Note thatZt = δ + Zt−1 + at

= 2δ + Zt−2 + at + at−1 (Zt−1 = δ + Zt−2 + at−1)...

= δt +t∑

j=1

aj

Therefore we see

µt = E [Zt] = E [δt] + E

t∑j=1

aj

= δt +t∑

j=1

E [aj] = δt

Arthur Berg Stationarity, ACF, White Noise, Estimation 7/ 21

Page 19: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Now your turn!

Mean of a Random Walk with DriftSuppose Z0 = 0, and for t > 0, Zt = δ + Zt−1 + at where δ is a constant andat ∼WN(0, σ2). What is the mean function of Zt? (That is compute E [Zt].)

Note thatZt = δ + Zt−1 + at

= 2δ + Zt−2 + at + at−1 (Zt−1 = δ + Zt−2 + at−1)...

= δt +t∑

j=1

aj

Therefore we see

µt = E [Zt] = E [δt] + E

t∑j=1

aj

= δt +t∑

j=1

E [aj] = δt

Arthur Berg Stationarity, ACF, White Noise, Estimation 7/ 21

Page 20: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

The Covariance Function

Definition (Covariance Function)

The covariance function of a random sequence {Zt} is

γ(s, t) = cov(Zs,Zt) = E [(Zs − µs) (Zt − µt)]

So in particular, we have

var(Zt) = cov(Zt,Zt) = γ(t, t) = E[(Zt − µt)2]

Also note that γ(s, t) = γ(t, s) since cov(Zs,Zt) = cov(Zt,Zs).

Example (Covariance Function of White Noise)

Let at ∼WN(0, σ2). By the definition of white noise, we have

γ(s, t) = E(as, at) =

{σ2, s = t0, s 6= t

Arthur Berg Stationarity, ACF, White Noise, Estimation 8/ 21

Page 21: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

The Covariance Function

Definition (Covariance Function)

The covariance function of a random sequence {Zt} is

γ(s, t) = cov(Zs,Zt) = E [(Zs − µs) (Zt − µt)]

So in particular, we have

var(Zt) = cov(Zt,Zt) = γ(t, t) = E[(Zt − µt)2]

Also note that γ(s, t) = γ(t, s) since cov(Zs,Zt) = cov(Zt,Zs).

Example (Covariance Function of White Noise)

Let at ∼WN(0, σ2). By the definition of white noise, we have

γ(s, t) = E(as, at) =

{σ2, s = t0, s 6= t

Arthur Berg Stationarity, ACF, White Noise, Estimation 8/ 21

Page 22: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

The Covariance Function

Definition (Covariance Function)

The covariance function of a random sequence {Zt} is

γ(s, t) = cov(Zs,Zt) = E [(Zs − µs) (Zt − µt)]

So in particular, we have

var(Zt) = cov(Zt,Zt) = γ(t, t) = E[(Zt − µt)2]

Also note that γ(s, t) = γ(t, s) since cov(Zs,Zt) = cov(Zt,Zs).

Example (Covariance Function of White Noise)

Let at ∼WN(0, σ2). By the definition of white noise, we have

γ(s, t) = E(as, at) =

{σ2, s = t0, s 6= t

Arthur Berg Stationarity, ACF, White Noise, Estimation 8/ 21

Page 23: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Example (Covariance Function of MA(1))

Let at ∼WN(0, σ2) and Zt = at + θat−1 (where θ is a constant), then

γ(s, t) = cov(at + θat−1, as + θas−1)= cov(at, as) + θ cov(at, as−1)+

+ θ cov(at−1, as) + θ2 cov(at−1, as−1)If s = t, then

γ(s, t) = γ(t, t) = σ2 + θ2σ2 = (θ2 + 1)σ2

If s = t − 1 or s = t + 1, then

γ(s, t) = γ(t, t + 1) = γ(t, t − 1) = θσ2

So all together we have

γ(s, t) =

(θ2 + 1)σ2, ifs = tθσ2, if|s− t| = 10, else

Arthur Berg Stationarity, ACF, White Noise, Estimation 9/ 21

Page 24: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Example (Covariance Function of MA(1))

Let at ∼WN(0, σ2) and Zt = at + θat−1 (where θ is a constant), then

γ(s, t) = cov(at + θat−1, as + θas−1)= cov (at, as) + θ cov (at, as−1)+

+ θ cov (at−1, as) + θ2 cov (at−1, as−1)If s = t, then

γ(s, t) = γ(t, t) = σ2 + θ2σ2 = (θ2 + 1)σ2

If s = t − 1 or s = t + 1, then

γ(s, t) = γ(t, t + 1) = γ(t, t − 1) = θσ2

So all together we have

γ(s, t) =

(θ2 + 1)σ2, ifs = tθσ2, if|s− t| = 10, else

Arthur Berg Stationarity, ACF, White Noise, Estimation 9/ 21

Page 25: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Example (Covariance Function of MA(1))

Let at ∼WN(0, σ2) and Zt = at + θat−1 (where θ is a constant), then

γ(s, t) = cov(at + θat−1, as + θas−1)= cov (at, as) + θ cov (at, as−1)+

+ θ cov (at−1, as) + θ2 cov (at−1, as−1)If s = t, then

γ(s, t) = γ(t, t) = σ2 + θ2σ2 = (θ2 + 1)σ2

If s = t − 1 or s = t + 1, then

γ(s, t) = γ(t, t + 1) = γ(t, t − 1) = θσ2

So all together we have

γ(s, t) =

(θ2 + 1)σ2, ifs = tθσ2, if|s− t| = 10, else

Arthur Berg Stationarity, ACF, White Noise, Estimation 9/ 21

Page 26: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Example (Covariance Function of MA(1))

Let at ∼WN(0, σ2) and Zt = at + θat−1 (where θ is a constant), then

γ(s, t) = cov(at + θat−1, as + θas−1)= cov (at, as) + θ cov (at, as−1)+

+ θ cov (at−1, as) + θ2 cov (at−1, as−1)If s = t, then

γ(s, t) = γ(t, t) = σ2 + θ2σ2 = (θ2 + 1)σ2

If s = t − 1 or s = t + 1, then

γ(s, t) = γ(t, t + 1) = γ(t, t − 1) = θσ2

So all together we have

γ(s, t) =

(θ2 + 1)σ2, ifs = tθσ2, if|s− t| = 10, else

Arthur Berg Stationarity, ACF, White Noise, Estimation 9/ 21

Page 27: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Example (Covariance Function of MA(1))

Let at ∼WN(0, σ2) and Zt = at + θat−1 (where θ is a constant), then

γ(s, t) = cov(at + θat−1, as + θas−1)= cov (at, as) + θ cov (at, as−1)+

+ θ cov (at−1, as) + θ2 cov (at−1, as−1)If s = t, then

γ(s, t) = γ(t, t) = σ2 + θ2σ2 = (θ2 + 1)σ2

If s = t − 1 or s = t + 1, then

γ(s, t) = γ(t, t + 1) = γ(t, t − 1) = θσ2

So all together we have

γ(s, t) =

(θ2 + 1)σ2, ifs = tθσ2, if|s− t| = 10, else

Arthur Berg Stationarity, ACF, White Noise, Estimation 9/ 21

Page 28: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Now your turn!

Covariance Function of a Random Walk with DriftSuppose Z0 = 0, and for t > 0, Zt = δ + Zt−1 + at where δ is a constant andat ∼WN(0, σ2). What is the covariance function of Zt? (That is computecov (Zs,Zt).)

From the representation

Zt = δt +t∑

j=1

aj

We see

γ(s, t) = cov(Zs,Zt) = cov

s∑j=1

aj,t∑

k=1

ak

=

s∑j=1

t∑k=1

cov (aj, ak) = min(s, t)σ2

Arthur Berg Stationarity, ACF, White Noise, Estimation 10/ 21

Page 29: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Now your turn!

Covariance Function of a Random Walk with DriftSuppose Z0 = 0, and for t > 0, Zt = δ + Zt−1 + at where δ is a constant andat ∼WN(0, σ2). What is the covariance function of Zt? (That is computecov (Zs,Zt).)

From the representation

Zt = δt +t∑

j=1

aj

We see

γ(s, t) = cov(Zs,Zt) = cov

s∑j=1

aj,t∑

k=1

ak

=

s∑j=1

t∑k=1

cov (aj, ak) = min(s, t)σ2

Arthur Berg Stationarity, ACF, White Noise, Estimation 10/ 21

Page 30: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

The Autocorrelation Function

Definition (Correlation Function)

The correlation function of a random sequence {Zt} is

ρ(s, t) =γ(s, t)√

γ(s, s)γ(t, t)

Cauchy-Schwartz inequality⇒ |ρ(s, t)| ≤ 1

Arthur Berg Stationarity, ACF, White Noise, Estimation 11/ 21

Page 31: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

The Autocorrelation Function

Definition (Correlation Function)

The correlation function of a random sequence {Zt} is

ρ(s, t) =γ(s, t)√

γ(s, s)γ(t, t)

Cauchy-Schwartz inequality⇒ |ρ(s, t)| ≤ 1

Arthur Berg Stationarity, ACF, White Noise, Estimation 11/ 21

Page 32: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Weak Stationarity

Definition (Weak Stationarity)A weakly stationary time series is a finite variance process that

(i) has a mean function, µ, that is constant (so it doesn’t depend on t);

(ii) has a covariance function, γ(s, t), that dependents on s and t only throughthe difference |s− t|.

From now on when we say stationary, we mean weakly stationary.

For stationary processes, the mean function is µ (no t).

Since γ(s, t) = γ(s + h, t + h) for stationary processes, we haveγ(s, t) = γ(s− t, 0).Therefore we can view the covariance function as a function of only onevariable (h = s− t); this is the autocovariance function γ(h).

Arthur Berg Stationarity, ACF, White Noise, Estimation 12/ 21

Page 33: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Weak Stationarity

Definition (Weak Stationarity)A weakly stationary time series is a finite variance process that

(i) has a mean function, µ, that is constant (so it doesn’t depend on t);

(ii) has a covariance function, γ(s, t), that dependents on s and t only throughthe difference |s− t|.

From now on when we say stationary, we mean weakly stationary.

For stationary processes, the mean function is µ (no t).

Since γ(s, t) = γ(s + h, t + h) for stationary processes, we haveγ(s, t) = γ(s− t, 0).Therefore we can view the covariance function as a function of only onevariable (h = s− t); this is the autocovariance function γ(h).

Arthur Berg Stationarity, ACF, White Noise, Estimation 12/ 21

Page 34: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Weak Stationarity

Definition (Weak Stationarity)A weakly stationary time series is a finite variance process that

(i) has a mean function, µ, that is constant (so it doesn’t depend on t);

(ii) has a covariance function, γ(s, t), that dependents on s and t only throughthe difference |s− t|.

From now on when we say stationary, we mean weakly stationary.

For stationary processes, the mean function is µ (no t).

Since γ(s, t) = γ(s + h, t + h) for stationary processes, we haveγ(s, t) = γ(s− t, 0).Therefore we can view the covariance function as a function of only onevariable (h = s− t); this is the autocovariance function γ(h).

Arthur Berg Stationarity, ACF, White Noise, Estimation 12/ 21

Page 35: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Weak Stationarity

Definition (Weak Stationarity)A weakly stationary time series is a finite variance process that

(i) has a mean function, µ, that is constant (so it doesn’t depend on t);

(ii) has a covariance function, γ(s, t), that dependents on s and t only throughthe difference |s− t|.

From now on when we say stationary, we mean weakly stationary.

For stationary processes, the mean function is µ (no t).

Since γ(s, t) = γ(s + h, t + h) for stationary processes, we haveγ(s, t) = γ(s− t, 0).Therefore we can view the covariance function as a function of only onevariable (h = s− t); this is the autocovariance function γ(h).

Arthur Berg Stationarity, ACF, White Noise, Estimation 12/ 21

Page 36: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Outline

1 §2.1: Stationarity

2 §2.2: Autocovariance and Autocorrelation Functions

3 §2.4: White Noise

4 R: Random Walk

5 Homework 1b

Arthur Berg Stationarity, ACF, White Noise, Estimation 13/ 21

Page 37: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Autocovariance and Autocorrelation Functions

Definition (Autocovariance Function)The autocovariance function of a stationary time series is

γh = γ(h) = E [(Zt+h − µ) (Zt − µ)]

(for any value of t).

Definition (Autocorrelation Function)The autocorrelation function of a stationary time series is

ρh = ρ(h) =γ(h)γ(0)

Arthur Berg Stationarity, ACF, White Noise, Estimation 14/ 21

Page 38: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Autocovariance and Autocorrelation Functions

Definition (Autocovariance Function)The autocovariance function of a stationary time series is

γh = γ(h) = E [(Zt+h − µ) (Zt − µ)]

(for any value of t).

Definition (Autocorrelation Function)The autocorrelation function of a stationary time series is

ρh = ρ(h) =γ(h)γ(0)

Arthur Berg Stationarity, ACF, White Noise, Estimation 14/ 21

Page 39: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Properties of the Autocovariance/Autocorrelation Functionsγ0 = var(Zt); ρ0 = 1|γk| ≤ γ0; |ρk| ≤ 1γk = γ−k; ρk = ρ−k

γk and ρk are positive semidefinite (p.s.d.), i.e.n∑

i=1

n∑j=1

αiαjγ|ti−tj| ≥ 0

andn∑

i=1

n∑j=1

αiαjρ|ti−tj| ≥ 0

for any set of times t1, . . . , tn and any real numbers α1, . . . , αn.Proof.

Define X =∑n

i=1 αiZti , then

0 ≤ var(X) =n∑

i=1

n∑j=1

αiαj cov(Zti ,Ztj) =n∑

i=1

n∑j=1

αiαjγ|ti−tj|

Arthur Berg Stationarity, ACF, White Noise, Estimation 15/ 21

Page 40: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Properties of the Autocovariance/Autocorrelation Functionsγ0 = var(Zt); ρ0 = 1|γk| ≤ γ0; |ρk| ≤ 1γk = γ−k; ρk = ρ−k

γk and ρk are positive semidefinite (p.s.d.), i.e.n∑

i=1

n∑j=1

αiαjγ|ti−tj| ≥ 0

andn∑

i=1

n∑j=1

αiαjρ|ti−tj| ≥ 0

for any set of times t1, . . . , tn and any real numbers α1, . . . , αn.Proof.

Define X =∑n

i=1 αiZti , then

0 ≤ var(X) =n∑

i=1

n∑j=1

αiαj cov(Zti ,Ztj) =n∑

i=1

n∑j=1

αiαjγ|ti−tj|

Arthur Berg Stationarity, ACF, White Noise, Estimation 15/ 21

Page 41: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Properties of the Autocovariance/Autocorrelation Functionsγ0 = var(Zt); ρ0 = 1|γk| ≤ γ0; |ρk| ≤ 1γk = γ−k; ρk = ρ−k

γk and ρk are positive semidefinite (p.s.d.), i.e.n∑

i=1

n∑j=1

αiαjγ|ti−tj| ≥ 0

andn∑

i=1

n∑j=1

αiαjρ|ti−tj| ≥ 0

for any set of times t1, . . . , tn and any real numbers α1, . . . , αn.Proof.

Define X =∑n

i=1 αiZti , then

0 ≤ var(X) =n∑

i=1

n∑j=1

αiαj cov(Zti ,Ztj) =n∑

i=1

n∑j=1

αiαjγ|ti−tj|

Arthur Berg Stationarity, ACF, White Noise, Estimation 15/ 21

Page 42: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Properties of the Autocovariance/Autocorrelation Functionsγ0 = var(Zt); ρ0 = 1|γk| ≤ γ0; |ρk| ≤ 1γk = γ−k; ρk = ρ−k

γk and ρk are positive semidefinite (p.s.d.), i.e.n∑

i=1

n∑j=1

αiαjγ|ti−tj| ≥ 0

andn∑

i=1

n∑j=1

αiαjρ|ti−tj| ≥ 0

for any set of times t1, . . . , tn and any real numbers α1, . . . , αn.Proof.

Define X =∑n

i=1 αiZti , then

0 ≤ var(X) =n∑

i=1

n∑j=1

αiαj cov(Zti ,Ztj) =n∑

i=1

n∑j=1

αiαjγ|ti−tj|

Arthur Berg Stationarity, ACF, White Noise, Estimation 15/ 21

Page 43: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Properties of the Autocovariance/Autocorrelation Functionsγ0 = var(Zt); ρ0 = 1|γk| ≤ γ0; |ρk| ≤ 1γk = γ−k; ρk = ρ−k

γk and ρk are positive semidefinite (p.s.d.), i.e.n∑

i=1

n∑j=1

αiαjγ|ti−tj| ≥ 0

andn∑

i=1

n∑j=1

αiαjρ|ti−tj| ≥ 0

for any set of times t1, . . . , tn and any real numbers α1, . . . , αn.Proof.

Define X =∑n

i=1 αiZti , then

0 ≤ var(X) =n∑

i=1

n∑j=1

αiαj cov(Zti ,Ztj) =n∑

i=1

n∑j=1

αiαjγ|ti−tj|

Arthur Berg Stationarity, ACF, White Noise, Estimation 15/ 21

Page 44: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Outline

1 §2.1: Stationarity

2 §2.2: Autocovariance and Autocorrelation Functions

3 §2.4: White Noise

4 R: Random Walk

5 Homework 1b

Arthur Berg Stationarity, ACF, White Noise, Estimation 16/ 21

Page 45: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

White Noise

White noise, denoted by at ∼WN(0, σ2), is by definition a weakly stationaryprocess with autocovariance function

γk =

{σ2, k = 00, k 6= 0

and autocorrelation function

ρk =

{1, k = 00, k 6= 0

Not all white noise is boring like iid N (0, σ2).For example, stationary GARCH processes can have all the properties of whitenoise.

Arthur Berg Stationarity, ACF, White Noise, Estimation 17/ 21

Page 46: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

White Noise

White noise, denoted by at ∼WN(0, σ2), is by definition a weakly stationaryprocess with autocovariance function

γk =

{σ2, k = 00, k 6= 0

and autocorrelation function

ρk =

{1, k = 00, k 6= 0

Not all white noise is boring like iid N (0, σ2).For example, stationary GARCH processes can have all the properties of whitenoise.

Arthur Berg Stationarity, ACF, White Noise, Estimation 17/ 21

Page 47: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Outline

1 §2.1: Stationarity

2 §2.2: Autocovariance and Autocorrelation Functions

3 §2.4: White Noise

4 R: Random Walk

5 Homework 1b

Arthur Berg Stationarity, ACF, White Noise, Estimation 18/ 21

Page 48: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Random Walk Simulation in R

Simulate the random walk Zt = .2 + Zt−1 + at where atiid∼ N (0, σ2).

Hint: use the representation Zt = .2t +∑t

j=1 aj!

> w = rnorm(200,0,1)> x = cumsum(w)> wd = w +.2> xd = cumsum(wd)> plot.ts(xd, ylim=c(-5,55))> lines(x)> lines(.2*(1:200), lty="dashed")

Arthur Berg Stationarity, ACF, White Noise, Estimation 19/ 21

Page 49: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Outline

1 §2.1: Stationarity

2 §2.2: Autocovariance and Autocorrelation Functions

3 §2.4: White Noise

4 R: Random Walk

5 Homework 1b

Arthur Berg Stationarity, ACF, White Noise, Estimation 20/ 21

Page 50: Stochastic Process, ACF, PACF, White Noise, Estimation · 2008-01-11 · §2.1: Stationarity§2.2: Autocovariance and Autocorrelation Functions§2.4: White Noise R: Random WalkHomework

§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b

Homework 1b

Read the following sections from the textbook§2.5: Estimation of the Mean, Autocovariances, and Autocorrelations

§2.6: Moving Average and Autoregressive Representations of Time SeriesProcesses

Do the following exercise.Prove that a time series of iid random variables is strongly stationary.

Arthur Berg Stationarity, ACF, White Noise, Estimation 21/ 21