Dates for term tests 1.Friday, February 07 2.Friday, March 07 3.Friday, March 28

Preview:

Citation preview

Dates for term tests

1. Friday, February 07

2. Friday, March 07

3. Friday, March 28

The Moving Average Time series of order q, MA(q)

where {ut|t T} denote a white noise time series with variance 2.

Let {xt|t T} be defined by the equation.

1 1 2 2 t t t t q t qx u u u u

Then {xt|t T} is called a Moving Average time series of order q. (denoted by MA(q))

qi

qih

hq

ihii

0

if0

2

The autocorrelation function for an MA(q) time series

The autocovariance function for an MA(q) time series

qi

qihh

q

ii

hq

ihii

0

if0 0

2

0

The mean value for an MA(q) time series

tE x

The autocorrelation function for an MA(q) time series

qi

qihh

q

ii

hq

ihii

0

if0 0

2

0

Comment

“cuts off” to zero after lag q.

-0.4

-0.2

0

0.2

0.4

0.6

0.8

1

1.2

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30

q

The Autoregressive Time series of order p, AR(p)

where {ut|t T} is a white noise time series with variance 2.

Let {xt|t T} be defined by the equation.

2211 tptpttt uxxxx

Then {xt|t T} is called a Autoregressive time series of order p. (denoted by AR(p))

The mean value of a stationary AR(p) series

p

txE

211

The Autocovariance function (h) of a stationary AR(p) series

Satisfies the equations:

21 10 pp

101 1 pp

212 1 pp

and

011 ppp

phhh p 11 for h > p

Yule Walker Equations

2

1

01 1 p p

with

phhh p 11for h > p

111 1 pp

212 1 pp

111 ppp

The Autocorrelation function (h) of a stationary AR(p) series

Satisfies the equations:

and

or:

h

pp

hh

rc

rc

rch

111

22

11

and c1, c2, … , cp are determined by using the starting values of the sequence (h).

pp xxx 11

pr

x

r

x

r

x111

21

where r1, r2, … , rp are the roots of the polynomial

Conditions for stationarity

Autoregressive Time series of order p, AR(p)

For a AR(p) time series, consider the polynomial

pp xxx 11

pr

x

r

x

r

x111

21

with roots r1, r2 , … , rp

then {xt|t T} is stationary if |ri| > 1 for all i.

If |ri| < 1 for at least one i then {xt|t T} exhibits deterministic behaviour.

If |ri| ≥ 1 and |ri| = 1 for at least one i then {xt|t T} exhibits non-stationary random behaviour.

since:

h

pp

hh

rc

rc

rch

111

22

11

i.e. the autocorrelation function, (h), of a stationary AR(p) series “tails off” to zero.

lim 0h

h

and |r1 |>1, |r2 |>1, … , | rp | > 1 for a stationary AR(p) series then

0

0.2

0.4

0.6

0.8

1

1.2

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30

Special Cases: The AR(1) time

Let {xt|t T} be defined by the equation.

11 ttt uxx

Consider the polynomial

xx 11

1

1r

x

with root r1= 1/1

1. {xt|t T} is stationary if |r1| > 1 or |1| < 1 .

2. If |ri| < 1 or |1| > 1 then {xt|t T} exhibits deterministic behaviour.

3. If |ri| = 1 or |1| = 1 then {xt|t T} exhibits non-stationary random behaviour.

Special Cases: The AR(2) time

Let {xt|t T} be defined by the equation.

2211 tttt uxxx

Consider the polynomial

2211 xxx

21

11r

x

r

x

where r1 and r2 are the roots of (x)

1. {xt|t T} is stationary if |r1| > 1 and |r2| > 1 .

2. If |ri| < 1 or |1| > 1 then {xt|t T} exhibits deterministic behaviour.

3. If |ri| ≤ 1 for i = 1,2 and |ri| = 1 for at least on i then {xt|t T} exhibits non-stationary random behaviour.

This is true if 1+2 < 1 , 2 –1 < 1 and 2 > -1.

These inequalities define a triangular region for 1 and 2.

Patterns of the ACF and PACF of AR(2) Time SeriesIn the shaded region the roots of the AR operator are complex

h kk

h kk

h kk

h kk

1

21

-1

2-2

III

IIIIV

2

The Mixed Autoregressive Moving Average Time Series of order p,q The ARMA(p,q) series

The Mixed Autoregressive Moving Average Time Series of order p, ARMA(p,q)

Let 1, 2, … p , 1, 2, … p , denote p + q +1 numbers (parameters).

Let {ut|t T} denote a white noise time series with variance 2.

– independent– mean 0, variance 2.

Let {xt|t T} be defined by the equation. 2211 ptpttt xxxx

Then {xt|t T} is called a Mixed Autoregressive- Moving Average time series - ARMA(p,q) series.

2211 qtqttt uuuu

Mean value, variance, autocovariance function,

autocorrelation function of anARMA(p,q) series

Similar to an AR(p) time series, for certain values of the parameters 1, …, p an ARMA(p,q) time series may not be stationary.

An ARMA(p,q) time series is stationary if the roots (r1, r2, … , rp ) of the polynomial

(x) = 1 – 1x – 2x2 - … - p xp

satisfy | ri| > 1 for all i.

Assume that the ARMA(p,q) time series {xt|t T} is stationary:

Let = E(xt). Then

2211 ptpttt xExExExE

21 p

1 21 p

1 2

1tp

E x

2211 qtqttt uEuEuEuE

0000 21 q

or

The Autocovariance function, (h), of a stationary mixed autoregressive-moving average time series {xt|t T} be determined by the equation:

ptpttt xxxx 2211

Thus

p 211 now

11 ptptt xxx

qtqttt uuuu 2211

qtqttt uuuu 2211

Hence

tht xxEh

phtpht xxE 11

tqhtqhththt xuuuu 2211

tphtptht xxExxE 11

tqhtqthttht xuExuExuE 11

phh p 11

qhhh uxquxux 11

thtux xuEh where

ptptht xxuE 11

qtqttt uuuu 2211

pthtptht xuExuE 11

qthtqthttht uuEuuEuuE 11

phh uxpux 11

qhhh uuquuuu 11

thtux xuEh note

.0 if 0 where hxuEh thtux

.0 if 0

.0 if and

2

h

huuEh thtuu

We need to calculate:

quxuxux ,,1,0

20 ux

hux note phh uxpux 11

qhhh uuquuuu 11

.0 if 0 and hhux

.0 if 0

.0 if 2

h

hhuu

222201 uxux

222 012 uxuxux

22

22

2

222

21 2 3 33 2 1 0ux ux ux ux

21 2 2

2 22 3 3

2 21 2 2 2 3 3

h ux(h)

0

-1

-2

-3

2

2

22 2

2 21 2 2 2 3 3

The autocovariance function (h) satisfies:

phhh p 11

qhhh uxquxux 11

For h = 0, 1. … , q:

pp 10 1 quxquxux 10 1

101 1 pp 101 quxqux

pqqq p 11 0uxq

for h > q:

phhh p 11

We then use the first (p + 1) equations to determine: (0), (1), (2), … , (p)

We use the subsequent equations to determine:(h) for h > p.

Example:The autocovariance function, (h), for an ARMA(1,1) time series:

11 hh 11 hh uxux

For h = 0, 1:

10 1 10 1 uxux

01 1 01 ux

for h > 1: 11 hh

or 10 1 2

1112

01 1 21

Substituting (0) into the second equation we get:

or

21

2111

211 11

22

1

1111

1

11

Substituting (1) into the first equation we get:

2111

222

1

11111 1

10

22

1

1112

12

111111

1

111

22

1

1121

1

21

for h > 1: 11 hh

22

1

111111 1

112

22

1

1111211 1

123

22

1

1111111 1

11

hhh

The Backshift Operator B

Consider the time series {xt : t T} and Let M denote the linear space spanned by the set of random variables {xt : t T}

(i.e. all linear combinations of elements of {xt : t T} and their limits in mean square).

M is a vector space

Let B be an operator on M defined by:

Bxt = xt-1.

B is called the backshift operator.

Note: 1.

2. We can also define the operator Bk withBkxt = B(B(...Bxt)) = xt-k.

3. The polynomial operator p(B) = c0I + c1B + c2B2 + ... + ckBk

can also be defined by the equation.p(B)xt = (c0I + c1B + c2B2 + ... + ckBk)xt . = c0Ixt + c1Bxt + c2B2xt + ... + ckBkxt

= c0xt + c1xt-1 + c2xt-2 + ... + ckxt-k

ktktt xcxcxcB

21 21

ktktt BxcBxcBxc 21 21

11211 21 ktktt xcxcxc

4. The power series operator p(B) = c0I + c1B + c2B2 + ...

can also be defined by the equation.p(B)xt = (c0I + c1B + c2B2 + ... )xt

= c0Ixt + c1Bxt + c2B2xt + ...

= c0xt + c1xt-1 + c2xt-2 + ...

5. If p(B) = c0I + c1B + c2B2 + ... and q(B) = b0I + b1B + b2B2 + ... are such that

p(B)q(B) = I i.e. p(B)q(B)xt = Ixt = xt than q(B) is denoted by [p(B)]-1.

Other operators closely related to B:

1. F = B-1 ,the forward shift operator, defined by Fxt = B-1xt = xt+1 and

2. = I - B ,the first difference operator, defined by xt = (I - B)xt = xt - xt-1 .

The Equation for a MA(q) time series

xt= 0ut + 1ut-1 +2ut-2 +... +qut-q + can be written

xt= (B) ut + where

(B) = 0I + 1B +2B2 +... +qBq

The Equation for a AR(p) time series

xt= 1xt-1 +2xt-2 +... +pxt-p + +ut

can be written

(B) xt= + ut

where

(B) = I - 1B - 2B2 -... - pBp

The Equation for a ARMA(p,q) time series

xt= 1xt-1 +2xt-2 +... +pxt-p + + ut + 1ut-1 +2ut-2 +... +qut-q

can be written

(B) xt= (B) ut + where

(B) = 0I + 1B +2B2 +... +qBq

and

(B) = I - 1B - 2B2 -... - pBp

Some comments about the Backshift operator B

1. It is a useful notational device, allowing us to write the equations for MA(q), AR(p) and ARMA(p, q) in a very compact form;

2. It is also useful for making certain computations related to the time series described above;

The partial autocorrelation function

A useful tool in time series analysis

The partial autocorrelation function

Recall that the autocorrelation function of an AR(p) process satisfies the equation:

x(h) = 1x(h-1) + 2x(h-2) + ... +px(h-p)

For 1 ≤ h ≤ p these equations (Yule-Walker) become:x(1) = 1 + 2x(1) + ... +px(p-1)

x(2) = 1x(1) + 2 + ... +px(p-2)

...

x(p) = 1x(p-1)+ 2x(p-2) + ... +p.

In matrix notation:

pxx

xx

xx

x

x

x

pp

p

p

p

2

1

121

211

111

2

1

These equations can be used to find 1, 2, … , p, if the time series is known to be AR(p) and the autocorrelation x(h)function is known.

In this case p

ppp ,,, 21

If the time series is not autoregressive the equations can still be used to solve for 1, 2, … , p, for any value of p ≥ 1.

are the values that minimizes the mean square error:

2

1

)()(...p

ixit

pixt xxEESM

121

211

111

21

211

111

)(

kk

k

k

kkk

xx

xx

xx

xxx

xx

xx

kkkk

Definition: The partial auto correlation function at lag k is defined to be:

Using Cramer’s Rule

Comment:

The partial auto correlation function, kk is determined from the auto correlation function, (h)

The partial auto correlation function at lag k, kk is the last auto-regressive parameter, . if the series was assumed to be an AR(k) series.

If the series is an AR(p) series then

An AR(p) series is also an AR(k) series with k > p with the auto regressive parameters zero after p.

kk

, 0 for kk k k k p

Some more comments:

1. The partial autocorrelation function at lag k, kk, can be interpreted as a corrected autocorrelation between xt and xt-k conditioning on the intervening variables xt-1, xt-2, ... ,xt-k+1 .

2. If the time series is an AR(p) time series than

kk = 0 for k > p

3. If the time series is an MA(q) time series than

x(h) = 0 for h > q

A General Recursive Formula for Autoregressive Parameters and the

Partial Autocorrelation function (PACF)

Letkk

kk

kkk ,,,, 321

denote the autoregressive parameters of order k satisfying the Yule Walker equations:

kkk

kkk13221

223121 kkk

kkk

kkk

kk

kk

kk 332211

Then it can be shown that:

k

jj

kj

k

jjk

kjk

kkkk

1

11

1,111

1

and

kjkjkkk

kj

kj ,,2 ,1 11,1

1

Proof:

The Yule Walker equations:

kkk

kkk13221

223121 kkk

kkk

kkk

kk

kk

kk 332211

In matrix form:

kkk

k

k

kk

k

k

22

1

21

2

1

1

1

1

kkk ρβΡ or

k

k

kk

k

k

k

kk

k

k

k

22

1

21

2

1

and ,

1

1

1

ρβΡ

kkk ρΡβ1

The equations for

1

2

11

12

11

1

1

1

1

1

kkk

k

k

kk

k

k

1,111

13

12

11 ,,,,

kkkk

kkk

11,1

11

1or

k

k

kk

k

k

kk

ρβ

AρΡ

001

000

100

where

A and

113

12

11

11 ,,,, k

kkkkk β

The matrix A reverses order

kkkk

kk ρAρβΡ

1,11

1

The equations may be written

11,11

1

kkkkk βAρ

Multiplying the first equations by

kkkkkkk

k βρΡAρΡβ

11

1,11

1

1

or kkkk

kk AρΡββ1

1,11

1

kkkk

k ρΡAβ1

1,1

k

kkk Aββ 1,1

Substituting this into the second equation

or

11,11,1

kkkk

kkkk AββAρ

kkk

kkkk Aβρβρ

11,1 1

and kk

kkk

kk

ρβ

Aβρ

1 1

1,1

Hence

k

jj

kj

k

jjk

kjk

kkkk

1

11

1,111

1

and

kjkjkkk

kj

kj ,,2 ,1 11,1

1

kkk

kk Aβββ 1,11

or

Some Examples

Example 1: MA(1) time seriesSuppose that {xt|t T} satisfies the following

equation:

xt = 12.0 + ut + 0.5 ut – 1

where {ut|t T} is white noise with = 1.1.Find:1. The mean of the series,2. The variance of the series,3. The autocorrelation function.4. The partial autocorrelation function.

SolutionNow {xt|t T} satisfies the following equation:

xt = 12.0 + ut + 0.5 ut – 1

Thus:

1. The mean of the series,

= 12.0

The autocovariance function for an MA(1) is

222 21

22

1 0.5 1.1 01 0 1.5125 0

1 0.5 1.1 1 0.605 1

0 1 0 1 0 1

hh h

h h h h

h h h

Thus:

2. The variance of the series,

(0) = 1.5125

and

3. The autocorrelation function is:

0.6051.5125

1 0 1 0

1 0.4 10

0 1 0 1

h hh

h h h

h h

( )

1 1 1

1 1 2

1 2

1 1 1

1 1 2

1 2 1

kkk k

k k k

k

k

k k

4. The partial auto correlation function at lag k is defined to be:

Thus (1)11 1

11 0.4

1

2 2(2)

22 2 2 2

1 1

1 2 2 1 0.4 0.16.19048

1 1 1 0.4 0.841 1

1 1

(3)33 3

1 1 1 1 0.4 0.4

1 1 2 0.4 1 0

2 1 3 0 0.4 0 0.0640.0941

1 .4 0 0.681 1 2

.4 1 .41 1 1

0 .4 12 1 1

(4)44 4

1 1 2 1 1 .4 0 .4

1 1 1 2 .4 1 .4 0

2 1 1 3 0 .4 1 0

3 2 1 4 0 0 .4 0 0.02560.0469

1 .4 0 0 0.54561 1 2 3

.4 1 .4 01 1 1 2

0 .4 1 .42 1 1 1

0 0 .4 13 2 1 1

(5)55 5

0.010240.0234

0.4368

66 77 88 990.0117, 0.0059, 0.0029, 0.0015

10,10 11,11 12,120.0007, 0.0004, 0.00029

-0.4

-0.2

0

0.2

0.4

0.6

0.8

1

1.2

0 1 2 3 4 5 6 7 8 9 10 11

Graph: Partial Autocorrelation function kk

Exercise: Use the recursive method to calculate kk

111

1 1, 1

1

1

kk

k j k jjk

k k k kkj j

j

and

11, 1 1 1, 2, , k k k

j j k k k j j k

11 1,1 1we start with

Exercise: Use the recursive method to calculate kk

212 2 1 12 2,2 21

1 1

0.4.19048

1 1 0.4

and2 1 1

1 1 2.2 1 1j

1 .19048 0.4

1.19048 0.4 .0.476192

2 23 3 2 2 1 13 3,3 2 2

1 1 2 2

0.0941, etc1

Example 2: AR(2) time series

Suppose that {xt|t T} satisfies the following equation:

xt = 0.4 xt – 1 + 0.1 xt – 2 + 1.2 + ut

where {ut|t T} is white noise with = 2.1.Is the time series stationary?Find:1. The mean of the series,2. The variance of the series,3. The autocorrelation function.4. The partial autocorrelation function.

1. The mean of the series

1 2

1.22.4

1 1 0.4 0.1

3. The autocorrelation function.Satisfies the Yule Walker equations

1 1 2 1 1

2 1 1 2 1

1 0.4 0.1

1 0.4 0.1

1 1 2 1 1then 0.4 0.1

where h h h h h

h h

hence

1

2

0.40.4444

0.90.4

0.4 0.1 0.27780.9

1 1 2 1 1then 0.4 0.1

where h h h h h

h h

h 0 1 2 3 4 5 6

h 1.0000 0.4444 0.2778 0.1556 0.0900 0.0516 0.0296

h 7 8 9 10 11 12 13

h 0.0170 0.0098 0.0056 0.0032 0.0018 0.0011 0.0006

2. the variance of the series

2 2

1 1 2 1

2.10 5.7522

1 1 0.4 0.4444 0.1 0.2778

4. The partial autocorrelation function.

1,1 1 0.4444

1

1 22,2

1

1

1 1 0.4444

0.4444 .27780.1000

1 0.44441

0.4444 11

1 1

1 2

2 1 33,3

1 2

1 1

2 1

1 1 0.4444 0.4444

1 0.4444 1 0.2778

0.2778 0.4444 0.15560

1 1 0.4444 0.2778

1 0.4444 1 0.4444

1 0.2778 0.4444 1

,in fact 0 for 3k k k

The partial autocorrelation function of an AR(p) time series “cuts off” after p.

Example 3: ARMA(1, 2) time series

Suppose that {xt|t T} satisfies the following equation:

xt = 0.4 xt – 1 + 3.2 + ut + 0.3 ut – 1 + 0.2 ut – 2

where {ut|t T} is white noise with = 1.6.Is the time series stationary?Find:1. The mean of the series,2. The variance of the series,3. The autocorrelation function.4. The partial autocorrelation function.

xt = 0.4 xt – 1 + 3.2 + ut + 0.3 ut – 1 + 0.2 ut – 1

white noise std. dev,. = 1.6.

Is the time series stationary?

(x) = 1 – 1x = 1 – 0.4x has root r1 =1/0.4 =2.5

Since |r1| > 1, the time series is stationary

Find:

1. The mean of the series.

1

3.2 3.2 32 165.333

1 1 0.4 .6 6 3

The autocovariance function (h) satisfies:

1 1 21 1 2ux ux uxh h h h h

For h = 0, 1, 2

for h > q: 1 1h h

i.e.

0.4 1 0.3 1 0.2 2ux ux uxh h h h h

For h = 0, 1, 2

for h > q: 0.4 1h h

0 0.4 1 0 0.3 1 0.2 2ux ux ux

etc.where

1 0.4 0 0.3 0 0.2 1ux ux

2 0.4 1 0.2 0ux

3 0.4 2 4 0.4 3 5 0.4 4

2 20 1.6 2.56,ux

22 22ux

20.4 0.7 0.2 1.6 1.2288

2 21 .4 .3 1.6 1.792,ux

0 1 10.4 2.56 0.3 1.792 0.2 1.288 0.4 3.34336

We use the first two equations to find 0 and 1

Then we use the third equation to find 2

1 0 00.4 0.3 2.56 0.2 1.792 0.4 1.1264

2 1 10.4 0.2 2.56 0.4 0.512

1then 0.4 for 3.h h h

0 00.4 0.4 1.1264 3.34336

201 0.4 0.4 1.1264 3.34336

0 2

0.4 1.1264 3.343364.516571

1 0.4

1 00.4 1.1264 0.4 4.516751 1.1264 2.933029

2 10.4 0.512 0.4 2.933029 0.512 1.68521

The autocovariance, autocorrelation functions

h (h ) (h )

0 4.517 1.0001 2.933 0.6492 1.685 0.3733 0.674 0.1494 0.270 0.0605 0.108 0.0246 0.043 0.0107 0.017 0.004

Spectral Theory for a stationary time series