<<

Introduction to Analysis. Lecture 5. Peter Bartlett

www.stat.berkeley.edu/∼bartlett/courses/153-fall2010

Last lecture: 1. ACF, sample ACF

2. Properties of the sample ACF

3. Convergence in square

1 Introduction to Time Series Analysis. Lecture 5. Peter Bartlett

www.stat.berkeley.edu/∼bartlett/courses/153-fall2010

1. AR(1) as a linear process 2. Causality

3. Invertibility 4. AR(p) models

5. ARMA(p,q) models

2 AR(1) as a linear process

Let {Xt} be the stationary solution to Xt − φXt−1 = Wt, where 2 Wt ∼ W N(0, σ ). If |φ| < 1, ∞ j X = φ W − t X t j j=0 is the unique solution: • This infinite sum converges in mean square, since |φ| < 1 implies |φj| < ∞. P • It satisfies the AR(1) recurrence.

3 AR(1) in terms of the back-shift operator

We can write

Xt − φXt−1 = Wt

⇔ (1 − φB) Xt = Wt

| φ{z(B) } ⇔ φ(B)Xt = Wt

Recall that B is the back-shift operator: BXt = Xt−1.

4 AR(1) in terms of the back-shift operator

Also, we can write ∞ j X = φ W − t X t j j=0 ∞ ⇔ X = φjBj W t X t j=0

| π{z(B) } ⇔ Xt = π(B)Wt

5 AR(1) in terms of the back-shift operator

With these definitions: ∞ π(B)= φjBj and φ(B) = 1 − φB, X j=0 we can check that π(B)= φ(B)−1: ∞ ∞ ∞ π(B)φ(B)= φjBj(1 − φB)= φjBj − φjBj = 1. X X X j=0 j=0 j=1

Thus, φ(B)Xt = Wt

⇒ π(B)φ(B)Xt = π(B)Wt

⇔ Xt = π(B)Wt.

6 AR(1) in terms of the back-shift operator

Notice that manipulating operators like φ(B), π(B) is like manipulating polynomials: 1 =1+ φz + φ2z2 + φ3z3 + ··· , 1 − φz provided |φ| < 1 and |z|≤ 1.

7 Introduction to Time Series Analysis. Lecture 5. 1. AR(1) as a linear process

2. Causality 3. Invertibility

4. AR(p) models

5. ARMA(p,q) models

8 AR(1) and Causality

Let Xt be the stationary solution to

Xt − φXt−1 = Wt,

2 where Wt ∼ W N(0, σ ). If |φ| < 1, ∞ j X = φ W − . t X t j j=0 φ = 1? φ = −1? |φ| > 1?

9 AR(1) and Causality

If |φ| > 1, π(B)Wt does not converge. But we can rearrange

Xt = φXt−1 + Wt 1 1 as X − = X − W , t 1 φ t φ t and we can check that the unique stationary solution is ∞ − X = − φ jW . t X t+j j=1

But... Xt depends on future values of Wt.

10 Causality

A linear process {Xt} is causal (strictly, a causal function of {Wt}) if there is a

2 ψ(B)= ψ0 + ψ1B + ψ2B + ··· ∞ with |ψ | < ∞ X j j=0

and Xt = ψ(B)Wt.

11 AR(1) and Causality

• Causality is a property of {Xt} and {Wt}.

• Consider the AR(1) process defined by φ(B)Xt = Wt (with φ(B) = 1 − φB):

φ(B)Xt = Wt is causal iff |φ| < 1

iff the root z1 of the polynomial φ(z) = 1 − φz satisfies |z1| > 1.

12 AR(1) and Causality

• Consider the AR(1) process φ(B)Xt = Wt (with φ(B) = 1 − φB): If |φ| > 1, we can define an equivalent causal model,

−1 Xt − φ Xt−1 = W˜ t,

where W˜ t is a new sequence.

13 AR(1) and Causality

• Is an MA(1) process causal?

14 Introduction to Time Series Analysis. Lecture 5. 1. AR(1) as a linear process

2. Causality 3. Invertibility

4. AR(p) models

5. ARMA(p,q) models

15 MA(1) and Invertibility

Define

Xt = Wt + θWt−1

= (1+ θB)Wt.

If |θ| < 1, we can write

−1 (1 + θB) Xt = Wt 2 2 3 3 ⇔ (1 − θB + θ B − θ B + ··· )Xt = Wt ∞ j ⇔ (−θ) X − = W . X t j t j=0

That is, we can write Wt as a causal function of Xt. We say that this MA(1) is invertible.

16 MA(1) and Invertibility

Xt = Wt + θWt−1

∞ j If |θ| > 1, the sum (−θ) Xt−j diverges, but we can write Pj=0 −1 −1 Wt−1 = −θ Wt + θ Xt.

Just like the noncausal AR(1), we can show that ∞ − W = − (−θ) jX . t X t+j j=1

That is, we can write Wt as a linear function of Xt, but it is not causal. We say that this MA(1) is not invertible.

17 Invertibility

A linear process {Xt} is invertible (strictly, an invertible function of {Wt}) if there is a

2 π(B)= π0 + π1B + π2B + ··· ∞ with |π | < ∞ X j j=0

and Wt = π(B)Xt.

18 MA(1) and Invertibility

• Invertibility is a property of {Xt} and {Wt}.

• Consider the MA(1) process defined by Xt = θ(B)Wt (with θ(B)=1+ θB):

Xt = θ(B)Wt is invertible iff |θ| < 1

iff the root z1 of the polynomial θ(z)=1+ θz satisfies |z1| > 1.

19 MA(1) and Invertibility

• Consider the MA(1) process Xt = θ(B)Wt (with θ(B)=1+ θB): If |θ| > 1, we can define an equivalent invertible model in terms of a new white noise sequence.

• Is an AR(1) process invertible?

20 Introduction to Time Series Analysis. Lecture 5. 1. AR(1) as a linear process

2. Causality 3. Invertibility

4. AR(p) models

5. ARMA(p,q) models

21 AR(p): Autoregressive models of order p

An AR(p) process {Xt} is a stationary process that satisfies

Xt − φ1Xt−1 −···− φpXt−p = Wt,

2 where {Wt}∼ W N(0, σ ).

Equivalently, φ(B)Xt = Wt, p where φ(B) = 1 − φ1B −···− φpB .

22 AR(p): Constraints on φ

Recall: For p = 1 (AR(1)), φ(B) = 1 − φ1B. This is an AR(1) model only if there is a stationary solution to

φ(B)Xt = Wt, which is equivalent to |φ1|= 6 1.

This is equivalent to the following condition on φ(z) = 1 − φ1z:

∀z ∈ R, φ(z) = 0 ⇒ z 6= ±1 equivalently, ∀z ∈ C, φ(z) = 0 ⇒ |z|= 6 1, where C is the set of complex numbers.

23 AR(p): Constraints on φ

Stationarity: ∀z ∈ C, φ(z) = 0 ⇒ |z|= 6 1, where C is the set of complex numbers.

φ(z) = 1 − φ1z has one root at z1 = 1/φ1 ∈ R. But the roots of a degree p> 1 polynomial might be complex. For stationarity, we want the roots of φ(z) to avoid the unit circle, {z ∈ C : |z| = 1}.

24 AR(p): Stationarity and causality

Theorem: A (unique) stationary solution to φ(B)Xt = Wt exists iff

p φ(z) = 1 − φ1z −···− φpz = 0 ⇒ |z|= 6 1.

This AR(p) process is causal iff

p φ(z) = 1 − φ1z −···− φpz = 0 ⇒ |z| > 1.

25 Recall: Causality

A linear process {Xt} is causal (strictly, a causal function of {Wt}) if there is a

2 ψ(B)= ψ0 + ψ1B + ψ2B + ··· ∞ with |ψ | < ∞ X j j=0

and Xt = ψ(B)Wt.

26 AR(p): Roots outside the unit circle implies causal (Details)

∀z ∈ C, |z|≤ 1 ⇒ φ(z) 6= 0 ∞ 1 ⇔ ∃{ψ },δ> 0, ∀|z|≤ 1+ δ, = ψ zj. j φ(z) X j j=0 j j 1/j ⇒ ∀|z|≤ 1+ δ, |ψj z | → 0, |ψj | |z| → 0 ∞ 1 ⇒ ∃j , ∀j ≥ j , |ψ |1/j ≤ ⇒ |ψ | < ∞. 0 0 j 1+ δ/2 X j j=0

m So if |z|≤ 1 ⇒ φ(z) 6= 0, then S = ψ BjW converges in mean m X j t j=0 −1 square, so we have a stationary, causal time series Xt = φ (B)Wt.

27 Calculating ψ for an AR(p): coefficients

2 Example: Xt = ψ(B)Wt ⇔ (1 − 0.5B + 0.6B )Xt = Wt, so 1= ψ(B)(1 − 0.5B + 0.6B2) 2 2 ⇔ 1=(ψ0 + ψ1B + ψ2B + ··· )(1 − 0.5B + 0.6B )

⇔ 1= ψ0,

0= ψ1 − 0.5ψ0,

0= ψ2 − 0.5ψ1 + 0.6ψ0,

0= ψ3 − 0.5ψ2 + 0.6ψ1, . .

28 Calculating ψ for an AR(p): example

⇔ 1= ψ0, 0= ψj (j ≤ 0),

0= ψj − 0.5ψj−1 + 0.6ψj−2

⇔ 1= ψ0, 0= ψj (j ≤ 0),

0= φ(B)ψj.

We can solve these linear difference equations in several ways: • numerically, or • by guessing the form of a solution and using an inductive proof, or • by using the theory of linear difference equations.

29 Calculating ψ for an AR(p): general case

φ(B)Xt = Wt, ⇔ Xt = ψ(B)Wt so 1= ψ(B)φ(B) p ⇔ 1=(ψ0 + ψ1B + ··· )(1 − φ1B −···− φpB )

⇔ 1= ψ0,

0= ψ1 − φ1ψ0,

0= ψ2 − φ1ψ1 − φ2ψ0, . .

⇔ 1= ψ0, 0= ψj (j < 0),

0= φ(B)ψj.

30 Introduction to Time Series Analysis. Lecture 5. 1. AR(1) as a linear process

2. Causality 3. Invertibility

4. AR(p) models

5. ARMA(p,q) models

31 ARMA(p,q): Autoregressive models

An ARMA(p,q) process {Xt} is a stationary process that satisfies

Xt −φ1Xt−1 −···−φpXt−p = Wt +θ1Wt−1 +···+θqWt−q,

2 where {Wt}∼ W N(0, σ ).

• AR(p) = ARMA(p,0): θ(B) = 1. • MA(q) = ARMA(0,q): φ(B) = 1.

32 ARMA processes

Can accurately approximate many stationary processes:

For any stationary process with γ, and any k >

0, there is an ARMA process {Xt} for which

γX (h)= γ(h), h = 0, 1,...,k.

33 ARMA(p,q): Autoregressive moving average models

An ARMA(p,q) process {Xt} is a stationary process that satisfies

Xt −φ1Xt−1 −···−φpXt−p = Wt +θ1Wt−1 +···+θqWt−q,

2 where {Wt}∼ W N(0, σ ).

Usually, we insist that φp,θq 6= 0 and that the polynomials

p q φ(z) = 1 − φ1z −···− φpz , θ(z)=1+ θ1z + ··· + θqz have no common factors. This implies it is not a lower order ARMA model.

34