Autoregressive and Moving-Average Models

Autoregressive and Moving-Average Models

Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y0,y1,y2,..., yt } as being realizations of this randoms variable. We also define a white-noise process. A sequence {εt } is a white-noise process if each value of the sequence has mean zero, has a constant variance, and is uncorrelated with all other realizations. Formally, {εt } is a white-noise process if, for each t, E(εt ) = E(εt−1) = ··· = 0 (3.1) 2 2 2 E(εt ) = E(εt−1) = ··· = σ E(εt εt−s) = E(εt− jεt− j−s) = 0 for all j and s For the rest of these notes, {εt } will always denote a white-noise process. Figure 3.1 illustrates a white-noise process generated in Stata with the following code: clear set obs 150 set seed 1000 gen time=_n tsset time gen white=invnorm(uniform()) twoway line white time, m(o) c(l) scheme(sj) /// ytitle( "white-noise" ) /// title( "White-Noise Process" ) 3.2 Stationarity A stochastic process is said to be covariance-stationary if it has a finite mean and variance. That is, for all t, j, and t − s, E(yt ) = E(yt−s) = µ (3.2) 11 12 3 Autoregressive and Moving-Average Models White-Noise Process 2 1 0 white-noise -1 -2 0 50 100 150 time Fig. 3.1 White-Noise Process, {εt } 2 2 2 E[( yt − µ) ] = E[( yt−s − µ) ] = σy (3.3) E[( yt − µ)( yt−s − µ)] = E[( yt− j − µ)( yt− j−s − µ)] = γs (3.4) 2 where µ, σy , and γs are all constants. For a covariance-stationary series, we can define the autocorrelation between yt and yt−s as γs ρs = (3.5) γ0 where both, γs and γ0, are defined in Equation 3.4. Obviously, ρ0 = 1. 3.3 The Moving-Average Processes 3.3.1 The MA(1) Process The first-order moving average process, or MA(1) process, is yt = εt + θε t−1 = ( 1 + θL)εt (3.6) 3.3 The Moving-Average Processes 13 The MA(1) process expresses an observed series as a function of the current and lagged unobserved shocks. To develop an intuition of the behavior of the MA(1) process, we show the following three simulated realizations: y1t = εt + 0.08 εt−1 = ( 1 + 0.08 L)εt (3.7) y2t = εt + 0.98 εt−1 = ( 1 + 0.98 L)εt y3t = εt − 0.98 εt−1 = ( 1 − 0.98 L)εt In the first two variables ( y1 and y2), past shocks feed positively into the current value of the series, with a small weight of θ = 0.08 in the first case, and a large weight of θ = 0.98 in the second case. While one may think that the second case would produce a more persistent series, it doesn’t. The structure of the MA(1) pro- cess, in which only the first lag of the shock appears on the right, forces it to have a very short memory, and hence weak dynamics. Figure 3.2 illustrates the gener- ated series y11 and y2t . This figure shows the weak dynamics of MA(1) processes. In addition, it also shows that the y2 series is more volatile than y1. Following the previous Stata code, we can generate Figure 3.2 with: gen Y1 = white+0.08 *l.white gen Y2 = white+0.98 *l.white twoway (line Y1 Y2 time, clcolor(blue red) ), scheme(sj) /// ytitle( "Y1 and Y2" ) /// title( "Two MA(1) Processes" ) It is easy to see that the unconditional mean and variance of an MA(1) process are E(yt ) = E(εt ) + θE(εt−1) (3.8) = 0 and 2 var (yt ) = var (εt ) + θ var (εt−1) (3.9) = σ 2 + θ 2σ 2 = σ 2(1 + θ 2) Notice that for a given σ 2, as θ increases in absolute value, the unconditional vari- ance increases as well. This explains why the y2 is more volatile than y1. The conditional mean and variance of an MA(1), where the conditioning infor- mation set is Ωt−1 = {εt−1,εt−2,... }, are E(yt |Ωt−1) = E(εt + θε t−1|Ωt−1) (3.10) = E(εt |Ωt−1) + E(θε t−1|Ωt−1) = θε t−1 and 2 var (yt |Ωt−1) = E(( yt − E(yt |Ωt−1)) |Ωt−1) (3.11) 2 = E(εt |Ωt−1) 14 3 Autoregressive and Moving-Average Models Two MA(1) Processes 4 2 0 Y1 and Y2 -2 -4 0 50 100 150 time Y1 Y2 Fig. 3.2 Two MA(1) Processes 2 = E(εt ) = σ 2 The conditional mean explicitly adapts to the information set, in contrasts to the unconditional mean, which is constant. We will study more the y1, y2, and y3 series once we learn about the autocorre- lation and the partial autocorrelation functions. 3.3.2 The MA( q) Process The general finite-order moving average process of order q, or MA( q), is expressed as y1 = εt + θ1εt−1 + θ2εt−2 + ··· + θqεt−q = B(L)εt (3.12) where as you know 2 q B(L) = 1 + θ1L + θ2L + ··· + θqL . 3.4 The Autoregressive Processes 15 is a qth-order lag operator polynomial. The MA( q) process is a natural generaliza- tion of the MA(1). By allowing for more lags of the shocks on the right hand side of the equation, the MA( q) process can capture richer dynamic patterns. 3.4 The Autoregressive Processes 3.4.1 The AR(1) Process The autoregressive process has a simple motivation: it is simply a stochastic differ- ence equation in which the current value of a series is linearly related to its past values, plus an additive stochastic shock. The first-order autoregressive process, AR(1), is yt = ϕyt−1εt (3.13) which can be written as (1 − ϕL)yt = εt (3.14) To illustrate the dynamics of different AR(1) processed, we simulate the realiza- tions of the following four AR(1) processes: z1t = + 0.9 · z1t−1 + εt (3.15) z2t = + 0.2 · z2t−1 + εt z3t = −0.9 · z3t−1 + εt z4t = −0.2 · z4t−1 + εt where we keep the innovation sequence {εt } to be the same in each case. Figure 3.3 illustrates the time series graph of the z1t and z2t series, while Figure 3.4 illustrates z3t and z4t . These two figures were obtained using: gen Z1 = 0 gen Z2 = 0 gen Z3 = 0 gen Z4 = 0 replace Z1 = +0.9 *l.Z1 + white if time > 1 replace Z2 = +0.2 *l.Z2 + white if time > 1 replace Z3 = -0.9 *l.Z3 + white if time > 1 replace Z4 = -0.2 *l.Z4 + white if time > 1 twoway (line Z1 Z2 time, clcolor(blue red) ), scheme(sj) /// ytitle( "Z1 and Z2" ) /// title( "Two AR(1) Processes" ) twoway (line Z3 Z4 time, clcolor(blue red) ), scheme(sj) /// ytitle( "Z3 and Z4" ) /// title( "Two AR(1) Processes" ) From the first figure we can see that the fluctuations in the AR(1) with parameter ϕ = 0.9 appears much more persistent than those of the AR(1) with parameter ϕ = 0.4. This contrasts sharply with the MA(1) process, which has a very short memory 16 3 Autoregressive and Moving-Average Models Two AR(1) Processes 4 2 0 -2 Z1 and Z2 -4 -6 0 50 100 150 time Z1 Z2 Fig. 3.3 Two AR(1) Processes regardless of the parameter value. Hence, the AR(1) model is capable of capturing much more persistent dynamics. Figure 3.4 shows that the sign is also critical in the dynamic of an AR(1) process. With a positive ϕ, a positive value is most likely followed by another positive value. However, with a negative ϕ, the series quickly changes from positive to negative and vice versa. Finally, when ϕ is negative, the there is a larger dispersion the larger its value. Let’s begin with a simple AR(1) process. Then, if we substitute backwards for the lagged y’s on the right hand side and use the lag operator we can write, yt = ϕyt−1 + εt (3.16) 2 yt = εt + ϕε t−1 + ϕ εt−2 + ··· 1 yt = εt 1 − ϕL which is the moving average representation of y. It is convergent if and only if |ϕ| < 1, which is the covariance stationarity condition for the AR(1) case. From the moving average representation of the covariance stationary AR(1) pro- cess, we can obtain the unconditional mean and variance, 2 E(yt ) = E(εt + ϕε t−1 + ϕ εt−2 + ··· ) (3.17) 3.4 The Autoregressive Processes 17 Two AR(1) Processes 5 0 Z3 and Z4 -5 0 50 100 150 time Z3 Z4 Fig. 3.4 Two AR(1) Processes 2 = E(εt ) + ϕE(εt−1) + ϕ E(εt−2) + ··· = 0 and 2 var (yt ) = var (εt + ϕε t−1 + ϕ εt−2 + ··· ) (3.18) = σ 2 + ϕ2σ 2 + ϕ4σ 2 + ··· ∞ = σ 2 ∑ ϕ2i i=0 σ 2 = 1 − ϕ2 The conditional moments are E(yt |yt−1) = E(ϕyt−1 + εt |yt−1) (3.19) = ϕE(yt−1|yt−1) + E(εt |yt−1) = ϕyt−1 + 0 = ϕyt−1 and 18 3 Autoregressive and Moving-Average Models var (yt ) = var (ϕyt−1 + εt |yt−1) (3.20) 2 = ϕ var (yt−1|yt−1) + var (εt |yt−1) = 0 + σ 2 = σ 2 It is important to note how the conditional mean adapts to the changing information set as the process evolves.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    10 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us