Autoregressive and Moving-Average Models II

Autoregressive and Moving-Average Models II

Chapter 4 Autoregressive and Moving-Average Models II 4.1 Box-Jenkins Model Selection The Box-Jenkins approach to modeling involves three iterative steps to identify the appropriate ARIMA process to estimate and forecast a univariate time series. 1. Model Selection or identification stage involves visualizing the time plot of the series, the autocorrelation and the partial correlation function. Plotting the time path of yt provides useful information about outliers, missing values, and struc- tural breaks{ } in the data generating process. Nonstationary variables may have a pronounced trend or appear to meander without a constant long-run mean or variance. 2. Parameter Estimation involves finding the values of the model coefficients which provide the best fit to the data. In this stage the goal is to select a stationary and parsimonious model that has a good fit. 3. Model Checking involves ensuring that the residuals from the estimated model mimic a white-noise process. 4.2 The Autocorrelation Function We mentioned in the previous chapter that for a process to be covariance stationary, we need that the mean is stable over time E(yt ) = E(yt s) = µ t,s (4.1) − ∀ its variance is stable over time 2 2 2 E[( yt µ) ] = E[( yt s µ) ] = σ t,s (4.2) − − − ∀ and that its autocovariance function also does not depend on time 21 22 4 Autoregressive and Moving-Average Models II E[( yt µ)( yt s µ)] = E[( yt j µ)( yt j s µ)] = γs t,s, j (4.3) − − − − − − − − ∀ This last equation indicates that the autocovariance function γs depends only on the displacement s and not on the time t. The autocovariance function is important because it provides a basic summary of cyclical dynamics in a covariance stationary series. Note that the autocovariance function is symmetric, γs = γ s. Also note that 2 2 − γ0 = E[( yt µ)( yt 0 µ)] = E[( yt µ) ] = σ . For the− same reasons− − one normally− uses the correlation coefficient rather than the variance, it is useful to define the autocorrelation function as follows γs ρs = (4.4) γ0 which is the same we defined before in Equation 3.5. The formula for the autocor- relation is just the usual correlation formula, tailored to the correlation between yt and yt s. The sample analog is, − T ∑t=s+1[( yt y¯)( yt s y¯)] ρˆ = − − − (4.5) s T 2 ∑ (yt y¯) t=1 − this estimator is the sample autocorrelation function or correlogram. It is of interest to know whether a series is a reasonable approximation to a white- noise, which is to say that all its autocorrelations are 0 in the population. If a series is white-noise, then the distribution of the sample autocorrelations in large samples is 1 ρˆs N 0, (4.6) ∼ T That is, they are approximately normally distributed. In practice, 95% of the sample autocorrelations should fall in the interval 2 . We can rewrite Equation 4.6 as ± √T √Tρˆs N(0,1). (4.7) ∼ Then, squaring both sides we obtain Tρˆ 2 χ2. (4.8) s ∼ 1 It can be shown that the sample autocorrelations at various displacements are ap- proximately independent of one another. Hence, we can obtain the Box-Pierce Q- statistic as the sum of m independent χ2 variables, m 2 QBP = T ∑ ρˆs (4.9) s=1 2 which follows a χm distribution under the null that y is white noise. As slight modi- fication, designed to follow more closely the χ2 distribution in small samples is the Ljung-Box Q-statistic , 4.3 The Partial Autocorrelation Function 23 m 1 2 QLB = T(T + 2) ∑ ρˆ . (4.10) T s s s=1 − 2 Under the null that y is white noise, QLB is approximately distributed as a χm random variable. 4.3 The Partial Autocorrelation Function The partial autocorrelation function, p(s), is just the coefficient of yt s in a popula- − tion linear regression of yt on yt 1, yt 2, ... , yt s. That is, − − − ˆ ˆ yˆt = cˆ+ β1yt 1 + + βsyt s, (4.11) − ··· − then the sample partial correlation at displacement s is ˆ pˆs = βs. (4.12) If the series is white noise, approximately 95% of the sample partial autocorrelations should fall in the interval 2 . ± √T Figure 4.1 shows the autocorrelation and the partial autocorrelation functions of a white-noise process presented in Figure 3.1. This one was obtained using: ac white, saving(acwhite) pac white, saving(pacwhite) gr combine acwhite.gph pacwhite.gph, col(1) /// iscale(0.7) fysize(100) /// title( "ACF and PACF for a White-Noise Process" ) Because it is white-noise, by definition it is uncorrelated over time. Hence, all popu- lation autocovariances and autocorrelations should be zero beyond displacement 0. The figure shows that most of the sample autocovariances and autocorrelations fall within the 95% confidence bands. 1 To obtain the correlogram that comes with the Ljung-Box Q-statistic (Portman- teau (Q)) we need to type corrgram white, lags(20) to obtain -1 0 1 -1 0 1 LAG AC PAC Q Prob>Q [Autocorrelation] [Partial Autocor] ------------------------------------------------------------------------------- 1 -0.0048 -0.0047 .00346 0.9531 | | 2 -0.2227 -0.2230 7.6432 0.0219 -| -| 3 -0.0019 -0.0040 7.6437 0.0540 | | 4 0.1439 0.1003 10.878 0.0280 |- | 5 -0.0363 -0.0405 11.086 0.0497 | | 1 For the variance of ρˆs given by Bartlett’s formula for MA( q) processes used in Stata, see the Stata Manual. Moreover, Stata also has the Yule-Walker option to estimate the partial autocorrelations. 24 4 Autoregressive and Moving-Average Models II ACF and PACF for a White-Noise Process 0.20 0.10 0.00 -0.10 Autocorrelations of white of Autocorrelations -0.20 0 10 20 30 40 Lag Bartlett's formula for MA(q) 95% confidence bands 0.30 0.20 0.10 0.00 -0.10 -0.20 Partial autocorrelations of white of autocorrelations Partial 0 10 20 30 40 Lag 95% Confidence bands [se = 1/sqrt(n)] Fig. 4.1 Autocorrelation and Partial Autocorrelation Functions of a White-Noise Process, εt { } 6 -0.0584 -0.0058 11.626 0.0709 | | 7 -0.0299 -0.0481 11.768 0.1084 | | 8 -0.0213 -0.0574 11.841 0.1584 | | 9 -0.0439 -0.0612 12.152 0.2048 | | 10 0.1331 0.1353 15.037 0.1307 |- |- 11 0.0035 -0.0087 15.039 0.1807 | | 12 0.0543 0.1337 15.527 0.2139 | |- 13 0.0382 0.0623 15.77 0.2618 | | 14 0.0331 0.0294 15.954 0.3162 | | 15 -0.0637 -0.0531 16.639 0.3409 | | 16 -0.0336 -0.0614 16.831 0.3966 | | 17 0.0198 -0.0016 16.899 0.4612 | | 18 -0.0044 -0.0128 16.902 0.5298 | | 19 -0.0671 -0.0215 17.686 0.5435 | | 20 -0.0791 -0.1179 18.784 0.5359 | | Nearly all Q-statistics fail to reject the null of a white-noise process. 4.3.1 Examples using Simulated Processes Now we present the autocorrelation and the partial autocorrelation functions of the MA(1) and AR(1) simulated processes in Equations 3.7 and 3.15. Just few obser- vations. (1) The y1 simulated process have very short-lived dynamics and cannot be distinguished from a white-noise process. (2) Most of the action in the MA(1) processes appear in the partial autocorrelations. When θ is relatively large and pos- 4.4 Model Selection Criteria 25 ACF and PACF for Y1 0.20 0.10 0.00 -0.10 Autocorrelations of Y1 of Autocorrelations -0.20 0 10 20 30 40 Lag Bartlett's formula for MA(q) 95% confidence bands 0.30 0.20 0.10 0.00 -0.10 -0.20 Partial autocorrelations of Y1 of autocorrelations Partial 0 10 20 30 40 Lag 95% Confidence bands [se = 1/sqrt(n)] Fig. 4.2 ACF and PACF for the Simulated Process y1t = εt + 0.08 εt 1 − itive, the partial autocorrelations fluctuate between positive and negative values and die out as the displacement increases. When θ is relatively large and negative, the partial autocorrelations are initially all negative. (3) In the AR(1) processes, when ϕ is small the resulting processes ( z2 and z4, not shown) cannot be distinguished from a white noise. (4) Most of the action in the AR(1) processes appear in the au- tocorrelation function and not in the partial autocorrelation function. A positive and relatively large ϕ (see z1) results in positive and decaying autocorrelations. A nega- tive and relatively large ϕ (see z3) results in autocorrelations that fluctuate between positive and negative values that die out as the displacement increases. 4.4 Model Selection Criteria As is well know in multiple regression analysis adding regressors to the equation al- ways improves the model fit, as measured by the R2. Hence, a model with additional lags for p and/or q will certainly fit better the current sample. This will not make the model better, for example, for forecasting. In addition, more lags mean less degrees of freedom. The most commonly used model selection criteria to obtain a parsimo- nious model are the Akaike Information Criterion (AIC) and the Schwartz Bayesian Criterion (SBC). Their formulas are given by 26 4 Autoregressive and Moving-Average Models II ACF and PACF for Y2 0.40 0.20 0.00 Autocorrelations of Y2 of Autocorrelations -0.20 0 10 20 30 40 Lag Bartlett's formula for MA(q) 95% confidence bands 0.40 0.20 0.00 -0.20 -0.40 Partial autocorrelations of Y2 of autocorrelations Partial 0 10 20 30 40 Lag 95% Confidence bands [se = 1/sqrt(n)] Fig.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    22 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us