<<

1 Introduction to

Autocorrelation occurs when the Gauss-Markov assumption of uncorrelated er- ror terms is violated. Cov(t, s) 6= 0, for t 6= s Autocorrelation is often paired with because it is another way in which the - matrix of the true error terms is different 0 2 from the Gauss-Markov assumption, E( ) = σ In.

In this case we have errors that covary, i.e. if one eror is positive and large the next error is likely to be positive and large. In this case one error can give us information about another. In general the correlation between t and t−k is called autocorrelation of order k. The correlation between t and t−1 is called autocorrelation of order 1 and is usually denoted by ρ1. What does autocorrelation imply for the variance covariance ma- trix of errors?

 2  E(1) E(12) ...E(1n) 0 . . . E( ) =  . . .  2 E(n1) E(n2) ...E(n) Here the off-diagonal elements are the between the dif- ferent error terms and they are not zero.

2 Summary : Properties of Autocorrelation

What are the causes of Autocorrelation?

• Misspecification of the model :

• Omitted variables

• Systematic errors in measurement

1 Omission of variables:

One factor that can cause autocorrelation is the omission of vari- ables. Suppose Yt is related to X2t and X3t, but we wrongfully do not include X3t in the model. Then the effect of X3t will be captured by the disturbances t. If X3t like many economic series exhibit a trend over time, then X3t depends on X3t−1,X3t−2 and so on. Similarly, t dpepnds on t−1, t−2 and so on. Misspecification of model: Suppose Yt is related to X2t with a quadratic relationship;

2 Yt = β1 + β2X2t + t but we wrongfully assume and estimate a straight line:

Yt = β1 + β2X2t + t Then the error term obtained from the straight line will depend on 2 X2t. Systematic error in measurement: Suppose a company updates its inventory at a given period of time. If a systematic error occured then the cumulative invnetory stock will exhibit accumulated measurement errors. These errors will show up as an autocorrelated procedure.

3 First order autocorrelation :

The simplest ans most commonly observed is the first-order auto- correlation. Consider the multiple regression model:

Yt = β1 + β2X2t + β3X3t + ··· + βpXpi + t in which the current observation of the error term t is a function of the previous observation of the error term:

t = ρt−1 + ut The coefficient ρ is called the first-order autocorrelation coeffi- cient and takes values from -1 to +1. It is obvious that the size of ρ will determine the strength of serial correlation. We can have three

2 different cases as described in the next slide. If ρ is zero, then we have no correlation. If ρ approaches unity. the value of the previous observation of the error becomes more important in determining thw value of the current error. In this case we have positive autocorrelation. If ρ approached -1, we have high degrees of negative autocorrelation.

4 Consequences of autocorrelation

In this section we will discuss the consequences of autocorrelated errors;

3 • the impacts on the ordinary (OLS) estimates. • effects on testing of hypothesis • effects on forecasting and prediction

5 Impacts on the OLS estimates

OLS estimates are unbiased and consistent even if the errors are au- tocorrelated. However. the problem is with th efficiency of these es- timates. While proving the Gauss-Markov theorem that established efficiency, one of the stepsinvolve the minimization the variance of P the linear combination att.

X  X 2 2 XX Var att = at σ + atasCov (t, s) t6=s where the summation is over t 6= s. If Cov(t, s) 6= 0 the second term on the right-hand side does not vanish. Due to the fact that Cov (t, s) 6= 0 the best linear unbiased P (BLUE) that minimizes Var( att) will not be the same as the OLS estimator. The OLS estimator is not BLUE and hence is inefficient. Note : The consequences of heteroscedasticity and autocorrela- tion is the same. If the autocorrelation in t is positive and the independent vari- 2 able Xt grows over time, then the estimated residual variance (ˆσ ) will be an underestimate and the value of R2 will be an overesti- mate. In general, the variance of the OLS estimates for regression coefficients will be biased.

6 Effects on tests of hypotheses

The presence of autocorrelation has a serious effect on the testing of hypotheses. When the autocorrelation is positive and the inde- pendent variable Xt grows over time, estimated standard errors will

4 be smaller than the true standard error, and hence the estimated standard errors will be underestimated. • As a result of autocorrelation, – the t-, which is divided by the standard error, will be overestimated – a regression coefficient that appears to be significant due to the overestimated t- may not really be so.

• Since the estimated of the parameters will be biased and inconsistent, thus the t-statistic and F-statistics are no longer valid.

7 Effects on prediction and forecast

Forecasts based on OLS in the presence of autocorrelation will be unbiased, however they will be inefficient due to the inefficient es- timates of the regression coefficients. Suppose we ignore the AR(1) serial correlation and obtain OLS estimatesα ˆ and βˆ. The OLS ˆ prediction would bey ˆt =α ˆ + βxt. However in case of first order autocorrelation , t is predictable from ρt−1 + ut, provided ρ can be estimated byρ ˆ.

Once we know t =ρ ˆˆt−1; the residual for the previous period ˆt−1 is known at time t. Therefore, the AR(1) prediction will be ˆ ˆ y˜t = β0 + β0xt +ρ ˆ˜t−1 ˆ ˆ ˆ ˆ = β0 + β1xt +ρ ˆ(yi − β0 − β1xt−1) ˆ ˆ by making use of the fact that ˆt−1 = yt−1 − β0 − β1xt−1. Thusy ˜t will be more efficient than that obtained by the OLS procedure.

8 Summary : Properties of Autocorrelation

If autocorrelation among the error terms in a regression model is ignored and the OLS procedure is used then;

5 • the estimates and forecasts based on them will still be unbiased and consistent.

• the OLS estimates are no longer BLUE and will be inefficient.

• The estimated variance of the regression coefficients will be biased and hence the test of hypothesis will be invalid.

6