<<

LINEAR

MODULE – XI

Lecture - 33

Autocorrelation

Dr. Shalabh Department of Mathematics and Indian Institute of Technology Kanpur

2

One of the basic assumptions in model is that the random error components or disturbances are identically and independently distributed. So in the model yX = β + u , it is assumed that 2 σ u if s = 0 Eu(,t u ts− )=  0if s ≠ 0, i.e., the correlation between the successive disturbances is zero.

2 In this assumption, when Eu (, t u ts− ) = σ u , s = 0 is violated, i.e., the of disturbance term does not remains constant, then problem of heteroskedasticity arises.

When Eu ( t , u ts− ) = 0, s ≠ 0 is violated, i.e., the variance of disturbance term remains constant though the successive disturbance terms are correlated, then such problem is termed as problem of .

When autocorrelation is present, some or all off diagonal elements in E ( uu ') are nonzero.

Sometimes the study and explanatory variables have a natural sequence order over time, i.e., the is collected with respect to time. Such data is termed as data. The disturbance terms in time series data are serially correlated. 3

The autocovariance at lag s is defined as

γ s=Eu( t , u ts− ); s = 0, ±± 1, 2,... .

At zero lag, we have constant variance, i.e.,

. 22 γσ0 =Eu()t = .

The autocorrelation coefficient at lag s is defined as

E() uu t ts− ρs = ;s = 0, ±± 1, 2,... Var() u Var ( u ) t ts−

Assume ρ s and γ s are symmetrical in s, i.e., these coefficients are constant over time and depend only on length of lag s.

The autocorrelation between the successive terms (u2 and u1), (u3 and u2),…, (un and un-1) gives the autocorrelation of

order one, i.e., ρ 1 .

Similarly, the autocorrelation between the successive terms (u3and uu 14 ),( and u 2 ),...,( unn and u−2 )

gives the autocorrelation of order two, i.e., ρ 2 . 4

Source of autocorrelation

Some of the possible reasons for the introduction of autocorrelation in the data are as follows:

1. Carryover of effect, atleast in part, is an important source of autocorrelation. For example, the monthly data on expenditure on household is influenced by the expenditure of preceding month. The autocorrelation is present in cross- section data as well as time series data. In the cross-section data, the neighboring units tend to be similar with respect to the characteristic under study. In time series data, the time is the factor that produces autocorrelation. Whenever some ordering of units is present, the autocorrelation may arise.

2. Another source of autocorrelation is the effect of deletion of some variables. In regression modeling, it is not possible to include all the variables in the model. There can be various reasons for this, e.g., some variable may be qualitative, sometimes direct observations may not be available on the variable etc. The joint effect of such deleted variables gives rise to autocorrelation in the data.

3. The misspecification of the form of relationship can also introduce autocorrelation in the data. It is assumed that the form of relationship between study and explanatory variables is linear. If there are log or exponential terms present in the model so that the linearity of the model is questionable then this also gives rise to autocorrelation in the data.

4. The difference between the observed and true values of variable is called measurement error or errors–in-variable. The presence of measurement errors on the dependent variable may also introduce the autocorrelation in the data. 5 Structure of disturbance term Consider the situation where the disturbances are autocorrelated,

γγ01 γn− 1  γγ γ E(εε ') = 10n− 2    γγnn−−12 γ 0

1 ρρ11 n− ρρ 121  n− = γ 0    ρρnn−−12 1

1 ρρ11 n− ρρ 2 121  n− = σ u .    ρρnn−−12 1

2 Observe that now there are (n + k) parameters- ββ 12 , ,..., βσ ku , , ρρ 12 , ,..., ρ n − 1 . These (n + k) parameters are to be estimated on the basis of available n observations. Since the number of parameters are more than the number of observations, so the situation is not good from the statistical point of view. In order to handle the situation, some special form and the structure of the disturbance term is needed to be assumed so that the number of parameters in the matrix of disturbance term can be reduced. The following structures are popular in autocorrelation: 1. Autoregressive (AR) process. 2. (MA) process. 3. Joint autoregressive moving average (ARMA) process. 6 Estimation under the first order autoregressive process Consider a simple linear regression model

yt=++=ββ01 Xtt ut, 1,2,..., n .

Assume ut ‘s follow a first order autoregressive scheme defined as

uut=ρε tt−1 + where

ρε<=1,E (t ) 0,

2 σ ε ifs = 0 E(,εεt ts+ )=  0 ifs ≠ 0 for all tn = 1,2,..., where ρ is the first order autocorrelation between u t and ut t − 1 , = 1,2,..., n . Now

uut=ρε tt−1 +

=ρ()u ++ εε tt−−21 t =  2 =+++εtt ρε−−12 ρ ε t... ∞ r = ∑ ρεtr− r=0

Eu()t = 0 2 2 22 42 Eu(tt )=+++ E (ε ) ρε E ( t−−12 ) ρε E ( t ) ... 2 4 2' =+++(1ρρ ....) σεε ( t s are seriallyindependent) σ 2 Eu()22=σ = ε for all t. tu1− ρ 2 7

=ε + ρε + ρ22 ε +× ε + ρε + ρ ε + E() uutt−1 E( t t−1 t − 2...) ( t −− 12 t t − 3...) =ε + ρ ε ++ ρε ε ++ ρε E { ttt( −−12...)}{ tt −− 12...}

2 =ρE  ε ++ ρε ... ( tt−−12)

2 = ρσ u .

Similarly,

22 E() uutt−2 = ρσu .

In general,

s 2 E() uut ts− = ρσ u 1 ρρ21 ρn− n−2 ρ1 ρρ 2 23n− E( uu ') =Ω=σ u ρρ1  ρ.     nn−−123 n − ρρρ 1

2 Note that the disurbance terms are no more independent and E ( uu ') ≠ σ I . The disturbance are nonspherical. 8

Consequences of autocorrelated disturbances Consider the model with first order autoregressive disturbances yX= β + u nk××11nk× n×1

uu=ρε +=, t 1,2,..., n t tt−1 with assumptions Eu( )= 0, Euu ( ') = Ω 2 σ ε ifs = 0 EE(εt )= 0, ( εεt ts+ ) =  0 ifs ≠ 0 where Ω is a positive definite matrix. The ordinary estimator of β is b= (') XX −1 Xy ' = (XX ' )−1 X '( Xβ + u ) b−=β (') XX−1 Xu ' Eb(−=β ) 0.

So OLSE remains unbiased under autocorrelated disturbances.

The of b is

=−−ββ Vb()()()' Eb b = (')XX−−11 XEuuXXX '(')(')

=(')XX−−11 X ' Ω XXX (')

21− ≠ σ u (XX ' ). 9 The residual vector is e=−== y Xb Hy Hu e'' e= y Hy = u ' Hu −1 Eee(')= Euu (') − E u '(') X X X X ' u 21− =−Ωnσ u tr(') X X X ' X .

ee' Since s 2 = , so n −1

σ 2 1 E() s21= u −Ωtr(')', X X− X X −− nn11 2 so s2 is a biased estimator of σ . In fact, s2 has downward bias.

Application of OLS fails in case of autocorrelation in the data and leads to serious consequences as . overly optimistic view from R2. . narrow . . usual t-ratio and F - ratio tests provide misleading results. . prediction may have large . Since disturbances are nonspherical, so generalized least squares estimate of β yields more efficient estimates than OLSE. The GLSE of β is βˆ =ΩΩ('X−−11 XX ) ' − 1 y E()ββˆ = ˆ 2−− 11 V()βσ=u ( XX ' Ω ). The GLSE is best linear unbiased estimator of β .