
AR, MA and ARMA models • The autoregressive process of order p or AR(p) is defined by the equation p X = φ X − + ! t X j t j t j=1 2 where !t ∼ N(0; σ ) • φ = (φ1; φ2; : : : ; φp) is the vector of model coefficients and p is a non-negative integer. • The AR model establishes that a realization at time t is a linear combination of the p previous realization plus some noise term. • For p = 0, Xt = !t and there is no autoregression term. 53 • The lag operator is denoted by B and used to express lagged values of the process so BXt = Xt−1, 2 3 d B Xt = Xt−2, B Xt = Xt−3,: : :, B Xt−d. • If we define p j 2 p Φ(B) = 1 − φ B = 1 − φ1B − φ2B − : : : − φ B X j p j=1 the AR(p) process is given by the equation Φ(B)Xt = !t; t = 1; : : : ; n. • Φ(B) is known as the characteristic polynomial of the process and its roots determine when the process is stationary or not. • The moving average process of order q or MA(q) is 54 defined as q X = ! + θ ! − t t X j t j j=1 • Under this model, the observed process depends on 0 previous !ts • MA(q) can define correlated noise structure in our data and goes beyond the traditional assumption where errors are iid. • In lag operator notation, the MA(q) process is given by q j the equation Xt = Θ(B)!t where Θ(B) = 1 + =1 θjB . Pj • The general autoregressive moving average process of orders p and q or ARMA(p; q) combines both AR and MA models into a unique representation. 55 • The ARMA process of orders p and q is defined as p q X = φ X − + θ ! − + ! t X j t j X j t j t j=1 j=1 • In lag operator notation, the ARMA(p; q) process is given by Φ(B)Xt = Θ(B)!t; t = 1; : : : ; n • Lets focus on the AR process and its characteristic polynomial. • The characteristic polynomial can be expressed as: p Φ(B) = (1 − α B) Y i i=1 where the α0s are the reciprocal roots. • If β1; β2; : : : ; βp are such that Φ(βi) = 0 (roots of the 56 polynomial) then β1 = 1/α1; β2 = 1/α2; : : : ; βp = 1/αp • Theorem: If Xt ∼ AR(p), Xt is a stationary process if and only if the modulus of all the roots of the characteristic polynomial are greater than one, i.e. if jjβijj > 1 for all i = 1; 2; : : : ; p or equivalently if jjαijj < 1; i = 1; 2; : : : p. 0 • The αis are also known as the poles of the AR process. • This theorem follows from the general linear process theory. • Some of the poles or reciprocal roots can be real number and some can be complex numbers and we need to distinguish between the 2 cases. 57 • For the complex case, we will use the representation αi = riexp(!ii); i = 1; : : : ; C so C is the total number of conjugate pairs and 2C is the total number of complex poles. • ri is the modulus of αi and !i its frequency. • The real reciprocal roots are denoted as αi = ri; i = 1; 2; : : : R • Example. Consider the AR(1) process Xt = φXt−1 + !t. • In lag-operator notation this process is (1 − φB)Xt = !t and the characteristic polynomial is Φ(B) = (1 − φB). • If Φ(B) = (1 − φB) = 0, the only characteristic root is 58 β = 1/φ (assuming φ 6= 0). • The AR(1) process is stationary if only if jφj < 1 or −1 < φ < 1. • The case where φ = 1 corresponds to a Random Walk process with a zero drift, Xt = Xt−1 + !t • This is a non-stationary explosive process. • If we recursive apply the AR(1) equation, the Random Walk process can be expressed as Xt = !t + !t−1 + !t−2 + : : :. Then, 1 2 V ar(Xt) = =0 σ = 1. Pt • Example. AR(2) process Xt = φ1Xt−1 + φ2Xt−2 + !t • The characteristic polynomial is now 59 2 Φ(B) = (1 − φ1B − φ2B ) • The solutions to Φ(B) = 0 are 2 2 −φ1 + qφ1 + 4φ2 −φ1 − qφ1 + 4φ2 β1 = ; β2 = 2φ2 2φ2 • The reciprocal roots are 2 2 φ1 + qφ1 + 4φ2 φ1 − qφ1 + 4φ2 α1 = ; α2 = 2 2 • The AR(2) is stationary if and only if jjα1jj < 1 and jjα2jj < 1 • These two conditions imply that jjα1α2jj = jφ2j < 1 and jjα1 + α2jj = jφ1j < 2 which means −1 < φ2 < 1 and −2 < φ1 < 2. 60 2 • For α1 and α2 real numbers, φ1 + 4φ2 ≥ 0 which implies −1 < α2 ≤ α1 < 1 and after some algebra φ1 + φ2 < 1; φ2 − φ1 < 1 2 2 φ1 • In the complex case φ1 + 4φ2 < 0 or −4 > φ2 • If we combine all the inequalities we obtain a region bounded by the lines φ2 = 1 + φ1; φ2 = 1 − φ1; φ2 = −1. • This is the region where the AR(2) process is stationary. • For an AR(p) where p ≥ 3, the region where the process is stationary is quite abstract. • For the stationarity condition of the MA(q) process, we need to rely on the general linear process. • A general linear process is a random sequence Xt of the 61 form, 1 X = a ! − t X j t j j=0 2 where !t is a white noise sequence with variance σ . • In lag operator notation, the general linear is given by the −1 −1 1 j expression Xt = Φ(B) !t where Φ(B) = =0 ajB . Pj • Note firstly that by the definition of the linear process, E(Xt) = 0. • Then, the covariance between Xt and Xs is 1 1 E[X X ] = a a E[X − X − ] t s X X j l t j s l j=0 l=0 62 1 2 = σ a a + − ; (s ≥ t) X j j s t j=0 • The last expression depends on t and s only through the difference s − t. Therefore, the process is stationary if 1 j=0 ajaj+k is finite for all non-negative integers k P 1 2 • Setting k = 0 we require that j=0 a < 1 P j • Given that a correlation is always between −1 and 1, jγkj ≤ γ0 1 so if γ0 < 1 then =0 ajaj+k < 1. Pj 1 2 • Then Xt is stationary if and only if =0 a < 1 Pj j • The MA(q) process can be written as a general linear 63 1 process of the form Xt = =0 aj!t−j where Pj 8 θj j=0,. ,q aj = < 0 j=q+1,q+2,. : with θ0 = 1. 1 2 q 2 • For the MA(q) process =0 a = =0 θ < 1 so a Pj j Pj j moving average process is always stationary. • For the ARMA(p,q) process given by Φ(B)Xt = Θ(B)!t Xt is stationary if only if the roots of Φ(B) = 0 have all modulus greater than 1 or all the reciprocal roots have a modulus less than one. • A related concept to stationary linear process is invertible process. 64 • Definition: A process Xt is invertible if 1 X = a X − + ! t X j t j t j=1 1 2 with the restriction that =1 a < 1 Pj j • Basically, an invertible process is an infinite autoregression. • By definition the AR(p) is invertible. We can set a1 = φ1; a2 = φ2; : : : ap = φp and aj = 0; j > p. Then 1 2 p 2 =1 a = =1 φ which is finite. Pj j P j • For an MA(q) process we have Xt = Θ(B)!t. If we find a polynomial Θ(B)−1 such that Θ(B)Θ(B)−1 = 1 then we −1 can invert the process since Θ(B) Xt = !t 65 • The MA(q) process is invertible if and only if the roots of Θ(B) have all modulus greater than one. • To illustrate this last point consider the MA(1) process Xt = (1 − θB)!t • If jθj < 1 then 1 − 1 Θ(B) 1 = = θjBj X (1 − θB) j=0 1 j • Since jθj < 1 then =0 θ < 1 and so the process is Pj invertible and has the representation 1 j X = θ X − + ! t X t j t j=1 • The ARMA(p,q) process is invertible whenever the MA 66 part of the process is invertible, i.e. when Θ(B) has reciprocal roots with modulus less than one. Autocorrelation and Partial Autocorrelation • The partial autocorrelation function (PACF) of a process Zt is defined as Pk = Corr(Zt; Zt+kjZt+1; : : : ; Zt+k−1); k = 0; 1; 2; 3; : : : • This PACF is equal to the ordinary correlation between Zt − Z^t and Zt+k − Zt^+k where Z^t and Zt^+k are the \best" linear estimators for Zt and Zt+k respectively. • This PACF can also be derived through an autoregressive model of order k Zt+k = φk1Zt+k−1 + φk2Zt+k−2 + : : : + φkkZt + !t+k 67 • The coefficients φk1; φk2; : : : ; φkk define the PACF. • We have a set of linear equations for which the solution can be obtained via Cramer's Rule. • R/S-plus include an option to compute the PACF. > acf(x,type=''partial'') 68 • The partial autocorrelation can be derived as follows. Suppose that Zt is zero mean stationary process. • Consider a regression model where Zt+k is regressed on k lagged variables Zt+k−1; Zt+k−2; : : : ; Zt, i.e., Zt+k = φk1Zt+k−1 + φk2Zt+k−2 + : : : + φkkZt + !t+k • φki denotes the i-th regression parameter and !t+k is a normal error term uncorrelated with Zt+k−j for j ≥ 1.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages37 Page
-
File Size-