
GARCH processes { probabilistic properties (Part 1) Alexander Lindner Centre of Mathematical Sciences Technical University of Munich D{85747 Garching Germany [email protected] http://www-m1.ma.tum.de/m4/pers/lindner/ Maphysto Workshop Non-Linear Time Series Modeling Copenhagen September 27 { 30, 2004 1 Contents 1. Definition of GARCH(p,q) processes 2. Markov property 3. Strict stationarity of GARCH(1,1) 4. Existence of 2nd moment of stationary solution 5. Tail behaviour, extremal behaviour 6. What can be done for the GARCH(p,q)? 7. GARCH is White Noise 8. ARMA representation of squared GARCH process 9. The EGARCH process and further processes 2 GARCH(p,q) on N0 ("t)t N0 i.i.d.;P ("0 = 0) = 0 (1) 2 α0 > 0; α1; : : : ; αp 0; β1; : : : ; βq 0: 2 2 ≥ ≥ (σ0; : : : ; σmax(p;q) 1) independent of "t : t max(p; q) 1 − f ≥ − g Yt = σt"t; (2) p q 2 2 2 σt = α0 + αiYt i + βjσt j; t max(p; q): (3) X − X − i=1 j=1 ≥ ARCH | {z } GARCH | {z } (Yt)t N0 GARCH(p,q) process 2 3 GARCH(p,q) on Z (1), (2) and (3) for t Z, and 2 "t independent of Yt 1 := Ys : s t 1 : − f ≤ − g ARCH(p): Engle (1981) GARCH(p,q): Bollerslev (1986) Generalized AutoRegressive Conditional Heteroscedasticity 4 Why conditional volatility? 2 Suppose σt Yt 1, 2 − 2 or equivalently σt "t 1, t Z 2 − 8 2 (the process is \causal") 2 Suppose E"t = 0, E" = 1, EYt < . Then t 1 E(Yt Yt 1) = E(σt"t Yt 1) = σt E("t Yt 1) = 0; − − − j 2 2j 2 2j 2 V (Yt Yt 1) = E(σt "t Yt 1) = σt E("t Yt 1) = σt j − j − j − 2 Hence σt is the conditional variance 5 Markov property (a) GARCH(1,1): 2 (σt )t Z is a Markov process, since 2 2 2 2 σt = α0 + (α1"t 1 + β1)σt 1 − − 2 (σt ;Yt)t Z is Markov process, but (Yt)t Z is not. 2 2 (b) GARCH(p,q), max(p; q) > 1 2 (σt )t Z is not Markov process 2 2 2 (σt ; : : : ; σt max(p;q)+1)t Z is Markov process. − 2 6 Strict stationarity - GARCH(1,1) 2 2 2 2 σt = α0 + α1σt 1"t 1 + β1σt 1 2 − −2 − = (α1"t 1 + β1)σt 1 + α0 2 − − =: Atσt 1 + Bt (4) − (At;Bt)t Z is i.i.d. sequence. 2 (4) is called a random recurrence equation. 2 2 σt = Atσt 1 + Bt − 2 = AtAt 1σt 1 + AtBt 1 + Bt − − . − k 2 = At At kσt k 1 + At At i+1 Bt i : (5) ··· − − − X ··· − − i=0 =α0 |{z} Let k and hope that (5) converges. ! 1 7 Strict stationarity - continued Suppose γ := E log A1 < 0 1 k (log At + log At 1 + ::: + log At k+1) !1 E log A1 a.s. k − − −! random walk! | {z } 1 = a.s. ! n0(!): log (At At k+1) γ=2 < 0 k n0(!) ) 8 9 k ··· − ≤ 8 ≥ kγ=2 = At At k+1 e k n0(!) ) j ··· − j ≤ 8 ≥ Hence (5) converges almost surely. 8 Theorem: (Nelson, 1990) If 2 E log(α1"1 + β1) < 0; (6) then GARCH(1,1) has a strictly stationary solution. Its marginal distribution is given by 1 2 2 α0 (α1" 1 + β1) (α1" i + β1) (7) X − − i=0 ··· If β1 > 0 (i.e. GARCH but not ARCH) and a strictly stationary solution exists, then (6) holds. Remark: + If E log A1 < , then j j 1 1 γ = inf E log A0A 1 A n : n N n + 1 j − ··· − j 2 is called the Lyapunov exponent of the sequence (An)n Z. 2 9 When is EY 2 < ? t 1 2 1 2 2 σ0 = α0 (α1" 1 + β1) (α1" i + β1) X − − i=0 ··· 2 1 2 i Eσ = α0 E(α1" + β1) < 0 X 1 i=0 1 2 α1E" + β1 < 1 () 1 2 2 2 EYt = Eσt E"t So stationary solution (Yt)t Z has finite variance iff 2 2 α1E"1 + β1 < 1 10 Tail behaviour of stochastic recurrence equations Theorem (Goldie 1991, Kesten 1973, Vervaat 1979) Suppose (Zt)t N satisfies the stochastic recurrence equation 2 Zt = AtZt 1 + Bt; t N; − 2 where ((At;Bt))t N,(A; B) i.i.d. sequences. Suppose κ > 0 such that 2 9 (i) The law of log A , given A = 0, is not concentrated on rZ for any r > 0 j j j j 6 (ii) E A κ = 1 (iii) Ej Aj κ log+ A < (iv) EjBjκ < j j 1 j j 1 Then Z =d AZ +B, where Z independent of (A; B), has a unique solution in distribution which satisfies E[((AZ + B)+)κ ((AZ)+)κ] lim xκP (Z > x) = − =: C 0 x κE A κ log+ A !1 ≥ j j j j (8) 11 Tail behaviour of GARCH(1,1) Corollary: Suppose "1 is continuous, P ("1 > 0) > 0 and all of its moments exist. Further, suppose that 2 E log(α1"1 + β1) < 0: Then κ > 0 and C1 > 0;C2 > 0 such that for the stationary 9 2 solutions of (σt ) and (Yt): κ 2 lim x P (σ0 > x) = C1 x !1 2κ lim x P (Y0 > x) = C2 x !1 Mikosch, St˘aric˘a(2000): GARCH(1,1) de Haan, Resnick, Rootz´en,de Vries: ARCH(1) 12 Extremal behaviour of GARCH(1,1) Mikosch and St˘aric˘a(2000) and de Haan et al. (1989) gave ex- treme value theory for GARCH(1,1) and ARCH(1). Suppose a stationary solution (Yt)t N0 exists. 2 d Let (Yt)t N0 be an i.i.d. sequence with Y0 = Y0. 2 e e Then under the previous assumptions, θ (0; 1) such that for 9 2 a certain sequence ct > 0, t N: 2 2κ lim P max Yi ctx = exp( θx− ); x R t i=1;:::;t !1 ≥ − 2 2κ lim P max Yi ctx = exp( x− ); x R: t i=1;:::;t ≥ − 2 !1 e The GARCH(1,1) process has an extremal index θ, i.e. ex- ceedances over large thresholds occur in clusters, with an average cluster length of 1/θ 13 What can be done for GARCH(p,q)? 2 p 1 τt := (β1 + α1"t ; β2; : : : ; βp 1) R − 2 p 1 − 2 ξt := ("t ; 0;:::; 0) R − 2 q 2 α := (α2; : : : ; αq 1) R − − 2 Ip 1:(p 1) (p 1) identity matrix − − × − Mt:(p + q 1) (p + q 1) matrix − × − τt βp α αq 0 1 Ip 1 0 0 0 Mt := − B ξt 0 0 0 C B C @ 0 0 Iq 2 0 A − p+q 1 N := (α0; 0;:::; 0)0 R − 2 2 2 2 2 Xt := (σt+1; : : : ; σt p+2;Yt ;:::;Yt q+2)0 − − Then (Yt)t Z solves the GARCH(p,q) equation if and only if 2 Xt+1 = Mt+1Xt + N; t Z (9) 2 14 GARCH(p,q) - continued (9) is a random recurrence equation. Theory for existence of stationary solutions can be applied. 2 For example, if E"1 = 0, E"1 = 1, then a necessary and sufficient condition for existence of a strictly stationary solution with finite second moments is p q α + β < 1: (10) X i X j i=1 j=1 (Bougerol and Picard (1992), Bollerslev (1986)) But stationary solutions can exist also if p q α + β = 1: X 1 X j i=1 j=1 A characterization for the existence of stationary solutions is achieved via the Lyapunov exponent (whether it is strictly nega- tive or not). 15 GARCH(p,q) is White Noise 2 Suppose E"t = 0, E" = 1 and (10). Then for h 1, t ≥ EYt = E(σt"t) = E(σtE("t Yt 1)) = 0 j − E(YtYt+h) = E(σtσt+h"t E("t+h Yt+h 1)) = 0 j − =0 2 | {z } But (Yt )t Z is not White Noise. 2 2 2 2 2 ut := Y σ = σ (" 1) t − t t t − Suppose Eu2 < and let h 1: t 1 ≥ Eut = 0 2 2 2 2 E(utut+h) = E(σ (" 1)σ ) E(" 1) = 0 t t − t+h t+h − Hence (ut)t Z is White Noise 2 16 2 The ARMA representation of Yt p q 2 2 2 σt = α0 + αiYt i + βjσt j X − X − i=1 j=1 2 2 Substitute σ = Y ut. Then t t − p q q 2 2 2 Yt αiYt i βjYt j = α0 + ut βjut j X − X − X − − i=1 − j=1 − j=1 2 Hence (Yt )t Z satisfies an ARMA(max(p; q); q) equation. 2 2 Hence autocorrelation function and spectral density of (Yt )t Z is that of ARMA(max(p; q); q) process with the given parameters.2 17 Drawbacks of GARCH Black (1976): Volatility tends to rise in response to \bad • news" and to fall in response to \good news" The volatility in the GARCH process is determined only by • the magnitude of the previous return and shock, not by its sign. The parameters in GARCH are restricted to be positive to • 2 ensure positivity of σt . When estimating, however, sometimes best fits are achieved for negative parameters. 18 Exponential GARCH (EGARCH) (Nelson, 1991) ("t)t Z i.i.d.(0; 1) 2 Yt = σt"t p q 2 2 log(σt ) = α0 + αig("t i) + βj log(σt j) X − X − i=1 j=1 2 2 g("t) = θ"t + γ( "t E "t ); θ + γ = 0 j j − j j 6 Most often, "t i.i.d.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages22 Page
-
File Size-