<<

, ACF, PACF, White , Stochastic Process Estimation Definition (stochastic process) A stochastic process is sequence of indexed random variables denoted as Z(ω, t) where ω belongs to a sample space and t belongs to an index set. From here on out, we will simply write a stochastic process (or ) as {Zt } (dropping the ω). Notation

For the time units t1, t2,..., tn, denote the n-dimensional distribution

function of Zt1 , Zt2 ,..., Ztn as

FZ ,...,Z (x1,..., xn) = P (Zt ≤ x1,..., Zt ≤ xn) t1 tn 1 n

Example iid Qn Let Z1,..., Zn ∼ N (0, 1). Then FZ ,...,Z (x1,..., xn) = Φ(xj ). t1 tn j=1 Arthur Berg Stationarity, ACF, White Noise, Estimation 3/ 1

Stationarity Function

Definition (Strongly Stationarity (aka Stricktly, aka Completely)) Definition (Mean Function)

A time series {xt } is strongly stationary if any collection The mean function of a time series {Zt } (if it exists) is given by Z ∞ {x , x ,..., x } t1 t2 tn µt = E(Zt ) = x ft (x) dx −∞ has the same joint distribution as the time shifted set Example (Mean of an iid MA(q))  xt +h, xt +h,..., xt +h 1 2 n iid 2 Let at ∼ N (0, σ ) and Strong stationarity implies the following: Zt = at + θ1at−1 + θ2at−2 + ··· + θqat−q All marginal distributions are equal, i.e. P(Zs ≤ c) = P(Zt ≤ c) for all s, t, and c. This is what is called “first-order stationary”. then The of Zs and Zt (if exists) is shift-independent, i.e. E(Zs + h, Zt + h) = E(Zs, Zt ) for any choice of h. µt = E(Zt ) = E(at ) + θ1 E(at−1) + θ2 E(at−2) + ··· + θq E(at−q) ≡ 0 Strong stationarity typically assumes too much. This leads us to the weaker assumption of weak stationarity. (free of the time variable t)

Arthur Berg Stationarity, ACF, White Noise, Estimation 4/ 1 Arthur Berg Stationarity, ACF, White Noise, Estimation 5/ 1 White Noise Now your turn!

Mean of a with Drift

Definition (White Noise) Suppose Z0 = 0, and for t > 0, Zt = δ + Zt−1 + at where δ is a constant 2 White noise is a collection of uncorrelated random variables with and at ∼ WN(0, σ ). What is the mean function of Zt ? (That is compute constant mean and . E [Zt ].) Note that Notation 2 2 Zt = δ + Zt−1 + at at ∼ WN(0, σ ) — white noise with mean zero and variance σ = 2δ + Z + a + a (Z = δ + Z + a ) IID WN t−2 t t−1 t−1 t−2 t−1 2 . If as is independent of at for all s 6= t, then wt ∼ IID(0, σ ) . Gaussian White Noise ⇒ IID t X Suppose at is normally distributed. = δt + aj uncorrelated+ normality ⇒ independent j=1 2 Thus it follows that at ∼ IID(0, σ ) (a stronger assumption). Therefore we see  t  t X X   µt = E [Zt ] = E [δt] + E  aj  = δt + E aj = δt Arthur Berg Stationarity, ACF, White Noise, Estimation 6/ 1 Arthur Berg Stationarity, ACF,j=1 White Noise, Estimationj=1 7/ 1

The Covariance Function Example (Covariance Function of MA(1)) 2 Let at ∼ WN(0, σ ) and Zt = at + θat−1 (where θ is a constant), then Definition (Covariance Function)

The covariance function of a random sequence {Zt } is γ(s, t) = cov(at + θat−1, as + θas−1)

γ(s, t) = cov(Zs, Zt ) = E [(Zs − µs)(Zt − µt )] = cov (at , as) + θ cov (at , as−1)+ + θ ( , ) + θ2 ( , ) So in particular, we have cov at−1 as cov at−1 as−1 If s = t, then h 2i var(Zt ) = cov(Zt , Zt ) = γ(t, t) = E (Zt − µt ) γ(s, t) = γ(t, t) = σ2 + θ2σ2 = (θ2 + 1)σ2

Also note that γ(s, t) = γ(t, s) since cov(Zs, Zt ) = cov(Zt , Zs). If s = t − 1 or s = t + 1, then Example (Covariance Function of White Noise) 2 2 γ(s, t) = γ(t, t + 1) = γ(t, t − 1) = θσ Let at ∼ WN(0, σ ). By the definition of white noise, we have

( 2 So all together we have σ , s = t  γ(s, t) = E(as, at ) = (θ2 + 1)σ2, ifs = t 0, s 6= t  γ(s, t) = θσ2, if|s − t| = 1  Arthur Berg Stationarity, ACF, White Noise, Estimation 8/ 1 Arthur Berg Stationarity,0, ACF, White Noise,else Estimation 9/ 1 Now your turn! The Function

Covariance Function of a Random Walk with Drift

Suppose Z0 = 0, and for t > 0, Zt = δ + Zt−1 + at where δ is a constant 2 and at ∼ WN(0, σ ). What is the covariance function of Zt ? (That is compute cov (Z , Z ).) s t Definition (Correlation Function)

From the representation The correlation function of a random sequence {Zt } is t X Zt = δt + aj γ(s, t) ρ(s, t) = j=1 pγ(s, s)γ(t, t) We see  s t  Cauchy-Schwartz inequality ⇒ |ρ(s, t)| ≤ 1 X X γ(s, t) = cov(Zs, Zt ) = cov  aj , ak  j=1 k=1 s t X X  2 = cov aj , ak = min(s, t)σ j=1 k=1

Arthur Berg Stationarity, ACF, White Noise, Estimation 10/ 1 Arthur Berg Stationarity, ACF, White Noise, Estimation 11/ 1

Weak Stationarity and Autocorrelation Functions

Definition (Weak Stationarity) Definition (Autocovariance Function) A weakly stationary time series is a finite variance process that The autocovariance function of a stationary time series is (i) has a mean function, µ, that is constant (so it doesn’t depend on t); γ = γ(h) = E [(Z − µ)(Z − µ)] (ii) has a covariance function, γ(s, t), that dependents on s and t only h t+h t through the difference |s − t|. (for any value of t). From now on when we say stationary, we mean weakly stationary. Definition (Autocorrelation Function) For stationary processes, the mean function is µ (no t). The autocorrelation function of a stationary time series is Since γ(s, t) = γ(s + h, t + h) for stationary processes, we have γ(s, t) = γ(s − t, 0). γ(h) ρ = ρ(h) = Therefore we can view the covariance function as a function of only h γ(0) one variable (h = s − t); this is the autocovariance function γ(h).

Arthur Berg Stationarity, ACF, White Noise, Estimation 12/ 1 Arthur Berg Stationarity, ACF, White Noise, Estimation 14/ 1 Properties of the Autocovariance/Autocorrelation White Noise Functions γ = var(Z ); ρ = 1 2 0 t 0 White noise, denoted by at ∼ WN(0, σ ), is by definition a weakly |γk | ≤ γ0; |ρk | ≤ 1 with autocovariance function γk = γ−k ; ρk = ρ−k ( γk and ρk are positive semidefinite (p.s.d.), i.e. σ2, k = 0 n n γk = X X 0, k 6= 0 αi αj γ|ti −tj | ≥ 0 i=1 j=1 n n and autocorrelation function and X X αi αj ρ| − | ≥ 0 ( ti tj 1, k = 0 i=1 j=1 ρk = 0, k 6= 0 for any set of times t1,..., tn and any real numbers α1, . . . , αn. Proof. Not all white noise is boring like iid N (0, σ2). Define X = Pn α Z , then i=1 i ti For example, stationary GARCH processes can have all the properties of n n n n white noise. X X X X 0 ≤ var(X) = αi αj cov(Zti , Ztj ) = αi αj γ|ti −tj | i=1 j=1 i=1 j=1 Arthur Berg Stationarity, ACF, White Noise, Estimation 15/ 1 Arthur Berg Stationarity, ACF, White Noise, Estimation 17/ 1

Random Walk Simulation in R Homework 1b

iid 2 Simulate the random walk Zt = .2 + Zt−1 + at where at ∼ N (0, σ ). Pt Hint: use the representation Zt = .2t + j=1 aj !

> w = rnorm(200,0,1) > x = cumsum(w) Read the following sections from the textbook > wd = w +.2 §2.5: Estimation of the Mean, , and > xd = cumsum(wd) > plot.ts(xd, ylim=c(-5,55)) > lines(x) §2.6: Moving Average and Autoregressive Representations of Time > lines(.2*(1:200), lty="dashed") Series Processes Do the following exercise. Prove that a time series of iid random variables is strongly stationary.

Arthur Berg Stationarity, ACF, White Noise, Estimation 19/ 1 Arthur Berg Stationarity, ACF, White Noise, Estimation 21/ 1