Stochastic Process, ACF, PACF, White Noise, Estimation §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Outline
1 §2.1: Stationarity
2 §2.2: Autocovariance and Autocorrelation Functions
3 §2.4: White Noise
4 R: Random Walk
5 Homework 1b
Arthur Berg Stationarity, ACF, White Noise, Estimation 2/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Stochastic Process
Definition (stochastic process) A stochastic process is sequence of indexed random variables denoted as Z(ω, t) where ω belongs to a sample space and t belongs to an index set.
From here on out, we will simply write a stochastic process (or time series) as {Zt} (dropping the ω). Notation
For the time units t1, t2,..., tn, denote the n-dimensional distribution function of
Zt1 , Zt2 ,..., Ztn as
F (x ,..., x ) = P (Z ≤ x ,..., Z ≤ x ) Zt1 ,...,Ztn 1 n t1 1 tn n
Example Let Z ,..., Z ∼iid N (0, 1). Then F (x ,..., x ) = Qn Φ(x ). 1 n Zt1 ,...,Ztn 1 n j=1 j
Arthur Berg Stationarity, ACF, White Noise, Estimation 3/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Stochastic Process
Definition (stochastic process) A stochastic process is sequence of indexed random variables denoted as Z(ω, t) where ω belongs to a sample space and t belongs to an index set.
From here on out, we will simply write a stochastic process (or time series) as {Zt} (dropping the ω). Notation
For the time units t1, t2,..., tn, denote the n-dimensional distribution function of
Zt1 , Zt2 ,..., Ztn as
F (x ,..., x ) = P (Z ≤ x ,..., Z ≤ x ) Zt1 ,...,Ztn 1 n t1 1 tn n
Example Let Z ,..., Z ∼iid N (0, 1). Then F (x ,..., x ) = Qn Φ(x ). 1 n Zt1 ,...,Ztn 1 n j=1 j
Arthur Berg Stationarity, ACF, White Noise, Estimation 3/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Stochastic Process
Definition (stochastic process) A stochastic process is sequence of indexed random variables denoted as Z(ω, t) where ω belongs to a sample space and t belongs to an index set.
From here on out, we will simply write a stochastic process (or time series) as {Zt} (dropping the ω). Notation
For the time units t1, t2,..., tn, denote the n-dimensional distribution function of
Zt1 , Zt2 ,..., Ztn as
F (x ,..., x ) = P (Z ≤ x ,..., Z ≤ x ) Zt1 ,...,Ztn 1 n t1 1 tn n
Example Let Z ,..., Z ∼iid N (0, 1). Then F (x ,..., x ) = Qn Φ(x ). 1 n Zt1 ,...,Ztn 1 n j=1 j
Arthur Berg Stationarity, ACF, White Noise, Estimation 3/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Stochastic Process
Definition (stochastic process) A stochastic process is sequence of indexed random variables denoted as Z(ω, t) where ω belongs to a sample space and t belongs to an index set.
From here on out, we will simply write a stochastic process (or time series) as {Zt} (dropping the ω). Notation
For the time units t1, t2,..., tn, denote the n-dimensional distribution function of
Zt1 , Zt2 ,..., Ztn as
F (x ,..., x ) = P (Z ≤ x ,..., Z ≤ x ) Zt1 ,...,Ztn 1 n t1 1 tn n
Example Let Z ,..., Z ∼iid N (0, 1). Then F (x ,..., x ) = Qn Φ(x ). 1 n Zt1 ,...,Ztn 1 n j=1 j
Arthur Berg Stationarity, ACF, White Noise, Estimation 3/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Stationarity
Definition (Strongly Stationarity (aka Stricktly, aka Completely))
A time series {xt} is strongly stationary if any collection
{xt1 , xt2 ,..., xtn } has the same joint distribution as the time shifted set
{xt1+h, xt2+h,..., xtn+h}
Strong stationarity implies the following:
All marginal distributions are equal, i.e. P(Zs ≤ c) = P(Zt ≤ c) for all s, t, and c. This is what is called “first-order stationary”.
The covariance of Zs and Zt (if exists) is shift-independent, i.e. E(Zs + h, Zt + h) = E(Zs, Zt) for any choice of h. Strong stationarity typically assumes too much. This leads us to the weaker assumption of weak stationarity. Arthur Berg Stationarity, ACF, White Noise, Estimation 4/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Stationarity
Definition (Strongly Stationarity (aka Stricktly, aka Completely))
A time series {xt} is strongly stationary if any collection
{xt1 , xt2 ,..., xtn } has the same joint distribution as the time shifted set
{xt1+h, xt2+h,..., xtn+h}
Strong stationarity implies the following:
All marginal distributions are equal, i.e. P(Zs ≤ c) = P(Zt ≤ c) for all s, t, and c. This is what is called “first-order stationary”.
The covariance of Zs and Zt (if exists) is shift-independent, i.e. E(Zs + h, Zt + h) = E(Zs, Zt) for any choice of h. Strong stationarity typically assumes too much. This leads us to the weaker assumption of weak stationarity. Arthur Berg Stationarity, ACF, White Noise, Estimation 4/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Stationarity
Definition (Strongly Stationarity (aka Stricktly, aka Completely))
A time series {xt} is strongly stationary if any collection
{xt1 , xt2 ,..., xtn } has the same joint distribution as the time shifted set
{xt1+h, xt2+h,..., xtn+h}
Strong stationarity implies the following:
All marginal distributions are equal, i.e. P(Zs ≤ c) = P(Zt ≤ c) for all s, t, and c. This is what is called “first-order stationary”.
The covariance of Zs and Zt (if exists) is shift-independent, i.e. E(Zs + h, Zt + h) = E(Zs, Zt) for any choice of h. Strong stationarity typically assumes too much. This leads us to the weaker assumption of weak stationarity. Arthur Berg Stationarity, ACF, White Noise, Estimation 4/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Stationarity
Definition (Strongly Stationarity (aka Stricktly, aka Completely))
A time series {xt} is strongly stationary if any collection
{xt1 , xt2 ,..., xtn } has the same joint distribution as the time shifted set
{xt1+h, xt2+h,..., xtn+h}
Strong stationarity implies the following:
All marginal distributions are equal, i.e. P(Zs ≤ c) = P(Zt ≤ c) for all s, t, and c. This is what is called “first-order stationary”.
The covariance of Zs and Zt (if exists) is shift-independent, i.e. E(Zs + h, Zt + h) = E(Zs, Zt) for any choice of h. Strong stationarity typically assumes too much. This leads us to the weaker assumption of weak stationarity. Arthur Berg Stationarity, ACF, White Noise, Estimation 4/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Mean Function
Definition (Mean Function)
The mean function of a time series {Zt} (if it exists) is given by Z ∞ µt = E(Zt) = x ft(x) dx −∞
Example (Mean of an iid MA(q))
iid 2 Let at ∼ N (0, σ ) and
Zt = at + θ1at−1 + θ2at−2 + ··· + θqat−q then
µt = E(Zt) = E(at) + θ1 E(at−1) + θ2 E(at−2) + ··· + θq E(at−q) ≡ 0
(free of the time variable t)
Arthur Berg Stationarity, ACF, White Noise, Estimation 5/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Mean Function
Definition (Mean Function)
The mean function of a time series {Zt} (if it exists) is given by Z ∞ µt = E(Zt) = x ft(x) dx −∞
Example (Mean of an iid MA(q))
iid 2 Let at ∼ N (0, σ ) and
Zt = at + θ1at−1 + θ2at−2 + ··· + θqat−q then
µt = E(Zt) = E(at) + θ1 E(at−1) + θ2 E(at−2) + ··· + θq E(at−q) ≡ 0
(free of the time variable t)
Arthur Berg Stationarity, ACF, White Noise, Estimation 5/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Mean Function
Definition (Mean Function)
The mean function of a time series {Zt} (if it exists) is given by Z ∞ µt = E(Zt) = x ft(x) dx −∞
Example (Mean of an iid MA(q))
iid 2 Let at ∼ N (0, σ ) and
Zt = at + θ1at−1 + θ2at−2 + ··· + θqat−q then
µt = E(Zt) = E(at) + θ1 E(at−1) + θ2 E(at−2) + ··· + θq E(at−q) ≡ 0
(free of the time variable t)
Arthur Berg Stationarity, ACF, White Noise, Estimation 5/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b White Noise
Definition (White Noise) White noise is a collection of uncorrelated random variables with constant mean and variance.
Notation 2 2 at ∼ WN(0, σ ) — white noise with mean zero and variance σ IID WN 2 If as is independent of at for all s 6= t, then wt ∼ IID(0, σ ) Gaussian White Noise ⇒ IID Suppose at is normally distributed. uncorrelated+ normality ⇒ independent 2 Thus it follows that at ∼ IID(0, σ ) (a stronger assumption).
Arthur Berg Stationarity, ACF, White Noise, Estimation 6/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b White Noise
Definition (White Noise) White noise is a collection of uncorrelated random variables with constant mean and variance.
Notation 2 2 at ∼ WN(0, σ ) — white noise with mean zero and variance σ IID WN 2 If as is independent of at for all s 6= t, then wt ∼ IID(0, σ ) Gaussian White Noise ⇒ IID Suppose at is normally distributed. uncorrelated+ normality ⇒ independent 2 Thus it follows that at ∼ IID(0, σ ) (a stronger assumption).
Arthur Berg Stationarity, ACF, White Noise, Estimation 6/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b White Noise
Definition (White Noise) White noise is a collection of uncorrelated random variables with constant mean and variance.
Notation 2 2 at ∼ WN(0, σ ) — white noise with mean zero and variance σ IID WN 2 If as is independent of at for all s 6= t, then wt ∼ IID(0, σ ) Gaussian White Noise ⇒ IID Suppose at is normally distributed. uncorrelated+ normality ⇒ independent 2 Thus it follows that at ∼ IID(0, σ ) (a stronger assumption).
Arthur Berg Stationarity, ACF, White Noise, Estimation 6/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b White Noise
Definition (White Noise) White noise is a collection of uncorrelated random variables with constant mean and variance.
Notation 2 2 at ∼ WN(0, σ ) — white noise with mean zero and variance σ IID WN 2 If as is independent of at for all s 6= t, then wt ∼ IID(0, σ ) Gaussian White Noise ⇒ IID Suppose at is normally distributed. uncorrelated+ normality ⇒ independent 2 Thus it follows that at ∼ IID(0, σ ) (a stronger assumption).
Arthur Berg Stationarity, ACF, White Noise, Estimation 6/ 21 Note that
Zt = δ + Zt−1 + at
= 2δ + Zt−2 + at + at−1 (Zt−1 = δ + Zt−2 + at−1) . . t X = δt + aj j=1 Therefore we see t t X X µt = E [Zt] = E [δt] + E aj = δt + E [aj] = δt j=1 j=1
§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Now your turn!
Mean of a Random Walk with Drift
Suppose Z0 = 0, and for t > 0, Zt = δ + Zt−1 + at where δ is a constant and 2 at ∼ WN(0, σ ). What is the mean function of Zt? (That is compute E [Zt].)
Arthur Berg Stationarity, ACF, White Noise, Estimation 7/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Now your turn!
Mean of a Random Walk with Drift
Suppose Z0 = 0, and for t > 0, Zt = δ + Zt−1 + at where δ is a constant and 2 at ∼ WN(0, σ ). What is the mean function of Zt? (That is compute E [Zt].)
Note that
Zt = δ + Zt−1 + at
= 2δ + Zt−2 + at + at−1 (Zt−1 = δ + Zt−2 + at−1) . . t X = δt + aj j=1 Therefore we see t t X X µt = E [Zt] = E [δt] + E aj = δt + E [aj] = δt j=1 j=1
Arthur Berg Stationarity, ACF, White Noise, Estimation 7/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b The Covariance Function
Definition (Covariance Function)
The covariance function of a random sequence {Zt} is
γ(s, t) = cov(Zs, Zt) = E [(Zs − µs)(Zt − µt)]
So in particular, we have
2 var(Zt) = cov(Zt, Zt) = γ(t, t) = E (Zt − µt)
Also note that γ(s, t) = γ(t, s) since cov(Zs, Zt) = cov(Zt, Zs). Example (Covariance Function of White Noise) 2 Let at ∼ WN(0, σ ). By the definition of white noise, we have ( σ2, s = t γ(s, t) = E(as, at) = 0, s 6= t
Arthur Berg Stationarity, ACF, White Noise, Estimation 8/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b The Covariance Function
Definition (Covariance Function)
The covariance function of a random sequence {Zt} is
γ(s, t) = cov(Zs, Zt) = E [(Zs − µs)(Zt − µt)]
So in particular, we have
2 var(Zt) = cov(Zt, Zt) = γ(t, t) = E (Zt − µt)
Also note that γ(s, t) = γ(t, s) since cov(Zs, Zt) = cov(Zt, Zs). Example (Covariance Function of White Noise) 2 Let at ∼ WN(0, σ ). By the definition of white noise, we have ( σ2, s = t γ(s, t) = E(as, at) = 0, s 6= t
Arthur Berg Stationarity, ACF, White Noise, Estimation 8/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b The Covariance Function
Definition (Covariance Function)
The covariance function of a random sequence {Zt} is
γ(s, t) = cov(Zs, Zt) = E [(Zs − µs)(Zt − µt)]
So in particular, we have
2 var(Zt) = cov(Zt, Zt) = γ(t, t) = E (Zt − µt)
Also note that γ(s, t) = γ(t, s) since cov(Zs, Zt) = cov(Zt, Zs). Example (Covariance Function of White Noise) 2 Let at ∼ WN(0, σ ). By the definition of white noise, we have ( σ2, s = t γ(s, t) = E(as, at) = 0, s 6= t
Arthur Berg Stationarity, ACF, White Noise, Estimation 8/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Example (Covariance Function of MA(1)) 2 Let at ∼ WN(0, σ ) and Zt = at + θat−1 (where θ is a constant), then
γ(s, t) = cov(at + θat−1, as + θas−1)
= cov(at, as) + θ cov(at, as−1)+ 2 + θ cov(at−1, as) + θ cov(at−1, as−1) If s = t, then γ(s, t) = γ(t, t) = σ2 + θ2σ2 = (θ2 + 1)σ2 If s = t − 1 or s = t + 1, then
γ(s, t) = γ(t, t + 1) = γ(t, t − 1) = θσ2
So all together we have (θ2 + 1)σ2, ifs = t γ(s, t) = θσ2, if|s − t| = 1 0, else
Arthur Berg Stationarity, ACF, White Noise, Estimation 9/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Example (Covariance Function of MA(1)) 2 Let at ∼ WN(0, σ ) and Zt = at + θat−1 (where θ is a constant), then
γ(s, t) = cov(at + θat−1, as + θas−1)
= cov (at, as) + θ cov (at, as−1)+ 2 + θ cov (at−1, as) + θ cov (at−1, as−1) If s = t, then γ(s, t) = γ(t, t) = σ2 + θ2σ2 = (θ2 + 1)σ2 If s = t − 1 or s = t + 1, then
γ(s, t) = γ(t, t + 1) = γ(t, t − 1) = θσ2
So all together we have (θ2 + 1)σ2, ifs = t γ(s, t) = θσ2, if|s − t| = 1 0, else
Arthur Berg Stationarity, ACF, White Noise, Estimation 9/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Example (Covariance Function of MA(1)) 2 Let at ∼ WN(0, σ ) and Zt = at + θat−1 (where θ is a constant), then
γ(s, t) = cov(at + θat−1, as + θas−1)
= cov (at, as) + θ cov (at, as−1)+ 2 + θ cov (at−1, as) + θ cov (at−1, as−1) If s = t, then γ(s, t) = γ(t, t) = σ2 + θ2σ2 = (θ2 + 1)σ2 If s = t − 1 or s = t + 1, then
γ(s, t) = γ(t, t + 1) = γ(t, t − 1) = θσ2
So all together we have (θ2 + 1)σ2, ifs = t γ(s, t) = θσ2, if|s − t| = 1 0, else
Arthur Berg Stationarity, ACF, White Noise, Estimation 9/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Example (Covariance Function of MA(1)) 2 Let at ∼ WN(0, σ ) and Zt = at + θat−1 (where θ is a constant), then
γ(s, t) = cov(at + θat−1, as + θas−1)
= cov (at, as) + θ cov (at, as−1)+ 2 + θ cov (at−1, as) + θ cov (at−1, as−1) If s = t, then γ(s, t) = γ(t, t) = σ2 + θ2σ2 = (θ2 + 1)σ2 If s = t − 1 or s = t + 1, then
γ(s, t) = γ(t, t + 1) = γ(t, t − 1) = θσ2
So all together we have (θ2 + 1)σ2, ifs = t γ(s, t) = θσ2, if|s − t| = 1 0, else
Arthur Berg Stationarity, ACF, White Noise, Estimation 9/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Example (Covariance Function of MA(1)) 2 Let at ∼ WN(0, σ ) and Zt = at + θat−1 (where θ is a constant), then
γ(s, t) = cov(at + θat−1, as + θas−1)
= cov (at, as) + θ cov (at, as−1)+ 2 + θ cov (at−1, as) + θ cov (at−1, as−1) If s = t, then γ(s, t) = γ(t, t) = σ2 + θ2σ2 = (θ2 + 1)σ2 If s = t − 1 or s = t + 1, then
γ(s, t) = γ(t, t + 1) = γ(t, t − 1) = θσ2
So all together we have (θ2 + 1)σ2, ifs = t γ(s, t) = θσ2, if|s − t| = 1 0, else
Arthur Berg Stationarity, ACF, White Noise, Estimation 9/ 21 From the representation t X Zt = δt + aj j=1 We see s t X X γ(s, t) = cov(Zs, Zt) = cov aj, ak j=1 k=1 s t X X 2 = cov (aj, ak) = min(s, t)σ j=1 k=1
§2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Now your turn!
Covariance Function of a Random Walk with Drift
Suppose Z0 = 0, and for t > 0, Zt = δ + Zt−1 + at where δ is a constant and 2 at ∼ WN(0, σ ). What is the covariance function of Zt? (That is compute cov (Zs, Zt).)
Arthur Berg Stationarity, ACF, White Noise, Estimation 10/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Now your turn!
Covariance Function of a Random Walk with Drift
Suppose Z0 = 0, and for t > 0, Zt = δ + Zt−1 + at where δ is a constant and 2 at ∼ WN(0, σ ). What is the covariance function of Zt? (That is compute cov (Zs, Zt).)
From the representation t X Zt = δt + aj j=1 We see s t X X γ(s, t) = cov(Zs, Zt) = cov aj, ak j=1 k=1 s t X X 2 = cov (aj, ak) = min(s, t)σ j=1 k=1
Arthur Berg Stationarity, ACF, White Noise, Estimation 10/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b The Autocorrelation Function
Definition (Correlation Function)
The correlation function of a random sequence {Zt} is γ(s, t) ρ(s, t) = pγ(s, s)γ(t, t)
Cauchy-Schwartz inequality ⇒ |ρ(s, t)| ≤ 1
Arthur Berg Stationarity, ACF, White Noise, Estimation 11/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b The Autocorrelation Function
Definition (Correlation Function)
The correlation function of a random sequence {Zt} is γ(s, t) ρ(s, t) = pγ(s, s)γ(t, t)
Cauchy-Schwartz inequality ⇒ |ρ(s, t)| ≤ 1
Arthur Berg Stationarity, ACF, White Noise, Estimation 11/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Weak Stationarity
Definition (Weak Stationarity) A weakly stationary time series is a finite variance process that (i) has a mean function, µ, that is constant (so it doesn’t depend on t); (ii) has a covariance function, γ(s, t), that dependents on s and t only through the difference |s − t|.
From now on when we say stationary, we mean weakly stationary. For stationary processes, the mean function is µ (no t). Since γ(s, t) = γ(s + h, t + h) for stationary processes, we have γ(s, t) = γ(s − t, 0). Therefore we can view the covariance function as a function of only one variable (h = s − t); this is the autocovariance function γ(h).
Arthur Berg Stationarity, ACF, White Noise, Estimation 12/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Weak Stationarity
Definition (Weak Stationarity) A weakly stationary time series is a finite variance process that (i) has a mean function, µ, that is constant (so it doesn’t depend on t); (ii) has a covariance function, γ(s, t), that dependents on s and t only through the difference |s − t|.
From now on when we say stationary, we mean weakly stationary. For stationary processes, the mean function is µ (no t). Since γ(s, t) = γ(s + h, t + h) for stationary processes, we have γ(s, t) = γ(s − t, 0). Therefore we can view the covariance function as a function of only one variable (h = s − t); this is the autocovariance function γ(h).
Arthur Berg Stationarity, ACF, White Noise, Estimation 12/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Weak Stationarity
Definition (Weak Stationarity) A weakly stationary time series is a finite variance process that (i) has a mean function, µ, that is constant (so it doesn’t depend on t); (ii) has a covariance function, γ(s, t), that dependents on s and t only through the difference |s − t|.
From now on when we say stationary, we mean weakly stationary. For stationary processes, the mean function is µ (no t). Since γ(s, t) = γ(s + h, t + h) for stationary processes, we have γ(s, t) = γ(s − t, 0). Therefore we can view the covariance function as a function of only one variable (h = s − t); this is the autocovariance function γ(h).
Arthur Berg Stationarity, ACF, White Noise, Estimation 12/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Weak Stationarity
Definition (Weak Stationarity) A weakly stationary time series is a finite variance process that (i) has a mean function, µ, that is constant (so it doesn’t depend on t); (ii) has a covariance function, γ(s, t), that dependents on s and t only through the difference |s − t|.
From now on when we say stationary, we mean weakly stationary. For stationary processes, the mean function is µ (no t). Since γ(s, t) = γ(s + h, t + h) for stationary processes, we have γ(s, t) = γ(s − t, 0). Therefore we can view the covariance function as a function of only one variable (h = s − t); this is the autocovariance function γ(h).
Arthur Berg Stationarity, ACF, White Noise, Estimation 12/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Outline
1 §2.1: Stationarity
2 §2.2: Autocovariance and Autocorrelation Functions
3 §2.4: White Noise
4 R: Random Walk
5 Homework 1b
Arthur Berg Stationarity, ACF, White Noise, Estimation 13/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Autocovariance and Autocorrelation Functions
Definition (Autocovariance Function) The autocovariance function of a stationary time series is
γh = γ(h) = E [(Zt+h − µ)(Zt − µ)]
(for any value of t).
Definition (Autocorrelation Function) The autocorrelation function of a stationary time series is
γ(h) ρ = ρ(h) = h γ(0)
Arthur Berg Stationarity, ACF, White Noise, Estimation 14/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Autocovariance and Autocorrelation Functions
Definition (Autocovariance Function) The autocovariance function of a stationary time series is
γh = γ(h) = E [(Zt+h − µ)(Zt − µ)]
(for any value of t).
Definition (Autocorrelation Function) The autocorrelation function of a stationary time series is
γ(h) ρ = ρ(h) = h γ(0)
Arthur Berg Stationarity, ACF, White Noise, Estimation 14/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Properties of the Autocovariance/Autocorrelation Functions
γ0 = var(Zt); ρ0 = 1 |γk| ≤ γ0; |ρk| ≤ 1 γk = γ−k; ρk = ρ−k γk and ρk are positive semidefinite (p.s.d.), i.e. n n X X αiαjγ|ti−tj| ≥ 0 i=1 j=1 n n and X X αiαjρ|ti−tj| ≥ 0 i=1 j=1
for any set of times t1,..., tn and any real numbers α1, . . . , αn. Proof. Pn Define X = i=1 αiZti , then
n n n n X X X X 0 ≤ var(X) = αiαj cov(Zti , Ztj ) = αiαjγ|ti−tj| i=1 j=1 i=1 j=1
Arthur Berg Stationarity, ACF, White Noise, Estimation 15/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Properties of the Autocovariance/Autocorrelation Functions
γ0 = var(Zt); ρ0 = 1 |γk| ≤ γ0; |ρk| ≤ 1 γk = γ−k; ρk = ρ−k γk and ρk are positive semidefinite (p.s.d.), i.e. n n X X αiαjγ|ti−tj| ≥ 0 i=1 j=1 n n and X X αiαjρ|ti−tj| ≥ 0 i=1 j=1
for any set of times t1,..., tn and any real numbers α1, . . . , αn. Proof. Pn Define X = i=1 αiZti , then
n n n n X X X X 0 ≤ var(X) = αiαj cov(Zti , Ztj ) = αiαjγ|ti−tj| i=1 j=1 i=1 j=1
Arthur Berg Stationarity, ACF, White Noise, Estimation 15/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Properties of the Autocovariance/Autocorrelation Functions
γ0 = var(Zt); ρ0 = 1 |γk| ≤ γ0; |ρk| ≤ 1 γk = γ−k; ρk = ρ−k γk and ρk are positive semidefinite (p.s.d.), i.e. n n X X αiαjγ|ti−tj| ≥ 0 i=1 j=1 n n and X X αiαjρ|ti−tj| ≥ 0 i=1 j=1
for any set of times t1,..., tn and any real numbers α1, . . . , αn. Proof. Pn Define X = i=1 αiZti , then
n n n n X X X X 0 ≤ var(X) = αiαj cov(Zti , Ztj ) = αiαjγ|ti−tj| i=1 j=1 i=1 j=1
Arthur Berg Stationarity, ACF, White Noise, Estimation 15/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Properties of the Autocovariance/Autocorrelation Functions
γ0 = var(Zt); ρ0 = 1 |γk| ≤ γ0; |ρk| ≤ 1 γk = γ−k; ρk = ρ−k γk and ρk are positive semidefinite (p.s.d.), i.e. n n X X αiαjγ|ti−tj| ≥ 0 i=1 j=1 n n and X X αiαjρ|ti−tj| ≥ 0 i=1 j=1
for any set of times t1,..., tn and any real numbers α1, . . . , αn. Proof. Pn Define X = i=1 αiZti , then
n n n n X X X X 0 ≤ var(X) = αiαj cov(Zti , Ztj ) = αiαjγ|ti−tj| i=1 j=1 i=1 j=1
Arthur Berg Stationarity, ACF, White Noise, Estimation 15/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Properties of the Autocovariance/Autocorrelation Functions
γ0 = var(Zt); ρ0 = 1 |γk| ≤ γ0; |ρk| ≤ 1 γk = γ−k; ρk = ρ−k γk and ρk are positive semidefinite (p.s.d.), i.e. n n X X αiαjγ|ti−tj| ≥ 0 i=1 j=1 n n and X X αiαjρ|ti−tj| ≥ 0 i=1 j=1
for any set of times t1,..., tn and any real numbers α1, . . . , αn. Proof. Pn Define X = i=1 αiZti , then
n n n n X X X X 0 ≤ var(X) = αiαj cov(Zti , Ztj ) = αiαjγ|ti−tj| i=1 j=1 i=1 j=1
Arthur Berg Stationarity, ACF, White Noise, Estimation 15/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Outline
1 §2.1: Stationarity
2 §2.2: Autocovariance and Autocorrelation Functions
3 §2.4: White Noise
4 R: Random Walk
5 Homework 1b
Arthur Berg Stationarity, ACF, White Noise, Estimation 16/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b White Noise
2 White noise, denoted by at ∼ WN(0, σ ), is by definition a weakly stationary process with autocovariance function ( σ2, k = 0 γk = 0, k 6= 0 and autocorrelation function ( 1, k = 0 ρk = 0, k 6= 0
Not all white noise is boring like iid N (0, σ2). For example, stationary GARCH processes can have all the properties of white noise.
Arthur Berg Stationarity, ACF, White Noise, Estimation 17/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b White Noise
2 White noise, denoted by at ∼ WN(0, σ ), is by definition a weakly stationary process with autocovariance function ( σ2, k = 0 γk = 0, k 6= 0 and autocorrelation function ( 1, k = 0 ρk = 0, k 6= 0
Not all white noise is boring like iid N (0, σ2). For example, stationary GARCH processes can have all the properties of white noise.
Arthur Berg Stationarity, ACF, White Noise, Estimation 17/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Outline
1 §2.1: Stationarity
2 §2.2: Autocovariance and Autocorrelation Functions
3 §2.4: White Noise
4 R: Random Walk
5 Homework 1b
Arthur Berg Stationarity, ACF, White Noise, Estimation 18/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b
Random Walk Simulation in R
iid 2 Simulate the random walk Zt = .2 + Zt−1 + at where at ∼ N (0, σ ). Pt Hint: use the representation Zt = .2t + j=1 aj!
> w = rnorm(200,0,1) > x = cumsum(w) > wd = w +.2 > xd = cumsum(wd) > plot.ts(xd, ylim=c(-5,55)) > lines(x) > lines(.2*(1:200), lty="dashed")
Arthur Berg Stationarity, ACF, White Noise, Estimation 19/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Outline
1 §2.1: Stationarity
2 §2.2: Autocovariance and Autocorrelation Functions
3 §2.4: White Noise
4 R: Random Walk
5 Homework 1b
Arthur Berg Stationarity, ACF, White Noise, Estimation 20/ 21 §2.1: Stationarity §2.2: Autocovariance and Autocorrelation Functions §2.4: White Noise R: Random Walk Homework 1b Homework 1b
Read the following sections from the textbook §2.5: Estimation of the Mean, Autocovariances, and Autocorrelations §2.6: Moving Average and Autoregressive Representations of Time Series Processes Do the following exercise. Prove that a time series of iid random variables is strongly stationary.
Arthur Berg Stationarity, ACF, White Noise, Estimation 21/ 21