<<

Announcements and Cross-Correlation Stationary Homework 1c STA 6857—Autocorrelation and Cross-Correlation & Outline Stationary Time Series (§1.4, 1.5)

1 Announcements

2 Autocorrelation and Cross-Correlation

3 Stationary Time Series

4 Homework 1c

Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 2/ 25

Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Homework Questions from Last Time

We’ve seen the AR(p) and MA(q) models. Which one do we use? In scientific modeling we wish to choose the model that best Our TA, Aixin Tan, will have office hours on Thursdays from 1–2pm describes or explains the data. Later we will develop many in 218 Griffin-Floyd. techniques to help us choose and fit the best model for our data. Homework 1c will be assigned today and the last part of homework Is this white noise process in the models unique? 1, homework 1d, will be assigned on Friday. Homework 1 will be We will see later that any stationary time series can be described as collected on Friday, September 7. Don’t wait till the last minute to do a MA(∞) + “deterministic part” by the Wold decomposition, where the homework, because more homework will follow next week. the white noise process in the MA(∞) part is unique. So in short, the answer is yes. We will also see later how to estimate the white noise process which will aid in forecasting.

Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 4/ 25 Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 5/ 25 Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Notational Disclaimer Joint Distribution The joint distribution contains all of the information about the time series. There is no feasible way to estimate the joint distribution without some strict assumptions. Definition (Joint Distribution Function (Joint CDF))

For convenience, we will follow the textbook’s style of not distinguishing Given time points t1, t2,..., tn, the joint CDF of xt1 , xt2 ,..., xtn , evaluated the random sequence (typically denoted with capital letters in ) at constants c1,..., cn, is given by with an observation of the random sequence (typically denoted with lower-case letters in statistics). We shall use {xt } in both situations and F(c1, c2,..., cn) = P (xt1 ≤ c1, xt2 ≤ c2,..., xtn ≤ cn)) the distinction between the random and non-random cases will be clear iid from the context. Example (CDF of wt ∼ N (0, 1)) n Y F(c1, c2,..., cn) = Φ(ct ) t=1 where x 1 Z 2 Φ(x) = √ e−z /2 dz 2π −∞ Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 6/ 25 Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 8/ 25

Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Univariate Distribution and Density Function

Definition (Mean Function)

The mean function of a time series {xt } is given by (if it exists) We may be also interested in the marginal distribution function at a particular time t. Knowing all marginal distributions cannot give you the Z ∞ µt = E(xt ) = x ft (x) dx full joint distribution. −∞ One dimensional distribution function: Example (Mean of an MA(q)) F (x) = P (x ≤ x) t t 2 Let wt ∼ WN(0, σ ) and with corresponding density function (if exists) xt = wt + θ1wt−1 + θ2wt−2 + ··· + θqwt−q ∂F (x) f (x) = t t ∂x then

µt = E(xt ) = E(wt ) + θ1 E(wt−1) + θ2 E(wt−2) + ··· + θq E(wt−q) ≡ 0

(free of the time variable t)

Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 9/ 25 Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 10/ 25 Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Mean Function Examples The Autocovariance Function

Definition (Autocovariance Function) Example (Mean of a Random Walk with Drift) The autocovariance function of a general random sequence {xt } is We saw before that if xt = δ + xt−1 + wt where x0 = 0, then xt has the representation γ(s, t) = cov(x , x ) = E [(x − µ )(x − µ )] t s t s s t t X xt = δt + wj So in particular, we have j=1 h i var(x ) = cov(x , x ) = γ(t, t) = E (x − µ )2 and the mean function of xt is t t t t t

Also note that γ(s, t) = γ(t, s) since cov(xs, xt ) = cov(xt , xs). µt = E(xt ) = δt Example (Autocovariance of White Noise) Example (Mean of Signal + Noise Model) 2 Let wt ∼ WN(0, σ ). By the definition of white noise, we have If x = s + w where w is a mean zero time series, then t t t t ( σ2, s = t µ = E(x ) = s γ(s, t) = E(ws, wt ) = t t t 0, s 6= t

Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 11/ 25 Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 12/ 25

Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Autocovariance Function of MA(1) Autocovariance Function of a Random Walk

Example (Autocovariance of MA(1)) 2 Let wt ∼ WN(0, σ ) and xt = wt + θ1wt−1, then Example (Autocovariance of a Random Walk)) γ( , ) = ( + θ , + θ ) s t cov wt 1wt−1 ws 1ws−1 Pt If xt = wj , then = cov(wt , ws) + θ1 cov(wt , ws−1)+ j=1 2 + θ1 cov(wt−1, ws) + θ1 cov(wt−1, ws−1)  s t  X X If s = t, then γ(s, t) = cov(xs, xt ) = cov  wj , wk  2 2 2 2 2 j=1 k=1 γ(s, t) = γ(t, t) = σ + θ1σ = (θ1 + 1)σ s t If s = t − 1 or s = t + 1, then X X  2 = cov wj , wk = min(s, t)σ 2 j=1 k=1 γ(s, t) = γ(t, t + 1) = γ(t, t − 1) = θ1σ So all together we have Note the dependence on s and t is not just a function of |s − t|! (The  random walk is not stationary.) (θ2 + 1)σ2, ifs = t  1 2 γ(s, t) = θ1σ , if|s − t| = 1  0, else

Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 13/ 25 Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 14/ 25 Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c The Autocorrelation Function The Cross- and Cross-Correlation Functions

The predictability of one series on another is me measured with the cross-covariance and cross-correlation functions. Definition (Autocorrelation Function) Definition (Cross-) The autocorrelation function (ACF) of a general random sequence {xt } is The cross-covariance function of two general time series {xt } and {yt } is γ(s, t) defined as ρ(s, t) = p γ(s, s)γ(t, t) ρxy (s, t) = E [(xs − µx s)(yt − µyt )]

As is well know from the Cauchy-Schwartz inequality, the correlation of Definition (Cross-Correlation Function (CCF)) two random variables is bounded between -1 and 1. Hence |ρ(s, t)| ≤ 1. The cross-correlation function (CCF) of two general time series {xt } and And |ρ(s, t)| = 1 implies an exact linear relationship between xs and xt . {yt } is defined as

γxy (s, t) ρxy (s, t) = p γxy (s, s)γxy (t, t)

Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 15/ 25 Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 16/ 25

Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Strict Stationarity Weak Stationarity

Definition (Strict Stationarity) Definition (Weak Stationarity) A time series {xt } is strictly stationary if any collection A weakly stationary time series is a finite variance process that {xt , xt ,..., xt } 1 2 n (i) has a mean function, µt , that is constant (so it doesn’t depend on t); has the same joint distribution as the time shifted set (ii) has a covariance function, γ(s, t), that dependents on s and t only through the difference |s − t|.  xt +h, xt +h,..., xtn+h 1 2 From now on when we say stationary, we mean weakly stationary. Strict stationarity implies the following: Since the mean function is free of t, we will simply drop the t and All marginal distributions are equal, i.e. P(x ≤ c) = P(x ≤ c) for all s t write µ(= µt ) s, t, and c. Since γ(s, t) = γ(s + h, t + h), we have γ(s, t) = γ(s − t, 0). So the The autocovariance function is shift-independent, i.e. autocovariance function can be thought of as a function of only one γ(s, t) = γ(s + h, t + h). variable (h = s − t). Strict stationarity is typically assumes too much. This leads us to the weaker assumption of weak stationarity.

Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 18/ 25 Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 19/ 25 Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Autocovariance and ACF of a Stationary Time Series Jointly Stationary Time Series

Definition (Autocovariance Function of a Stationary Time Series) Definition (Jointly Stationary) The autocovariance function of a stationary time series is Two time series, {xt } and {yt }, are jointly stationary if the are each stationary, and the cross-covariance function is only a function of the lag

γ(h) = E [(xt+h − µ)(xt − µ)] h, i.e. γxy (h) = E [(xt+h − µx )(yt − µy )] (for any value of t). is the same for all t.

Definition (Autocorrelation Function of a Stationary Time Series) Definition (CCF of Jointly Stationary Time Series) The autocorrelation function of a stationary time series is The Cross-Correlation function of jointly stationary time series {xt } and γ(t + h, t) γ(h) {yt } is ρ(h) = p = γxy (h) γ(t + h, t + h)γ(t, t) γ(0) ρxy (h) = p γx (0)γy (0)

Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 20/ 25 Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 21/ 25

Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Example Using Cross-Correlation Textbook Reading

Example (Prediction Using Cross-Correlation) Consider the model yt = Axt−` + wt where ` is a positive integer. Then clearly one can use {xt } to predict yt . To be able to detect such a model, we consider the cross-correlation Read the following sections from the textbook function. Assuming µx = µy = 0, we have §1.6 (Estimation of Correlation) §1.7 (Vector-Valued and Multidimensional Series) γxy (h) = E (yt+hxt )

= E [(Axt+h−` + wt+h)xt ]

= E (Axt+h−`xt ) + E (wt+hxt )

= Aγx (h − `)

So the cross-covariance function is a shifted and scaled version of the autocovariance function of the {xt } series.

Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 22/ 25 Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 24/ 25 Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Textbook Problems

Do the following exercises from the textbook 1.6 1.7 1.9 (Note: cos(A − B) = sin A sin B + cos A cos B) 1.13 1.15

Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 25/ 25