STA 6857---Autocorrelation and Cross-Correlation & Stationary Time Series

STA 6857---Autocorrelation and Cross-Correlation & Stationary Time Series

Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c STA 6857—Autocorrelation and Cross-Correlation & Outline Stationary Time Series (§1.4, 1.5) 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 2/ 25 Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Homework Questions from Last Time We’ve seen the AR(p) and MA(q) models. Which one do we use? In scientific modeling we wish to choose the model that best Our TA, Aixin Tan, will have office hours on Thursdays from 1–2pm describes or explains the data. Later we will develop many in 218 Griffin-Floyd. techniques to help us choose and fit the best model for our data. Homework 1c will be assigned today and the last part of homework Is this white noise process in the models unique? 1, homework 1d, will be assigned on Friday. Homework 1 will be We will see later that any stationary time series can be described as collected on Friday, September 7. Don’t wait till the last minute to do a MA(1) + “deterministic part” by the Wold decomposition, where the homework, because more homework will follow next week. the white noise process in the MA(1) part is unique. So in short, the answer is yes. We will also see later how to estimate the white noise process which will aid in forecasting. Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 4/ 25 Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 5/ 25 Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Notational Disclaimer Joint Distribution The joint distribution contains all of the information about the time series. There is no feasible way to estimate the joint distribution without some strict assumptions. Definition (Joint Distribution Function (Joint CDF)) For convenience, we will follow the textbook’s style of not distinguishing Given time points t1; t2;:::; tn, the joint CDF of xt1 ; xt2 ;:::; xtn , evaluated the random sequence (typically denoted with capital letters in statistics) at constants c1;:::; cn, is given by with an observation of the random sequence (typically denoted with lower-case letters in statistics). We shall use fxt g in both situations and F(c1; c2;:::; cn) = P (xt1 ≤ c1; xt2 ≤ c2;:::; xtn ≤ cn)) the distinction between the random and non-random cases will be clear iid from the context. Example (CDF of wt ∼ N (0; 1)) n Y F(c1; c2;:::; cn) = Φ(ct ) t=1 where x 1 Z 2 Φ(x) = p e−z =2 dz 2π −∞ Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 6/ 25 Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 8/ 25 Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Univariate Distribution and Density Mean Function Definition (Mean Function) The mean function of a time series fxt g is given by (if it exists) We may be also interested in the marginal distribution function at a particular time t. Knowing all marginal distributions cannot give you the Z 1 µt = E(xt ) = x ft (x) dx full joint distribution. −∞ One dimensional distribution function: Example (Mean of an MA(q)) F (x) = P (x ≤ x) t t 2 Let wt ∼ WN(0; σ ) and with corresponding density function (if exists) xt = wt + θ1wt−1 + θ2wt−2 + ··· + θqwt−q @F (x) f (x) = t t @x then µt = E(xt ) = E(wt ) + θ1 E(wt−1) + θ2 E(wt−2) + ··· + θq E(wt−q) ≡ 0 (free of the time variable t) Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 9/ 25 Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 10/ 25 Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Mean Function Examples The Autocovariance Function Definition (Autocovariance Function) Example (Mean of a Random Walk with Drift) The autocovariance function of a general random sequence fxt g is We saw before that if xt = δ + xt−1 + wt where x0 = 0, then xt has the representation γ(s; t) = cov(x ; x ) = E [(x − µ )(x − µ )] t s t s s t t X xt = δt + wj So in particular, we have j=1 h i var(x ) = cov(x ; x ) = γ(t; t) = E (x − µ )2 and the mean function of xt is t t t t t Also note that γ(s; t) = γ(t; s) since cov(xs; xt ) = cov(xt ; xs). µt = E(xt ) = δt Example (Autocovariance of White Noise) Example (Mean of Signal + Noise Model) 2 Let wt ∼ WN(0; σ ). By the definition of white noise, we have If x = s + w where w is a mean zero time series, then t t t t ( σ2; s = t µ = E(x ) = s γ(s; t) = E(ws; wt ) = t t t 0; s 6= t Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 11/ 25 Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 12/ 25 Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Autocovariance Function of MA(1) Autocovariance Function of a Random Walk Example (Autocovariance of MA(1)) 2 Let wt ∼ WN(0; σ ) and xt = wt + θ1wt−1, then Example (Autocovariance of a Random Walk)) γ( ; ) = ( + θ ; + θ ) s t cov wt 1wt−1 ws 1ws−1 Pt If xt = wj , then = cov(wt ; ws) + θ1 cov(wt ; ws−1)+ j=1 2 + θ1 cov(wt−1; ws) + θ1 cov(wt−1; ws−1) 0 s t 1 X X If s = t, then γ(s; t) = cov(xs; xt ) = cov @ wj ; wk A 2 2 2 2 2 j=1 k=1 γ(s; t) = γ(t; t) = σ + θ1σ = (θ1 + 1)σ s t If s = t − 1 or s = t + 1, then X X 2 = cov wj ; wk = min(s; t)σ 2 j=1 k=1 γ(s; t) = γ(t; t + 1) = γ(t; t − 1) = θ1σ So all together we have Note the dependence on s and t is not just a function of js − tj! (The 8 random walk is not stationary.) (θ2 + 1)σ2; ifs = t <> 1 2 γ(s; t) = θ1σ ; ifjs − tj = 1 > :0; else Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 13/ 25 Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 14/ 25 Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c The Autocorrelation Function The Cross-Covariance and Cross-Correlation Functions The predictability of one series on another is me measured with the cross-covariance and cross-correlation functions. Definition (Autocorrelation Function) Definition (Cross-Covariance Function) The autocorrelation function (ACF) of a general random sequence fxt g is The cross-covariance function of two general time series fxt g and fyt g is γ(s; t) defined as ρ(s; t) = p γ(s; s)γ(t; t) ρxy (s; t) = E [(xs − µx s)(yt − µyt )] As is well know from the Cauchy-Schwartz inequality, the correlation of Definition (Cross-Correlation Function (CCF)) two random variables is bounded between -1 and 1. Hence jρ(s; t)j ≤ 1. The cross-correlation function (CCF) of two general time series fxt g and And jρ(s; t)j = 1 implies an exact linear relationship between xs and xt . fyt g is defined as γxy (s; t) ρxy (s; t) = p γxy (s; s)γxy (t; t) Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 15/ 25 Arthur Berg STA 6857—Autocorrelation and Cross-Correlation & Stationary Time Series (§1.4, 1.5) 16/ 25 Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Announcements Autocorrelation and Cross-Correlation Stationary Time Series Homework 1c Strict Stationarity Weak Stationarity Definition (Strict Stationarity) Definition (Weak Stationarity) A time series fxt g is strictly stationary if any collection A weakly stationary time series is a finite variance process that fxt ; xt ;:::; xt g 1 2 n (i) has a mean function, µt , that is constant (so it doesn’t depend on t); has the same joint distribution as the time shifted set (ii) has a covariance function, γ(s; t), that dependents on s and t only through the difference js − tj. xt +h; xt +h;:::; xtn+h 1 2 From now on when we say stationary, we mean weakly stationary. Strict stationarity implies the following: Since the mean function is free of t, we will simply drop the t and All marginal distributions are equal, i.e. P(x ≤ c) = P(x ≤ c) for all s t write µ(= µt ) s; t, and c. Since γ(s; t) = γ(s + h; t + h), we have γ(s; t) = γ(s − t; 0). So the The autocovariance function is shift-independent, i.e. autocovariance function can be thought of as a function of only one γ(s; t) = γ(s + h; t + h).

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    6 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us