LECTURES 2 - 3 : Stochastic Processes, Autocorrelation Function
Total Page:16
File Type:pdf, Size:1020Kb
LECTURES 2 - 3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series fXtg is a series of observations taken sequentially over time: xt is an observation recorded at a specific time t. Characteristics of times series data: observations are dependent, become available at equally spaced time points and are time-ordered. This is a discrete time series. The purposes of time series analysis are to model and to predict or forecast future values of a series based on the history of that series. 2.2 Some descriptive techniques. (Based on [BD] x1.3 and x1.4) ......................................................................................... Take a step backwards: how do we describe a r.v. or a random vector? ² for a r.v. X: 2 d.f. FX (x) := P (X · x), mean ¹ = EX and variance σ = V ar(X). ² for a r.vector (X1;X2): joint d.f. FX1;X2 (x1; x2) := P (X1 · x1;X2 · x2), marginal d.f.FX1 (x1) := P (X1 · x1) ´ FX1;X2 (x1; 1) 2 2 mean vector (¹1; ¹2) = (EX1; EX2), variances σ1 = V ar(X1); σ2 = V ar(X2), and covariance Cov(X1;X2) = E(X1 ¡ ¹1)(X2 ¡ ¹2) ´ E(X1X2) ¡ ¹1¹2. Often we use correlation = normalized covariance: Cor(X1;X2) = Cov(X1;X2)=fσ1σ2g ......................................................................................... To describe a process X1;X2;::: we define (i) Def. Distribution function: (fi-di) d.f. Ft1:::tn (x1; : : : ; xn) = P (Xt1 · x1;:::;Xtn · xn); i.e. this is the joint d.f. for the vector (Xt1 ;:::;Xtn ). (ii) First- and Second-order moments. ² Mean: ¹X (t) = EXt 2 2 2 2 ² Variance: σX (t) = E(Xt ¡ ¹X (t)) ´ EXt ¡ ¹X (t) 1 ² Autocovariance function: γX (t; s) = Cov(Xt;Xs) = E[(Xt ¡ ¹X (t))(Xs ¡ ¹X (s))] ´ E(XtXs) ¡ ¹X (t)¹X (s) (Note: this is an infinite matrix). ² Autocorrelation function: Cov(Xt;Xs) γX (t; s) ½X (t; s) = Cor(Xt;Xs) = p = V ar(Xt)V ar(Xs) σX (t)σX (s) Properties of the process which are determined by the first- and second- order moments are called second-order properties. Example: stationarity. – will be defined soon! 2.3 Meaning of the Autocorrelation function: In linear regression problem, PSTAT 5A : Cor(X; Y ) measures linear dependence. Recall: correlation is zero if there is no linear dependence and it is close to §1 for variables with high linear dependence. In TS: Autocorrelation function is used to assess numerically the dependence between two adjacent values. Let say that for a moment we observe only two r. v.’s: Xt and Xt+1. 2 ½(t; t + 1) is the correlation of Xt;Xt+1. The prefix auto is to convey the notion of self- correlation (both variables come from the same TS), correlation of the series with itself. When the TS is smooth, the autocorr. f’n is large even when t and s are quite apart, whereas very choppy series tend to have autocorr. f’ns that are nearly zero for large separations. So, the autocorr. f’n tends to reflect the essential smoothness of the TS. Let us consider two “extreme” examples: Example: Gaussian White Noise: Consider the series fZtg, where Zt are Gaussian, m.z. r.v.’s and γZ (s; t) = E(ZtZs) = 1; if s = t ; and 0; if s 6= t so that the autocov. f’n is 0 at all time separations. The series looks very choppy: 1 Example. Apply smoothing operation Xt = 3 (Zt¡1 + Zt + Zt+1). Autocov. f’n is: Autocor. f’n is a normalized autocov. f’n. 3. Stationarity 3.1 Stationarity and Strict Stationarity (Based on x1.4 and 2.1 of [BD]) Intuitively stationarity means that the graphs over two equal-length time intervals of a realization of the TS should exibit similar statistical characteristics. 3 On a graph: ² no trend ² no seasonality ² no change of variability ² no apparent sharp changes of behavior Two approaches to stationarity: (i) Def. A stochastic process is said to be strictly stationary if the joint probability density associated with the n r.v.’s Xt1 ;:::;Xtn for any set of times t1; : : : ; tn, is the same as that associated with the n r.v.’s Xt1+k;:::;Xtn+k for any integer k. (ii) Def. A process is said to be (weakly, second-order) stationary if 2 (a) EjXtj < 1 (b) EXt = ¹ for all t 2 T (c) γX (t; s) = γX (t + r; s + r) = γX (t ¡ s) for all t; s; r 2 T . Note: The function γX (k) is often refered as the value of the autocov. f’n at lag k. (iii) The ACF (the autocorr. f’n) is γX (k) ½X (k) = γX (0) 2 (iv) Note: γX (k) = γX (¡k); ½X (k) = ½X (¡k); γX (0) = σX . (v) Example: Apply smoothing operation to WN series: 1 Xt = 3 (Zt¡1 + Zt + Zt+1). Autocov. f’n is: 2 (vi) Example: Xt = Z1 + ::: + Zt, where Zt are i.i.d. r.v.s with m.z. and variance σ . Non-stationary. 4 In general: ² strict stationarity + finite second moment imply stationarity In general: ² stationarity does not imply strict stationarity Fact: For Gaussian TS stationarity = strict stationarity. (because Gaussian distribution is determined by its mean and covariance) 5.