<<

LECTURES 2 - 3 : Stochastic Processes, function. Stationarity.

Important points of Lecture 1:

A {Xt} is a series of observations taken sequentially over time: xt is an observation recorded at a specific time t. Characteristics of times series : observations are dependent, become available at equally spaced time points and are time-ordered. This is a discrete time series. The purposes of time series analysis are to model and to predict or forecast future values of a series based on the history of that series.

2.2 Some descriptive techniques. (Based on [BD] §1.3 and §1.4)

...... Take a step backwards: how do we describe a r.v. or a random vector? • for a r.v. X: 2 d.f. FX (x) := P (X ≤ x), µ = EX and σ = V ar(X).

• for a r.vector (X1,X2): joint d.f. FX1,X2 (x1, x2) := P (X1 ≤ x1,X2 ≤ x2), marginal d.f.FX1 (x1) := P (X1 ≤ x1) ≡ FX1,X2 (x1, ∞) 2 2 mean vector (µ1, µ2) = (EX1,EX2), σ1 = V ar(X1), σ2 = V ar(X2), and Cov(X1,X2) = E(X1 − µ1)(X2 − µ2) ≡ E(X1X2) − µ1µ2. Often we use correlation = normalized covariance:

Cor(X1,X2) = Cov(X1,X2)/{σ1σ2} ......

To describe a process X1,X2,... we define (i) Def. Distribution function: (fi-di) d.f.

Ft1...tn (x1, . . . , xn) = P (Xt1 ≤ x1,...,Xtn ≤ xn),

i.e. this is the joint d.f. for the vector (Xt1 ,...,Xtn ). (ii) First- and Second-order moments.

• Mean: µX (t) = EXt 2 2 2 2 • Variance: σX (t) = E(Xt − µX (t)) ≡ EXt − µX (t)

1 • function:

γX (t, s) = Cov(Xt,Xs) = E[(Xt − µX (t))(Xs − µX (s))] ≡ E(XtXs) − µX (t)µX (s)

(Note: this is an infinite ). • Autocorrelation function:

Cov(Xt,Xs) γX (t, s) ρX (t, s) = Cor(Xt,Xs) = p = V ar(Xt)V ar(Xs) σX (t)σX (s)

Properties of the process which are determined by the first- and second- order moments are called second-order properties. Example: stationarity. – will be defined soon! 2.3 Meaning of the Autocorrelation function: In problem, PSTAT 5A : Cor(X,Y ) measures linear dependence.

Recall: correlation is zero if there is no linear dependence and it is close to ±1 for variables with high linear dependence. In TS: Autocorrelation function is used to assess numerically the dependence between two adjacent values. Let say that for a we observe only two r. v.’s: Xt and Xt+1.

2 ρ(t, t + 1) is the correlation of Xt,Xt+1. The prefix auto is to convey the notion of self- correlation (both variables come from the same TS), correlation of the series with itself. When the TS is smooth, the autocorr. f’n is large even when t and s are quite apart, whereas very choppy series tend to have autocorr. f’ns that are nearly zero for large separations. So, the autocorr. f’n tends to reflect the essential smoothness of the TS. Let us consider two “extreme” examples:

Example: Gaussian : Consider the series {Zt}, where Zt are Gaussian, m.z. r.v.’s and γZ (s, t) = E(ZtZs) = 1, if s = t , and 0, if s 6= t so that the autocov. f’n is 0 at all time separations. The series looks very choppy:

1 Example. Apply smoothing operation Xt = 3 (Zt−1 + Zt + Zt+1). Autocov. f’n is:

Autocor. f’n is a normalized autocov. f’n.

3. Stationarity 3.1 Stationarity and Strict Stationarity (Based on §1.4 and 2.1 of [BD]) Intuitively stationarity that the graphs over two equal-length time intervals of a realization of the TS should exibit similar statistical characteristics.

3 On a graph: • no trend • no seasonality • no change of variability • no apparent sharp changes of behavior Two approaches to stationarity: (i) Def. A is said to be strictly stationary if the joint probability density associated with the n r.v.’s Xt1 ,...,Xtn for any set of times t1, . . . , tn, is the same as that associated with the n r.v.’s Xt1+k,...,Xtn+k for any integer k. (ii) Def. A process is said to be (weakly, second-order) stationary if 2 (a) E|Xt| < ∞ (b) EXt = µ for all t ∈ T (c) γX (t, s) = γX (t + r, s + r) = γX (t − s) for all t, s, r ∈ T .

Note: The function γX (k) is often refered as the value of the autocov. f’n at lag k. (iii) The ACF (the autocorr. f’n) is

γX (k) ρX (k) = γX (0)

2 (iv) Note: γX (k) = γX (−k), ρX (k) = ρX (−k), γX (0) = σX . (v) Example: Apply smoothing operation to WN series: 1 Xt = 3 (Zt−1 + Zt + Zt+1). Autocov. f’n is:

2 (vi) Example: Xt = Z1 + ... + Zt, where Zt are i.i.d. r.v.s with m.z. and variance σ .

Non-stationary.

4 In general: • strict stationarity + finite second moment imply stationarity In general: • stationarity does not imply strict stationarity Fact: For Gaussian TS stationarity = strict stationarity. (because Gaussian distribution is determined by its mean and covariance)

5