<<

Summary and old exam exercises

Stationary stochastic processes

Maria Sandsten

Lecture 11

October 14 2019

1 / 24 Summary and old exam exercises Examination information

2 / 24 Summary and old exam exercises Important course topics

I Properties of a stationary .

I Calculation of functions from spectral densities.

I Characteristics of covariance function, and corresponding realization.

I Calculation of , covariance functions and spectral densities of filtered processes.

I Calculation of parameters, covariance functions and spectral densities of AR-process and MA-processes (pole-zero plots).

I Cross-covariance and cross- calculations.

I Calculations of derivatives and integrals.

I Optimal filters (MSE, Wiener and matched filter).

I Sampling.

I calculation of the mean value estimate.

I Knowledge of basic spectral estimation techniques.

3 / 24 Summary and old exam exercises A stationary stochastic process

A continuous time process, X (t), t ∈ R, or a discrete time process Xt ,t = 0, ±1,..., is weakly stationary if,

I The expected value, E[X (t)] = m

I The covariance function, C[X (s), X (t)] = r(t − s) = r(τ)

I The variance, V [X (t)] = r(0)

I The correlation function, ρ(τ) = r(τ)/r(0) For a real-valued stationary stochastic process we have

I V [X (t)] = r(0) ≥ 0

I r(−τ) = r(τ)

I |r(τ)| ≤ r(0)

4 / 24 Summary and old exam exercises Spectral density For a weakly there exists a, positive, symmetrical and integrable spectral density function R(f ) such that, Z ∞ Z ∞ r(τ) = R(f )ei2πf τ df , R(f ) = r(τ)e−i2πf τ dτ, −∞ −∞ in continuous time, and

∞ Z 1/2 X r(τ) = R(f )ei2πf τ df , R(f ) = r(τ)e−i2πf τ , −1/2 τ=−∞

in discrete time. If E[X (t)] = 0 the power and the variance is Z ∞ E[X 2(t)] = V [X (t)] = r(0) = R(f )df . −∞

5 / 24 Summary and old exam exercises Discrete time white

For discrete time white noise,

R(f ) = σ2, −1/2 < f ≤ 1/2,

and  σ2 τ = 0 r(τ) = 0 τ = ±1, ±2,...

6 / 24 Summary and old exam exercises Random harmonic function The random harmonic function

X (t) = A cos(2πf0t + φ), t ∈ R, is a stationary process with A > 0 and φ ∈ Rect(0, 2π). The covariance function is E[A2] r(τ) = cos(2πf τ), τ ∈ , 2 0 R

for all f0 > 0 and the spectral density is defined as

E[A2] R(f ) = (δ(f − f ) + δ(f + f )). 4 0 0 The random harmonic function is also defined for discrete time with 0 < f0 ≤ 0.5

7 / 24 Summary and old exam exercises Filtering of stationary processes

For continuous time processes the output Y (t), t ∈ R, is obtained from the input X (t), t ∈ R, through Z ∞ Y (t) = h(u)X (t − u)du, −∞ where h(t) is the impulse response. The corresponding function is Z ∞ H(f ) = h(t)e−i2πft dt. −∞

For discrete time processes, the integrals are changed to sums.

8 / 24 Summary and old exam exercises Filtering of stationary processes The mean value, Z ∞ mY = mX h(u)du = mX H(0), −∞ the covariance function, Z ∞ Z ∞ rY (τ) = h(u)h(v)rX (τ + u − v)dudv, −∞ −∞ and the spectral density,

2 RY (f ) = |H(f )| RX (f ),

For discrete time processes, the integrals are changed to sums.

9 / 24 Summary and old exam exercises The MA(q)-process

A moving average process of order q, MA(q), is given by

Xt = c0et + c1et−1 + ... + cqet−q,

where et , t = 0, ±1, ±2,..., is a zero-mean white noise 2 with variance σ . The expected value mX = 0 and the covariance function is rX (τ) = C[Xt , Xt+τ ] 6= 0 |τ| ≤ q.

The spectral density is the discrete time Fourier transform of rX (τ) or

2 −i2πf −i2πfq 2 RX (f ) = σ |c0 + c1e + ... + cqe | .

10 / 24 Summary and old exam exercises The AR(p)-process

An auto-regressive process of order p, AR(p), is given by

Xt + a1Xt−1 + a2Xt−2 + ... + apXt−p = et ,

where, et , t = 0, ±1, ±2 ..., is zero-mean white Gaussian noise with 2 variance σ . The expected value, mX = 0, and the covariance function solves the Yule-Walker equations

 σ2 τ = 0 r (τ) + a r (τ − 1) + ... + a r (τ − p) = X 1 X p X 0 τ = 1, 2, ...

The spectral density is

σ2 RX (f ) = −i2πf −i2πfp 2 . |1 + a1e + ... + ape |

11 / 24 Summary and old exam exercises Old exam 180108-1 The following figures show realizations, covariance functions and spectral densities of one AR(p)-process and one MA(q)-process. Determine, with a motivation, what figures that are realizations, covariance functions and spectral densities. Also state which realization, covariance function and spectral density that are connected. Decide of which type the different processes are (AR or MA) and which orders they have. The order can be assumed to be less than 10 for both processes. Important! To receive full number of credits correct motivations have to be given for all answers. (10p)

12 / 24 Summary and old exam exercises Old exam 080306-5

A weakly stationary process Xt , t = 0, ±1, ±2 ..., is defined as

Xt − Xt−1 + 0.5Xt−2 = et + B

2 where et is white noise with expected value zero and variance σ = 1. 2 The process B is a stochastic variable with variance σB , which is independent of et . Compute the covariance function rX (τ) for τ = 0, ±1, ±2, ±3.

(20p)

13 / 24 Summary and old exam exercises Solution

Define a new process Zt = Xt − cB to be zero-mean and accordingly an AR(2)-process defined as Zt − Zt−1 + 0.5Zt−2 = et . We apply Xt = Zt + cB, in Xt − Xt−1 + 0.5Xt−2 = et + B and get

E[Zt + cB] − E[Zt−1 + cB] + 0.5E[Zt−2 + cB] = E[et ] + E[B].

As E[Zt ] = 0 and E[et ] = 0 by definition,

cE[B] − cE[B] + 0.5cE[B] = E[B],

giving c = 2. The Yule-Walker-equations for Zt are the same as in the AR(2)-example from Example 8.pdf among the lecture notes, giving for 2 σ = 1, rZ (0) = 2.4, rZ (±1) = 1.6, rZ (±2) = 0.4 and rZ (±3) = −0.4. Accordingly

2 rX (τ) = C[Zt + 2B, Zt+τ + 2B] = rZ (τ) + 4σB .

(Also read Example 4.12, page 105).

14 / 24 Summary and old exam exercises Cross-covariance, cross spectrum and spectrum The cross-covariance function is defined as

rX ,Y (τ) = C[X (t), Y (t + τ)].

Note that rX ,Y (τ) 6= rX ,Y (−τ) but rX ,Y (τ) = rY ,X (−τ). The complex-valued cross spectrum RX ,Y (f ) is defined so that Z i2πf τ rX ,Y (τ) = RX ,Y (f )e df .

The quadratic coherence spectrum is defined

2 2 |RX ,Y (f )| 2 κX ,Y (f ) = , 0 ≤ κX ,Y ≤ 1. RX (f )RY (f ) The same formulations apply for discrete time processes.

15 / 24 Summary and old exam exercises Differentiation

A weakly stationary process X (t), t ∈ R, is said to be differentiable in 0 quadratic mean with the derivative X (t) if rX (τ) is twice differentiable or if Z ∞ 0 2 V [X (t)] = (2πf ) RX (f )df < ∞. −∞

We have mX 0 = 0, 00 rX 0 (τ) = −rX (τ), and 2 RX 0 (f ) = (2πf ) RX (f ).

0 The cross-covariance function, rX ,X 0 (t, t + τ) = rX (τ) and rX ,X 0 (t, t) = 0.

16 / 24 Summary and old exam exercises Old exam 171028-3

A zero-mean, weakly stationary, X (t), t ∈ R has the spectral density

−2π|f | RX (f ) = πe .

State, with a short motivation, for each of the following statements if it is right or wrong. Each correct answer with motivation gives 2 credits. a) The covariance function is r (τ) = 1 . X 1+τ 2 b) The process is differentiable in quadratic mean. c) The variance V [(X (0) + X (t))/2] approaches zero 0 when t → ∞. d) After filtering through a linear filter with frequency function H(f ) = 1 + if , the spectral density of the process is constant for all . e) A new process Y (t) = 3X (t) − 2X 0(t) is created. The process Y (t) is a Gaussian process.

(10p)

17 / 24 Summary and old exam exercises Old exam 181102-2

A stationary Gaussian process X (t), t ∈ R, has expected value E[X (t)] = 2 and covariance function

−τ 2 rX (τ) = e .

Compute the probability,

P(X 0(t) ≥ X (t)).

(10p)

18 / 24 Summary and old exam exercises Optimal filters

2 I Mean square error optimal filter, minimize E[(Y (t) − S(t)) ].

I The matched filter for binary detection is

hopt (u) = s(T − u),

for the zero-mean white noise disturbance case. For equal decision errors, the decision level k = sout (T )/2 with errors α = β = 1 − Φ( sout (T ) ). 2σN I The Wiener filter frequency function is

RS (f ) Hopt (f ) = , RS (f ) + RN (f )

where RS (f ) and RN (f ) are the zero-mean and noise spectral densities respectively.

19 / 24 Summary and old exam exercises Sampling

The continuous time process Y (t), t ∈ R is sampled to the discrete time sequence Zt = Y (t), t = 0, ±d, ±2d,.... The covariance function is

rZ (τ) = rY (τ), τ = 0, ±d, ±2d,...

and the spectral density

∞ X RZ (f ) = RY (f + kfs ) − fs /2 < f ≤ fs /2. k=−∞

with fs = 1/d as the sampling frequency. With τ = nd, rZ (τ) is converted to rX (n), n = 0, ±1, ±2,..., and

RX (ν) = fs · RZ (νfs ),

for ν = f · d = f /fs .

20 / 24 Summary and old exam exercises Estimation of mean If X (t), t = 1, 2 ... is weakly stationary with the unknown expected value m then n 1 X mˆ = X (t) n n t=1

is an unbiased estimate of m as E[m ˆn] = m. The variance is

n n n−1 1 X 1 X 1 X V [m ˆ ] = C[ x(t), x(s)] = (n − |τ|)r(τ). n n n n2 t=1 s=1 τ=−n+1

For large n, 1 X V [m ˆ ] ≈ r(τ). n n τ

21 / 24 Summary and old exam exercises Old exam 171028-5

We define the weakly stationary Gaussian process Y (t) = m + X (t), t ∈ R, where m is an unknown constant expected value. The process X (t) is assumed to have E[X (t)] = 0 and spectral density   1 for |f − 2.1| ≤ 0.1, RX (f ) = 1 for |f + 2.1| ≤ 0.1  0 otherwise. An estimate of m should be found as Y (d) + Y (2d) + ... Y (nd) mn = , b n by sampling the process Y (t) with sample distance d. There are two possible choices of sample distances, d = 1/4 and d = 1/2.

a) Calculate the covariance function rY (τ). (4p) b) Determine the spectral density of the sampled process Zt = Y (t), t = 0, ±d, ±2d,... for the two suggested sample distances, d = 1/4 and d = 1/2. (10p) c) Determine nV [mbn], when n → ∞ for d = 1/4 and d = 1/2. Which of the suggested sample distances would you prefer? (6p)

22 / 24 Summary and old exam exercises Spectrum estimation The periodogram is defined as

1 X −i2πft 2 Rbx (f ) = | x(t)w(t)e | , n t where w(t) is a data window. With the Hanning window, the spectrum estimate will have better leakage properties (lower sidelobes), although the resolution is somewhat degraded, (wider mainlobe), in comparison to the rectangular window.

The variance of the periodogram is 2 V [Rbx (f )] ≈ RX (f ) 0 < |f | < 0.5 Dividing the sequence into K possibly overlapping sequences and calculating the average of K approximately uncorrelated estimates (Welch method), reduces the variance to

1 2 V [Rbmv (f )] ≈ R (f ). K X

23 / 24 Summary and old exam exercises Old exam 180108-3

A B 30 30

25 25

20 20

15 15

10 10 (f)]) (f)]) * x * x

(E[R 5 (E[R 5 10 10

0 0 10log 10log

-5 -5

-10 -10

-15 -15

-20 -20 0 0.1 0.2 0.3 0.4 0 0.1 0.2 0.3 0.4 f f

a) In the figures above, the expected values of the periodogram the rectangular and the Hanning window, calculated from n samples, are illustrated with solid lines. Which of the figures shows the expected value using the two windows. Motivate your answer. b) The sequence is divided into K sequences and the final estimate is calculated as the average of K uncorrelated periodograms. How much does the variance decrease in comparison to the periodogram in a)? (10p) 24 / 24