<<

THE STATE UNIVERSITY OF NEW JERSEY RUTGERS

College of Engineering Department of Electrical and Computer Engineering

332:322 Principles of Communications Systems Spring 2004 Problem Set 9

Haykin: 1.1–1.10

1. Consider a random process X t defined by

π

X t sin 2 fct

; ℄

in which the fc is a uniformly distributed over the 0 W .

Show that X t is nonstationary. Hint: Examine specific sample functions of the random

= ; = process X t for the frequency f W 2 W 4 and W say.

SOLUTION: An easy way to solve this problem is to find the of the random process X t

1 Z W 1

℄ π π ℄ E X t sin 2 ft df 1 cos 2 Wt

W 0 W

℄ Clearly E X t is a function of time and hence the process X t is not stationary. 2. Let X and Y be statistically independent Gaussian-distributed random variables each with

zero mean and unit . Define the Gaussian process

π π

Z t X cos 2 t Y sin 2 t

(a) Determine the joint probability density function of the random variables Z t1 and Z t2

obtained by observing Z t at times t1 and t2 respectively.

SOLUTION: Since every weighted sum of the samples of the Gaussian process Z t

is Gaussian, Z t1 , Z t2 are jointly Gaussian random variables. Hence we need to find

mean, variance and correlation co-efficient to evaluate the joint Gaussian PDF.

℄ π ℄ π ℄

E Z t1 cos 2 t1 E X sin 2 t1 E Y

℄ ℄ ℄ ℄

Since E X E Y 0,E Z t1 0. Similarly, E Z t2 0.

℄ ℄

CovZ t1 Z t2 E Z t1 Z t2

π π ℄ π π ℄ E X cos 2 t1 Y sin 2 t1 X cos 2 t2 Y sin 2 t2

2

π π ℄ π π π π ℄ ℄ cos2 t1 cos 2 t2 E X cos 2 t1 sin 2 t2 sin 2 t1 cos 2 t2 E XY

2

π π ℄ sin 2 t1 sin 2 t2 E Y

2 2

℄ ℄ ℄ ℄: ℄ Noting that, E X 1,E Y 1 and E XY E X E Y 0 (since X and Y are

independent), we obtain,

℄ π ℄ CovZ t1 Z t2 cos 2 t1 t2

1

2 2

σ ℄ ℄

E Z t1 1. This result is obtained by putting t1 t2 in Cov Z t1 Z t2 . Z t1

σ2 2 ℄

Similarly, E Z t2 1 Z t2

Correlation coefficient is given by

CovZ t1 Z t2

π ℄ ρ cos 2 t t

σ σ2 1 2

Z t1 Z t2

Hence the joint PDF

; : ; ℄

; fZ t1 Z t2 z1 z2 C exp Q z1 z2 where,

1 1

Ô

C

π ℄

2 π

π 2π 1 cos 2 t1 t2 2 sin 2 t1 t2

1 2 2

; π ℄ ℄

Qz1 z2 2 z1 2cos 2 t1 t2 z1z2 z2

π ℄

sin 2 t1 t2

(b) Is the process Z t stationary? Why?

SOLUTION: We find that E Z t 0 and of Z t1 and Z t2 depends only

on the time difference t1 t2. The process Z t is hence wide sense stationary. Since it

is Gaussian, it is also strict sense stationary.

3. The square wave xt of FIGURE 1 of constant amplitude A, period T0, and delay td, repre-

Figure 1: Square wave for xt sents the sample function of a random process X t . The delay is random, described by the

probability density function 8

< 1

T0 T0 

2  td 2

fTD td T0 : 0 otherwise

2

(a) Determine the probability density function of the random variable X tk obtained by

observing the random process X t at time tk. SOLUTION: X t is a square wave, and it takes on the two values 0 or A with equal probability. Hence the PDF can be given as

1 1

δ δ

f x x x A

X t 2 2 (b) Determine the mean and function of X t using ensemble-averaging SOLUTION: Using our definition of ensemble average for the mean of a , we find

Ê ∞

℄ E X t ∞ xf x dx X t

1 1

: : 0 2 A 2 A 2

Autocorrelation: Let’s denote the square wave with random delay time tD, period T0

and amplitude A as A:SqT0 t tD . Then, the autocorrelation can be written as,

: : : τ℄ τ RX E A SqT0 t tD A SqT0 t tD

2 Ê ∞

: τ℄ A ∞ Sq t t Sq t t f t dt T0 D T0 D TD D D T

Ê 0

2 2 1

: τ℄ A Sq t t Sq t t dt T0 T0 D T0 D T0 D 2

2 τj

A j T0

; jτj  1 2

2 T0 2 Since the square wave is periodic with period T0, RX t must also be periodic with

period T0. (c) Determine the mean and autocorrelation function of X t using time-averaging. SOLUTION: On a time-averaging basis we note by inspection that the mean is

A

> < x t 2

and time-autocorrelation is, Z

T0 =2 2 τj

1 A j T0

τ > τ ; jτj 

< x t x t x t x t 1 2 = T0 T0 2 2 T0 2

Again, the autocorrelation is periodic with period T0. (d) Establish whether or not X t is stationary. In what sense is it ergodic?

SOLUTION: We note that the ensemble-averaging and time-averaging yield the same set of results for the mean and autocorrelation functions. Therefore, X t is ergodic in

the mean and autocorrelation function. Since ergodicity implies wide-sense stationar-

ity, it follows that X t must be wide-sense stationary.

4. Consider two linear filters connected in cascade as in FIGURE 2. Let X t be a stationary τ

process with autocorrelation function RX . The random process appearing at the first filter

output is V t and second filter output is Y t .

3

Figure 2: Cascade of linear filters (a) Find the autocorrelation function of Y t SOLUTION: The cascade connection of two filters is equivalent to a filter with im- pulse response

Z ∞

ht h1 u h2 t u du

The autocorrelation function of Y t is given by, Z

Z ∞ ∞

τ τ τ τ τ τ τ τ

RY h 1 h 2 RX 1 2 d 1 d 2

∞ ∞

τ

(b) Find the cross- RVY of V t and Y t .

SOLUTION: The cross correlation function of V t and Y t is,

τ τ ℄

RVY E V t Y t

V t and Y t are related as follows,

Z ∞

λ λ λ

Y t V h2 t d

∞ Therefore,

Ê ∞

τ τ λ λ λ℄ R E V t ∞ V h t d VY 2

Ê ∞

λ τ λ℄ λ ∞ h t E V t V d 2

Ê ∞

λ τ λ λ ∞ h t R t d

2 V

5. A random telegraph X t , characterized by the autocorrelation function

τ jτj RX exp 2v where v is a constant, is applied to a low-pass RC filter of FIGURE 3. Determine the power and autocorrelation function of the random process at the filter output. SOLUTION: The power spectral density of the random telegraph wave is given as,

Ê ∞

τ π S f ∞ R exp j2 ft dt

X X Ê

Ê 0 ∞

π π ∞ exp 2vt exp j2 ft dt exp 2vt exp j2 ft dt 0

1 1

π π 2v j f 2 v j f v 2 π2 2 v f

4 Figure 3: RC Filter

The transfer function of the filter is

1

H f π 1 j2 f RC Therefore, PSD of the filter output is,

2 v

j j

SY f H f SX f 2 2 2

π π ℄

v f 1 j2 f RC To find the autocorrelation function of the filter output, we first expand SY f in partial fractions as follows,

v 1 1

SY f 2 2 2 2 2 2 2 2 2

= π π 1 4R C v 1 2RC f v f Recognizing that,

1

g jτj IFT f exp 2v 2 π2 2

v f

=

1 2RC

g j j=

IFT f 2 2 2 exp t RC

= π

1 2RC f

τ f g

where IFT stands for Inverse , we get RY IFT SY f

jτj τj j

v exp 2v

τ ℄ RY 2 2 2 2RC exp

1 4R C v v RC

6. A stationary Gaussian process X t has zero mean and power spectral density SX f . Deter-

mine the probability density function of a random variable obtained by observing the process X t at some time tk.

2

σ SOLUTION: Let µx be the mean and x be the variance of the random variable Xk obtained by observing the random process at time tk. Then,

µx 0

2 2 2 2

℄ ℄ σ x E Xk µx E Xk

5 We note that Z ∞

σ2 2 ℄

x E Xk SX f df

The PDF of Gaussian random variable Xk is given by

2

1 x

Ô

fX x exp k πσ2 σ2 2 x 2 x

2

σ

7. A stationary Gaussian process X t with mean µx and variance X is passed through two

linear filters with impulse responses h1 t and h2 t , yielding processes Y t and Z t , as shown in FIGURE 4

Figure 4: Parallel systems.

(a) Determine the joint probability density function of the random variables Y t1 and

Z t2 .

SOLUTION: Since X t is a Gaussian random process, the random variables Y t1

and Z t2 are jointly Gaussian. Hence to find the joint PDF, we need to find variance

2 2 covY t1 Z t2 σ σ ρ Y , Z , mean µY1 ,µZ2 and correlation coefficient σ σ 1 2 Y1 Z2

Z ∞

τ τ τ

Y t1 X t1 h1 d

µY1 H1 0 µ x

Ê ∞

τ τ where H 0 ∞ h d and µ is the mean of the stationary random process X t . 1 1 x Similarly,

Z ∞

Z t2 X t2 u h2 u du

µZ2 H2 0 µ x

Ê ∞

where H 0 ∞ h u du andµ is the mean of the stationary random process X t . 2 2 x

6

The covariance of Y t1 and Z t2 is

℄ ℄

CovY t1 Z t2 E Y t1 µY1 Z t2 µZ2 Ê

Ê ∞ ∞

τ τ τ ℄

E X t µ X t µ h h u d du

∞ ∞

1 x 2 x 1 2 Ê

∞ Ê ∞

τ ℄ τ τ

E X t µ X t µ h h u d du

∞ ∞

1 x 2 x 1 2 Ê

Ê ∞ ∞

τ τ τ

C t t u h h u d du

∞ ∞

X 1 2 1 2

τ where CX is the function of X t .

σ2 2 ℄

Y E Y t1 µY1 Ê

1 ∞ Ê ∞

τ τ τ

C u h h u d du

∞ ∞ X 1 1

2 2

℄ σ

Z E Z t2 µZ2 Ê

2 Ê ∞ ∞

τ τ τ

C u h h u d du

∞ ∞ X 2 2

Finally, the joint Gaussian PDF can be written as,

; ;

; fY t1 Z t2 y z k exp Q y z where,

1 Ô k

ρ2 πσ σ

2 Y1 Z2 1

1 y µY1 2 y µY1 z µZ2 z µZ2 2

; ρ ℄

Qy z 2 2

ρ σ σ σ σ

21 Y1 Y1 Z2 Z2

(b) What conditions are necessary and sufficient to ensure that Y t1 and Z t2 are statisti-

cally independent?

SOLUTION: The random variables Y t1 and Z t2 are uncorrelated if and only if

their covariance is zero. Since Y t and Z t are jointly Gaussian processes, it follows

that Y t1 and Z t2 are statistically independent if Cov Y t1 Z t2 0. Therefore the

necessary and sufficient condition for Y t1 and Z t2 to be statistically independent is Z

that Z ∞ ∞

τ τ τ

CX t1 t2 u h1 h2 u d du

∞ ∞

8. A stationary Gaussian process X t with zero mean and power spectral density SX f is

applied to a linear filter whose impulse response ht is shown in FIGURE 5. A sample Y is Figure 5: ht for problem 8

taken of the random process at the filter output at time T .

7 (a) Determine the mean and variance of Y SOLUTION: The filter output is

Ê ∞

τ τ τ

Y t h X t d

1 Ê T

τ τ

T 0 X T d

τ Put T u. Then the sample value ofY t at t T equals

1 Z T

Y X u du T 0 The mean of Y is therefore

1 Ê T

℄ ℄ E Y E T 0 X u du

1 Ê T

℄ T 0 E X u du

0

Variance of Y

σ2 2 2 ℄ ℄

Y E Y E Y

RY 0

Ê ∞

∞ S f df Y

Ê ∞ 2

j j ∞ S f H f df X

Ê ∞

π

H f h t exp j2 ft dt

1 Ê T

π

T 0 exp j2 ft dt

π sinc f T exp j f T Therefore, Z ∞

2 2

σ

Y SX f sinc f T df

∞ (b) What is the probability density function of Y ? SOLUTION: Since the filter output is Gaussian, it follows that Y is also Gaussian. Hence the PDF ofY is 2

1 y

Õ

f y exp Y σ2 πσ2 2 Y 2 Y

8