Gaussian, Markov and stationary processes

Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania [email protected] http://www.seas.upenn.edu/users/~aribeiro/

November 11, 2016

Stoch. Systems Analysis Queues 1 Introduction and roadmap

Introduction and roadmap

Gaussian processes

Brownian motion and its variants

White

Stoch. Systems Analysis Queues 2 Stochastic processes

I Assign a X (t) to a random event

I Without restrictions, there is little to say about stochastic processes

I Memoryless property makes matters simpler and is not too restrictive

I Have also restricted attention to discrete time and/or discrete space

I Simplifies matters further but might be too restrictive

I Time t and range of X (t) values continuous

I Time and/or state may be discrete as particular cases

I Restrict attention to (any type or a combination of types) Markov processes (memoryless) ) Gaussian processes (Gaussian probability distributions) ) Stationary processes (“limit distributions”) )

Stoch. Systems Analysis Queues 3 Markov processes

I X (t) is a Markov process when the future is independent of the past

I For all t > s and arbitrary values x(t), x(s)andx(u)forallu < s

P X (t) x(t) X (s) > x(s), X (u) > x(u), u < s ⇥ =P X (t) x(t) ⇤X (s) > x(s) ⇥ ⇤ I Memoryless property defined in terms of cdfs not pmfs

I Memoryless property useful for same reasons of discrete time/state

I But not as much useful as in discrete time /state

Stoch. Systems Analysis Queues 4 Gaussian processes

I X (t)isaGaussianprocesswhenall prob. distributions are Gaussian

I For arbitrary times t1, t2,...tn it holds Values X (t ), X (t ),...X (t ) are jointly Gaussian ) 1 2 n I Will define more precisely later on

I Simplifies study because Gaussian distribution is simplest possible Suces to know , variances and (cross-) ) Linear transformation of independent Gaussians is Gaussian ) Linear transformation of jointly Gaussians is Gaussian ) I More details later

Stoch. Systems Analysis Queues 5 Markov processes + Gaussian processes

I Markov (memoryless) and Gaussian properties are di↵erent Will study cases when both hold )

I , also known as

I Brownian motion with drift

I linear evolution models )

I Geometric brownian motion pricing of stocks, arbitrages, risk neutral measures, pricing of stock) options (Black-Scholes)

Stoch. Systems Analysis Queues 6 Stationary processes

I Process X (t) is stationary if all probabilities are invariant to time shifts

I I.e., for arbitrary times t1, t2,...,tn and arbitrary time shift s

P[X (t + s) x , X (t + s) x ,...,X (t + s) x ]= 1 1 2 2 n n P[X (t ) x , X (t ) x ,...,X (t ) x ] 1 1 2 2 n n

I System’s behavior is independent of time origin

I Follows from our success on studying limit probabilities

I study of limit distribution ⇡

I Will study Spectral analysis of stationary stochastic processes ) Linear filtering of stationary stochastic processes )

Stoch. Systems Analysis Queues 7 Gaussian processes

Introduction and roadmap

Gaussian processes

Brownian motion and its variants

White Gaussian noise

Stoch. Systems Analysis Queues 8 Jointly gaussian variables

I Random variables (RV) X1, X2,...,Xn are jointly Gaussian (normal) if any linear combination of them is Gaussian T I We may also say, vector RV X =[X1,...,Xn] is Gaussian (normal) T I Formally, for any a1, a2,...,an variable (a =[a1,...,an] )

T Y = a1X1 + a2X2 + ...+ anXn = a X

I is normally distributed

I Consider 2 dimensions 2RVsX and X jointly normal ) 1 2 I To describe joint distribution have to specify

Means: µ1 = E [X1]andµ2 = E [X2] ) 2 2 2 Variances: =var[X1] = E (X1 µ1) and =var[X2] ) 11 22 2 : = cov(X1)=E [(X1 µ1)(X2 µ2)] ) 12 ⇥ ⇤

Stoch. Systems Analysis Queues 9 Pdf of jointly normal RVs in 2 dimensions

T I In 2 dimensions, define vector µ =[µ1,µ2] T I And covariance C with elements (C is symmetric, C = C )

2 2 C = 11 12 2 2 ✓ 12 22 ◆

I Joint pdf of x is given by

1 1 T 1 f (x)= exp (x µ) C (x µ) X 1/2 2 2⇡ det (C) ✓ ◆

I Assumed that C is invertible and as a consequence det(C) =0 6 T I Can verify that any linear combination a x is normal if the pdf of x is as given above

Stoch. Systems Analysis Queues 10 Pdf of jointly normal RVs in n dimensions

n I For X R (n dimensions) define µ = E [X] and covariance matrix 2 2 E (X1) E [(X1)X2] ... E [(X1)Xn] 2 E [X2X1] E X2 ... E [X2Xn] T ⇥ ⇤ C := E xx = 0 . . . . 1 . . ⇥ ⇤ .. . B C ⇥ ⇤ B [X X ] [X X ] ... X 2 C B E n 1 E n 2 E n C @ A I C symmetric. Consistent with 2-dimensional def. Made⇥ µ⇤ =0

I Joint pdf of x defined as before (almost)

1 1 T 1 fX(x)= exp (x µ) C (x µ) n/2 1/2 2 (2⇡) det (C) ✓ ◆

I C invertible, therefore det(C) = 0. All linear combinations normal 6

I Expected value µ and covariance matrix C completely specify probability distribution of a Gaussian vector X

Stoch. Systems Analysis Queues 11 Anotationalaside

n n n n I With x R , µ R and C R ⇥ , define function (µ, C; x)as 2 2 2 N

1 1 T 1 (µ, C; x):= exp (x µ) C (x µ) N n/2 1/2 2 (2⇡) det (C) ✓ ◆

I µ and C are parameters, x is the argument of the function

n I Let X R be a Gaussian vector with mean µ, and covariance C 2 I Can write the pdf of X as

1 1 T 1 fX(x) = exp (x µ) C (x µ) := (µ, C; x) n/2 1/2 2 N (2⇡) det (C) ✓ ◆

Stoch. Systems Analysis Queues 12 Gaussian processes

I Gaussian processes (GP) generalize Gaussian vectors to infinite dimensions

I X (t)isaGPifany linear combination of values X (t)isGaussian

I I.e., for arbitrary times t1, t2,...,tn and constants a1, a2,...,an

Y = a1X (t1)+a2X (t2)+...+ anX (tn)

I has a

I t can be a continuous or discrete time index

I More general, any linear functional of X (t)isnormallydistributed

I A functional is a function of a function t2 I E.g., the (random) integral Y = X (t) dt has a normal distribution Zt1 I Integral functional is akin to a sum of X (ti )

Stoch. Systems Analysis Queues 13 Joint pdfs in a Gaussian process

I Consider times t1, t2,...,tn.Themeanvalueµ(ti )atsuchtimesis

µ(ti )=E [X (ti )]

I The cross-covariance between values at times ti and tj is

C(ti , tj )=E X (ti ) µ(ti ) X (tj ) µ(tj )

I Covariance matrix for values⇥ X (t1), X (t2),...,X (tn)isthen⇤

C(t1, t1) C(t1, t2) ... C(t1, tn) C(t2, t1) C(t2, t2) ... C(t2, tn) C(t1,...,tn)=0 . . . . 1 . . .. . B C B C(t , t ) C(t , t ) ... C(t , t ) C B n 1 n 2 n n C @ A I Joint pdf of X (t1), X (t2),...,X (tn) then given as

T T f (x1,...,xn)= [µ(t1),...,µ(tn)] , C(t1,...,tn);[x1,...,xn] X (t1),...,X (tn) N ⇣ ⌘

Stoch. Systems Analysis Queues 14 Mean value and autocorrelation functions

I To specify a Gaussian process, suces to specify: Mean value function µ(t)=E [X (t)] ;and ) ) Autocorrelation function R(t1, t2)=E X (t1)X (t2) ) ) I obtained as C(t , t )=R(t , t ) µ(t )µ(t ) 1 2 1 ⇥2 1 ⇤2 I For simplicity, most of the time will consider processes with µ(t)=0

I Can always define process Y (t)=X (t) µ (t)withµ (t)=0 X Y I In such case C(t1, t2)=R(t1, t2)

I Autocorrelation is a function of two variables t1 and t2

I Autocorrelation is a symmetric function R(t1, t2)=R(t2, t1)

Stoch. Systems Analysis Queues 15 Probabilities in a Gaussian process

I All probs. in a GP can be expressed on terms of µ(t)andR(t1, t2)

I For example, probability distribution function of X (t)is

2 1 xt µ(t) f (x )= exp X (t) t 2 2⇡ R(t, t) µ2(t) 2 R(t, t) µ(t) ! q I For a zero mean process with µ(t)=0forallt

1 x 2 f (x )= exp t X (t) t 2R(t, t) 2⇡R(t, t) ✓ ◆ p

Stoch. Systems Analysis Queues 16 Joint and conditional probabilities in a GP

I For a zero mean process consider two times t1 and t2

I The covariance matrix for X (t1)andX (t2)is

R(t , t ) R(t , t ) C = 1 1 1 2 R(t , t ) R(t , t ) ✓ 1 2 2 2 ◆

I Joint pdf of X (t1)andX (t2) then given as

1 1 T 1 f (x , x )= exp [x , x ] C [x , x ] X (t1),X (t2) 1 2 1/2 2 t1 t2 t1 t2 2⇡ det (C) ✓ ◆

I Conditional pdf of X (t1) given X (t2) computed as

fX (t1),X (t2)(xt1 , xt2 ) fX (t1) X (t2)(x1, x2)= | fX (t2)(x2)

Stoch. Systems Analysis Queues 17 Brownian motion and its variants

Introduction and roadmap

Gaussian processes

Brownian motion and its variants

White Gaussian noise

Stoch. Systems Analysis Queues 18 Brownian motion as limit of

I Gaussian processes are natural models due to

I Let us reconsider a symmetric random walk in one dimension

I Walker takes increasingly frequent and increasingly small steps

time step = h x(t)

t

Stoch. Systems Analysis Queues 19 Brownian motion as limit of random walk

I Gaussian processes are natural models due to central limit theorem

I Let us reconsider a symmetric random walk in one dimension

I Walker takes increasingly frequent and increasingly small steps

time step = h/2 x(t)

t

Stoch. Systems Analysis Queues 20 Brownian motion as limit of random walk

I Gaussian processes are natural models due to central limit theorem

I Let us reconsider a symmetric random walk in one dimension

I Walker takes increasingly frequent and increasingly small steps

time step = h/4 x(t)

t

Stoch. Systems Analysis Queues 21 Random walk, time step h and step size ph

I Let X (t) be position at time t with X (0) = 0

I Let h be a time step and ph the size of each step

I Walker steps right or left with prob. 1/2 for each direction

I Given X (t)=x, prob. distribution of the position at time t + h is

P X (t + h)=x + ph X (t)=x =1/2 h i P X (t + h)=x ph X (t)=x =1/2 h i I Consider time T = Nh and index n =1 , 2,...,N

I Define step RV Y = 1, equiprobably P [Y = 1] = 1/2 n ± n ± I Can write X [(n + 1)h] in terms of X (nh)andYn as

X [(n + 1)h]=X (nh)+ ph Yn ⇣ ⌘

Stoch. Systems Analysis Queues 22 Central Limit Theorem as h 0 !

I Use recursively to write X (T )=X (Nh)as

N 1 N 1 X (T )=X (Nh)=X (0) + ph Yn = ph Yn n=0 n=0 ⇣ ⌘ X ⇣ ⌘ X I Yn are independent identically distributed with mean and variance

E [Yn]=1/2(1) + (1/2)( 1) =0 var [Y ]=1/2(1)2 +(1/2)( 1)2 =1 n

I As h 0wehaveN = T /h , and from central limit theorem ! !1 N 1 Y (0, N)= (0, T /h) n ⇠N N n=0 X 2 2 I Therefore X (T ) 0, ( h)(T /h) = 0, T ) ⇠N N

Stoch. Systems Analysis Queues 23 Conditional distribution of later values

I More general, consider times T = Nh and T + S =(N + M)h

I Let X (T )=x(T ) be given. Can write X (T + S)as

N+M 1 X (T + S)=x(T )+ ph Yn ⇣ ⌘ nX=N I From central limit theorem it then follows

N+M 1 Y 0, (N + M N) = (0, S/h) n ⇠N N n=N X 2 I Therefore X (T + S) X (T )=x(T ) (x(T ), S) ) ⇠N h i

Stoch. Systems Analysis Queues 24 Definition of Brownian motion

I The former is for motivational purposes

I Define a Brownian motion process as (a.k.a Wiener process)

(i) X (t)normallydistributedwith0meanandvariance2t

X (t) 0,2t ⇠N (ii) Independent increments For disjoint intervals (t1, t2)and(s1, s2) increments X (t ) X (t )and) X (s ) X (s )areindependentRVs 2 1 2 1 (iii) Stationary increments Probability distribution of increment X (t + s) X (s) is the) same as probability distribution of X (t)

I Property (ii) Brownian motion is a Markov process ) I Properties (i) and (ii) Brownian motion is a Gaussian process )

Stoch. Systems Analysis Queues 25 Mean and autocorrelation of Brownian motion

I Mean function µ(t)=E [X (t)] is null for all times (by definition)

µ(t) = E [X (t)] = 0

I For autocorrelation RX (t1, t2)startwithtimest1 < t2

I Use conditional expectations to write

RX (t1, t2)=E [X (t1)X (t2)] = EX (t1) EX (t2) X (t1)X (t2) X (t1) h ⇥ ⇤i I In the innermost expectation X (t1) is a given constant, then

RX (t1, t2)=EX (t1) X (t1)EX (t2) X (t2) X (t1) h ⇥ ⇤i I Start computing innermost expectation

Stoch. Systems Analysis Queues 26 Autocorrelation of Brownian motion (continued)

I The conditional distribution of X (t2) given X (t1)is

X (t ) X (t ) X (t ),2(t t ) 2 1 ⇠N 1 2 1 h i ⇣ ⌘ I Innermost expectation is then EX (t2) X (t2) X (t1) = X (t1) ) I From where autocorrelation follows as ⇥ ⇤ 2 2 RX (t1, t2)=EX (t1) X (t1)X (t1) = EX (t1) X (t1) = t1 ⇥ ⇤ ⇥ ⇤ 2 I Repeating steps, if t < t R (t , t )= t 2 1 ) X 1 2 1 2 I Autocorrelation of Brownian motion R (t , t )= min(t , t ) ) X 1 2 1 2

Stoch. Systems Analysis Queues 27 Brownian motion with drift

I Similar to Brownian motion, but start with biased random walk

I Time step h,stepsizeph, right or left with di↵erent probs. 1 µ P X (t + h)=x + ph X (t)=x = 1+ ph 2 h i 1 ⇣ µ ⌘ P X (t + h)=x ph X (t)=x = 1 ph 2 h i ⇣ ⌘ I If µ>0 biased to the right, if µnegative, biased to the left

I Definition requires h small enough to make (µ/)ph 1 

I Notice that bias vanishes as ph.Sameasvariance

Stoch. Systems Analysis Queues 28 Steps

I Define step RV Y = 1, with probabilities n ± 1 µ 1 µ P[Yn =1]= 1+ ph , P[Yn = 1] = 1 ph 2 2 ⇣ ⌘ ⇣ ⌘ I Expected value of Yn is

E [Yn] =(1) P[Xn =1] +( 1)P [Xn = 1] 1 µ 1 µ = 1+ ph 1 ph 2 2 µ ⇣ ⌘ ⇣ ⌘ = ph

I Second moment of Yn is

2 2 2 E Yn =(1)P[Xn =1]+( 1) P[Xn = 1] = 1 h i 2 2 2 µ I Variance of Yn is var [Yn] = E Yn E [Yn]=1 h ) 2 h i

Stoch. Systems Analysis Queues 29 Write as sum of steps & Central Limit Theorem

I Can write X (t) in terms of step RVs Yn

I Consider time T = Nh,indexn =1, 2,...,N.WriteX [(n + 1)h]as

X [(n +1)h]=X (nh)+ ph Yn ⇣ ⌘ I Use recursively to write X (T )=X (Nh)as

N 1 N 1 X (T )=X (Nh)=X (0) + ph Yn = ph Yn n=0 n=0 ⇣ ⌘ X ⇣ ⌘ X N 1 I As h 0wehaveN and Y normally distributed ! !1 n=0 n

I As h 0, X (T )tendstobenormallydistributedP ! I Need to determine mean and variance (and only mean and variance)

Stoch. Systems Analysis Queues 30 Mean and variance of X (T )

I Expected value of X (T ) = scaled sum of expected values of Yn µ E [X (T )] = ph (N)(E [Yn]) = ph (N) ph = µT ⇣ ⌘ ⇣ ⌘ ⇣ ⌘ I Used T = Nh

I Variance of X (T ) = scaled sum of variances of Yn

2 µ2 var [X (T )] = ph (N)(var[Y ]) = 2h (N) 1 h 2T n 2 ! ✓ ◆ ⇣ ⌘ 2 2 I Used T = Nh and 1 (µ / )h 0 ! 2 I Brownian motion with drift X (t) µt, t ) ⇠N Normal with mean µt and variance 2 ) Independent and stationary increments )

Stoch. Systems Analysis Queues 31 Geometric Brownian motion

I Next state follows by multiplying by a random value

I Instead of adding or subtracting a random quantity

I Define RV Y = 1 with probabilities as in biased Brownian motion i ± 1 µ 1 µ P[Yn =1]= 1+ ph , P[Yn = 1] = 1 ph 2 2 ⇣ ⌘ ⇣ ⌘ I Define geometric random walk through the recursion

Y [(n +1)h]=Y (nh)e(ph)Yn

(ph) I When Yn =1increaseY [(n + 1)h]byrelative amount e (ph) I When Yn =1decreaseY [(n + 1)h]byrelative amount e

(ph) I Notice e 1 ph suitable to model investment return ⇡ ± ) ⇣ ⌘

Stoch. Systems Analysis Queues 32 Geometric Brownian motion as function of BMD

I Take logarithms on both sides of recursive definition

log Y [(n + 1)h] = log Y (nh) ph Y n ⇣ ⌘ ⇣ ⌘ ⇣ ⌘ I Define X (nh) = log Y (nh) recursion for X (nh)is ⇣ ⌘ X [(n + 1)h]=X (nh) ph Y n ⇣ ⌘ I As h 0theprocessX (t) becomes BMD with parameters µ and !

I Given a BMD X (t)withparametersµ, ,theprocessY (t)

Y (t)=eX (t)

I is a geometric Brownian motion (GBM) with parameters µ,

Stoch. Systems Analysis Queues 33 White Gaussian noise

Introduction and roadmap

Gaussian processes

Brownian motion and its variants

White Gaussian noise

Stoch. Systems Analysis Queues 34 Delta function

h(t) I Consider a function h(t)definedas

1/h if h/2 t h/2 (t) = h 0else   ⇢

I “Define” delta function as limit of (t)ash 0 h ! if t =0 (t) =limh(t)= 1 h 0 0else ! ⇢

I Is this a function? Of course not t )

I Consider the integral of (t)inanintervalthatincludes[ h/2, h/2] h b (t) dt =1, for any a, b such that a h/2, h/2 b h   Za I Integral is 1 independently of h

Stoch. Systems Analysis Queues 35 Delta function (continued)

I Another integral involving h(t)(forh small)

b b f (t) (t) dt f (0) (t) dt f (0), a h/2, h/2 b h ⇡ h ⇡   Za Za

I Define the generalized function (t) as the entity having the property

b f (0) if a < 0 < b f (t)(t) dt = 0else Za ⇢

I Delta function permits taking derivatives of discontinuous functions

I A delta function is not defined, its action on other functions is

I Interpretation A delta function cannot be observed directly, but can be observed) through its e↵ect in other functions

Stoch. Systems Analysis Queues 36 Heaviside’s step function and delta function

I Integral of delta function between and t 1 t 0ift < 0 (u) du = := H(t) 1if0< t Z1 ⇢

I H(t) is defined as Heaviside’s step function

I To maintain consistency of fundamental theory of calculus we define the derivative of Heaviside’s step function as

@H(t) = (t) @t

H(t) (t)

t

Stoch. Systems Analysis Queues 37 White Gaussian noise

I A White Gaussian noise (WGN) process W (t) is one with Zero mean E [W (t)] = 0 for all t ) ) Delta function autocorrelation R (t , t )=2(t t ) ) ) W 1 2 1 2

I To interpret W (t) consider time step h and process Wh(nh)with

W (nh) (0,2/h) h ⇠N

I Values Wh(n1h)andWh(n2h)atdi↵erenttimesareindependent

I White noise W (t) is the limit of the process W (nh)ash 0 h !

W (t)= lim Wh(nh), with n = t/h t !1

I Process Wh(nh) is the discrete-time representation of white noise

Stoch. Systems Analysis Queues 38 Properties of WGN

I For di↵erent times t1 and t2, W (t1)andW (t2)areuncorrelated

E [W (t1)W (t2)] = RW (t1, t2)=0

I But since W (t) is Gaussian uncorrelation implies independence

I Values of W (t)atdi↵erenttimesareindependent

2 2 I WGN has infinite power E W (t) = RW (t, t)= (0) ) I Therefore WGN does not represent⇥ any⇤ physical phenomena I However WGN is a convenient abstraction ) approximates processes with large power and ) (nearly) independent samples

I Some processes can be modeled as post-processing of WGN

I Cannot observe WGN directly, but can model its e↵ect on systems

Stoch. Systems Analysis Queues 39 Integral of white Gaussian noise

t I Consider integral of a WGN process W (t) X (t)= W (u) du ) Z0 I Since integration is linear functional and W (t)isGP,X (t)isalsoGP To characterize X (t) just determine mean and autocorrelation ) I The mean function µ(t)=E [X (t)] is null

t t µ(t)=E W (u) du = E [W (u)] du =0 Z0 Z0

I The autocorrelation RX (t1, t2) is given by (assume t1 < t2)

t1 t2 RX (t1, t2)=E W (u1) du1 W (u2) du2 ✓Z0 ◆✓Z0 ◆

Stoch. Systems Analysis Queues 40 Integral of white Gaussian noise (continued)

I Product of integral is double integral of product

t1 t2 RX (t1, t2)=E W (u1)W (u2) du1du2 Z0 Z0

I Interchange expectation & integration

t1 t2 RX (t1, t2)= E [W (u1)W (u2)] du1du2 Z0 Z0

2 I Definition and value of autocorrelation RW (u1, u2)= (u1 u2) t1 t2 2 RX (t1, t2) = (u1 u2) du1du2 Z0 Z0 t1 t1 t1 t2 2 2 = (u1 u2) du1du2 + (u1 u2) du1du2 Z0 Z0 Z0 Zt1 t1 2 2 = du1 = t1 Z0 I Same mean and autocorrelation as Brownian motion

Stoch. Systems Analysis Queues 41 White Gaussian noise and Brownian motion

I GPs are uniquely determined by mean and autocorrelation The integral of WGN is Brownian motion ) Conversely the derivative of Brownian motion is WGN )

I I.e., with W (t)aWGNprocessandX (T ) Brownian motion

t @X (t) W (u) du = X (t) = W (t) , @t Z0 I Brownian motion can be also interpreted as a sum of Gaussians

I Not Bernoullis as before

I Any i.i.d. distribution with same mean and variance would work

I This is fine, but derivatives and integrals involve limits

I What are these derivatives?

Stoch. Systems Analysis Queues 42 Mean square derivative of a

I Consider a realization x(t) of the process X (t)

I The derivative of (lowercase) x(t)is

@x(t) x(t + h) x(t) =lim @t h 0 h !

I When this limit exists limit may not exist for all realizations )

I Can define sure limit (limit exists for all processes), almost sure limit (exists except for a zero-measure set of processes), in probability, etc.

I Definition used here is in mean-squared sense

I Process @X (t)/@t is the derivative of X (t)inmeansquaresenseif

X (t + h) X (t) @X (t) 2 lim E =0 h 0 h @t ! "✓ ◆ #

Stoch. Systems Analysis Queues 43 Mean square integral of a stochastic process

I Likewise consider the integral of a realization x(t) of X (t)

(b a)/h b x(t)= lim hx(a + nh) h 0 a ! n=1 Z X I Limit need not exist for all realizations

I Can define in sure sense, almost sure sense, in probability sense, etc.

I Adopt definition in mean square sense b I Process a X (t) is the integral of X (t)inmeansquaresenseif

R (b a)/h 2 b lim E hX (a + nh) X (t) =0 h 0 2 3 ! n=1 a ✓ X Z ◆ 4 5 I Mean square sense convergence is convenient to work with autocorrelation and Gaussian processes

Stoch. Systems Analysis Queues 44 Example: Linear state model

I Stochastic process X (t) follows a linear state model if

@X (t) = aX (t)+W (t) @t

2 I With W (t)WGNwithautocorrelationRW (t1, t2)= (t1 t2)

I Discrete time representation of X (t) X (nh)withstepsizeh ) I Solving di↵erential eq. between nh and n(h +1)(h small)

(n+1)h X ((n +1)h) X (nh)eah + W (t) dt ⇡ Znh (n+1)h I Defining X (n)=X (nh)andW (n)= nh W (t) dt may write X (n +1) (1 + ahR )X (n)+W (n) ⇡ 2 2 I Where E W (n) = h and W (n1)independentofW (n2) ⇥ ⇤

Stoch. Systems Analysis Queues 45 Example: Vector linear state model

I Vector stochastic process X(t) follows a linear state model if

@X(t) = AX(t)+W(t) @t

2 I With W(t)vectorWGNRW (t1, t2)= (t1 t2)I

I Discrete time representation of X (t) X (nh)withstepsizeh ) I Solving di↵erential eq. between nh and n(h +1)(h small)

(n+1)h X((n +1)h) X(nh)eAh + W(t) dt ⇡ Znh (n+1)h I Defining X(n)=X(nh)andW(n)= nh W(t) dt may write X(n +1) (I + ARh)X(n)+W(n) ⇡ 2 2 I Where E W (n) = hI and W(n1)independentofW(n2) ⇥ ⇤

Stoch. Systems Analysis Queues 46