<<

Plan Fun with White , Last Part 1. The Poisson Process cont’d/revisited 3. 3. Stationary Processes (if time allows – ha ha – or skip til later)

Next Time: Markov Processes (finally)

Reading: G&S 6.1, 6.2, generating function sheet Homework 2 due Today Homework 3 due next week Nt

5 Figure 1. 4 3 2 1 t X1 = S1 S2 S3S4 S5

X2 X3 X4 X5 Example 2. Poisson Process |{z}| {z }|{z}| {z } Consider a with S = 0 and X iid Exponential λ . (Note: If W = X λ, 0 i h i i i − Sn = i Wi + nλ, a sum of white noise plus drift.) Then, S has a Gamma n, λ distribution. P n h i Define Nt = max n 0 such that Sn t . A few properties directly{ ≥ from the definition:≤ }

1. N0 = 0. 2. Nt n Sn t. 3. N ≥= n ⇐⇒ S ≤ t and S > t. t ⇐⇒ n ≤ n+1 4. If s < t, Nt Ns counts the number of “arrivals” between s and t. This N N is often− called the increment of the process over (s, t]. t − s We have the following.

P N = k = P S t, S > t (1) { t } { k ≤ k+1 } = E P S t, S > t S (2) { k ≤ k+1 | k } t = P X > t u P S near u (3) { k+1 − } { k } Z0 t λkuk−1 = e−λ(t−u) e−λu du (4) Γ(k) Z0 λk t = e−λt k uk−1 du (5) k! Z0 (λt)k = e−λt , (6) k! 1 9 Feb 2006 so N has a Poisson λt distribution. t h i Next, we consider the distribution of N N for some s < t. t − s

P N N k { t − s ≥ } = P S t (7) { k+Ns ≤ } = P S S t S (8) { k+Ns − Ns ≤ − Ns } = P S S t s + (s S ) (9) { k+Ns − Ns ≤ − − Ns } = E P S S t s + (s S ) N , S (10) { k+Ns − Ns ≤ − − Ns | s Ns } ∞ s P near P near = Sk+Ns SNs t s + (s SNs ) Ns = n, SNs u Ns = n, SNs u (11) 0 { − ≤ − − | } { } nX=0 Z ∞ s P near P near = Sk+n Sn t s + (s u) Ns = n, SNs u Ns = n, SNs u (12) 0 { − ≤ − − | } { } nX=0 Z ∞ s n+k = P X (s u) + X t s N = n, S near u P N = n, S near u (13)  n+1 − − i ≤ − s Ns  { s Ns } n=0 Z0 i=n+2 X  X 

∞ s  n+k  = P X (s u) + X t s X > s u, S near u P N = n, S nea(14)r u  n+1 − − i ≤ − n+1 − n  { s Ns } n=0 Z0 i=n+2 X  X 

∞ s  n+k  = P X (s u) + X t s X > s u P N = n, S near u (15)  n+1 − − i ≤ − n+1 −  { s Ns } n=0 Z0 i=n+2 X  X 

∞ s  n+k  P P near = Xn+1 + Xi t s Ns = n, SNs u (16) 0  ≤ −  { } nX=0 Z  i=Xn+2  ∞ s P P  near = Sn+k Sn t s Ns = n, SNs u (17) 0 { − ≤ − } { } nX=0 Z ∞ = P S S t s P N = n (18) { n+k − n ≤ − } { s } nX=0 ∞ = P S t s P N = n (19) { k ≤ − } { s } nX=0 = P S t s (20) { k ≤ − } = P N − k . (21) { t s ≥ } That step from (16) to (17) requires an argument. It follows from the relation

P X > t + u X > u = P X > t (22) { | } { } for a random variable X with an Exponential distribution. Thus we see that N N has the same t − s distribution as Nt−s.

2 9 Feb 2006 Now consider s0 < t0 < s < t. We could use the same basic argument to show that N N t − s and Nt0 Ns0 are independent but it gets a bit messy. It’s easier using generating functions. Note−that ∞ λn G(z; λ) = e−λ zn = eλ(z−1). (23) n! nX=0 From this, we get the G of Nt:

λt(z−1) GNt (z) = e (24) 0 0 0 0 = eλ(t−s+s−t +t −s +s −0)(z−1) (25) 0 0 0 0 = eλ(t−s)(z−1)eλ(s−t )(z−1)eλ(t −s )(z−1)eλ(s −0)(z−1) (26)

= GNt−s (z)GNs−t0 (z)GNt0−s0 (z)GNs0−0 (z) (27)

= GNt−Ns (z)GNs−Nt0 (z)GNt0 −Ns0 (z)GNs0 −N0 (z). (28)

But Nt = Nt Ns + Ns Nt0 + Nt0 Ns0 + Ns0 N0, and this equality of generating functions implies that the− componen−ts are indep−endent. (There’s− a brief argument needed, which I’ll show you, to make this rigorous.) Conversely, if we show independence first, we could show equality in distribution with the same relation. Thus, from a white noise random walk, we get the following process.

Process 3. Let (Nt)t≥0 be a process with the following properties:

1. N0 = 0 2. For 0 s < t, Nt Ns has a Poisson λ(t s ) distribution. 3. For 0 ≤ s < t <−s < t < < s h< t −, thei random variables N N are independent. ≤ 1 1 2 2 · · · m m ti − si This is called a (homogeneous) Poisson process with rate λ. Property 2 has two parts. The first is the specific distribution of the increment Nt Ns. The second is that the distribution of the increment depends on time only through t s. This− is the property of stationary increments: the distributions of increments in two time interv−als of the same length are equal. Property 3 is called independent increments: the increments in disjoint time intervals are stochastically independent. Poisson processes are examples of both renewal processes and point processses, both of which we will study later.

3 9 Feb 2006 Definition 4. Let L2(0, 1) be (up to some formal details) the set of functions g on (0, 1) such 1 2 that 0 g < . A complete,∞ orthonormal basis (ψ ) for L2(0, 1) is a countable subset of L2(0, 1) such that R n ψ ψ = δ and for any g 2(0, 1), we can find c = gψ such that j k jk ∈ L n n R n 2 R g ckψk 0, (29) − ! → Z kX=0 as n . → ∞ Example 5. Brownian Motion Define a process (ξt)t≥0 as follows. Let (An)n≥0 be a standard normal white noise Process, i.e., An are iid Normal 0, 1 . Define h i ∞ ξt(ω) = An(ω)ψn(t), (30) nX=0 for a particular complete orthonormal basis (ψ). Taking some liberties (we’ll see a more formal derivation another time), we will think of ξt as a continuous white noise process meaning that in some formal sense we should have Eξt = 0 and Eξ ξ = δ(s t). Arguing loosely, this works: s t −

Eξt = EAnψn(t) = 0 (31) n X Eξsξt = EAnAmψn(t)ψm(s) (32) n,m X = ψn(t)ψn(s), (33) n X which “makes sense” when s = t because 6

dt Eξsξt = ψn(s) dt ψn(t) = 0. (34) n Z X Z But don’t take that last calculation too seriously. Now, just as we got a random walk process by taking cumulative sums of a discrete white noise process, we can see what we get when we take cumulative integrals of a continuous white noise process. Define t ∞ t Wt = ξs ds = An ψn(s) ds, (35) 0 0 Z nX=0 Z where we choose a specific basis ψn. Definition 6. Define H = 1 1 . Then let (0,1/2] − (1/2,1] H (t) = 2j/2H(2j t k). (36) jk − Then, H = 1 and H for j 0, k = 0, . . . , 2j 1 form a complete orthonormal basis for 0 (0,1] jk ≥ − L2(0, 1). It is called the Haar basis. 2 0 0 0 0 To see this note that H0 = 1, HjkHj k = δjj δkk , and Hjk = 0. And we have that J−1 2j −1 αH0 + j=0 k=0 βjkHjkRgives a represenR tation for all piecewiseR constant functions on dyadic −J intervalsPof lengthP 2 . (We’ll have a cool martingale proof of this another time.)

4 9 Feb 2006 Example 5 cont’d Now order the Haar functions (H0, H00, H10, H11, H20, H21, H22, H23, . . .) and label these as ψ for n 0. (For 2j n < 2j+1 and j Z , take k = n 2j and let H H .) n ≥ ≤ ∈ + − n ≡ jk For n 1, ≥ t H (s) ds S (t), (37) n ≡ n Z0 called the Schauder function. It follows that

Wt = AnSn(t). (38) ≥ nX0

Figure 7. 2−j/2−1

Sn(t) 2j n < 2j+1 ≤ k = n 2j −

k2−j (k + 1)2−j

Lemma 8. Let (a ) be a real sequence that satisfies a = O(kγ ) for some 0 γ < 1/2. Define k | k| ≤ f(t) = k≥0 akSk(t) and fn(t) be the corresponding partial sum. Then fn f uniformly on (0, 1), meaning that sup f (t) f(t) 0. → P 0

Proof of Lemma 10 Let φ = 1 . Then, if s t, s [0,s] ≤

1 s = φtφs = anbn (40) 0 ≥ Z nX0 where 1 an = φtHn = Sn(t) (41) Z0 1 bn = φsHn = Sn(s). (42) Z0

5 9 Feb 2006 Example 5 cont’d So, Wt as defined exists as a random function. We get the following:

EWt = EAnSn(t) = 0 (43) ≥ nX0 E 2 E 2 Wt = AnSn(t) = t (44) ≥ nX0

EWsWt = EAnAmSn(t)Sm(s) = min(s, t) (45) ≥ n,mX 0 and for u < s < t,

E(Wt Ws)Wu = EAnAm(Sn(t) Sn(s))Sm(u) = min(t, u) min(s, u) = 0. (46) − ≥ − − n,mX 0 Moreover, using the characteristic generating functions with s t, ≤

iλ(W −W ) iλ A (S (t)−S (s)) Ee t s = Ee n n n n (47) ∞ P = EeiλAn(Sn(t)−Sn(s)) (48) nY=0 ∞ 2 λ 2 = e− 2 (Sn(t)−Sn(s)) (49) nY=0 2 − λ (S (t)−S (s))2 = e 2 n n n (50) 2 − λ P− = e 2 (t 2s+s) (51) 2 λ = e− 2 (t−s), (52) using normality of the Ans. Hence, Wt Ws has a Normal 0, t s distribution. We get the following process. − h − i

Process 11. Thus, derived from a white noise process, we get a process (Wt)t≥0 with the following properties:

1. W0 = 0 2. For 0 s < t, Wt Ws has a Normal 0, t s distribution. 3. For 0 ≤ s < t < −s < t < , the randomh − viariables W W are independent. ≤ 1 1 2 2 · · · ti − si We call this process a Weiner process or equivalently, a Brownian Motion.

6 9 Feb 2006 Definition 12. A X = (X ) is called strongly stationary if for any n 1, any t ≥ t1, . . . , tn, and any h, the two vectors

(Xt1 , . . . , Xtn ) and (Xt1+h, . . . , Xtn+h) (53) have the same distribution.

Definition 13. A stochastic process X = (Xt) is called weakly stationary (aka second-order stationary) if for any t1, t2 and h,

E E Xt1 = Xt2 (54) Cov Cov (Xt1 , Xt2 ) = (Xt1+h, Xt2+h). (55)

Examples 14. 1. Is the white noise process stationary? In which sense? 2. What about the Poisson process? Weiner Process? 3. ARMA processes

Definition 15. If X is a (weakly) , then three important functions that characterize the process: 1. The autocovariance function R is defined by

R(t) = Cov(X0, Xt). (56)

2. The function ρ is defined by

R(t) ρ(t) = Cor(X , X ) = . (57) 0 t R(0)

3. The is defined when ρ has the representation

ρ(t) = eitλdF (λ) (58) Z for some distribution function F . The spectral density is the function f(λ) = F 0(λ).

Example 16. White noise: what’s in a name?

7 9 Feb 2006