<<

Discrete time Markov chains

Today: I Random walks Random walks and branching processess I First step analysis revisited I Branching processes I Generating functions Bo Friis Nielsen1 Next week 1DTU Informatics I Classification of states I Classification of chains 02407 Stochastic Processes 2, September 7 2021 I Discrete time Markov chains - invariant Two weeks from now I Poisson process

Bo Friis Nielsen Random walks and branching processess Bo Friis Nielsen Random walks and branching processess

0 Simple with two Solution technique for uk s reflecting barriers 0 and N

uk = puk+1 + quk−1, k = 1, 2,..., N − 1, u0 = 1, 1 0 0 ... 0 0 0 u = 0 q 0 p ... 0 0 0 N

0 q 0 ... 0 0 0 P = Rewriting the first equation using p + q = 1 we get ......

0 0 0 ... q 0 p (p + q)uk = puk+1 + quk−1 ⇔

0 0 0 ... 0 0 1 0 = p(uk+1 − uk ) − q(uk − uk−1) ⇔

T = min{n ≥ 0; Xn ∈ {0, 1}} xk = (q/p)xk−1 u = P{X = 0|X = k} k T 0 with xk = uk − uk−1, such that

k−1 xk = (q/p) x1

Bo Friis Nielsen Random walks and branching processess Bo Friis Nielsen Random walks and branching processess Recovering uk Values of absorption probabilities uk

x1 = u1 − u0 = u1 − 1 From uN = 0 we get

x2 = u2 − u1 N−1 . X i . 0 = 1 + x1 (q/p) ⇔ i=0 xk = uk − uk−1 1 x1 = − PN−1 i such that i=0 (q/p) Leading to u1 = x1 + 1

u2 = x2 + x1 + 1 ( 1 1 − (k/N) = (N − k)/N when p = q = 2 . u = k N . k (q/p) −(q/p) 6= . 1−(q/p)N when p q k−1 X i uk = xk + xk−1 + ··· + 1 = 1 + x1 (q/p) i=0

Bo Friis Nielsen Random walks and branching processess Bo Friis Nielsen Random walks and branching processess

Direct calculation as opposed to first step analysis Expected number of visits to states

QR W (n) = Q(0) + Q(1) + ... Q(n) P = ij ij ij ij 0 I In matrix notation we get

2 (n) 2 n 2 QR QR Q QR + R W = I + Q + Q + ··· + Q P = =   0 I 0 I 0 I = I + Q I + Q + ··· + Qn−1 = I + QW (n−1) n n−1 n−2 n Q Q R + Q R + ··· + QR + R P = Elementwise we get the “first step analysis” equations 0 I r−1 " n #  (n) X (n−1) (n) X 1 if X = j E 1 1 ` Wij = δij + Pik Wkj Wij = (X` = j)|X0 = i , where (X`) = 0 if X` 6= j k=0 `=0

Bo Friis Nielsen Random walks and branching processess Bo Friis Nielsen Random walks and branching processess Limiting equations as n → ∞ Absorption time T −1 r T −1 X X X 1(Xn = j) = 1 = T ∞ X n=0 j=0 n=0 W = I + Q + Q2 + ··· = Qi Thus i=0  r T −1  W = I + QW X X E(T |X0 = i) = E  1(Xn = j) X0 = i From the latter we get j=0 n=0 (I − Q)W = I r "T −1 # X E X 1 When all states related to Q are transient (we have assumed = (Xn = j|X0 = i that) we have j=0 n=0 ∞ r X i −1 X W = Q = (I − Q) = Wij i=0 j=0 With T = min{n ≥ 0, r ≤ Xn ≤ N} we have that In matrix formulation "T −1 # = X v W 1 Wij = E 1(Xn = j) X0 = i where vi = E(T |X0 = i) as last week, and 1 is a column vector n=0 of ones. Bo Friis Nielsen Random walks and branching processess Bo Friis Nielsen Random walks and branching processess

Absorption probabilities Conditional expectation discrete case (2.1)

P{X = x, Y = y} P{Y = y|X = x} = P{X = x} X (n) P E[ = | = ] = P{ = | = } Uij = {T ≤ n, XT = j|X0 = i} Y y X x y Y y X x y U(1) = R = IR h(x) = E[Y = y|X = x] is a function of x, thus h(X) is a U(2) = IR + QR , which we call E[Y = y|X]. Now U(n) = (I + Q + ··· + Q(n−1)R = W (n−1)R X X X E[h(X)] = P{X = x}h(x) = P{X = x} yP{Y = y|X = x} Leading to x x y U = WR X X P{X = x, Y = y} X X = yP{X = x} = yP{X = x, Y = y} P{X = x} x y x y = E[Y ] = E{E[Y |X]}, (E[g(Y )] = E{E[g(Y )|X]})

Bo Friis Nielsen Random walks and branching processess Bo Friis Nielsen Random walks and branching processess Conditional variance discrete case Random sum (2.3)

N X X = ξ1 + ··· + ξN = ξi h i i=1 V 2 2 E E 2 2 ar[Y ] = E Y − E [Y ] = { [Y |X]} − E [Y ] where N is a random variable taking values among the non-negative integers; with = E{Var[Y |X] + E[Y |X]2} − E{E[Y |X]}2 2 2 E V 2 E V 2 = E{Var[Y |X]} + E{E[Y |X] } − E{E[Y |X]} } (N) = ν, ar(N) = τ , (ξi ) = µ, ar(ξi ) = σ E{Var[Y |X]} + Var{E[Y |X]} E(X) = E(E(X|N)) = E(Nµ) = νµ Var(X) = E(Var(X|N)) + Var(E(X|N)) = E(Nσ2) + Var(Nµ) = νσ2 + τ 2µ2

Bo Friis Nielsen Random walks and branching processess Bo Friis Nielsen Random walks and branching processess

Branching processes Extinction probabilities

Xn+1 = ξ1 + ξ2 + ··· + ξXn Define N to be the random time of extinction where ξi are independent random variables with common (N can be defective - i.e. P{N = ∞) > 0)} propability mass function

un = P{N ≤ n} = P{XN = 0} P{ξi = k} = pk And we get From a random sum interpretation we get ∞ X k un = pk un−1 n+1 E(Xn+1) = µE(Xn) = µ k=0 2 2 n 2 Var(Xn+1) = σ E(Xn) + µVar(Xn) = σ µ + µ Var(Xn) 2 n 2 2 n−1 2 = σ µ + µ (σ µ + µ Var(Xn−1))

Bo Friis Nielsen Random walks and branching processess Bo Friis Nielsen Random walks and branching processess The generating function - an important analytic Generating functions tool ∞ k  ξ X k 1 d φ(s) φ(s) = E s = pk s , pk = k! k k=0 ds s=0 Moments from generating functions I Manipulations with probability distributions ∞ Determining the distribution of a sum of random variables dφ(s) I X k−1 E = pk ks = (ξ) I Determining the distribution of a random sum of random ds s=1 k=1 s=1 variables Similarly I Calculation of moments 2 ∞ I Unique characterisation of the distribution d φ(s) X = p k(k − 1)sk−2 = E(ξ(ξ − 1)) Same information as CDF 2 k I ds s=1 k=2 s=1 a factorial moment

Var(ξ) = φ00(1) + φ0(1) − (φ0(1))2

Bo Friis Nielsen Random walks and branching processess Bo Friis Nielsen Random walks and branching processess

The sum of iid random variables Sum of iid random variables - continued

RememberI ndependentI denticallyD istributed Pn The Probability of outcome (X1 = i, X2 = x − i) is Sn = X1 + X2 + ··· + Xn = = Xi i 1 P{X1 = i, X2 = x − i} = P{X1 = i}P{X2 = x − i} by With px = P{Xi = x}, Xi ≥ 0 we find for n = 2 S2 = X1 + X2 independence, which again is pi px−i . The event {S2 = x} can be decomposed into the set In total we get {(X = 0, X = x), (X = 1, X = x − 1) x 1 2 1 2 X ... (X1 = i, X2 = x − i),... (X1 = x, X2 = 0)} P{S2 = x} = pi px−i The probability of the event {S2 = x} is the sum of the i=0 probabilities of the individual outcomes.

Bo Friis Nielsen Random walks and branching processess Bo Friis Nielsen Random walks and branching processess Generating function - one example Generating function - another example

Binomial distribution   n k n−k Poisson distribution pk = p (1 − p) λk k p = e−λ k k! n n   ∞ ∞ k ∞ k X k X k n k n−k X k X k λ −λ −λ X (sλ) φbin(s) = s pk = s p (1 − p) φ (s) = s p = s e = e k poi k k! k! k=0 k=0 k=0 k=0 k=0 n X  n  = e−λesλ = e−λ(1−s) = (sp)k (1 − p)n−k = (1 − p + ps)n k k=0

Bo Friis Nielsen Random walks and branching processess Bo Friis Nielsen Random walks and branching processess

And now to the reason for all this ... Sum of two Poisson distributed random variables

The convolution can be tough to deal with (sum of random variables) Theorem If X and Y are independent then X ∼ P(λ) Y ∼ P(µ) Z = X + Y  λx  φ (s) = φ (s)φ (s) φ (s) = e−λ(1−s) φ (s) = e−µ(1−s) P{X = x} = p = e−λ X+Y X Y X Y x x!

where φX (s) and φY (s) are the generating functions of X and Y And we get  φ ( ) = φ ( )φ ( ) = −λ(1−s) −µ(1−s) = −(λ+µ)(1−s) A probabilistic proof (which I think is instructive) Z s X s Y s e e e         Such that φ (s) = E sX+Y = E sX sY = E sX E sY = φ (s)φ (s) X+Y X Y Z ∼ P(λ + µ)

Bo Friis Nielsen Random walks and branching processess Bo Friis Nielsen Random walks and branching processess Sum of two Binomial random variables with the Poisson example same p

X ∼ B(n, p) Y ∼ B(m, p) Z = X + Y  λx  X ∼ P(λ) φ (s) = e−λ(1−s) P{X = x} = p = e−λ n     X x x! φX (s) = (1 − p + ps) P n x n−x m {X = x} = px = p (1 − p) φY (s) = (1 − p + ps) x φ0(s) = −(−λ)e−λ(1−s) = λe−λ(1−s) And we get And we find n m n+m 0 0 φZ (s) = φX (s)φY (s) = (1−p +ps) (1−p +ps) = (1−p +ps) E(X) = φ (1) = λe = λ 00 2 −λ(1−s) Such that φ (s) = λ e Z ∼ B(n + m, p) V (X) = φ00(1) + φ0(1) − (φ0(1))2 = λ2 + λ − λ2 = λ

Bo Friis Nielsen Random walks and branching processess Bo Friis Nielsen Random walks and branching processess

Generating function - the geometric distribution Generating function for random sum

x−1 ∞ px = (1 −∞ p) p X x X x x−1 φgeo(s) = s px = s (1 − p) p x=1 x=1 ∞ X = s(s(1 − p))x−1p h (s) = g (φ(s)) x=1 X N A useful power series is: Applied for the branching process we get

 1−aN+1 φ (s) = φ (φ(s)) N  N < ∞, a 6= 1 n n−1 X  1−a ai = N + 1 N < ∞, a = 1 i=0  1  1−a N = ∞, |a| < 1 sp And we get φ (s) = geo 1 − s(1 − p)

Bo Friis Nielsen Random walks and branching processess Bo Friis Nielsen Random walks and branching processess Generating function for the sum of independent Sum of two geometric random variables with the random variables same p

X with pdf px Y with pdf qy X ∼ geo(p) Y ∼ geo(p) Z = X + Y sp φX (s) =   Z = X + Y what is r = P{Z = z}? 1−s(1−p) P{X = x} = p = (1 − p)x−1p z φ ( ) = sp x z Y s 1−s(1−p) X P{Z = z} = rz = pi qz−i And we get i=0 sp sp  sp 2 Theorem φ (s) = φ (s)φ (s) = = Z X Y 1 − s(1 − p) 1 − s(1 − p) (1 − s(1 − p) (23) page 153 If X and Y are independent then The density of this distribution is φX+Y (s) = φX (s)φY (s) P{Z = z} = h(z) = (z − 1)(1 − p)z−2p2 where φX (s) and φY (s) are the generating functions of X and Y  Negative binomial.

Bo Friis Nielsen Random walks and branching processess Bo Friis Nielsen Random walks and branching processess

Sum of k geometric random variables Characteristic function and other with the same p

 I Characteristic function: E eitX θ  I Moment generating function: E e X More generally - sum of k geometric variables −  I Laplace Stieltjes transform: E e sX  x − 1   sp k EXAMPLE: (exponential) p = (1 − p)x−k pk φ (s) = x k − 1 X 1 − s(1 − p)   Z ∞ λ E eθX = eθx λe−λx dx = , θ < λ 0 λ − θ

Bo Friis Nielsen Random walks and branching processess Bo Friis Nielsen Random walks and branching processess