Probability vector, Markov chains, stochastic matrix
Section 4.9 Applications to Markov Chains
A Probability vector is a vector with nonnegative entries that add up to 1. A stochastic matrix is a square matrix whose columns are probability vectors. A Markov chain is a sequence of probability vectors x0, x1, x2, ··· , together with a stochastic matrix P, such that
x1 = Px0, x2 = Px1, x3 = Px2, ···
n A Markov chain of vectors in R describes a system or a sequence of experiments. xk is called state vector. An example is the crunch and munch breakfast problem. Example: model for population movement
Consider a population movement between city and suburbs in a metropolitan region governed by the migration matrix M:
From City Suburbs To: .95 .03 City .05 .97 Suburbs
Each column vector of M is a probability vector. M is a stochastic matrix Suppose that the population in 2006 is: 600,000 in the city and 400,000 in the suburbs. .60 The initial state (probability vector) is: x = 0 .40 Find the population distribution in the year 2007 and 2008. Solution. The population distribution in 2007 is given by x1 = Mx0, i.e.,
.95 .03 .60 .582 = . .05 .97 .40 .418
58.2% of the region will live in the city in 2007. 41.8% of the region will live in the suburbs in 2007. x2 = Mx1, To find the population distribution in the year 2008, compute x2 = Mx1, i.e.,
.95 .03 .582 .565 = . .05 .97 .418 .435
56.5% of the region will live in the city in 2008. 43.5% of the region will live in the suburbs in 2008. Steady state vectors
If P is a stochastic matrix, then a steady state vector (or equilibrium vector) for P is a probability vector v such that Pv = v . Section 4.9 Applications to Markov Chains
Rent-a-Lemon has three locations from which to rent a car for one day: Airport, downtown and the valley
Daily Migration:
Rented From Airport Downtown Valley .95 .02 .05 Airport Returned To .03 .90 .05 Downtown .02 .08 .90 Valley
Airport
Downtown Valley
.95 .02 .05 M = .03 .90 .05 .02 .08 .90 (migration matrix)
1 .5 (initial fraction of cars at airport)
x0 = .3 (initial fraction of cars downtown) .2 (initial fraction of cars at valley location)
(initial distribution vector which is a probability vector)
Interpretation of Mx0
.95 .02 .05 .5
Mx0 = .03 .90 .05 .3 = .02 .08 .90 .2
.95 .02 .05 .5 .03 + .3 .90 + .2 .05 .02 .08 .90 ↑↑↑
Redistribution Redistribution Redistribution of airport of downtown of valley cars cars cars
2 Distribution after one day=
.95 .02 .05 .5 0.491 x1 = Mx0 = .03 .90 .05 .3 = 0.295 .02 .08 .90 .2 0.214
xk+1 = Mxk for k = 0,1,2,… (Markov Chain)
Distribution after two days=
.95 .02 .05 0.491 0.483 x2 = Mx1 = .03 .90 .05 0.295 = 0.290 .02 .08 .90 0.214 0.226 .95 .02 .05 0.483 0.475 x3 = Mx2 = .03 .90 .05 0.290 = 0.287 .02 .08 .90 0.226 0.236 .95 .02 .05 0.475 0.468 x4 = Mx3 = .03 .90 .05 0.287 = 0.284 .02 .08 .90 0.236 0.244 ⋮ 0.417 x49 = Mx48 = 0.278 0.305
3 0.417 x50 = Mx49 = 0.278 (long term distribution) 0.305 ⋮ 0.417 x = 0.278 is called a steady state vector since x = Mx 0.305
Finding the Steady State Vector Mx = x
Mx =Ix
Mx −Ix = 0
M−Ix = 0
Solve M−Ix = 0 to find the steady state vector. Note that the solution x must be a probability vector.
4 EXAMPLE: Suppose that 3% of the population of the U.S. lives in the State of Washington. Suppose the migration of the population into and out of Washington State will be constant for many years according to the following migration probabilities. What percentage of the total U.S. population will eventually live in Washington?
From : WA Rest of U.S. To: .9 .01 WA .1 .99 Rest of U.S.
Solution .9 .01 % of people in WA M = x = .1 .99 %inrestofU.S.
.9 .01 10 −0.1 0.01 M − I = − = .1 .99 01 0.1 −0.01
Solve M−Ix = 0
− − 0.10.010 ∼ 1 0.1 0 0.1 −0.01 0 000
x1 0.1x2 0.1 x = = = x2 x2 x2 1
5 1 One solution: x = 10
Solution we want has entries which add up to one:
1/11 0.091 x = ≈ 10/11 0.909
6 The convergence of the state vectors to the steady state (fixed point of the the stochastic matrix) occurs very frequently. The feature required for this convergence is called regularity of P. A stochastic matrix P is regular if, for some integer k, Pk > 0 (all entries are strictly positive). .5 .2 .3 Example. Let P = .3 .8 .3 Then P2 is .2 0 .4 .5 .2 .3 .5 .2 .3 .37 .26 .33 .3 .8 .3 .3 .8 .3 = .45 .70 .45 . .2 0 .4 .2 0 .4 .18 .04 .22 Hence P is regular. Theorem If P is an n × n regular stochastic matrix, then P has a unique steady state vector v. Further, if x0 is any initial state and xk+1 = Pxk for k = 0, 1, 2, ··· , then the Markov chain {xk } converges to v.
Remark. The initial state does not affect the long time behavior of the Markv chain.