Lecture-19: Reversibility

Lecture-19: Reversibility

Lecture-19: Reversibility 1 Introduction R Definition 1.1. A stochastic process X : W ! X is reversible if the vector (Xt1 ,:::,Xtn ) has the same distribution as (Xt−t1 ,:::,Xt−tn ) for all finite positive integers n, time instants t1 < t2 < ··· < tn and shifts t 2 R. Lemma 1.2. A reversible process is stationary. Proof. Since Xt is reversible, both (Xt1 ,:::,Xtn ) and (Xs+t1 ,:::,Xs+tn ) have the same distribution as (X−t1 ,:::,X−tn ) for each n 2 N and t1 < ··· < tn, by taking t = 0 and t = −s respectively. Definition 1.3. The space of distributions over state space X is denoted by ( ) X P(X) , a 2 [0,1] : 1a = ∑ ax = 1 . x2X Theorem 1.4. A stationary homogeneous Markov process X : W ! XR with countable state space X ⊆ R and X×X probability transition kernel P : R+ ! [0,1] is reversible iff there exists a probability distribution p 2 P(X), that satisfy the detailed balanced conditions pxPxy(t) = pyPyx(t) for all x,y 2 X and times t 2 R+. (1) When such a distribution p exists, it is the equilibrium distribution of the process. Proof. We assume that the process X is reversible, and hence stationary. We denote the stationary distribution by p, and by reversibility of X, we have Pp fXt1 = x,Xt2 = yg = Pp fXt2 = x,Xt1 = yg, for t = t2 +t1. Hence, we obtain the detailed balanced conditions in Eq. (1). Conversely, let p be the distribution that satisfies the detailed balanced conditions in Eq. (1), then summing up both sides over y 2 X, we see that p is the equilibrium distribution. m Let (x1,:::,xm) 2 X , then applying detailed balanced equations in Eq. (1) repeatedly, we can write p(x1)Px1x2 (t2 −t1):::Pxm−1xm (tm −tm−1) = p(xm)Pxmxm−1 (tm −tm−1):::Px2x1 (t2 −t1). For the homogeneous stationary Markov process X, it follows that for all t0 2 R+ Pp fXt1 = x1,:::,Xtm = xmg = Pp Xt0 = xm,:::,Xt0+tm−t1 = x1 . Since m 2 N and t0,t1 :::,tm were arbitrary, the reversibility follows. Corollary 1.5. A stationary discrete time Markov chain X : W ! XZ with transition matrix P 2 [0,1]X×X is reversible iff there exists a probability distribution p 2 P(X), that satisfies the detailed balanced conditions pxPxy = pyPyx, x,y 2 X. (2) When such a distribution p exists, it is the equilibrium distribution of the process. Corollary 1.6. A stationary Markov process X : W ! XR and generator matrix Q 2 RX×X is reversible iff there exists a probability distribution p 2 P(X), that satisfies the detailed balanced conditions pxQxy = pyQyx, x,y 2 X. (3) When such a distribution p exists, it is the equilibrium distribution of the process. 1 Example 1.7 (Random walks on edge-weighted graphs). Consider an undirected graph G = (X,E) with the vertex set X and the edge set E = ffx,yg : x,y 2 Xg being a subset of unordered pairs of elements from X. We say that y is a neighbor of x (and x is a neighbor of y), if e = fx,yg 2 E and denote x ∼ y. We assume a function w : E ! R+, such that we is a positive number associated with each edge e = fx,yg 2 E. Let Xn 2 X denote the location of a particle on one of the graph vertices at the nth time-step. Consider the following random discrete time movement of a particle on this graph from one vertex to another. If the particle is currently at vertex x then it will next move to vertex y with probability we Pxy , P(fXn+1 = ygjfXn = xg) = 1fe=fx,ygg. ∑ f :x2 f w f The Markov chain X : W ! XN describing the sequence of vertices visited by the particle is a random walk on an undirected edge-weighted graph. Google’s PageRank algorithm, to estimate the relative importance of webpages, is essentially a random walk on a graph! Proposition 1.8. Consider an irreducible homogeneous Markov chain that describes the random walk on an edge weighted graph with a finite number of vertices. In steady state, this Markov chain is time reversible with stationary probability of being in a state x 2 X given by ∑ f :x2 f w f px = . (4) 2∑g2E wg Proof. Using the definition of transition probabilities for this Markov chain and the given distribution p defined in (4), we notice that we we pxPxy = 1fe=fx,ygg, pyPyx = 1fe=fx,ygg. ∑ f 2E w f ∑ f 2E w f Hence, the detailed balance equation for each pair of states x,y 2 X is satisfied, and the result follows. We can also show the following dual result. Lemma 1.9. Let X : W ! XZ+ be a reversible Markov chain on a finite state space X and transition prob- ability matrix P 2 [0,1]X×X. Then, there exists a random walk on a weighted, undirected graph G with the same transition probability matrix P. Proof. We create a graph G = (X,E), where fx,yg 2 E if and only if Pxy > 0. For the stationary distribution p : X ! [0,1] for the Markov chain X, we set the edge weights wfx,yg , pxPxy = pyPyx, With this choice of weights, it is easy to check that wx = ∑ f :x2 f w f = px, and the transition matrix associated with a random walk on this graph is exactly P. Is every Markov chain reversible? 1. If the process is not stationary, then no. To see this, we observe that PfXt1 = x1,Xt2 = x2g = nt1 (x1)Px1x2 (t2 −t1), PfXt−t2 = x2,Xt−t1 = x1g = nt−t2 (x2)Px2x1 (t2 −t1). If the process is not stationary, the two probabilities can’t be equal for all times t,t1,t2 and states x1,x2 2 X. 2. If the process is stationary, then it is still not true in general. Suppose we want to find a stationary distribution a 2 P(X) that satisfies the detailed balance equations axPxy = ayPyx for all states x,y 2 X. For any arbitrary Markov chain X, one may not end up getting any solution. To see this consider a state z 2 X such that PxyPyz > 0. Reversibility condition implies that Pa fX1 = x,X2 = y,X3 = zg = Pa fX1 = z,X2 = y,X3 = zg, and hence a P P P x = zy yx 6= zx . az PxyPyz Pxz Thus, we see that a necessary condition for time reversibility is PxyPyzPzx = PxzPzyPyx for all x,y,z 2 X. 2 Theorem 1.10 (Kolmogorov’s criterion for reversibility of Markov chains). A stationary Markov chain X : W ! XZ is time reversible if and only if starting in state x 2 X, any path back to state x has the same probability n as the reversed path, for all initial states x 2 X. That is, for all n 2 N and states (x1,:::,xn) 2 X Pxx1 Px1x2 :::Pxnx = Pxxn Pxnxn−1 :::Px1x. Proof. The proof of necessity is as indicated above. To see the sufficiency part, fix states x,y 2 X. For any non-negative integer n 2 N, we compute n n (P )xyPyx = ∑ Pxx1 :::PxnyPyx = ∑ PxyPyxn :::Px1x = Pxy(P )yx. x1,x2,:::xn x1,x2,:::xn n n!¥ Taking the limit n ! ¥ and noticing that (P )xy ! py 8x,y 2 X, we get the desired result by appealing to Theorem 1.4. 1.1 Reversible Processes Definition 1.11. Let X : W ! XR be a stationary homogeneous Markov process with stationary distribution p 2 xy P(X) and the generator matrix Q 2 RX×X. The probability flux from state x to state y is defined as lim Nt , t!¥ Nt where Nxy = 1 and N = 1 respectively denote the total number of transitions t ∑n2N fSn6t,Xn=y,Xn−1=xg t ∑n2N fSn6tg from state x to state y and the total number of transition in time duration (0,t]. Lemma 1.12. The probability flux from state x to state y is xy Nt pxQxy = lim . t!¥ Nt Lemma 1.13. For a stationary Markov process X : W ! XR, probability flux balances across a cut A ⊆ X, that is ∑ ∑ pxQxy = ∑ ∑ pyQyx. y2/A x2A x2A y2/A Proof. From global balance condition pQ = 0, we get ∑y2A ∑x2X pxQxy = ∑x2A ∑y2X pyQyx = 0. Further, we have the following identity ∑y2A ∑x2A pxQxy = ∑y2A ∑x2A pyQyx. Subtracting the second identity from the first, we get the result. Corollary 1.14. For A = fxg, the above equation reduces to the full balance equation for state x, i.e., ∑ pxQxy = ∑ pyQyx. y6=x y6=x Z+ Example 1.15. We define two non-negative sequences birth and death rates denoted by l 2 R+ and m 2 N R+ R+ . A Markov process X : W ! Z+ is called a birth-death process if its infinitesimal transition probabilities satisfy Pn,n+m(h) = (1 − lnh − mnh − o(h))1fm=0g + lnh1fm=1g + mnh1fm=−1g + o(h). We say f (h) = o(h) if limh!0 f (h)/h = 0. In other words, a birth-death process is any CTMC with generator of the form 0 1 −l0 l0 0 0 0 B m1 −(l1 + m1) l1 0 0 ···C B C B 0 m2 −(l2 + m2) l2 0 ···C Q = B C. B 0 0 m3 −(l3 + m3) l3 ···C @ . A . .. Proposition 1.16. An ergodic birth-death process in steady-state is time-reversible.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    4 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us