Lecture-17: Reversibility

1 Introduction

Let T be an additive ordered group, such as Z or R. A (X(t) ∈ X : t ∈ T) is reversible if (X(t1),...,X(tn)) has the same distribution as (X(τ −t1),...,X(τ −tn)) for all finite positive integers n, instants n {t1,...,tn} ⊂ T and all shifts τ ∈ T. Lemma 1.1. A reversible process is stationary.

Proof. Since X(t) is reversible, both (X(t1),...,X(tn)) and (X(s+t1),...,X(s+tn)) have the same distribution as (X(−t1),...,X(−tn)) for each n ∈ N and t1,...,tn ∈ T. X Let P(X) be the space of probability distributions on state space X, i.e. P(X) , {α ∈ [0,1] : ∑x αx = 1} . Theorem 1.2. A stationary homogeneous Markov process with state space X and probability transition kernel P(t) is reversible iff there exists a probability distribution π ∈ P(X), that satisfy the detailed balanced conditions

πxPxy(t) = πyPyx(t) for all x,y ∈ X and times t ∈ T. (1)

When such a distribution π exists, it is the equilibrium distribution of the process. Proof. We assume that X(t) is reversible, and hence stationary. We denote the stationary distribution by π, and by reversibility of X(t) we have

Pr{X(t1) = x,X(t2) = y} = Pr{X(t2) = x,X(t1) = y},

for τ = t2 +t1. Hence, we obtain the detailed balanced conditions (1). Conversely, let π be the distribution that satisfies the detailed balanced conditions, then summing up both sides m over y ∈ X, we see that π is the equilibrium distribution. Let (x1,...,xm) ∈ X and applying detailed balanced equations (2), we can write

π(x1)Px1x2 (t2 −t1)...Pxm−1xm (tm −tm−1) = π(xm)Pxmxm−1 (tm −tm−1)...Px2x1 (t2 −t1).

Therefore the homogeneous stationary Markov process X, it follows that for all t0 ∈ T

Pr{X(t1) = x1,...,X(tm) = xm} = Pr{X(t0) = xm,...,X(t0 +tm −t1) = x1}.

Since m ∈ N and t0,t1 ...,tm were arbitrary, the reversibility follows. Corollary 1.3. A stationary discrete time with state space X and transition matrix P is reversible iff there exists a probability distribution π, that satisfy the detailed balanced conditions

πiPi j = π jPji, ∀i, j ∈ X. (2)

When such a distribution π exists, it is the equilibrium distribution of the process. Corollary 1.4. A stationary Markov process with state space X and generator matrix Q is reversible iff there exists a probability distribution π, that satisfy the detailed balanced conditions

πiQi j = π jQ ji, ∀i, j ∈ X. (3)

When such a distribution π exists, it is the equilibrium distribution of the process.

1 Example 1.5 (Random walks on edge-weighted graphs). Consider an undirected graph G = (X,E) with the vertex set X and the edge set E = {{x,y} : x,y ∈ X} being a subset of unordered pairs of elements from X. We say that y is a neighbor of x (and x is a neighbor of y), if e = {x,y} ∈ E and denote x ∼ y. We assume a function w : E → R+, such that we is a positive number associated with each edge e = {i, j} ∈ E. Let Xn ∈ X denote the location of a particle on one of the graph vertices at the nth time-step. Consider the following random discrete time movement of a particle on this graph from one vertex to another in the following manner. If the particle is presently at vertex x then it will next move to vertex y with probability

we Pxy , P(Xn+1 = y|Xn = x) = 1{e={x,y}}. ∑ f :x∈ f w f The Markov chain describing the sequence of vertices visited by the particle is a on an undirected edge-weighted graph. Google’s PageRank algorithm, to estimate the relative importance of webpages, is essentially a random walk on a graph! Proposition 1.6. Consider an irreducible homogeneous Markov chain that describes the random walk on an edge weighted graph with a finite number of vertices. In steady state, this Markov chain is time reversible with stationary probability of being in a state x ∈ X given by

∑ f :x∈ f w f πx = . (4) 2∑g∈E wg

Proof. Using the definition of transition probabilities for this Markov chain and the given distribution π defined in (4), we notice that

we we πxPxy = 1{e={x,y}}, πyPyx = 1{e={x,y}}. ∑ f ∈E w f ∑ f ∈E w f

Hence, the equation for each pair of states x,y ∈ X is satisfied, and the result follows. We can also show the following dual result. Lemma 1.7. Let X be a reversible Markov chain on a finite state space X and transition probability matrix P. Then, there exists a random walk on a weighted, undirected graph G with the same transition probability matrix P.

Proof. We create a graph G = (X,E), where (x,y) ∈ E if and only if Pxy > 0. For the stationary distribution π for the Markov chain X, we set the edge weights

w{x,y} , πxPxy = πyPyx,

With this choice of weights, it is easy to check that wx = ∑ f :x∈ f w f = πx, and the transition matrix associated with a random walk on this graph is exactly P.

Is every Markov chain reversible? If we try to prove the equations necessary for time reversibility, xiPi j = x jPji for all i, j ∈ X, for any arbitrary Markov chain, one may not end up getting any solution. This is so because, if Pi jPjk > 0, then x P P P i = k j ji 6= ki . xk Pi jPjk Pik

Thus, we see that a necessary condition for time reversibility is Pi jPjkPki = PikPk jPji for all i, j,k ∈ X. Theorem 1.8 (Kolmogorov’s criterion for reversibility of Markov chains). A stationary Markov chain is time reversible if and only if starting in state i, any path back to state i has the same probability as the reversed path, for all initial states i ∈ X. That is, if Pii1 Pi1i2 ...Piki = Piik Pikik−1 ...Pi1i. Proof. The proof of necessity is as indicated above. To see the sufficiency part, fix states i, j. For any positive integer k, we compute

k k (P )i jPji = ∑ Pii1 ...Pik jPji = ∑ Pi jPjik ...Pi1i = Pi j(P ) ji. i1,i2,...ik i1,i2,...ik

k k→∞ Taking k → ∞ and noticing that (P )i j → π j ∀i, j ∈ X, we get the desired result by appealing to Theorem 1.2.

2 1.1 Reversible Processes Let X be a stationary homogeneous Markov process with stationary distribution π and the generator matrix Q. The probability flux x y Q = Nxy(t) N (t) = from state to state is defined as πx xy limt→∞ N(t) , where xy ∑n∈N 1{Sn6t,Xn=y,Xn−1=x} and N(t) = ∑n∈N 1{Sn6t} respectively denote the total number of transitions from state x to state y and the total number of transition in time duration (0,t]. Lemma 1.9. For a stationary Markov process, probability flux balances across a cut A ⊆ I, that is

∑ ∑ πiQi j = ∑ ∑ π jQ ji. j∈/A i∈A i∈A j∈/A

Proof. From full balance condition πQ = 0, we get ∑ j∈A ∑i∈X πiQi j = ∑i∈A ∑ j∈X π jQ ji = 0. Further, we have the following identity ∑ j∈A ∑i∈A πiQi j = ∑ j∈A ∑i∈A π jQ ji. Subtracting the second identity from the first, we get the result. Corollary 1.10. For A = {i}, the above equation reduces to the full balance equation for state i, i.e.,

∑ πiQi j = ∑ π jQ ji. j6=i j6=i

Example 1.11. We define two non-negative sequences birth and death rates denoted by {λn : n ∈ N0} and {µn : n ∈ N0}. A Markov process {Xt ∈ N0 : t ∈ R} on the state space N0 is called a birth-death process if its infinitesimal transition probabilities satisfy

Pn,n+m(h) = (1 − λnh − µnh − o(h))1{m=0} + λnh1{m=1} + µnh1{m=−1} + o(h).

We say f (h) = o(h) if limh→0 f (h)/h = 0. In other words, a birth-death process is any CTMC with generator of the form   −λ0 λ0 0 0 0  µ1 −(λ1 + µ1) λ1 0 0 ···    0 µ2 −(λ2 + µ2) λ2 0 ··· Q =  .  0 0 µ3 −(λ3 + µ3) λ3 ···  ......  ...... Proposition 1.12. An ergodic birth-death process in steady-state is time-reversible.

Proof. Since the process is stationary, the probability flux must balance across any cut of the form A = {0,1,2,...,i}, i ≥ 0. But, this is precisely the equation πiλi = π jµ j since there are no other transitions possible across the cut. So the process is time-reversible. In fact, the following, more general, statement can be proven using similar ideas. Proposition 1.13. Consider an ergodic CTMC on a countable state space I with the following property: for any pair of states i 6= j ∈ I, there is a unique path i = i0 → i1 → ··· → in(i, j) = j of distinct states having positive probability. Then the CTMC in steady-state is reversible.

Example 1.14 (M/M/1 queue). The M/M/1 queue’s generator defines a birth-death process. Hence, if it is stationary, then it must be time-reversible, with the equilibrium distribution π satisfying the detailed balance equations πnλ = πn+1µ for each n ∈ N0. This yields πn+1 = ρπn for the system load ρ = Eσ1/Eξ1 = λ/µ. n Since ∑i≥0 π = 1, we must have ρ < 1, such that πn = (1−ρ)ρ for each n ∈ N0. In other words, if λ < µ, then the equilibrium distribution of the number of customers in the system is geometric with parameter ρ = λ/µ. We say that the M/M/1 queue is in the stable regime when ρ < 1. Corollary 1.15. The number of customers in a stable M/M/1 queueing system at equilibrium is a reversible Markov process.

Further, since M/M/1 queue is a reversible CTMC, the following theorem follows. Theorem 1.16 (Burke). Departures from a stable M/M/1 queue are Poisson with same rate as the arrivals.

3