Chapter on Gibbs State

Chapter on Gibbs State

7 Gibbs states Brook’s theorem states that a positive probability measure on a finite product may be decomposed into factors indexed by the cliquesofits dependency graph. Closely related to this is the well known fact that apositivemeasureisaspatialMarkovfieldonagraphG if and only if it is a Gibbs state. The Ising and Potts models are introduced, and the n-vector model is mentioned. 7.1 Dependency graphs Let X (X1, X2,...,Xn) be a family of random variables on a given probability= space. For i, j V 1, 2,...,n with i j,wewritei j ∈ ={ } "= ⊥ if: X and X are independent conditional on (X : k i, j).Therelation i j k "= is thus symmetric, and it gives rise to a graph G with vertex set V and edge-set⊥ E i, j : i j ,calledthedependency graph of X (or of its ={$ % "⊥ } law). We shall see that the law of X may be expressed as a product over terms corresponding to complete subgraphs of G.Acompletesubgraphof G is called a clique,andwewriteK for the set of all cliques of G.For notational simplicity later, we designate the empty subset of V to be a clique, and thus ∅ K.Acliqueismaximal if no strict superset is a clique, and ∈ we write M for the set of maximal cliques of G. Weassumeforsimplicitythatthe Xi take values in some countable subset S of the reals R.ThelawofX gives rise to a probability mass function π on Sn given by π(x) P(X x for i V ), x (x , x ,...,x ) Sn. = i = i ∈ = 1 2 n ∈ It is easily seen by the definition of independence that i j if and only if π may be factorized in the form ⊥ π(x) g(x , U)h(x , U), x Sn, = i j ∈ for some functions g and h,whereU (x : k i, j).ForK K and = k "= ∈ x Sn,wewritex (x : i K ).Wecallπ positive if π(x)>0forall ∈ K = i ∈ x Sn. ∈ K In the following, each function fK acts on the domain S . 142 7.1 Dependency graphs 143 7.1 Theorem [54]. Let π be a positive probability mass function on Sn. There exist functions f : S K [0, ),K M,suchthat K → ∞ ∈ (7.2) π(x) f (x ), x Sn. = K K ∈ K M !∈ In the simplest non-trivial example, let us assume that i j whenever i j 2. The maximal cliques are the pairs i, i 1 ,andthemass⊥ | − |≥ { + } function π may be expressed in the form n 1 − n π(x) fi (xi , xi 1), x S , = + ∈ i 1 != so that X is a Markov chain, whatever the direction of time. Proof. We shall show that π may be expressed in the form (7.3) π(x) f (x ), x Sn, = K K ∈ K K !∈ for suitable fK .Representation(7.2)followsfrom(7.3)byassociatingeach fK with some maximal clique K * containing K as a subset. Arepresentationofπ in the form π(x) f (x) = r r ! is said to separate i and j if every fr is a constant function of either xi or xj ,thatis,no fr depends non-trivially on both xi and xj .Let (7.4) π(x) f (x ) = A A A A !∈ be a factorization of π for some family A of subsets of V ,andsupposethat i, j satisfies: i j,buti and j are not separated in (7.4). We shall construct ⊥ from (7.4) a factorization that separates every pair r, s that is separated in (7.4), and in addition separates i, j.Continuingbyiteration,weobtaina factorization that separates every pair i, j satisfying i j,andthishasthe required form (7.3). ⊥ Since i j, π may be expressed in the form ⊥ (7.5) π(x) g(x , U)h(x , U) = i j for some g, h,whereU (x : j i, j).Fixs, t S,andwriteh = k "= ∈ t (respectively, h )forthefunctionh(x) evaluated with x t (respectively, s,t j = " x s, x t). By (7.4), " i = j = " " π(x) π(x) (7.6) π(x) π(x) f A(xA) . = t π(x) = t π(x) t #A A $ t " !∈ " " " " " " " 144 Gibbs states By (7.5), the ratio π(x) h(xj , U) = ( , ) π(x) t h t U is independent of xi ,sothat " " π(x) f A(xA) s . π(x) = f (x ) t A A A A "s,t !∈ " By (7.6), " " " " f A(xA) s π(x) f A(xA) = t f (x ) #A A $#A A A A "s,t $ !∈ " !∈ " is the required representation, and the" claim is proved." ! " 7.2 Markov and Gibbs random fields Let G (V, E) be a finite graph, taken for simplicity without loops or multiple= edges. Within statistics and statistical mechanics, there has been agreatdealofinterestinprobabilitymeasureshavingatypeof‘spatial Markov property’ given in terms of the neighbour relation of G.Weshall restrict ourselves here to measures on the sample space " 0, 1 V ,while ={ } noting that the following results may be extended without material difficulty to a larger product SV ,whereS is finite or countably infinite. The vector σ " may be placed in one–one correspondence with the ∈ subset η(σ) v V : σv 1 of V ,andweshallusethiscorrespondence ={ ∈ = } freely. For any W V ,wedefinetheexternal boundary ⊆ %W v V : v/W,v w for some w W . ={ ∈ ∈ ∼ ∈ } For s (sv : v V ) ",wewritesW for the sub-vector (sw : w W ). We refer= to the configuration∈ ∈ of vertices in W as the ‘state’ of W . ∈ 7.7 Definition. Aprobabilitymeasureπ on " is said to be positive if π(σ) > 0forallσ ".ItiscalledaMarkov (random) field if it is positive and: for all W V∈,conditionalonthestateofV W ,thelawofthestateof ⊆ \ W depends only on the state of %W .Thatis,π satisfies the global Markov property (7.8) π σW sW σV W sV W π σW sW σ%W s%W , = \ = \ = = = for all s %",andW " V . & % " & ∈ ⊆" " The key result about such measures is their representation intermsofa ‘potential function’ φ,inaformknownasaGibbsrandomfield(orsome- times ‘Gibbs state’). Recall the set K of cliques of the graph G,andwrite 2V for the set of all subsets (or ‘power set’) of V . 7.2 Markov and Gibbs random fields 145 7.9 Definition. Aprobabilitymeasureπ on " is called a Gibbs (random) V field if there exists a ‘potential’ function φ :2 R,satisfyingφC 0if C / K,suchthat → = ∈ (7.10) π(B) exp φ , B V. = K ⊆ #K B $ '⊆ We allow the empty set in the abovesummation, so that log π(∅) φ∅. = Condition (7.10) has been chosen for combinatorial simplicity. It is not the physicists’ preferred definition of a Gibbs state. Letusdefinea Gibbs state as a probability measure π on " such that there exist functions f : 0, 1 K R, K K,with K { } → ∈ (7.11) π(σ) exp f (σ ) ,σ". = K K ∈ #K K $ '∈ It is immediate that π satisfies (7.10) for some φ whenever it satisfies (7.11). The converse holds also, and is left for Exercise 7.1. Gibbs fields are thus named after Josiah Willard Gibbs, whose volume [95] made available the foundations of statistical mechanics. A simplistic motivation for the form of (7.10) is as follows. Suppose that each state σ has an energy Eσ ,andaprobabilityπ(σ).Weconstraintheaverageenergy E Eσ π(σ) to be fixed, and we maximize the entropy = σ ( η(π) π(σ)log π(σ). =− 2 σ " '∈ With the aid of a Lagrange multiplier β,wefindthat β Eσ π(σ) e− ,σ". ∝ ∈ The theory of thermodynamics leads to the expression β 1/(kT) where k is Boltzmann’s constant and T is (absolute) temperature.= Formula (7.10) arises when the energy Eσ may be expressed as the sum of the energies of the sub-systems indexed by cliques. 7.12 Theorem. Apositiveprobabilitymeasureπ on " is a Markov random field if and only if it is a Gibbs random field. The potential function φ corresponding to the Markov field π is given by K L φ ( 1)| \ | log π(L), K K. K = − ∈ L K '⊆ Apositiveprobabilitymeasureπ is said to have the local Markov property if it satisfies the global property (7.8) for all singleton sets W and all s ". ∈ The global property evidently implies the local property, and it turns out that the two properties are equivalent. For notational convenience, we denote a singleton set w as w. { } 146 Gibbs states 7.13 Proposition. Let π be a positive probability measure on ".The following three statements are equivalent: (a) π satisfies the global Markov property, (b) π satisfies the local Markov property, (c) for all A Vandanypairu,v Vwithu/ A, v Aandu" v, ⊆ ∈ ∈ ∈ π(A u) π(A u v) (7.14) ∪ ∪ \ . π(A) = π(A v) \ Proof. First, assume (a), so that (b) holds trivially. Let u / A, v A,and ∈ ∈ u " v.Applying(7.8)withW u and, for w u, sw 1ifandonlyif ={ } "= = w A,wefindthat ∈ (7.15) π(A u) ∪ π(σu 1 σV u A) π(A) π(A u) = = | \ = + ∪ π(σ 1 σ% A %u) = u = | u = ∩ π(σu 1 σV u A v) since v/%u = = | \ = \ ∈ π(A u v) ∪ \ . = π(A v) π(A u v) \ + ∪ \ Equation (7.15) is equivalent to (7.14), whence (b) and (c) are equivalent under (a). It remains to show that the local property implies the global property. The proof requires a short calculation, and may be done eitherbyTheorem 7.1 or within the proof of Theorem 7.12. We follow the first route here. Assume that π is positive and satisfies the local Markov property. Then u v for all u,v V with u " v.ByTheorem7.1,thereexistfunctions f ⊥, K M,suchthat∈ K ∈ (7.16) π(A) f (A K ), A V.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    10 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us