Chapter on Gibbs State

Total Page:16

File Type:pdf, Size:1020Kb

Chapter on Gibbs State 7 Gibbs states Brook’s theorem states that a positive probability measure on a finite product may be decomposed into factors indexed by the cliquesofits dependency graph. Closely related to this is the well known fact that apositivemeasureisaspatialMarkovfieldonagraphG if and only if it is a Gibbs state. The Ising and Potts models are introduced, and the n-vector model is mentioned. 7.1 Dependency graphs Let X (X1, X2,...,Xn) be a family of random variables on a given probability= space. For i, j V 1, 2,...,n with i j,wewritei j ∈ ={ } "= ⊥ if: X and X are independent conditional on (X : k i, j).Therelation i j k "= is thus symmetric, and it gives rise to a graph G with vertex set V and edge-set⊥ E i, j : i j ,calledthedependency graph of X (or of its ={$ % "⊥ } law). We shall see that the law of X may be expressed as a product over terms corresponding to complete subgraphs of G.Acompletesubgraphof G is called a clique,andwewriteK for the set of all cliques of G.For notational simplicity later, we designate the empty subset of V to be a clique, and thus ∅ K.Acliqueismaximal if no strict superset is a clique, and ∈ we write M for the set of maximal cliques of G. Weassumeforsimplicitythatthe Xi take values in some countable subset S of the reals R.ThelawofX gives rise to a probability mass function π on Sn given by π(x) P(X x for i V ), x (x , x ,...,x ) Sn. = i = i ∈ = 1 2 n ∈ It is easily seen by the definition of independence that i j if and only if π may be factorized in the form ⊥ π(x) g(x , U)h(x , U), x Sn, = i j ∈ for some functions g and h,whereU (x : k i, j).ForK K and = k "= ∈ x Sn,wewritex (x : i K ).Wecallπ positive if π(x)>0forall ∈ K = i ∈ x Sn. ∈ K In the following, each function fK acts on the domain S . 142 7.1 Dependency graphs 143 7.1 Theorem [54]. Let π be a positive probability mass function on Sn. There exist functions f : S K [0, ),K M,suchthat K → ∞ ∈ (7.2) π(x) f (x ), x Sn. = K K ∈ K M !∈ In the simplest non-trivial example, let us assume that i j whenever i j 2. The maximal cliques are the pairs i, i 1 ,andthemass⊥ | − |≥ { + } function π may be expressed in the form n 1 − n π(x) fi (xi , xi 1), x S , = + ∈ i 1 != so that X is a Markov chain, whatever the direction of time. Proof. We shall show that π may be expressed in the form (7.3) π(x) f (x ), x Sn, = K K ∈ K K !∈ for suitable fK .Representation(7.2)followsfrom(7.3)byassociatingeach fK with some maximal clique K * containing K as a subset. Arepresentationofπ in the form π(x) f (x) = r r ! is said to separate i and j if every fr is a constant function of either xi or xj ,thatis,no fr depends non-trivially on both xi and xj .Let (7.4) π(x) f (x ) = A A A A !∈ be a factorization of π for some family A of subsets of V ,andsupposethat i, j satisfies: i j,buti and j are not separated in (7.4). We shall construct ⊥ from (7.4) a factorization that separates every pair r, s that is separated in (7.4), and in addition separates i, j.Continuingbyiteration,weobtaina factorization that separates every pair i, j satisfying i j,andthishasthe required form (7.3). ⊥ Since i j, π may be expressed in the form ⊥ (7.5) π(x) g(x , U)h(x , U) = i j for some g, h,whereU (x : j i, j).Fixs, t S,andwriteh = k "= ∈ t (respectively, h )forthefunctionh(x) evaluated with x t (respectively, s,t j = " x s, x t). By (7.4), " i = j = " " π(x) π(x) (7.6) π(x) π(x) f A(xA) . = t π(x) = t π(x) t #A A $ t " !∈ " " " " " " " 144 Gibbs states By (7.5), the ratio π(x) h(xj , U) = ( , ) π(x) t h t U is independent of xi ,sothat " " π(x) f A(xA) s . π(x) = f (x ) t A A A A "s,t !∈ " By (7.6), " " " " f A(xA) s π(x) f A(xA) = t f (x ) #A A $#A A A A "s,t $ !∈ " !∈ " is the required representation, and the" claim is proved." ! " 7.2 Markov and Gibbs random fields Let G (V, E) be a finite graph, taken for simplicity without loops or multiple= edges. Within statistics and statistical mechanics, there has been agreatdealofinterestinprobabilitymeasureshavingatypeof‘spatial Markov property’ given in terms of the neighbour relation of G.Weshall restrict ourselves here to measures on the sample space " 0, 1 V ,while ={ } noting that the following results may be extended without material difficulty to a larger product SV ,whereS is finite or countably infinite. The vector σ " may be placed in one–one correspondence with the ∈ subset η(σ) v V : σv 1 of V ,andweshallusethiscorrespondence ={ ∈ = } freely. For any W V ,wedefinetheexternal boundary ⊆ %W v V : v/W,v w for some w W . ={ ∈ ∈ ∼ ∈ } For s (sv : v V ) ",wewritesW for the sub-vector (sw : w W ). We refer= to the configuration∈ ∈ of vertices in W as the ‘state’ of W . ∈ 7.7 Definition. Aprobabilitymeasureπ on " is said to be positive if π(σ) > 0forallσ ".ItiscalledaMarkov (random) field if it is positive and: for all W V∈,conditionalonthestateofV W ,thelawofthestateof ⊆ \ W depends only on the state of %W .Thatis,π satisfies the global Markov property (7.8) π σW sW σV W sV W π σW sW σ%W s%W , = \ = \ = = = for all s %",andW " V . & % " & ∈ ⊆" " The key result about such measures is their representation intermsofa ‘potential function’ φ,inaformknownasaGibbsrandomfield(orsome- times ‘Gibbs state’). Recall the set K of cliques of the graph G,andwrite 2V for the set of all subsets (or ‘power set’) of V . 7.2 Markov and Gibbs random fields 145 7.9 Definition. Aprobabilitymeasureπ on " is called a Gibbs (random) V field if there exists a ‘potential’ function φ :2 R,satisfyingφC 0if C / K,suchthat → = ∈ (7.10) π(B) exp φ , B V. = K ⊆ #K B $ '⊆ We allow the empty set in the abovesummation, so that log π(∅) φ∅. = Condition (7.10) has been chosen for combinatorial simplicity. It is not the physicists’ preferred definition of a Gibbs state. Letusdefinea Gibbs state as a probability measure π on " such that there exist functions f : 0, 1 K R, K K,with K { } → ∈ (7.11) π(σ) exp f (σ ) ,σ". = K K ∈ #K K $ '∈ It is immediate that π satisfies (7.10) for some φ whenever it satisfies (7.11). The converse holds also, and is left for Exercise 7.1. Gibbs fields are thus named after Josiah Willard Gibbs, whose volume [95] made available the foundations of statistical mechanics. A simplistic motivation for the form of (7.10) is as follows. Suppose that each state σ has an energy Eσ ,andaprobabilityπ(σ).Weconstraintheaverageenergy E Eσ π(σ) to be fixed, and we maximize the entropy = σ ( η(π) π(σ)log π(σ). =− 2 σ " '∈ With the aid of a Lagrange multiplier β,wefindthat β Eσ π(σ) e− ,σ". ∝ ∈ The theory of thermodynamics leads to the expression β 1/(kT) where k is Boltzmann’s constant and T is (absolute) temperature.= Formula (7.10) arises when the energy Eσ may be expressed as the sum of the energies of the sub-systems indexed by cliques. 7.12 Theorem. Apositiveprobabilitymeasureπ on " is a Markov random field if and only if it is a Gibbs random field. The potential function φ corresponding to the Markov field π is given by K L φ ( 1)| \ | log π(L), K K. K = − ∈ L K '⊆ Apositiveprobabilitymeasureπ is said to have the local Markov property if it satisfies the global property (7.8) for all singleton sets W and all s ". ∈ The global property evidently implies the local property, and it turns out that the two properties are equivalent. For notational convenience, we denote a singleton set w as w. { } 146 Gibbs states 7.13 Proposition. Let π be a positive probability measure on ".The following three statements are equivalent: (a) π satisfies the global Markov property, (b) π satisfies the local Markov property, (c) for all A Vandanypairu,v Vwithu/ A, v Aandu" v, ⊆ ∈ ∈ ∈ π(A u) π(A u v) (7.14) ∪ ∪ \ . π(A) = π(A v) \ Proof. First, assume (a), so that (b) holds trivially. Let u / A, v A,and ∈ ∈ u " v.Applying(7.8)withW u and, for w u, sw 1ifandonlyif ={ } "= = w A,wefindthat ∈ (7.15) π(A u) ∪ π(σu 1 σV u A) π(A) π(A u) = = | \ = + ∪ π(σ 1 σ% A %u) = u = | u = ∩ π(σu 1 σV u A v) since v/%u = = | \ = \ ∈ π(A u v) ∪ \ . = π(A v) π(A u v) \ + ∪ \ Equation (7.15) is equivalent to (7.14), whence (b) and (c) are equivalent under (a). It remains to show that the local property implies the global property. The proof requires a short calculation, and may be done eitherbyTheorem 7.1 or within the proof of Theorem 7.12. We follow the first route here. Assume that π is positive and satisfies the local Markov property. Then u v for all u,v V with u " v.ByTheorem7.1,thereexistfunctions f ⊥, K M,suchthat∈ K ∈ (7.16) π(A) f (A K ), A V.
Recommended publications
  • Pdf File of Second Edition, January 2018
    Probability on Graphs Random Processes on Graphs and Lattices Second Edition, 2018 GEOFFREY GRIMMETT Statistical Laboratory University of Cambridge copyright Geoffrey Grimmett Geoffrey Grimmett Statistical Laboratory Centre for Mathematical Sciences University of Cambridge Wilberforce Road Cambridge CB3 0WB United Kingdom 2000 MSC: (Primary) 60K35, 82B20, (Secondary) 05C80, 82B43, 82C22 With 56 Figures copyright Geoffrey Grimmett Contents Preface ix 1 Random Walks on Graphs 1 1.1 Random Walks and Reversible Markov Chains 1 1.2 Electrical Networks 3 1.3 FlowsandEnergy 8 1.4 RecurrenceandResistance 11 1.5 Polya's Theorem 14 1.6 GraphTheory 16 1.7 Exercises 18 2 Uniform Spanning Tree 21 2.1 De®nition 21 2.2 Wilson's Algorithm 23 2.3 Weak Limits on Lattices 28 2.4 Uniform Forest 31 2.5 Schramm±LownerEvolutionsÈ 32 2.6 Exercises 36 3 Percolation and Self-Avoiding Walks 39 3.1 PercolationandPhaseTransition 39 3.2 Self-Avoiding Walks 42 3.3 ConnectiveConstantoftheHexagonalLattice 45 3.4 CoupledPercolation 53 3.5 Oriented Percolation 53 3.6 Exercises 56 4 Association and In¯uence 59 4.1 Holley Inequality 59 4.2 FKG Inequality 62 4.3 BK Inequalitycopyright Geoffrey Grimmett63 vi Contents 4.4 HoeffdingInequality 65 4.5 In¯uenceforProductMeasures 67 4.6 ProofsofIn¯uenceTheorems 72 4.7 Russo'sFormulaandSharpThresholds 80 4.8 Exercises 83 5 Further Percolation 86 5.1 Subcritical Phase 86 5.2 Supercritical Phase 90 5.3 UniquenessoftheIn®niteCluster 96 5.4 Phase Transition 99 5.5 OpenPathsinAnnuli 103 5.6 The Critical Probability in Two Dimensions 107
    [Show full text]
  • Probability on Graphs Random Processes on Graphs and Lattices
    Probability on Graphs Random Processes on Graphs and Lattices GEOFFREY GRIMMETT Statistical Laboratory University of Cambridge c G. R. Grimmett 1/4/10, 17/11/10, 5/7/12 Geoffrey Grimmett Statistical Laboratory Centre for Mathematical Sciences University of Cambridge Wilberforce Road Cambridge CB3 0WB United Kingdom 2000 MSC: (Primary) 60K35, 82B20, (Secondary) 05C80, 82B43, 82C22 With 44 Figures c G. R. Grimmett 1/4/10, 17/11/10, 5/7/12 Contents Preface ix 1 Random walks on graphs 1 1.1 RandomwalksandreversibleMarkovchains 1 1.2 Electrical networks 3 1.3 Flowsandenergy 8 1.4 Recurrenceandresistance 11 1.5 Polya's theorem 14 1.6 Graphtheory 16 1.7 Exercises 18 2 Uniform spanning tree 21 2.1 De®nition 21 2.2 Wilson's algorithm 23 2.3 Weak limits on lattices 28 2.4 Uniform forest 31 2.5 Schramm±LownerevolutionsÈ 32 2.6 Exercises 37 3 Percolation and self-avoiding walk 39 3.1 Percolationandphasetransition 39 3.2 Self-avoiding walks 42 3.3 Coupledpercolation 45 3.4 Orientedpercolation 45 3.5 Exercises 48 4 Association and in¯uence 50 4.1 Holley inequality 50 4.2 FKGinequality 53 4.3 BK inequality 54 4.4 Hoeffdinginequality 56 c G. R. Grimmett 1/4/10, 17/11/10, 5/7/12 vi Contents 4.5 In¯uenceforproductmeasures 58 4.6 Proofsofin¯uencetheorems 63 4.7 Russo'sformulaandsharpthresholds 75 4.8 Exercises 78 5 Further percolation 81 5.1 Subcritical phase 81 5.2 Supercritical phase 86 5.3 Uniquenessofthein®nitecluster 92 5.4 Phase transition 95 5.5 Openpathsinannuli 99 5.6 The critical probability in two dimensions 103 5.7 Cardy's formula 110 5.8 The
    [Show full text]
  • A Course on Large Deviations with an Introduction to Gibbs Measures
    A Course on Large Deviations with an Introduction to Gibbs Measures Firas Rassoul-Agha Timo Seppäläinen Graduate Studies in Mathematics Volume 162 American Mathematical Society A Course on Large Deviations with an Introduction to Gibbs Measures https://doi.org/10.1090//gsm/162 A Course on Large Deviations with an Introduction to Gibbs Measures Firas Rassoul-Agha Timo Seppäläinen Graduate Studies in Mathematics Volume 162 American Mathematical Society Providence, Rhode Island EDITORIAL COMMITTEE Dan Abramovich Daniel S. Freed Rafe Mazzeo (Chair) Gigliola Staffilani 2010 Mathematics Subject Classification. Primary 60-01, 60F10, 60J10, 60K35, 60K37, 82B05, 82B20. For additional information and updates on this book, visit www.ams.org/bookpages/gsm-162 Library of Congress Cataloging-in-Publication Data Rassoul-Agha, Firas. A course on large deviations with an introduction to Gibbs measures / Firas Rassoul-Agha, Timo Sepp¨al¨ainen. pages cm. — (Graduate studies in mathematics ; volume 162) Includes bibliographical references and indexes. ISBN 978-0-8218-7578-0 (alk. paper) 1. Large deviations. 2. Probabilities. 3. Measure theory. I. Sepp¨al¨ainen, Timo O. II. Title. QA273.67.R375 2015 519.2—dc23 2014035275 Copying and reprinting. Individual readers of this publication, and nonprofit libraries acting for them, are permitted to make fair use of the material, such as to copy select pages for use in teaching or research. Permission is granted to quote brief passages from this publication in reviews, provided the customary acknowledgment of the source is given. Republication, systematic copying, or multiple reproduction of any material in this publication is permitted only under license from the American Mathematical Society.
    [Show full text]
  • Physics 132- Fundamentals of Physics for Biologists II
    Physics 132- Fundamentals of Physics for Biologists II Statistical Physics and Thermodynamics QUIZ 2 Quiz 2 25 20 15 10 Numberof Students 5 AVG: 5.15 0 STDEV: 2.17 0 2 4 6 8 10 Score 1. (4 pts) A 200 g block of copper at a temperature of 55 oC is put into an insulated beaker of water at 20 oC. The two come to thermal equilibrium at a temperature of about 30 oC – much closer to the original temperature of the water than of the copper. (From the macroscopic point of view, the specific heat of water is about 1.0 J/g-oC while the specific heat of copper is only about 0.4 J/g-oC.) However, from the microscopic point of view… 1.1 (2 pts) From this you can conclude A. There are more degrees of freedom in 200 g of copper than in 200 g of water. B. There are fewer degrees of freedom in 200 g of copper than in 200 g of water. C. There are about the same number of degrees of freedom in 200 g of copper as there are in 200 g of water. D. The information given doesn’t tell you anything about the number of degrees of freedom in the matter. 1.2 (2 pts) From this you can conclude the following about the kinetic energy associated with moving in the x-direction (after the system has come to thermal equilibrium ): A. The average x-direction contribution to kinetic energy of a water molecule is greater than that of a copper atom.
    [Show full text]
  • Hidden Gibbs Models: Theory and Applications
    Hidden Gibbs Models: Theory and Applications – DRAFT – Evgeny Verbitskiy Mathematical Institute, Leiden University, The Netherlands Johann Bernoulli Institute, Groningen University, The Netherlands November 21, 2015 2 Contents 1 Introduction1 1.1 Hidden Models / Hidden Processes............... 3 1.1.1 Equivalence of “random” and “deterministic” settings. 4 1.1.2 Deterministic chaos..................... 4 1.2 Hidden Gibbs Models....................... 6 1.2.1 How realistic is the Gibbs assumption?......... 8 I Theory9 2 Gibbs states 11 2.1 Gibbs-Bolztmann Ansatz..................... 11 2.2 Gibbs states for Lattice Systems................. 12 2.2.1 Translation Invariant Gibbs states............. 13 2.2.2 Regularity of Gibbs states................. 15 2.3 Gibbs and other regular stochastic processes.......... 16 2.3.1 Markov Chains....................... 17 2.3.2 k-step Markov chains.................... 18 2.3.3 Variable Length Markov Chains.............. 19 2.3.4 Countable Mixtures of Markov Chains (CMMC’s)... 20 2.3.5 g-measures (chains with complete connections).... 20 2.4 Gibbs measures in Dynamical Systems............. 27 2.4.1 Equilibrium states...................... 28 3 Functions of Markov processes 31 3.1 Markov Chains (recap)....................... 31 3.1.1 Regular measures on Subshifts of Finite Type...... 33 3.2 Function of Markov Processes.................. 33 3.2.1 Functions of Markov Chains in Dynamical systems.. 35 3.3 Hidden Markov Models...................... 36 3.4 Functions of Markov Chains from the Thermodynamic point of view................................ 38 3 4 CONTENTS 4 Hidden Gibbs Processes 45 4.1 Functions of Gibbs Processes................... 45 4.2 Yayama................................ 45 4.3 Review: renormalization of g-measures............. 47 4.4 Cluster Expansions........................
    [Show full text]