Hidden Gibbs Models: Theory and Applications

Total Page:16

File Type:pdf, Size:1020Kb

Hidden Gibbs Models: Theory and Applications Hidden Gibbs Models: Theory and Applications – DRAFT – Evgeny Verbitskiy Mathematical Institute, Leiden University, The Netherlands Johann Bernoulli Institute, Groningen University, The Netherlands November 21, 2015 2 Contents 1 Introduction1 1.1 Hidden Models / Hidden Processes............... 3 1.1.1 Equivalence of “random” and “deterministic” settings. 4 1.1.2 Deterministic chaos..................... 4 1.2 Hidden Gibbs Models....................... 6 1.2.1 How realistic is the Gibbs assumption?......... 8 I Theory9 2 Gibbs states 11 2.1 Gibbs-Bolztmann Ansatz..................... 11 2.2 Gibbs states for Lattice Systems................. 12 2.2.1 Translation Invariant Gibbs states............. 13 2.2.2 Regularity of Gibbs states................. 15 2.3 Gibbs and other regular stochastic processes.......... 16 2.3.1 Markov Chains....................... 17 2.3.2 k-step Markov chains.................... 18 2.3.3 Variable Length Markov Chains.............. 19 2.3.4 Countable Mixtures of Markov Chains (CMMC’s)... 20 2.3.5 g-measures (chains with complete connections).... 20 2.4 Gibbs measures in Dynamical Systems............. 27 2.4.1 Equilibrium states...................... 28 3 Functions of Markov processes 31 3.1 Markov Chains (recap)....................... 31 3.1.1 Regular measures on Subshifts of Finite Type...... 33 3.2 Function of Markov Processes.................. 33 3.2.1 Functions of Markov Chains in Dynamical systems.. 35 3.3 Hidden Markov Models...................... 36 3.4 Functions of Markov Chains from the Thermodynamic point of view................................ 38 3 4 CONTENTS 4 Hidden Gibbs Processes 45 4.1 Functions of Gibbs Processes................... 45 4.2 Yayama................................ 45 4.3 Review: renormalization of g-measures............. 47 4.4 Cluster Expansions......................... 47 4.5 Cluster expansion.......................... 49 5 Hidden Gibbs Fields 53 5.1 Renormalization Transformations in Statistical Mechanics.. 54 5.2 Examples of pathologies under renormalization........ 55 5.3 General Properties of Renormalised Gibbs Fields....... 60 5.4 Dobrushin’s reconstruction program.............. 60 5.4.1 Generalized Gibbs states/“Classification” of singularities 61 5.4.2 Variational Principles.................... 64 5.5 Criteria for Preservation of Gibbs property under renormal- ization................................ 70 II Applications 73 6 Hidden Markov Models 75 6.1 Three basic problems for HMM’s................. 75 6.1.1 The Evaluation Problem.................. 77 6.1.2 The Decoding or State Estimation Problem....... 78 6.2 Gibbs property of Hidden Markov Chains........... 78 7 Denoising 81 A Prerequisites 89 A.1 Notation............................... 89 A.2 Measure theory........................... 89 A.3 Stochastic processes........................ 92 A.4 Ergodic theory........................... 93 A.5 Entropy............................... 94 A.5.1 Shannon’s entropy rate per symbol............ 94 A.5.2 Kolmogorov-Sinai entropy of measure-preserving sys- tems.............................. 95 Chapter 1 Introduction Last modified on April 3, 2015 Very often the true dynamics of a physical system is hidden from us: we are able to observe only a certain measurement of the state of the underly- ing system. For example, air temperature and the speed of wind are easily observable functions of the climate dynamics, while electroencephalogram provides insight into the functioning of the brain. Partial Observability Noise Coarse Graining Observing only partial information naturally limits our ability to describe or model the underlying system, detect changes in dynamics, make pre- dictions, etc. Nevertheless, these problems must be addressed in practical 1 2 CHAPTER 1. INTRODUCTION situations, and are extremely challenging from the mathematical point of view. Mathematics has proved to be extremely useful in dealing with imperfect data. There are a number of methods developed within various mathe- matical disciplines such as • Dynamical Systems • Control Theory • Decision Theory • Information Theory • Machine Learning • Statistics to address particular instances of this general problem – dealing with imperfect knowledge. Remarkably, many problems have a common nature. For example, cor- recting signals corrupted by noisy channels during transmission, analyz- ing neuronal spikes, analyzing genomic sequences, and validating the so- called renormalization group methods of theoretical physics, all have the same underlying mathematical structure: a hidden Gibbs model. In this model, the observable process is a function (either deterministic or ran- dom) of a process whose underlying probabilistic law is Gibbs. Gibbs mod- els have been introduced in Statistical Mechanics, and have been highly successful in modeling physical systems (ferromagnets, dilute gases, poly- mer chains). Nowadays Gibbs models are ubiquitous and are used by researchers from different fields, often implicitly, i.e., without knowledge that a given model belongs to a much wider class. A well-established Gibbs theory provides an excellent starting point to develop Hidden Gibbs theory. A subclass of Hidden Gibbs models is formed by the well-known Hidden Markov Models. Several examples have been considered in Statistical Mechanics and the theory of Dynamical Systems. However, many basic questions are still unanswered. Moreover, the relevance of Hidden Gibbs models to other areas, such as Information Theory, Bioinformatics, and Neuronal Dynamics, has never been made explicit and exploited. Let us now formalise the problem, first by describing the mathematical paradigm to model partial observability, effects of noise, coarse-graining, and then, introducing the class of Hidden Gibbs Models. 1.1. HIDDEN MODELS / HIDDEN PROCESSES 3 1.1 Hidden Models / Hidden Processes Let us start with the following basic model, which we will specify further in the subsequent chapters. The observable process fYtg is a function of the hidden time- dependent process fXtg. We allow the process fXtg to be either • a stationary random (stochastic) process, • a deterministic dynamical process Xt+1 = f (Xt). For simplicity, we will only consider processes with discrete time, i.e., t 2 Z or Z+. However, many of what we discuss applies to continuous time processes as well. The process fYtg is a function of fXtg. The function can be • deterministic Yt = f (Xt) for all t, • or random: Yt ∼ PXt (·), i.e. Yt is chosen according to some probabil- ity distribution, which depends on fXtg. Remark 1.1. If not stated otherwise, we will implicitly assume that in case Yt is a random function of the underlying hidden process Xt, then Yt is chosen independently for every t. For example, Yt = Xt + Zt, where the “noise” fZtg is a sequence of independent endemically dis- tributed random variables. In many practical situations discussed below one can easily allow for the “dependence", e.g., fZtg is a Markov process. Moreover, as we will see latter, the case of a random function can reduced to the deterministic case, hence, any stationary process fZtg can be used to model noise. However, despite the fact the models are equivalent (see next subsection), it is often convenient to keep implicit dependence on the noise parameters. 4 CHAPTER 1. INTRODUCTION 1.1.1 Equivalence of “random” and “deterministic” settings. There is one-to-one correspondence between stationary processes and measure-preserving dynamical systems. Proposition 1.2. Let (W, A, m, T) be a measure-preserving dynamical system. Then for any measurable j : W ! R, (w) t Yt = j(T w), w 2 W, t 2 Z+ or Z,(1.1.1) is a stationary process. In the opposite direction, any real-valued stationary pro- cess fYtg gives rise to a measure preserving dynamical system (W, A, m, T) and a measurable function f : W ! R such that (1.1.1) holds. - Exercise 1.3. Prove Proposition 1.2. 1.1.2 Deterministic chaos Chaotic systems are typically defined as systems where trajectories de- pend sensitivelyon the initial conditions, that is, if small difference in ini- tial conditions results, over time, in substantial differences in the states: small causes can produce large effects. Simple observables (functions) fYtg of trajectories of chaotic dynamical systems fXtg, Xt+1 = f (Xt), can be indistinguishable from “random” processes. Example 1.4. Consider a piece-wise expanding map f : [0, 1] ! [0, 1], f (x) = 2x mod 1, and the following observable f : ( 0, x 2 [0, 1 ] =: I , ( ) = 2 0 f x 1 1, x 2 ( 2 , 1] := I1. Since the map f is expanding: for x, x0 close, d( f (x), f (x0)) = 2d(x, x0), the dynamical system is chaotic. Furthermore, one can easily show that n the Lebesgue measure on [0, 1] is f -invariant, and the process Yn = f( f (x)) is actually a Bernoulli process. Theorem 1.5 (Sinai’s Theorem on Bernoulli factors). 1.1. HIDDEN MODELS / HIDDEN PROCESSES 5 At the same time, often a time series looks "completely random", while it has a very simple underlying dynamical structure. In the figure below, the Gaussian white noise is compared visually vs the so-called determin- istic Gaussian white noise. The reconstruction techniques (c.f., Chapter ??) easily allow to distinguish the time series, as well as to identify the underlying dynamics. Figure 1.1: (a) Time series of the Gaussian and deterministic Gaussian white noise. (b) Corresponding probability densities of single observations. (c) Reconstruction plots Xn+1 vs Xn. Many vivid examples of close resemblance between the "random" and "deterministic" processes can be found in the literature.
Recommended publications
  • Ruelle Operator for Continuous Potentials and DLR-Gibbs Measures
    Ruelle Operator for Continuous Potentials and DLR-Gibbs Measures Leandro Cioletti Artur O. Lopes Departamento de Matem´atica - UnB Departamento de Matem´atica - UFRGS 70910-900, Bras´ılia, Brazil 91509-900, Porto Alegre, Brazil [email protected] [email protected] Manuel Stadlbauer Departamento de Matem´atica - UFRJ 21941-909, Rio de Janeiro, Brazil [email protected] October 25, 2019 Abstract In this work we study the Ruelle Operator associated to a continuous potential defined on a countable product of a compact metric space. We prove a generaliza- tion of Bowen’s criterion for the uniqueness of the eigenmeasures. One of the main results of the article is to show that a probability is DLR-Gibbs (associated to a continuous translation invariant specification), if and only if, is an eigenprobability for the transpose of the Ruelle operator. Bounded extensions of the Ruelle operator to the Lebesgue space of integrable functions, with respect to the eigenmeasures, are studied and the problem of exis- tence of maximal positive eigenfunctions for them is considered. One of our main results in this direction is the existence of such positive eigenfunctions for Bowen’s potential in the setting of a compact and metric alphabet. We also present a version of Dobrushin’s Theorem in the setting of Thermodynamic Formalism. Keywords: Thermodynamic Formalism, Ruelle operator, continuous potentials, Eigenfunc- tions, Equilibrium states, DLR-Gibbs Measures, uncountable alphabet. arXiv:1608.03881v6 [math.DS] 24 Oct 2019 MSC2010: 37D35, 28Dxx, 37C30. 1 Introduction The classical Ruelle operator needs no introduction and nowadays is a key concept of Thermodynamic Formalism.
    [Show full text]
  • Pdf File of Second Edition, January 2018
    Probability on Graphs Random Processes on Graphs and Lattices Second Edition, 2018 GEOFFREY GRIMMETT Statistical Laboratory University of Cambridge copyright Geoffrey Grimmett Geoffrey Grimmett Statistical Laboratory Centre for Mathematical Sciences University of Cambridge Wilberforce Road Cambridge CB3 0WB United Kingdom 2000 MSC: (Primary) 60K35, 82B20, (Secondary) 05C80, 82B43, 82C22 With 56 Figures copyright Geoffrey Grimmett Contents Preface ix 1 Random Walks on Graphs 1 1.1 Random Walks and Reversible Markov Chains 1 1.2 Electrical Networks 3 1.3 FlowsandEnergy 8 1.4 RecurrenceandResistance 11 1.5 Polya's Theorem 14 1.6 GraphTheory 16 1.7 Exercises 18 2 Uniform Spanning Tree 21 2.1 De®nition 21 2.2 Wilson's Algorithm 23 2.3 Weak Limits on Lattices 28 2.4 Uniform Forest 31 2.5 Schramm±LownerEvolutionsÈ 32 2.6 Exercises 36 3 Percolation and Self-Avoiding Walks 39 3.1 PercolationandPhaseTransition 39 3.2 Self-Avoiding Walks 42 3.3 ConnectiveConstantoftheHexagonalLattice 45 3.4 CoupledPercolation 53 3.5 Oriented Percolation 53 3.6 Exercises 56 4 Association and In¯uence 59 4.1 Holley Inequality 59 4.2 FKG Inequality 62 4.3 BK Inequalitycopyright Geoffrey Grimmett63 vi Contents 4.4 HoeffdingInequality 65 4.5 In¯uenceforProductMeasures 67 4.6 ProofsofIn¯uenceTheorems 72 4.7 Russo'sFormulaandSharpThresholds 80 4.8 Exercises 83 5 Further Percolation 86 5.1 Subcritical Phase 86 5.2 Supercritical Phase 90 5.3 UniquenessoftheIn®niteCluster 96 5.4 Phase Transition 99 5.5 OpenPathsinAnnuli 103 5.6 The Critical Probability in Two Dimensions 107
    [Show full text]
  • The Dobrushin Comparison Theorem Is a Powerful Tool to Bound The
    COMPARISON THEOREMS FOR GIBBS MEASURES∗ BY PATRICK REBESCHINI AND RAMON VAN HANDEL Princeton University The Dobrushin comparison theorem is a powerful tool to bound the dif- ference between the marginals of high-dimensional probability distributions in terms of their local specifications. Originally introduced to prove unique- ness and decay of correlations of Gibbs measures, it has been widely used in statistical mechanics as well as in the analysis of algorithms on random fields and interacting Markov chains. However, the classical comparison theorem requires validity of the Dobrushin uniqueness criterion, essentially restricting its applicability in most models to a small subset of the natural parameter space. In this paper we develop generalized Dobrushin comparison theorems in terms of influences between blocks of sites, in the spirit of Dobrushin- Shlosman and Weitz, that substantially extend the range of applicability of the classical comparison theorem. Our proofs are based on the analysis of an associated family of Markov chains. We develop in detail an application of our main results to the analysis of sequential Monte Carlo algorithms for filtering in high dimension. CONTENTS 1 Introduction . .2 2 Main Results . .4 2.1 Setting and notation . .4 2.2 Main result . .6 2.3 The classical comparison theorem . .8 2.4 Alternative assumptions . .9 2.5 A one-sided comparison theorem . 11 3 Proofs . 12 3.1 General comparison principle . 13 3.2 Gibbs samplers . 14 3.3 Proof of Theorem 2.4................................. 18 3.4 Proof of Corollary 2.8................................. 23 3.5 Proof of Theorem 2.12................................ 25 4 Application: Particle Filters .
    [Show full text]
  • Probability on Graphs Random Processes on Graphs and Lattices
    Probability on Graphs Random Processes on Graphs and Lattices GEOFFREY GRIMMETT Statistical Laboratory University of Cambridge c G. R. Grimmett 1/4/10, 17/11/10, 5/7/12 Geoffrey Grimmett Statistical Laboratory Centre for Mathematical Sciences University of Cambridge Wilberforce Road Cambridge CB3 0WB United Kingdom 2000 MSC: (Primary) 60K35, 82B20, (Secondary) 05C80, 82B43, 82C22 With 44 Figures c G. R. Grimmett 1/4/10, 17/11/10, 5/7/12 Contents Preface ix 1 Random walks on graphs 1 1.1 RandomwalksandreversibleMarkovchains 1 1.2 Electrical networks 3 1.3 Flowsandenergy 8 1.4 Recurrenceandresistance 11 1.5 Polya's theorem 14 1.6 Graphtheory 16 1.7 Exercises 18 2 Uniform spanning tree 21 2.1 De®nition 21 2.2 Wilson's algorithm 23 2.3 Weak limits on lattices 28 2.4 Uniform forest 31 2.5 Schramm±LownerevolutionsÈ 32 2.6 Exercises 37 3 Percolation and self-avoiding walk 39 3.1 Percolationandphasetransition 39 3.2 Self-avoiding walks 42 3.3 Coupledpercolation 45 3.4 Orientedpercolation 45 3.5 Exercises 48 4 Association and in¯uence 50 4.1 Holley inequality 50 4.2 FKGinequality 53 4.3 BK inequality 54 4.4 Hoeffdinginequality 56 c G. R. Grimmett 1/4/10, 17/11/10, 5/7/12 vi Contents 4.5 In¯uenceforproductmeasures 58 4.6 Proofsofin¯uencetheorems 63 4.7 Russo'sformulaandsharpthresholds 75 4.8 Exercises 78 5 Further percolation 81 5.1 Subcritical phase 81 5.2 Supercritical phase 86 5.3 Uniquenessofthein®nitecluster 92 5.4 Phase transition 95 5.5 Openpathsinannuli 99 5.6 The critical probability in two dimensions 103 5.7 Cardy's formula 110 5.8 The
    [Show full text]
  • Continuum Percolation for Gibbs Point Processes Dominated by a Poisson Process, and Then We Use the Percolation Results Available for Poisson Processes
    Electron. Commun. Probab. 18 (2013), no. 67, 1–10. ELECTRONIC DOI: 10.1214/ECP.v18-2837 COMMUNICATIONS ISSN: 1083-589X in PROBABILITY Continuum percolation for Gibbs point processes∗ Kaspar Stuckiy Abstract We consider percolation properties of the Boolean model generated by a Gibbs point process and balls with deterministic radius. We show that for a large class of Gibbs point processes there exists a critical activity, such that percolation occurs a.s. above criticality. Keywords: Gibbs point process; Percolation; Boolean model; Conditional intensity. AMS MSC 2010: 60G55; 60K35. Submitted to ECP on May 29, 2013, final version accepted on August 1, 2013. Supersedes arXiv:1305.0492v1. 1 Introduction Let Ξ be a point process in Rd, d ≥ 2, and fix a R > 0. Consider the random S B B set ZR(Ξ) = x2Ξ R(x), where R(x) denotes the open ball with radius R around x. Each connected component of ZR(Ξ) is called a cluster. We say that Ξ percolates (or R-percolates) if ZR(Ξ) contains with positive probability an infinite cluster. In the terminology of [9] this is a Boolean percolation model driven by Ξ and deterministic radius distribution R. It is well-known that for Poisson processes there exists a critical intensity βc such that a Poisson processes with intensity β > βc percolates a.s. and if β < βc there is a.s. no percolation, see e.g. [17], or [14] for a more general Poisson percolation model. For Gibbs point processes the situation is less clear. In [13] it is shown that for some | downloaded: 28.9.2021 two-dimensional pairwise interacting models with an attractive tail, percolation occurs if the activity parameter is large enough, see also [1] for a similar result concerning the Strauss hard core process in two dimensions.
    [Show full text]
  • Mean-Field Monomer-Dimer Models. a Review. 3
    Mean-Field Monomer-Dimer models. A review. Diego Alberici, Pierluigi Contucci, Emanuele Mingione University of Bologna - Department of Mathematics Piazza di Porta San Donato 5, Bologna (Italy) E-mail: [email protected], [email protected], [email protected] To Chuck Newman, on his 70th birthday Abstract: A collection of rigorous results for a class of mean-field monomer- dimer models is presented. It includes a Gaussian representation for the partition function that is shown to considerably simplify the proofs. The solutions of the quenched diluted case and the random monomer case are explained. The presence of the attractive component of the Van der Waals potential is considered and the coexistence phase coexistence transition analysed. In particular the breakdown of the central limit theorem is illustrated at the critical point where a non Gaussian, quartic exponential distribution is found for the number of monomers centered and rescaled with the volume to the power 3/4. 1. Introduction The monomer-dimer models, an instance in the wide set of interacting particle systems, have a relevant role in equilibrium statistical mechanics. They were introduced to describe, in a simplified yet effective way, the process of absorption of monoatomic or diatomic molecules in condensed-matter physics [15,16,39] or the behaviour of liquid solutions composed by molecules of different sizes [25]. Exact solutions in specific cases (e.g. the perfect matching problem) have been obtained on planar lattices [24,26,34,36,41] and the problem on regular lattices is arXiv:1612.09181v1 [math-ph] 29 Dec 2016 also interesting for the liquid crystals modelling [1,20,27,31,37].
    [Show full text]
  • Markov Random Fields and Gibbs Measures
    Markov Random Fields and Gibbs Measures Oskar Sandberg December 15, 2004 1 Introduction A Markov random field is a name given to a natural generalization of the well known concept of a Markov chain. It arrises by looking at the chain itself as a very simple graph, and ignoring the directionality implied by “time”. A Markov chain can then be seen as a chain graph of stochastic variables, where each variable has the property that it is independent of all the others (the future and past) given its two neighbors. With this view of a Markov chain in mind, a Markov random field is the same thing, only that rather than a chain graph, we allow any graph structure to define the relationship between the variables. So we define a set of stochastic variables, such that each is independent of all the others given its neighbors in a graph. Markov random fields can be defined both for discrete and more complicated valued random variables. They can also be defined for continuous index sets, in which case more complicated neighboring relationships take the place of the graph. In what follows, however, we will look only at the most appro- achable cases with discrete index sets, and random variables with finite state spaces (for the most part, the state space will simply be {0, 1}). For a more general treatment see for instance [Rozanov]. 2 Definitions 2.1 Markov Random Fields Let X1, ...Xn be random variables taking values in some finite set S, and let G = (N, E) be a finite graph such that N = {1, ..., n}, whose elements will 1 sometime be called sites.
    [Show full text]
  • A Note on Gibbs and Markov Random Fields with Constraints and Their Moments
    NISSUNA UMANA INVESTIGAZIONE SI PUO DIMANDARE VERA SCIENZIA S’ESSA NON PASSA PER LE MATEMATICHE DIMOSTRAZIONI LEONARDO DA VINCI vol. 4 no. 3-4 2016 Mathematics and Mechanics of Complex Systems ALBERTO GANDOLFI AND PIETRO LENARDA A NOTE ON GIBBS AND MARKOV RANDOM FIELDS WITH CONSTRAINTS AND THEIR MOMENTS msp MATHEMATICS AND MECHANICS OF COMPLEX SYSTEMS Vol. 4, No. 3-4, 2016 dx.doi.org/10.2140/memocs.2016.4.407 ∩ MM A NOTE ON GIBBS AND MARKOV RANDOM FIELDS WITH CONSTRAINTS AND THEIR MOMENTS ALBERTO GANDOLFI AND PIETRO LENARDA This paper focuses on the relation between Gibbs and Markov random fields, one instance of the close relation between abstract and applied mathematics so often stressed by Lucio Russo in his scientific work. We start by proving a more explicit version, based on spin products, of the Hammersley–Clifford theorem, a classic result which identifies Gibbs and Markov fields under finite energy. Then we argue that the celebrated counterexample of Moussouris, intended to show that there is no complete coincidence between Markov and Gibbs random fields in the presence of hard-core constraints, is not really such. In fact, the notion of a constrained Gibbs random field used in the example and in the subsequent literature makes the unnatural assumption that the constraints are infinite energy Gibbs interactions on the same graph. Here we consider the more natural extended version of the equivalence problem, in which constraints are more generally based on a possibly larger graph, and solve it. The bearing of the more natural approach is shown by considering identifi- ability of discrete random fields from support, conditional independencies and corresponding moments.
    [Show full text]
  • Large Deviations and Applications 29. 11. Bis 5. 12. 1992
    MATHEMATISCHES FORSCHUNGSINSTITUT OBERWOLFACH Tagungsbericht 51/1992 Large Deviations and Applications 29. 11. bis 5. 12. 1992 Zu dieser Tagung unter der gemeinsamen Leitung von E. Bolthausen (Zurich), J. Gartner (Berlin) und S. R. S. Varadhan (New York) trafen sich Mathematiker und mathematische Physiker aus den verschieden- sten Landern mit einem breiten Spektrum von Interessen. Die Theorie vom asymptotischen Verhalten der Wahrscheinlichkeiten groer Abweichungen ist einer der Schwerpunkte der jungeren wahr- scheinlichkeitstheoretischen Forschung. Es handelt sich um eine Pra- zisierung von Gesetzen groer Zahlen. Gegenstand der Untersuchun- gen sind sowohl die Skala als auch die Rate des exponentiellen Abfalls der kleinen Wahrscheinlichkeiten, mit denen ein untypisches Verhal- ten eines stochastischen Prozesses auftritt. Die Untersuchung dieser kleinen Wahrscheinlichkeiten ist fur viele Fragestellungen interessant. Wichtige Themen der Tagung waren: – Anwendungen groer Abweichungen in der Statistik – Verschiedene Zugange zur Theorie groer Abweichungen – Stochastische Prozesse in zufalligen Medien – Wechselwirkende Teilchensysteme und ihre Dynamik – Statistische Mechanik, Thermodynamik – Hydrodynamischer Grenzubergang – Verhalten von Grenzachen, monomolekulare Schichten – Dynamische Systeme und zufallige Storungen – Langreichweitige Wechselwirkung, Polymere – Stochastische Netzwerke Die Tagung hatte 47 Teilnehmer, es wurden 42 Vortrage gehalten. 1 Abstracts G. Ben Arous (joint work with A. Guionnet) Langevin dynamics for spin glasses We study the dynamics for the Sherrington-Kirkpatrick model of spin glasses. More precisely, we take a soft spin model suggested by Som- polinski and Zippelius. Thus we consider diusions interacting with a Gaussian random potential of S-K-type. (1) We give an “annealed” large deviation principle for the empirical measure at the process level. We deduce from this an annealed law of large numbers and a central limit theorem.
    [Show full text]
  • Stein's Method for Discrete Gibbs Measures
    The Annals of Applied Probability 2008, Vol. 18, No. 4, 1588–1618 DOI: 10.1214/07-AAP0498 c Institute of Mathematical Statistics, 2008 STEIN’S METHOD FOR DISCRETE GIBBS MEASURES1 By Peter Eichelsbacher and Gesine Reinert Ruhr-Universit¨at Bochum and University of Oxford Stein’s method provides a way of bounding the distance of a prob- ability distribution to a target distribution µ. Here we develop Stein’s method for the class of discrete Gibbs measures with a density eV , where V is the energy function. Using size bias couplings, we treat an example of Gibbs convergence for strongly correlated random vari- ables due to Chayes and Klein [Helv. Phys. Acta 67 (1994) 30–42]. We obtain estimates of the approximation to a grand-canonical Gibbs ensemble. As side results, we slightly improve on the Barbour, Holst and Janson [Poisson Approximation (1992)] bounds for Poisson ap- proximation to the sum of independent indicators, and in the case of the geometric distribution we derive better nonuniform Stein bounds than Brown and Xia [Ann. Probab. 29 (2001) 1373–1403]. 0. Introduction. Stein [17] introduced an elegant method for proving convergence of random variables toward a standard normal variable. Bar- bour [2, 3] and G¨otze [10] developed a dynamical point of view of Stein’s method using time-reversible Markov processes. If µ is the stationary dis- tribution of a homogeneous Markov process with generator , then X µ if and only if E g(X) = 0 for all functions g in the domainA ( ) of∼ the operator . ForA any random variable W and for any suitable functionD A f, to assess theA distance Ef(W ) f dµ we first find a solution g of the equation | − | g(Rx)= f(x) f dµ.
    [Show full text]
  • Boundary Conditions for Translation- Invariant Gibbs Measures of the Potts Model on Cayley Trees Daniel Gandolfo, M
    Boundary Conditions for Translation- Invariant Gibbs Measures of the Potts Model on Cayley Trees Daniel Gandolfo, M. M. Rakhmatullaev, U.A. Rozikov To cite this version: Daniel Gandolfo, M. M. Rakhmatullaev, U.A. Rozikov. Boundary Conditions for Translation- In- variant Gibbs Measures of the Potts Model on Cayley Trees. Journal of Statistical Physics, Springer Verlag, 2017, 167 (5), pp.1164. 10.1007/s10955-017-1771-5. hal-01591298 HAL Id: hal-01591298 https://hal.archives-ouvertes.fr/hal-01591298 Submitted on 26 Apr 2018 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Boundary conditions for translation-invariant Gibbs measures of the Potts model on Cayley trees D. Gandolfo Aix Marseille Univ, Universit´ede Toulon, CNRS, CPT, Marseille, France. [email protected] M. M. Rahmatullaev and U. A. Rozikov Institute of mathematics, 29, Do'rmon Yo'li str., 100125, Tashkent, Uzbekistan. [email protected] [email protected] Abstract We consider translation-invariant splitting Gibbs measures (TISGMs) for the q- state Potts model on a Cayley tree of order two. Recently a full description of the TISGMs was obtained, and it was shown in particular that at sufficiently low temper- atures their number is 2q − 1.
    [Show full text]
  • Arxiv:2103.07391V1 [Math.PR] 12 Mar 2021 Studied
    GIBBS MEASURES OF POTTS MODEL ON CAYLEY TREES: A SURVEY AND APPLICATIONS U.A. ROZIKOV Abstract. In this paper we give a systematic review of the theory of Gibbs mea- sures of Potts model on Cayley trees (developed since 2013) and discuss many applications of the Potts model to real world situations: mainly biology, physics, and some examples of alloy behavior, cell sorting, financial engineering, flocking birds, flowing foams, image segmentation, medicine, sociology etc. 1. Introduction The Potts model is defined by a Hamiltonian (energy) of configurations of spins which take one of q possible values on vertices of a lattice. The model is used to study the behavior of systems having multiple states (spins, colors, alleles, etc). Since the model has a rich mathematical formulation it has been studied extensively. Usually the results of investigations of a system (in particular, the Potts model) are presented by a measure assigning a number to each (interesting) suitable properties of the system. The Gibbs measure is one of such important measures in many problems of probability theory and statistical mechanics. It is the measure associated with the Hamiltonian of a (biological, physical) system. Each Gibbs measure gives a state of the system. The main problem for a given Hamiltonian on a countable lattice is to describe its all possible Gibbs measures. In case of uniqueness of the Gibbs measure (for all values of the parameters, in particular a parameter can be temperature), the system does not change its state. The existence of some values of parameters at which the uniqueness of Gibbs measure switches to non-uniqueness is interpreted as phase transition (the system will change its state).
    [Show full text]