The Information-Theoretic Structure of Classical Spin Systems
Total Page:16
File Type:pdf, Size:1020Kb
Santa Fe Institute Working Paper 15-10-042 arXiv:1510.08954 [cond-mat.stat-mech] Anatomy of a Spin: The Information-Theoretic Structure of Classical Spin Systems Vikram S. Vijayaraghavan,∗ Ryan G. James,† and James P. Crutchfield‡ Complexity Sciences Center and Physics Department, University of California at Davis, One Shields Avenue, Davis, CA 95616 (Dated: August 16, 2016) Collective organization in matter plays a significant role in its expressed physical properties. Typically, it is detected via an order parameter, appropriately defined for each given system’s observed emergent patterns. Recent developments in information theory, however, suggest quantifying collective organization in a system- and phenomenon-agnostic way: decompose the system’s thermodynamic entropy density into a localized entropy, that solely contained in the dynamics at a single location, and a bound entropy, that stored in space as domains, clusters, excitations, or other emergent structures. We compute this decomposition and related quantities explicitly for the nearest-neighbor Ising model on the 1D chain, the Bethe lattice with coordination number k = 3, and the 2D square lattice, illustrating its generality and the functional insights it gives near and away from phase transitions. In particular, we consider the roles that different spin motifs play (in cluster bulk, cluster edges, and the like) and how these affect the dependencies between spins. PACS numbers: 05.20.-y 89.75.Kd 05.45.-a 02.50.-r 89.70.+c Keywords: Ising spin model, thermodynamic entropy density, dual total correlation, entropy rate, elusive information, enigmatic information, predictable information rate, complex system I. INTRODUCTION statistical mechanics. Both Shaw [9] and Arnold [10] studied information in 1D spin-strips within the 2D Ising Collective behavior underlies a vast array of fascinating model. Feldman and Crutchfield [11–13] explored several phenomena, many of critical importance to contemporary generalizations of the excess entropy, a well known mutual science and technology. Unusual material properties— information measure for time series, in 1D and 2D Ising such as superconductivity, metal-insulator transitions, models as a function of coupling strength, showing that and heavy fermions—have been attributed to collective they were sensitive to spin patterns and can be used behavior arising from the compounded interaction of sys- as a generalized order parameter. In one of the more tem components [1]. Collective behavior is by no means thorough-going studies to date, Lau and Grassberger [14] limited to complex materials, however: the behavior of computed the excess entropy using adjacent rings of spins financial markets [2], epileptic seizures [3] and conscious- to probe criticality in the 2D cylindrical Ising model. ness [4], and animal flocking behavior [5] are all now also Barnett et al. [6] tracked information flows in a kinetic seen as examples. And, we now appreciate that it appears Ising system using both the pairwise and global mutual most prominently via phase transitions [6, and references informations (generalized as the total correlation [15], it therein]. appears below) and the transfer entropy [16]. Abdallah and Plumbley [17] employed an extensive version of the Operationally, collective behavior is detected and quan- dual total correlation [15] on small, random spin glasses. tified using correlations or, more recently, by mutual Despite their successful application, the measures used information [7], estimated either locally via pairwise com- to monitor collective phenomena in these studies were ponent interactions or globally. Exploring spin systems motivated information theoretically rather than physically, familiar from statistical mechanics, here we argue that and so their thermodynamic relevance and structural these diagnostics can be substantially refined to become interpretation have remained unclear. more incisive tools for quantifying collective behavior. This is part of the larger endeavor of discovering ways To address this we decompose the thermodynamic entropy density for spin systems into a set of information mea- arXiv:1510.08954v2 [cond-mat.stat-mech] 13 Aug 2016 to automatically detect collective behavior and the emer- gence of organization [8]. sures, some already familiar in information theory [18], Along these lines, much effort has been invested to explore complex systems [19], and elsewhere [17]. For example, information-theoretic properties of the Ising model of one measure that falls out naturally—the bound entropy— is that part of the thermodynamic entropy accounting for entropy shared between spins. This is in contrast to mon- ∗ [email protected] itoring collective behavior as the difference between the † [email protected] thermodynamic entropy and the entropy of a hypothetical ‡ [email protected] distribution over uncorrelated spins—the total correla- 2 tion mentioned above. In this way, the decomposition are uncorrelated. To determine it, we use the isolated provides a physical basis for informational measures of col- spin entropy: lective behavior, in particular showing how the measures lead to interpretations of emergent structure in system H[σ0] = p ( ) log2 p ( ) p ( ) log2 p ( ) , − ↑ ↑ − ↓ ↓ configurations. where p ( ) = (1 + m)/2 is the probability of a spin being To make this argument and illustrate its consequences, ↑ up, p ( ) = (1 m)/2 is the probability of a spin being SectionII first defines notation and lays the groundwork ↓ − down, and m = (# # )/ is average magnetization for our decomposition. Section III then derives the decom- ↑ − ↓ |L| in a configuration. The site index 0 was chosen arbitrarily position. Though this decomposition is given for regular and represents any single spin in the lattice. The system’s spin lattices, in principle it is meaningful for any system Boltzmann entropy is then the extensive quantity HB = with a well defined thermodynamic entropy. SectionIV H[σ0]. outlines the computational methods for estimating the |L| · resulting quantities, using them to interpret emergent As Jaynes [20] emphasizes, though correct for an ideal organization in the nearest-neighbor ferromagnetic Ising gas, HB is not the empirically correct system entropy if model in one dimension (SectionIVA), in the Bethe lat- a system develops internal correlations. In our case, if tice with coordination number k = 3 (SectionIVB), and spins are correlated across the lattice, one must consider Gibbs entropy in the two-dimensional square Ising lattice (SectionIVC). entire configurations, leading to the : Having established the different types of entropy in spin X H[σ] = p (σ) log p (σ) (3) systems and connections to their underlying physical struc- − 2 σ∈σ tures, we conclude by suggesting applications and future directions. and the thermodynamic entropy density h, the entropy per spin when the entire lattice is considered: H[σ] II. SPIN ENTROPIES h = . (4) |L| We write σ to denote a configuration of spins on a lattice Note that we dropped the factor kB and used the base-2 , σ for the particular spin state at lattice site i , logarithm. Our use of the letter h as opposed to s is meant L i ∈ L and σ\i the collection of all spin states in a configuration to reflect this multiplicative constant difference. It is easy excluding σ at site i. The random variable over all to see that H[σ] HB. Thus, the Boltzmann entropy i ≤ possible spin configurations is denoted σ, for a particular is typically an overestimate of the true thermodynamic spin variable σi, and for all spin variables but the one at entropy. site i, σ . As a shorthand, we write σ σ and similar to \i ∈ More to the point, in the thermodynamic limit the Boltz- mean indexing into the event space of the random variable mann entropy HB and Gibbs entropy H[σ] differ sub- σ. stantially. As Jaynes shows [20], the difference directly We study the ferromagnetic spin-1/2 Ising model with measures the effect of internal correlations on total energy nearest-neighbor interactions in the thermodynamic limit and other thermodynamic state variables. Moreover, the whose Hamiltonian is given by: difference does not vanish, rather it increases proportion- ally to system size . X |L| (σ) = J σ σ , (1) H − i j Before leaving the differences between entropy defini- hi,ji tions, it is important to note that Boltzmann’s more familiar definition—S = k ln W —via the number W where i, j denotes all pairs (i, j) such that the sites i and B h i of microstates associated with a given thermodynamic j are directly connected in and the interaction strength L macrostate is consistent with Gibbs’ definition [20]. This J is positive. We assume the system is in equilibrium follows from Shannon’s deep insight on the asymptotic and isolated. And so, the probability of configuration σ equipartition property that 2h|L| measures the volume of occurring is given by the Boltzmann distribution: typical microstates: the set of almost-equiprobable con- 1 figurations that are simultaneously most numerous and p (σ) = e−H(σ)/kBT , (2) Z capture the bulk of the probability—those that are typ- ically realized. (See Refs. [21, Sec. 21], [7, Ch. 3] and where Z is the partition function. [22, 23].) Thus, Boltzmann’s W should be interpreted as The Boltzmann entropy of a statistical mechanical system the size (phase space volume) of the typical set associated assumes the constituent degrees of freedom (here spins) with a particular thermodynamic macrostate. Given this 3 agreement, we focus Gibbs’ approach. Though, as we measures: now note, the contrast between Gibbs’ entropy H[σ] and |L| " # Boltzmann’s HB leads directly to our larger goals. X X R[σ] = p (σ) log p σi σ\ − 2 i i=1 σ∈σ |L| X = H σi σ\i , (7) i=1 and III. DECOMPOSING A SPIN’S THERMODYNAMIC ENTROPY X p (σ) B[σ] = p (σ) log (8) − 2 |L| σ∈σ Y p σi σ\i Since we are particularly interested here in monitoring i=0 |L| the appearance of internal correlations, the difference be- X = H [σ] H σi σ\ .