Comparing Monte Carlo Methods for Finding Ground States of Ising Spin
Total Page:16
File Type:pdf, Size:1020Kb
Comparing Monte Carlo methods for finding ground states of Ising spin glasses: Population annealing, simulated annealing, and parallel tempering Wenlong Wang,1, ∗ Jonathan Machta,1, 2, y and Helmut G. Katzgraber3, 4, 2 1Department of Physics, University of Massachusetts, Amherst, Massachusetts 01003 USA 2Santa Fe Institute, 1399 Hyde Park Road, Santa Fe, New Mexico 87501 USA 3Department of Physics and Astronomy, Texas A&M University, College Station, Texas 77843-4242, USA 4Materials Science and Engineering, Texas A&M University, College Station, Texas 77843, USA Population annealing is a Monte Carlo algorithm that marries features from simulated-annealing and parallel-tempering Monte Carlo. As such, it is ideal to overcome large energy barriers in the free-energy landscape while minimizing a Hamiltonian. Thus, population-annealing Monte Carlo can be used as a heuristic to solve combinatorial optimization problems. We illustrate the capabilities of population-annealing Monte Carlo by computing ground states of the three-dimensional Ising spin glass with Gaussian disorder, while comparing to simulated-annealing and parallel-tempering Monte Carlo. Our results suggest that population annealing Monte Carlo is significantly more efficient than simulated annealing but comparable to parallel-tempering Monte Carlo for finding spin-glass ground states. I. INTRODUCTION primary purpose of this paper is to introduce population annealing as an effective method for finding ground states Spin glasses present one of the most difficult chal- of frustrated disordered spin systems. lenges in statistical physics [1]. Finding spin-glass ground Population-annealing Monte Carlo [18{22] is closely re- states is important in statistical physics because some lated to simulated annealing and also shares some simi- properties of the low-temperature spin-glass phase can larities with parallel tempering. Both simulated anneal- be understood by studying ground states. For exam- ing and population annealing involve taking the system ple, ground-state energies in different boundary condi- through an annealing schedule from high to low temper- tions have been used to compute the stiffness exponent ature. Population annealing makes use of a population of spin glasses [2{4]. More generally, the problem of of replicas of the system that are simultaneously cooled finding ground states of Ising spin glasses in three and and, at each temperature step, the population is resam- more dimensions is a nnondeterministic polynomial-time pled so it stays close to the equilibrium Gibbs distribu- (NP) hard combinatorial optimization problem [5] and tion. The resampling step plays the same role as the is thus closely related to other hard combinatorial op- replica exchange step in parallel-tempering Monte Carlo. timization problems [6], such as protein folding [7] or On the other hand, population annealing is an example the traveling salesman problem. As such, developing ef- of a sequential Monte Carlo algorithm [23], while paral- ficient algorithms to find the ground state of a spin-glass lel tempering is a Markov chain Monte Carlo algorithm. Hamiltonian|as well as related problems that fall into In a recent large-scale study [22], we have thoroughly the class of \quadratic unconstrained binary optimiza- tested population-annealing Monte Carlo against the tion problems"|represents an important problem across well-established parallel tempering-Monte Carlo method. multiple disciplines. Simulations of thousands of instances of the Edwards- Many generally applicable computational methods Anderson Ising spin glass show that population anneal- have been developed to solve hard combinatorial opti- ing Monte Carlo is competitive with parallel-tempering mization problems. Exact algorithms that efficiently ex- Monte Carlo for doing large-scale spin-glass simulations plore the tree of system states include branch-and-cut at low but nonzero temperatures where thermalization is [8] algorithms. Heuristic methods include genetic algo- difficult. Not only is population-annealing Monte Carlo rithms [9, 10], particle swarm optimization [11], and ex- competitive in comparison to parallel-tempering Monte tremal optimization [12, 13]. The focus of this paper is Carlo when it comes to speed and statistical errors, it has the added benefits that the free energy is readily ac- arXiv:1412.2104v2 [cond-mat.dis-nn] 6 Jul 2015 on heuristic Monte Carlo methods based on thermal an- nealing approaches. In particular, we studied simulated cessible, multiple boundary conditions can be simulated annealing [14], parallel-tempering Monte Carlo [15{17], at the same time, the position of the temperatures in the and population-annealing Monte Carlo [18]. The first anneal schedule does not need to be tuned with as much two methods are well-known and have been successfully care as in parallel tempering, and it is trivially paralleliz- applied to minimize Hamiltonians, while the third has able on multi-core architectures. been much less widely used in statistical physics and a It is well known that parallel tempering is more effi- cient at finding spin-glass ground states than simulated annealing [24, 25] because parallel tempering is more ef- ficient at overcoming free-energy barriers. Here we find ∗Electronic address: [email protected] that population annealing is comparably efficient to par- yElectronic address: [email protected] allel tempering Monte Carlo and, thus, also more effi- 2 cient than simulated annealing. Nonetheless, because of The resampling step in PA keeps the population close the strong similarities between population annealing and to the Gibbs distribution as the temperature is lowered simulated annealing, a detailed comparison of the two al- by differentially reproducing replicas of the system de- gorithms is informative and sheds light on the importance pending on their energy. Lower-energy replicas may be of staying near equilibrium, even for heuristics designed copied several times and higher-energy replicas elimi- to find ground states. nated from the population. Consider a temperature step The outline of the paper is as follows. We first in- in which the temperature T = 1/β is lowered from from troduce our benchmark problem, the Edwards-Anderson β to β0, where β0 > β. The reweighting factor required to Ising spin glass, and the population annealing algorithm transform the Gibbs distribution from β to β0 for replica 0 −(β −β)Ei in Sec. II. We then study the properties of popula- i with energy Ei is e . The expected number of 0 tion annealing for finding ground states of the Edwards- copies τi(β; β ) of replica i is proportional to the reweight- Anderson model and compare population annealing with ing factor simulated annealing in Sec. III B. We conclude by com- 0 paring the efficiency of population annealing and paral- e−(β −β)Ei τ (β; β0) = ; (2) lel tempering in Sec. IV and present our conclusions in i Q(β; β0) Sec. V. where Q is a normalization factor that keeps the expected population size fixed, II. MODELS AND METHODS Rβ 1 X 0 Q(β; β0) = e−(β −β)Ei : (3) A. The Edwards-Anderson model R i=1 The Edwards-Anderson (EA) Ising spin-glass Hamil- Here Rβ is the actual population size at inverse temper- tonian is defined by ature β. A useful feature of PA is that the absolute free energy can be easily and accurately computed from the X H = − Jijsisj; (1) sum of the logarithm of the normalizations Q at each hiji temperature step. Details can be found in Ref. [19]. Resampling is carried out by choosing a number of where si = ±1 are Ising spins on a d-dimensional hyper- copies to make for each replica in the population at β0. cubic lattice with periodic boundary conditions of size There are various ways to choose the integer number 3 0 N = L . The summation hiji is over all nearest neigh- of copies ni(β; β ) having the correct (real) expectation 0 bor pairs. The couplings Jij are independent Gaussian τi(β; β ). The population size can be fixed (Rβ = R) by random variates with mean zero and variance one. For using the multinomial resampling [20] or residual resam- Gaussian disorder, with probability one, there is a unique pling [26]. Here we allow the population size to fluctuate pair of ground states for any finite system. We call a re- slightly and use nearest-integer resampling. We let the 0 0 alization of the couplings fJijg a sample. Here we study number of copies be ni(β; β ) = bτi(β; β )c with proba- 0 0 0 0 the three-dimensional (3D) EA model. bility dτi(β; β )e − τi(β; β ) and ni(β; β ) = dτi(β; β )e, otherwise. bxc is the greatest integer less than x and dxe is least integer greater than x. This choice insures B. Population Annealing and Simulated Annealing 0 0 that the mean of ni(β; β ) is τi(β; β ) and the variance is minimized. Population annealing (PA) and simulated annealing The resampling step ensures that the new population (SA) are closely related algorithms that may be used as is representative of the Gibbs distribution at β0 although heuristics to find ground states of the EA model. Both for finite population R, biases are introduced because algorithms change the temperature of a system through the low-energy states are not fully sampled. In addition, an annealing schedule from a high temperature where the the population is now correlated due to the creation of system is easily thermalized to a sufficiently low temper- multiple copies. Both of these problems are partially cor- ature where there is a significant probability of finding rected by carrying out Metropolis sweeps and the under- the system in its ground state. At each temperature sampling of low-energy states is reduced by increasing in the annealing schedule NS sweeps of a Markov-chain R. Indeed, for PA, which is an example of a sequential Monte Carlo algorithm are applied at the current temper- Monte Carlo method [23], systematic errors are elimi- ature.