
entropy Article Potential of Entropic Force in Markov Systems with Nonequilibrium Steady State, Generalized Gibbs Function and Criticality Lowell F. Thompson 1,2,*,† and Hong Qian 1,*,† 1 Department of Applied Mathematics, University of Washington, Seattle, WA 98195, USA 2 Pacific Northwest National Laboratory, 902 Battelle Blvd, Richland, WA 99352, USA * Correspondence: [email protected] (L.F.T.); [email protected] (H.Q.); Tel.: +1-206-543-2584 (H.Q.) † These authors contributed equally to this work. Academic Editors: Hermann Haken and Juval Portugali Received: 3 May 2016; Accepted: 15 August 2016; Published: 18 August 2016 Abstract: In this paper, we revisit the notion of the “minus logarithm of stationary probability” as a generalized potential in nonequilibrium systems and attempt to illustrate its central role in an axiomatic approach to stochastic nonequilibrium thermodynamics of complex systems. It is demonstrated that this quantity arises naturally through both monotonicity results of Markov processes and as the rate function when a stochastic process approaches a deterministic limit. We then undertake a more detailed mathematical analysis of the consequences of this quantity, culminating in a necessary and sufficient condition for the criticality of stochastic systems. This condition is then discussed in the context of recent results about criticality in biological systems. Keywords: nonequilibrium steady states; stochastic nonequilibrium thermodynamics; generalized potentials; entropy 1. Introduction This is part II of a series on stochastic nonlinear dynamics of complex systems. Part I [1] presents a chemical reaction kinetic perspective on complex systems in terms of a mesoscopic stochastic nonlinear kinetic approach (e.g., Delbrück–Gillespie processes) as well as a stochastic nonequilibrium thermodynamics (stoc-NET) in phase space. One particularly important feature of the theory in [1] is that it takes the abstract mathematical concepts seriously—that is, it follows what the mathematics tells us [2]. For example, it was shown that the widely employed local equilibrium assumption in the traditional macroscopic theory of NET can be eliminated when one recognizes the fine distinction between the set of random events, the S in a probability space (S , F, P) and a random variable that is defined as an observable on the top of the measurable space, x : S ! R. The local equilibrium assumption is needed only when one applies the phase space stoc-NET to physically measurable transport processes [3]. The same chemical kinetic approach can be applied to other biological systems. Biological organisms are complex systems with a large number of heterogeneous constituents, which can be thought of as “individuals”. To be able to develop a scientific theory for such a complex system with any predictive power, one must use a probabilistic treatment that classifies the individuals into “statistically identical groups” [4–6]. Thermodynamics and statistical mechanics provide a powerful conceptual framework, as well as a set of tools with which one can comprehend and analyze these systems. The fully developed statistical thermodynamic theory taught in college physics classes is mainly a theory of equilibrium systems. The application of its fundamental ideas, however, Entropy 2016, 18, 309; doi:10.3390/e18080309 www.mdpi.com/journal/entropy Entropy 2016, 18, 309 2 of 20 is not limited to just equilibrium systems or molecular processes. Stoc-NET [3,7–10], along with the information theoretical approach [6,11–14], is a further development in this area. One of the key elements of the theory presented in [1] was the nonequilibrium steady state (NESS) potential, or “energy”, defined as the minus logarithm of the stationary probability distribution of a kinetic model. In the past, this quantity has appeared repeatedly in the literature [15–19], but most of the studies focus on its computation. In this paper, we attempt to illustrate its central role as a novel “law of force”, a necessary theoretical element in the stoc-NET of complex systems. Once this connection between energy and probability is established, it is possible to formally define probabalistic quantities analagous to other physical variables such as temperature and entropy. (Notice that the term “entropy” is somewhat overloaded. In the context of probability distributions, it typically refers to the Shannon entropy S = R p(x) ln p(x) dx. In statistical physics, it is more often used to refer to the Gibbs or Boltzmann entropy, both of which are defined in terms of the volume of some region in the phase space of a Hamiltonian system. These various definitions are related but not equivalent. In this work , particularly in Section 4.1, we define analogues of Gibbs and Boltzmann entropies for probability distributions.) In particular, we extend the notion of critical temperatures to the realm of stationary stochastic processes and find a necessary and sufficient condition for the existence of such criticalities. Loosely speaking, at low temperatures, the dynamics of a stochastic process are dominated by energy considerations and become nearly deterministic (i.e., the system is almost always in a ground state). At high temperatures, the dynamics are dominated by entropic considerations and become nearly uniform (i.e., the system traverses all states, regardless of energy). The former occurs for any stationary process, and the rate of approach is given by the energy. In contrast, the entropic effects typically dominate only at infinite temperatures, but some systems can reach uniformity at a finite temperature. We define such a temperature as critical. Note that we are not presenting an alternative to existing statistical mechanical literature on criticality and phase transition. Instead, we are attempting to generalize these notions from statistical mechanics to a much broader context where the concept of criticality does not yet exist. One can certainly craft stationary distributions from an equilibrium statistical mechanics problem and apply our theory, but this will not produce results that differ from classical approaches. The paper is organized as follows: In Section2, we provide a brief historical review of the use of the negative logarithm of a stationary probability distribution as an energy potential. In Section 2.1, we first look at the history of using minus-log-probability to equilibrium chemical thermodynamics and briefly review Kirkwood’s fundamental idea of the potential of mean force and the notion of entropic force. In Section 2.2, we describe two recent results identifying the minus-log-probability as “energy”: a self-contained and consistent mesoscopic stoc-NET [20], and a precise agreement between its macroscopic limit and Gibbs’ theory [21,22]. These two results provide strong evidence for the validity of such an identification. In Section 2.3, we discuss the legitimacy and centrality of stationary distribution in the “entropy inequality” for a Markov process from a mathematical standpoint. In Section3, we propose a definition of the “corresponding deterministic dynamics” of a stochastic process using power-scaling of probability densities. In Section3, we show that the rate of convergence to this corresponding deterministic process coincides with the minus-log-probability definition of energy. With the justifications given in Sections2 and3, we carry out a more detailed analysis of such a probability distribution in Section4. In Section 4.1, terms analogous to Boltzmann’s and Gibbs’ entropy are defined, along with their corresponding microcanonical partition functions. We also discuss the relative merits of these definitions. In Section 4.2, we prove that the system has a critical temperature if and only if the Gibbs’ entropy of the system is asymptotic to the energy. In Section 4.3, we discuss several example distributions in order to emphasize some subtleties in the choice of distribution and to illustrate the connection between this theory and equilibrium statistical mechanics. Finally, in Section5, the ideas from previous sections are related to some recent results on biological systems. Entropy 2016, 18, 309 3 of 20 2. A Novel Law of Force: Potential of Entropic Force In Boltzmann’s statistical mechanics, phenomenological thermodynamics is given a Newtonian mechanical basis. Based on the already well developed concepts of mechanical energy and its conservation, Boltzmann [23] derived the relation peq(x) ∝ e−U(x)/kBT, (1) where U(x) is the mechanical energy of a microstate x and peq(x) is the probability of state x when the system is in thermal equilibrium—a concept which had also already been well established in thermodynamics via the notion of quasi-stationary processes. (It is important to distinguish between a mechanical microstate and a thermodynamic state. A thermodynamic state is a state of recurrent motion, defined by an entire level set A = fx j U(x) = Eg. Thus, Boltzmann [23] also introduced his celebrated entropy SB(E) = kB ln W(E), where SB is the entropy and W(E) is the number of microstates consistent with a given energy E. That is, W(E) is the cardinality of A. In terms of E, then peq(E) ∝ W(E)e−E/kBT = e−[E−TS(E)]/kBT.) In a thermodynamic equilibrium, there is no net transport of any kind. (In the thermodynamics before Gibbs, macroscopic transport processes were driven by either a temperature or a pressure gradient in the three-dimensional physical space. In Gibbs’ macroscopic chemical thermodynamics, a chemical equilibrium has no net flux in the
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages20 Page
-
File Size-