Thermodynamic Derivation of the Fluctuation Theorem and Jarzynski

Thermodynamic Derivation of the Fluctuation Theorem and Jarzynski

Thermodynamic derivation of the fluctuation theorem and Jarzynski equality MAARTEN H. P. AMBAUM ∗ Department of Meteorology, University of Reading, United Kingdom Abstract A thermodynamic expression for the analog of the canonical ensemble for nonequilibrium systems is described based on a purely information theoretical interpretation of entropy. It is shown that this nonequilibrium canonical distribu- tion implies some important results from nonequilibrium thermodynamics, specifically, the fluctuation theorem and the Jarzynski equality. Those results are therefore expected to be more widely applicable, for example, to macroscopic systems. 1 Introduction Earth system may be governed by thermodynamic con- straints on entropy production [11, 12, 13, 14, 15]. The The derivations of the fluctuation theorem [1, 2] and the theoretical underpinning of those thermodynamic con- Jarzynski equality [3] appear to depend on the underlying straints is still lacking. The presence of a fluctuation the- microscopic Hamiltonian dynamics. From this it would orem for such systems would be of great importance. follow that these theorems are only relevant to micro- Here we demonstrate that the information-theoretical scopic systems, with their associated definitions of en- definition of entropy implies the fluctuation theorem and tropy and temperature. In contrast, a statistical mechani- the Jarzynski equality. It is shown that these results are cal description of macroscopic systems often depends on due to the counting properties of entropy rather than the more general forms of entropy, primarily information en- dynamics of the underlying system. As such, both these tropy [4, 5, 6]. Two notable examples from fluid dynamics results are applicable to a much wider class of problems, are the statistical mechanics of point vortices [7] and the specifically, macroscopic systems for which we can define statistical mechanics of two-dimensional incompressible an entropy and which are thermostatted in some general flows[8]. In such cases, ‘temperature’ is defined in terms sense. of the change of entropy with the energy of the system [9] The central tenet is that for two states A and B of a or, equivalently, in terms of the Lagrange multiplier for system, defined by two sets of macroscopic parameters, the energy under the maximization of entropy at a given the ratio of the probabilities pB=pA for the system to be in expectation value of the energy [10]. either state is The question is whether for such macroscopic systems we can derive a fluctuation theorem or Jarzynski equal- pB=pA = exp(DABS=k); (1) ity. This is of particular importance for climate science as S B there are strong indications that the global state of the cli- with DAB the difference in entropy between the states A mate system and, more generally, other components of the and . This is essentially the Boltzmann definition of entropy: entropy is a counting property of the system. ∗Email: [email protected] The theoretical background can be found in [10], where 1 it is shown that this information theoretical interpretation confined; in the following we will still refer to the ac- reproduces the statistical mechanics based on Gibbs en- cessible phase space under constraints A as the energy tropy, but furthermore gives a justification of the Gibbs shell. The hyper-area FA is non-dimensionalised such formulation as a statistical inference problem under lim- that FA(U)dU is proportional to the number of states ited knowledge of the system. Of note is that the entropy between energies U and U + dU. We will not consider only has meaning in relation to the macroscopic con- other multiplicative factors which make the argument of straints on the system (indicated by the subscripts A and B the logarithm non-dimensional; these contribute an addi- above), constraints which can be arbitrarily complex and tive entropy constant which will not be of interest to us prescriptive, as may be needed for systems far from equi- here. Note also that the microcanonical ensemble does not librium. In an information-theoretical setting the above include a notion of equilibrium: the system is assumed to definition of entropy is equivalent to the principle of in- be insulated so it cannot equilibrate with an external sys- difference: the absence of any distinguishing information tem. It just moves around on the energy shell (defined by between microscopic states within any of the macroscopic A) and the principle of indifference implies that all states, states A or B is equivalent to equal prior (prior to obtaining however improbable from a macroscopic point of view, additional macroscopic constraints) probabilities for the are members of the ensemble. Of course, the number of microscopic states [16]. Note also that we do not need to unusual states (say, with non-uniform macroscopic prop- specify precisely at this point how the states are counted, erties not defined by A) is much lower than the number or how an invariant measure can be defined on the phase of regular states (say, with uniform macroscopic density) space confined by A or B. The principle of indifference for macroscopic systems. Only for small systems, the dis- does not imply that all states are assumed equally proba- tinction becomes important but it does not invalidate the ble; it is a statement that we cannot a priori assume a cer- above formal definition of entropy. The above definition tain structure in phase space (such as a precisely defined of entropy also ensures that entropy is an extensive prop- invariant measure) in the absence of further information. erty such that for two independent systems considered to- The principle of indifference is not a statement about the gether the total entropy is the sum of the individual en- structure of phase space; it is a principle of statistical in- tropies, S = S1 + S2. The Boltzmann constant k ensures ference and it is the only admissible starting point from dimensional compatibility with the classical thermody- an information theoretical point of view. namic entropy when the usual equilibrium assumptions are made [10, 17]. The hyper-area of the energy shell, and thus the en- 2 A general form for the canonical tropy, can be a function of several variables which are set ensemble as external constraints, such as the total energy U, system volume, V, or particle number N for a simple gas system. Following Boltzmann, we define the entropy SA as the For the canonical ensemble we consider a system that can logarithm of the number of states accessible to a system exchange energy with some reservoir. We consider here under given macroscopic constraints A. For an isolated only a theoretical canonical ensemble in that we consider system, the entropy is related to the size FA of the acces- the coupling between the two systems to be weak such sible phase space, that the interaction energy vanishes compared to the rele- vant energy fluctuations in the system. SA = klnFA: (2) First, we need to define what a reservoir is. Follow- ing equilibrium thermodynamics, we formally define an For a classical gas system, A is defined by the energy U, ‘inverse temperature’ b = (kT)−1 as volume V and molecule number N, and the phase space −1 −1 size FA is the hyper-area of the energy shell, and it de- b = k ¶S=¶U = F ¶F=¶U: (3) fines the usual microcanonical ensemble. For more com- plicated systems, where A may include several macro- We make no claim about the equality of b and the classi- scopic order parameters, the energy shell becomes more cal equilibrium inverse temperature; b is the expansivity 2 of phase space with energy and as such can be defined up energy U. We can write the hyper-area of the energy for any system, whether it is in thermodynamic equilib- shell of the system F0 as a function of U. The total en- rium or not. When an isolated system is prepared far from tropy of the system plus reservoir R can then be written as equilibrium (for example, when it has a local equilibrium a function of the exchange energy, U, as temperature which varies over the system) then b is still uniquely defined for the system as a non-local property S = S0(U) + SR(UR) − kbU; (6) of the energy shell that the system resides on. Because with S0 = klnF0. The number of states at each level of both energy and entropy in the weak coupling limit are exchange energy therefore is proportional to extensive quantities, b must be an intensive quantity. Now consider a large isolated system R with total (in- F(U) ∝ F0(U) exp(−bU); (7) ternal) energy U . Let this system receive energy U0 from R where we omitted proportionality constants related to the the environment. By expanding its entropy S in powers R additive entropy constants. Nowhere we assume that the of U, we can then write the entropy of this large system system is in equilibrium with the reservoir. This means as that F(U) is the relevant measure to construct an ensem- 0 0 1 0 ¶b 02 ble average for the system, even for far-from-equilibrium SR(UR +U ) = SR(UR) + kU b + U + O(U ) : 2 ¶U systems. Even the reservoir can be locally out of equilib- (4) rium, as discussed above. We have also made no refer- We see that for finite U0; (¶b=¶U)−1 has to be an exten- ence to the size of the system of interest, as long as it is sive quantity. But that means that for a very large system much smaller than the reservoir. However, in contrast to ¶b=¶U = O(N−1); where N is a measure of the size of systems in thermodynamic equilibrium, there is no guar- the system (such as particle number).

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    6 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us