Thermodynamics of Markov Processes with Non-Extensive Entropy and Free Energy

Thermodynamics of Markov Processes with Non-Extensive Entropy and Free Energy

Thermodynamics of Markov Processes with Non-extensive Entropy and Free Energy Liangrong Peng1;2, Hong Qian3, Liu Hong2;3 1College of Mathematics and Data Science, Minjiang University, Fuzhou, 350108, P.R.C. 2Zhou Pei-Yuan Center for Applied Mathematics, Tsinghua University, Beijing, 100084, P.R.C. 3Department of Applied Mathematics, University of Washington, Seattle, WA 98195-3925, U.S.A.a) (Dated: 1 April 2020) Statistical thermodynamics of small systems shows dramatic differences from nor- mal systems. Parallel to the recently presented steady-state thermodynamic for- malism for master equation and Fokker-Planck equation, we show that a \ther- modynamic" theory can also be developed based on Tsallis' generalized entropy (q) PN q (q) S = i=1(pi − pi )=[q(q − 1)] and Shiino's generalized free energy F = PN q−1 (q) [ i=1 pi(pi/πi) − 1]=[q(q − 1)], where πi is the stationary distribution. dF =dt = (q) (q) (q) (q) −fd ≤ 0 and it is zero iff the system is in its stationary state. dS =dt − Qex = fd (q) where Qex characterizes the heat exchange. For systems approaching equilibrium (q) with detailed balance, fd is the product of Onsager's thermodynamic flux and force. However, it is discovered that the Onsager's force is non-local. This is a consequence of the particular transformation invariance for zero energy of Tsallis' statistics. Keywords: Tsallis Thermodynamics, Non-equilibrium Steady-State, Master Equa- tion, Fokker-Planck Equation arXiv:1005.1251v3 [math-ph] 31 Mar 2020 a)Author to whom correspondence should be addressed. Electronic mail: [email protected], [email protected] 1 1. INTRODUCTION The Boltzmann-Gibbs (B-G) statistical thermodynamics is one of the fundamental theo- ries of modern physics with wide applications in chemistry, biology, and even social sciences. Recently, a complete statistical thermodynamic formalism based on both the B-G entropy and B-G free energy (relative entropy) has been presented for stochastic dynamics based on master equations1,2 and Fokker-Planck equations3,4. The theory introduces several key quantities: entropy production rate, free energy dissipation rate, excess and house-keeping heat, etc. Statistical thermodynamics of small systems requires the traditional thermodynamic quantities such as energy and entropy to be non-extensive5. This extension of the clas- sical B-G entropy provides a framework to study the systems with long-range interactions, multi-fractal structures, long-term memories, or systems whose interaction with macroscopic parts cannot be ignored6. It is therefore natural to ask whether the thermodynamic formal- ism can be developed in parallel for non-extensive entropy and free energy, such as the Tsallis entropy7, Renyi entropy8,9 and many others6,10. Here we focus on the Tsallis entropy, which enjoys the properties of concavity, experimen- tal robustness (Lesche stability), non-extensivity (as a generalization of the B-G entropy), and the finiteness of the rate of entropy production11,12. This work shows that a similar structure does exist for Tsallis thermodynamics, which therefore would be of interest to a wide range applications, like defect turbulence13, optical lattices14, mathematical finance15, and so on. PN Generalized Entropy. The notions of Shannon's entropy in the form − i=1 fi ln fi PN and relative entropy in the form i=1 fi ln(fi=pi) arise from the asymptotic expressions for the probability of frequency ffig within a large set of independent and identically distributed 1 (i.i.d.) random samples, each with an equal probability of N and fpig respectively. The −NI(f1;··· ;fN ) exponent I(f1; ··· ; fN ) in the asymptotic expression e , known as large deviations rate function, provides the divergence function for an information geometry16. When a set of samples are not independent, the probability for the frequency is no longer a simple multi-nomial distribution; Tsallis entropy is believed to arise in this context in connection with q-binomial distribution17. The Shannon entropy is intimately related to the logarithmic function, y = ln(x), which is the inverse of exponential function x = exp(y). The exponential 2 function is defined through the limit: 1 exp(y) = lim (1 + ry) r : (1) r!0 The inverse function of the right-hand side of Eq. (1) is xr − 1 y = : (2) r If one takes this as the analogue of the y = ln(x), then corresponding to Gibbs' entropy P − i pi ln pi, one can have a generalization of the entropy: N X pr − 1 − p i : (3) i r i=1 If we denote r = q − 1, then Eq. (3) becomes N q−1 X 1 − p S(q) = p i ; (4) i q(q − 1) i=1 as q ! 1. This is the non-extensive entropy first proposed by Tsallis7, and q 2 R (q 6= 0; 1) is known as the non-extensive parameter. Here we added an additional factor q to the denominator to maintain the definite sign of free energy and its dissipation rate, which will be shown later. Because of relations in Eqs. (1) and (2), the Tsallis entropy enjoys many of the properties of the Gibbs' entropy. Relative entropy. Recognizing the correspondence between ln(x) and (xq−1 −1)=[q(q − 18,19 1)] as q ! 1, a generalization of the relative entropy with respect to a distribution gi can be similarly introduced20: " N q−1 # 1 X pi H(q)(fp gjjfg g) = p − 1 : (5) i i q(q − 1) i g i=1 i q Statistical internal energy and H as the Gibbs free energy. Assuming fπig is a stationary distribution to the master equation in Eq. (12) below, one can define a statistical internal energy by applying the Boltzmann's law in reverse1: π1−q − 1 (q) = i ; (6) i q(q − 1) in analogous to B-G energy i = ln(1/πi). Then the mean statistical internal energy is N 1−q X (q) X π − 1 U (q) = pq = pq i : (7) i i i q(q − 1) i=1 i=1 3 q (q) The mean internal energy being the sum of pi i rather than the conventional pii is a peculiar feature of the Tsallis statistics. Then we have generalized free energy: " N q−1 # 1 X pi F (q) = U (q) − S(q) = p − 1 = H(q)(fp gjjfπ g): (8) q(q − 1) i π i i i=1 i Conditional entropy. Tsallis entropy does not enjoy the standard equation concerning conditional entropy of a pair of correlated random variables A and B19,21: S(AB) = S(A) + SA(B); (9a) where X S(AB) = − piqij log (piqij) ; (9b) i;j X S(A) = − pi log pi; (9c) i ! X X SA(B) = − pi qij log qij : (9d) i j For Tsallis's entropy, one has ! " # 1 X 1 X X X S(q)(AB) = 1 − (p q )q = 1 − pq + pq − (p q )q q(q − 1) i ij q(q − 1) i i i ij i;j i i i;j P q X 1 − j qij X = S(q)(A) + pq = S(q)(A) + pq S(q)(BjA = i) i q(q − 1) i i i (q) X (q) (q) (q) ≤ S (A) + pi S (BjA = i) = S (A) + SA (B); i when q > 1. And the inequality sign changes when q < 1. Eq. (9), together with entropy reaches its maximum with uniform distribution 1 1 1 1 S (fp ; p ; p ; ··· ; p g) ≤ S ; ; ; ··· ; ; (10) 1 2 3 N N N N N and entropy is invariant with adding an impossible event S (fp1; p2; p3; ··· ; pN ; 0g) = S (fp1; p2; p3; ··· ; pN g) ; (11) providing Khinchin's uniqueness theorem for Gibbs entropy21. It is noted that Tsallis entropy does satisfy both Eqs. (10) and (11). By replacing Eq. (9) with a more general non-extensive one, Abe has shown that the Tsallis entropy follows uniquely from the generalized version of Khinchin's axioms22. 4 This paper is organised as follows. In Sec. 2 and 3, the Tsallis thermodynamics of master equations and F-P equations are constructed separately, with emphasis on the correct non- negative form of house-keeping heat. In Sec. 4, the underlying mathematical connection and correspondence between the two formalisms is explored and justified for both the discretized F-P equations and tridiagonal master equations, in the limit of infinitesimal jump step in space. Sec. 5 provides two simple but insightful examples to illustrate the application of Tsallis thermodynamics and its difference from B-G thermodynamics. Finally, we summarize our results in Sec. 6. Proofs on theorems in Sec. 4 can be found in appendix for completeness. 2. TSALLIS THERMODYNAMICS OF MASTER EQUATIONS We now consider a Markov process with master equation N dpi(t) X = (p w − p w ) ; i = 1; 2; ··· ; N; (12) dt j ji i ij j=1 which has been widely applied for modeling various irreversible processes, including quantum physics, chemical reactions and open biological systems23{25. Relative entropy and positive free energy dissipation. We assume both fpi(t)g and fgi(t)g satisfy the same master equation, with different initial distributions. Then the time derivative of the relative entropy N " q−1 q # d 1 X pi(t) dpi pi(t) dgi H(q)(t) = q − (q − 1) dt q(q − 1) g (t) dt g (t) dt i=1 i i N ( " q−1 q−1# q q 1 X pi pj pi pj = w qp − −(q − 1)g − q(q − 1) ji j g g j g g i;j=1 i j i j N q ( q−1 q ) 1 X wjipj pigj pigj = q − (q − 1) − 1 ≤ 0: (13) q(q − 1) q−1 g p g p i;j=1 gj i j i j The last step is because wij ≥ 0 when i 6= j, the summand is zero when i = j, and the term in the {· · · g: qzq−1 − (q − 1)zq − 1 (1 − z)2 d 1 − zq = − ≤ 0; (z > 0): (14) q(q − 1) q(q − 1) dz 1 − z Now if we take a stationary distribution fπig as the fgig, then we have the time derivative of generalized free energy F (q): (q) N " q−1 q−1# N dF 1 X pi pj 1 X (q) = w p − = − φ Φ ; (15) dt q − 1 ji j π π 2 ij ij i;j=1 i j i;j=1 5 where (p /π )q−1 − (p /π )q−1 φ (t) = p w − p w ; Φ(q)(t) = i i j j : (16) ij i ij j ji ij q − 1 Markovian dynamics with detailed balance.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    27 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us