[Cond-Mat.Stat-Mech] 3 Sep 2004 H Ee Om Esalte S Tt E H Specific the Get to It Use Then and Shall Tsallis We for Simple Temperature Form
Total Page:16
File Type:pdf, Size:1020Kb
Generalized Entropies and Statistical Mechanics Fariel Shafee Department of Physics Princeton University Princeton, NJ 08540 USA.∗ We consider the problem of defining free energy and other thermodynamic functions when the entropy is given as a general function of the probability distribution, including that for nonextensive forms. We find that the free energy, which is central to the determination of all other quantities, can be obtained uniquely numerically even when it is the root of a transcendental equation. In particular we study the cases of the Tsallis form and a new form proposed by us recently. We compare the free energy, the internal energy and the specific heat of a simple system of two energy states for each of these forms. PACS numbers: 05.70.-a, 05.90 +m, 89.90.+n Keywords: entropy, nonextensive, probability distribution function, free energy, specific heat I. INTRODUCTION heat in both cases and see how it changes with the change of the parameter at different temperatures. We have recently [1] proposed a new form of nonex- tensive entropy which depends on a parameter similar II. ENTROPY AND PDF to Tsallis entropy [2, 3, 4], and in a similar limit ap- proaches Shannon’s classical extensive entropy. We have shown how the definition for this new form of entropy can The pdf is found by optimizing the function arise naturally in terms of mixing of states in a phase cell when the cell is re-scaled, the parameter being a measure L = S + α(1 − pi)+ β(U − piEi) (1) of the rescaling, and how Shannon’s coding theorem [5] Xi Xi elucidates such an approach. where S is the entropy (whichever form), U the in- In this paper we shall adopt a more general attitude ternal energy of the ensemble per constituent, p is the and try to develop the statistical mechanics of systems i probability of the i-th state which has energy E , and α where the entropy is defined almost arbitrarily. Such a i and β are the Lagrange multipliers associated with the ’designer’ entropy [6] may indeed be relevant in a spe- normalization of the pdf’s p and the conservation of en- cific context, but we shall not justify here any specific i ergy. form. The applicability of the Tsallis form which leads Let us make the assumption that the entropy S is a to a Levy-type pdf found in many physical contexts is sum of the components from all the states: now well established [7, 8, 9] and in the earlier paper we have commented about how our form may also be more relevant in a context that demands a more stiff pdf. In S = φ(pi) (2) this paper we concentrate on the use of the pdf to obtain Xi macroscopic quantities for general entropy functions. where φ is a generalized function which will in general Central to the development of the statistical mechanics be different from the Shannon form: of a system is the definition of the free energy, because it is related to the normalization of the probability dis- arXiv:cond-mat/0409037v2 [cond-mat.stat-mech] 3 Sep 2004 tribution function, which in turn controls the behavior S = −hlog(p)i≡− pi log(pi) (3) of all the macroscopic properties of the ensemble. Hence Xi we shall first establish a general prescription to obtain We get from optimizing L the free energy, and then determine its value in a simple physical case in terms of the temperature for Tsallis and ′ the newer form. We shall then use it to get the specific φ (p)= α + βEi (4) with the simple solution ∗ Electronic address: [email protected] φ(pi) = (α + βEi)pi (5) 2 the constant of integration vanishing because there can so that Eq.10 yields the familiar expression for A be no contribution to the entropy from a state which has zero probability. A = − log(Q)/β (13) where Q is the partition function III. FREE ENERGY −βEi We shall assume that the Helmholtz free energy A is Q = e (14) defined by Xi The exponential form of pi in Eq.12 allows the separa- tion of the A dependent factor and hence such a simple β(U − A)= S (6) expression for A in the case of Shannon entropy, giving Hence if S is nonextensive, and U is extensive, then A us normal extensive thermodynamics. must also be nonextensive. In the Tsallis case we have Using Eq.5 and Eq.6 get pq − 1 S = − i i = −hLog (p )i (15) Pq − 1 q i A = −α/β (7) using the q-logarithm Hence we get the relation for the pdf pq−1 − 1 −1 Logq(p)= (16) pi = ψ (β(Ei − A)) (8) q − 1 with the definition and this is easily seen to lead to, using the symbol ǫ for (1 − q) ψ(p)= φ(p)/p (9) 1 1/ǫ pi = (17) Here we assume that the function ψ can be inverted. 1+ ǫβ(Ei − A) This may not always be the case for an arbitrary expres- sion for the entropy, at least not in a manageable closed We note that as it is no longer possible to separate form. Finally, A can be obtained from the constraint out a common A dependent factor, we can no longer find equation an expression for A in terms of the partition function in the usual way. Instead it is necessary to solve the normalization equation Eq.10. For a general value of ǫ −1 pi = ψ (β(Ei − A)) = 1 (10) this will give an infinite number of roots, but for values of Xi Xi ǫ corresponding to reciprocals of integers, we shall have polynomial equations with a finite number of roots, which Once A has been determined, it may be placed in Eq.8 too may be complex in general. We shall later see, at to get the pdf p for each of the states, all properly nor- i least for the simple example we consider later, that a malized, and we can find U and its derivative C, the real and stable root can be found that approaches the specific heat. For the simple system that we shall be Shannon value of A as ǫ → 0, because in that limit the concerned with, pressure and volume, or their analogues, Log (p) function in the definition of the Tsallis entropy will not enter our considerations, and hence we have only q also coincides with the natural logarithm. one specific heat, Cv, with β now identified as the inverse scale of energy, the temperature T , V. THERMODYNAMICS OF THE NEW ∂U ENTROPY C = −β2 (11) ∂β The entropy we proposed in ref.[1] is given by IV. SHANNON AND TSALLIS dM(q) THERMODYNAMICS S = = − pqlog(p ) (18) dq i i Xi The Shannon entropy Eq.3 immediately gives, using with mixing probability [1] Eq.8 and Eq.9 M(q)=1 − pq (19) −α−βEi β(A−Ei) i pi = e = e (12) Xi 3 We can also express it as the q-expectation value The third order difference between W (Z) and log(1+z) is only 1 z3, and hence the difference between the Tsallis q 24 S = −hlog(p)iq ≡ pi log(pi) (20) form and our form of entropy will be detectable at rather Xi low T , i.e. high β. Proceeding as in the case of Tsallis entropy we get, with ǫ =1 − q, VI. APPLICATION TO A SIMPLE SYSTEM −W (ǫβ(Ei−A))/ǫ pi = e (21) Let us consider the simplest nontrivial system, where we have only two energy eigenvalues ±E, as in a spin- where W is the Lambert function defined by 1/2 system. For a noninteracting ensemble of systems we shall expect the standard results [11] corresponding W (z)eW (z) = z (22) to Shannon entropy Like the Tsallis case, here too we have to obtain A by solving numerically the transcendental Eq. 10. For the A = − log[2 cosh(βE)]/β (29) specific heat we get S = log[2 cosh(βE)] − βE tanh(βE) (30) U = −E tanh(βE) (31) e−Wi(1+1/ǫ) 2 2 C = β2 E (E − A) (23) C = (βE) / cosh (βE) (32) i i 1+ W Xi i where we have written for brevity Wi = A W (ǫβ(Ei − A)) and have used the following identi- -1 ties related to the Lambert function [10]. -1.5 -2 ′ W (z) W (z)= (24) z(1 + W (z)) -2.5 − W (z)/z = e W (z) (25) -3 -3.5 As W (z) ∼ z for small z, we see that for small ǫ we T effectively get classical pdf and thermodynamics, as in 1 2 3 4 5 6 the Tsallis case. So the parameter ǫ is again a measure -4.5 of the departure from the standard statistical mechanics due to nonextensivity. However, our nonextensivity is A ǫ . different functionally from the Tsallis form [1], and the FIG. 1: for Shannon (top), Tsallis with = 0 1 (middle) and Tsallis with ǫ = 0.25 (bottom) values of ǫ in the two forms can only be compared in the limit of low β. If we use the power series expansion of W (z) S (−n)n−1zn 0.7 W (z)= (26) n! 0.6 nX=1 0.5 and compare that with the power series expansion for 0.4 log(1 + z) in a form of the Tsallis pi similar to the new entropy form 0.3 0.2 − log(1+ǫT β(Ei−A))/ǫT pi = e (27) 0.1 T we get a cancelation of the parameter ǫn of the new 1 2 3 4 5 6 entropy and of Tsallis parameter ǫT in the first order, so that both distributions approach the Shannon pdf, as FIG.