arXiv:cond-mat/0409037v2 [cond-mat.stat-mech] 3 Sep 2004 h ee om esalte s tt e h specific the get to it use then and shall Tsallis We for simple temperature form. a the newer in of obtain the value terms to its in determine prescription case then physical general and energy, a Hence free establish the ensemble. first the behavior of shall the properties we dis- controls macroscopic probability the turn because the all in energy, of of which free normalization the function, the of to tribution definition related the is is it system a of obtain to In functions. pdf the pdf. general of for stiff use quantities the more macroscopic on a concentrate more demands we be paper that we also this context paper may a earlier form is in the our contexts relevant in how and physical about 9] many commented 8, have leads in [7, which found established specific form well pdf any now Tsallis Levy-type here the a justify spe- of to not a applicability The in shall relevant we form. a be but Such indeed context, may arbitrarily. cific systems [6] almost of entropy defined mechanics ’designer’ is statistical entropy the the develop where to try and [5] theorem coding approach. Shannon’s an how such measure and elucidates a being rescaling, cell parameter phase the the a of re-scaled, in is states ap- cell of the mixing limit can when of entropy similar terms of in form a new naturally this arise in for definition have and the We how 4], entropy. shown extensive similar 3, classical parameter Shannon’s [2, proaches a entropy on Tsallis depends to which entropy tensive ∗ lcrncades [email protected] address: Electronic eta otedvlpeto h ttsia mechanics statistical the of development the to Central attitude general more a adopt shall we paper this In nonex- of form new a proposed [1] recently have We ewrs nrp,nnxesv,poaiiydistributi probability nonextensive, 89.90.+n entropy, +m, Keywords: 05.90 for 05.70.-a, new numbers: PACS a and specific forms. the these form and of root Tsallis energy each the internal the for the det is states of energy, the it free cases to the when the central compare even study is numerically which we uniquely energy, particular free obtained the be dis probability that can the find of We function general forms. a as given is entropy ecnie h rbe fdfiigfe nryadohrther other and energy free defining of problem the consider We .INTRODUCTION I. eeaie nrpe n ttsia Mechanics Statistical and Generalized eateto Physics of Department rneo,N 08540 NJ Princeton, rneo University Princeton ailShafee Fariel USA. nfnto,fe nry pcfi heat specific energy, free function, on ∗ fteprmtra ieettemperatures. change different the with at changes parameter it how the see of and cases both in heat omlzto ftepdf’s the of normalization and ergy. rbblt fthe of probability edffrn rmteSannform: Shannon the from different be u ftecmoet rmaltestates: the all from components the of sum enleeg fteesml e constituent, per ensemble the of energy ternal where ihtesml solution simple the with e smk h supinta h entropy the that assumption the make us Let h d sfudb piiigtefunction the optimizing by found is pdf The egtfo optimizing from get We where β rbto,icuigta o nonextensive for that including tribution, r h arnemlilesascae ihthe with associated multipliers Lagrange the are L eto ipesse ftoenergy two of system simple a of heat φ S = S sagnrlzdfnto hc ili general in will which function generalized a is steetoy(hcee form), (whichever entropy the is riaino l te quantities, other all of ermination S fatasedna qain In equation. transcendental a of = rpsdb srcnl.We recently. us by proposed m I NRP N PDF AND ENTROPY II. oyai ucin hnthe when functions modynamic + −h α log( φ i (1 t tt hc a energy has which state -th ( φ p − ′ S i p ( ( = ) p ) X = − ≡ i = ) i X p L p α i i i α + ) + n h osraino en- of conservation the and φ + X ( i βE p βE β i p ( (2) ) i U i ) i log( p − i X p i i (3) ) p i E E U i (1) ) p i and , i h in- the S sthe is sa is (4) (5) α 2

the constant of integration vanishing because there can so that Eq.10 yields the familiar expression for A be no contribution to the entropy from a state which has zero probability. A = − log(Q)/β (13) where Q is the partition function III. FREE ENERGY

−βEi We shall assume that the A is Q = e (14) defined by Xi

The exponential form of pi in Eq.12 allows the separa- tion of the A dependent factor and hence such a simple β(U − A)= S (6) expression for A in the case of Shannon entropy, giving Hence if S is nonextensive, and U is extensive, then A us normal extensive thermodynamics. must also be nonextensive. In the Tsallis case we have Using Eq.5 and Eq.6 get pq − 1 S = − i i = −hLog (p )i (15) Pq − 1 q i A = −α/β (7) using the q-logarithm Hence we get the relation for the pdf pq−1 − 1 −1 Logq(p)= (16) pi = ψ (β(Ei − A)) (8) q − 1 with the definition and this is easily seen to lead to, using the symbol ǫ for (1 − q)

ψ(p)= φ(p)/p (9) 1 1/ǫ pi = (17) Here we assume that the function ψ can be inverted. 1+ ǫβ(Ei − A) This may not always be the case for an arbitrary expres- sion for the entropy, at least not in a manageable closed We note that as it is no longer possible to separate form. Finally, A can be obtained from the constraint out a common A dependent factor, we can no longer find equation an expression for A in terms of the partition function in the usual way. Instead it is necessary to solve the normalization equation Eq.10. For a general value of ǫ −1 pi = ψ (β(Ei − A)) = 1 (10) this will give an infinite number of roots, but for values of Xi Xi ǫ corresponding to reciprocals of integers, we shall have polynomial equations with a finite number of roots, which Once A has been determined, it may be placed in Eq.8 too may be complex in general. We shall later see, at to get the pdf p for each of the states, all properly nor- i least for the simple example we consider later, that a malized, and we can find U and its derivative C, the real and stable root can be found that approaches the specific heat. For the simple system that we shall be Shannon value of A as ǫ → 0, because in that limit the concerned with, pressure and volume, or their analogues, Log (p) function in the definition of the Tsallis entropy will not enter our considerations, and hence we have only q also coincides with the natural logarithm. one specific heat, Cv, with β now identified as the inverse scale of energy, the temperature T , V. THERMODYNAMICS OF THE NEW ∂U ENTROPY C = −β2 (11) ∂β The entropy we proposed in ref.[1] is given by

IV. SHANNON AND TSALLIS dM(q) THERMODYNAMICS S = = − pqlog(p ) (18) dq i i Xi The Shannon entropy Eq.3 immediately gives, using with mixing probability [1] Eq.8 and Eq.9

M(q)=1 − pq (19) −α−βEi β(A−Ei) i pi = e = e (12) Xi 3

We can also express it as the q-expectation value The third order difference between W (Z) and log(1+z) is only 1 z3, and hence the difference between the Tsallis q 24 S = −hlog(p)iq ≡ pi log(pi) (20) form and our form of entropy will be detectable at rather Xi low T , i.e. high β. Proceeding as in the case of Tsallis entropy we get, with ǫ =1 − q, VI. APPLICATION TO A SIMPLE SYSTEM

−W (ǫβ(Ei−A))/ǫ pi = e (21) Let us consider the simplest nontrivial system, where we have only two energy eigenvalues ±E, as in a spin- where W is the Lambert function defined by 1/2 system. For a noninteracting ensemble of systems we shall expect the standard results [11] corresponding W (z)eW (z) = z (22) to Shannon entropy Like the Tsallis case, here too we have to obtain A by solving numerically the transcendental Eq. 10. For the A = − log[2 cosh(βE)]/β (29) specific heat we get S = log[2 cosh(βE)] − βE tanh(βE) (30) U = −E tanh(βE) (31) e−Wi(1+1/ǫ) 2 2 C = β2 E (E − A) (23) C = (βE) / cosh (βE) (32) i i 1+ W Xi i

where we have written for brevity Wi = A W (ǫβ(Ei − A)) and have used the following identi- -1 ties related to the Lambert function [10]. -1.5

-2 ′ W (z) W (z)= (24) z(1 + W (z)) -2.5 − W (z)/z = e W (z) (25) -3 -3.5 As W (z) ∼ z for small z, we see that for small ǫ we T effectively get classical pdf and thermodynamics, as in 1 2 3 4 5 6 the Tsallis case. So the parameter ǫ is again a measure -4.5 of the departure from the standard due to nonextensivity. However, our nonextensivity is A ǫ . different functionally from the Tsallis form [1], and the FIG. 1: for Shannon (top), Tsallis with = 0 1 (middle) and Tsallis with ǫ = 0.25 (bottom) values of ǫ in the two forms can only be compared in the limit of low β. If we use the power series expansion of W (z) S

(−n)n−1zn 0.7 W (z)= (26) n! 0.6 nX=1 0.5 and compare that with the power series expansion for 0.4 log(1 + z) in a form of the Tsallis pi similar to the new entropy form 0.3 0.2 − log(1+ǫT β(Ei−A))/ǫT pi = e (27) 0.1 T we get a cancelation of the parameter ǫn of the new 1 2 3 4 5 6 entropy and of Tsallis parameter ǫT in the first order, so that both distributions approach the Shannon pdf, as FIG. 2: S for same three entropy forms (from bottom to top we have already mentioned, but if we demand equality in Shannon, Tsallis (0.1), Tsallis (0.25)) the second order, we get For the Tsallis entropy we shall take ǫ =0.25, 0.10 and 1 solve numerically. The values are shown in Figs 1-4. It is ǫ = ǫ (28) n 2 T seen that Tsallis entropy gives very similar shapes for all 4 the variables and for ǫ = 0.10 we get a fit much nearer A to the Shannon form than for ǫ =0.25. T The specific heat shows the typical Schottky form for 0.2 0.4 0.6 0.8 1 a two-level system. -1.025 -1.05 U T -1.075 1 2 3 4 5 6 -1.1 -0.2 -1.125 -0.4 -1.15

-0.6 FIG. 5: Comparison of A for Shannon (apart), Tsallis with -0.8 ǫ = 0.1 and new entropy for ǫ = 0.05 (superposed)

-1 S

U FIG. 3: for same three entropy forms (same order as for 0.4 S)

0.3 C 0.2

0.4 0.1

0.3 T 0.2 0.4 0.6 0.8 1 0.2 FIG. 6: Comparison of S for same three forms of entropy 0.1 (Shannon, Tsallis (0.1), new entropy(0.05) - Tsallis is just over the new entropy ). T 1 2 3 4 5 6 U FIG. 4: C for same three entropy forms (peaks bottom to top - Shannon, Tsallis (0.1), Tsallis (0.25)) -0.75

-0.8 Though the variables involve both p+ and p−, after finding A we can replace p− by 1−p for faster execution + -0.85 of the numerics. For our new entropy too we first find the numerical -0.9 value of A from the normalization condition -0.95 − − e W+/ǫ + e W−/ǫ = 1 (33) T 0.2 0.4 0.6 0.8 1 and then use this value to find U, S, and C. In Figs. U 5-8 we show the values corresponding to ǫn =0.05, which FIG. 7: Comparison of for the same three entropies. Shan- is half the Tsallis parameter ǫT =0.10 used, and also the non is separated, other two overlap. Shannon values. We note that only for the S curve we have a perceptible difference between Tsallis entropy and our new entropy at values of β near 1. form of the entropy as a function of the pdf’s of the states. We note that despite the apparent complexity of the exponential or transcendental equations determin- VII. CONCLUSIONS ing the primary variable, the Helmholtz free energy A, it is possible to numerically get stable values which ap- We have presented above a simple prescription for find- proach the expected Shannon values in the right limit of ing the important thermodynamic variables for any given the parameter used, both in the Tsallis case and in our 5

C indeed be the desirable characteristic. But at low val- ues the corresponding values of the parameters for the 0.4 two distributions produce virtually completely overlap- ping graphs. The form of the entropy may be a reflec- 0.3 tion of the effective interaction among the constituent systems of the ensemble, the Shannon form being the 0.2 limiting case of the zero interaction case, and Tsallis or our form being results of different forms of interaction with ǫ standing for a coupling constant. It may be inter- 0.1 esting to investigate more complicated systems that are physically realizable and comparable. It is noteworthy T 0.2 0.4 0.6 0.8 1 that most thermodynamic functions we have considered here are not crucially dependent on the form of the en- FIG. 8: Specific heat C for same three entropies. Again, only tropy with adjusted coupling, though the value of entropy Shannon is separated. itself may vary significantly in the different formulations. This probably indicates that the best way to discriminate between the suitability of different definitions of entropy case of the newly defined entropy. For higher values of in different contexts may be in comparing quantities that the parameter our entropy would give values varying sig- relate most directly to entropy. nificantly from the Shannon or even the Tsallis entropy, and there may be physical situations where that may

[1] F. Shafee, ”A new nonextensive entropy”, cle spectra”, hep-ph/0004225 (2000) nlin.AO/0406044 (2004) [8] O. Sotolongo-Costa et al., ”A nonextensive approach to [2] C. Tsallis,J. Stat. Phys., 52, 479(1988) DNA breaking by ionizing radiation”, cond-mat/0201289 [3] P. Grigolini, C. Tsallis and B.J. West,Chaos, Fractals (2002) and Solitons,13, 367 (2001) [9] C. Wolf, ” for photons admitting Tsallis [4] A.R. Plastino and A. Plastino, J. Phys. A 27, 5707 statistics”, Fizika B 11, 1 (2002) (1994) [10] R.M. Corless, G.H. Gonnet, D.E.G. Hare, D.J. Jeffrey [5] M.A. Nielsen and M. Chuang, Quantum computation and D.E. Knuth, ”On the Lambert W function”, U.W. and quantum information (Cambridge U.P., NY, 2000) Ontario preprint (1996) [6] P.T. Landsberg, ”Entropies Galore”, Braz. J. Phys. 29, [11] R.K. Pathria, Statistical Mechanics, (Butterworth- 46 (1999) Heinemann, Oxford, UK, 1996) p.77 [7] C. Beck, ”Nonextensive statistical mechanics and parti-