<<

1

Mathematical structure derived from Tsallis Hiroki Suyari Department of Information and Image Sciences, Chiba University, 263-8522, Japan e-mail: [email protected], [email protected]

Abstract— In order to explain the ubiquitous existence of power-law 0.45 behaviors in nature in an unified manner, Tsallis entropy introduced in q=0.2 0.4 q=0.6 1988 has been successfully applied to describing many physical behaviors. mean= 0 q=1.0 (Gaussian) Tsallis entropy is an one-parameter generalization of Shannon entropy, q-variance=1 q=1.4 0.35 q=1.8 which recovers the usual Boltzmann-Gibbs statistics based on Shannon 0.3 entropy. Thus, the generalized Boltzmann-Gibbs statistics by means of q < 1 : narrow fx() Tsallis entropy is called Tsallis statistics. Recently, we discovered the 0.25 q > 1 : wide beautiful mathematical structure behind Tsallis statistics: law of error, 2 0.2 q Stirling’s formula, q multinomial coefficient, q central limit theorem expqq()−β x 2 − − − ∫ expqq()−β ydy (numerical evidences only). In particular, the q central limit theorem 0.15 is a mathematical representation of the ubiquitous− existence of power- law behaviors in nature. In this summary, we briefly show these results 0.1 without proof. Each mathematical proof is given in my references. 0.05 Index Terms— Tsallis entropy, q logarithm, q exponential, q-product, 0 q Gaussian, law of error in Tsallis− statistics,− q Stirling’s formula, -5 -4 -3 -2 -1 0 1 2 3 4 5 q−multinomial coefficient, q central limit theorem,− Pascal triangle in x Tsallis− statistics − Fig. 1. q Gaussian probability density function (q =0.2,0.6,1.0,1.4,1.8) − I. INTRODUCTION Tsallis entropy [1][2] in discrete system and continuous system are respectively defined as follows: The mathematical basis for Tsallis statistics comes from the n deformed expressions for the logarithm and the exponential functions q q 1 pi which are the logarithm function: (d) − i=1 + − Sq := q R , (1) 1 q q 1 ∈ x − 1 − q lnq x := − (x 0,q R) (7) (c) 1 f (x) dx + 1 q ≥ ∈ Sq :=− q R (2) − q 1 ∈ − and its inverse function, the q exponential function (6). Using the n − where pi i=1 is a probability distribution and f is a probability q logarithm function (7), Tsallis entropy (2) can be written as density{ function.} Tsallis entropy recovers Shannon entropy when q − → 1. (c) q S = f (x) lnq f (x) dx, (8) n q − (d) (c) limSq = pi ln pi, limSq = f (x)lnf (x) dx (3) q 1 q 1 → − i=1 → − which is easily found to recover Shannon entropy when q 1. → (c) The many successful applications of Tsallis statistics stimulate us The maximum entropy principle for Tsallis entropy Sq under the to try to find the new mathematical structure behind Tsallis statistics constraints: [7][8][9]. Recently, a new multiplication operation determined by the x2f (x)q dx q logarithm and the q exponential function which naturally emerge f (x) dx =1, = σ2 (4) − − f (y)q dy from Tsallis entropy is presented in [10] and [11]. Using this algebra, we can obtain the beautiful mathematical structure behind Tsallis yields the so-called q Gaussian probability density fuinction: statistics. In this summary, we briefly show our results without proofs. − 2 Each proof can be found in my references. The contents consist exp β x 1 q q 2 1 q f (x)= − 1+(1 q) β x − , (5) of the 7 sections, section I: introduction (this section), section II: 2 ∝ q expq βqy dy − − the new multiplication operation, section III: law of error in Tsallis − statistics, section IV: q Stiling’s formula, section V: q multinomial where expq(x) is the q exponential function defined by − − − coefficient and symmetry in Tsallis statistics, section VI: q central 1 limit theorem in Tsallis statitics (numerical evidence only) and− section 1 q [1 + (1 q) x] − if 1+(1 q) x>0, expq (x):= − − (x R) VII: conclusion. 0 otherwise ∈ (6) and βq is a positive constant related to σ and q [3][4]. For q 1, q Gaussian distribution (5) recovers a usual Gaussian distribution.→ II. THE NEW MULTIPLICATION OPERATION DETERMINED BY − q LOGARITHM FUNCTION AND q EXPONENTIAL FUNCTION The power form in q Gaussian (5) has been found to be fairly fitted − − to many physical systems− which cannot be systematically studied in the usual Boltzmann-Gibbs [5][6]. The new multiplication operation q is introduced in [10] and [11] for satisfying the following equations:⊗ ———————————————————– This work was partially supported by the Ministry of Education, Science, lnq (x q y)=lnq x +lnq y, (9) Sports and Culture, Grant-in-Aid for Encouragement of Young Scientists(B), ⊗ exp (x) q exp (y)=exp (x + y) . (10) 14780259, 2004. q ⊗ q q 2

These lead us to the definition of q between two positive numbers Gauss’ law of error is generalized to Tsallis statistics, which results ⊗ in q Gaussian as a generalization of a usual Gaussian distribution. 1 1 q 1 q 1 q x>0,y>0, − x − + y − 1 − , if 1 q 1 q Consider almost the same setting as Gauss’ law of error: we x qy := − x − + y − 1 > 0, get n observed values (16) as results of n measurements for some ⊗  0, otherwise −  observation. Each observed value xi (i =1, ,n) is each value of (11) ··· identically distributed random variable Xi (i =1, ,n),respec- which is called q product in [11]. The q product recovers the usual ··· − − tively. There exist a true value x satisfying the additive relation (17) product such that lim (x q y)=xy. The fundamental properties of q 1 where each of ei is an error in each observation of a true value → ⊗ the q product q are almost the same as the usual product, but the x. Thus, for each Xi, there exists a random variable Ei such that − ⊗ distributive law does not hold in general. Xi = x + Ei (i =1, ,n).EveryEi has the same probability ··· density function f which is differentiable, because X1, ,Xn are a (x q y) = ax q y (a, x, y ) (12) ··· R identically distributed random variables (i.e., E1, ,En are also ⊗ 6 ⊗ ∈ ··· The properties of the q product can be found in [10] and [11]. so). Let Lq (θ) be a function of a variable θ, defined by In order to see one− of the validities of the q product in Tsallis − Lq (θ):=f (x1 θ) q f (x2 θ) q q f (xn θ) . (21) statistics, we recall the well known expression of the exponential − ⊗ − ⊗ ···⊗ − function exp (x) given by Theorem 2: If the function Lq (θ) of θ for any fixed x n exp (x) = lim 1+ . (13) x1,x2, ,xn takes the maximum at (19), then the probability n n density··· function f must be a q Gaussian: →∞ − Replacing the power on the right side of (13) by the n times of the 2 n exp β e q-product q : f (e)= q − q (22) ⊗ n 2 q exp β e de x⊗ := x q q x, (14) q q ⊗ ···⊗ − n times where βq is a q dependent positive constant. n − x q See [14] for the proof. Our result f (e) in (22) coincides with (5) expq (x) is obtained. In other words, lim 1+ ⊗ coincides with n n by applying the q product to MLP instead of MEP. →∞ expq (x) . − n x ⊗q expq (x) = lim 1+ (15) n n IV. q STILING’SFORMULA →∞ − The proof of (15) is given in [12]. This coincidence indicates a By means of the q-product (11), the q-factorial n!q is naturally validity of the q-product in Tsallis statistics. In fact, the present results defined by in the following sections reinforce it. n!q := 1 q q n (23) ⊗ ···⊗ AW OF ERROR IN SALLIS STATISTICS III. L T for n N and q>0.Usingthedefinition (11), lnq (n!q) is explicitly ∈ Consider the following situation: we get n observed values: expressed by n 1 q x1,x2, ,xn R (16) k − n ··· ∈ k=1 − ln (n! )= . (24) as results of mutually independent n measurements for some ob- q q 1 q servation. Each observed value xi (i =1, ,n) is each value of − independent, identically distributed (i.i.d. for··· short) random variable If an approximation of lnq (n!q) is not needed, the above explicit Xi (i =1, ,n), respectively. There exist a true value x satisfying form (24) should be directly used for its computation. However, in the additive···relation: order to clarify the correspondence between the studies q =1and q =1, the approximation of lnq (n!q) is useful. In fact, using the xi = x + ei (i =1, ,n) , (17) present6 q-Stirling’s formula, we obtain the surprising mathematical ··· structure in Tsallis statistics [15]. where each of ei is an error in each observation of a true value The tight q-Stirling’s formula is derived as follows: x. Thus, for each Xi, there exists a random variable Ei such that Xi = x + Ei (i =1, ,n).EveryEi has the same probability 1 ··· n + 2 ln n +( n)+θn,1 +(1 δ1) density function f which is differentiable, because X1, ,Xn are − − ··· if q =1, i.i.d. (i.e., E1, ,En are i.i.d.). Let L (θ) be a function of a variable ···  n 1 ln n 1 + θ δ θ, defined by  2n 2 n,2 2 lnq (n!q)= − − − −  if q =2,  1 q L (θ):=f (x1 θ) f (x2 θ) f (xn θ) . (18)  n 1 n − 1 1 n − − ··· − 2 q + 2 1 q− + θn,q + 2−q δq − − − − Theorem 1: If the function L (θ) of θ for any fixed x1,x2, ,xn  if q>0 and q =1 , 2, ···  6 takes the maximum at  (25)  x1 + x2 + + xn where  θ = θ∗ := ··· , (19) n 1 1 δ1 lim θn,q =0, 0 < θn,1 < ,e− = √2π. (26) then the probability density function f must be a Gaussian probability n 12n density function: →∞ On the other hand, the rough q-Stirling’s formula is reduced from 1 e2 (25) to f (e)= exp . (20) √2πσ −2σ2 n n The above result goes by the name of “Gauss’ law of error” [13]. 2 q lnq n 2 q + O (lnq n) if q =2, lnq (n!q)= − 6 Gauss’ law of error tells us that in measurements it is the most n− ln n + O (1)− if q =2, probable to assume a Gaussian probability distribution for additive − (27) noise, which is often used as an assumption in many scientific fields. which is more useful in applications in Tsallis statistics. See section On the basis of Gauss’ law of error, some functions such as error V, section VI and [15] for its applications. The derivation of the above function are often used to estimate error rate in measurements. formulas is given in [12]. 3

V. q MULTINOMIAL COEFFICIENT AND SYMMETRY where − k IN TSALLIS STATISTICS n = ni,ni N (i =1, ,k) . (38) n ∈ ··· We define the q binomial coefficient by i=1 k − q Applying the definitions of q and q to (37), the q multinomial ⊗ ® − n coefficient is explicitly written as := (n!q) q [(k!q) q ((n k)!q)] (n, k ( n) N) 1 k ® ⊗ − ≤ ∈ 1 q q n n1 nk − (28) n 1 q 1 q 1 q = ` − i1− ik− +1 . where q is the inverse operation to q,whichisdefined by n1 nk − ···− q `=1 i =1 i =1  ® ⊗ ··· 1 k 1 (39) 1 q 1 q 1 q x>0,y>0,   x − y − +1 − , 1 q 1 q Along the same reason as stated above in case of the q binomial x qy := − x − + y − 1 > 0, . ®  − coefficient, we consider the q-logarithm of the q multinomial− coef-  0, otherwise − (29) ficient given by q is also introduced by the following satisfactions as similarly as n ® ln =ln (n! ) ln (n ! ) ln (n ! ) . q [10][11]. q n n q q q 1 q q k q ⊗ 1 k q − ···− ··· (40) lnq x q y =lnq x lnq y, (30) ® − From the definition (37), it is clear that expq (x) q expq (y)=expq (x y) . (31) ® − n n n! lim = = . Applying the definitions of q, q and n!q to (28), the q binomial q 1 n1 nk n1 nk n ! n ! ⊗ ® − → q 1 k coefficient is explicitly written as ··· ··· ··· (41) 1 When n goes infinity, the q multinomial coefficient (37) has the n k n k 1 q − − n 1 q 1 q − 1 q similar relation to Tsallis entropy (1) as (35): = ` − i − j − +1 . (32) k − − q `=1 i=1 j=1 n lnq n1 nk From the definition (28), it is clear that ··· q 2 q n − n1 nk S2 q , , if q>0,q=2 n n n! 2 q − n n lim = = . (33) − · k ··· 6 q 1 k k k!(n k)!  .(42) → q − ' S1 (n)+ S1 (ni) if q =2  − i=1 n 1 q k 1 q n k 1 q In general, when q<1, `=1 ` − i=1 i − j=1− j − +1 > n 1− q k 1− q n k 1 q This is a natural generalization of (35). In the same way as the case 0. On the other hand, when ` − i − − j − +  `=1 − i=1 − j=1 of the q binomial coefficient, the above relation (42) is easily proved n − 1 < 0, takes complex numbers in general, which divides [15]. k q When q 1, (42) recovers the well known result: the formulations and discussions of the q binomial coefficient into → − n n1 nk two cases: it takes a real number or a complex number. In order to ln nS1 , , (43) avoid such separate formulations and discussions, we consider the n1 nk ' n ··· n ··· q-logarithm of the q binomial coefficient: where S is Shannon entropy. − 1 n The relation (42) reveals a surprising symmetry: (42) is equivalent lnq =lnq (n!q) lnq (k!q) lnq ((n k)!q) . (34) to k − − − q n ln1 (1 q) The above definition (28) is artificial because it is defined from the − − n n 1 k 1 (1 q) n ··· − − analogy with the usual binomial coefficient .However,whenn 1+(1 q) k n − n1 nk S1+(1 q) , , ( q>0,q=2). goes infinity, the q binomial coefficient (28) has a surprising relation ' 1+(1 q) · − n ··· n 6 to Tsallis entropy− as follows: − (44)

2 q n − k n k This expression represents that behind Tsallis statistics there exists n 2 q S2 q n , −n if q>0,q=2, lnq · − 6 k ' −S (n)+S (k)+S (n k) if q =2 a symmetry with a factor 1 q around q =1. Substitution of q 1 1 1 some concrete values of q into− (42) or (44) helps us understand the − − (35) symmetrymentionedabove. where Sq is Tsallis entropy (1) and S1 (n) is Boltzmann entropy:

S1 (n):=lnn. (36) VI. q CENTRAL LIMIT THEOREM IN TSALLIS STATITICS − (NUMERICAL EVIDENCE ONLY) Applying the rough expression of the q Stirling’s formula (27), the above relations (35) are easily proved [15].− The above correspondence It is well known that any binomial distribution converge to a (35) between the q binomial coefficient (28) and Tsallis entropy (1) Gaussian distribution when n goes infinity. This is a typical example convincesusofthefactthe− q binomial coefficient (28) is well- of the central limit theorem in the usual probability theory. By defined in Tsallis statistics. Therefore,− we can construct Pascal’s analogy with this famous result, each set of normalized q binomial − triangle in Tsallis statistics [15]. coefficients is expected to converge to each q Gaussian distribution − The above relation (35) is easily generalized to the case of the with the same q when n goes infinity. As shown in this section, the q multinomial coefficient. The q multinomial coefficient in Tsallis present numerical results come up to our expectations. statistics− is defined in a similar way− as that of the the q binomial In Fig.2, each set of bars and solid line represent each set of coefficient (28). − normalized q binomial coefficients and q Gaussian distribution with normalized− q-mean 0 and normalized q−-variance 1 for each n n when q =0.1, respectively. Each of the three graphs on the first row := (n!q) q [(n1!q) q q (nk!q)] (37) n1 nk ® ⊗ ···⊗ of Fig.2 represents two kinds of probability distributions stated above, ··· q 4 and the three graphs on the second row of Fig.2 represent the cor- responding cumulative probability distributions, respectively. From Fig.2, we find the convergence of a set of normalized q binomial coefficients to a q Gaussian distribution when n goes infi−nity. Other 0.16 cases with different− q represent the similar convergences as the case 0.14 of q =0.1. 0.12 Note that we never use any curve-fitting in these numerical n computations. Every bar and solid line is computed and plotted 0.1 independently each other. 0.08

0.06 n=5 n=10 n=20 0.5 0.5 0.5 0.04 0.4 0.4 0.4 =0,1,…,0.9 with respect to q 0.3 0.3 0.3 0.02

0.2 0.2 0.2 0 maximal difference among the values of two cumulative probabilities cumulative of two valuesthe difference among maximal q-binmial(normalized and of coefficient that q-Gaussian distribution) each for 0 20 40 60 80 100

probability (q=0.1) probability 0.1 0.1 0.1 n

0 0 0 -1 0 1 -1 0 1 -1 0 1 Fig. 3. transition of maximal difference among the values of two cumulative n=5 n=10 n=20 probabilities (normalized q-binmial coefficient and q Gaussian distribution) 1 1 1 for each q =0.1,0.2, ,0.9 when n − 0.8 0.8 0.8 ··· →∞

0.6 0.6 0.6 0.4 0.4 0.4 the q-multinomial coefficient and Tsallis entropy provides us with the 0.2 0.2 0.2 significant mathematical structure such as symmetry (44) and others

cumulative probability (q=0.1) probability cumulative 0 0 0 -1 0 1 -1 0 1 -1 0 1 [15]. Moreover, the convergence of each set of q-binomial coefficients to each q Gaussian with the same q when n increases represents the existence− of the central limit theorem in Tsallis statistics. In all of our recently presented results such as law of error [14], Fig. 2. probability distributions (the first row) and its corresponding cu- q-Stirling’s formula [12], symmetry (44) and the present numerical mulative probability distributions (the second row) of normalized q-binomial evidences for the central limit theorem in Tsallis statistics [15], the q- coefficient (q =0.1) and q Gaussian ditribution when n =5,10,20 − product is found to play crucial roles in these successful applications. The q-product is uniquely determined by Tsallis entropy. Therefore, the introduction of Tsallis entropy provides the wealthy foundations In order to confirm these convergences more precisely, we compute inmathematicsandphysicsaswell-organized generalization of the the maximal difference ∆q,n among the values of two cumula- Boltzmann-Gibbs-Shannon theory. tive probabilities (a set of normalized q-binomial coefficients and q Gaussian distribution) for each q =0.1, 0.2, , 0.9 and n. − ··· REFERENCES ∆q,n is defined by [1] C. Tsallis, Possible generalization of Boltzmann-Gibbs statistics, J. Stat. ∆q,n := max Fq-bino (i) Fq Gauss (i) (45) Phys. 52, 479-487 (1988). i=0, ,n − ··· | − | [2] E.M.F. Curado and C. Tsallis, Generalized statistical mechanics: connec- where Fq-bino (i) and Fq Gauss (i) are cumulative probability distrib- tion with thermodynamics, J. Phys. A 24, L69-L72 (1991); Corrigenda − 24, 3187 (1991); 25, 1019 (1992). utions of a set of normalized q binomial coefficients pq-bino (k) and − [3] C. Tsallis, S.V.F. Levy, A.M.C. Souza and R. Maynard, Statistical- its corresponding q Gaussian distribution fq Gauss (x), respectively. − − mechanical foundation of the ubiquity of Levy distributions in nature, i i Phys. Rev. Lett. 75, 3589-3593 (1995) [Erratum: 77, 5442 (1996)]. Fq-bino (i):= pq-bino (k) ,Fq Gauss (i):= fq Gauss (x) dx [4] D. Prato and C. Tsallis, Nonextensive foundation of Levy distributions, − − Phys. Rev. E 60, 2398-2401 (2000). k=0 −∞ (46) [5] C. Tsallis et al., Nonextensive Statistical Mechanics and Its Applications, edited by S. Abe and Y. Okamoto (Springer-Verlag, Heidelberg, 2001). Fig.3 results in convergences of ∆q,n to 0 when n for q =0.1, 0.2, , 0.9. This result indicates that the limit→∞ of every [6] C. Tsallis et al., Nonextensive Entropy: Interdisciplinary Applications, convergence is··· a q Gaussian distribution with the same q (0, 1] edited by M. Gell-Mann and C. Tsallis (Oxford Univ. Press, New York, 2004). as that of a given set− of normalized q binomial coefficients.∈ − [7] H. Suyari, Nonextensive derived from form invariance of The present convergences reveal a possibility of the existence of the pseudoadditivity, Phys. Rev. E 65, 066118 (7pages) (2002). central limit theorem in Tsallis statistics, in which any distributions [8] H. Suyari, On the most concise set of axioms and the uniqueness theorem converge to a q Gaussian distribution as nonextensive generalization for Tsallis entropy, J. Phys. A., 35, 10731-10738 (2002). of a Gaussian− distribution. The central limit theorem in Tsallis [9] H. Suyari, Generalization of Shannon-Khinchin axioms to nonextensive statistics provides not only a mathematical result in Tsallis statistics systems and the uniqueness theorem for the nonextensive entropy, IEEE but also the physical reason why there exist universally power-law Trans. Inform. Theory., vol.50, pp.1783-1787 (2004). behaviors in many physical systems. In other words, the central limit [10] L. Nivanen, A. Le Mehaute, Q.A. Wang, Generalized algebra within a nonextensive statistics, Rep.Math.Phys. 52, 437-434 (2003). theorem in Tsallis statistics mathematically explains the reason of [11] E.P. Borges, A possible deformed algebra and calculus inspired in ubiquitous existence of power-law behaviors in nature. nonextensive thermostatistics, Physica A 340, 95-101 (2004). [12] H. Suyari, q-Stirling’s formula in Tsallis statistics, LANL e-print cond- mat/0401541, submitted. [13] A. Hald, A history of mathematical statistics from 1750 to 1930, New VII. CONCLUSION York, Wiley (1998). We discovered the conclusive and beautiful mathematical structure [14] H. Suyari, Law of error in Tsallis statistics, LANL e-print cond- behind Tsallis statistics: law of error [14], q-Stirling’s formula [12], mat/0401540, submitted. q multinomial coefficient [15], q central limit theorem (numerical [15] H. Suyari, Mathematical structure derived from the q-multinomial coef- evidences− only) [15]. The one-to-one− correspondence (42) between ficient in Tsallis statistics, LANL e-print cond-mat/0401546, submitted.