Ternary Representation of Stochastic Change and the Origin of Entropy and Its Fluctuations Hong Qian,∗ Yu-Chen Cheng,† and Lowell F. Thompson‡ Department of Applied Mathematics, University of Washington, Seattle, WA 98195, U.S.A. A change in a stochastic system has three representations: Probabilistic, statistical, and informational: (i) is based on random variable u(ω) → u˜(ω); this induces (ii) the probability distributions Fu(x) → Fu˜(x), x ∈ n R ; and (iii) a change in the probability measure P → P˜ under the same observable u(ω). In the informational dP˜ dFu representation a change is quantified by the Radon-Nikodym derivative ln P (ω) = − ln F (x) when d d u˜ x = u(ω). Substituting a random variable into its own density function creates a fluctuating entropy whose expectation has been given by Shannon. Informational representation of a deterministic transformation on Rn reveals entropic and energetic terms, and the notions of configurational entropy of Boltzmann and Gibbs, and potential of mean force of Kirkwood. Mutual information arises for correlated u(ω) and u˜(ω); and a nonequilibrium thermodynamic entropy balance equation is identified. I. INTRODUCTION What is the relationship between this informational representation of the change and the observed Fu and Fu˜? R A change according to classical physics is simple: if one For u(ω), u˜(ω) ∈ , the answer is [52]: measures x and x which are traits, in real numbers, of −1 1 2 dP˜ dF a “same” type, then ∆x = x − x is the mathematical u 2 1 P(ω)= u(ω) . (1) representation of the change; ∆x ∈ Rn. How to characterize d dFu˜ a change in a complex world? To represent a complex, u Ontherhsof(1), dF (x) is like a probability density function, dFu˜ stochastic world [1], the theory of probability developed by A. which is only defined on R. However, substituting the N. Kolmogorov envisions an abstract space P called (Ω, F, ) random variable u(ω) into the probability density function, a probability space. Similar to the Hilbert space underlying one obtains the lhs of (1). Putting a random variable back quantum mechanics [2], one does not see or touch the objects P into the logarithm of its own density function to create a in the probability space, ω ∈ Ω, nor the . Rather, one new random variable is the fundamental idea of fluctuating observes the probability space through functions, say u(ω), entropy in stochastic thermodynamics [6, 7], and the notion called random variables which map Ω → Rn. The same P of self-information [8–10]. Its expected value then becomes function maps the probability measure to a cumulative the Shannon information entropy or intimately related relative probability distribution function (cdf) , Rn. Fu(x) x ∈ entropy[11]. The result in (1) can be generalizedto u, u˜ ∈ Rn. Now a change occurs; and based on observation(s) the In this case, is changed to . In the current statistical data Fu(x) Fu˜(x) n ∂ Fu science, one simply works with the two functions Fu(x) and dFu ∂x1···∂xn x1, ··· , xn = n . (2) Fu˜(x). In fact, the more completedescription in the statistical ∂ Fu˜ dFu˜ representation is a joint probability distribution ∂x1···∂xn Fuu˜(x1, x2) whose marginal distributions are Fu(x1) = Fuu˜(x1, ∞) and In the rest of the paper, we shall consider the u(ω) ∈ R. But Fu˜(x2)= Fuu˜(∞, x2). the results are generally valid for multidimensional u(ω). If, however, one explores a little more on the “origin” of In this paper, we present key results based on this the change, one realizes that there are two possible sources: informational representation of stochastic change. We show A change in the P, or a change in the u(ω). If the P → P˜ all types of entropy are unified under the single theory. The while the u(ω) is fixed, then according to the measure theory, discussions are restricted on very simple cases; we only touch one can characterize this “change of measure” in terms of upon the stochastic change with a pair of correlated u(ω) → dP˜ a Radon-Nikodym (RN) derivative dP (ω) [3–5]. In the rest u˜(ω), which have respective generated σ-algebras that are of this paper, we will assume that all the measures under non-identical in general. The notion of “thermodynamic consideration are absolutely continuous with respect to each work” will appear then [5]. Rn arXiv:1902.09536v1 [cond-mat.stat-mech] 25 Feb 2019 other and that all measures on are absolutely continuous The informational and probabilistic representations of with respect to the Lebesgue measure. This ensures that all stochastic changes echo the Schr¨odinger and Heisenberg RN derivatives are well-defined. Note, this is a mathematical pictures in quantum dynamics [12]: in terms of wave object that is defined on the invisible probability space. It functions in the abstract, invisible Hilbert space and in terms actually is itself a random variable, with expectation, variance, of self-adjoint operators as observables. and statistics. II. INFORMATIONAL REPRESENTATION OF STOCHASTIC CHANGE ∗ [email protected] † [email protected] Statistics and information: Push-forward and pull-back. ‡ [email protected] Consider a sequence of real-valued random variables of a 2 same physical origin and their individual cdfs: This is exactly the relative entropy w.r.t. the stationary f ∗(x). In statistical thermodynamics, this is called free energy. F (x), F (x), ··· , F T (x). (3) −1 ∗ u1 u2 u −β ln f (x) on the rhs of (8) is called internal energy, According to the axiomatic theory of probability built on where β stands for the physical unit of energy. The integral (Ω, F, P), there is a sequence of random variables is the mean internal energy. Essential facts on information. Several key mathematical u1(ω),u2(ω), ··· ,uT (ω), (4) facts concerningthe information,as a random variable defined in (1), are worth stating. in which each maps the P to the push-forward uk(ω) (ω) First, even though the u(ω) appears in the rhs of the measure , R. Eq. 5 illustrate this with and Fk(x) x ∈ u u˜ equation, the resulting lhs is independent of the u(ω): Itisa stand for any u and u . i j random variable created from P, P˜, and random variable u˜(ω), as clearly shown in Eq. 5. This should be compared with a P u(ω)✲ (ω) Fu(x) well-known result in elementary probability: For a real-valued ◗ ◗ (5) random variable η(ω) and its cdf Fη(x), the constructed −1 u˜(ω) dP˜ dFu ◗ random variable Fη η(ω) is a uniform distribution on [0, 1] dP (ω)= dFu u(ω) ◗ ˜ ◗ independent of the nature of η(ω). ❄ ◗s Second, if we denote the logarithm of (1) as ξ(ω), ξ(ω) = u(ω)✲ P˜ P˜(ω) Fu˜(x) d ln dP , then one has a result based on the change of measure for integration: Informational representation, thus, considers the (3) as the P P P P P P d push-forward of a sequence of measures 1 = , 2, ··· , T , E η(ω) = η(ω)dP(ω) = η(ω) (ω) dP˜(ω) P dP˜ under a single observable, say u1(ω). This sequence of k ZΩ ZΩ can be represented through the fluctuating entropy inside the = η(ω)e−ξ(ω)dP˜(ω) [··· ] below: ZΩ P˜ −1 = E η(ω)e−ξ(ω) . (9a) dFu1 dPk(ω)= u1(ω) dP(ω). (6) dFuk h i And conversely, Narratives based on information representation have rich P˜ P varieties. We have discussed above the information E η(ω) = E η(ω)eξ(ω) . (9b) representation cast with a single, common random variable P u h i u(ω): k(ω) −→ Fk(x). Alternatively, one can cast the In particular, when the η = 1, the log-mean-exponential ∗ information representation with a single, given F (x) on R: of fluctuating ξ is zero. The incarnations of this equality uk ∗ Pk(ω) −→ F (x) for any sequence of uk(ω). Actually have been discovered numerous times in thermodynamics, there is a corresponding sequence of measures Pk, whose such as Zwanzig’s free energy perturbation method [13], push-forward are all F ∗(x), on the real line independent of the Jarzynski-Crooks equality [15, 36], and the Hatano-Sasa k. Then parallel to Eq. 5 we have a schematic: equality [16]. Third, one has an inequality, P uk(ω✲) (ω) Fuk (x) P P ξ(ω) ◗ ln E ξ(ω) ≤ ln E e =0. (10) −1 ◗ (7) u u (ω) dPk dF k ◗1 h i dP (ω)= dF ∗ uk(ω) ◗ ◗ As we have discussed in [5], this inequality is the h i ❄ ◗s mathematical origin of almost all inequalities in connection uk(ω✲) ∗ Pk(ω) F (x) to entropy in thermodynamics and information theory. Fourth, let us again consider a real-valued random variable ∗ If the invariant F (x) is the uniform distribution, e.g., when η(ω), with probability density function fη(x), x ∈ R, and its ∗ one chooses the Lebesgue measure on R with F (x) = x, information, e.g., fluctuating entropy ξ(ω) = − ln fη η(ω) then one has [6]. Then one has a new measure P˜ whose dP˜ ξ(ω), dP (ω) = e and EP dFuk − ln uk(ω) = − fuk (x) ln fuk (x)dx. dx R Z dP˜(ω)= eξ(ω)dP(ω) This is precisely the Shannon entropy! More generally with a Zx1<η(ω)≤x2 Zx1<η(ω)≤x2 ∗ fixed F (x), one has −1 = fη η(ω) dP(ω) x <η(ω)≤x EP dFuk fuk (x) Z 1 2 ln uk(ω) = fuk (x) ln dx x2 dF ∗ R f ∗(x) −1 Z = fη(y) fη(y)dy ∗ Zx1 = − fuk (x) ln f (x)dx + fuk (x) ln fuk (x)dx.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages7 Page
-
File Size-