Entropy Differential Review Abstract: Keywords: Literature
Total Page:16
File Type:pdf, Size:1020Kb
Entropy 2005, 7[4], 253-299 Entropy ISSN 1099-4300 www.mdpi.org/entropy/ Review Differential entropy and time Piotr Garbaczewski Institute of Physics, University of Zielona G´ora, ul. Szafrana 4a, 65-516 Zielona G´ora, Poland E-mail: [email protected] Received: 22 August 2005 / Accepted: 17 October 2005 / Published: 18 October 2005 Abstract: We give a detailed analysis of the Gibbs-type entropy notion and its dynamical be- havior in case of time-dependent continuous probability distributions of varied origins: related to classical and quantum systems. The purpose-dependent usage of conditional Kullback-Leibler and Gibbs (Shannon) entropies is explained in case of non-equilibrium Smoluchowski processes. A very different temporal behavior of Gibbs and Kullback entropies is confronted. A specific con- ceptual niche is addressed, where quantum von Neumann, classical Kullback-Leibler and Gibbs entropies can be consistently introduced as information measures for the same physical system. If the dynamics of probability densities is driven by the Schr¨odinger picture wave-packet evolution, Gibbs-type and related Fisher information functionals appear to quantify nontrivial power transfer processes in the mean. This observation is found to extend to classical dissipative processes and supports the view that the Shannon entropy dynamics provides an insight into physically relevant non-equilibrium phenomena, which are inaccessible in terms of the Kullback-Leibler entropy and arXiv:quant-ph/0408192v5 17 Oct 2005 typically ignored in the literature. Keywords: differential entropy; von Neumann entropy; Shannon entropy; Kullback-Leibler entropy; Gibbs entropy; entropy methods; entropy functionals; Fisher information; Ornstein- Uhlenbeck process; Schr¨odinger picture evolution; dynamics of probability densities; invariant density. PACS codes: 05.45.+b, 02.50.-r, 03.65.Ta, 03.67.-a Entropy 2005, 7[4], 253-299 2 1 Introduction Among numerous manifestations of the concept of entropy in physics and mathematics, the information-theory based entropy methods have been devised to investigate the large time be- havior of solutions of various (mostly dissipative) partial differential equations. Shannon, Kull- back and von Neumann entropies are typical information theory tools, designed to quantify the information content and possibly information loss for various classical and quantum systems in a specified state. For quantum systems the von Neumann entropy vanishes on pure states, hence one presumes to have a complete information about the state of a system. On the other hand, for pure states the Gibbs-type (Shannon, e.g. differential) entropy gives access to another information-theory level, associated with a probability distribution inferred from a L2(Rn) wave packet. Although Shannon or Kullback entropies are interpreted as information measures, it is quite natural to think of entropy as a measure of uncertainty. In view of the profound role played by the Shannon entropy in the formulation of entropic indeterminacy relations [1, 2], the term information, in the present paper is mostly used in the technical sense, meaning the inverse of uncertainty. In physics, the notion of entropy is typically regarded as a measure of the degree of randomness and the tendency (trends) of physical systems to become less and less organized. Throughout the paper we shall attribute a more concrete meaning to the term organization, both in the classical and quantum contexts. Namely, we shall pay special attention to quantifying, in terms of suitable entropy functionals, the degree of the probability distribution (de)localization, and the dynamical behavior of this specific - localization uncertainty - feature of a physical system. 1.1 Notions of entropy Notions of entropy, information and uncertainty are intertwined and cannot be sharply differen- tiated. While entropy and uncertainty are - to some extent synonymous - measures of ignorance (lack of information, uncertainty), the complementary notion of information basically quantifies the ability of observers to make reliable predictions about the system, [4, 6, 7]: the more aware one is about chances of a concrete outcome, the lower is the uncertainty of this outcome. Normally, the growth of uncertainty is identified with an increase of the entropy which in turn is interpreted as an information loss. Consult e.g. standard formulations of the celebrated Boltzmann H-theorem. Following Ref. [3] let us recall that entropy - be it thermodynamical (Gibbs-Boltzmann), dynam- ical, von Neumann, Wehrl, Shannon, Renyi, Tsallis or any other conceivable candidate - has an exceptional status among physical quantities. As a derived quantity it does not show up in any fundamental equation of motion. Generically, there is no a priori preferred notion of entropy (perhaps, except for the thermodynamical Clausius case) in physical applications and its specific choice appears to be purpose-dependent. As an obvious remnant of the standard thermodynamical reasoning, one expects entropy to be a state function of the system (thermodynamical notions of equilibrium or near-equilibrium are Entropy 2005, 7[4], 253-299 3 implicit). This state connotation is a source of ambiguities, since inequivalent notions of the system state are used in the description of physical systems, be them classical, thermodynamical and quantum. Not to mention rather specialized meaning of state employed in the standard information theory, [4, 9, 7]. A primitive information-theory system is simply a bit whose two admissible states are binary digits 1 and 0. Its quantum equivalent is a qubit whose admissible states are vectors in a two-dimensional Hilbert space, hence an infinity of pure states of a two-level quantum system. The information theory framework, if extended to more complicated systems, employs a plethora of notions of state [4, 7]. As very special cases we may mention a phase-space point as the determi- native of the state of a classical dynamical system, or the macroscopic notion of a thermodynamical state in its classical and quantum versions, [3]. A symbolic mathematical representation of quantum states in terms of wave vectors and/or density operators is expected to provide an experimentally verifiable ”information” about the system. To obtain a catalogue of the corresponding statistical predictions, an a priori choice of suitable observables (and thus measurement procedures) is necessary. Then, a casual interpretation of entropy as a measure of one’s uncertainty about measurable properties of a system in a prescribed quantum state may acquire an unambiguous meaning. When adopting the state notion to the Hilbert space language of quantum theory, we realize that normalized wave functions and density operators, which are traditionally supposed to determine the quantum state, allow to extend the notion of entropy to certain functionals of the state of the quantum system. The von Neumann entropy (ˆρ)= k Tr(ˆρ lnρ ˆ) (1) S − B of a quantum state (e.g. the density operatorρ ˆ), though often infinite, is typically related to the degree of departure from purity (e.g. the ”mixedness” level) of the state and is particularly useful while quantifying measurements performed upon finite quantum systems. For a given density operatorρ ˆ, von Neumann entropy is commonly accepted as a reliable measure of the information content (about the departure from purity), to be experimentally extracted from of a quantum system in a given state. Only under very specific circumstances, like e.g. in an optimal ”quantum experiment” [11, 12] which refers to the diagonal density operator (with pi, 1 i N being its eigenvalues), the information gain can be described in terms of both von ≤ ≤ Neumann’s and the standard Shannon measure of information: Tr(ˆρ lnρ ˆ)= p ln p . (2) − − i i i X Since von Neumann entropy is invariant under unitary transformations, the result exhibits an invariance under the change of the Hilbert space basis and the conservation in time for a closed system (when there is no information/energy exchange with the environment). Thus, Schr¨odinger dynamics has no impact on the von Neumann encoding of information, see e.g. also [13, 14] for a related discussion. Entropy 2005, 7[4], 253-299 4 Pure states have vanishing von Neumann entropy ( (ˆρ) = 0 ”for the pure states and only for S them”, [3]) and are normally considered as irrelevant from the quantum information theory per- spective, since ”one has complete information” [3] about such states. One may even say that a pure state is an unjustified over-idealization, since otherwise it would constitute e.g. a completely measured state of a system in an infinite Hilbert space, [15]. A colloquial interpretation of this situation is: since the wave function provides a complete description of a quantum system, surely we have no uncertainty about this quantum system, [16]. Note that as a side comment we find in Ref. [15] a minor excuse: this idealization, often employed for position-momentum degrees of freedom, is usually an adequate approximation, to be read as an answer to an objection of Ref. [17]: ”although continuous observables such as the position are familiar enough, they are really unphysical idealizations”, c.f. in this connection [18] for an alternative view. On the other hand, the classic Shannon entropy is known to be a natural measure of the amount of uncertainty related to measurements for pairs of observables, discrete and continuous on an equal footing, when a quantum system actually is in a pure state. Hence,