An Overview of Definitions and Applications in Quantum Mechanics
Total Page:16
File Type:pdf, Size:1020Kb
Entropy in Physics: An Overview of Definitions and Applications in Quantum Mechanics Fabian Immanuel IJpelaar January 22, 2021 Abstract In modern physics, entropy is a well-known and important quantity. At its core, the entropy is a function of the probabilities of a distribution that is meant to describe the uncertainty in outcome of a random process represented by that distribution. However, it has been used in many different fields and, as a consequence, has many interpretations. Moreover, a lot of different functions have been grouped under the name of entropy, all with their own uses and interpretations. In this work, we discuss the definitions, origins, and interpretations of many of these functions as well as how they fit together. We will also explicitly cover some of the applications that the entropies have found within physics, in particular in quantum physics. These applications include thermodynamics, measurement uncertainty in quantum mechanics, measures of mixedness in the density matrix formalism and in phase space formalisms, measures of entanglement, and parton distributions in QCD. Master's Thesis Physics Van Swinderen Institute Supervisor: Dani¨elBoer Second Examiner: Elisabetta Pallante 1 Contents Introduction 5 1 The Entropies of Statistical Mechanics and Information Theory 9 1.1 Statistical Mechanics and the Classical Entropy . 10 1.1.1 The Liouville Equation . 10 1.1.2 The Microcanonical Ensemble . 11 1.1.3 The Microcanonical Entropy . 11 1.1.4 Extensivity of the microcanonical Entropy and Temperature . 12 1.1.5 The Canonical Ensemble . 13 1.1.6 The Canonical Entropy . 13 1.1.7 Closing Remarks . 14 1.2 The Shannon Entropy . 15 1.2.1 Comparison with the Standard deviation . 15 1.2.2 Shannon Entropy as a Measure of (Lack of) Information . 16 1.2.3 Statistical Mechanics and Information Theory: Jaynes' Maximum Entropy Principle . 18 1.2.4 Properties . 19 1.3 The Differential (Continuous Shannon) Entropy . 21 1.3.1 Properties . 22 1.3.2 The Differential Entropy and the Standard Deviation . 24 1.4 Summary and Concluding Remarks . 24 2 Course Graining Classical Distributions 26 2.1 Aggregation of Discrete Probabilities . 27 2.2 Coarse Graining Continuous Distributions . 27 2.3 The Second Law and Mixing . 29 2.4 Concluding Remarks . 31 3 Generalized Entropies 32 3.1 The Shannon-Khinchin Axioms . 33 3.1.1 Extensivity . 33 3.1.2 Experimental Robustness . 34 3.2 The Maximum Entropy Method Revisited . 35 3.3 Tsallis Entropy . 36 3.3.1 Properties . 37 3.3.2 Relation to the Shannon Entropy . 38 3.3.3 Maximization . 38 3.3.4 Extensivity . 39 3.3.5 The Zeroth Law of Thermodynamics and the Tsallis Composition Law . 39 3.4 The R´enyi Entropy . 42 3.4.1 Properties . 43 3.4.2 Entropy Maximization . 44 2 3.5 Scaling Exponents . 45 3.6 Imposing Extensivity . 46 3.7 Summary and Concluding Remarks . 47 4 Density Matrices, Mixed States and Quantum Operations 49 4.1 Introduction to the Density Matrix: The Particle Source . 49 4.2 The Density Matrix . 50 4.3 Schmidt Decomposition . 51 4.4 Quantum Operations . 51 4.4.1 Kraus Operators from Unitary Time Evolution . 52 4.5 Defining Mixedness through Majorization . 53 4.5.1 Schur-Convexity (Concavity) . 53 5 Measurement Uncertainty in Discrete Hilbert Spaces 54 5.1 The Uncertainty Principle . 55 5.1.1 The Maassen-Uffink Uncertainty Relation in a 2D Hilbert Space . 57 5.2 Concluding Remarks . 58 6 Position and Momentum Uncertainty 59 6.1 The Uncertainty Relations . 59 6.2 Infinite Square Well . 62 6.2.1 Comparison with the Classical Infinite Square Well . 65 6.3 Harmonic Oscillator . 65 6.3.1 Comparison with the classical harmonic oscillator . 66 6.3.2 Coherent States and Squeeze States . 68 6.4 The Hydrogen Atom . 73 6.5 Maximum Entropy Wavefunctions . 75 6.6 Summary and Concluding Remarks . 78 7 Measures of Mixedness 80 7.1 The Von Neumann Entropy . 81 7.1.1 Von Neumann's Gedanken Experiment . 81 7.1.2 The Ensembles and Entropies of Quantum Statistical Physics . 83 7.1.3 Properties . 84 7.1.4 Entropy Maximization . 87 7.2 The Quantum Tsallis and R´enyi entropies . 89 7.3 Summary and Concluding Remarks . 89 8 Entanglement and the Entanglement Entropy 91 8.1 Quantifying Entanglement: The LOCC Paradigm . 92 8.1.1 Maximally and Minimally Entangled States . 93 8.1.2 Locally Manipulating Two Qubits . 93 8.1.3 The Distillable Entanglement and Entanglement Cost . 94 8.1.4 Beyond Pure States . 95 8.2 The Entanglement Entropy and Area Laws . 95 8.3 The Spin XY Model and the Ground State Entropy . 96 8.4 Entanglement Entropy in Field Theory . 103 8.4.1 The Real Time Approach . 103 8.4.2 The Euclidean Method . 107 8.4.3 The Replica Trick . 109 8.4.4 The Entanglement Entropy of the Free Scalar Field . 109 8.5 Summary and Concluding Remarks . 111 9 Phase Space Entropies 114 9.1 The Wigner and Husimi Representations . 114 3 9.2 The Linear Entropy And Wigner Distributions . 116 9.2.1 Properties . 117 9.2.2 Entropy Maximization . 117 9.2.3 Entropy of Course graining . 118 9.3 Wehrl Entropy and The Husimi-Q Representation . 119 9.3.1 Quantum-Optical States . 119 9.4 Summary and Concluding Remarks . 123 10 Entropy and Parton Distributions 125 10.1 DIS and the Parton Model . 126 10.1.1 QCD and Corrections to Bjorken Scaling . 128 10.2 The Entropy of the Parton Distribution functions . 129 10.3 The Entropy of Ignorance . 132 10.3.1 Entropy of Ignorance for a Spin System . 133 10.3.2 The Entropy of Ignorance vs the Von Neumann entropy in the CCG model . 134 10.4 Summary and Concluding Remarks . 136 Conclusion 137 Appendices 150 A Hydrogen Atom Entropies 151 4 Introduction Entropy, although being ubiquitous throughout physics and even many other fields, is also notorious for being hard to grasp. Von Neumann famously said to Shannon, who had developed his \own" entropy quantity but didn't know yet what to call it [1]: You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage. (-John Von Neumann to Claude Shannon, 1949) While this quote is a bit dated by now, entropy and entropy-like quantities are still being studied very intensively, and since then, the number of \entropies" has become even larger. Wehrl [2] has even gone as far as to say: \There is a tremendous variety of entropy-like quantities, especially in the classical case, and perhaps every month somebody invents a new one". Moreover, as we will discuss, all these quantities are used within many different fields of research, so even for the same quantity, interpretations may vary, adding another barrier to entry for someone who wants to learn about the use of these entropies. The aim of this work is to act as a general starting point for someone who wants to learn about these entropies and related topics, while still focusing on applications and interpretations in physics, and in particular quantum mechanics. Since we feel that a lot of confusion about the entropy comes from its past uses and interpretations, we will give a brief overview of this in the rest of the introduction. In chapter 1, we will show how the most well known entropy, now called the Boltzmann-Gibbs-Shannon (BGS) entropy, is defined in classical statistical mechanics, as well as in classical information theory. In chapter.