03. Entropy: Boltzmann, Gibbs, Shannon

Total Page:16

File Type:pdf, Size:1020Kb

03. Entropy: Boltzmann, Gibbs, Shannon 03. Boltzmann Entropy, Gibbs Entropy, Shannon Information. I. Entropy in Statistical Mechanics. • Goal: To explain the behavior of macroscopic systems in terms of the dynamical laws governing their microscopic consituents. - In particular: To provide a micro-dynamical explanation of the 2nd Law. 1. Boltzmann's Approach. • Consider different "macrostates" of a gas: Ludwig Boltzmann (1844-1906) • Why does the gas prefer to be in the Thermodynamic equilibrium macrostate = constant thermodynamic properties equilibrium macrostate (last one)? (temperature, volume, pressure, etc.) • Suppose the gas consists of N identical particles governed by Hamilton's equations of motion (the micro-dynamics). Def. 1. A microstate X of a gas is a specification of the position (3 values) and momentum (3 values) for each of its N particles. Let Ω = phase space = 6N-dim space of all possible microstates. Let ΩE = region of Ω that consists of all microstates with constant energy E. • Ω • E • Hamiltonian dynamics • maps initial microstate • • X to final microstate X . • • i f Xf Xi • Can 2nd Law be • explained by recourse to • • • this dynamics? • • Def. 2. A macrostate Γ of a gas is a specification of the gas in terms of macroscopic properties (pressure, temperature, volume, etc.). • Relation between microstates and macrostates: Macrostates supervene on microstates! - To each microstate there corresponds exactly one macrostate. - Many distinct microstates can correspond to the same macrostate. • So: ΩE is partitioned into a finite number of regions corresponding to macrostates, with each microstate X belonging to one macrostate Γ(X). • • • ΩE microstates • • Γ(X) • • • • macrostate • X • • • • • • • • • Boltzmann's Claim: The equilibrium macrostate Γeq is vastly larger than any other macrostate (so it contains the vast majority of possible microstates). Def. 3. The Boltzmann Entropy is defined by So: SB(Γ(X)) is a measure of the size of Γ(X). SB(Γ(X)) = k log|Γ(X)| And: SB(Γ(X)) obtains its where |Γ(X)| is the volume of Γ(X). maximum value for Γeq. • • Γeq • ΩE • • • • • • • • very small non- • • equilibrium • • • macrostates • • • • • • • • • • • Thus: SB increases over time because, for any initial microstate Xi, the dynamics will map Xi into Γeq very quickly, and then keep it there for an extremely long time. Two Ways to Explain the Approach to Equilibrium: (a) Appeal to Typicality (Goldstein 2001) Claim: A system approaches equilibrium because equilibrium microstates are typical and nonequilibrium microstates are atypical. • Why? For large N, ΩE is almost entirely filled up with equilibrium microstates. Hence they are "typical". - But: What is it about the dynamics that evolves atypical states to typical states? - "If a system is in an atypical microstate, it does not evolve into an equilibrium microstate just because the latter is typical." (Frigg 2009) - Need to identify properties of the dynamics that guarantee atypical states evolve into typical states. - And: Need to show that these properties are typical. - Ex: If the dynamics is chaotic (in an appropriate sense), then (under certain conditions), any initial microstate Xi will quickly be mapped into Γeq and remain there for long periods of time. (Frigg 2009) (b) Appeal to Probabilities Claim: A system approaches equilibrium because it evolves from states of lower toward states of higher probability, and the equilibrium state is the state of highest probabililty. • Associate probabilities with macrostates: the larger the macrostate, the greater the probability of finding a microstate in it. "In most cases, the initial state will be a very unlikely state. From this state the system will steadily evolve towards more likely states until it has finally reached the most likely state, i.e., the state of thermal equilibrium." Task: Make this a bit more precise (Boltzmann's combinatorial argument)... P89 • • • Arrangement #1: P6 w1 w2 w3 state of P6 in w1, state of P89 in w3, etc. • • • • • • • • • • Ωµ! • Start with the 6-dim phase space Ωµ of a single particle. ΩE = N copies of Ωµ • Partition Ωµ into ℓ cells w1, w2, ..., wℓ of size δw. point in Ωµ = single- • A state of an N-particle system is given by N points in Ωµ. particle microstate. Def. 4. An arrangement is a specification of which points lie in which cells. P6 • • • Arrangement #1: P89 w1 w2 w3 state of P6 in w1, state of P89 in w3, etc. Arrangement #2: • • • • state of P89 in w1, state of P6 in w3, etc. • Distribution: • • (1, 0, 2, 0, 1, 1, ...) • • • Takes form (n1, n2, ..., nℓ), where nj = # of points in wj. Ωµ • Start with the 6-dim phase space Ωµ of a single particle. ΩE = N copies of Ωµ • Partition Ωµ into ℓ cells w1, w2, ..., wℓ of size δw. point in Ωµ = single- • A state of an N-particle system is given by N points in Ωµ. particle microstate. Def. 4. An arrangement is a specification of which points lie in which cells. Def. 5. A distribution is a specification of how many points (regardless of which ones) lie in each cell. • Note: More than one arrangement can correspond to the same distribution. • How many arrangements G(Di) are compatible with a given distribution Di = (n1, n2, ..., nℓ)? n ! = n(n − 1)(n − 2)"1 = # of ways to arrange n distinguishable objects N ! 0! = 1 • Answer: G(Di ) = n1 !n2 !!nℓ ! Number of ways to arrange N distinguishable objects into ! bins with capacities n1, n2, ..., n!. Check: Let D1 = (N, 0, ..., 0) and D2 = (N − 1, 1, 0, ..., 0). - G(D1) = N !/N ! = 1. (Only one way for all N particles to be in w1.) - G(D2) = N !/(N − 1)! = N(N − 1)(N − 2)"1/(N − 1)(N − 2)"1 = N. (There are N different ways w2 could have one point in it; namely, if P1 was in it, or if P2 was in it, or if P3 was in it, etc...) "The probability of this distribution [Di] is then given by the number of permutations of which the elements of this distribution are capable, that is by the number [G(Di)]. As the most probable distribution, i.e., as the one corresponding to thermal equilibrium, we again regard that distribution for which this expression is maximal..." • Again: The probability of a distribution Di is given by G(Di). • And: Each distribution Di corresponds to a macrostate ΓDi. Why? Because a system's macroscopic properties (volume, pressure, temp, etc) only depend on how many particles are in particular microstates, and not on which particles are in which microstates. • What is the size of this macrostate? - A point in ΩE corresponds to an arrangement of Ωµ. - The size of a macrostate ΓDi in ΩE is given by the number of points it contains (the number of arrangements compatible with Di) multiplied by a volume element of ΩE. - A volume element of ΩE is given by N copies of a volume element δw of Ωµ. number of volume element • So: The size of ΓD is |ΓD | = arrangements × i i of ΩE compatible with Di N = G(Di) δw In other words: The probability G(Di) of a distribution Di is proportional to the size of its corresponding macrostate ΓDi. - The equilibrium macrostate, being the largest, is the most probable; and a system evolves from states of low probabilty to states of high probability. • And: Each distribution Di corresponds to a macrostate ΓDi. Why? Because a system's macroscopic properties (volume, pressure, temp, etc) only depend on how many particles are in particular microstates, and not on which particles are in which microstates. • What is the size of this macrostate? - A point in ΩE corresponds to an arrangement of Ωµ. - The size of a macrostate ΓDi in ΩE is given by the number of points it contains (the number of arrangements compatible with Di) multiplied by a volume element of ΩE. - A volume element of ΩE is given by N copies of a volume element δw of Ωµ. number of volume element • So: The size of ΓD is |ΓD | = arrangements × i i of ΩE compatible with Di N = G(Di) δw • The Boltzmann entropy of ΓDi is given by: SB is a measure of how N large a macrostate is, and SB(ΓDi) = k log(G(Di) δw ) thus how probable the = k log(G(Di)) + Nk log(δw) corresponding distribution of microstates is. = k log(G(Di)) + const. Other formulations of SB Stirling's approx: SB(ΓDi) = klog(G(Di)) + const. logn! ≈ nlogn − n ⎛ N ! ⎟⎞ = k log⎜ ⎟ +const. ⎜ ⎟ n1 + ... + nℓ = N ⎝⎜n1 !n2 !!nℓ !⎠⎟ = klog(N !) − klog(n1!) − ... − klog(nℓ!) + const. ≈ (NklogN − N ) − (n1klogn1 − n1) − ... − (nℓklognℓ − nℓ) + const. ℓ SB in terms of microstate occupation numbers n . = −k∑nj lognj +const. j j=1 Probabilities for microstates, probability of finding not macrostates/distributions! • Let: pj = nj/N = a randomly chosen microstate in cell wj SB in terms of microstate probabilities pj. ℓ • Then: S (Γ ) = −Nk p log p +const. B Di ∑ j j The biggest value of S is for the j=1 B distribution Di for which the pj's are all 1/ℓ; i.e., the nj's are all N/ℓ (the equilibrium distribution). Relation between Boltzmann Entropy SB and Thermodynamic Entropy ST • First: Let's derive the Maxwell-Boltzmann equilibrium distribution. • Assume: A system of N weakly interacting particles described by: nj = # states in cell wj n = N Weakly ineracting ∑ j N = total # particles j assumption means total internal energy is just sum εj = energy of microstate in wj of energies of each particle ∑εj nj = U j U = total internal energy • Recall: SB(nj) = (Nklog N − N ) − k ∑(nj log nj − nj ) + const. j d d • So: SB = 0 −k∑ (nj lognj −nj ) + 0 dnj j dnj = −k ∑(log nj + nj /nj − 1) j = −k ∑ log nj j • Or: dSB = −k ∑ log nj dnj j * • Now: SB takes its maximum value for the values nj that solve: * Small changes to SB due dSB = −k ∑ log nj dnj = 0 only to small changes dn .
Recommended publications
  • Blackbody Radiation: (Vibrational Energies of Atoms in Solid Produce BB Radiation)
    Independent study in physics The Thermodynamic Interaction of Light with Matter Mirna Alhanash Project in Physics Uppsala University Contents Abstract ................................................................................................................................................................................ 3 Introduction ......................................................................................................................................................................... 3 Blackbody Radiation: (vibrational energies of atoms in solid produce BB radiation) .................................... 4 Stefan-Boltzmann .............................................................................................................................................................. 6 Wien displacement law..................................................................................................................................................... 7 Photoelectric effect ......................................................................................................................................................... 12 Frequency dependence/Atom model & electron excitation .................................................................................. 12 Why we see colours ....................................................................................................................................................... 14 Optical properties of materials: ..................................................................................................................................
    [Show full text]
  • Arxiv:1910.10745V1 [Cond-Mat.Str-El] 23 Oct 2019 2.2 Symmetry-Protected Time Crystals
    A Brief History of Time Crystals Vedika Khemania,b,∗, Roderich Moessnerc, S. L. Sondhid aDepartment of Physics, Harvard University, Cambridge, Massachusetts 02138, USA bDepartment of Physics, Stanford University, Stanford, California 94305, USA cMax-Planck-Institut f¨urPhysik komplexer Systeme, 01187 Dresden, Germany dDepartment of Physics, Princeton University, Princeton, New Jersey 08544, USA Abstract The idea of breaking time-translation symmetry has fascinated humanity at least since ancient proposals of the per- petuum mobile. Unlike the breaking of other symmetries, such as spatial translation in a crystal or spin rotation in a magnet, time translation symmetry breaking (TTSB) has been tantalisingly elusive. We review this history up to recent developments which have shown that discrete TTSB does takes place in periodically driven (Floquet) systems in the presence of many-body localization (MBL). Such Floquet time-crystals represent a new paradigm in quantum statistical mechanics — that of an intrinsically out-of-equilibrium many-body phase of matter with no equilibrium counterpart. We include a compendium of the necessary background on the statistical mechanics of phase structure in many- body systems, before specializing to a detailed discussion of the nature, and diagnostics, of TTSB. In particular, we provide precise definitions that formalize the notion of a time-crystal as a stable, macroscopic, conservative clock — explaining both the need for a many-body system in the infinite volume limit, and for a lack of net energy absorption or dissipation. Our discussion emphasizes that TTSB in a time-crystal is accompanied by the breaking of a spatial symmetry — so that time-crystals exhibit a novel form of spatiotemporal order.
    [Show full text]
  • Ludwig Boltzmann Was Born in Vienna, Austria. He Received His Early Education from a Private Tutor at Home
    Ludwig Boltzmann (1844-1906) Ludwig Boltzmann was born in Vienna, Austria. He received his early education from a private tutor at home. In 1863 he entered the University of Vienna, and was awarded his doctorate in 1866. His thesis was on the kinetic theory of gases under the supervision of Josef Stefan. Boltzmann moved to the University of Graz in 1869 where he was appointed chair of the department of theoretical physics. He would move six more times, occupying chairs in mathematics and experimental physics. Boltzmann was one of the most highly regarded scientists, and universities wishing to increase their prestige would lure him to their institutions with high salaries and prestigious posts. Boltzmann himself was subject to mood swings and he joked that this was due to his being born on the night between Shrove Tuesday and Ash Wednesday (or between Carnival and Lent). Traveling and relocating would temporarily provide relief from his depression. He married Henriette von Aigentler in 1876. They had three daughters and two sons. Boltzmann is best known for pioneering the field of statistical mechanics. This work was done independently of J. Willard Gibbs (who never moved from his home in Connecticut). Together their theories connected the seemingly wide gap between the macroscopic properties and behavior of substances with the microscopic properties and behavior of atoms and molecules. Interestingly, the history of statistical mechanics begins with a mathematical prize at Cambridge in 1855 on the subject of evaluating the motions of Saturn’s rings. (Laplace had developed a mechanical theory of the rings concluding that their stability was due to irregularities in mass distribution.) The prize was won by James Clerk Maxwell who then went on to develop the theory that, without knowing the individual motions of particles (or molecules), it was possible to use their statistical behavior to calculate properties of a gas such as viscosity, collision rate, diffusion rate and the ability to conduct heat.
    [Show full text]
  • The Philosophy and Physics of Time Travel: the Possibility of Time Travel
    University of Minnesota Morris Digital Well University of Minnesota Morris Digital Well Honors Capstone Projects Student Scholarship 2017 The Philosophy and Physics of Time Travel: The Possibility of Time Travel Ramitha Rupasinghe University of Minnesota, Morris, [email protected] Follow this and additional works at: https://digitalcommons.morris.umn.edu/honors Part of the Philosophy Commons, and the Physics Commons Recommended Citation Rupasinghe, Ramitha, "The Philosophy and Physics of Time Travel: The Possibility of Time Travel" (2017). Honors Capstone Projects. 1. https://digitalcommons.morris.umn.edu/honors/1 This Paper is brought to you for free and open access by the Student Scholarship at University of Minnesota Morris Digital Well. It has been accepted for inclusion in Honors Capstone Projects by an authorized administrator of University of Minnesota Morris Digital Well. For more information, please contact [email protected]. The Philosophy and Physics of Time Travel: The possibility of time travel Ramitha Rupasinghe IS 4994H - Honors Capstone Project Defense Panel – Pieranna Garavaso, Michael Korth, James Togeas University of Minnesota, Morris Spring 2017 1. Introduction Time is mysterious. Philosophers and scientists have pondered the question of what time might be for centuries and yet till this day, we don’t know what it is. Everyone talks about time, in fact, it’s the most common noun per the Oxford Dictionary. It’s in everything from history to music to culture. Despite time’s mysterious nature there are a lot of things that we can discuss in a logical manner. Time travel on the other hand is even more mysterious.
    [Show full text]
  • Josiah Willard Gibbs
    GENERAL ARTICLE Josiah Willard Gibbs V Kumaran The foundations of classical thermodynamics, as taught in V Kumaran is a professor textbooks today, were laid down in nearly complete form by of chemical engineering at the Indian Institute of Josiah Willard Gibbs more than a century ago. This article Science, Bangalore. His presentsaportraitofGibbs,aquietandmodestmanwhowas research interests include responsible for some of the most important advances in the statistical mechanics and history of science. fluid mechanics. Thermodynamics, the science of the interconversion of heat and work, originated from the necessity of designing efficient engines in the late 18th and early 19th centuries. Engines are machines that convert heat energy obtained by combustion of coal, wood or other types of fuel into useful work for running trains, ships, etc. The efficiency of an engine is determined by the amount of useful work obtained for a given amount of heat input. There are two laws related to the efficiency of an engine. The first law of thermodynamics states that heat and work are inter-convertible, and it is not possible to obtain more work than the amount of heat input into the machine. The formulation of this law can be traced back to the work of Leibniz, Dalton, Joule, Clausius, and a host of other scientists in the late 17th and early 18th century. The more subtle second law of thermodynamics states that it is not possible to convert all heat into work; all engines have to ‘waste’ some of the heat input by transferring it to a heat sink. The second law also established the minimum amount of heat that has to be wasted based on the absolute temperatures of the heat source and the heat sink.
    [Show full text]
  • Episode 1: Phis Wants to Be a Physicist
    Episode 1: Phis wants to be a physicist Illustration: Xia Hong Script: Xia Hong 12/2012 Phis is 14 years old. Like most kids of her age, she’s a bit anxious about finding the person inside her… She tried singing. Hmm… “You are very talented, Phis!” Her best friend Lizzy told her. Sports are also not her cup of tea… Episode 1: Phis wants to be a physicist 1 “I wish I could But why I’m short, but does height I’m very be as confident She’s smart. She’s pretty. matter?! confident! as Lizzy!” Phis And she is told her little taller than me! sister Chemi. No wonder she’s “Then I would so confident! know what I can do.” “Who says size doesn’t matter?” Phis is not convinced. “Just read the newspaper…” “It’s always good to be either really small or really big, not in between!” Even her name sounds sharp — it is the same as the Lizzy in Pride and Prejudice ! Chemi, don’t you think our names are a bit strange? Episode 1: Phis wants to be a physicist 2 Phis’s self-search reached a conclusion after a monthly Science Day event in school hosted by Ms. Allen. What I’ll show you today is the It is our fascinating pleasure world of to have quantum Prof. Lee mechanics. here. Imagine, we live in the quantum world. Quantum We have to comply to its rules, and behave very We no longer have a differently from certain answer for things our normal life.
    [Show full text]
  • Lecture 6: Entropy
    Matthew Schwartz Statistical Mechanics, Spring 2019 Lecture 6: Entropy 1 Introduction In this lecture, we discuss many ways to think about entropy. The most important and most famous property of entropy is that it never decreases Stot > 0 (1) Here, Stot means the change in entropy of a system plus the change in entropy of the surroundings. This is the second law of thermodynamics that we met in the previous lecture. There's a great quote from Sir Arthur Eddington from 1927 summarizing the importance of the second law: If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equationsthen so much the worse for Maxwell's equations. If it is found to be contradicted by observationwell these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of ther- modynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation. Another possibly relevant quote, from the introduction to the statistical mechanics book by David Goodstein: Ludwig Boltzmann who spent much of his life studying statistical mechanics, died in 1906, by his own hand. Paul Ehrenfest, carrying on the work, died similarly in 1933. Now it is our turn to study statistical mechanics. There are many ways to dene entropy. All of them are equivalent, although it can be hard to see. In this lecture we will compare and contrast dierent denitions, building up intuition for how to think about entropy in dierent contexts. The original denition of entropy, due to Clausius, was thermodynamic.
    [Show full text]
  • Dirty Tricks for Statistical Mechanics
    Dirty tricks for statistical mechanics Martin Grant Physics Department, McGill University c MG, August 2004, version 0.91 ° ii Preface These are lecture notes for PHYS 559, Advanced Statistical Mechanics, which I’ve taught at McGill for many years. I’m intending to tidy this up into a book, or rather the first half of a book. This half is on equilibrium, the second half would be on dynamics. These were handwritten notes which were heroically typed by Ryan Porter over the summer of 2004, and may someday be properly proof-read by me. Even better, maybe someday I will revise the reasoning in some of the sections. Some of it can be argued better, but it was too much trouble to rewrite the handwritten notes. I am also trying to come up with a good book title, amongst other things. The two titles I am thinking of are “Dirty tricks for statistical mechanics”, and “Valhalla, we are coming!”. Clearly, more thinking is necessary, and suggestions are welcome. While these lecture notes have gotten longer and longer until they are al- most self-sufficient, it is always nice to have real books to look over. My favorite modern text is “Lectures on Phase Transitions and the Renormalisation Group”, by Nigel Goldenfeld (Addison-Wesley, Reading Mass., 1992). This is referred to several times in the notes. Other nice texts are “Statistical Physics”, by L. D. Landau and E. M. Lifshitz (Addison-Wesley, Reading Mass., 1970) par- ticularly Chaps. 1, 12, and 14; “Statistical Mechanics”, by S.-K. Ma (World Science, Phila., 1986) particularly Chaps.
    [Show full text]
  • Generalized Molecular Chaos Hypothesis and the H-Theorem: Problem of Constraints and Amendment of Nonextensive Statistical Mechanics
    Generalized molecular chaos hypothesis and the H-theorem: Problem of constraints and amendment of nonextensive statistical mechanics Sumiyoshi Abe1,2,3 1Department of Physical Engineering, Mie University, Mie 514-8507, Japan*) 2Institut Supérieur des Matériaux et Mécaniques Avancés, 44 F. A. Bartholdi, 72000 Le Mans, France 3Inspire Institute Inc., McLean, Virginia 22101, USA Abstract Quite unexpectedly, kinetic theory is found to specify the correct definition of average value to be employed in nonextensive statistical mechanics. It is shown that the normal average is consistent with the generalized Stosszahlansatz (i.e., molecular chaos hypothesis) and the associated H-theorem, whereas the q-average widely used in the relevant literature is not. In the course of the analysis, the distributions with finite cut-off factors are rigorously treated. Accordingly, the formulation of nonextensive statistical mechanics is amended based on the normal average. In addition, the Shore- Johnson theorem, which supports the use of the q-average, is carefully reexamined, and it is found that one of the axioms may not be appropriate for systems to be treated within the framework of nonextensive statistical mechanics. PACS number(s): 05.20.Dd, 05.20.Gg, 05.90.+m ____________________________ *) Permanent address 1 I. INTRODUCTION There exist a number of physical systems that possess exotic properties in view of traditional Boltzmann-Gibbs statistical mechanics. They are said to be statistical- mechanically anomalous, since they often exhibit and realize broken ergodicity, strong correlation between elements, (multi)fractality of phase-space/configuration-space portraits, and long-range interactions, for example. In the past decade, nonextensive statistical mechanics [1,2], which is a generalization of the Boltzmann-Gibbs theory, has been drawing continuous attention as a possible theoretical framework for describing these systems.
    [Show full text]
  • LECTURE 9 Statistical Mechanics Basic Methods We Have Talked
    LECTURE 9 Statistical Mechanics Basic Methods We have talked about ensembles being large collections of copies or clones of a system with some features being identical among all the copies. There are three different types of ensembles in statistical mechanics. 1. If the system under consideration is isolated, i.e., not interacting with any other system, then the ensemble is called the microcanonical ensemble. In this case the energy of the system is a constant. 2. If the system under consideration is in thermal equilibrium with a heat reservoir at temperature T , then the ensemble is called a canonical ensemble. In this case the energy of the system is not a constant; the temperature is constant. 3. If the system under consideration is in contact with both a heat reservoir and a particle reservoir, then the ensemble is called a grand canonical ensemble. In this case the energy and particle number of the system are not constant; the temperature and the chemical potential are constant. The chemical potential is the energy required to add a particle to the system. The most common ensemble encountered in doing statistical mechanics is the canonical ensemble. We will explore many examples of the canonical ensemble. The grand canon- ical ensemble is used in dealing with quantum systems. The microcanonical ensemble is not used much because of the difficulty in identifying and evaluating the accessible microstates, but we will explore one simple system (the ideal gas) as an example of the microcanonical ensemble. Microcanonical Ensemble Consider an isolated system described by an energy in the range between E and E + δE, and similar appropriate ranges for external parameters xα.
    [Show full text]
  • H Theorems in Statistical Fluid Dynamics
    J. Phys. A: Math. Gen. 14 (1981) 1701-1718. Printed in Great Britain H theorems in statistical fluid dynamics George F Camevaleit, Uriel FrischO and Rick Salmon1 $National Center for Atmospheric Researchq, Boulder, Colorado 80307, USA §Division of Applied Sciences, Harvard University, Cambridge, Massachusetts 02138, USA and CNRS, Observatoire de Nice, Nice, France SScripps Institution of Oceanography, La Jolla, California 92093, USA Received 28 April 1980, in final form 6 November 1980 Abstract. It is demonstrated that the second-order Markovian closures frequently used in turbulence theory imply an H theorem for inviscid flow with an ultraviolet spectral cut-off. That is, from the inviscid closure equations, it follows that a certain functional of the energy spectrum (namely entropy) increases monotonically in time to a maximum value at absolute equilibrium. This is shown explicitly for isotropic homogeneous flow in dimensions d 3 2, and then a generalised theorem which covers a wide class of systems of current interest is presented. It is shown that the H theorem for closure can be derived from a Gibbs-type H theorem for the exact non-dissipative dynamics. 1. Introduction In the kinetic theory of classical and quantum many-particle systems, statistical evolution is described by a Boltzmann equation. This equation provides a collisional representation for nonlinear interactions, and through the Boltzmann H theorem it implies monotonic relaxation toward absolute canonical equilibrium. Here we show to what extent this same framework formally applies in statistical macroscopic fluid dynamics. Naturally, since canonical equilibrium applies only to conservative systems, a strong analogy can be produced only by confining ourselves to non-dissipative idealisations of fluid systems.
    [Show full text]
  • Statistical Mechanics I: Lecture 26
    VII.G Superfluid He4 Interesting examples of quantum fluids are provided by the isotopes of helium. The two electrons of He occupy the 1s orbital with opposite spins. The filled orbital makes this noble gas particularly inert. There is still a van der Waals attraction between two He atoms, but the interatomic potential has a shallow minimum of depth 9◦K at a separation of roughly 3A.˚ The weakness of this interaction makes He a powerful wetting agent. As the He atoms have a stronger attraction for practically all other molecules, they easily spread over the surface of any substance. Due to its light mass, the He atom undergoes large zero point fluctuations at T = 0. These fluctuations are sufficient to melt a solid phase, and thus He remains in a liquid state at ordinary pressures. Pressures of over 25 atmosphere are required to sufficiently localize the He atoms to result in a solid phase at T = 0. A remarkable property of the quantum liquid at zero temperature is that, unlike its classical counterpart, it has zero entropy. The lighter isotope of He3 has three nucleons and obeys fermi statistics. The liquid phase is very well described by a fermi gas of the type discussed in sec.VII.E. The inter­ actions between atoms do not significantly change the non-interacting picture described earlier. By contrast, the heavier isotope of He4 is a boson. Helium can be cooled down by a process of evaporation: Liquid helium is placed in an isolated container, in equilibrium with the gas phase at a finite vapor density.
    [Show full text]