Entropy According to Boltzmann Kim Sharp, Department of Biochemistry and Biophysics, University of Pennsylvania, 2016

Total Page:16

File Type:pdf, Size:1020Kb

Entropy According to Boltzmann Kim Sharp, Department of Biochemistry and Biophysics, University of Pennsylvania, 2016 Entropy according to Boltzmann Kim Sharp, Department of Biochemistry and Biophysics, University of Pennsylvania, 2016 "The initial state in most cases is bound to be highly improbable and from it the system will always rapidly approach a more probable state until it finally reaches the most probable state, i.e. that of the heat equilibrium. If we apply this to the second basic theorem we will be able to identify that quantity which is usually called entropy with the probability of the particular state." (Boltzmann, 1877) (1) Heat flows from a hot object to a cold one. Two gases placed into the same container will spontaneously mix. Stirring mixes two liquids; it does not un-mix them. Mechanical motion is irreversibly dissipated as heat due to friction. Scientists’ attempts to explain these commonplace observations spurred the development of thermodynamics and statistical mechanics and led to profound insights about the properties of matter, heat and time’s arrow. Central to this is the thermodynamic concept of entropy. 150 years ago (in 1865), Clausius famously said (2) "The energy of the universe remains constant. The entropy of the universe tends to a maximum." This is one formulation of the 2nd Law of Thermodynamics. In 1877 Boltzmann for the first time explained what entropy is and why, according to the 2nd Law of Thermodynamics, entropy increases (3). Boltzmann also showed that there were three contributions to entropy: from the motion of atoms (heat), from the distribution of atoms in space (position) (3), and from radiation (photon entropy)(4). These were the only known forms of entropy until the discovery of black hole entropy by Hawking and Bekenstein in the 1970's. Boltzmann's own words, quoted above, are as concise and as accurate a statement on the topic any made in the subsequent century and a half. Boltzmann's original papers, though almost unread today, are unsurpassed in clarity and depth of physical insight (see for example references (1, 5–7)). It is likely that many inaccurate and misleading statements about entropy and the 2nd law of thermodynamics could have been avoided by simply reading or quoting what Boltzmann actually said, rather than what his commentators or scientific opponents said he said! How do I love thee? let me count the ways- E. B. Browning Boltzmann's formulation of the 2nd Law can be paraphrased as "What we observe is that state or change of states that is the most probable, where most probable means: can can be realized the most ways" The 2nd Law, once expressed in this way, has the same retrospective obviousness and inevitability that Darwin's theory of evolution through natural selection and survival of the fittest does. Incidentally, Boltzmann was a strong admirer of Darwin's work and he correctly characterized the struggle of living things "not for elements nor energy...rather, it is a struggle for entropy (more accurately: negative entropy)"(8). Boltzmann's statement of the 2nd Law in terms of probability is not simply a tautology but a statement of great explanatory power. This is apparent once the exact meaning of the terms observe, state, most probable, and ways is established. Atoms, Motion and Probability The fundamental definition of entropy and the explanation of the 2nd Law must both be grounded upon probability, Boltzmann realized, because of two simple considerations: i) Atoms are in constant motion, interacting with other atoms, exchanging energy, changing positions. ii) We can neither observe, nor calculate the positions and velocities of every atom. The macroscopic phenomena we seek to explain using the 2nd Law reflect the aggregate behavior of large numbers of atoms that can potentially adopt a combinatorically larger 1 Entropy according to Boltzmann, © Kim Sharp, 2016 number of configurations and distributions of energies. So we first consider the probability distribution for atomic positions in the simplest case, in order to define our terms, and appreciate the consequences of averaging over the stupendously large numbers involved. Then, with just a little more work, the 'rule' for the probability distribution for the energies of atoms is established. Distribution in space To introduce the probability calculations, consider the distribution of identical molecules of an ideal gas between two equal volumes V, labeled L for left, and R for right (Figure 1). From the postulate of equal a priori probabilities, each molecule has the same probability, ½, of being in either volume: the volumes are the same size, each molecule is moving rapidly with a velocity determined by the temperature, and undergoing elastic collisions with other molecules and the walls which changes its direction, but otherwise making no interactions 1. So at any instant we might expect an equal number of molecules in each volume. Indeed, if there are two molecules, this is the mostly likely single 'state distribution', produced by 2 of the 4 arrangements or 2 complexions: LR and RL both belong to the state distribution (NL=1,NR=1) . The more extreme state distributions of (NL=2,NR=0) and (NL=0,NR=2) are less likely, there being 1 complexion belonging to each: LL and RR. Still, with only two molecules these more unequal state distributions are quite probable, with p=1/4 each. Distributing 4 molecules, the possibilities are given in Table 1, Table 1 State distribution # of complexions Probability 4 R (NL =4,NR=0) 1 (½) = 0.0625 L (NL=3,NR=1) 4 4/16 = 0.25 (NL=2,NR=2) 6 6/16 = 0.375 (NL=1,NR=3) 4 4/16 = 0.25 4 state distribution (N =2,N =2) (NL=0,NR=4) 1 (½) = 0.0625 L R Figure 1 where the probabilities are obtained as follows: Each molecule independently has a probability of ½ of being in L or R, so the probability of any particular complexion of 4 molecules is ½ x ½ x ½ x ½ = (1/2)4 = 1/16. and there is one such complexion LLLL that gives the distribution (NL=4,NR=0), four (RLLL, LRLL etc.) that give the distribution (NL=3,NR=1) and so on. There are more ways (complexions) to obtain an equal distribution (6) than any other distribution, while the probability of the more extreme distributions- all molecules in the left or right volumes, diminishes to p=1/16 each. More generally, for N molecules the total number of complexions is N 2 , and the number of these that have a given number of molecules NL in the first volume and NR = N- NL in the second, is given by the binomial coefficient: N ! W (N L , N )= (1) N L ! N R! The function W(NL,N) is peaked with a maximum at NL= NR given by 1 Tolman (1936) has emphasized that the ultimate justification for the postulate of equal a priori probabilities is the empirical success of the resulting theory 2 Here LR means a 'complexion' where the first molecule is in the left hand volume, and the second one is in the right hand volume, and so on. The state distribution is defined as the number of molecules in each volume. More generally the state distribution describes the number of molecules/atoms lying within each small range of spatial positions and kinetic energies, exactly analogous to the partition function of post-quantum statistical mechanics. 2 Entropy according to Boltzmann, © Kim Sharp, 2016 N ! W =W (N =N , N )= (2) max L R N 2 ( 2 !) As N increases the peak becomes ever sharper, in other words an increasing fraction of the state distributions come to resemble ever more closely the most probable one NL=NR . This is illustrated in the 3 panels below for 10 thousand, 1 million and 100 million molecules: Figure 2 With 10 thousand molecules, we find that the majority of possible state distributions lie within ±1% of the most probable NL= NR one, although a small but significant fraction lie further away. For 1 million molecules, we find that almost all the possible state distributions (99.92%) lie within ¼ of a percent of NL=NR . For a 100 million molecules, most (99%) lie within two hundredths of a percent of NL=NR . In other words, for large numbers of molecules almost every state distribution is indistinguishable from the most probable one, NL=NR, the state distribution that has the most complexions, that can occur in the most ways. Figure 3 Figure 3 shows one computer generated distribution for each of six values of N, starting with 100 and increasing by a factor of 10 each time up to 10 million molecules. Already for N=100,000 molecules, the total number of possible complexions is stupendous, greater than 1 followed by 30,000 zeroes (!), yet the single sample belongs to a state distribution that differs negligibly from NL=NR, exactly as one expects from the narrowing peak in Fig. 1. For the larger values of N, there are even more possible complexions, on the order of 100.3N yet the single sample belongs to a state distribution even more closely resembling the most probable one. The point of this example is twofold: 1) Even picking just one complexion, it is overwhelming likely that it corresponds to a state distribution with NL≈NR . In no way is it necessary to sample all the complexions (the so called ergodic hypothesis), or even any significant fraction of them. 2) As the specific complexion changes due to thermal motion (as molecules cross between L and R), the gas is overwhelmingly likely to stay in state distributions where NL≈ NR if it is already there, or if it has a complexion where L and R are very different, it is overwhelmingly likely to move to complexions belonging to less unequal state distributions, because there are so many more of the latter than the former.
Recommended publications
  • Canonical Ensemble
    ME346A Introduction to Statistical Mechanics { Wei Cai { Stanford University { Win 2011 Handout 8. Canonical Ensemble January 26, 2011 Contents Outline • In this chapter, we will establish the equilibrium statistical distribution for systems maintained at a constant temperature T , through thermal contact with a heat bath. • The resulting distribution is also called Boltzmann's distribution. • The canonical distribution also leads to definition of the partition function and an expression for Helmholtz free energy, analogous to Boltzmann's Entropy formula. • We will study energy fluctuation at constant temperature, and witness another fluctuation- dissipation theorem (FDT) and finally establish the equivalence of micro canonical ensemble and canonical ensemble in the thermodynamic limit. (We first met a mani- festation of FDT in diffusion as Einstein's relation.) Reading Assignment: Reif x6.1-6.7, x6.10 1 1 Temperature For an isolated system, with fixed N { number of particles, V { volume, E { total energy, it is most conveniently described by the microcanonical (NVE) ensemble, which is a uniform distribution between two constant energy surfaces. const E ≤ H(fq g; fp g) ≤ E + ∆E ρ (fq g; fp g) = i i (1) mc i i 0 otherwise Statistical mechanics also provides the expression for entropy S(N; V; E) = kB ln Ω. In thermodynamics, S(N; V; E) can be transformed to a more convenient form (by Legendre transform) of Helmholtz free energy A(N; V; T ), which correspond to a system with constant N; V and temperature T . Q: Does the transformation from N; V; E to N; V; T have a meaning in statistical mechanics? A: The ensemble of systems all at constant N; V; T is called the canonical NVT ensemble.
    [Show full text]
  • Magnetism, Magnetic Properties, Magnetochemistry
    Magnetism, Magnetic Properties, Magnetochemistry 1 Magnetism All matter is electronic Positive/negative charges - bound by Coulombic forces Result of electric field E between charges, electric dipole Electric and magnetic fields = the electromagnetic interaction (Oersted, Maxwell) Electric field = electric +/ charges, electric dipole Magnetic field ??No source?? No magnetic charges, N-S No magnetic monopole Magnetic field = motion of electric charges (electric current, atomic motions) Magnetic dipole – magnetic moment = i A [A m2] 2 Electromagnetic Fields 3 Magnetism Magnetic field = motion of electric charges • Macro - electric current • Micro - spin + orbital momentum Ampère 1822 Poisson model Magnetic dipole – magnetic (dipole) moment [A m2] i A 4 Ampere model Magnetism Microscopic explanation of source of magnetism = Fundamental quantum magnets Unpaired electrons = spins (Bohr 1913) Atomic building blocks (protons, neutrons and electrons = fermions) possess an intrinsic magnetic moment Relativistic quantum theory (P. Dirac 1928) SPIN (quantum property ~ rotation of charged particles) Spin (½ for all fermions) gives rise to a magnetic moment 5 Atomic Motions of Electric Charges The origins for the magnetic moment of a free atom Motions of Electric Charges: 1) The spins of the electrons S. Unpaired spins give a paramagnetic contribution. Paired spins give a diamagnetic contribution. 2) The orbital angular momentum L of the electrons about the nucleus, degenerate orbitals, paramagnetic contribution. The change in the orbital moment
    [Show full text]
  • Blackbody Radiation: (Vibrational Energies of Atoms in Solid Produce BB Radiation)
    Independent study in physics The Thermodynamic Interaction of Light with Matter Mirna Alhanash Project in Physics Uppsala University Contents Abstract ................................................................................................................................................................................ 3 Introduction ......................................................................................................................................................................... 3 Blackbody Radiation: (vibrational energies of atoms in solid produce BB radiation) .................................... 4 Stefan-Boltzmann .............................................................................................................................................................. 6 Wien displacement law..................................................................................................................................................... 7 Photoelectric effect ......................................................................................................................................................... 12 Frequency dependence/Atom model & electron excitation .................................................................................. 12 Why we see colours ....................................................................................................................................................... 14 Optical properties of materials: ..................................................................................................................................
    [Show full text]
  • Ludwig Boltzmann Was Born in Vienna, Austria. He Received His Early Education from a Private Tutor at Home
    Ludwig Boltzmann (1844-1906) Ludwig Boltzmann was born in Vienna, Austria. He received his early education from a private tutor at home. In 1863 he entered the University of Vienna, and was awarded his doctorate in 1866. His thesis was on the kinetic theory of gases under the supervision of Josef Stefan. Boltzmann moved to the University of Graz in 1869 where he was appointed chair of the department of theoretical physics. He would move six more times, occupying chairs in mathematics and experimental physics. Boltzmann was one of the most highly regarded scientists, and universities wishing to increase their prestige would lure him to their institutions with high salaries and prestigious posts. Boltzmann himself was subject to mood swings and he joked that this was due to his being born on the night between Shrove Tuesday and Ash Wednesday (or between Carnival and Lent). Traveling and relocating would temporarily provide relief from his depression. He married Henriette von Aigentler in 1876. They had three daughters and two sons. Boltzmann is best known for pioneering the field of statistical mechanics. This work was done independently of J. Willard Gibbs (who never moved from his home in Connecticut). Together their theories connected the seemingly wide gap between the macroscopic properties and behavior of substances with the microscopic properties and behavior of atoms and molecules. Interestingly, the history of statistical mechanics begins with a mathematical prize at Cambridge in 1855 on the subject of evaluating the motions of Saturn’s rings. (Laplace had developed a mechanical theory of the rings concluding that their stability was due to irregularities in mass distribution.) The prize was won by James Clerk Maxwell who then went on to develop the theory that, without knowing the individual motions of particles (or molecules), it was possible to use their statistical behavior to calculate properties of a gas such as viscosity, collision rate, diffusion rate and the ability to conduct heat.
    [Show full text]
  • Spin Statistics, Partition Functions and Network Entropy
    This is a repository copy of Spin Statistics, Partition Functions and Network Entropy. White Rose Research Online URL for this paper: https://eprints.whiterose.ac.uk/118694/ Version: Accepted Version Article: Wang, Jianjia, Wilson, Richard Charles orcid.org/0000-0001-7265-3033 and Hancock, Edwin R orcid.org/0000-0003-4496-2028 (2017) Spin Statistics, Partition Functions and Network Entropy. Journal of Complex Networks. pp. 1-25. ISSN 2051-1329 https://doi.org/10.1093/comnet/cnx017 Reuse Items deposited in White Rose Research Online are protected by copyright, with all rights reserved unless indicated otherwise. They may be downloaded and/or printed for private study, or other acts as permitted by national copyright laws. The publisher or other rights holders may allow further reproduction and re-use of the full text version. This is indicated by the licence information on the White Rose Research Online record for the item. Takedown If you consider content in White Rose Research Online to be in breach of UK law, please notify us by emailing [email protected] including the URL of the record and the reason for the withdrawal request. [email protected] https://eprints.whiterose.ac.uk/ IMA Journal of Complex Networks (2017)Page 1 of 25 doi:10.1093/comnet/xxx000 Spin Statistics, Partition Functions and Network Entropy JIANJIA WANG∗ Department of Computer Science,University of York, York, YO10 5DD, UK ∗[email protected] RICHARD C. WILSON Department of Computer Science,University of York, York, YO10 5DD, UK AND EDWIN R. HANCOCK Department of Computer Science,University of York, York, YO10 5DD, UK [Received on 2 May 2017] This paper explores the thermodynamic characterization of networks using the heat bath analogy when the energy states are occupied under different spin statistics, specified by a partition function.
    [Show full text]
  • The Philosophy and Physics of Time Travel: the Possibility of Time Travel
    University of Minnesota Morris Digital Well University of Minnesota Morris Digital Well Honors Capstone Projects Student Scholarship 2017 The Philosophy and Physics of Time Travel: The Possibility of Time Travel Ramitha Rupasinghe University of Minnesota, Morris, [email protected] Follow this and additional works at: https://digitalcommons.morris.umn.edu/honors Part of the Philosophy Commons, and the Physics Commons Recommended Citation Rupasinghe, Ramitha, "The Philosophy and Physics of Time Travel: The Possibility of Time Travel" (2017). Honors Capstone Projects. 1. https://digitalcommons.morris.umn.edu/honors/1 This Paper is brought to you for free and open access by the Student Scholarship at University of Minnesota Morris Digital Well. It has been accepted for inclusion in Honors Capstone Projects by an authorized administrator of University of Minnesota Morris Digital Well. For more information, please contact [email protected]. The Philosophy and Physics of Time Travel: The possibility of time travel Ramitha Rupasinghe IS 4994H - Honors Capstone Project Defense Panel – Pieranna Garavaso, Michael Korth, James Togeas University of Minnesota, Morris Spring 2017 1. Introduction Time is mysterious. Philosophers and scientists have pondered the question of what time might be for centuries and yet till this day, we don’t know what it is. Everyone talks about time, in fact, it’s the most common noun per the Oxford Dictionary. It’s in everything from history to music to culture. Despite time’s mysterious nature there are a lot of things that we can discuss in a logical manner. Time travel on the other hand is even more mysterious.
    [Show full text]
  • Entropy: from the Boltzmann Equation to the Maxwell Boltzmann Distribution
    Entropy: From the Boltzmann equation to the Maxwell Boltzmann distribution A formula to relate entropy to probability Often it is a lot more useful to think about entropy in terms of the probability with which different states are occupied. Lets see if we can describe entropy as a function of the probability distribution between different states. N! WN particles = n1!n2!....nt! stirling (N e)N N N W = = N particles n1 n2 nt n1 n2 nt (n1 e) (n2 e) ....(nt e) n1 n2 ...nt with N pi = ni 1 W = N particles n1 n2 nt p1 p2 ...pt takeln t lnWN particles = "#ni ln pi i=1 divide N particles t lnW1particle = "# pi ln pi i=1 times k t k lnW1particle = "k# pi ln pi = S1particle i=1 and t t S = "N k p ln p = "R p ln p NA A # i # i i i=1 i=1 ! Think about how this equation behaves for a moment. If any one of the states has a probability of occurring equal to 1, then the ln pi of that state is 0 and the probability of all the other states has to be 0 (the sum over all probabilities has to be 1). So the entropy of such a system is 0. This is exactly what we would have expected. In our coin flipping case there was only one way to be all heads and the W of that configuration was 1. Also, and this is not so obvious, the maximal entropy will result if all states are equally populated (if you want to see a mathematical proof of this check out Ken Dill’s book on page 85).
    [Show full text]
  • Useful Constants
    Appendix A Useful Constants A.1 Physical Constants Table A.1 Physical constants in SI units Symbol Constant Value c Speed of light 2.997925 × 108 m/s −19 e Elementary charge 1.602191 × 10 C −12 2 2 3 ε0 Permittivity 8.854 × 10 C s / kgm −7 2 μ0 Permeability 4π × 10 kgm/C −27 mH Atomic mass unit 1.660531 × 10 kg −31 me Electron mass 9.109558 × 10 kg −27 mp Proton mass 1.672614 × 10 kg −27 mn Neutron mass 1.674920 × 10 kg h Planck constant 6.626196 × 10−34 Js h¯ Planck constant 1.054591 × 10−34 Js R Gas constant 8.314510 × 103 J/(kgK) −23 k Boltzmann constant 1.380622 × 10 J/K −8 2 4 σ Stefan–Boltzmann constant 5.66961 × 10 W/ m K G Gravitational constant 6.6732 × 10−11 m3/ kgs2 M. Benacquista, An Introduction to the Evolution of Single and Binary Stars, 223 Undergraduate Lecture Notes in Physics, DOI 10.1007/978-1-4419-9991-7, © Springer Science+Business Media New York 2013 224 A Useful Constants Table A.2 Useful combinations and alternate units Symbol Constant Value 2 mHc Atomic mass unit 931.50MeV 2 mec Electron rest mass energy 511.00keV 2 mpc Proton rest mass energy 938.28MeV 2 mnc Neutron rest mass energy 939.57MeV h Planck constant 4.136 × 10−15 eVs h¯ Planck constant 6.582 × 10−16 eVs k Boltzmann constant 8.617 × 10−5 eV/K hc 1,240eVnm hc¯ 197.3eVnm 2 e /(4πε0) 1.440eVnm A.2 Astronomical Constants Table A.3 Astronomical units Symbol Constant Value AU Astronomical unit 1.4959787066 × 1011 m ly Light year 9.460730472 × 1015 m pc Parsec 2.0624806 × 105 AU 3.2615638ly 3.0856776 × 1016 m d Sidereal day 23h 56m 04.0905309s 8.61640905309
    [Show full text]
  • Physical Biology
    Physical Biology Kevin Cahill Department of Physics and Astronomy University of New Mexico, Albuquerque, NM 87131 i For Ginette, Mike, Sean, Peter, Mia, James, Dylan, and Christopher Contents Preface page v 1 Atoms and Molecules 1 1.1 Atoms 1 1.2 Bonds between atoms 1 1.3 Oxidation and reduction 5 1.4 Molecules in cells 6 1.5 Sugars 6 1.6 Hydrocarbons, fatty acids, and fats 8 1.7 Nucleotides 10 1.8 Amino acids 15 1.9 Proteins 19 Exercises 20 2 Metabolism 23 2.1 Glycolysis 23 2.2 The citric-acid cycle 23 3 Molecular devices 27 3.1 Transport 27 4 Probability and statistics 28 4.1 Mean and variance 28 4.2 The binomial distribution 29 4.3 Random walks 32 4.4 Diffusion as a random walk 33 4.5 Continuous diffusion 34 4.6 The Poisson distribution 36 4.7 The gaussian distribution 37 4.8 The error function erf 39 4.9 The Maxwell-Boltzmann distribution 40 Contents iii 4.10 Characteristic functions 41 4.11 Central limit theorem 42 Exercises 45 5 Statistical Mechanics 47 5.1 Probability and the Boltzmann distribution 47 5.2 Equilibrium and the Boltzmann distribution 49 5.3 Entropy 53 5.4 Temperature and entropy 54 5.5 The partition function 55 5.6 The Helmholtz free energy 57 5.7 The Gibbs free energy 58 Exercises 59 6 Chemical Reactions 61 7 Liquids 65 7.1 Diffusion 65 7.2 Langevin's Theory of Brownian Motion 67 7.3 The Einstein-Nernst relation 71 7.4 The Poisson-Boltzmann equation 72 7.5 Reynolds number 73 7.6 Fluid mechanics 76 8 Cell structure 77 8.1 Organelles 77 8.2 Membranes 78 8.3 Membrane potentials 80 8.4 Potassium leak channels 82 8.5 The Debye-H¨uckel equation 83 8.6 The Poisson-Boltzmann equation in one dimension 87 8.7 Neurons 87 Exercises 89 9 Polymers 90 9.1 Freely jointed polymers 90 9.2 Entropic force 91 10 Mechanics 93 11 Some Experimental Methods 94 11.1 Time-dependent perturbation theory 94 11.2 FRET 96 11.3 Fluorescence 103 11.4 Some mathematics 107 iv Contents References 109 Index 111 Preface Most books on biophysics hide the physics so as to appeal to a wide audi- ence of students.
    [Show full text]
  • Josiah Willard Gibbs
    GENERAL ARTICLE Josiah Willard Gibbs V Kumaran The foundations of classical thermodynamics, as taught in V Kumaran is a professor textbooks today, were laid down in nearly complete form by of chemical engineering at the Indian Institute of Josiah Willard Gibbs more than a century ago. This article Science, Bangalore. His presentsaportraitofGibbs,aquietandmodestmanwhowas research interests include responsible for some of the most important advances in the statistical mechanics and history of science. fluid mechanics. Thermodynamics, the science of the interconversion of heat and work, originated from the necessity of designing efficient engines in the late 18th and early 19th centuries. Engines are machines that convert heat energy obtained by combustion of coal, wood or other types of fuel into useful work for running trains, ships, etc. The efficiency of an engine is determined by the amount of useful work obtained for a given amount of heat input. There are two laws related to the efficiency of an engine. The first law of thermodynamics states that heat and work are inter-convertible, and it is not possible to obtain more work than the amount of heat input into the machine. The formulation of this law can be traced back to the work of Leibniz, Dalton, Joule, Clausius, and a host of other scientists in the late 17th and early 18th century. The more subtle second law of thermodynamics states that it is not possible to convert all heat into work; all engines have to ‘waste’ some of the heat input by transferring it to a heat sink. The second law also established the minimum amount of heat that has to be wasted based on the absolute temperatures of the heat source and the heat sink.
    [Show full text]
  • Episode 1: Phis Wants to Be a Physicist
    Episode 1: Phis wants to be a physicist Illustration: Xia Hong Script: Xia Hong 12/2012 Phis is 14 years old. Like most kids of her age, she’s a bit anxious about finding the person inside her… She tried singing. Hmm… “You are very talented, Phis!” Her best friend Lizzy told her. Sports are also not her cup of tea… Episode 1: Phis wants to be a physicist 1 “I wish I could But why I’m short, but does height I’m very be as confident She’s smart. She’s pretty. matter?! confident! as Lizzy!” Phis And she is told her little taller than me! sister Chemi. No wonder she’s “Then I would so confident! know what I can do.” “Who says size doesn’t matter?” Phis is not convinced. “Just read the newspaper…” “It’s always good to be either really small or really big, not in between!” Even her name sounds sharp — it is the same as the Lizzy in Pride and Prejudice ! Chemi, don’t you think our names are a bit strange? Episode 1: Phis wants to be a physicist 2 Phis’s self-search reached a conclusion after a monthly Science Day event in school hosted by Ms. Allen. What I’ll show you today is the It is our fascinating pleasure world of to have quantum Prof. Lee mechanics. here. Imagine, we live in the quantum world. Quantum We have to comply to its rules, and behave very We no longer have a differently from certain answer for things our normal life.
    [Show full text]
  • Lecture 6: Entropy
    Matthew Schwartz Statistical Mechanics, Spring 2019 Lecture 6: Entropy 1 Introduction In this lecture, we discuss many ways to think about entropy. The most important and most famous property of entropy is that it never decreases Stot > 0 (1) Here, Stot means the change in entropy of a system plus the change in entropy of the surroundings. This is the second law of thermodynamics that we met in the previous lecture. There's a great quote from Sir Arthur Eddington from 1927 summarizing the importance of the second law: If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equationsthen so much the worse for Maxwell's equations. If it is found to be contradicted by observationwell these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of ther- modynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation. Another possibly relevant quote, from the introduction to the statistical mechanics book by David Goodstein: Ludwig Boltzmann who spent much of his life studying statistical mechanics, died in 1906, by his own hand. Paul Ehrenfest, carrying on the work, died similarly in 1933. Now it is our turn to study statistical mechanics. There are many ways to dene entropy. All of them are equivalent, although it can be hard to see. In this lecture we will compare and contrast dierent denitions, building up intuition for how to think about entropy in dierent contexts. The original denition of entropy, due to Clausius, was thermodynamic.
    [Show full text]