8.044: Statistical Physics I Lecturer: Professor Nikta Fakhri Notes by: Andrew Lin Spring 2019 My recitations for this class were taught by Professor Wolfgang Ketterle. 1 February 5, 2019 This class’s recitation teachers are Professor Jeremy England and Professor Wolfgang Ketterle, and Nicolas Romeo is the graduate TA. We’re encouraged to talk to the teaching team about their research – Professor Fakhri and Professor England work in biophysics and nonequilibrium systems, and Professor Ketterle works in experimental atomic and molecular physics. 1.1 Course information We can read the online syllabus for most of this information. Lectures will be in 6-120 from 11 to 12:30, and a 5-minute break will usually be given after about 50 minutes of class. The class’s LMOD website will have lecture notes and problem sets posted – unlike some other classes, all pset solutions should be uploaded to the website, because the TAs can grade our homework online. This way, we never lose a pset and don’t have to go to the drop boxes. There are two textbooks for this class: Schroeder’s “An Introduction to Thermal Physics” and Jaffe’s “The Physics of Energy.” We’ll have a reading list that explains which sections correspond to each lecture. Exam-wise, there are two midterms on March 12 and April 18, which take place during class and contribute 20 percent each to our grade. There is also a final that is 30 percent of our grade (during finals week). The remaining 30 percent of our grade comes from 11 or 12 problem sets (lowest grade dropped). Office hours haven’t been posted yet; they will also be posted on the website once schedules are sorted out. 1.2 Why be excited about 8.044? One of the driving principles behind this class is the phrase “More is different.” We can check the course website for the reading “More is Different” by P.W. Anderson. Definition 1 Thermodynamics is a branch of physics that provides phenomenological descriptions for properties of macroscopic systems in thermal equilibrium. Throughout this class, we’ll define each of the words in the definition above, and more generally, we’re going to learn about the physics of energy and matter as we experience it at normal, everyday time and length scales. The 1 most important feature is that we’re dealing with the physics of many particles at once – in fact, we’re going to be doing a statistical description of about 1024 particles at once. It would be very hard and basically useless to try to use ordinary equations of motion to describe the behavior of each particle. Fact 2 Because thermodynamics is a study of global properties, like magnetism or hardness, the largeness of our systems will often actually be an advantage in calculations. The concept of time asymmetry will also come up in this class. In Newton’s laws, Maxwell’s equations, or the Schrodinger equation, there is no real evidence that time needs to travel in a certain direction for the physics to be valid. But the “arrow of time” is dependent on some of the ideas we’ll discuss in this class. Two more ideas that will repeatedly come up are temperature and entropy. We’ll spend a lot of time precisely understanding those concepts, and we’ll understand that it doesn’t make sense to talk about the temperature of an individual particle – it only does to define temperature with regards to a larger system. Meanwhile, entropy is possibly the most influential concept coming from statistical mechanics: it was originally understood as a thermodynamic property of heat engines, which is where much of this field originated. But now, entropy is science’s fundamental measure of disorder and information, and it can quantify ideas from image compression to the heat death of the Universe. Here’s a list of some of the questions we’ll be asking in this class: • What is the difference between a solid, liquid, and gas? • What makes a material an insulator or a conductor? • How do we understand other properties of materials, like magnets, superfluids, superconductors, white dwarfs, neutron stars, stretchiness of rubber, and physics of living systems? None of these are immediately apparent from the laws of Newton, Maxwell, or Schrodinger. Instead, we’re going to need to develop a theoretical framework with two main parts: • Thermodynamics: this is the machinery that describes macroscopic quantities such as entropy, temperature, magnetization, and their relationship. • Statistical mechanics: this is the statistical machinery at the microscopic level. What are each of the degrees of freedom doing in our system? These concepts have been incorporated into different other STEM fields: for example, they come up in Monte- Carlo methods, descriptions of ensembles, understanding phases, nucleation, fluctuations, bioinformatics, and (now the foundation of most of physics) quantum statistical mechanics. 1.3 An example from biology Many living systems perform processes that are irreversible, and the behavior of these processes can be quantified in terms of how much entropy is produced by them. Statistical physics and information theory help us do this! As a teaser, imagine we have a biological system where movement of particles is influenced by both thermal motion and motor proteins. By watching a video, we can track each individual particle, and looking at the trajectory forward and backward, and we can construct a relative entropy _ hSi X pf ≡ D[pforwardjjpbackward] = pf ln kB pb 2 which compares the probability distributions of forward and backward motion, and the point is that this relates to the entropy production rate of the system! But it’ll take us a lot of work to get to that kind of result, so we’ll start with some definitions and important concepts. To summarize this general overview, there’s two complementary paths going on here: Thermodynamics =) global properties =) (temperature, entropy, magnetization, etc.); and Statistical physics =) (microscopic world to macroscopic world): We’ll also spend time on two “diversions:” quantum mechanic will help us construct the important states that we will end up “counting” in statistical physics, and basic probability theory will give us a statistical description of the properties we’re trying to describe (since entropy itself is an information theory metric)! To fully discuss these topics, we’re going to need some mathematics, particularly multivariable calculus. 1.4 Definitions We’ll start by talking about the basic concepts of heat, internal energy, thermal energy, and temperature. Definition 3 (Tentative) Thermal energy is the collective energy contained in the relative motion of a large number of particles that compose a macroscopic system. Heat is the transfer of that thermal energy. (We’ll try to be careful in distinguishing between energy and the transfer of that energy throughout this class.) Definition 4 Internal energy, often denoted U, is the sum of all contributions to the energy of a system as an isolated whole. This internal energy U is usually made up of a sum of different contributions: • Kinetic energy of molecular motion, including translational, vibrational, and rotational motion, • Potential energy due to interactions between particles in the system, and • Molecular, atomic, and nuclear binding energies. Notably, this does not include the energy of an external field or the kinetic and potential energy of the system as a whole, because we care about behavior that is internal to our system of study. Example 5 Consider a glass of water on a table, and compare it to the same glass at a higher height. This doesn’t change the internal energy, even though the glass has gained some overall gravitational potential energy. Definition 6 (Tentative) Temperature is what we measure on a thermometer. 3 As a general rule, if we remove some internal energy from a system, the temperature will decrease. But there are cases where it will plateau as well! For example, if we plot temperature as a function of the internal energy, it is linear for each phase state (solid, liquid, vapor), but plateaus during phase changes, because it takes some energy to transform ice to water to vapor. And now we’re ready to make some slightly more precise definitions: Definition 7 Let U0 be the energy of a system at tempreature T = 0. Thermal energy is the part of the internal energy of a system above U = U0. Notice that with this definition, the binding energy does not contribute to thermal energy (because that’s present even at T = 0;U = U0), but the other sources of internal energy (kinetic energy, potential energy) will still contribute. Definition 8 Heat is the transfer of thermal energy from a system to another system. This is not a property of the system: instead, it’s energy in motion! And heat transfer can occur as heat conduction or radiation, a change in temperature, or other things that occur at the microscopic level. 1.5 States and state variables The next discussion is a little bit more subtle – we want to know what it means for our system to be in a particular state. In classical mechanics, a state is specified by the position and velocity of all objects at time t. So if we’re given the two numbers fxi (t); x_i (t)g for each i (that is, for every particle in our system), we have specified everything we might want to know. Meanwhile, in quantum mechanics, the state of a system is specified by quantum numbers: for example, jn1; ··· ; nM i (for some nonnegative integers ni ) is one way we might describe the system. But we have a completely different definition of “state” now that we’re in a macroscopic system: Definition 9 A system which has settled down is in a state of thermodynamic equilibrium or thermodynamic macrostate.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages164 Page
-
File Size-