The Canonical Ensemble

Total Page:16

File Type:pdf, Size:1020Kb

The Canonical Ensemble The Canonical Ensemble Stephen R. Addison February 12, 2001 The Canonical Ensemble We will develop the method of canonical ensembles by considering a system placed in a heat bath at temperature T: The canonical ensemble is the assembly of systems with ¯xed N and V: In other words we will consider an assembly of systems closed to others by rigid, diathermal, impermeable walls. The energy of the microstates can fluctuate, the system is kept in equilibrium by being in contact with the heat bath at temperature T: Schematically, we can view this ensemble as: State 1 E1;V;N Bath T State 2 E2;V;N Bath T State 3 E3;V;N Bath T . State º Eº ;V;N Bath T The system for which the canonical ensemble is appropriate can be thought of as a sub-system of the system for which the microcanonical ensemble is ap- propriate. Bath E0 System Eº Isolated system with E;V;N ¯xed. Basically what we do is to examine one state and consider the rest to be in the 1 heat bath. Thus the macroscopic system is speci¯ed by T;V; andN as illustrated. Isolated Temperature T V;N Let the combined energy of the system and the heat bath = E0 The system with V;N will be in one of a variety of microstates E1;E2;:::;Er: These energies could be degenerate in some cases, but we assume that they can be ordered. E1 · E2 · E3 · :::· Er · ::: We'll select ±E so that we select one energy level but several microstates. Let the system be in a state with energy Er, the energy of the reservoir is then E0 ¡ Er. What's the probability that the system will be in a microstate with energy Er? When we considered an isolated system, we found the probability of it be- ing in a macrostate speci¯ed by (E;V;N;®) was proportional to the multiplicity ­(E;V;N;®): In this situation, we could choose to analyze the system or the heat bath. It will prove to be e±cient to analyze the heat bath. The multiplicity of the heat bath is: ­(E0 ¡ Er) We have an isolated system with two sub-systems, labelling the heat bath as system 2, we have: pr = const­2(E0 ¡ Er) The ratio of probabilities for the states Ei; and Ej is pi ­2(E0 ¡ Ei) = pj ­2(E0 ¡ Ej) We could write similar expressions for all pairs of levels, if we consider three pi ­2(E0 ¡ Ei) = pi ­2(E0 ¡ Ei) 2 pi ­2(E0 ¡ Ei) = pj ­2(E0 ¡ Ej) pi ­2(E0 ¡ Ei) = pk ­2(E0 ¡ Ek) we can add them to give pi + pj + pk ­2(E0 ¡ Ei)+­2(E0 ¡ Ej +­2(E0 ¡ Ek)) = pi ­2(E0 ¡ Ei) We can generalize this result to yield ­2(E0 ¡ Er) pr = P ­2(E0 ¡ Ei) i Where the sum is taken over all levels. Next, we rewrite the expression for pr in terms of the reservoir entropy. S = k ln ­, so S=k =ln­; and ­ = eS=k. Using this we write µ ¶ S2(E0 ¡ Er) pr = (constant) £ exp k With a large reservoir, we can assume that E0 À Er. If the heat bath is large this inequality holds for all states with a reasonable chance of occurring. Now we'll expand in a Taylor series about S2(E0). Recall µ ¶ µ ¶ 2 df 1 2 d f f(x0 + a)=f(xo)+a + a 2 + ::: dx x=x0 2! dx x=x0 so 2 2 1 1 Er @S2(E0) Er @ S2(E0) S2(E0 ¡ Er)= S2(E0) ¡ + 2 ::: k k k @E0 2!k @E0 Now, we know that @S2(E0) 1 = @E T where T is the temperature of the heat bath. Thus, we have 1 1 Er S2(E0 ¡ Er)= S2(E0) ¡ : k k kT All higher order partial derivatives are zero by assumption of a large heat bath µ ¶ 2 @ S2 @ 1 = =0 @E2 @E T Now using ­2(E0 ¡ Er) pr = P ­2(E0 ¡ Ei) i 3 and ­=eS=k and substituting 1 ¯ = kT we get S2 exp( k ¡ ¯Er) pr = P S2 exp( k ¡ ¯Ei) i eS2=ke¡¯Er = P eS2=ke¡¯Ei i e¡¯Er = P e¡¯Ei i e¡¯Er pr = Z Partition Functions and the Boltzmann Distribu- tion In the last equation, I have de¯ned X Z = e¡¯Ei = Sum over States = Zustandsumme = Partition Function i We will also use an alternate de¯nition where the sum is over energy levels rather than states. In this alternate de¯nition, we let the degeneracy of the level be g(Ei): Then X ¡¯Ei Z = g(Ei)e : Ei e¡¯Er The equation pr = Z is called the Boltzmann distribution. The Boltzmann distribution gives the probability that is in a particular state when it is in a heat bath at temperature T: Z plays a central r^ole in the study of systems at ¯xed T: The term e¡¯Er is called the Boltzmann factor. It is now time to combine these ideas with our knowledge of probability, recall X X <x>= xif(xi)= pixi i i In thermal physics, in the canonical ensemble, the probability distribution (pi = f(xi) is the Boltzmann distribution, the average is called an ensemble average. 4 Average Energy in the Canonical Ensemble X X 1 ¡¯Er <E>= piEr = Ere i Z r P Let's simplify this result, consider Z = e¡¯Er r @Z @ X = e¡¯Er @¯ @¯ r X ¡¯Er @ = e (¡¯Er) @¯ rX ¡¯Er = ¡ Ere r so X 1 ¡¯Er <E> = Ere Z r 1 @ X = ¡ e¡¯Er Z @¯ r 1 @Z = ¡ Z @¯ @ ln Z <E> = ¡ @¯ This is an average over the states of the system that exchange energy with the reservoir. The fluctuations around this energy are small. Gibbs Entropy Formula Consider a general macroscopic system with state labelled 1; 2; 3;:::;r;:::P . The probability that a system is in a state r is pr: Without constraints, pr =1 is all about the way we can say about the system. The general de¯nition of entropy is then X S = ¡k pr ln pr: r This is the Gibbs entropy formula. We can deduce this formula for a generalized ensemble. Consider an ensemble of º replicas of our system. We'll assume that each replica has the same probability p2;p2;p3;:::;pr;::: of being in the state i: Provided º is large enough, the number or systems in the ensemble in state r is ºr = ºpr 5 The multiplicity ­º for the ensemble with º1 subsystems in state 1, º2 subsys- tems in state 2, etc., is the number of ways the distribution can be realized: º! ­º = º1!º2! :::ºr! Now º! Sº = k ln ­º = k ln º1!º2! :::ºr! Sº = k ln º! ¡ k(ln º1!+º2!+:::+ ºr!) Recall Stirling's approximation ln N!=N ln N ¡ N,so X Sº = k(º ln º ¡ ºr ln ºr) r But ºr = ºpr X Sº = kº ln º ¡ k ºpr ln(ºpr) rX = kº ln º ¡ kº pr ln(ºpr) Xr X = kº ln º ¡ kº pr ln º ¡ kº pr ln pr r X Xr = kº ln º ¡ kº ln º pr ¡ kº pr ln pr X r r Sº = ¡k ln º pr ln pr r But entropy is extensive so the entropy of one system (replica) is X S = Sº = ¡k pr ln pr r The Gibbs entropy formula is consistent with the Boltzmann entropy formula S = k ln ­: In an isolated system with energy in the range E to E + ±E; the number of microstates in the interval is ­(E;V;N): The probability of ¯nding the system in one of the microstates is 1=­(E;V;N) in the range and 0 outside the range. So 1 pr = (there are ­ terms in an isolated system) ­(E;V;N) X X 1 X S = ¡k pr ln pr = ¡k pr ln = k ln ­ pr = k ln ­ r r ­ r 6 Entropy of a System in a Heat Bath To ¯nd the entropy of a system in a heat bath, we can use the Gibbs entropy formula and the Boltzmann distribution for pr: X S = ¡k pr ln pr r and e¡¯Er pr = : Z Combining these expressions, and simplifying S X ¡ = pr ln pr k r µ ¶ X e¡¯Er e¡¯Er = ln r Z Z X e¡¯Er ¡ ¢ X e¡¯Er = ln e¡¯Er ¡ ln Z r Z r Z X e¡¯Er Z = (¡¯Er) ¡ ln Z r Z Z X ¡¯E ¡¯Ere r = ¡ ln Z r Z but, by de¯nition, we have X ¡¯E Ere r <E>= ; r Z so we can simplify the above result to yield S ¡ = ¡¯<E>¡ ln Z; k and ¯nally we ¯nd <E> S = + k ln Z: T We will usually assume that the system energy is well de¯ned and use <E> and E interchangeably, that is the system's mean energy (which is an estimate) and the system's energy are interchangeable. Our development of the partition function through its ensemble tells us that Z = Z(T;V;N), thus S and <E> are also functions of T; V; and N: 7 Summary X S = ¡k pr ln pr r e¡¯Er pr = Z X Z = e¡¯Er r X 1 ¡¯Er <E>= Ere Z r @ ln Z <E>= ¡ @¯ <E> S = + k ln Z = S(T;V;N) T In these expressions, Z and <E>, and as a consequence S are de¯ned as functions of T; V; and N: Contrast this with an isolated system where S = S(T;V;N): For a macroscopic system at temperature T , energy fluctuations are negligible | i.e.
Recommended publications
  • Canonical Ensemble
    ME346A Introduction to Statistical Mechanics { Wei Cai { Stanford University { Win 2011 Handout 8. Canonical Ensemble January 26, 2011 Contents Outline • In this chapter, we will establish the equilibrium statistical distribution for systems maintained at a constant temperature T , through thermal contact with a heat bath. • The resulting distribution is also called Boltzmann's distribution. • The canonical distribution also leads to definition of the partition function and an expression for Helmholtz free energy, analogous to Boltzmann's Entropy formula. • We will study energy fluctuation at constant temperature, and witness another fluctuation- dissipation theorem (FDT) and finally establish the equivalence of micro canonical ensemble and canonical ensemble in the thermodynamic limit. (We first met a mani- festation of FDT in diffusion as Einstein's relation.) Reading Assignment: Reif x6.1-6.7, x6.10 1 1 Temperature For an isolated system, with fixed N { number of particles, V { volume, E { total energy, it is most conveniently described by the microcanonical (NVE) ensemble, which is a uniform distribution between two constant energy surfaces. const E ≤ H(fq g; fp g) ≤ E + ∆E ρ (fq g; fp g) = i i (1) mc i i 0 otherwise Statistical mechanics also provides the expression for entropy S(N; V; E) = kB ln Ω. In thermodynamics, S(N; V; E) can be transformed to a more convenient form (by Legendre transform) of Helmholtz free energy A(N; V; T ), which correspond to a system with constant N; V and temperature T . Q: Does the transformation from N; V; E to N; V; T have a meaning in statistical mechanics? A: The ensemble of systems all at constant N; V; T is called the canonical NVT ensemble.
    [Show full text]
  • Magnetism, Magnetic Properties, Magnetochemistry
    Magnetism, Magnetic Properties, Magnetochemistry 1 Magnetism All matter is electronic Positive/negative charges - bound by Coulombic forces Result of electric field E between charges, electric dipole Electric and magnetic fields = the electromagnetic interaction (Oersted, Maxwell) Electric field = electric +/ charges, electric dipole Magnetic field ??No source?? No magnetic charges, N-S No magnetic monopole Magnetic field = motion of electric charges (electric current, atomic motions) Magnetic dipole – magnetic moment = i A [A m2] 2 Electromagnetic Fields 3 Magnetism Magnetic field = motion of electric charges • Macro - electric current • Micro - spin + orbital momentum Ampère 1822 Poisson model Magnetic dipole – magnetic (dipole) moment [A m2] i A 4 Ampere model Magnetism Microscopic explanation of source of magnetism = Fundamental quantum magnets Unpaired electrons = spins (Bohr 1913) Atomic building blocks (protons, neutrons and electrons = fermions) possess an intrinsic magnetic moment Relativistic quantum theory (P. Dirac 1928) SPIN (quantum property ~ rotation of charged particles) Spin (½ for all fermions) gives rise to a magnetic moment 5 Atomic Motions of Electric Charges The origins for the magnetic moment of a free atom Motions of Electric Charges: 1) The spins of the electrons S. Unpaired spins give a paramagnetic contribution. Paired spins give a diamagnetic contribution. 2) The orbital angular momentum L of the electrons about the nucleus, degenerate orbitals, paramagnetic contribution. The change in the orbital moment
    [Show full text]
  • Spin Statistics, Partition Functions and Network Entropy
    This is a repository copy of Spin Statistics, Partition Functions and Network Entropy. White Rose Research Online URL for this paper: https://eprints.whiterose.ac.uk/118694/ Version: Accepted Version Article: Wang, Jianjia, Wilson, Richard Charles orcid.org/0000-0001-7265-3033 and Hancock, Edwin R orcid.org/0000-0003-4496-2028 (2017) Spin Statistics, Partition Functions and Network Entropy. Journal of Complex Networks. pp. 1-25. ISSN 2051-1329 https://doi.org/10.1093/comnet/cnx017 Reuse Items deposited in White Rose Research Online are protected by copyright, with all rights reserved unless indicated otherwise. They may be downloaded and/or printed for private study, or other acts as permitted by national copyright laws. The publisher or other rights holders may allow further reproduction and re-use of the full text version. This is indicated by the licence information on the White Rose Research Online record for the item. Takedown If you consider content in White Rose Research Online to be in breach of UK law, please notify us by emailing [email protected] including the URL of the record and the reason for the withdrawal request. [email protected] https://eprints.whiterose.ac.uk/ IMA Journal of Complex Networks (2017)Page 1 of 25 doi:10.1093/comnet/xxx000 Spin Statistics, Partition Functions and Network Entropy JIANJIA WANG∗ Department of Computer Science,University of York, York, YO10 5DD, UK ∗[email protected] RICHARD C. WILSON Department of Computer Science,University of York, York, YO10 5DD, UK AND EDWIN R. HANCOCK Department of Computer Science,University of York, York, YO10 5DD, UK [Received on 2 May 2017] This paper explores the thermodynamic characterization of networks using the heat bath analogy when the energy states are occupied under different spin statistics, specified by a partition function.
    [Show full text]
  • Statistical Physics– a Second Course
    Statistical Physics– a second course Finn Ravndal and Eirik Grude Flekkøy Department of Physics University of Oslo September 3, 2008 2 Contents 1 Summary of Thermodynamics 5 1.1 Equationsofstate .......................... 5 1.2 Lawsofthermodynamics. 7 1.3 Maxwell relations and thermodynamic derivatives . .... 9 1.4 Specificheatsandcompressibilities . 10 1.5 Thermodynamicpotentials . 12 1.6 Fluctuations and thermodynamic stability . .. 15 1.7 Phasetransitions ........................... 16 1.8 EntropyandGibbsParadox. 18 2 Non-Interacting Particles 23 1 2.1 Spin- 2 particlesinamagneticfield . 23 2.2 Maxwell-Boltzmannstatistics . 28 2.3 Idealgas................................ 32 2.4 Fermi-Diracstatistics. 35 2.5 Bose-Einsteinstatistics. 36 3 Statistical Ensembles 39 3.1 Ensemblesinphasespace . 39 3.2 Liouville’stheorem . .. .. .. .. .. .. .. .. .. .. 42 3.3 Microcanonicalensembles . 45 3.4 Free particles and multi-dimensional spheres . .... 48 3.5 Canonicalensembles . 50 3.6 Grandcanonicalensembles . 54 3.7 Isobaricensembles .......................... 58 3.8 Informationtheory . .. .. .. .. .. .. .. .. .. .. 62 4 Real Gases and Liquids 67 4.1 Correlationfunctions. 67 4.2 Thevirialtheorem .......................... 73 4.3 Mean field theory for the van der Waals equation . 76 4.4 Osmosis ................................ 80 3 4 CONTENTS 5 Quantum Gases and Liquids 83 5.1 Statisticsofidenticalparticles. .. 83 5.2 Blackbodyradiationandthephotongas . 88 5.3 Phonons and the Debye theory of specific heats . 96 5.4 Bosonsatnon-zerochemicalpotential .
    [Show full text]
  • Entropy: from the Boltzmann Equation to the Maxwell Boltzmann Distribution
    Entropy: From the Boltzmann equation to the Maxwell Boltzmann distribution A formula to relate entropy to probability Often it is a lot more useful to think about entropy in terms of the probability with which different states are occupied. Lets see if we can describe entropy as a function of the probability distribution between different states. N! WN particles = n1!n2!....nt! stirling (N e)N N N W = = N particles n1 n2 nt n1 n2 nt (n1 e) (n2 e) ....(nt e) n1 n2 ...nt with N pi = ni 1 W = N particles n1 n2 nt p1 p2 ...pt takeln t lnWN particles = "#ni ln pi i=1 divide N particles t lnW1particle = "# pi ln pi i=1 times k t k lnW1particle = "k# pi ln pi = S1particle i=1 and t t S = "N k p ln p = "R p ln p NA A # i # i i i=1 i=1 ! Think about how this equation behaves for a moment. If any one of the states has a probability of occurring equal to 1, then the ln pi of that state is 0 and the probability of all the other states has to be 0 (the sum over all probabilities has to be 1). So the entropy of such a system is 0. This is exactly what we would have expected. In our coin flipping case there was only one way to be all heads and the W of that configuration was 1. Also, and this is not so obvious, the maximal entropy will result if all states are equally populated (if you want to see a mathematical proof of this check out Ken Dill’s book on page 85).
    [Show full text]
  • Physical Biology
    Physical Biology Kevin Cahill Department of Physics and Astronomy University of New Mexico, Albuquerque, NM 87131 i For Ginette, Mike, Sean, Peter, Mia, James, Dylan, and Christopher Contents Preface page v 1 Atoms and Molecules 1 1.1 Atoms 1 1.2 Bonds between atoms 1 1.3 Oxidation and reduction 5 1.4 Molecules in cells 6 1.5 Sugars 6 1.6 Hydrocarbons, fatty acids, and fats 8 1.7 Nucleotides 10 1.8 Amino acids 15 1.9 Proteins 19 Exercises 20 2 Metabolism 23 2.1 Glycolysis 23 2.2 The citric-acid cycle 23 3 Molecular devices 27 3.1 Transport 27 4 Probability and statistics 28 4.1 Mean and variance 28 4.2 The binomial distribution 29 4.3 Random walks 32 4.4 Diffusion as a random walk 33 4.5 Continuous diffusion 34 4.6 The Poisson distribution 36 4.7 The gaussian distribution 37 4.8 The error function erf 39 4.9 The Maxwell-Boltzmann distribution 40 Contents iii 4.10 Characteristic functions 41 4.11 Central limit theorem 42 Exercises 45 5 Statistical Mechanics 47 5.1 Probability and the Boltzmann distribution 47 5.2 Equilibrium and the Boltzmann distribution 49 5.3 Entropy 53 5.4 Temperature and entropy 54 5.5 The partition function 55 5.6 The Helmholtz free energy 57 5.7 The Gibbs free energy 58 Exercises 59 6 Chemical Reactions 61 7 Liquids 65 7.1 Diffusion 65 7.2 Langevin's Theory of Brownian Motion 67 7.3 The Einstein-Nernst relation 71 7.4 The Poisson-Boltzmann equation 72 7.5 Reynolds number 73 7.6 Fluid mechanics 76 8 Cell structure 77 8.1 Organelles 77 8.2 Membranes 78 8.3 Membrane potentials 80 8.4 Potassium leak channels 82 8.5 The Debye-H¨uckel equation 83 8.6 The Poisson-Boltzmann equation in one dimension 87 8.7 Neurons 87 Exercises 89 9 Polymers 90 9.1 Freely jointed polymers 90 9.2 Entropic force 91 10 Mechanics 93 11 Some Experimental Methods 94 11.1 Time-dependent perturbation theory 94 11.2 FRET 96 11.3 Fluorescence 103 11.4 Some mathematics 107 iv Contents References 109 Index 111 Preface Most books on biophysics hide the physics so as to appeal to a wide audi- ence of students.
    [Show full text]
  • Microcanonical, Canonical, and Grand Canonical Ensembles Masatsugu Sei Suzuki Department of Physics, SUNY at Binghamton (Date: September 30, 2016)
    The equivalence: microcanonical, canonical, and grand canonical ensembles Masatsugu Sei Suzuki Department of Physics, SUNY at Binghamton (Date: September 30, 2016) Here we show the equivalence of three ensembles; micro canonical ensemble, canonical ensemble, and grand canonical ensemble. The neglect for the condition of constant energy in canonical ensemble and the neglect of the condition for constant energy and constant particle number can be possible by introducing the density of states multiplied by the weight factors [Boltzmann factor (canonical ensemble) and the Gibbs factor (grand canonical ensemble)]. The introduction of such factors make it much easier for one to calculate the thermodynamic properties. ((Microcanonical ensemble)) In the micro canonical ensemble, the macroscopic system can be specified by using variables N, E, and V. These are convenient variables which are closely related to the classical mechanics. The density of states (N E,, V ) plays a significant role in deriving the thermodynamic properties such as entropy and internal energy. It depends on N, E, and V. Note that there are two constraints. The macroscopic quantity N (the number of particles) should be kept constant. The total energy E should be also kept constant. Because of these constraints, in general it is difficult to evaluate the density of states. ((Canonical ensemble)) In order to avoid such a difficulty, the concept of the canonical ensemble is introduced. The calculation become simpler than that for the micro canonical ensemble since the condition for the constant energy is neglected. In the canonical ensemble, the system is specified by three variables ( N, T, V), instead of N, E, V in the micro canonical ensemble.
    [Show full text]
  • Chapter 6. the Ising Model on a Square Lattice
    Chapter 6. The Ising model on a square lattice 6.1 The basic Ising model The Ising model is a minimal model to capture the spontaneous emergence of macroscopic ferromagnetism from the interaction of individual microscopic magnetic dipole moments. Consider a two-dimensional system of spin 1=2 magnetic dipoles arrayed on a square L L lattice. At each lattice point .i; j /, the dipole can assume only one of two spin states, i;j 1. The dipoles interact with an externally applied constant magnetic field B, and the energyD˙ of a dipole interacting with the external field, Uext, is given by U .i;j / Hi;j ext D where H is a non-dimensionalized version of B. The energy of each spin (dipole) is lower when it is aligned with the external magnetic field. The dipoles also interact with their nearest neighbors. Due to quantum effects the energy of two spins in a ferromagnetic material is lower when they are aligned with each other1. As a model of this, it is assumed in the Ising model that the interaction energy for the dipole at .i; j / is given by Uint.i;j / Ji;j .i 1;j i 1;j i;j 1 i;j 1/ D C C C C C where J is a non-dimensionalized coupling strength. Summing over all sites, we get the total energy for a configuration given by E L L J L L U./ H i;j i;j .i 1;j i 1;j i;j 1 i;j 1/: E D 2 C C C C C i 1 j 1 i 1 j 1 XD XD XD XD The second sum is divided by two to account for each interaction being counted twice.
    [Show full text]
  • LECTURE 9 Statistical Mechanics Basic Methods We Have Talked
    LECTURE 9 Statistical Mechanics Basic Methods We have talked about ensembles being large collections of copies or clones of a system with some features being identical among all the copies. There are three different types of ensembles in statistical mechanics. 1. If the system under consideration is isolated, i.e., not interacting with any other system, then the ensemble is called the microcanonical ensemble. In this case the energy of the system is a constant. 2. If the system under consideration is in thermal equilibrium with a heat reservoir at temperature T , then the ensemble is called a canonical ensemble. In this case the energy of the system is not a constant; the temperature is constant. 3. If the system under consideration is in contact with both a heat reservoir and a particle reservoir, then the ensemble is called a grand canonical ensemble. In this case the energy and particle number of the system are not constant; the temperature and the chemical potential are constant. The chemical potential is the energy required to add a particle to the system. The most common ensemble encountered in doing statistical mechanics is the canonical ensemble. We will explore many examples of the canonical ensemble. The grand canon- ical ensemble is used in dealing with quantum systems. The microcanonical ensemble is not used much because of the difficulty in identifying and evaluating the accessible microstates, but we will explore one simple system (the ideal gas) as an example of the microcanonical ensemble. Microcanonical Ensemble Consider an isolated system described by an energy in the range between E and E + δE, and similar appropriate ranges for external parameters xα.
    [Show full text]
  • Statistical Physics Syllabus Lectures and Recitations
    Graduate Statistical Physics Syllabus Lectures and Recitations Lectures will be held on Tuesdays and Thursdays from 9:30 am until 11:00 am via Zoom: https://nyu.zoom.us/j/99702917703. Recitations will be held on Fridays from 3:30 pm until 4:45 pm via Zoom. Recitations will begin the second week of the course. David Grier's office hours will be held on Mondays from 1:30 pm to 3:00 pm in Physics 873. Ankit Vyas' office hours will be held on Fridays from 1:00 pm to 2:00 pm in Physics 940. Instructors David G. Grier Office: 726 Broadway, room 873 Phone: (212) 998-3713 email: [email protected] Ankit Vyas Office: 726 Broadway, room 965B Email: [email protected] Text Most graduate texts in statistical physics cover the material of this course. Suitable choices include: • Mehran Kardar, Statistical Physics of Particles (Cambridge University Press, 2007) ISBN 978-0-521-87342-0 (hardcover). • Mehran Kardar, Statistical Physics of Fields (Cambridge University Press, 2007) ISBN 978- 0-521-87341-3 (hardcover). • R. K. Pathria and Paul D. Beale, Statistical Mechanics (Elsevier, 2011) ISBN 978-0-12- 382188-1 (paperback). Undergraduate texts also may provide useful background material. Typical choices include • Frederick Reif, Fundamentals of Statistical and Thermal Physics (Waveland Press, 2009) ISBN 978-1-57766-612-7. • Charles Kittel and Herbert Kroemer, Thermal Physics (W. H. Freeman, 1980) ISBN 978- 0716710882. • Daniel V. Schroeder, An Introduction to Thermal Physics (Pearson, 1999) ISBN 978- 0201380279. Outline 1. Thermodynamics 2. Probability 3. Kinetic theory of gases 4.
    [Show full text]
  • Lecture 7: Ensembles
    Matthew Schwartz Statistical Mechanics, Spring 2019 Lecture 7: Ensembles 1 Introduction In statistical mechanics, we study the possible microstates of a system. We never know exactly which microstate the system is in. Nor do we care. We are interested only in the behavior of a system based on the possible microstates it could be, that share some macroscopic proporty (like volume V ; energy E, or number of particles N). The possible microstates a system could be in are known as the ensemble of states for a system. There are dierent kinds of ensembles. So far, we have been counting microstates with a xed number of particles N and a xed total energy E. We dened as the total number microstates for a system. That is (E; V ; N) = 1 (1) microstatesk withsaXmeN ;V ;E Then S = kBln is the entropy, and all other thermodynamic quantities follow from S. For an isolated system with N xed and E xed the ensemble is known as the microcanonical 1 @S ensemble. In the microcanonical ensemble, the temperature is a derived quantity, with T = @E . So far, we have only been using the microcanonical ensemble. 1 3 N For example, a gas of identical monatomic particles has (E; V ; N) V NE 2 . From this N! we computed the entropy S = kBln which at large N reduces to the Sackur-Tetrode formula. 1 @S 3 NkB 3 The temperature is T = @E = 2 E so that E = 2 NkBT . Also in the microcanonical ensemble we observed that the number of states for which the energy of one degree of freedom is xed to "i is (E "i) "i/k T (E " ).
    [Show full text]
  • Section 2 Introduction to Statistical Mechanics
    Section 2 Introduction to Statistical Mechanics 2.1 Introducing entropy 2.1.1 Boltzmann’s formula A very important thermodynamic concept is that of entropy S. Entropy is a function of state, like the internal energy. It measures the relative degree of order (as opposed to disorder) of the system when in this state. An understanding of the meaning of entropy thus requires some appreciation of the way systems can be described microscopically. This connection between thermodynamics and statistical mechanics is enshrined in the formula due to Boltzmann and Planck: S = k ln Ω where Ω is the number of microstates accessible to the system (the meaning of that phrase to be explained). We will first try to explain what a microstate is and how we count them. This leads to a statement of the second law of thermodynamics, which as we shall see has to do with maximising the entropy of an isolated system. As a thermodynamic function of state, entropy is easy to understand. Entropy changes of a system are intimately connected with heat flow into it. In an infinitesimal reversible process dQ = TdS; the heat flowing into the system is the product of the increment in entropy and the temperature. Thus while heat is not a function of state entropy is. 2.2 The spin 1/2 paramagnet as a model system Here we introduce the concepts of microstate, distribution, average distribution and the relationship to entropy using a model system. This treatment is intended to complement that of Guenault (chapter 1) which is very clear and which you must read.
    [Show full text]