Introduction to Thermodynamics and Statistical Physics (114016) - Lecture Notes
Total Page:16
File Type:pdf, Size:1020Kb
Eyal Buks Introduction to Thermodynamics and Statistical Physics (114016) - Lecture Notes September 24, 2017 Technion Preface to be written... Contents 1. The Principle of Largest Uncertainty ..................... 1 1.1 Entropy in Information Theory . 1 1.1.1 Example - Two States System . 1 1.1.2 Smallest and Largest Entropy . 2 1.1.3 The composition property ........................ 5 1.1.4 Alternative Definition of Entropy . 8 1.2 Largest Uncertainty Estimator . 9 1.2.1 Useful Relations............................... 11 1.2.2 The FreeEntropy ................................ 13 1.3 The Principle of Largest Uncertainty in Statistical Mechanics 14 1.3.1 Microcanonical Distribution . 14 1.3.2 Canonical Distribution . 15 1.3.3 Grandcanonical Distribution . 16 1.3.4 Temperature and Chemical Potential . 17 1.4 Time Evolution of Entropy of an Isolated System . 18 1.5 Thermal Equilibrium .............................. 19 1.5.1 Externally Applied Potential Energy . 20 1.6 Free Entropy and Free Energies . 21 1.7 ProblemsSet1.................................... 21 1.8 SolutionsSet1 ................................... 30 2. Ideal Gas ................................................ 47 2.1 AParticleinaBox ................................. 47 2.2 GibbsParadox .................................... 50 2.3 FermionsandBosons ............................... 52 2.3.1 Fermi-Dirac Distribution . 53 2.3.2 Bose-Einstein Distribution . 54 2.3.3 ClassicalLimit ................................ 54 2.4 Ideal Gas in the Classical Limit ..................... 55 2.4.1 Pressure ...................................... 57 2.4.2 Useful Relations............................... 58 2.4.3 HeatCapacity .................................. 59 2.4.4 Internal Degrees of Freedom . 59 2.5 Processes inIdeal Gas ............................. 62 Contents 2.5.1 Isothermal Process ............................. 64 2.5.2 IsobaricProcess ............................... 64 2.5.3 IsochoricProcess .............................. 65 2.5.4 Isentropic Process ............................. 65 2.6 CarnotHeatEngine ................................ 66 2.7 Limits Imposed Upon the Efficiency . 68 2.8 ProblemsSet2.................................... 73 2.9 SolutionsSet2................................... 81 3. Bosonic and Fermionic Systems ........................... 99 3.1 Electromagnetic Radiation . 99 3.1.1 Electromagnetic Cavity . 99 3.1.2 Partition Function............................. 102 3.1.3 CubeCavity.................................... 102 3.1.4 AverageEnergy ................................. 104 3.1.5 Stefan-Boltzmann Radiation Law . 105 3.2 PhononsinSolids ................................. 107 3.2.1 One Dimensional Example . 107 3.2.2 The3DCase .................................... 109 3.3 FermiGas........................................ 112 3.3.1 Orbital Partition Function . 112 3.3.2 Partition Function of the Gas . 112 3.3.3 Energy and Number of Particles . 114 3.3.4 Example: Electrons in Metal . 114 3.4 Semiconductor Statistics . 116 3.5 ProblemsSet3.................................... 117 3.6 SolutionsSet3................................... 119 4. Classical Limit of Statistical Mechanics ................... 129 4.1 Classical Hamiltonian ............................ 129 4.1.1 Hamilton-Jacobi Equations . 130 4.1.2 Example ....................................... 130 4.1.3 Example ....................................... 131 4.2 Density Function ................................. 132 4.2.1 Equipartition Theorem . 132 4.2.2 Example ....................................... 133 4.3 NyquistNoise .................................... 134 4.4 Thermal Equilibrium From Stochastic Processes . 138 4.4.1 Langevin Equation .............................. 138 4.4.2 The Smoluchowski-Chapman-Kolmogorov Relation . 139 4.4.3 The Fokker-Planck Equation . 139 4.4.4 The Potential Condition. 141 4.4.5 FreeEnergy .................................... 142 4.4.6 Fokker-Planck Equation in One Dimension . 143 4.4.7 Ornstein—Uhlenbeck Process in One Dimension . 145 Eyal Buks Thermodynamics and Statistical Physics 6 Contents 4.5 ProblemsSet4.................................... 147 4.6 SolutionsSet4................................... 150 5. Exercises ................................................. 159 5.1 Problems........................................ 159 5.2 Solutions....................................... 161 References ................................................... 173 Index ................................................... ...... 175 Eyal Buks Thermodynamics and Statistical Physics 7 1. The Principle of Largest Uncertainty In this chapter we discuss relations between information theory and statistical mechanics. We show that the canonical and grand canonical distributions can be obtained from Shannon’s principle of maximum uncertainty [1, 2, 3]. Moreover, the time evolution of the entropy of an isolated system and the H theorem are discussed. 1.1 Entropy in Information Theory The possible states of a given system are denoted as e , where m = 1, 2, 3, , m ··· and the probability that state em is occupied is denoted by pm. The normal- ization condition reads pm = 1 . (1.1) m x For a given probability distribution p the entropy is defined as { m} σ = pm log pm . (1.2) − m x Below we show that this quantity characterizes the uncertainty in the knowl- edge of the state of the system. 1.1.1 Example - Two States System Consider a system which can occupy either state e1 with probability p, or state e with probability 1 p, where 0 p 1. The entropy is given by 2 − ≤ ≤ σ = p log p (1 p) log (1 p) . (1.3) − − − − Chapter 1. The Principle of Largest Uncertainty 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0.2 0.4x 0.6 0.8 1 p log p (1 p) log (1 p) As expected, the entropy− vanishes− at−p = 0 and−p = 1, since in both cases there is no uncertainty in what is the state which is occupied by the system. The largest uncertainty is obtained at p = 0.5, for which σ = log 2 = 0.69. 1.1.2 Smallest and Largest Entropy Smallest value. The term p log p in the range 0 p 1 is plotted in the figure below. Note that the− value of p log p in the≤ limit≤ p 0 can be calculated using L’Hospital’s rule − → d log p lim ( p log p) = lim dp = 0 . (1.4) p 0 p 0 d 1 → − → @− dp p A From this figure, which shows that p log p 0 in the range 0 p 1, it is easy to infer that the smallest possible− value≥ of the entropy is zero.≤ Moreover,≤ since p log p = 0 iff p = 0 or p = 1, it is evident that σ = 0 iff the system occupies− one of the states with probability one and all the other states with probability zero. In this case there is no uncertainty in what is the state which is occupied by the system. Eyal Buks Thermodynamics and Statistical Physics 2 1.1. Entropy in Information Theory 0.35 0.3 0.25 0.2 -p*log(p) 0.15 0.1 0.05 0 0.2 0.4p 0.6 0.8 1 p log p − Largest value. We seek a maximum point of the entropy σ with respect to all probability distributions pm which satisfy the normalization condition. This constrain, which is given{ by} Eq. (1.1), is expressed as 0 = g (¯p) = p 1 , (1.5) 0 m − m x where p¯ denotes the vector of probabilities p¯ = (p , p , ) . (1.6) 1 2 ··· A small change in σ (denoted as δσ) due to a small change in p¯ (denoted as δp¯ = (δp , δp , )) can be expressed as 1 2 ··· ∂σ δσ = δp , (1.7) ∂p m m m x or in terms of the gradient of σ (denoted as ¯ σ) as ∇ δσ = ¯ σ δp¯ . (1.8) ∇ · In addition the variables (p1, p2, ) are subjected to the constrain (1.5). Similarly to Eq. (1.8) we have ··· δg = ¯ g δp¯ . (1.9) 0 ∇ 0 · Both vectors ¯ σ and δp¯ can be decomposed as ∇ ¯ σ = ¯ σ + ¯ σ , (1.10) ∇ ∇ ∇ δp¯ = (δp¯) !+ (δ p¯) !, (1.11) where ¯ σ and (δp¯) are parallel to ¯ g0, and where ¯ σ and (δp¯) are ∇ ∇ ∇ orthogonal to ¯ g . Using this notation Eq. (1.8) can be expressed as ! ∇ 0 ! Eyal Buks Thermodynamics and Statistical Physics 3 Chapter 1. The Principle of Largest Uncertainty δσ = ¯ σ (δp¯) + ¯ σ (δp¯) . (1.12) ∇ · ∇ · ! ! Given that the constrain g0 (¯p) = 0 is satisfied at a given point p¯, one has g0 (¯p + δp¯) = 0 to first order in δp¯ provided that δp¯, is orthogonal to ¯ g0, namely, provided that (δp¯) = 0. Thus, a stationary (maximum or minimum∇ or saddle point) point of σ occurs iff for every small change δp¯, which is orthogonal to ¯ g0 (namely, δp¯ ¯ g0 = 0) one has 0 = δσ = ¯ σ δp¯. As can be seen from∇ Eq. (1.12), this· condition∇ is fulfilled only when ∇¯ σ· = 0, ∇ namely only when the vectors ¯ σ and ¯ g are parallel to each other. In other 0 ! words, only when ∇ ∇ ¯ σ = ξ ¯ g , (1.13) ∇ 0∇ 0 where ξ0 is a constant. This constant is called Lagrange multiplier . Using Eqs. (1.2) and (1.5) the condition (1.13) is expressed as log pm + 1 = ξ0 . (1.14) Let M be the number of available states. From Eq. (1.14) we find that all probabilities are equal. Thus using Eq. (1.5), one finds that 1 p = p = = . (1.15) 1 2 ··· M After finding this stationary point it is necessary to determine whether it is a maximum or minimum or saddle point. To do this we expand σ to second order in δp¯ σ (¯p + δp¯) = exp δp¯ ¯ σ (¯p) · ∇ 2 ! δp¯ ¯ = 1 + δp¯ ¯ + · ∇ + σ (¯p) · ∇ 2! ··· @ ! A 2 δp¯ ¯ = σ (¯p) + δp¯ ¯ σ + · ∇ σ + 2! · ∇ ! ··· ∂σ 1 ∂2σ = σ (¯p) + δpm + δpmδpm′ + ∂pm 2 ∂pm∂pm′ ··· m m,m′ x x (1.16) Using Eq. (1.2) one finds that ∂2σ 1 = δm,m′ . (1.17) ∂pm∂pm′ −pm Since the probabilities pm are non-negative one concludes that any stationary point of σ is a local maximum point. Moreover, since only a single stationary point was found, one concludes that the entropy σ obtains its largest value, which is denoted as Λ (M), and which is given by Eyal Buks Thermodynamics and Statistical Physics 4 1.1. Entropy in Information Theory 1 1 1 Λ (M) = σ , , , = log M, (1.18) M M ··· M 2 3 for the probability distribution given by Eq. (1.15). For this probability dis- tribution that maximizes σ, as expected, the state which is occupied by the system is most uncertain. 1.1.3 The composition property The composition property is best illustrated using an example.