Eyal Buks
Introduction to Thermodynamics and Statistical Physics (114016) - Lecture Notes
September 24, 2017
Technion
Preface
to be written...
Contents
1. The Principle of Largest Uncertainty ...... 1 1.1 Entropy in Information Theory ...... 1 1.1.1 Example - Two States System ...... 1 1.1.2 Smallest and Largest Entropy ...... 2 1.1.3 The composition property ...... 5 1.1.4 Alternative Definition of Entropy ...... 8 1.2 Largest Uncertainty Estimator ...... 9 1.2.1 Useful Relations...... 11 1.2.2 The FreeEntropy ...... 13 1.3 The Principle of Largest Uncertainty in Statistical Mechanics 14 1.3.1 Microcanonical Distribution ...... 14 1.3.2 Canonical Distribution ...... 15 1.3.3 Grandcanonical Distribution ...... 16 1.3.4 Temperature and Chemical Potential ...... 17 1.4 Time Evolution of Entropy of an Isolated System ...... 18 1.5 Thermal Equilibrium ...... 19 1.5.1 Externally Applied Potential Energy ...... 20 1.6 Free Entropy and Free Energies ...... 21 1.7 ProblemsSet1...... 21 1.8 SolutionsSet1 ...... 30
2. Ideal Gas ...... 47 2.1 AParticleinaBox ...... 47 2.2 GibbsParadox ...... 50 2.3 FermionsandBosons ...... 52 2.3.1 Fermi-Dirac Distribution ...... 53 2.3.2 Bose-Einstein Distribution ...... 54 2.3.3 ClassicalLimit ...... 54 2.4 Ideal Gas in the Classical Limit ...... 55 2.4.1 Pressure ...... 57 2.4.2 Useful Relations...... 58 2.4.3 HeatCapacity ...... 59 2.4.4 Internal Degrees of Freedom ...... 59 2.5 Processes inIdeal Gas ...... 62 Contents
2.5.1 Isothermal Process ...... 64 2.5.2 IsobaricProcess ...... 64 2.5.3 IsochoricProcess ...... 65 2.5.4 Isentropic Process ...... 65 2.6 CarnotHeatEngine ...... 66 2.7 Limits Imposed Upon the Efficiency ...... 68 2.8 ProblemsSet2...... 73 2.9 SolutionsSet2...... 81
3. Bosonic and Fermionic Systems ...... 99 3.1 Electromagnetic Radiation ...... 99 3.1.1 Electromagnetic Cavity ...... 99 3.1.2 Partition Function...... 102 3.1.3 CubeCavity...... 102 3.1.4 AverageEnergy ...... 104 3.1.5 Stefan-Boltzmann Radiation Law ...... 105 3.2 PhononsinSolids ...... 107 3.2.1 One Dimensional Example ...... 107 3.2.2 The3DCase ...... 109 3.3 FermiGas...... 112 3.3.1 Orbital Partition Function ...... 112 3.3.2 Partition Function of the Gas ...... 112 3.3.3 Energy and Number of Particles ...... 114 3.3.4 Example: Electrons in Metal ...... 114 3.4 Semiconductor Statistics ...... 116 3.5 ProblemsSet3...... 117 3.6 SolutionsSet3...... 119
4. Classical Limit of Statistical Mechanics ...... 129 4.1 Classical Hamiltonian ...... 129 4.1.1 Hamilton-Jacobi Equations ...... 130 4.1.2 Example ...... 130 4.1.3 Example ...... 131 4.2 Density Function ...... 132 4.2.1 Equipartition Theorem ...... 132 4.2.2 Example ...... 133 4.3 NyquistNoise ...... 134 4.4 Thermal Equilibrium From Stochastic Processes ...... 138 4.4.1 Langevin Equation ...... 138 4.4.2 The Smoluchowski-Chapman-Kolmogorov Relation . . . 139 4.4.3 The Fokker-Planck Equation ...... 139 4.4.4 The Potential Condition...... 141 4.4.5 FreeEnergy ...... 142 4.4.6 Fokker-Planck Equation in One Dimension ...... 143 4.4.7 Ornstein—Uhlenbeck Process in One Dimension ...... 145
Eyal Buks Thermodynamics and Statistical Physics 6 Contents
4.5 ProblemsSet4...... 147 4.6 SolutionsSet4...... 150
5. Exercises ...... 159 5.1 Problems...... 159 5.2 Solutions...... 161
References ...... 173
Index ...... 175
Eyal Buks Thermodynamics and Statistical Physics 7
1. The Principle of Largest Uncertainty
In this chapter we discuss relations between information theory and statistical mechanics. We show that the canonical and grand canonical distributions can be obtained from Shannon’s principle of maximum uncertainty [1, 2, 3]. Moreover, the time evolution of the entropy of an isolated system and the H theorem are discussed.
1.1 Entropy in Information Theory
The possible states of a given system are denoted as e , where m = 1, 2, 3, , m ··· and the probability that state em is occupied is denoted by pm. The normal- ization condition reads
pm = 1 . (1.1) m For a given probability distribution p the entropy is defined as { m}
σ = pm log pm . (1.2) − m Below we show that this quantity characterizes the uncertainty in the knowl- edge of the state of the system.
1.1.1 Example - Two States System
Consider a system which can occupy either state e1 with probability p, or state e with probability 1 p, where 0 p 1. The entropy is given by 2 − ≤ ≤ σ = p log p (1 p) log (1 p) . (1.3) − − − − Chapter 1. The Principle of Largest Uncertainty
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0 0.2 0.4x 0.6 0.8 1 p log p (1 p) log (1 p) As expected, the entropy− vanishes− at−p = 0 and−p = 1, since in both cases there is no uncertainty in what is the state which is occupied by the system. The largest uncertainty is obtained at p = 0.5, for which σ = log 2 = 0.69.
1.1.2 Smallest and Largest Entropy
Smallest value. The term p log p in the range 0 p 1 is plotted in the figure below. Note that the− value of p log p in the≤ limit≤ p 0 can be calculated using L’Hospital’s rule − →
d log p lim ( p log p) = lim dp = 0 . (1.4) p 0 p 0 d 1 → − → − dp p From this figure, which shows that p log p 0 in the range 0 p 1, it is easy to infer that the smallest possible− value≥ of the entropy is zero.≤ Moreover,≤ since p log p = 0 iff p = 0 or p = 1, it is evident that σ = 0 iff the system occupies− one of the states with probability one and all the other states with probability zero. In this case there is no uncertainty in what is the state which is occupied by the system.
Eyal Buks Thermodynamics and Statistical Physics 2 1.1. Entropy in Information Theory
0.35
0.3
0.25
0.2 -p*log(p) 0.15
0.1
0.05
0 0.2 0.4p 0.6 0.8 1 p log p −
Largest value. We seek a maximum point of the entropy σ with respect to all probability distributions pm which satisfy the normalization condition. This constrain, which is given{ by} Eq. (1.1), is expressed as
0 = g (¯p) = p 1 , (1.5) 0 m − m where p¯ denotes the vector of probabilities p¯ = (p , p , ) . (1.6) 1 2 ··· A small change in σ (denoted as δσ) due to a small change in p¯ (denoted as δp¯ = (δp , δp , )) can be expressed as 1 2 ··· ∂σ δσ = δp , (1.7) ∂p m m m or in terms of the gradient of σ (denoted as ¯ σ) as ∇ δσ = ¯ σ δp¯ . (1.8) ∇ · In addition the variables (p1, p2, ) are subjected to the constrain (1.5). Similarly to Eq. (1.8) we have ··· δg = ¯ g δp¯ . (1.9) 0 ∇ 0 · Both vectors ¯ σ and δp¯ can be decomposed as ∇ ¯ σ = ¯ σ + ¯ σ , (1.10) ∇ ∇ ∇ ⊥ δp¯ = (δp¯) + (δ p¯) , (1.11) ⊥ where ¯ σ and (δp¯) are parallel to ¯ g0, and where ¯ σ and (δp¯) are ∇ ∇ ∇ ⊥ ⊥ orthogonal to ¯ g . Using this notation Eq. (1.8) can be expressed as ∇ 0
Eyal Buks Thermodynamics and Statistical Physics 3 Chapter 1. The Principle of Largest Uncertainty
δσ = ¯ σ (δp¯) + ¯ σ (δp¯) . (1.12) ∇ · ∇ ⊥ · ⊥ Given that the constrain g0 (¯p) = 0 is satisfied at a given point p¯, one has g0 (¯p + δp¯) = 0 to first order in δp¯ provided that δp¯, is orthogonal to ¯ g0, namely, provided that (δp¯) = 0. Thus, a stationary (maximum or minimum∇ or saddle point) point of σ occurs iff for every small change δp¯, which is orthogonal to ¯ g0 (namely, δp¯ ¯ g0 = 0) one has 0 = δσ = ¯ σ δp¯. As can be seen from∇ Eq. (1.12), this· condition∇ is fulfilled only when ∇¯ σ· = 0, ∇ namely only when the vectors ¯ σ and ¯ g are parallel to each other. In⊥ other 0 words, only when ∇ ∇
¯ σ = ξ ¯ g , (1.13) ∇ 0∇ 0
where ξ0 is a constant. This constant is called Lagrange multiplier . Using Eqs. (1.2) and (1.5) the condition (1.13) is expressed as
log pm + 1 = ξ0 . (1.14) Let M be the number of available states. From Eq. (1.14) we find that all probabilities are equal. Thus using Eq. (1.5), one finds that 1 p = p = = . (1.15) 1 2 ··· M After finding this stationary point it is necessary to determine whether it is a maximum or minimum or saddle point. To do this we expand σ to second order in δp¯ σ (¯p + δp¯) = exp δp¯ ¯ σ (¯p) · ∇ 2 δp¯ ¯ = 1 + δp¯ ¯ + · ∇ + σ (¯p) 2! · ∇ ··· 2 δp¯ ¯ = σ (¯p) + δp¯ ¯ σ + · ∇ σ + 2! · ∇ ··· ∂σ 1 ∂2σ = σ (¯p) + δpm + δpmδpm′ + ∂pm 2 ∂pm∂pm′ ··· m m,m′ (1.16) Using Eq. (1.2) one finds that
∂2σ 1 = δm,m′ . (1.17) ∂pm∂pm′ −pm
Since the probabilities pm are non-negative one concludes that any stationary point of σ is a local maximum point. Moreover, since only a single stationary point was found, one concludes that the entropy σ obtains its largest value, which is denoted as Λ (M), and which is given by
Eyal Buks Thermodynamics and Statistical Physics 4 1.1. Entropy in Information Theory
1 1 1 Λ (M) = σ , , , = log M, (1.18) M M ··· M for the probability distribution given by Eq. (1.15). For this probability dis- tribution that maximizes σ, as expected, the state which is occupied by the system is most uncertain.
1.1.3 The composition property
The composition property is best illustrated using an example. Example - A Three States System. A system can occupy one of the states e1, e2 or e3 with probabilities p1, p2 and p3 respectively. The uncer- tainty associated with this probability distribution can be estimated in two ways, directly and indirectly. Directly, it is simply given by the definition of entropy in Eq. (1.2)
σ (p , p , p ) = p log p p log p p log p . (1.19) 1 2 3 − 1 1 − 2 2 − 3 3 Alternatively [see Fig. 1.1], the uncertainty can be decomposed as follows: (a) the system can either occupy state e1 with probability p1, or not occupy state e1 with probability 1 p1; (b) given that the system does not occupy state e , it can either occupy− state e with probability p / (1 p ) or occupy 1 2 2 − 1 state e3 with probability p3/ (1 p1). Assuming that uncertainty (entropy) is additive, the total uncertainty− (entropy) is given by p p σ = σ (p , 1 p ) + (1 p ) σ 2 , 3 . (1.20) i 1 − 1 − 1 1 p 1 p − 1 − 1 The factor (1 p ) in the second term is included since the uncertainty asso- − 1 ciated with distinction between states e2 and e3 contributes only when state e1 is not occupied, an event which occurs with probability 1 p1. Using the definition (1.2) and the normalization condition −
p1 + p2 + p3 = 1 , (1.21) one finds
σ = p log p (1 p ) log (1 p ) i − 1 1 − − 1 − 1 p p p p + (1 p ) 2 log 2 3 log 3 − 1 −1 p 1 p − 1 p 1 p − 1 − 1 − 1 − 1 = p log p p log p p log p − 1 1 − 2 2 − 3 3 (1 p p p ) log (1 p ) − − 1 − 2 − 3 − 1 = σ (p1, p2, p3) , (1.22)
that is, for this example the entropy satisfies the decomposition property.
Eyal Buks Thermodynamics and Statistical Physics 5 Chapter 1. The Principle of Largest Uncertainty
− p1 1 p1
e1 e2 ,e3
p2 p3 − − 1 p1 1 p1
e2 e3
Fig. 1.1. The composition property - three states system.
The general case. The composition property in the general case can be defined as follows. Consider a system which can occupy one of the states e , e , , e with probabilities q , q , , q respectively. This set of { 1 2 ··· M0 } 1 2 ··· M0 states is grouped as follows. The first group includes the first M1 states e , e , , e ; the second group includes the next M states e , e , , e , { 1 2 ··· M1 } 2 { M1+1 M1+2 ··· M1+M2 } etc., where M1 +M2 + = M0. The probability that one of the states in the first group is occupied is···p = q +q + +q , the probability that one of the 1 1 2 ··· M1 states in the second group is occupied is p2 = qM1+1 + qM1+2 + +qM1+M2 , etc., where ···
p + p + = 1 . (1.23) 1 2 ··· The composition property requires that the following holds [see Fig. 1.2]
σ (q , q , , q ) = σ (p , p , ) 1 2 ··· M0 1 2 ··· q q q + p σ 1 , 2 , , M1 1 p p ··· p 1 1 1 q q q + p σ M1+1 , M1+2 , , M1+M2 + 2 p p ··· p ··· 2 2 2 (1.24)
Using the definition (1.2) the following holds
σ (p , p , ) = p log p p log p , (1.25) 1 2 ··· − 1 1 − 2 2 − · · ·
Eyal Buks Thermodynamics and Statistical Physics 6 1.1. Entropy in Information Theory
p p 1 2 …
e ,e ,..., e e + ,..., e + 1 2 M1 M1 1 M1 M 2
q q qM q q + qM +M 1 2 1 M1 M1 2 1 2 p p p p p p 1 1 … 1 2 2 … 2 e e eM e + e + e + 1 2 1 M1 1 M1 2 M1 M 2 = + + + p q q ... q p = q + +... + q + 1 1 2 M1 2 M1 1 M1 M 2
Fig. 1.2. The composition property - the general case.
q q q p σ 1 , 2 , , M1 1 p p ··· p 1 1 1 q q q q q q = p 1 log 1 2 log 2 M1 log M1 1 −p p − p p − · · · − p p 1 1 1 1 1 1 = q log q q log q q log q − 1 1 − 2 2 − · · · − M1 M1 + p1 log p1 , (1.26)
q q q p σ M1+1 , M1+2 , , M1+M2 2 p p ··· p 2 2 2 = q log q q log q − M1+1 M1+1 − M1+2 M1+2 q log q − · · · − M1+M2 M1+M2 + p2 log p2 , (1.27)
etc., thus it is evident that condition (1.24) is indeed satisfied.
Eyal Buks Thermodynamics and Statistical Physics 7 Chapter 1. The Principle of Largest Uncertainty
1.1.4 Alternative Definition of Entropy
Following Shannon [1, 2], the entropy function σ (p1, p2, , pN ) can be al- ternatively defined as follows: ··· 1. σ (p , p , , p ) is a continuous function of its arguments p , p , , p . 1 2 ··· N 1 2 ··· N 2. If all probabilities are equal, namely if p1 = p2 = = pN = 1/N, then the quantity Λ (N) = σ (1/N, 1/N, , 1/N) is a··· monotonic increasing function of N. ··· 3. The function σ (p1, p2, , pN ) satisfies the composition property given by Eq. (1.24). ···
Exercise 1.1.1. Show that the above definition leads to the entropy given by Eq. (1.2) up to multiplication by a positive constant.
Solution 1.1.1. The 1st property allows approximating the probabilities p , p , , p using rational numbers, namely p = M /M , p = M /M , 1 2 ··· N 1 1 0 2 2 0 etc., where M1,M2, are integers and M0 = M1 + M2 + + MN . Using the composition property··· (1.24) one finds ···
Λ (M ) = σ (p , p , , p ) + p Λ (M ) + p Λ (M ) + (1.28) 0 1 2 ··· N 1 1 2 2 ···
In particular, consider the case were M1 = M2 = = MN = K. For this case one finds ···
Λ (NK) = Λ (N) + Λ (K) . (1.29)
Taking K = N = 1 yields
Λ (1) = 0 . (1.30)
Taking N = 1 + x yields
Λ (K + Kx) Λ (K) 1 Λ (1 + x) − = . (1.31) Kx K x Taking the limit x 0 yields → dΛ C = , (1.32) dK K where Λ (1 + x) C = lim . (1.33) x 0 → x Integrating Eq. (1.32) and using the initial condition (1.30) yields
Λ (K) = C log K. (1.34)
Eyal Buks Thermodynamics and Statistical Physics 8 1.2. Largest Uncertainty Estimator
Moreover, the second property requires that C > 0. Choosing C = 1 and using Eq. (1.28) yields
σ (p , p , , p ) = Λ (M ) p Λ (M ) p Λ (M ) 1 2 ··· N 0 − 1 1 − 2 2 − · · · M1 M2 MN = p1 log p2 log pM log − M0 − M0 − · · · − M0 = p log p p log p p log p , − 1 1 − 2 2 − · · · − N N (1.35)
in agreement with the definition (1.2).
1.2 Largest Uncertainty Estimator
As before, the possible states of a given system are denoted as em, where m = 1, 2, 3, , and the probability that state em is occupied is denoted by p . Let X ···(l = 1, 2, ,L) be a set of variables characterizing the system m l ··· (e.g., energy, number of particles, etc.). Let Xl (m) be the value which the variable Xl takes when the system is in state em. Consider the case where the expectation values of the variables Xl are given
X = p X (m) , (1.36) l m l m where l = 1, 2, ,L. However, the probability distribution p is not given. ··· { m} Clearly, in the general case the knowledge of X1 , X2 , , XL is not sufficient to obtain the probability distribution because th ere··· are in general many different possibilities for choosing a probability distribution which is consistent with the contrarians (1.36) and the normalization condition (1.1). For each such probability distribution the entropy can be calculated according to the definition (1.2). The probability distribution pm , which is consistent with these conditions, and has the largest possible entropy{ } is called the largest uncertainty estimator (LUE). The LUE is found by seeking a stationary point of the entropy σ with respect to all probability distributions pm which satisfy the normalization constrain (1.5) in addition to the constrains{ } (1.36), which can be expressed as
0 = g (¯p) = p X (m) X , (1.37) l m l − l m where l = 1, 2, L. To first order one has ··· δσ = ¯ σ δp¯ , (1.38a) ∇ · δg = ¯ g δp¯ , (1.38b) l ∇ l ·
Eyal Buks Thermodynamics and Statistical Physics 9 Chapter 1. The Principle of Largest Uncertainty where l = 0, 1, 2, L. A stationary point of σ occurs iff for every small ··· change δp¯, which is orthogonal to all vectors ¯ g0, ¯ g1, ¯ g2, , ¯ gL one has ∇ ∇ ∇ ··· ∇
0 = δσ = ¯ σ δp¯ . (1.39) ∇ · This condition is fulfilled only when the vector ¯ σ belongs to the subspace ∇ spanned by the vectors ¯ g0, ¯ g1, ¯ g2, , ¯ gL [see also the discussion below Eq. (1.12) above]. In∇ other∇ words,∇ only··· ∇ when ¯ σ = ξ ¯ g + ξ ¯ g + ξ ¯ g + + ξ ¯ g , (1.40) ∇ 0∇ 0 1∇ 1 2∇ 2 ··· L∇ L
where the numbers ξ0, ξ1, , ξL, which are called Lagrange multipliers, are constants. Using Eqs. (1.2),··· (1.5) and (1.37) the condition (1.40) can be expressed as
L log p 1 = ξ + ξ X (m) . (1.41) − m − 0 l l l=1 From Eq. (1.41) one obtains
L p = exp ( 1 ξ ) exp ξ X (m) . (1.42) m − − 0 − l l l=1
The Lagrange multipliers ξ0, ξ1, , ξL can be determined from Eqs. (1.5) and (1.37) ···
L 1 = p = exp ( 1 ξ ) exp ξ X (m) , (1.43) m − − 0 − l l m m l=1
Xl = pmXl (m) m L = exp ( 1 ξ ) exp ξ X (m) X (m) . − − 0 − l l l m l=1 (1.44)
Using Eqs. (1.42) and (1.43) one finds
L exp ξ X (m) − l l p = l=1 . (1.45) m L exp ξ X (m) − l l m l=1 In terms of the partition function , which is defined as Z
Eyal Buks Thermodynamics and Statistical Physics 10 1.2. Largest Uncertainty Estimator
L = exp ξ X (m) , (1.46) Z − l l m l=1 one finds
1 L p = exp ξ X (m) . (1.47) m − l l l=1 Z Using the same arguments as in section 1.1.2 above [see Eq. (1.16)] it is easy to show that at the stationary point that occurs for the probability distribution given by Eq. (1.47) the entropy obtains its largest value.
1.2.1 Useful Relations
The expectation value X can be expressed as l
Xl = pmXl (m) m 1 L = exp ξ Xl (m) Xl (m) m − l l=1 Z 1 ∂ = Z − ∂ξ Z l ∂ log = Z . − ∂ξl (1.48)
2 Similarly, Xl can be expressed as
2 2 Xl = pmXl (m) m 1 L = exp ξ X (m) X2 (m) − l l l m l=1 Z 1 ∂2 = Z . ∂ξ2 Z l (1.49)
Using Eqs. (1.48) and (1.49) one finds that the variance of the variable Xl is given by
2 2 2 2 1 ∂ 1 ∂ (∆Xl) = (Xl Xl ) = Z2 Z . (1.50) − ∂ξl − ∂ξl Z Z However, using the following identity
Eyal Buks Thermodynamics and Statistical Physics 11 Chapter 1. The Principle of Largest Uncertainty
∂2 log ∂ 1 ∂ 1 ∂2 1 ∂ 2 Z = Z = Z Z , (1.51) ∂ξ2 ∂ξ ∂ξ ∂ξ2 − ∂ξ l l Z l Z l Z l one finds 2 2 ∂ log (∆Xl) = 2 Z . (1.52) ∂ξl Note that the above results Eqs. (1.48) and (1.52) are valid only when is expressed as a function of the the Lagrange multipliers, namely Z = (ξ , ξ , , ξ ) . (1.53) Z Z 1 2 ··· L Using the definition of entropy (1.2) and Eq. (1.47) one finds
σ = pm log pm − m 1 L = p log exp ξ X (m) − m − l l m l=1 Z L = p log + ξ X (m) m Z l l m l=1 L = log + ξ p X (m) , Z l m l l=1 m (1.54) thus L σ = log + ξ X . (1.55) Z l l l=1 Using the above relations one can also evaluate the partial derivative of the entropy σ when it is expressed as a function of the expectation values, namely σ = σ ( X , X , , X ) . (1.56) 1 2 ··· L Using Eq. (1.55) one has L L ∂σ ∂ log ∂ξl′ ∂ Xl′ = Z + Xl′ + ξl′ ∂ Xl ∂ Xl ∂ Xl ∂ Xl l′=1 l′=1 L ∂ log ∂ξl′ = Z + Xl′ + ξl ∂ Xl ∂ Xl l′=1 L L ∂ log ∂ξl′ ∂ξl′ = Z + Xl′ + ξl , ∂ξ ′ ∂ Xl ∂ Xl l′=1 l l′=1 (1.57)
Eyal Buks Thermodynamics and Statistical Physics 12 1.2. Largest Uncertainty Estimator thus using Eq. (1.48) one finds ∂σ = ξ . (1.58) ∂ X l l 1.2.2 The Free Entropy
The free entropy σ is defined as the term log in Eq. (1.54) F Z σ = log F Z L = σ ξ p X (m) − l m l l=1 m L = p log p ξ p X (m) . − m m − l m l m l=1 m (1.59) The free entropy is commonly expressed as a function of the Lagrange mul- tipliers
σ = σ (ξ , ξ , , ξ ) . (1.60) F F 1 2 ··· L We have seen above that the LUE maximizes σ for given values of expectation values X , X , , X . We show below that a similar result can be 1 2 ··· L obtained for the free energy σF with respect to given values of the Lagrange multipliers.
Claim. The LUE maximizes σF for given values of the Lagrange multipliers ξ , ξ , , ξ . 1 2 ··· L Proof. As before, the normalization condition is expressed as
0 = g0 (¯p) = pm 1 . (1.61) m − At a stationary point of σF, as we have seen previously, the following holds ¯ σ = η ¯ g , (1.62) ∇ F ∇ 0 where η is a Lagrange multiplier. Thus
L (log p + 1) ξ X (m) = η , (1.63) − m − l l l=1 or
L p = exp ( η 1) exp ξ X (m) . (1.64) m − − − l l l=1 Eyal Buks Thermodynamics and Statistical Physics 13 Chapter 1. The Principle of Largest Uncertainty
Table 1.1. The microcanonical, canonical and grandcanonical distributions.
energy numberofparticles microcanonical distribution constrained U (m) = U constrained N (m) = N canonical distribution average is given U constrained N (m) = N grandcanonical distribution average is given U average is given N
This result is the same as the one given by Eq. (1.42). Taking into account the normalization condition (1.61) one obtains the same expression for pm as the one given by Eq. (1.47). Namely, the stationary point of σF corresponds to the LUE probability distribution. Since 2 ∂ σF 1 = δm,m′ < 0 , (1.65) ∂pm∂pm′ −pm one concludes that this stationary point is a maximum point [see Eq. (1.16)].
1.3 The Principle of Largest Uncertainty in Statistical Mechanics
The energy and number of particles of state em are denoted by U (m) and N (m) respectively. The probability that state em is occupied is denoted as pm. We consider below three cases (see table 1.1). In the first case (micro- canonical distribution) the system is isolated and its total energy U and num- ber of particles N are constrained , that is for all accessible states U (m) = U and N (m) = N. In the second case (canonical distribution) the system is allowed to exchange energy with the environment, and we assume that its average energy U is given. However, its number of particles is constrained , that is N (m) = N . In the third case (grandcanonical distribution) the sys- tem is allowed to exchange both energy and particles with the environment, and we assume that both the average energy U and the average number of particles N are given. However, in all cases, the probability distribution pm is not given. { According} to the principle of largest uncertainty in statistical mechanics the LUE is employed to estimate the probability distribution pm , namely, we will seek a probability distribution which is consistent with{ the} normal- ization condition (1.1) and with the given expectation values (energy, in the second case, and both energy and number of particles, in the third case), which maximizes the entropy.
1.3.1 Microcanonical Distribution In this case no expectation values are given. Thus we seek a probability distribution which is consistent with the normalization condition (1.1), and
Eyal Buks Thermodynamics and Statistical Physics 14 1.3. The Principle of Largest Uncertainty in Statistical Mechanics which maximizes the entropy. The desired probability distribution is
p = p = = 1/M , (1.66) 1 2 ··· where M is the number of accessible states of the system [see also Eq. (1.18)]. Using Eq. (1.2) the entropy for this case is given by
σ = log M. (1.67)
1.3.2 Canonical Distribution
Using Eq. (1.47) one finds that the probability distribution is given by 1 p = exp ( βU (m)) , (1.68) m − Zc where β is the Lagrange multiplier associated with the given expectation value U , and the partition function is given by = exp ( βU (m)) . (1.69) Zc − m The term exp ( βU (m)) is called Boltzmann factor. Moreover, Eq.− (1.48) yields ∂ log U = Zc , (1.70) − ∂β Eq. (1.52) yields
∂2 log (∆U)2 = Zc , (1.71) ∂β2 and Eq. (1.55) yields
σ = log + β U . (1.72) Zc Using Eq. (1.58) one can expressed the Lagrange multiplier β as ∂σ β = . (1.73a) ∂U The temperature τ = 1/β is defined as 1 = β . (1.74) τ Exercise 1.3.1. Consider a system that can be in one of two states having energies ε/2. Calculate the average energy U and the variance (∆U)2 ± in thermal equilibrium at temperature τ.
Eyal Buks Thermodynamics and Statistical Physics 15 Chapter 1. The Principle of Largest Uncertainty
Solution: The partition function is given by Eq. (1.69)
βε βε βε = exp + exp = 2 cosh , (1.75) Zc 2 − 2 2 thus using Eqs. (1.70) and (1.71) one finds
ε βε U = tanh , (1.76) −2 2 and ε 2 1 (∆U)2 = , (1.77) 2 cosh2 βε 2 where β = 1/τ.
1 2x 3 4 5 0
-0.2
-0.4 -tanh(1/x) -0.6
-0.8
-1
1.3.3 Grandcanonical Distribution
Using Eq. (1.47) one finds that the probability distribution is given by 1 p = exp ( βU (m) ηN (m)) , (1.78) m − − Zgc where β and η are the Lagrange multipliers associated with the given expec- tation values U and N respectively, and the partition function is given by
gc = exp ( βU (m) ηN (m)) . (1.79) Z m − − The term exp ( βU (m) ηN (m)) is called Gibbs factor. Moreover, Eq.− (1.48)− yields
Eyal Buks Thermodynamics and Statistical Physics 16 1.3. The Principle of Largest Uncertainty in Statistical Mechanics
∂ log U = Zgc , (1.80) − ∂β η ∂ log N = Zgc (1.81) − ∂η β Eq. (1.52) yields
∂2 log (∆U)2 = Zgc , (1.82) ∂β2 η 2 2 ∂ log gc (∆N) = 2Z , (1.83) ∂η β and Eq. (1.55) yields
σ = log + β U + η N . (1.84) Zgc
1.3.4 Temperature and Chemical Potential
Probability distributions in statistical mechanics of macroscopic parameters are typically extremely sharp and narrow. Consequently, in many cases no distinction is made between a parameter and its expectation value. That is, the expression for the entropy in Eq. (1.72) can be rewritten as
σ = log + βU , (1.85) Zc and the one in Eq. (1.84) as
σ = log + βU + ηN . (1.86) Zgc Using Eq. (1.58) one can expressed the Lagrange multipliers β and η as
∂σ β = , (1.87) ∂U N ∂σ η = . (1.88) ∂N U The chemical potential µ is defined as
µ = τη . (1.89) − In the definition (1.2) the entropy σ is dimensionless. Historically, the entropy was defined as
S = kBσ , (1.90)
where
Eyal Buks Thermodynamics and Statistical Physics 17 Chapter 1. The Principle of Largest Uncertainty
23 1 k = 1.38 10− JK− (1.91) B × is the Boltzmann constant. Moreover, the historical definition of the temper- ature is τ T = . (1.92) kB When the grandcanonical partition function is expressed in terms of β and µ (instead of in terms of β and η), it is convenient to rewrite Eqs. (1.80) and (1.81) as (see homework exercises 14 of chapter 1). ∂ log ∂ log U = Zgc + τµ Zgc , (1.93) − ∂β ∂µ µ β ∂ log N = λ Zgc , (1.94) ∂λ where λ is the fugacity , which is defined by
η λ = exp (βµ) = e− . (1.95)
1.4 Time Evolution of Entropy of an Isolated System
Consider a perturbation which results in transitions between the states of an isolated system. Let Γrs denotes the resulting rate of transition from state r to state s. The probability that state s is occupied is denoted as ps. The following theorem (H theorem) states that if for every pair of states r and s
Γrs = Γsr , (1.96)
then dσ 0 . (1.97) dt ≥ Moreover, equality holds iff p = p for all pairs of states for which Γ = 0. s r sr To prove this theorem we express the rate of change in the probability ps in terms of these transition rates dp r = p Γ p Γ . (1.98) dt s sr r rs s − s The first term represents the transitions to state r, whereas the second one represents transitions from state r. Using property (1.96) one finds dp r = Γ (p p ) . (1.99) dt sr s r s − Eyal Buks Thermodynamics and Statistical Physics 18 1.5. Thermal Equilibrium
The last result and the definition (1.2) allows calculating the rate of change of entropy dσ d = p log p dt dt r r − r dp = r (log p + 1) − dt r r = Γsr (ps pr) (log pr + 1) . − r s − (1.100)
One the other hand, using Eq. (1.96) and exchanging the summation indices allow rewriting the last result as dσ = Γ (p p ) (log p + 1) . (1.101) dt sr s r s r s − Thus, using both expressions (1.100) and (1.101) yields dσ 1 = Γ (p p ) (log p log p ) . (1.102) dt 2 sr s r s r r s − − In general, since log x is a monotonic increasing function
(p p ) (log p log p ) 0 , (1.103) s − r s − r ≥
and equality holds iff ps = pr. Thus, in general dσ 0 , (1.104) dt ≥ and equality holds iff ps = pr holds for all pairs is states satisfying Γsr = 0. When σ becomes time independent the system is said to be in thermal equilibrium. In thermal equilibrium, when all accessible states have the same probability, one finds using the definition (1.2)
σ = log M, (1.105)
where M is the number of accessible states of the system. Note that the rates Γrs, which can be calculated using quantum mechan- ics, indeed satisfy the property (1.96) for the case of an isolated system.
1.5 Thermal Equilibrium
Consider two isolated systems denoted as S1 and S2. Let σ1 = σ1 (U1,N1) and σ2 = σ2 (U2,N2) be the entropy of the first and second system respectively
Eyal Buks Thermodynamics and Statistical Physics 19 Chapter 1. The Principle of Largest Uncertainty and let σ = σ1 + σ2 be the total entropy. The systems are brought to contact and now both energy and particles can be exchanged between the systems. Let δU be an infinitesimal energy, and let δN be an infinitesimal number of particles, which are transferred from system 1 to system 2. The corresponding change in the total entropy is given by
∂σ ∂σ δσ = 1 δU + 2 δU − ∂U1 ∂U2 N1 N2 ∂σ ∂σ 1 δN + 2 δN − ∂N1 ∂N2 U1 U2 1 1 µ µ = + δU 1 + 2 δN . −τ τ − − τ τ 1 2 1 2 (1.106)
The change δσ in the total entropy is obtained by removing a constrain. Thus, at the end of this process more states are accessible, and therefore, according to the principle of largest uncertainty it is expected that
δσ 0 . (1.107) ≥ For the case where no particles can be exchanged (δN = 0) this implies that energy flows from the system of higher temperature to the system of lower temperature. Another important case is the case where τ 1 = τ 2, for which we conclude that particles flow from the system of higher chemical potential to the system of lower chemical potential. In thermal equilibrium the entropy of the total system obtains its largest possible value. This occurs when
τ 1 = τ 2 (1.108)
and
µ1 = µ2 . (1.109)
1.5.1 Externally Applied Potential Energy
In the presence of externally applied potential energy µex the total chemical potential µtot is given by
µtot = µint + µex , (1.110)
where µint is the internal chemical potential . For example, for particles having charge q in the presence of electric potential V one has
µex = qV , (1.111)
Eyal Buks Thermodynamics and Statistical Physics 20 1.7. Problems Set 1 whereas, for particles having mass m in a constant gravitational field g one has
µex = mgz , (1.112)
where z is the height. The thermal equilibrium relation (1.109) is generalized in the presence of externally applied potential energy as
µtot,1 = µtot,2 . (1.113)
1.6 Free Entropy and Free Energies
The free entropy [see Eq. (1.59)] for the canonical distribution is given by [see Eq. (1.85)]
σ = σ βU , (1.114) F,c − whereas for the grandcanonical case it is given by [see Eq. (1.86)]
σ = σ βU ηN . (1.115) F,gc − − We define below two corresponding free energies, the canonical free energy (known also as the Helmholtz free energy )
F = τσ = U τσ , (1.116) − F,c − and the grandcanonical free energy
G = τσ = U τσ + τηN = U τσ µN . − F,gc − − −
In section 1.2.2 above we have shown that the LUE maximizes σF for given values of the Lagrange multipliers ξ1, ξ2, , ξL. This principle can be implemented to show that: ··· In equilibrium at a given temperature τ the Helmholtz free energy obtains • its smallest possible value. In equilibrium at a given temperature τ and chemical potential µ the grand- • canonical free energy obtains its smallest possible value. Our main results are summarized in table 1.2 below
1.7 Problems Set 1
Note: Problems 1-6 are taken from the book by Reif, chapter 1.
Eyal Buks Thermodynamics and Statistical Physics 21 Chapter 1. The Principle of Largest Uncertainty
Table 1.2. Summary of main results.
micro general −canonical canonical grandcanonical (M states) given Xl where expectation U U , N l = 1, 2, ..., L values Z = Zc = Zgc = partition L − ξlXl(m) e−βU(m) e−βU(m)−ηN(m) function e l=1 m m m pm = pm = pm = p L p = 1 m − ξ X (m) m M 1 −βU(m) 1 −βU(m)−ηN(m) 1 l l e e e l=1 Zc Zgc Z ∂ log Zgc U = − ∂β ∂ log Z ∂ log Zc η Xl Xl = − U = − ∂ξl ∂β ∂ log Zgc N = − ∂η β 2 2 ∂ log Zgc 2 2 (∆U) = ∂β2 2 2 ∂ log Z 2 ∂ log Zc η (∆Xl) (∆Xl) = ∂ξ2 (∆U) = ∂β2 ∂2 log Z l 2 gc (∆N) = ∂η2 β σ = σ = σ = σ L σ = log M log Z + ξl Xl log Zc + β U log Zgc + β U + η N l=1 Lagrange β = ∂σ ξ = ∂σ β = ∂σ ∂U N l ∂ Xl ∂U ∂σ multipliers { Xn }n= l η = ∂N U max min / max min F (τ) min G (τ, µ) σF (ξ1, ξ2, ..., ξL) max σ principle L F = U − τσ G = U − τσ − µN σF = σ − ξl Xl l=1
1. A drunk starts out from a lamppost in the middle of a street, taking steps of equal length either to the right or to the left with equal probability. What is the probability that the man will again be at the lamppost after taking N steps a) if N is even? b) if N is odd? 2. In the game of Russian roulette, one inserts a single cartridge into the drum of a revolver, leaving the other five chambers of the drum empty. One then spins the drum, aims at one’s head, and pulls the trigger. a) What is the probability of being still alive after playing the game N times?
Eyal Buks Thermodynamics and Statistical Physics 22 1.7. Problems Set 1
b) What is the probability of surviving (N 1) turns in this game and then being shot the Nth time one pulls the− trigger? c) What is the mean number of times a player gets the opportunity of pulling the trigger in this macabre game?
3. Consider the random walk problem with p = q and let m = n1 n2, de- note the net displacement to the right. After a total of N steps,− calculate the following mean values: m , m2 , m3 , and m4 . Hint: Calculate the moment generating function. 4. The probability W(n), that an event characterized by a probability p occurs n times in N trials was shown to be given by the binomial distri- bution
N! n N n W (n) = p (1 p) − . (1.117) n!(N n)! − − Consider a situation where the probability p is small (p << 1) and where one is interested in the case n << N. (Note that if N is large, W (n) becomes very small if n N because of the smallness of the factor pn when p << 1. Hence W→(n) is indeed only appreciable when n << N.) Several approximations can then be made to reduce Eq. (1.117) to simpler form. N n Np a) Using the result ln(1 p) p, show that (1 p) − e− . b) Show that N!/(N n−)! ≃N −n. − ≃ c) Hence show that Eq.− (1.117)≃ reduces to
n λ λ W (n) = e− , n! where λ Np is the mean number of events. This distribution is called the≡ "Poisson distribution." 5. Consider the Poisson distribution of the preceding problem. ∞ a) Show that it is properly normalized in the sense that W (n) = 1. n=0 (The sum can be extended to infinity to an excellent approxima tion, since W (n) is negligibly small when n N.) b) Use the Poisson distribution to calculate n . c) Use the Poisson distribution to calculate (∆n)2 = (n n )2 . − 6. A molecule in a gas moves equal distances l between collisions with equal probability in any direction. After a total of N such displacements, what is the mean square displacement R2 of the molecule from its starting point ? 7. A multiple choice test contains 20 problems. The correct answer for each problem has to be chosen out of 5 options. Show that the probability to pass the test (namely to have at least 11 correct answers) using guessing only, is 5.6 10 4. × −
Eyal Buks Thermodynamics and Statistical Physics 23 Chapter 1. The Principle of Largest Uncertainty
8. Consider two objects traveling in the xy plane. Object A starts from the point (0, 0) and object B starts from the point (N,N), where N is an integer. At each step both objects A and B simultaneously make a single move of length unity. Object A makes either a move to the right (x, y) (x + 1, y) with probability 1/2 or an upward move (x, y) (x, y +→ 1) with probability 1/2. On the other hand, object B makes either→ a move to the left (x, y) (x 1, y) with probability 1/2 or a downward move (x, y) (x, y →1) with− probability 1/2. What is the probability that objects→ A and− B meet along the way in the limit N ? 9. Consider A dice having 6 faces. All faces have equal→ probab ∞ ility of out- come. Initially, n faces are colored white and 6 n faces are colored black, where n 0, 1, 2, , 6 . Each time the outcome− is white (black) one black (white)∈ { face is··· turned} into a white (black) face before the next roll. The process continues until all faces have the same color. What is the probability pn that all faces will become white? 10. Alice, Bob and other N 2 people are randomly seated at a round table. − What is the probability pC that Alice and Bob will be seated next to each other? What is the probability pR that Alice and Bob will be seated next to each other for the case where the group is randomly seated in a row. 11. Write a computer function returning the value 1 with probability p and the value 0 with probability 1 p for any given 0 p 1. The function can use another given function,− which returns the value≤ ≤1 with probability 1/2 and the value 0 with probability 1/2. Make sure the running time is finite. 12. Consider a system of N spins. Each spin can be in one of two possible states: in state ’up’ the magnetic moment of each spin is +m, and in state ’down’ it is m. Let N+ (N ) be the number of spins in state ’up’ − − (’down’), where N = N+ +N . The total magnetic moment of the system is given by −
M = m (N+ N ) . (1.118) − − Assume that the probability that the system occupies any of its 2N pos- sible states is equal. Moreover, assume that N 1. Let f (M) be the probability distribution of the random variable M≫(that is, M is consid- ered in this approach as a continuous random variable). Use the Stirling’s formula 1 N! = (2πN)1/2 N N exp N + + (1.119) − 2N ··· to show that 1 M 2 f (M) = exp . (1.120) m√2πN −2m2N Use this result to evaluate the expectation value and the variance of M.
Eyal Buks Thermodynamics and Statistical Physics 24 1.7. Problems Set 1
13. Consider a one dimensional random walk. The probabilities of transiting to the right and left are p and q = 1 p respectively. The step size for both cases is a. − a) Show that the average displacement X after N steps is given by X = aN (2p 1) = aN (p q) . (1.121) − − b) Show that the variance (X X )2 is given by − (X X )2 = 4a2Npq . (1.122) − 14. A classical harmonic oscillator of mass m, and spring constant k oscillates with amplitude a. Show that the probability density function f(x), where f(x)dx is the probability that the mass would be found in the interval dx at x, is given by 1 f(x) = . (1.123) π√a2 x2 − 15. A coin having probability p = 2/3 of coming up heads is flipped 6 times. Show that the entropy of the outcome of this experiment is σ = 3.8191 (use log in natural base in the definition of the entropy). 16. A fair coin is flipped until the first head occurs. Let X denote the number of flips required. a) Find the entropy σ. In this exercise use log in base 2 in the definition
of the entropy, namely σ = i pi log2 pi. b) A random variable X is drawn− according to this distribution. Find an “efficient” sequence of yes-no questions of the form, “Is X con- tained in the set S?” Compare σ to the expected number of questions required to determine X. 17. In general the notation
∂z ∂x y is used to denote the partial derivative of z with respect to x, where the variable y is kept constant. That is, to correctly calculate this derivative the variable z has to be expressed as a function of x and y, namely, z = z (x, y). a) Show that:
∂y ∂z ∂x = z . (1.124) ∂y ∂x y − ∂z x
Eyal Buks Thermodynamics and Statistical Physics 25 Chapter 1. The Principle of Largest Uncertainty
b) Show that:
∂z ∂z ∂z ∂y = + . (1.125) ∂x ∂x ∂y ∂x w y x w 18. Let be a grandcanonical partition function. Zgc a) Show that:
∂ log ∂ log U = Zgc + τµ Zgc . (1.126) − ∂β ∂µ µ β where τ is the temperature, β = 1/τ, and µ is the chemical potential. b) Show that:
∂ log N = λ Zgc , (1.127) ∂λ where
λ = exp (βµ) (1.128)
is the fugacity. 19. Consider an array on N distinguishable two-level (binary) systems. The two-level energies of each system are ε/2. Show that the temperature τ of the system is given by ± ε τ = , (1.129) 1 2 U 2 tanh− − Nε where U is the average total energy of the array. Note that the tem- perature can become negative if U > 0. Why a negative temperature is possible for this system ? 20. Consider an array of N distinguishable quantum harmonic oscillators in thermal equilibrium at temperature τ. The resonance frequency of all oscillators is ω. The quantum energy levels of each quantum oscillator is given by
1 ε = ℏω n + , (1.130) n 2 where n = 0, 1, 2, is integer. ··· a) Show that the average energy of the system is given by
Nℏω βℏω U = coth , (1.131) 2 2 where β = 1/τ.
Eyal Buks Thermodynamics and Statistical Physics 26 1.7. Problems Set 1
b) Show that the variance of the energy of the system is given by
2 N ℏω (∆U)2 = 2 . (1.132) sinh 2 βℏω 2 21. Consider a lattice containing N non-interacting atoms. Each atom has 3 non-degenerate energy levels E1 = ε, E2 = 0, E3 = ε. The system is at thermal equilibrium at temperature−τ. a) Show that the average energy of the system is
2Nε sinh (βε) U = , (1.133) − 1 + 2 cosh βε where β = 1/τ. b) Show the variance of the energy of the system is given by
cosh (βε) + 2 (U U )2 = 2Nε2 . (1.134) − [1 + 2 cosh (βε)]2 22. Consider a one dimensional chain containing N 1 sections (see figure). Each section can be in one of two possible sates.≫ In the first one the section contributes a length a to the total length of the chain, whereas in the other state the section has no contribution to the total length of the chain. The total length of the chain in Nα, and the tension applied to the end points of the chain is F . The system is in thermal equilibrium at temperature τ. a) Show that α is given by
a F a α = 1 + tanh . (1.135) 2 2τ b) Show that in the limit of high temperature the spring constant is given approximately by 4τ k . (1.136) ≃ Na2
Nα
23. A long elastic molecule can be modelled as a linear chain of N links. The state of each link is characterized by two quantum numbers l and n. The length of a link is either l = a or l = b. The vibrational state of a link
Eyal Buks Thermodynamics and Statistical Physics 27 Chapter 1. The Principle of Largest Uncertainty
is modelled as a harmonic oscillator whose angular frequency is ωa for a link of length a and ωb for a link of length b. Thus, the energy of a link is ℏ 1 ωa n + 2 for l = a En,l = 1 , (1.137) ℏωb n + for l = b 2 n = 0, 1, 2, ··· The chain is held under a tension F . Show that the mean length L of the chain in the limit of high temperature T is given by
2 aωb + bωa F ωbωa (a b) 2 L = N + N −2 β + O β , (1.138) ωb + ωa (ωb + ωa) where β = 1/τ. 24. The elasticity of a rubber band can be described in terms of a one- dimensional model of N polymer molecules linked together end-to-end. The angle between successive links is equally likely to be 0◦ or 180◦. The length of each polymer is d and the total length is L. The system is in thermal equilibrium at temperature τ. Show that the force f required to maintain a length L is given by
τ 1 L f = tanh− . (1.139) d Nd 25. Consider a system which has two single particle states both of the same energy. When both states are unoccupied, the energy of the system is zero; when one state or the other is occupied by one particle, the energy is ε. We suppose that the energy of the system is much higher (infinitely higher) when both states are occupied. Show that in thermal equilibrium at temperature τ the average number of particles in the level is 2 N = , (1.140) 2 + exp [β (ε µ)] − where µ is the chemical potential and β = 1/τ. 26. Consider an array of N two-level particles. Each one can be in one of two states, having energy E1 and E2 respectively. The numbers of particles in states 1 and 2 are n1 and n2 respectively, where N = n1 + n2 (assume n1 1 and n2 1). Consider an energy exchange with a reservoir at temperature≫ τ leading≫ to population changes n n 1 and n n +1. 2 → 2− 1 → 1 a) Calculate the entropy change of the two-level system, (∆σ)2LS. b) Calculate the entropy change of the reservoir, (∆σ)R. c) What can be said about the relation between (∆σ)2LS and (∆σ)R in thermal equilibrium? Use your answer to express the ration n2/n1 as a function of E1, E2 and τ.
Eyal Buks Thermodynamics and Statistical Physics 28 1.7. Problems Set 1
27. Consider a lattice containing N sites of one type, which is denoted as A, and the same number of sites of another type, which is denoted as B. The lattice is occupied by N atoms. The number of atoms occupying sites of type A is denoted as NA, whereas the number of atoms occupying atoms of type B is denoted as NB, where NA + NB = N. Let ε be the energy necessary to remove an atom from a lattice site of type A to a lattice site of type B. The system is in thermal equilibrium at temperature τ. Assume that N,N ,N 1. A B ≫ a) Calculate the entropy σ. b) Calculate the average number NB of atoms occupying sites of type B. 28. Consider a microcanonical ensemble of N quantum harmonic oscillators in thermal equilibrium at temperature τ. The resonance frequency of all oscillators is ω. The quantum energy levels of each quantum oscillator is given by
1 ε = ℏω n + , (1.141) n 2 where n = 0, 1, 2, is integer. The total energy E of the system is given by ···
N E = ℏω m + , (1.142) 2 where
N m = nl , (1.143) l=1 and nl is state number of oscillator l. a) Calculate the number of states g (N, m) of the system with total energy ℏω (m + N/2). b) Use this result to calculate the entropy σ of the system with total energy ℏω (m + N/2). Approximate the result by assuming that N 1 and m 1. ≫ c) Use this result≫ to calculate (in the same limit of N 1 and m 1) the average energy of the system U as a function of≫ the temperature≫ τ. 29. The energy of a donor level in a semiconductor is ε when occupied by an electron (and the energy is zero otherwise). A− donor level can be either occupied by a spin up electron or a spin down electron, however, it cannot be simultaneously occupied by two electrons. Express the average occupation of a donor state Nd as a function of ε and the chemical potential µ.
Eyal Buks Thermodynamics and Statistical Physics 29 Chapter 1. The Principle of Largest Uncertainty
1.8 Solutions Set 1
1. Final answers: N! 1 N a) N N 2 ( 2 )!( 2 )! b) 0 2. Final answers: 5 N a) 6 N 1 5 − 1 b) 6 6 c) In general
∞ N 1 d ∞ N d 1 1 Nx − = x = = , dx dx 1 x 2 N=0 N=0 (1 x) − − thus
N 1 1 ∞ 5 − 1 1 N¯ = N = = 6 . 6 6 6 5 2 N=0 1 − 6 3. Let W (m) be the probability for for taking n1 steps to the right and n2 = N n1 steps to the left, where m = n1 n2, and N = n1 + n2. Using − − N + m n = , 1 2 N m n = − , 2 2 one finds
N! N+m N−m 2 2 W (m) = N+m N m p q . 2 ! −2 ! It is convenient to employ the moment generating function, defined as
φ (t) = etm .
In general, the following holds
∞ tk φ (t) = mk , k! k=0 thus from the kth derivative of φ (t) one can calculate the kth moment of m
mk = φ(k) (0) . Eyal Buks Thermodynamics and Statistical Physics 30 1.8. Solutions Set 1
Using W (m) one finds
N φ (t) = W (m) etm m= N − N N! N+m N−m tm = p 2 q 2 e , N+m ! N m ! m= N 2 −2 − or using the summation variable N + m n = , 1 2 one has N N! n N n t(2n N) φ (t) = p 1 q − 1 e 1− n !(N n )! n =0 1 1 1 − N tN N! 2t n1 N n = e− pe q − 1 n1!(N n1)! n1=0 − tN 2t N = e− pe + q . Using p = q = 1/ 2
et + e t N φ (t) = − = (cosh t)N . 2 Thus using the expansion 1 1 (cosh t)N = 1 + Nt2 + [N + 3N (N 1)] t4 + O t5 , 2! 4! − one finds m = 0 , m2 = N, 3 m = 0 , m4 = N (3N 2) . − 4. Using the binomial distribution
N! n N n W (n) = p (1 p) − n!(N n)! − − N (N 1) (N 1) (N n + 1) n N n = − − × · · · − p (1 p) − n! − (Np)n = exp ( Np) . ∼ n! −
Eyal Buks Thermodynamics and Statistical Physics 31 Chapter 1. The Principle of Largest Uncertainty
5. ∞ ∞ λ λn a) W (n) = e− n! = 1 n=0 n=0 b) As in Ex. 1-6, it is convenient to use the moment generating function
n tn t n tn ∞ tn λ ∞ λ e λ ∞ (λe ) t φ (t) = e = e W (n) = e− = e− = exp λ e 1 . n! n! − n=0 n=0 n=0 Using the expansion 1 exp λ et 1 = 1 + λt + λ (1 + λ) t2 + O t3 , − 2 one finds n = λ . c) Using the same expansion one finds n2 = λ (1 + λ) , thus (∆n)2 = n2 n 2 = λ (1 + λ) λ2 = λ . − − 6.
N 2 N R2 = r = r2 + r r = Nl2 n n n · m n=1 n=1 n=m =l2 =0 7. 20 20! n 20 n 4 0.2 0.8 − = 5.6 10− . (1.144) n! (20 n)! × × n=11 − 8. Let σAn = 1 (σBn = 1) if object A (B) makes a move to the right (left) at step n, and σAn = 0 (σBn = 0) if object A (B) makes an upward (downward) move at step n. The location (xAm, yAm) of object A and the location (xBm, yBm) of object B after m steps is given by (x , y ) = (S , m S ) , (1.145) Am Am Am − Am (x , y ) = (N S ,N m + S ) , (1.146) Bm Bm − Bm − Bm where m SAm = σAn , (1.147) n=1 m SBm = σBn . (1.148) n=1 Eyal Buks Thermodynamics and Statistical Physics 32 1.8. Solutions Set 1
A meeting occurs if for some m S = N S , (1.149) Am − Bm m S = N m + S , (1.150) − Am − Bm i.e. if
S + S = N = 2m N. (1.151) Am Bm − Thus, a meeting is possible only after m = N steps, and it occurs if SAm + SBm = N. Therefore, the probability is given by
2N N (2N)! pN = = . (1.152) 22N N 2 (N!2 ) With the help of the Stirling’s formula (1.119) one finds that 1 lim pN = . (1.153) N √ →∞ Nπ 9. The following holds
p0 = 0 , 5 1 p = p + p , 1 6 0 6 2 4 2 p = p + p , 2 6 1 6 3 3 3 p = p + p , 3 6 2 6 4 2 4 p = p + p , 4 6 3 6 5 1 5 p = p + p , 5 6 4 6 6 p6 = 1 , and thus 1 3 1 13 31 p = , p = , p = , p = , p = . 1 32 2 16 3 2 4 16 5 32 10. For the case of a round table (and N > 2)
N 2 (N 2)! 2 p = × × − = , (1.154) C N! N 1 − and for the case of a row (2 + (N 2) 2) (N 2)! 2 p = − × × − = . (1.155) R N! N
Eyal Buks Thermodynamics and Statistical Physics 33 Chapter 1. The Principle of Largest Uncertainty
11. Let the binary representation of p be given by
m ∞ 1 p = σ , (1.156) m 2 m=1 where σ 0, 1 . Let Σ be a sequence of random variables generated m ∈ { } m by the given computer function (i.e. Σm = 1 with probability 1/2 and Σm = 0 with probability 1/2). The proposed function has a while loop running over integer values of the variable m starting from the value m = 1. At each iteration the random variable Σm is generated and compared with σm. If σm = Σm the value of m is increased by 1, i.e. m m + 1, and the loop continues. If σ = Σ the program stops and the→ value 1 is m m returned if σm > Σm and the value 0 is returned if σm < Σm. Note that the probability that the program will never stop vanishes even when p is irrational and/or the number of nonzero binary digits σm is infinite. 12. Using
N+ + N = N, (1.157a) − M N+ N = , (1.157b) − − m one has N M N = 1 + , (1.158a) + 2 mN N M N = 1 , (1.158b) − 2 − mN or N N = (1 + x) , (1.159a) + 2 N N = (1 x) , (1.159b) − 2 − where M x = . mN The number of states having total magnetization M is given by N! N! Ω (M) = = N N . (1.160) N+!N ! (1 + x) ! (1 x) ! − 2 2 − Since all states have equal probability one has Ω (M) f (M) = . (1.161) 2N
Eyal Buks Thermodynamics and Statistical Physics 34 1.8. Solutions Set 1
Taking the natural logarithm of Stirling’s formula one finds 1 log N! = N log N N + O , (1.162) − N thus in the limit N 1 one has ≫ log f = log 2N + N log N N − − N N N (1 + x) log (1 + x) + (1 + x) − 2 2 2 N N N (1 x) log (1 x) + (1 x) − 2 − 2 − 2 − = N log 2 + N log N − N N N N (1 + x) log (1 + x) (1 x) log (1 x) − 2 2 − 2 − 2 − N N N N = 2 log + (1 + x) log (1 + x) + (1 x) log (1 x) − 2 − 2 2 − 2 − N N N N = 2 log + (1 + x) log + log (1 + x) + (1 x) log + log (1 x) − 2 − 2 2 − 2 − N 1 + x = log 1 x2 + x log . − 2 − 1 x − (1.163)
The function log f (x) has a sharp peak near x = 0, thus we can approx- imate it by assuming x 1. To lowest order ≪ 1 + x log 1 x2 + x log = x2 + O x3 , (1.164) − 1 x − thus M 2 f (M) = A exp , (1.165) −2m2N where A is a normalization constant, which is determined by requiring that
∞ 1 = f (M) dM. (1.166) −∞ Using the identity
∞ π exp ay2 dy = , (1.167) − a −∞
Eyal Buks Thermodynamics and Statistical Physics 35 Chapter 1. The Principle of Largest Uncertainty
one finds
1 ∞ M 2 = exp dM = m√2πN , (1.168) A −2m2N −∞ thus 1 M 2 f (M) = exp , (1.169) m√2πN −2m2N The expectation value is giving by
∞ M = Mf (M) dM = 0 , (1.170) −∞ and the variance is given by
∞ (M M )2 = M 2 = M 2f (M) dM = m2N. (1.171) − −∞ 13. The probability to have n steps to the right is given by
N! n N n W (n) = p q − . (1.172) n!(N n)! − a)
N N!n n N n n = p q − (1.173) n!(N n)! n=0 − N ∂ N! n N n = p p q − ∂p n!(N n)! n=0 − ∂ N N 1 = p (p + q) = pN (p + q) − = pN . ∂p Since
X = an a (N n) = a (2n N) (1.174) − − − we find
X = aN (2p 1) = aN (p q) . (1.175) − − b)
Eyal Buks Thermodynamics and Statistical Physics 36 1.8. Solutions Set 1
N 2 2 N!n n N n n = p q − n!(N n)! n=0 − N N N!n (n 1) n N n N!n n N n = − p q − + p q − n!(N n)! n!(N n)! n=0 n=0 − − 2 N 2 ∂ N! n N n = p p q − + n ∂p2 n!(N n)! n=0 − ∂2 = p2 (p + q)N + n = p2N (N 1) + pN . ∂p2 − Thus (n n )2 = p2N (N 1) + pN p2N 2 = Npq , (1.176) − − − and (X X )2 = 4a2Npq . (1.177) − 14. The total energy is given by kx2 mx˙ 2 ka2 E = + = , (1.178) 2 2 2 where a is the amplitude of oscillations. The time period T is given by a dx m a dx m T = 2 = 2 = 2π , (1.179) 2 2 a x˙ k a √a x k − − − thus 2 1 f(x) = = . (1.180) T x˙ π√a2 x2 | | − 15. The six experiments are independent, thus 2 2 1 1 σ = 6 ln ln = 3.8191 . × −3 3 − 3 3 n 16. The random variable X obtains the value n with probability pn = q , where n = 1, 2, 3, , and q = 1/2. ··· a) The entropy is given by
∞ ∞ ∞ σ = p log p = qn log qn = nqn . − n 2 n − 2 n=1 n=1 n=1 This can be rewritten as
∂ ∞ ∂ 1 q σ = q qn = q 1 = = 2 . ∂q ∂q 1 q − 2 n=1 (1 q) − −
Eyal Buks Thermodynamics and Statistical Physics 37 Chapter 1. The Principle of Largest Uncertainty
b) A series of questions is of the form: "Was a head obtained in the 1st time ?", "Was a head obtained in the 2nd time ?", etc. The expected number of questioned required to find X is 1 1 1 1 + 2 + 3 + = 2 , 2 × 4 × 8 × ··· which is exactly the entropy σ. 17. a) Consider an infinitesimal change in the variable z = z (x, y)
∂z ∂z δz = δx + δy . (1.181) ∂x ∂y y x For a process for which z is a constant δz = 0, thus
∂z ∂z 0 = (δx) + (δy) . (1.182) ∂x z ∂y z y x
Dividing by (δx)z yields ∂z ∂z (δy) = z ∂x − ∂y (δx) y x z ∂z ∂y = − ∂y ∂x x z ∂y ∂x = z . − ∂y ∂z x (1.183)
b) Consider a process for which the variable w is kept constant. An infinitesimal change in the variable z = z (x, y) is expressed as
∂z ∂z (δz) = (δx) + (δy) . (1.184) w ∂x w ∂y w y x
Dividing by (δx)w yields (δz) ∂z ∂z (δy) w = + w . (1.185) (δx) ∂x ∂y (δx) w y x w or ∂z ∂z ∂z ∂y = + . (1.186) ∂x ∂x ∂y ∂x w y x w
Eyal Buks Thermodynamics and Statistical Physics 38 1.8. Solutions Set 1
18. We have found in class that ∂ log U = Zgc , (1.187) − ∂β η ∂ log N = Zgc . (1.188) − ∂η β a) Using relation (1.125) one has
∂ log U = Zgc − ∂β η ∂ log ∂ log ∂µ = Zgc Zgc − ∂β − ∂µ ∂β µ β η ∂ log η ∂ log = Zgc Zgc − ∂β − β2 ∂µ µ β ∂ log ∂ log = Zgc + τµ Zgc . − ∂β ∂µ µ β (1.189)
b) Using Eq. (1.188) one has
∂ log N = Zgc − ∂η β ∂µ ∂ log = Zgc − ∂η ∂µ β β ∂ log = τ Zgc , ∂µ β (1.190)
or in terms of the fugacity λ, which is defined by
η λ = exp (βµ) = e− , (1.191)
one has ∂ log N = τ Zgc ∂µ β ∂λ ∂ log = τ Zgc ∂µ ∂λ ∂ log = λ Zgc . ∂λ (1.192)
Eyal Buks Thermodynamics and Statistical Physics 39 Chapter 1. The Principle of Largest Uncertainty
19. The canonical partition function is given by
= N , (1.193) Zc Z1 where βε βε βε = exp + exp = 2 cosh . (1.194) Z1 2 − 2 2 Thus ∂ log ∂ log Nε βε U = Zc = N Z1 = tanh , (1.195) − ∂β − ∂β − 2 2 and ε τ = . (1.196) 1 2 U 2 tanh− − Nε The negative temperature is originated by our assumption that the en- ergy of a single magnet has an upper bound. In reality this is never the case. 20. The canonical partition function is given by
= N , (1.197) Zc Z1 where
βℏω ∞ = exp exp ( βℏωn) (1.198) Z1 − 2 − n=0 βℏω exp 2 1 = − = . 1 exp ( βℏ ω) 2 sinh βℏω − − 2 a) ∂ log ∂ log Nℏω βℏω U = Zc = N Z1 = coth (1.199) − ∂β − ∂β 2 2 b)
2 ∂2 log ∂2 log N ℏω (∆U)2 = Zc = N Z1 = 2 (1.200) ∂β2 ∂β2 sinh 2 βℏω 2 21. The canonical partition function is given by
= [exp (βε) + 1 + exp ( βε)]N = [1 + 2 cosh (βε)]N , (1.201) Zc − where β = 1/τ.
Eyal Buks Thermodynamics and Statistical Physics 40 1.8. Solutions Set 1
a) Thus the average energy is
∂ log 2Nε sinh (βε) U = Zc = , (1.202) − ∂β − 1 + 2 cosh βε b) and the variance is
∂2 log ∂ U cosh (βε) + 2 (U U )2 = Zc = = 2Nε2 . − ∂β2 − ∂β [1 + 2 cosh (βε)]2 (1.203)
22. Each section can be in one of two possible sates with corresponding en- ergies 0 and F a. − a) By definition, α is the mean length of each segment, which is given by
a exp (Faβ) a F aβ α = = 1 + tanh , (1.204) 1 + exp (F aβ) 2 2 where β = 1/τ. b) At high temperature F aβ 1 the length of the chain L = Nα is given by ≪
Na F aβ Na F aβ L = 1 + tanh 1 + , (1.205) 2 2 ≃ 2 2 or Na F = k L , (1.206) − 2 where the spring constant k is given by 4τ k = . (1.207) Na2 23. The average length of a single link is given by
∞ ∞ a exp (βF a) exp βℏω n + 1 + b exp (βF b) exp βℏω n + 1 − a 2 − b 2 n=0 n=0 l = ∞ ∞ exp (βF a) exp βℏω n + 1 + exp (βF b) exp βℏω n + 1 − a 2 − b 2 n=0 n=0 (1.208) a exp(βF a) + b exp(βF b) 2 sinh βℏ ωa 2 sinh βℏ ωb = 2 2 . exp(βF a) + exp(βF b) 2 sinh βℏ ωa βℏ ωb 2 2 sinh 2
Eyal Buks Thermodynamics and Statistical Physics 41 Chapter 1. The Principle of Largest Uncertainty
To first order in β
2 aωb + bωa F ωbωa (a b) 2 l = + −2 β + O β . (1.209) ωb + ωa (ωb + ωa) The average total length is L = n l . 24. The length L is given by
L = N l , where l is the average contribution of a single molecule to the total length, which can be either +d or d. The probability of each possibility is determined by the Boltzmann factor.− The energy change due to flipping of one link from 0◦ to 180◦ is 2fd, therefore
βfd βfd e e− l = d βfd − βfd = d tanh (βfd) , e + e− where β = 1/τ. Thus
L = Nd tanh (βfd) ,
or
τ 1 L f = tanh− . d Nd 25. The grand partition function is given by Zgc
= 1 + 2 exp [β (µ ε)] , (1.210) Zgc − thus 1 ∂ log 2 N = Zgc = . (1.211) β ∂µ 2 + exp [β (ε µ)] − 26. a) N! N! (∆σ)2LS = log log (n2 1)! (n1 + 1)! − n2!n1! n − n = log 2 log 2 . n1 + 1 ≃ n1 (1.212)
b) E E (∆σ) = 2 − 1 (1.213) R τ
Eyal Buks Thermodynamics and Statistical Physics 42 1.8. Solutions Set 1
c) For a small change near thermal equilibrium one expects (∆σ)2LS + (∆σ)R = 0, thus n E E 2 = exp 2 − 1 . (1.214) n − τ 1 27. The number of ways to select NB occupied sites of type B out of N sites is N!/n!(N n)!. Similarly the number of ways to select NB empty sites of type A out− of N sites is N!/n!(N n)!. − a) Thus
N! 2 σ = log 2 [N log N N log N (N N ) log (N N )] . N !(N N )! ≃ − B B − − B − B B − B (1.215)
b) The energy of the system is given by U = NBε. Thus, the Helmholtz free energy is given by
U U U U F = U τσ = U 2τ N log N log N log N . − − − ε ε − − ε − ε (1.216)
At thermal equilibrium (∂F/∂U)τ = 0, thus ∂F 2τ U U 0 = = 1 + log log N , (1.217) ∂U ε ε − − ε τ or N N ε − B = exp , (1.218) NB 2τ therefore N NB = ε . 1 + exp 2τ Alternatively, one can calculate the chemical potential from the re- quirement N N 1 = A + B , (1.219) N N where N exp (βµ) A = , (1.220a) N 1 + exp (βµ) N exp (βµ βε) B = − , (1.220b) N 1 + exp (βµ βε) −
Eyal Buks Thermodynamics and Statistical Physics 43 Chapter 1. The Principle of Largest Uncertainty
which is satisfied when ε µ = , (1.221) 2 thus N NB = ε . (1.222) 1 + exp 2τ 28. In general,
g (N, m) = # os ways to distribute m identical balls in N boxes { } Moreover
# os ways to distribute m identical balls in N boxes { } = # os ways to arrange m identical balls and N 1 identical partitions in a line { − } … …
a) Therefore (N 1 + m)! g (N, m) = − . (1.223) (N 1)!m! − b) The entropy is given by
(N 1 + m)! σ = log − (N + m) log (N + m) N log N m log m , (N 1)!m! ≃ − − − (1.224)
or in terms of the total energy E = ℏω (m + N/2) E N E N σ = N + log N + ℏω − 2 ℏω − 2 E N E N N log N log . − − ℏω − 2 ℏω − 2 (1.225)
Eyal Buks Thermodynamics and Statistical Physics 44 1.8. Solutions Set 1
c) The temperature τ is given by
1 ∂σ = τ ∂E 1 ln 2ℏω ln 2ℏω 1 2E+Nℏω 2E Nℏω = − + − − ℏω ℏω 1 2E + Nℏω = ln . ℏω 2E Nℏω − (1.226)
In the thermodynamical limit (N 1, m 1) the energy E and its average U are indistinguishable, thus≫ ≫
ℏω 2U + Nℏω exp = , (1.227) τ 2U Nℏω − or Nℏω ℏω U = coth . (1.228) 2 2τ 29. The grand canonical partition function is given by
ζ = 1 + 2λ exp (βε) , (1.229)
where λ = exp (βµ) is the fugacity, thus
∂ log ζ 2λeβε 1 N = λ = = . (1.230) d βε 1 β(ε+µ) ∂λ 1 + 2λe 1 + 2 e−
Eyal Buks Thermodynamics and Statistical Physics 45
2. Ideal Gas
In this chapter we study some basic properties of ideal gas of massive identical particles. We start by considering a single particle in a box. We then discuss the statistical properties of an ensemble of identical indistinguishable particles and introduce the concepts of Fermions and Bosons. In the rest of this chapter we mainly focus on the classical limit. For this case we derive expressions for the pressure, heat capacity, energy and entropy and discuss how internal degrees of freedom may modify these results. In the last part of this chapter we discuss an example of an heat engine based on ideal gas (Carnot heat engine). We show that the efficiency of such a heat engine, which employs a reversible process, obtains the largest possible value that is allowed by the second law of thermodynamics.
2.1 A Particle in a Box
Consider a particle having mass M in a box. For simplicity the box is assumed to have a cube shape with a volume V = L3. The corresponding potential energy is given by 0 0 x, y, z L V (x, y, z) = . (2.1) ≤ else ≤ ∞ The quantum eigenstates and eigenenergies are determined by requiring that the wavefunction ψ (x, y, z) satisfies the Schrödinger equation 2 ∂2ψ ∂2ψ ∂2ψ + + + V ψ = Eψ . (2.2) −2M ∂x2 ∂y2 ∂z2 In addition, we require that the wavefunction ψ vanishes on the surfaces of the box. The normalized solutions are given by
2 3/2 n πx n πy n πz ψ (x, y, z) = sin x sin y sin z , (2.3) nx,ny,nz L L L L where
n , n , n = 1, 2, 3, (2.4) x y z ··· Chapter 2. Ideal Gas
The corresponding eigenenergies are given by