<<

Physics 309 Lecture 5

1 Review

Discuss Quiz 1. Assumptions for . Use of equipartition and degrees of freedom. Assump- tions to write work as − R PdV. Definition of microstate, macrostate and multiplicity. Example the Einstein : N harmonic oscillators with total energy qh f , ie q units of energy. We found that the number of microstates for a macrostate with q units of energy divided among N oscillators was

q + N − 1 Ω(q) = (1) q If we assume that all microstates were equally probable, then the probability of a given macrostate is p(a) = Ω(a)/Ω(all). We’re going to always use this assumption, the fundamental assumption of . This means over time a system explores all of the microstates, so we’ll mostly find it in the most probably macrostate

2 Interacting systems

Let’s imagine two Einstein A and B with NA and NB oscillators each, sharing q = qA + qB units of energy.

The multiplicity Ω of the whole system is the product of the multiplicities of the individual systems, ΩA and ΩB. It’s easy enough to calculate this for small q and N, eg NA = NB = 3 and q = 6: qA ΩA qB ΩB Ω = ΩAΩB 0 1 6 28 28 1 3 5 21 63 2 6 4 15 90 3 10 3 10 100 4 15 2 6 90 5 21 1 3 63 6 28 0 1 28 462 So if we start with all the energy in A, ie qA = 6 and let the system exchange energy at random, we’ll most likely end up in the macrostate with qA = 3, ie energy flowed from the more energetic to the less energetic, just based on probability.

1 3 Large systems

3.1 Einstein solid For the small system, it was more likely that energy would flow “downhill”, but we want to con- sider real systems with 1023 particles, where we’ll find that it’s overwhelmingly likely that this will happen. A useful mathematical tool is Stirling’s approximation for the factorial of large numbers: √ N! ≈ NNe−N 2πN (2)

Mostly used in log form without the extra 2πN factor:

ln N! ≈ N ln N − N (3)

Good to about 10% when N = 10 and to 1% when N = 100. We’ll also recall the Taylor expansion for ln(1 + x) ≈ x for x  1 So let’s go back to our single Einstein solid and see how its multiplicity behaves when N and q are large:

(q + N − 1)! Ω(N, q) = (4) q!(N − 1)! (q + N)! ≈ (5) q!N! (6)

Take the log and use Stirling’s approx

(q + N)! ln Ω = ln (7) q!N! ≈ (q + N) ln(q + N) − (q + N) − q ln q + q − N ln N + N (8) = (q + N) ln(q + N) − q ln q − N ln N (9) (10)

Let’s look at the first term in the “high-temperature” limit when q  N:

  N  ln(q + N) = ln q 1 + (11) q  N  = ln q + ln 1 + (12) q N ≈ ln q + (13) q (14) so that q N2 ln Ω ≈ N ln + N + (15) N q  eq N Ω(N, q) ≈ (16) N (17)

Very strong dependence on N.

2 Now we can get back to the two-solid system, with N oscillators each. The total multiplicity is the product of multiplicities:  e 2N Ω = (q q )N (18) N A B

We can differentiate this wrt qA to find that the peak is at qA = q/2. Let’s see what Ω looks like q q near the peak. Write qA = 2 + x and qB = 2 − x to get

 e 2N  q 2 N Ω = − x2 (19) N 2

With a litle algebra and more approximation we find that " #  q 2 N  q 2  2x 2 ln − x2 ≈ N ln − (20) 2 2 q so that " #  2x 2 Ω ∝ exp −N (21) q

 2 Congratulations! It’s a Gaussian! Falls off to 1/e of the peak when N 2x = 1 ie x = √q . So √ q 2 N the fractional width is 1/ N. One in ten billion if N = 1020! So for real systems the fluctuations are completely negligible.

3.2 The ideal gas Let’s sketch out a similar analysis for the monatomic ideal gas: For one atom, the number of microstates must depend on the physical volume and the volume in momentum space like Ω1 ∝ 2 2 2 2 VVp. At fixed energy U = p /2m, we have 2mU = px + py + pz, which defines the surface of a sphere. From QM, we have that in each dimension ∆x∆px = h and three dimensions, so

VVp Ω = (22) 1 h3

N 2 For N particles, the constraint in momentum space becomes ∑i pi = 2mU so we need to know how to calculate the area of the surface of a momentum hypersphere in higher dimensions. We multiply the multiplicities for each particle together, but that overcounts by a factor N! if the particles are indistinguishable. So we get:

1 VN Ω = × (area of hypersphere) (23) N N! h3N √ The area in d dimensions has to go like rd−1 but r = 2mU and d = N so

N 3N/2 ΩN = f (N)V U (24)

For interacting ideal gases with N particles each

2 N 3N/2 Ωtotal = [ f (N)] (VAVB) (UAUB) (25)

The same treatment√ we used before will give us that the peak width in U and in V being propor- tional to 1/ N, ie very narrow.

3 4

We can state what we’ve seen more generally as the second law of : any large system in equilibrium will be found in the macrostate with the greatest multiplicity. We more often deal with the entropy, defined as S = k ln Ω (26) A measure of how “spread out” the energy is. Disconnected observations: • Since entropy is a monotonic function of multiplicity, then our statement of the 2nd law implies that any large system in equilibrium will be found in the state with largest entropy.

• Ω is dimensionless, and we’d expect S to be dimensionless too, but for hysterical raisins that will become clear, it’s got that extra factor of k and thus units J/K.

• Once you know the macrostate, you know Ω, so entropy is a function of state • Multiplicities multiply, so add

Discuss entropy in terms of the connected Einstein solids: the equal-energy configuration is the highest multiplicity state and therefore the highest entropy state and therefore the one we find the system in. We can decrease entropy in one place, but there’ll always be a greater increase in entropy elsewhere. Discuss Maxwell’s demon in terms of our two Einstein solids. Even Maxwell’s demon can’t do it because of the entropy increase associated with storing information. eq N For an Einstein solid, Ω = N so S = Nk [ln(q/N) + 1]. For the ideal gas, if we put back in all the constants in Ω(U, V, N) and take the log, we get the Sackur-Tetrode equation " ! # V  4πmU 3/2 5 S = Nk ln + (27) N 3Nh2 2

Lots of boring constants, but we care about the dependencies on variables: depends on V, U, N. What does it tell us about entropy when these variables change? If we expand from Vi to Vf V while keeping N and U (and thus T) constant, then ∆S = Nk ln f . Isothermal expansion, so heat Vi supplied to keep T constant. The added heat caused an increase in S. But in a free expansion, we can increase S without adding heat. The ∆S must be the same because S is a function of state.

4.1 Entropy of mixing and Gibbs’ paradox Separate two different species of gas by a partition and then remove the partition. They mix. Each gas expands into volume 2V so entropy of each increases by Nk ln 2. Total increase in entropy is 2Nk ln 2. But what if the gas on each side is the same? Then we can just use the Sackur-Tetrode equation and see what happens when we make the replacement N → 2N and U → 2U with V staying the same. We end up with Snew = SA + SB − 2Nk ln 2 because of the N under the V in the S-T equation. That N came from a 1/N! in the multiplicity, which was put there to account for the fact that gas molecules are indistinguishable.

4.2 For next time HW: problems 2.26, 2.27, 2.33, due Sep 15 Read Sections 3.1 and 3.2 on the relationship of entropy with temperature and with heat

4