Part II Thermal & Statistical Physics Eugene Terentjev rm.245 Bragg

Relevant websites: http://www-teach.phy.cam.ac.uk/teaching/webpages.php http://www.bss.phy.cam.ac.uk/~emt1000/statphys.html

Michaelmas 2012 Part II Thermal & Statistical Energy Over-arching concept … Often-misused concept … Physical “definition”: there exists a certain scalar quantity that does not change in all the possible transformations that Nature undergoes.

This is essentially an abstract mathematical constraint, but we really can’t tell what energy “is”…

Potential Energy Kinetic Energy

Mass Energy Thermal Energy

1 Michaelmas 2012 Part II Thermal & Statistical Dissipation of energy In most areas of physics we only consider the situations where the total energy of a given system was conserved. The forces may have done some work , but it was possible to convert it back into potential energy .

In many situations, however, some of the work will be “lost ”. Where would the energy go, if there is a universal Law of conservation of energy? We say it converts into “ heat ”. One often thinks that “heat” is just the form of kinetic energy of many particles – but since there are so many of them ( 10 23 in a spoon of water ) we have no hope of harvesting this energy back… so it is considered “lost”.

Michaelmas 2012 Part II Thermal & Statistical Dissipation of energy This loss of energy into heat is called “energy dissipation”. It is a result of wet friction (or “dissipative friction”), which in most cases is a force proportional, and opposite in direction to the velocity : F v friction = −γ F N v F fr fr v mg Compare this with the dry friction (resistance), which is F N friction = −µ The force of friction is directly proportional to the applied load. (Amontons 1st Law) The force of friction is independent of the area of contact. (Amontons 2nd Law) The force of friction is independent of the sliding velocity. (Coulomb's Law)

2 Michaelmas 2012 Part II Thermal & Statistical Rate of dissipation Let us calculate the power loss in an oscillator (an example): v F m x&& = −k x −γ v + F(t) fr external dx d 2 x dx dx P = F(t) = ( m + k x + γ ) dt dt 2 dt dt d mv2 k x2   dx 2 P = Fext ⋅v =  +  + γ   dt  2 2   dt  Which means: you have to spend some external power to make a particle moving against friction! If you don’t ( P=0 ) – then the rate of energy loss is: d []Total Energy = −F ⋅v dt fric. “loss of energy”

heat

Michaelmas 2012 Part II Thermal & Statistical Work and Heat The First Law of thermodynamics states: “Energy is conserved when heat is taken into account ” In more quantitative terms, we claim that any change of internal energy of any system (which we now call U) is converted into work and the rest is released as heat : ∆U = ∆W + ∆Q If you remember, the work (interpreted as the change of P.E. to be specific) was determined as ∆W = −F ⋅∆x It is common to start the subject of thermodynamics with the analysis of gases, so the force is what acts on the piston of area A: ∆W = −F ⋅∆x P F So: ∆U = ∆Q − PdV A ∆l

3 Michaelmas 2012 Part II Thermal & Statistical : equation of state Boyle (1662): at fixed PV = constant

Charles (1787): at fixed mass of gas V = V0 + constant ⋅(T −T0 ) V Extrapolating this linear relation , scientists very early have determined a “zero temperature” at which V=0, V0 which in today’s Celsius scale is at T = −273 C. T T A similar linear P-T relationship was also known 0 (Amontons 1702) and so, this required PV = constant ⋅T Finally, Avogadro (1811 →1860 ) determined the constant, which had to be proportional to the amount of gas. He introduced a mol: PV = nR ⋅T A mol is the mass in grams of a substance which is equal to its molecular weight. Each mol P ∆V contains exactly the same number of particles, 23 NA=6·10 !

Total amount of particles is: N = nNA

Michaelmas 2012 Part II Thermal & Statistical Heat capacity One of the basic properties of all forms of matter in response to heat is that of heat capacity . ∆Q = C ⋅∆T The heat capacity describes the change in temperature of a substance when it is supplied dQ C = with a given amount of heat. dT But most materials (especially gas) can also expand on heating. So we must specify in what conditions the heat capacity is measured, e.g. at constant volume or at constant pressure  dQ   dQ  CV =   CP =    dT V  dT P The difference between them is the work ( −PdV ) on expanding the body:

V=constant: dQ = dU = CVdT

P=constant: dQ = dU + PdV = CVdT + nR dT = (CV + nR)dT

4 Michaelmas 2012 Part II Thermal & Statistical Temperature and Entropy Entropy (Clausius 1850, Gibbs, Boltzmann ) is a measure of the unavailability of a system’s energy to do work (see 1st law ∆ U = ∆ W + ∆ Q ). In simple terms, the entropy S is the “ heat per unit temperature ”: ∆S = ∆Q or dQ = T ⋅dS T However, the modern ( primary ) definition of entropy is from statistical principles. This is often called “Boltzmann entropy” and is defined as S = kB ln Ω where Ω is the number of states the system can explore The Second Law of thermodynamics states that the entropy of an isolated system can only increase with time (∆S>0) and approaches its maximal value in equilibrium. → Heat cannot spontaneously flow from a material at lower temperature to a material at higher temperature.

Michaelmas 2012 Part II Thermal & Statistical Variable particle number Consider a gas in a box – change particle number (N→N+dN ) while keeping the volume constant (dV=0). If this addition was done reversible , that is, not creating heat (dQ=TdS=0), then the increase in the energy of this gas: ∆U = µ ∆N  ∂U  This defines the µ =    ∂N V ,S In summary, for an arbitrary process capable of changing the volume of gas (i.e. do mechanical work), converting some of the energy into heat, and also exchanging particles – we have the energy increment: dU = TdS − pdV + µ dN

5 Michaelmas 2012 Part II Thermal & Statistical Other sources of energy “Paramagnetic Salt” is a model system where each atom is fixed on a lattice and carries a magnetic moment (spin=1/2), independently of its neighbours. ε = mB ε = ε = 0 ↑ Without an external field the up-down ↑ ↓ ε = −mB spin energy levels are the same, but the ↓ magnetic field splits the degeneracy: U = −M ⋅ B = (N↑ − N↓ )mB dU = TdS − M ⋅dB

Please watch the dimensionality… People (and textbooks) are often flippant about the difference between “energy” and “energy density” - and in this case: the “magnetic moment” and “magnetisation”…

Michaelmas 2012 Part II Thermal & Statistical Other sources of energy “Simple Harmonic Oscillator” is a workhorse of physics (this is because any potential energy is quadratic near equilibrium).

For the potential V(x)= ½ a x 2 the nth energy level of the excited oscillation has the V (x) value: h 1 ε n = ω (n + 2) n= 2 ω = a where the natural frequency m n= 1 Total energy of an assembly of such ∆E oscillators is then n= 0 h 1 U = ∑ ω (ni + 2) x i

6 Michaelmas 2012 Part II Thermal & Statistical Other sources of energy Van der Waals gas is an empirical model V(r) system that takes into account the pair Short-range interactions between particles of gas. repulsion  2   N a  r  p + 2 (V − Nb ) = N kBT  V  Long-range N k T N 2a or p = B − attraction V − Nb V 2 We shall use this model in several contexts in this course. Here let’s just note that with a potential like this, the energy can be stored as kinetic or potential form – and one can convert kinetic ↔ potential energy , for example, by Joule expansion.

Michaelmas 2012 Part II Thermal & Statistical Equation of state The equation of state is a relationship between ( p,V,T ) – for instance: 2 N k T N a pV = N k T or p = B − but not pV γ = const B V − Nb V 2 Such a relationship describes a surface in a 3D (p,V,T ) space!

7 Michaelmas 2012 Part II Thermal & Statistical Thermodynamic variables Thermodynamic variables are observable properties of any system. They fall into two categories, intensive and extensive : System Intensive Extensive Linear spring F force x displacement Gas p pressure V volume Particle exchange µ chemical potential N number Surface film γ surface tension A area Electrostatic V potential q charge Magnetic B “magnetic field” M “magnetic moment” …any… T temperature S entropy Intensive and extensive variables form conjugate pairs , whose product has the dimensionality of (and therefore represents) energy, e.g. ∆U = ∆Q + ∆W = T ⋅∆S − P ⋅∆V Other forms of work: ∆W = −F ⋅∆x ∆W = γ ⋅∆A etc.

Michaelmas 2012 Part II Thermal & Statistical Thermodynamic potentials We have just seen, for the mean internal energy of the system: ∆U = T ⋅∆S − p ⋅∆V or dU = T ⋅dS − p ⋅dV This means that ( S,V ) are the natural variables of U(S,V) which means that to determine the value of U we must measure S and V (or “keep them under control”). But in some situations (e.g. in an open container) it may be hard to control/measure the volume, instead the pressure P would be a more natural variable? Introduce enthalpy : H = U + pV then dH = dU + d(pV ) = TdS − pdV + pdV +Vdp = TdS +Vdp We conclude that the function H=H(S,p), with the conjugate pair (p,V ) “switched” – that is, now V plays the role of a “force” while P the “displacement”

8 Michaelmas 2012 Part II Thermal & Statistical Thermodynamic potentials What we have just seen: U(S,V) → H(S,p) is one example of a general “ Legendre transformation ” switching between the “force” and “displacement” within pairs of thermodynamic variables. In many situations it may be hard to control/measure the entropy , instead the temperature T would be a more natural variable? Introduce free energy : F = U −TS then dF = dU − d(TS ) = TdS − pdV −TdS − SdT = −SdT − pdV We conclude that the function F=F(T,V). In the similar way we may introduce the Gibbs free energy : G = H −TS = F + pV dG = dF + d(pV ) = −TdS − pdV + pdV +Vdp = −SdT +Vdp

Michaelmas 2012 Part II Thermal & Statistical Thermodynamic derivatives Thermodynamic potentials are different forms of energy, expressed in appropriate “natural variables”. We have just seen four: U(S,V) → H(S,p) → G(T,p) → F(T,V). dU = TdS − pdV dH = TdS +Vdp dG = −SdT +Vdp dF = −SdT − pdV This means that we can have partial derivatives, in each case determining the corresponding thermodynamic force:  dU   dH   dU   dF  e.g. T =   =   p = −  = −   dS V =const  dS  p=const  dV S =const  dV T =const What will happen if we take a derivative with respect to a “wrong” variable?  dH  TdS +Vdp  dH   dp  e.g.   = so   = V    dV  dV  dV S =const  dV S =const  dU   dS  C   = T  =CV so dS = dT  dT V =const  dT V =const T

9 Michaelmas 2012 Part II Thermal & Statistical Maxwell relations Thermodynamic potentials are different forms of energy, expressed in appropriate “natural variables”. We have seen four: U(S,V) → H(S,p) → G(T,P) → F(T,V). dU = TdS − pdV dH = TdS +Vdp dG = −SdT +Vdp dF = −SdT − pdV One can evaluate a second derivative in two ways:  d 2U   dT   dp  e.g.   =   = −  One of Maxwell relations  dS dV   dV S  dS V

 dS   dV   dS   dp   dT   dV    = −    =     =   dP dS  dP T  dT P  dV T  dT V  S  P Do you notice the pattern? 1) Which two variables → 2) Which potential → 3) What sign?

Michaelmas 2012 Part II Thermal & Statistical Analytic Methods  dx   dx   du  Chain rule:   =     Maxwell rel:  dy z  du z  dy z  dX   dY  Π = Π(x, y :)   = ±  Reciprocity  dx   dx   dz   dy   dx    = −    x y theorem:      dy z  dz  y  dy x Entropy of an ideal gas  ∂S   ∂S  Suppose S=S(p,T) dS =   dT +   dp  ∂T  p  ∂p T dT  ∂V  dT dp dS = C −   dp = C − Nk p T  ∂T  p T B p Now integrate: p

S = C p ln T − Nk B ln p + const  Nk T   V T 2/3  2/5 B   S = Ns 0 + Nk B ln T − ln  = Nk B ln const ⋅   V   N 

10 Michaelmas 2012 Part II Thermal & Statistical Cp and Cv once again... Let us use this as another example of calculation:  ∂S   ∂S  For any function, such as S( T,V ): dS =   dT +   dV  ∂T   ∂V  Now evaluate V T  dS   dS   dS   dV  C p = T  = T  + T     dT  p  dT V  dV T  dT  p  dp   dV  C p = CV + T     dT V  dT  p We stop as soon as the combination ( P,V,T ) is reached If ideal gas, then:  dp  Nk  dV  Nk   = B ;   = B  dT V V  dT  p p

so C p = CV + Nk B

Michaelmas 2012 Part II Thermal & Statistical Joule expansion

Isolated system, so ∆U=0 . During the expansion V1→V2 V2  ∂T  T −T =   dV 2 1 ∫ ∂V  V1 U

 dT   dT   dU  1 TdS − pdV 1   dp     = −    = − = − T  − p  dV U  dU V  dV T CV dV T CV   dT V  We stop as soon as the combination ( P,V,T ) is reached  dp  Nk T If ideal gas, then: T  = B = p; so ∆T = 0  dT V V Nk T N 2a Try to evaluate this for a non-ideal gas, e.g. p = B − V − b V 2

11 Summary so far……..

The 2nd Law of thermodynamics: “Entropy of a closed system increases to a maximum in

equilibrium ” dQ = TdS S = kB ln Ω Thermodynamic variables come in conjugate pairs of “force”–“variable”. A given set defines the corresponding thermodynamic potential: “Enthalpies” “Free energies” U , H , ... = TdS + { Ydx } F , G , ... = − SdT + { Ydx } Mean energy U(S,V,N) is the only potential that depends on all the extensive variables! Hence U( λS, λV, λN)= λU(S,V,N)

Tools of analytical thermodynamics: Maxwell relations – Reciprocity – Chain rule → stop at ( P,V,T )

Michaelmas 2012 Part II Thermal & Statistical Internal equilibrium Let us consider a closed system, fully isolated from outside. The 2 nd Law demands that the change of entropy of this system can only be positive, ∆S> 0, and it should be maximum in equilibrium. Let’s divide the system into two parts: U,V,N 1 2 they can exchange U, V and N

dU 1 + p1dV 1 − µ1dN 1 dU 2 + p2dV 2 − µ2dN 2 dS tot = dS 1 + dS 2 = + T1 T2 dU + p dV − µ dN − dU − p dV + µ dN = 1 1 1 1 1 + 1 2 1 2 1 T1 T2  1 1   p p   µ µ     1 2   1 2  =  − dU 1 +  − dV 1 −  − dN 1  T1 T2   T1 T2   T1 T2  Between any two parts inside closed system: T1 = T2 ; p1 = p2 ; µ1 = µ2

12 Michaelmas 2012 Part II Thermal & Statistical Equilibrium in open systems A much more relevant problem is about a system interacting with a reservoir (the rest of the Universe). The 2 nd Law is still in action, only it applies to the whole: as before, the two parts of the (closed) 1 Universe can exchange U, V and N U,V,N R dU 1 + p1dV 1 − µ1dN 1 dU R + pRdV R − µRdN R dS tot = + T1 TR since: dU + p dV − µ dN − dU − p dV + µ dN ∆X1 = −∆X 2 = 1 1 1 1 1 + 1 R 1 R 1 T1 TR

− dU − pRdV + µRdN 1 = dS + = ()TRdS − dU − pRdV + µRdN TR TR So far, no assumptions were made, but surely the “reservoir” is very big...

Michaelmas 2012 Part II Thermal & Statistical Availability Now assuming the reservoir is so big that it “doesn’t notice” any changes that our system inflicts on it:

dT R = dp R = dµR = 0 p dV = d( p V ;) etc . 1 So: R R U,V,N We can now define a new object for our R

system, called the availability A, as: dA = −TRdS tot 1 dS tot = ()TRdS − dU − pRdV + µRdN TR

Meaning that: dA = −TRdS + dU + pRdV − µRdN

dA = (T −TR )dS −(p − pR )dV +(µ − µR )dN

• Availability is the function of system variables ( U,S,V,N )

• Since R-variables are constant: A =U −TRS + pRV − µR N • In equilibrium, availability is at its minimum: dA ≤ 0

13 Michaelmas 2012 Part II Thermal & Statistical Availability dA ≡ −TRdS total = dU −TRdS + pRdV − µRdN A particularly important aspect of the 2 nd Law in the form of the availability reaching its minimum when the system is at equilibrium with the (big) reservoir is that – for any variable X that characterises our system – the probability is:  A(X )  P(X ) ∝ exp  −   kBT  Now consider our system at constant T,V and N. Then: dA = dU −T dS + p dV − µ dN R R R T ,V ,N = dU −T dS = d(U −TS ) = dF (T,V,N) ... the Helmholtz free energy, which has T,V and N as its proper variables. dA ; dA or; dA Check yourself: T , p,N S, p,N S ,V ,N

Michaelmas 2012 Part II Thermal & Statistical Phase equilibrium If we are looking at an open system, then it has to be under the (p,T ) control, and so has to be described by G(T,p,N), e.g., in contrast to a fixed-volume vessel, which requires ( V,T ) -and F. When the two phases coexist, the full potential is G = Gl + Gv dA = dG = 0 In equilibrium: T , p

dG l = −Sl dT +Vl dp + µl dN l = −dG v = SvdT −Vvdp − µvdN v

Now, dT=dp=0 and dN l=−dN v, so µl = µv

This is a part of our earlier equilibrium condition set for the two parts of a system. Every time when there is a particle exchange, we find µ matching between the subsystems.

14 Michaelmas 2012 Part II Thermal & Statistical Phase equilibrium Look at the familiar example of VdW isotherms and the resulting gas-liquid transition. On the ( p,V ) plane we have:

And at a temperature below the critical point F there is a region of coexistence . How can we find the pressure at which this occurs, for a given T ?

µl (A) = µv (E) E   ∂µ The Gibbs-Duhem equation: µv (E) = µl (A) +   dp ∫A    ∂p T E dµ = −sdT + vdp = µl (A) + vdp ∫A The equal area rule gives the = 0 vapour pressure pv(T)

Summary so far……..

Availability – the energy function that reflects the balance between the system and the reservoir:

dA = dU −TRdS + pRdV − µRdN

= (T −TR )dS − ( p − pR )dV + (µ − µR )dN The system interacting with a reservoir has a probability to have the value of its variable X given by  A(X )  P(X ) ∝ exp −   kBT  When you control a certain set of variables of your system, e.g. ( X,Y ), then a small increment in availability is equal to the increment of the corresponding T.D.potential Π(X,Y ) e.g. dA = dF (T,V , N) T ,V ,N

15 Michaelmas 2012 Part II Thermal & Statistical Microstates and Macrostate Microstate is a particular configuration ( realisation ) of the system with certain values of microscopic parameters, e.g.

• set of solutions of a Schrödinger equation with an energy Ei • positions and velocities of particles in a classical gas • an arrangement of spins in a paramagnetic lattice

Macrostate is a set of all microstates which has a certain mean energy U and is subject to any other constraint: V, N, etc. For an isolated system, all microstates compatible with the given constraints are equally likely to occur

Statistical mechanics is all about finding 1  ∂ ln Ω  the number microstates : Ω(U,V , N) =   kBT  ∂U 

The Boltzmann entropy: S = kB ln Ω

Michaelmas 2012 Part II Thermal & Statistical Statistical entropy The entropy ( heat per unit temperature ) is the measure of irreversibility, of chaos, of disorder – quantitatively

measured as “number of configurations”: S = kB ln Ω Examples: 1) Identical particles in a box: Ω=N!

S = kB ln( N )! ≈ (N ln N − N) using Stirling approximation : N!≈ N N e−N N! 2) Several populations ( N + N +N +…=N ): Ω = 1 2 3 N !N !N !...   1 2 3   S = kB ln Ω ≈ kB  N ln N − N − ∑[Ni ln Ni − Ni ]  i= 3,2,1 ,...    N N   i i = kB  ∑ Ni[ln N − ln Ni ] = −N kB ∑ ln  i= 3,2,1 ,...  i= 3,2,1 ,... N N

“Gibbs entropy”: S = −N k P i ln)( P i)( B ∑i

16 Michaelmas 2012 Part II Thermal & Statistical Canonical ensemble Microcanonical ensemble is just a collection of thermally isolated systems, all with the same energy – and hence the same probability of occurrence. Canonical (or standard) ensemble is a collection of systems all connected to a reservoir (and possibly to each other) so that they can exchange energy. So the mean energy of this ensemble U fluctuates. We have a definite number: N. There will also be a grand canonical ensemble (for the lack of a better name), in which the systems are also allowed to exchange particles, so that both U and N fluctuate (while the intensive variables T and µ are fixed). Gibbs derives the probability that a chosen subsystem ( i) E to be found in a microstate Ei : 1 − i P i)( = e kBT Z

Michaelmas 2012 Part II Thermal & Statistical Partition function The Boltzmann factor determines the probability for a

system to be found in a state with energy Ei E 1 − i P i)( = ⋅e kBT where the normalization factor Z  E  Z exp i = ∑ − k T  all states {i}  B  This is not just a mere normalization, but a very important object in statistical physics, called the partition function. It encodes the statistical properties of the whole system . Its significance is mainly in its role in defining the free energy , a most important form of thermodynamic potential energy,

−F / kBT expressed as F = −kBT ln Z or Z = e E e−Ei / kBT ∑i i We can find the average U =< E >= ∑ Ei P i)( = −E / k T i ∑ e i B energy of the system i 1 ∂Z ∂ U = − = − ln Z with β =1/ k T Z ∂β ∂β B

17 Michaelmas 2012 Part II Thermal & Statistical Maximum probability It is now quite obvious that the maximum probability P(i)

has the microstate with: [1] the lowest energy (Ei), and [2] the largest “ degeneracy ” ( Ni), i.e. the number of states with the same energy: E E E − i − i − i kBT kBT kBT ln Ωi Z = ∑ e = ∑ e ⋅()Ni! = ∑ e ⋅e all states microstate s {}Ei {}Ei 1 1 − ()Ei −kBT ln Ωi − ()Ei −TS i kBT kBT = ∑ e = ∑ e = ∑ Zi {}Ei microstate s microstate s Maximization of the partition function, or (equivalently!) minimization of the free energy F = U – TS is the main driving force in all Nature. Average (potential) energy U → minimum ? In balance… Average entropy S → maximum ?

LastSummary lecture…….. so far……..

The maximum probability P(i) is for the microstate with:

[1] the lowest energy ( Ei), and [2] the largest “ degeneracy ” (i.e. the number of states with the same energy)

E 1 − i − ()Ei −kBT ln Ωi kBT kBT Z = ∑ e = ∑ e = ∑ Zi all states microstate s {}Ei microstate s 1 1 − ()Ei −TS i P i][ = e kBT Z Maximization of the partition function, or (equivalently!)

minimization of the free energy Fi = Ei – TS i is the main driving force in all natural processes Average energy U → minimum In balance… Average entropy S → maximum

18 Michaelmas 2012 Part II Thermal & Statistical Two-level system

As one simplest example of a real physical case E= ε that we can analyze within proper statistical mechanics, let’s consider a 2-level system: E= 0

The object (e.g. electron in an atom, or a spin) E − i − ε can exist on either of the two levels: Z = ∑ e kBT =1+ e kBT all states Once the partition function is found, you know everything about the system! ∂ Route 1 : mean energy U = − ln Z where β =1/ k T ∂β B Here we have U 1  ∂  ε e−β ε ε ε  −β ε  U = U = −β ε − e  = −β ε = ε 2 1+ e ∂β 1+ e k T   e B +1 Low-T High-T

− ε U = ε e kBT T

Michaelmas 2012 Part II Thermal & Statistical Two-level system − ε kBT Partition function Z = 1+ e E= ε Once the partition function is found, you know everything about the system! E= 0

−ε / kBT Route 2 : free energy F = −kBT ln Z = −kBT ln (1+ e ) Recall thermodynamic potential properties: dF = −SdT − PdV  dF  k T d S = −  = k ln ()1+ e−ε / kBT + B e−ε / kBT B −ε / kBT  dT V 1+ e dT ε 1 = k ln ()1+ e−ε / kBT + S B ε / kBT T e +1 S =k Bln (2) At low temperature U → min “wins”; Low-T ε − ε entropy “loses out”… S = e kBT High-T T At high temperature S → max “wins”; T energy follows…

19 Michaelmas 2012 Part II Thermal & Statistical Paramagnetism B Paramagnetism, in its simplest form is analogous E=+ m·B to the 2-level system: E= 0 Consider a system (e.g. crystalline solid) in which E=−m·B m s each atom has a magnetic moment [ spin ]: = µM Assume the spins do not interact ( each on its own ). (selection rules for a spin s= ½) allows only in 2 states for spins: “ up ” and “ down ” with the same energy. But if an external magnetic field B is imposed, the “up” and “down” states have different energy! E − i mB −mB k T k T k T   Z = e B = e B + e B = 2cosh mB  1 ∑ k T all states  B  N But there are N such atoms in the system, all independent: Z = Z1 Once the partition function is found, you know everything about the system!

Michaelmas 2012 Part II Thermal & Statistical Paramagnetism B  mB −mB N Partition function Z = e kBT + e kBT  E=+ m·B   Route 2 : free energy E= 0  mB −mB  E=−m·B F = −N k T ln  e kBT + e kBT  B   Magnetization M (exactly like dielectric polarization P) should be defined as the sum of all m-dipoles per 1 volume . It is an extensive thermodynamic variable, forming a conjugate pair with the (intensive) B-field (same with PdE ): dF = −SdT − PdV + MdB which sign?  dF  Average magnetization induced by an external B will be M = (?)   determined by the appropriate thermodynamic derivative:  dB T ,V =const M d  mB −mB  M = N k T ln e kBT + e kBT  N m B dB     = N m⋅ tanh mB  B  k T  B −N m High-T

20 Michaelmas 2012 Part II Thermal & Statistical Quantum oscillator E = hω (n + 1 ) Oscillator (e.g. a vibrating molecular bond) has ∞ number n 2 of states (labeled by n) separated by equal energy gaps. Whether we can “see” this discrete nature of oscillator motion depends on our equipment: if it is sensitive to the accuracy ∆E =hω So, from statistical point of view, we look for the ∞ −hω(n+ )2/1 partition function: kBT Z = ∑e ω − h n=0 hω − ∞ e 2kBT 1 Geometric progression Z = e 2kBT an = = ∑ hω hω hω n=0 − − 1− e kBT e 2kBT − e 2kBT Low-temperature limit: High-temperature limit: ω − h k T k T 2 B Z ≈ B Z ≈ e hω ∂ ∂ 1 Mean energy: U = − ln Z U ≈ ln ()hωβ = = k T ∂β ∂β β B “Zero-temperature ∂  hω  1 ≈  β  = hω “Classical limit”, CV=k B oscillations” ∂β  2  2

Michaelmas 2012 Part II Thermal & Statistical Quantum oscillator E = hω (n + 1 ) Oscillator (e.g. a vibrating molecular bond) has ∞ number n 2 of states (labeled by n) separated by equal energy gaps. Once we know the partition function, we know all properties of our system…

ω − h 2kBT k T Low-temperature limit: Z ≈ e High-temperature limit: Z ≈ B hω 1 k T  Free energy: F = −k T ln Z ≈ hω F ≈ −k T ln  B  B 2 B  hω  Entropy is zero in the Find entropy in this “”: F=U classical limit ∂F k T   1  S = − = k ln  B  + k T  S(T) ∂T B  hω  B  T  e⋅k T  = k ln  B  B  hω  h kBT kBT ≈ ω

21 Take a deep breath.

… Stretch.

… We move forward

Michaelmas 2012 Part II Thermal & Statistical Continuous systems So far we have seen how to handle physical systems in which we can enumerate different microstate s ( Ei and Ni) – and find the partition function, which in turn makes accurate predictions about average (most probable) state of such systems. But the simplest(?) and most common object to study in thermodynamics is the ideal gas : PV=Nk BT. Let’s look at it:

Actually, N particles is far too many: m v let’s start with just one ! E − i Z = ∑e kBT all states Volume V=L 3 We have two (related) difficulties: (a) what is E, when the particle doesn’t interact with anything , and (b) how to count the “states”…

22 Michaelmas 2012 Part II Thermal & Statistical Phase space Normally, by E we would want to mean the m potential energy W(x), which would have a v “minimum” and the higher Boltzmann factor. But here there is no P.E. (only kinetic energy ) V=L 3 Normally, you would describe you system (here – just 1 particle) by assigning it a position, say, x(t) if we look only on 1 dimension. But this is clearly not a full description! x(t) p(t) Need velocity?... t x(t) Introduce the “ phase space ”, the coordinate and the momentum along this axis, x(t) and p=mv. An element of phase space, at a time t, gives a full predictive description of where it will be at t+dt

Michaelmas 2012 Part II Thermal & Statistical Continuous states So we account for all the possible states of a m particle by summing over all possible points in v its phase space. Consider how to do this on a 1-dimensional example (along x-axis): p(t) V=L 3 dx ∑ = ∑ → ∫ dx dp dp all states pi xi ∆x ∆p x(t) But note that the sum was non-dimensional, while the integral has dimensionality of [ kg.m 2/s ] ? The formal conversion of a sum into an integral requires dividing by the elementary “step of discretisation”. Here it is ∆x ∆p. What is the smallest possible value ∆p ∆x can take? 2π h dx dp dk This result is worth remembering: ∑ = ∫ = ∫ dx “the world is discrete in phase space” all states ()2π h 2π

23 Michaelmas 2012 Part II Thermal & Statistical One particle in a box We can now make progress: the single particle m of mass m and momentum p has its statistical v partition function: p2 3 3 − 3 −E d x d p 2m k T V=L kBT B Z1 = ∑e = ∫ 3 e all states ()2π h First of all notice that nothing under the integral depends on x, i.e. there is no potential energy V(x): this is ideal gas! 3 p2  p 2  3 − − x d p 2m k T  dp 2m k T  Z = V ⋅ e B x B 3 1 ∫ 3 = V ⋅ ∫ e  = V / λ ()2π h  ()2π h    Secondly, instead of doing complicated 3-dimensional integrals, 2 2 2 2 3 note that p = p x + p y + p z and d p = dp x dp y dp z This is a very important expression, and let’s call it 1/λ

Michaelmas 2012 Part II Thermal & Statistical One particle in a box The particle of mass m and momentum p has its partition function: λ3 p2 3 3 − d x d p 2m k T V B V=L 3 Z1 = ∫ 3 e = 3 ()2π h λ This is how many ways you can “pack” the particle into this box! So what is this length scale λ? p 2 +∞ − h2 1 dp 2m k T 1 2π = e B = 2π m k T so λ = ∫ h h B m k T λ −∞ 2π 2π B Recall the de Broglie wave length of a wave representation of a particle. Also recall the mean thermal velocity from the Maxwell distribution. 2π h 2π h 2π h 2( π h)2 λ = = = = p mv m⋅ kBT / m m kBT So we managed to “count” the possible states a free classical particle can have in the box – and the result is just: Z=V/ λ3

24 Michaelmas 2012 Part II Thermal & Statistical Classical ideal gas One particle of mass m has its partition function:  3 What is we have N such V  mk BT  Z1 = = V particles, all independent of λ3  2π h2  3 each other?   V=L 1 Z = Z N N N! 1 But if all particles are exactly the same (indistinguishable) – we could not tell the difference between many configurations! We now know everything about ideal gas ∂ ∂ Route 1 : Mean energy U = − ln Z = − []N ln Z − ln( N )! ∂β N ∂β 1 Using factorizing property  3  ∂  mk T  ∂  1  of logarithms! U = −N ln V  B   = −N ln     h2   2/3 ∂β   2π   ∂β β  3 1 3 U = N = Nk T -- This is actually correct! 2 β 2 B

Michaelmas 2012 Part II Thermal & Statistical Classical ideal gas The ideal gas of particles of mass m has 3 Z N V  mk T  Z = 1 where Z = = V  B  N 1 3  2  N! λ  2π h  V=L 3 Route 2 : Free energy

F = −kBT ln Z N = −kBT[N ln Z1 − N ln N + N] Using factorizing property 3 of logarithms! Nλ3 Nλ F = N kBT [ln ( )−1]= N kBT ln V Ve  dF  N k T Let’s find the pressure: P = −  = B  dV T =const V Note: we could not find the ideal gas law from the mean energy U, which was in the “wrong variables” T,V !  dF  Nλ3 3 Entropy is a bit of work: S = −  = −Nk B ln + Nk B  dT V Ve 2

25 Michaelmas 2012 Part II Thermal & Statistical Grand partition function In the same way, for a grand canonical ensemble, the probability for a given system to have energy Ei and a number of particles Ni (in contact with reservoir that maintains T, µ) is: 1  E − µN  P i)( = ⋅exp − i i  with the grand partition function Ξ  kBT  TOTAL   E − µN   Ξ = ∑ ∑ exp − i i    k T  Ni =0all states { i }  B   There is a corresponding thermodynamic potential that needs to be minimised in equilibrium. It is called the grand potential :

−Φ / kBT Φ = −kBT ln Ξ or Ξ = e Note that its natural variables are ( T,V, µ) and it is obtained from F(T,V,N) by the Legendre transformation: Φ=F −µN

Michaelmas 2012 Part II Thermal & Statistical Grand potential In the same way as we have analysed the canonical p.f. Z, by identifying a free energy of a microstate Ei−TS i, let us re-order the grand-canonical summation and arrange the exponent:

TOTAL  µ N Ei  1 1  −   − (Ei −kBT ln Ωi −µNi )  − Φi Ξ = ∑e kBT ∑e kBT  = ∑ ∑e kBT  = ∑e kBT     N=0 all states { i }  microstate s ENi i  {Ei }

The first observation is that one can have a “grand partition function’’ and a grand potential for a given microstate, if it can exchange particles with other microstates.

Secondly, as in the canonical ensemble, the minimum of Φi is the most probable microstate, and at sufficiently low T is can be treated as the average , i.e. thermodynamic Φ( T,V, µ)= F−µN. 1  E − k T ln Ω − µN   i B i i  Probabilit y P(Ei , Ni ) = exp −  Ξ  kBT 

26 Michaelmas 2012 Part II Thermal & Statistical Grand potential TOTAL µN You will have noticed: Ξ = ∑ Z(N)e kBT N =0 For instance, for a classical ideal gas we know what Z(N) is, so

TOTAL µN N TOTAL µ / kBT N k T Z (T,V ) (Z e ) µ / k T e B 1 1 B Ξ = ∑ = ∑ = exp (Z1e ) N =0 N! N =0 N! So if this ideal gas is in contact, and can exchange particles, with a (big) reservoir which maintains a chemical potential µ:   V µ / kBT We have the pressure: Φ = −kBT ln Ξ = −kBT 3 e ∂Φ kBT β µ  λ  p = − = 3 e Mean number of particles: ∂V T ,µ λ ∂Φ V µ N = − = e kBT N 3 so Φ = −kBT N ; Ξ = e ∂µ T ,V λ N − N 1 (eβµ Z ) N N e Probabilit y P(N) = ⋅ 1 = Ξ N! N!

LastSummary lecture…….. so far……..

Partition function of classical ideal gas 3   N V mk BT Z Z = = V   1 3 1 3  2  Z N = V=L λ  2π h  N!

Systems open to particle exchange: grand partition function TOTAL   E − µN   Ξ = ∑ ∑ exp − i i    k T  Ni =0all states { i }  B   Classical ideal gas:

µ / kBT Ξ = exp (Z1e )  V  Φ = −k T ln Ξ = (F − µN) = −k T eµ / kBT = −k T N B B  λ3  B

27 Michaelmas 2012 Part II Thermal & Statistical p-T and p-T-µ ensembles By analogy with constructing the grand partition function in an ensemble when we had particle V −V T, p0 0 exchange, under controlled µ, we should build a corresponding statistical sum: ∞  p0V  1 1  −   − (Ei −kBT ln Ωi + p0Vi )  − Gi (T, p0 ,N) kBT kBT kBT Ψ = ∑e Z(T,V , N) = ∑ ∫ dV i e  = ∑e     V =0   microstate s Ei   {Ei } The corresponding thermodynamic potential is G(T,p, N)= F+pV . If now in addition we also open the system to

V0−V exchange particles (at µ imposed by the reservoir), N −N 0 then the corresponding thermodynamic potential T, p 0 would be Y(T,p, µ) = G – µN = F + pV – µN = Φ+pV In fact, in such an ensemble there is no proper (extensive) thermodynamic potential of the system, that is, dY = 0.

Michaelmas 2012 Part II Thermal & Statistical Chemical potential of ideal gas Many different ways to obtain this, but the properly systematic is to obtain the canonical Z(N), then its F(T,N) and then µ: 3  3  3 Nλ  Nλ  Nλ F = Nk BT ln ; µ = kBTln +1 = kBT ln Ve  Ve  V

Some additional factors may contributed to the single-particle Z1. For instance, a constant (adsorption) potential φ, or an internal (e.g. vibrational) h degree of freedom: Z vib = kBT / ω  3   3 βφ   Nλ   Nλ e  F = Nk BT ln  −βφ ; µ = kBT ln   Ve ⋅ e Z vib   V Z vib  Alternatively, from ∂Φ V µ kBT Extras… N = − = 3 e ∂µ T ,V λ Let us examine what this factor implies

28 Michaelmas 2012 Part II Thermal & Statistical Mixtures Let us consider the p-T controlled ensemble (natural on the laboratory benchtop) where the gas has several species, that

is, N=Σi Ni .The partial (osmotic) pressure law gives NikBT p = ∑ pi = ∑ and similarly for the entropy: i i V  T 2/5  5  T   p  S = S = N k ln const  = N k ln   − N k ln  i  ∑i i ∑i i B   B   ∑i i B    pi  2  T0   p0  So the change in entropy is: ∆S = −k N ln c B ∑i i ( i ) In the same way, using the general form for the chemical potential of ideal gas, we have for each species: N λ3 Nλ3 µ = k T ln i i = k T ln i + k T ln ()()c = µ ( p,T ) + k T ln c i B V B V B i i B i

Pure gas of i-species Addition due to

at current p,T current ci=Ni/N

Michaelmas 2012 Part II Thermal & Statistical Chemical reactions Consider a generic chemical reaction, say, A + 2B = 2C . Remaining in the p-T ensemble, we must work with the Gibbs

potential G(p,T,N) = ΣiGi. In equilibrium: dG=0, so we obtain dG = −SdT +Vdp + µ dN = µ dN + µ dN + µ dN = 0 ∑i i i A A B B C C 1 1 but dN = dN = − dN A 2 B 2 C

For an arbitrary reaction in equilibrium: ⇒ µ A + 2µB − 2µC = 0 ν µ = 0 = ν µ ( p,T) + k T ln c ν i ∑i i i ∑i i i B (∏i ( i ) )

2 Define the chemical ν c c K (p,T) = c i ⇒ K = A B c ∏i ( i ) c 2 equilibrium constant cC 1 ln K = − ν µ ( p,T) c ∑i i i kBT

29 Michaelmas 2012 Part II Thermal & Statistical Chemical reactions

Consider a fully general chemical reaction, Σi νi Ai = 0 Remaining in the p-T ensemble, with dG = µ dN = 0 ∑i i i 3 Let us recall what we know about µi:  N λ    ii ii  −βφ  µii = kBT ln  ⋅e   V   hence ν µ = k T ν ln N − ln Z (i) = 0 ∑i i i B ∑i i ( i 1 ) Define an alternative chemical K (p,T) = N νi N ∏i ( i ) equilibrium constant

νi Then its value is given by: K = Z (i) N ∏i ( 1 ) For instance, for A + 2B = 2C , we might get Potential gain −φ

3 3 2 3 3 2 (V / λA )( V / λB ) (V / λA )( V / λB ) −βφ KN = 3 2 >> 1 but KN = 3 2 e << 1 (V / λC ) (V / λC )

Summary so far……..

A generic chemical reaction, Σi νi Ai = 0 (e.g. 2H+O=H 2O )

A “chemical” (empirical) version of K ( p,T) = (c )ν i chemical equilibrium constant c ∏i i

A “statistical” version of K (p,T) = (N )ν i = (Z )ν i chemical equilibrium constant N ∏i i ∏i i This is how you calculate it…

For instance, for 2H + O = H 2O, we get

3 3 2 3 3 2 − φ (V / λO )( V / λH ) (V / λO )( V / λH ) kBT KN = 3 >> 1 but KN = 3 e << 1 (V / λH2O ) (V / λH2O ) Bonding potential energy −φ

30 Michaelmas 2012 Part II Thermal & Statistical Classical vs. Quantum The ideal gas of particles of mass m has N 3 Z1 Nλ Z N = or F = N kBT ln N! Ve V=L 3 Note how this factor turns up under the logarithm! 2/3 What is the meaning of Nλ3 N  2π h2  =   V V  m k T  3  B  Nλ Nλ3 1 >> >> 1 V V Classical physics , particles are Quantum physics , particles localized and interact via forces interact and behave as waves

Michaelmas 2012 Part II Thermal & Statistical Classical vs. Quantum We established an important operation, of changing the order of summation (in different ensembles), to end up summing over the microstates Ei. In the grand canonical ensemble: TOTAL µN 1 1    − (Ei −kBT ln Ωi −µNi )  − Φi (T, V, ) Ξ = ∑ e kBT Z(N) = ∑ ∑e kBT  = ∑e kBT     N =0   microstate s ENi i  {Ei } In this way we identified the grand partition function of a given microstate, and the corresponding potential:

TOTAL 1  − []ε k (n)−µ n  kBT Ξk = ∑ e  ; Φ k = −kBT ln Ξk n=0  

In the quantum regime , when the separate particles cannot be properly distinguished and the statistical sum over microstates could be very difficult, we have an interesting way forward, if it happens so that: εk(n)=n εk

31 Michaelmas 2012 Part II Thermal & Statistical Classical vs. Quantum

When the energy of a microstate Ek factorises with the number of particles in this state, Εk(n)= nk εk (and remember, the entropy is always extensive too), then

1 1 1 TOTAL 1 − (ε1 −µ )n1 − (ε 2 −µ )n2 − (ε 3 −µ )n3  − (ε k −µ )nk  kBT kBT kBT kBT Ξ = ∑e e e ... =  ∑ e  = Ξk ∏   ∏ n1 ,n2 ,n3 ,... k  nk =0  k The corresponding full grand potential is just the sum over each energy state:

Φ(T, µ) = −kBT ∑ ln Ξk = ∑ Φ k k k However, to find any average (e.g. the mean energy U), we need to use the probability:

U (N) = ∑ ε k P(ε k , N) microstate s{k}

Michaelmas 2012 Part II Thermal & Statistical Fermi statistics The Pauli exclusion principle prohibits more than one Fermi

particle to occupy a given energy level εk : n ε −µ ε −µ 1 − k  − k   − k T[ε k −µ ]  k T Ξ = e B  = 1+ e B Φ = −k T ln 1+ e kBT  k ∑ k B   n= 1,0   ε −µ   − k 1 e kBT dΦ k kBT 1 n(ε k ) = − = kBT ε −µ = β[ε −µ ] dµ − k e k +1 1+ e kBT This is the famous expression of the Fermi occupation number : it tells how many particles you can find having the energy E, at temperature T. E n(E) εF This picture makes it clear that T=0 1 µ at low temperature is the “Fermi energy ” – the highest E n level particles have to pack to. 0 µ 0 1

32 Michaelmas 2012 Part II Thermal & Statistical Fermi energy Fermi energy is the chemical potential of Fermi particles at very low , when the density of states is sharp: n(E) 1 T→0 n(E) = 1 E−ε F k T B E e +1 0 εF

If we have the total of N particles, then the sum of all n( εk) has to = N 3 3 d x d p This is actually very easy, if N = ∑ n(ε k ) = ⋅ n(E) ∫ h 3 you recall that E=p 2/2m microstate s { ε k } ()2π ∞ 4π p2dp ∞ m 2/3 N = V ⋅ n(E) = V E dE ⋅ n(E) ∫ h 3 ∫ 2h3 0 ()2π 0 2π 2 3/2 2/3 ε F 2/3 h  N  N m 2 m 2/3   = E dE = ()ε ε F ≈ 4.7 2h3 ∫ 2h3 F m  V  V 2π 0 3π Note how Fermi energy depends on the particle density (or pressure), i.e. it is increasingly hard to add more particles to the system.

Michaelmas 2012 Part II Thermal & Statistical Bose statistics If particles do not have half-integer spin, they are not subject to the Pauli exclusion principle , and can all occupy the same level of energy. These are called Bose particles . n ε −µ ∞ 1  − k   − [ε k −µ ]  1 kBT  kBT  Ξ(ε k ) = ∑e  = ε −µ Φ k = kBT ln 1− e   − k   n=0 k T   1− e B As before, we evaluate the ε −µ − k 1 mean number of particles e kBT with energy E: dΦk kBT 1 n(ε k ) = − = kBT ε −µ = β[ε −µ ] dµ − k e k −1 1− e kBT This is the famous expression of the Bose occupation number : note that n(E) can easily be >1. But we really need to know what is the value of chemical potential µ in this case!

33 Michaelmas 2012 Part II Thermal & Statistical Bose condensate Fermi particles, due to the exclusion principle , must occupy increasingly high levels of energy. Bose particles do not! So, at very low temperatures they can all sit at the level E= 0, which is called Bose condensate . E E

µ=ε F The picture for Bose particles already suggests the answer for µ = ? µ = ? n n 0 1 0 NC A more sophisticated argument says: we have some number of particles

(NC) in the E=0 condensate, and some number ( N-NC) excited. If the two subsystems are in equilibrium, then their chemical potentials are equal

(particles can exchange). How to find the optimal number NC? It is achieved when the corresponding free energy is minimized , that is, when

dF C = 0 … which conveniently happens to be the definition of µC=0 . dN C Conclusion: as soon as the Bose condensate appears, µ≈ 0.

Michaelmas 2012 Part II Thermal & Statistical Bose condensate The total (fixed) number of particles is determined E by the familiar constraint: d 3 x d 3 p N = n(ε ) = ⋅ n(E) ∑ k ∫ h 3 µ = ? microstate s { ε k } ()2π n 0 In 3D, we have for bosons: NC 2/3 V  2m  ∞ E   N = 2 2 ∫ β [E−µ ] dE 4π  h  0 e −1 The area under each curve should be equal to N, but at T→0 the area can’t be preserved... All particles in the ground state: 1 n(E) = N for E = ,0 hence = N e−β µ −1 1 1 k T e−β µ =1+ , so : − βµ ≈ , µ = − B → 0 N N N

34 Summary so far…….. 2/3 Nλ3 N  2π h 2  << 1 Quantum vs. classical =   V V  m kBT  >> 1

Fermi particles: 3/2 n(E) 3 3/2 π 3/4 h2  N  T→0 1 ε F ≈ ⋅   1 2 3/1 m  V  n(E) = ()E−ε / k T e F B +1 E 0 εF

2/3 Bose particles: N  2π h 2  Condensate from   ≈1 1 V  m kBTc  n(E) = ()E−µ / k T e B −1  k T  µ ≈ 0 in fact µ ≈ − B   N 

Michaelmas 2012 Part II Thermal & Statistical Chemical potential, again In the classical regime: the ideal gas of N particles 3 µ = k T ln Nλ at Nλ3/V small B ( V ) µ µF At Nλ3/V large, we are in the quantum regime, µ so Fermi and Bose 0 B N/V systems are different: Nλ3/V ≈1 h2  N  3/2 µ = ε ≈ 4.7   classical F F m  V 

µB ≈ 0 Due to the Bose-condensation This defines the condensation in the state with E= 0 temperature 2 h 3/2 2π N kBTC = ( ) m V Can you estimate number of particles in

the condensate, NC, at any given T

35 Michaelmas 2012 Part II Thermal & Statistical Ideal at low-T By doing the “grand partition function” trick (for additive ε(n)=nεk) we got the probability to occupy a level εk : 1 n(ε k ) == eβ [ε k −µ ] +1 This is essentially the probability P( ε,N), to be used in place of our earlier forms of statistical probability, which was normalised by the corresponding partition function. Now we can find the averages, e.g. d 3 xd 3 p E U = ∑ ε k n(ε k ) = in 3D ∫ h 3 β[E−ε F ] microstate s {k} 2( π ) e +1

2/3 2/3 2/3 2/3 V  2m  ∞ E V  2m  ε F V  2m   2      2/3    2/5  T = :0 U = 2 2 ∫ β[E−µ ] dE = 2 2 ∫ E dE = 2 2 ε F 4π  h  0 e +1 4π  h  0 4π  h   5 

n(E) 3/2 2 2 3/5 1 h  2 N  U h N ε F ≈ 6π  so, ≈ 6.4 () 2m  V  V 2m V 0 εF E

Michaelmas 2012 Part II Thermal & Statistical Ideal Fermi gas at low-T At low, but nonzero temperature the problem is much more 2/3 2/3 complicated V  2m  ∞ E   U = 2 2 ∫ β [E−µ ] dE 4π  h  0 e +1 n(E) 1 Low-T series expansion of “Fermi integrals” E 0 2 ∞ f (E) ε F π 2 4 ∫ β [E−µ ] dE ≈ ∫ f (E)dE + ()()kBT f (' ε F ) +... { kBT f (''' ε F ) } 0 e +1 0 6 Note that this works the same way for any g( ε), i.e. in any dimension For the mean internal energy, in 3-dimensions: V  2mE  2/3 V  2m  2/3 π 2 f =   so, U = U +   ε 2/1 ()k T 2 ≡ U + g(ε )⋅()k T 2 4π 2  h2  0 16  h2  F B 0 4 F B

2 ∂U π 2 This is U(T,V,N), so: CV = = g(ε F ) kBT ∂T V ,N 2

36 Michaelmas 2012 Part II Thermal & Statistical Ideal Fermi gas at low-T But what if we need any other thermodynamic quantity, not just

CV , which is all we can get from U(T,V,N) ?

The full grand partition function is Ξ = ∏ Ξk microstate s {k} this is g(E) Therefore the grand potential: 3 3 2/3 ∞ d xd p −β[E−µ ] V  2m  2/1 Φ = Φk = Φ(ε k ) = −kBT ln ()1+ e   E dE ∑ ∫ 3 ∫0 2 2 {k} 2( π h) 4π  h  2/3 2/3 2 V  2m  ∞ E 2   Integration by parts → = − ⋅ 2 2 ∫ β[E−µ ] dE ≡ − U 3 4π  h  0 e +1 3 Now we can legitimately differentiate the proper thermodynamic potential Φ(T,V, µ): 2 3/1 ∂Φ  2m   V  2 p = − = p (T = 0) + const ⋅    k T Pressure F 2 ()B ∂V  h   N  ∂Φ  2m  2/3 Entropy S = − = const ⋅  k 2T ∂T  h2  B

Michaelmas 2012 Part II Thermal & Statistical at low-T The principle is exactly the same: we either find the average energy U(T,N) from which the heat capacity follows – or go for the full grand potential Φ(T, µ) and the rest of thermodynamics. The good (or bad) news is that µ=0 . this is g(E) E V  2m  2/3   U = ∑ ε k n(ε k ) = ∫ β E 2 2 E dE in 3D microstate s{k} e −1 4π  h  2/3 2/3 V  2m  ∞ x   2/5 2/5 Non-dimensional substitution → = 2 2 ()kBT ∫ x dx = const ⋅T 4π  h  0 e −1

2/3 ∞ −β[E−µ ] V  2m  Φ = Φk = kBT ln ()1− e   E dE ∑ ∫0 2 2 {k} 4π  h  2/3 2/3 2 V  2m  ∞ E 2   Integrate by parts → = 2 2 ∫ β E dE ≡ U 3 4π  h  0 e −1 3

37 Be fluent at…….. 2/3 Nλ3 N  2π h2  Quantum regime =   >> 1 V V  m kBT 

Fermi particles at low (non-zero) temperature: 1 n(E) = How to “do” Fermi integrals… ()E−ε F / kBT e +1 (mean energy, Φ, pressure)

Bose particles at low temperature: 1 n(E) = How to “do” Bose integrals ( µ=0) ()E−µ / kBT e −1 (mean energy, etc)

Michaelmas 2012 Part II Thermal & Statistical Photons ( E= ħω) are Bose particles of a special type. Since their mass=0, their “number of particles” is not fixed, but varies with temperature. E.g. E= 0 condensate has no particles at all. 1 n(E) = Can we find the mean number of photons, ehω / kBT −1 of the selected “color” given by the fixed ω? 3 3 d x d p h Instead of E=p 2/2m now there N = ∑ n(ε k ) = ∫ 3 ⋅ n( ω) 2π h is a different relation between microstate s { ε k } () energy and momentum: p = h k = hω / c 4π 2ω 2dω 1 n( ω) N = V ⋅ ∫ 3 3 ω / k T ()2π c eh B −1 Spectral density n( ω) is the number of particles at a given ω: N = ∫ n(ω)dω ω

38 Michaelmas 2012 Part II Thermal & Statistical Stefan-Boltzmann law Photons in thermal equilibrium (emitted by a hot body) have a very characteristic power spectrum, which has originally led Planck to suggest E= ħω. Now we can understand why! 1 n(E) = Let us find the mean energy of photons. ehω / kBT −1 p = h k = hω / c 4π 2ω 2dω hω U = V ⋅ = V u(ω) dω ∫ 3 3 ω / k T ∫ ()2π c eh B −1 1 hω 3 The spectral density of energy is: u(ω) = ⋅ 2π 2c3 ehω / kBT −1 The full energy emitted: U 1 ω = 2π c / λ = ⋅(k T )4 V 2π 2c3 h3 B Stefan-Boltzmann law: U= σ T4

Michaelmas 2012 Part II Thermal & Statistical Other excitations All elementary excitations are Bose particles with µ = 0 , zero rest mass (so E = h ω ) and a dispersion relation between the energy and the momentum ω = ω ( k ) . ω 1 n(E) = eβ hω −1 3 3 k d x d p ∆ω max N = n(ε ) = ⋅ n(hω) k ∑ k ∫ 3 In each case there is a need microstate s { k} ()2π h to find the right form of density of states g(E) dE , which is now g(ω) dω 4π 2k 2dk hω U = ∑ε k n(ε k ) = V ⋅ ∫ 3 hω / kBT k ()2π h e −1

39 Michaelmas 2012 Part II Thermal & Statistical and Normal modes of vibrations in a lattice follow the dispersion relation (lowest energy mode)

1 ω = 2ω0 sin( 2 ka )

where a is the lattice spacing and ω0 the frequency of each bond. The Debye model preserves the cutoff, but ignores the slowing of the wave

ωD 3V V 3 modes per atom 3: N = 1 = ω 2dω ⇒ ω 3 ∑ ∫0 2 3 2 3 D all states 2π c 2π c

3 4 3 3V ∞ hω dω 3V h  k T  ∞ y dy  B  ⇒ 3 low −T : U = 2 3 ∫ β ω = 2 3 ⋅ ∫ y CV ∝ T 2π c 0 e h −1 2π c  h  0 e −1

3 3 ω h ω h 3V D ω dω 3V D ω dω V 3 high −T : U = 2 3 ∫ β ω ≈ 2 3 ∫ = 2 3 ωD ⋅ kBT 2π c 0 e h −1 2π c 0 (βhω + ...) 2π c

Michaelmas 2012 Part II Thermal & Statistical Phonons and Debye model At high-T one can re-express the integral via the 3N constraint, giving the result: This is the energy of 3N simple high − T : U ≈ 3Nk BT harmonic oscillators (see lecture 5). There we had: 1 Z = 1 h 2sinh ()2 β ω0 CV 1 h 1 h U = 2 ω0 tanh (2 β ω0 ) (hω )2 1 C = 0 V 2 2 1 h 4kBT sinh ( 2 β ω0 ) Low-T limit (quantum oscillator) is very different… T

So at high-T phonons behave exactly the " gas" ⇒C ∝ T 3 V same as classical oscillators, but at low-T −hω0 kBT the major difference is due to the continuous "energy gap" ⇒CV ∝ e spectrum (no gaps!)

40