<<

Chemistry 223: Calculations ©David Ronis McGill University

We hav e shown that the entropy, defined through its differential as d− Q dS = Reversible ,(1) T is an extensive (because Q is) state function with units of energy/K; hence, d− Q o Reversible = 0. (2) ∫ T Since entropyisastate function (according to Eq. (2)), we can use any reversible path to calcu- late it, and are guaranteed to get the same answer (something you demonstrated in homework). Howdoyou calculate entropychanges? Clearly from Eq. (1) B − ∆ = d QReversible S A→B ∫ ,(3) A T buthow doyou use this? The first criterion is to find a reversible path connecting your initial and final states. In some cases, this is almost all you have todo. For example, suppose the system is at constant volume. Weknowthat dQV = CV dT,which when used in Eq. (3) shows that

T f CV (T, V, N) T f  ∆S → = S(T , V, N) − S(T V, N) = dT ≈ C ln ,(4a) Ti T f f i, ∫ V   Ti T Ti where the last approximation follows by assuming that CV is independent of (at least approximately).

Similarly,ifinstead the pressure is held constant, we have dQP = CP dT and Eq. (3) becomes

T f CP(T, P, N) T f  ∆S → = S(T , P, N) − S(T P, N) = dT ≈ C ln ,(4b) Ti T f f i, ∫ P   Ti T Ti

where we have again treated CP as approximately constant to get the last relation. In general, ∆ note that STi→T f are not the same in constant volume and pressure processes.

What happens if there is a phase change somewhere between Ti and T f ,e.g., at T0?Atthe phase transition, (e.g., the latent heat of fusion, sublimation, or vaporization) is added to the system, with no resulting change in temperature, until all the material is converted from one ∆ phase to another.Atconstant pressure, the contribution to the entropyisjust Htransition/T0 (what is it for constant volume?) and we can write

T0 ∆ T f C (T, P, N) ∆ = CP,i(T, P, N) + Htransition + P, f STi→T f ∫ dT ∫ dT,(5) Ti T T0 T0 T

where CP,i/ f is the heat capacity in the initial/final phase. Forexample, if we were to warm one mole of from -10C (263K) to 10C (283K) at 1 atm, Eq. (5) gives(treating the heat capacities as constants):

Fall Term, 2015 Chemistry 223 -2- EntropyCalculations

 273 ∆H  283 ∆S = C (ice)ln + fus + C (water)ln ,(6) P  263 273 P  273 where ∆H fus is the molar heat of fusion of water.Similarly,warming 1 mole of water from 90C (363K) to 110C (383K) gives  373 ∆H  383 ∆S = C (water)ln + vap + C (steam)ln ,(7) P  363 373 P  373

where ∆Hvap is the latent heat of vaporization for water.Note that there is an empirical relation, known as Trouton’srule, which asserts that for manyliquids, ∆Svap =∆Hvap/Tb ≈ 90J/K/mol. There are, however, manyexamples where the rule fails (e.g., water,alcohols, amines). (As a reviewexercise, calculate the enthalpychanges for the twoexamples giveninEqs. (6) and (7)). Matters become somewhat more complicated if neither pressure or volume is held constant, butwewill soon have the tools needed to handle the general case.

Chemistry 223: The Third Law of ©David Ronis McGill University

The preceding section has shown howtocompute entropychanges in much the same way as we did for enthalpyand energy changes. What happens in a ? Since S is a state function, we can imagine the reaction proceeding by way of the constituent elements in their standard states of aggregation, and exactly as was done for the ,

∆S =∆S f (products) −∆S f (reactants). (1) Howshould we define the standard state? Youmight think that it should be defined exactly as it wasfor the enthalpy; i.e., pure elements at 1 atm in their standard states of aggregation are arbi- trarily assigned zero entropyofformation. While there is nothing wrong with this, it turns out that experiments performed in the early 20’th century suggest another choice. In 1902, T.W.Richards found, for a wide class of reactions, that the entropyofreaction approached zero as the temperature approached absolute zero. In 1906, using Richards’ data, Nernst argued that this meant that all materials have the same entropyatabsolute zero (which can arbitrarily assigned to be zero). This was summarized by in 1912 in what is now known as the Third LawofThermodynamics: The entropy of all perfect crystalline solids at absolute zeroiszero.

There is a good microscopic reason for this, albeit one that is beyond the levelofthis course. As manyofyou may have seen in your general chemistry course, there is a relationship between entropyand randomness; specifically,cf. the section on entropyofmixing in ideal sys- tems, S = kB ln Ω,where Ω is the number of ways of realizing the system. If there is only one waytorealize the system, the entropyiszero and this turns out to be the case for perfect crystals at absolute zero. The third lawleads to the introduction of an absolute entropyscale, where all (of perfect crystalline states) are zero at absolute zero; hence, at anyfinite temperature

Fall Term, 2015 Chemistry 223 -3- The Third LawofThermodynamics

T dQ T C S(T, P, N) = ∫ P = ∫ P dT,(2) 0 T 0 T

where a similar expression holds in terms of CV for constant volume processes. The last expres- sion must be modified slightly if phase transitions occur between 0K and T (as above). In prac- tice, this means that entropies of formation of the pure elements at 298.15 and 1 atm are NOT zero! Youmight think that the preceding discussion is just an argument about some convention. In part you would be correct; either convention would give identical answers in ∆S calculations. However, the Third Lawdoes have atleast one important physical consequence; namely,that heat capacities must vanish as absolute zero is approached. If this weren’tthe case then the last integral in Eq. (9) would divergelogarithmically,and the entropywould be infinite, not zero! This is indeed the case experimentally,although sometimes extraordinarily lowtemperatures must be attained to see the heat capacities vanish. One troubling result is our prediction from the 3 5 kinetic theory of , which gav e CV = R and CP = R, independent of temperature and 2 2 clearly nonzero. Again, the detailed answer lies beyond this course, but in short, the third lawis intimately bound to mechanics and energy quantization, something our simple kinetic theory model had completely ignored.

Fall Term, 2015