<<

What if entropy were dimensionless? Harvey S. Leffa) Department, California State Polytechnic University, Pomona, 3801 West Temple Avenue, Pomona, California 91768 ͑Received 23 July 1998; accepted 2 March 1999͒ One of entropy’s puzzling aspects is its of energy/. A review of thermodynamics and statistical mechanics leads to six conclusions: ͑1͒ Entropy’s dimensions are linked to the definition of the Kelvin temperature scale. ͑2͒ Entropy can be defined to be dimensionless when temperature T is defined as an energy ͑dubbed tempergy͒. ͑3͒ Dimensionless entropy per particle typically is between 0 and ϳ80. Its value facilitates comparisons among materials and estimates of the number of accessible states. ͑4͒ Using dimensionless entropy and tempergy, Boltzmann’s constant k is unnecessary. ͑5͒ Tempergy, kT, does not generally represent a stored system energy. ͑6͒ When the ͑extensive͒ heat capacity Cӷk, tempergy is the energy transfer required to increase the dimensionless entropy by unity. © 1999 American Association of Physics Teachers.

I. INTRODUCTION thus the energy spectrum region where the system spends most executing its ‘‘entropy dance.’’ Section VI con- Entropy has many faces. It has been interpreted as a mea- tains a recapitulation and main conclusions. sure of disorder,1 the unavailability of work,2 the of energy spreading,3 and the number of accessible microstates4 in macroscopic physical systems. While each of these inter- II. DEFINITIONS OF ENTROPY AND THE pretations has a logical basis, it is unclear why any of them ‘‘ENTROPY DANCE’’ should have the dimensions of energy per unit temperature. Indeed the common entropy unit, J KϪ1, is as mysterious as Clausius defined the entropy change dS of a system at temperature T, in terms of an infinitesimal reversible heat entropy itself. The goal here is to answer the following ques- ␦ ͑ ͒ process during which the system gains energy Qrev . His tions: i Why does entropy have the dimensions of energy 11 per unit temperature? ͑ii͒ Are these dimensions necessary famous entropy algorithm is ͑ ͒ ␦ and/or physically significant? iii How would thermal phys- Qrev ics differ if entropy were defined to be dimensionless and dSϭ . ͑1͒ T temperature were defined as an energy? ͑iv͒ What range of values does dimensionless entropy per particle have? ͑v͒ If Extending this process along a finite path with a finite energy temperature is defined as an energy, called tempergy, does it transfer, integration of ͑1͒ leads to an entropy difference Ϫ have a meaningful physical interpretation? An examination Sfinal Sinitial . This procedure is commonly used as a calcu- of thermal physics books5–10 shows that although some au- lation tool to obtain path-independent entropy differences. It thors define entropy as a dimensionless quantity and some is evident from Eq. ͑1͒ that S has the dimensions of energy define temperature as an energy, no in-depth discussion of per unit temperature ͑common units are J/K͒. questions ͑i͒–͑v͒ seems to exist. What follows is in part a Notably, Clausius used the Kelvin temperature T in ͑1͒. review and synthesis of ideas that are scattered throughout This can be traced to the definition of the Kelvin temperature textbooks, with the goal of filling this gap. scale and the following property of dilute gases. If an N In Sec. II, we review common definitions of entropy in particle dilute gas has pressure p and volume V, then thermodynamics, statistical mechanics, and information pV theory. In Sec. III we observe that in each case one can ϭ␶ϭfunction of temperature. ͑2͒ replace S by a dimensionless entropy Sd and that, in a sense, N entropy is inherently dimensionless. We show how hand- The Kelvin temperature T is defined to be proportional to ␶ book entropy values can be converted to dimensionless en- and to have the value T ϭ273.16 K at the triple point of tropies per particle, assess the range and meaning of the lat- 3 ter numerical values, and use them to estimate the number of H2O. It follows that accessible states. Defining entropy as a dimensionless quan- ␶ϭkT, ͑3͒ tity leads naturally to a definition of temperature as an en- ϭ ergy ͑tempergy͒. In Sec. IV, we argue that although in very where the constant of proportionality, k 1.3807 Ϫ23 Ϫ1 special cases, tempergy is a good indicator of the internal ϫ10 JK , is Boltzmann’s constant. Extrapolation of ͑2͒ energy per particle, per degree of freedom, or per elementary to low shows that Tϭtϩ273.15, where t is the excitation, no such property holds in general. Thus tempergy Celsius temperature. The above-mentioned specification of cannot be interpreted generally as an identifiable stored en- T3 assures that the Kelvin and Celsius scales have equal ergy. In Sec. V we show that under typical laboratory con- degree sizes. An interpretation of Eq. ͑3͒ is that k is a con- ditions, tempergy is the energy transfer needed to increase a version factor that links temperature and energy. The product system’s dimensionless entropy by unity. In addition, tem- kT, which is ubiquitous in thermal physics, enables correct pergy shares with temperature the property of determining dimensions whenever an energy, heat capacity, or pressure is the internal energy ͑quantum statistical average energy͒ and expressed as a function of T in Kelvins.

1114 Am. J. Phys. 67 ͑12͒, December 1999 © 1999 American Association of Physics Teachers 1114 Modern-day values of Boltzmann’s constant are obtained a ‘‘spreading’’ function.3 This ‘‘spreading’’ can usefully be from data on the universal gas constant R envisioned as a dance over the system’s accessible states. ϭ8.3145 J KϪ1 molϪ1, evaluated from dilute gas data, and When the dance involves more states ͑or phase space vol- 12–15 ͑ ͒ ͑ ͒ ume͒ then the entropy is greater; thus we call it an entropy Avogadro’s number, NA . That is, combining 2 and 3 with the corresponding standard molar equation pVϭnRT, dance. Last, but not least, we observe that although the factor ϭ ϭ k in Eq. ͑6͒ gives S its conventional dimensions of energy/ where n is the number, gives k R/(N/n) R/NA .We note that k and R are two versions of the same constant using temperature, S/k contains all the essential physics. In this different unit systems. sense, entropy is inherently a dimensionless entity. Given the definition of temperature above and the fact that In information theory, the missing information, MI, is de- all reversible Carnot cycles operating between fixed reser- fined as voir temperatures T and T have efficiency 1ϪT /T , one c h c h ϭϪ ͑ ͒ finds that for any such Carnot cycle, MI c͚ Pi ln Pi , 7 i

Qh Qc ϩ ϭ0, ͑4͒ where Pi is the probability of finding the system in state i, T T h c with energy eigenvalue Ei , and c is an arbitrary multiplica- tive constant.20 Equilibrium statistical mechanics at constant where (Qh ,Qc) are the energies gained by the working fluid at (T ,T ). In many textbooks, these properties are used to temperature can be obtained by maximizing the function MI h c under the constraint that the average energy is fixed.21 This obtain the Clausius inequality, ⌺ Q /T р0, for a cycle with i i i procedure leads to the conclusion that if one makes the iden- segments iϭ1,2,..., and this inequality is then used to obtain tifications SϭMI and cϭkϭBoltzmann’s constant, then the entropy algorithm, Eq. ͑1͒.16,17 consistency with classical thermodynamics is obtained, and In Callen’s modern formulation of classical thermodynamics,18 the entropy function S is postulated to ϭ Ϫ1 ͑ ␶͒ ϭ ͑ ␶͒ ͑ ͒ exist for equilibrium thermodynamic states. It is specified to Pi Z exp Ei / with Z ͚ exp Ei / . 8 i be extensive, additive over constituent subsystems, and to ϭ ץ ץ vanish in a state for which ( U/ S)V,N 0. Here U is the Z is the canonical partition function of statistical mechanics. internal energy and the derivative is taken holding particle Notably, in both Pi and Z, temperature occurs only through numbers and external parameters fixed. The dimensions of S the product ␶ϭkT. As in the microcanonical formalism, en- come from the requirement that the latter derivative be iden- tropy’s dimensions come solely from the factor k. Here the tified as the Kelvin temperature; i.e., essential physics is contained within the set ͕Pi͖. In this -S͒ ϭT, ͑5͒ sense, the canonical ensemble entropy is inherently dimenץ/Uץ͑ V,N sionless. and this gives S the dimensions energy/temperature. Again the dimensions of S are linked to the definition of tempera- III. DIMENSIONLESS ENTROPY AND TEMPERGY ture. In statistical physics, entropy is usually defined using the The arguments above suggest that the tempergy, ␶ϭkT, Boltzmann form, has significance. Nevertheless, it is the temperature T, and ␶ 22 Sϭk ln W. ͑6͒ not , that is commonly used as a thermodynamic . Recall that ␶ is traditionally written as kT in order to obtain The factor k is Boltzmann’s constant, which was actually an absolute temperature scale with a degree size equal to the introduced into Eq. ͑6͒ by Planck. Using radiation data, Celsius degree. Had the temperature been defined as ␶, then Planck obtained the first numerical value of k.19 In classical ͑ ͒ ␶ ␶ Th and Tc in Eq. 4 would have been replaced by h and c , statistical mechanics, W is proportional to the total phase respectively. Similarly, the Clausius algorithm, ͑1͒, would space volume accessible to the system. This volume is in the have taken the form 6N-dimensional phase space consisting of the position and ␦ ␦ Qrev Qrev momentum coordinates for the N particles. If there are inter- dS ϭ ϭ , ͑9͒ nal degrees of freedom, say, for diatomic molecules that can d kT ␶ rotate and vibrate, the dimensionality of the phase space is where S is the dimensionless entropy, increased accordingly. In the quantum framework, W repre- d sents the number of quantum states that are accessible to the S S ϭ . ͑10͒ system. Evaluation of W to obtain S is a main task of the d k microcanonical ensemble formalism of statistical mechanics. Several things should be kept in mind. First, the term ac- Use of ͑10͒ in Eq. ͑6͒, gives the dimensionless Boltzmann cessible means accessible under specified experimental con- form ditions, e.g., the system’s temperature is fixed, which implies S ϭln W. ͑11͒ that the system’s quantum state is likely to be in a specific d region of the energy spectrum ͑we discuss this further in Sec. It might seem that using ␶ rather than T would upset ther- V͒. , if states are accessible, this implies that the sys- modynamics as we know it, but this is not so because it is tem can be in any of them. Classically, this means that the possible to view the Kelvin as an energy,23 i.e., 1 K system’s phase point, ͑r, p͒, which consists, say, of 6N po- ϵ1.3807ϫ10Ϫ23 J. It should be kept in mind, however, that sition and momentum components, moves through the acces- although tempergy and internal energy have the same dimen- sible phase space as time increases. Quantum mechanically, sions, they are very different entities, and we show in Sec. IV the system’s state can make transitions between the acces- that in general, tempergy cannot be related to a stored system sible states. With this view, S has the interpretation of being energy. Had the previously described steps been followed

1115 Am. J. Phys., Vol. 67, No. 12, December 1999 Harvey S. Leff 1115 Table I. Typical dimensionless entropy ranges for solids, liquids, and gases for Tϭ298.15 K and pressureϭ0.101 MPa. Higher ͑but atypical͒ values exist.

Type of matter Typical range for ␴

Monatomic solids 0.3–10 Monatomic gases 15–25 Diatomic gases 15–30 Monatomic and polyatomic solids 10–60 Polyatomic liquids 10–80 Polyatomic gases 20–60

For monatomic dilute gases in the classical domain, the Sackur–Tetrode formula for entropy27 implies that the di- mensionless entropy per particle is ␴ϭϪ ϩ ͒ϩ ͒ ͑ ͒ 1.16 1.5 ln͑M a 2.5 ln͑T 13

at pressure 0.101 MPa, where M a is the atomic in mass units, and T is the Kelvin temperature. At Tϭ298.15 K, this implies ␴ϭ15.2 for helium gas and ␴ϭ21.2 for the rela- tively massive radon gas. Figure 1͑a͒ shows that at standard temperature and pressure, 15Ͻ␴Ͻ25 for most monatomic gases. This range is above that for monatomic solids because gas atoms are less constrained than solids and gases have more accessible states. Figure 1͑a͒ also shows that 15Ͻ␴ Ͻ30 for most diatomic gases. For a given particle mass, diatomic gases have higher entropy because their entropy Fig. 1. Dimensionless entropy per particle ␴ vs particle mass, in mass units dance involves more states. Entropy tends to increase with ͑ ͒ ͑ ͒ ͑ ͒ ͑ u . a Monatomic solids m—solids; gray squares , monatomic gases m— particle mass, as in ͑13͒, because the of states ͑num- gases; open circles͒ and diatomic gases ͑d—gases; gray diamonds͒. ͑b͒ ͑ ͒ ͑ ber of energy states per unit energy͒ is greater for more mas- Polyatomic solids p—solids; gray squares , liquids p—liquids; black 3 circles͒, and gases ͑p—gases; open diamonds͒. Integers printed near data sive particles. points are numbers of atoms per molecule, e.g., the polyatomic liquid Figure 1͑b͒ shows that entropy is larger for more complex has ͑approximate͒ mass 18 u and 3 atoms per molecule, while the poly- molecules. Again entropy tends to increase with particle atomic liquid C16H32 has mass 224 u and 48 atoms per molecule. The poly- mass, but this tendency can be overshadowed by the sensi- atomic gases each have approximately the same number of atoms per mol- tivity of ␴ to the number of atoms per molecule. For example ecule and their sequence shows that entropy increases with mass. Entropy- ͑ ͒ increasing effects of larger numbers of atoms per molecule are evident for the solid C18H38 octadecane , with molecular mass 254 u the polyatomic liquids and solids. Lines connecting data points are included and 56 atoms per molecule has considerably higher ␴ than to help identify materials of a given type; they do not necessarily indicate the solid Br2Hg2, with more than twice the molecular mass the positions of other data points. 561 u, but only four atoms per molecule. The octadecane molecules have many more degrees of freedom and acces- sible states. Approximate ranges of ␴ for gases, liquids, and solids are summarized in Table I. historically, entropy would have been dimensionless by defi- Overall, a helpful rule of thumb is that ␴ is typically be- nition and we might never have encountered the Kelvin tem- tween 0 and ͑roughly͒ 80 at ordinary temperatures and pres- perature scale or Boltzmann’s constant. sures. We may gain a deeper understanding of the numerical It is instructive to examine the intensive dimensionless ͑ ͒ ͑ ͒ 24,25 information in Table I and Fig. 1 using Eqs. 11 and 12 to entropy per particle, obtain ␴ϭln͑W1/N͒, ͑14͒ Sd ␴ϵ . ͑12͒ 28 N or equivalently, Wϭexp͑N␴͒. ͑15͒ In what follows, unless stated otherwise, we focus on one- We now estimate W when Nϭ2ϫ1019 ͑roughly the num- component, single-phase systems. What are typical values of 3 ͒ ␴? Consider graphite, with entropy 5.7 J KϪ1 molϪ1 at stan- ber of air molecules in 1 cm of air for small and large ␴ ␴ϭ dard temperature and pressure. Equation ͑12͒ implies the di- values of . For solid silver’s relatively small value 8.5 Ϫ ϫ 14 mensionless entropy per particle, ␴ϭ0.68. Similarly, one ϫ10 5 at Tϭ1K, ͑15͒ implies Wϭ107.4 10 . For the hy- finds that diamond has ␴ϭ0.29 while lead has ␴ϭ7.79. At drocarbon gas icosane, with ␴ϭ112 under standard Ϫ 21 Tϭ1 K, solid silver has ␴ϭ8.5ϫ10 5, which seems consis- conditions,29 ͑15͒ implies WϷ1010 . A range of ␴ and W tent with ␴ approaching zero for T(or␶)→0.26 It turns out values is given in Table II. that the dimensionless entropies per particle of monatomic We close this section with two observations. The first is solids are typically Ͻ10, as illustrated in Fig. 1͑a͒. that use of ␶ leads naturally to a dimensionless heat capacity

1116 Am. J. Phys., Vol. 67, No. 12, December 1999 Harvey S. Leff 1116 Table II. Number of accessible states W for an N particle system for ␴ Our last special case is a gas of photons in an otherwise Ϫ10 19 ϭ10 , 0.1, 1, 10, and 100 when Nϭ2ϫ10 . empty box of volume V, with wall temperature ␶. For this photon gas, the total internal energy is known from statistical W 31 ␴ ͑order of magnitude͒ mechanics to be

Ϫ 9 ␲␶4 ϱ 3 ␲␶4 10 10 10 8 V x 8 VJU 10 Uϭ ͵ dxϭ , ͑19͒ 1018 3 x 3 0.1 10 ͑hc͒ 0 e Ϫ1 ͑hc͒ 1 1019 10 4 20 where J ϭ␲ /15ϭ6.494 is the integral in ͑19͒. The average 10 1010 U 21 number of photons, which depends upon both the container 100 1010 volume V and the temperature ␶,is 8␲␶3V ϱ x2 8␲␶3VJ ϭ ͵ ϭ N ͑ ͒ Nphotons 3 x dx 3 , 20 ͑hc͒ 0 e Ϫ1 ͑hc͒ as well as dimensionless entropy. Notably, the American In- ϭ ͑ ͒ where JN 2.404 is the integral in 20 . The combination of stitute of Physics Handbook contains tabular data for both ͑ ͒ ͑ ͒ dimensionless specific entropy and dimensionless specific Eqs. 19 and 20 gives ␴ heat capacities. Second, Sd and satisfy the relations U JU ϭͩ ͪ ␶ϭ2.7␶. ͑21͒ ␴ Nphotons JNץ ץ Sd 1 1 ͩ ͪ ϭ , ͩ ͪ ϭ , ͑16͒ U N␶ The average energy per photon is seen to be 2.7␶, despite theץ U ␶ץ V,N V,N ϰ␶4 ϰ␶3 fact that U . This is true because Nphotons . These re- where the derivatives hold particle numbers and externally sults hold for all ␶Ͼ0. variable parameters fixed. In order to understand how very special the above- mentioned cases are, we take a closer look at the Debye model of a solid, which accounts for small oscillations of coupled atoms. Using normal coordinates, the model reduces IV. TEMPERGY IS NOT GENERALLY A STORED to a collection of independent linear oscillators. Debye as- SYSTEM ENERGY sumed the distribution of the oscillators’ coin- ␶ cides with the distribution of sound wave frequencies in the In general, tempergy cannot be linked to an energy per ␯ ␶ solid, and that there is a maximum , D . For particle, per degree of freedom, or per excitation. Despite ӷ ␯ ͑ this fact, we enumerate some very special cases for which ␶ h D , the model predicts that the system energy relative to ͒ → ␶ is so related. The equipartition theorem implies that for mon- the ground state 3N . This reflects energy equipartition, atomic and diatomic ideal gases in the classical domain, i.e., average energy ␶/2 for each of the 6N degrees of free- namely at sufficiently high ␶ and/or low density, each degree dom (3N from kinetic and 3N from potential energies͒.In ͑ ␶Ӷ ␯ of freedom including rotations and/or vibrations when these the low temperature limit, h D , quantum behavior pre- ͒ 1␶ modes are active stores 2 of energy on average. A suffi- vents equipartition, and it is interesting to examine the aver- cient condition for equipartition in the classical domain is age energy per excitation. These excitations, called phonons, that each degree of freedom have an energy that is quadratic are usefully viewed as independent, identical quasiparticles in its variable, e.g., momentum for translational kinetic en- that obey Bose–Einstein . Using the canonical en- ergy and displacement for vibrational energy.30 Thus, even semble, the average energy of the phonons can be written32 for classical nonideal gases with position-dependent forces, 4 ͑ ␯ ␶͒ 3 4 9N␶ h D / x 9N␶ I each translational degree of freedom stores energy 1␶. For ϭ ͵ ϭ U ͑ ͒ 2 Uphonons ͑ ␯ ͒3 xϪ dx ͑ ␯ ͒3 , 22 the Einstein and Debye ͑quantum͒ models of solids, each h D 0 e 1 h D 1␶ degree of freedom stores 2 of energy on average for suffi- and the average number of phonons is ciently high ␶. 3 ͑ ␯ ␶͒ 2 3 9N␶ h D / x 9N␶ I Another special case is an ideal Bose–Einstein gas below ϭ ͵ ϭ N ͑ ͒ ␶ Nphonons ␯ ͒3 xϪ dx ␯ ͒3 . 23 its critical temperature c . The total number of particles can ͑h D 0 e 1 ͑h D ϭ ϩ be written as N N0 Nexc , where N0 and Nexc are the num- bers of particles in the ground state and in excited states, The quantities IU and IN are shorthand notations for the in- respectively. It can be shown that31 tegrals that appear in Eqs. ͑22͒ and ͑23͒, respectively. Notice that the number of phonons depends on the temperature, V h2 1/2 which is reminiscent of the photon gas. Combining these N ϭNϪN ϭ2.612 with ␭ϭͩ ͪ , ͑17͒ exc 0 ␭3 2␲m␶ equations gives

Uphonons IU and ϭͩ ͪ ␶. ͑24͒ Nphonons IN U 2.012␶͑V/␭3͒ ϭ ϭ ␶ ␶Ͻ␶ ͑ ͒ ͑ ␭3͒ 0.77 for c . 18 Numerical evaluations of the integrals IU and IN show that Nexc 2.612 V/ ␶ ␯ for values of between 0 and h D , the right-hand side of Thus, below the critical temperature, the tempergy ␶ is ͑24͒ ranges from 2.7 ␶ to 0.64 ␶, as illustrated in Fig. 2. within 30% of the average energy per excited particle. How- Notice that the factor 2.7 arises in the zero temperature limit, ϭ ͓ ͑ ͒ ͑ ͒ ever, because the particles in the ground state contribute zero for which IU /IN JU /JN see Eqs. 19 – 21 for the photon energy, ␶ is not a good indicator of U/N. gas͔. We conclude that for sufficiently low temperatures, the

1117 Am. J. Phys., Vol. 67, No. 12, December 1999 Harvey S. Leff 1117 Fig. 3. U/(N␶) and U/(N⑀) as functions of ␶/⑀ for a system with N par- ticles, each with possible energies 0 and ⑀Ͼ0. Because U/(N␶) is never near unity ͑its maximum is ϳ0.28͒, ␶ never represents the average internal energy per particle. Similarly, because U/(N⑀) is never near unity, ␶ never Fig. 2. Average energy per phonon vs ␶/h for the Debye model. vD represents the average internal energy per excitation.

␶ tempergy is a rough indicator of the average energy per represents the degree of hotness for objects, and when two phonon, but is a poor indicator of the energy per particle, objects interact thermally, energy flows from hotter to U/N. colder.33,34 This very useful interpretation holds for tempergy ␶ӷ ␯ → ␶ ␯ In contrast, for h D , Nphonons (9N )/(2h D), while as well as temperature but provides no particular insights for → ␶ → ␯ ␶ ␶ Uphonons 3N ,soUphonons /Nphonons 0.67(h D / ) . That ␶. is, the average energy per phonon becomes a small fraction In search of such insights, we consider an energy transfer of ␶. In this limit energy equipartition ͑␶/2 per degree of QϾ0 to a system by a heat process. In terms of the ͑exten- freedom͒ holds and U→6NkT. In the intermediate region, sive͒ heat capacity C(T), where quantum effects are more subtle, there is no known ␶ Tϩ⌬T way to interpret as a stored energy. Qϭ ͵ C͑TЈ͒dTЈϭC¯ ⌬T, ͑25͒ Before closing this section, we consider another model for T which ␶ cannot be related to a stored system energy. Con- sider a collection of independent particles, each with two where the last step defines the average heat capacity C¯ for accessible states, having energies 0 and ⑀Ͼ0. Figure 3 shows the interval (T,Tϩ⌬T), with ⌬TϾ0. C¯ is well defined un- graphs of U/(N⑀)vs␶/⑀ and U/(N␶)vs␶/⑀. For ␶/⑀Ͻ1, less ⌬Tϭ0 and QÞ0. This special case, which occurs for relatively few particles are in the excited states and the av- two coexisting phases, is treated separately later. If we make Ӷ ⑀ ␶ ⑀ӷ erage energy per particle U/N 0.5 . For / 1, the frac- the special choice QϭkT, and also assume that C¯ ӷk ͑be- tion of particles in the excited state approaches 0.5. Thus, cause C¯ is extensive and k is a constant͒ it follows that when ␶ increases without bound, U/N→0.5⑀ while U/(N␶)→0. Figure 3 shows that U/NϽ0.28␶ for all values ⌬T k ␶ ϭ ⑀ ϭ Ӷ1. ͑26͒ of . Also, because U Nexc , where Nexc is the average T C¯ number of particles in the higher energy state, it follows that ϭ⑀ ␶ U/Nexc , independent of . Thus, for this simple model of The dimensionless entropy change of the system when its two-state independent particles, ␶ is neither a good indicator temperature increases from T to Tϩ⌬T is of the average energy per particle nor the average energy per Tϩ⌬T C͑TЈ͒ excitation. ⌬ ϭ ͵ Ј ͑ ͒ Sd Ј dT . 27 For the special cases where tempergy can be related to the T kT average energy per particle, per degree of freedom, or per Because C(TЈ)Ͼ0, we may establish lower and upper excitation, the total energy can be written as the sum of the bounds on ⌬S by replacing 1/TЈ by 1/(Tϩ⌬T) and 1/T, energies of similar, independent particles or quasiparticles, d respectively, as follows: or degrees of freedom. In some cases the special result fol- lows because the particle energy is quadratic in a momentum 1 Tϩ⌬T ͵ C͑TЈ͒dTЈр⌬S or position variable and the temperature is sufficiently high. k͑Tϩ⌬T͒ d In other cases, the special result seems to be related to Bose– T Einstein statistics. However, when such special conditions do 1 Tϩ⌬T ␶ р ͵ C͑TЈ͒dTЈ. not exist, we cannot relate to a stored system energy. kT T Except for special cases, ␶ is a poor indicator of the ͑28͒ internal energy per particle, per degree of freedom, or The equalities hold in the limit ⌬T→0. per elementary excitation, and no universal interpreta- Using Eq. ͑25͒ in ͑28͒, we obtain Q/͓k(Tϩ⌬T)͔р⌬S tion of tempergy as a meaningful stored energy is d рQ/(kT). For the special case QϭkT, this becomes known. ϩ⌬ р⌬ р ͑ ͒ T/(T T) Sd 1. Application of Eq. 26 then gives V. A PHYSICAL INTERPRETATION OF 1 р⌬ р TEMPERGY Sd 1 ͑1ϩk/C¯ ͒ Given that ␶ generally cannot be associated with a mean- when ingful stored energy, it is natural to ask if it has any general and useful physical interpretation. We know that temperature Qϭ␶ϭkT. ͑29͒

1118 Am. J. Phys., Vol. 67, No. 12, December 1999 Harvey S. Leff 1118 Because we assume C¯ ӷk, expansion of the left-hand side in bNkT4 Qϭ ͓͑1ϩ␰͒4Ϫ1͔. ͑31͒ ͑ ͒ ¯ ͉⌬ Ϫ ͉Ͻ ¯ 3 29 to order k/C leads to the conclusion Sd 1 k/C, 4TD and this brings us to our desired result, Similarly ⌬S ϭ1 ⌬S bNT3 d ⌬ ϭ ϭ ͓͑ ϩ␰͒3Ϫ ͔ ͑ ͒ Sd 3 1 1 . 32 k 3TD when Setting QϭkT, solving for (1ϩ␰) and substituting the result in ͑32͒, we find that Qϭ␶ϭkT, C¯ ӷk. ͑30͒ bNT3 4 T 3 3/4 ⌬ ϭ ͫͩ ϩ ͩ D ͪ ͪ Ϫ ͬ ͑ ͒ Sd 3 1 1 . 33 Equation ͑30͒ is very general, and is satisfied whenever C¯ is 3TD bN T well defined. We now show that it holds for various real and Expansion of the first term in the square brackets leads to model systems over a wide range of temperatures. ϭ␶ ͑ ͒ ͑T /T͒3 For an isothermal transfer Q , 30 obviously holds be- ⌬ ϭ Ϫ D ϩ ͑ ͒ ⌬ ϭ ␶ϭ Sd 1 . 34 cause Sd Q/ 1. This is true even if the system consists 2bN ¯ of two coexisting phases at constant pressure, in which case Ӷ Þ ⌬ ϭ ¯ ͑ ͒ For any T TD , the second term on the right-hand side Q 0, T 0, and C—defined in Eq. 25 —does not exist. It becomes negligible for sufficiently large N.36 If we demand is instructive to examine this interesting case. If L is the ϭ⌬ ϭ that this term be no larger, say, than 0.001, the implied con- ‘‘heat of transformation’’ per mole, then Q nL kT dition is when ⌬n moles get transformed. This gives ⌬nϭk/(L/T) ϭ ⌬ ⌬ T3 k/ smol , where smol is the entropy of transition per 3Ͼ D ⌬ ϭ ⌬ NT mole. In particle language this implies N R/ smol 0.47 ϭ1/(␴ Ϫ␴ ).25 Here ␴ and ␴ refer to the higher and 2 1 2 1 when lower entropy phases, respectively. To see the implications ͑ ͉⌬ Ϫ ͉Ͻ ͑ ͒ of this, consider a transition from liquid to vapor assuming Sd 1 0.001. 35 ͒ ␴ Ϫ␴ no dissociation upon vaporization . In this case 2 1 ϭ⌬␴ This condition is most restrictive for small samples of solids vap , the dimensionless entropy of vaporization per par- with relatively high Debye temperatures. For diamond, with ⌬␴ Ϸ ticle. For many such liquids, it is known that vap 10 at the exceptionally high Debye temperature T ϭ2200 K, ͑35͒ 35 ⌬ Ͻ D atmospheric pressure. Notably, this implies N 1, i.e., becomes NT3Ͼ2.3ϫ1010. Choosing the relatively small not even one molecule goes from liquid to vapor when Q sample size 10Ϫ6 mol or Nϭ6.02ϫ1017 and massϭ1.2 ϭ␶ ⌬ Ͼ ⌬␴ Ͻ Ϫ . In fact the condition N 1 requires that vap 1. ϫ10 8 kg, this is satisfied for TϾ0.003 K. Given diamond’s Although this condition is satisfied for any saturated liquid at high Debye temperature, this example suggests that ⌬S pressures and temperatures sufficiently close to its critical d ϭ1 is a good approximation when Qϭ␶ϭkT for most mac- point, the typical result ⌬NϽ1 for atmospheric pressure un- roscopic solids over all but the very lowest attainable tem- derscores the smallness of the energy transfer ␶. If indeed ⌬ Ͼ ⌬ ϭ ⌬ peratures. N 1, the result Sd 1 is independent of both N and In addition to the lattice vibrations, contribute to the amount of liquid and vapor in the system because the the heat capacity in metals. Using the free model, at ⌬ Ͻ process is isothermal. If N 1 there is no phase transition low temperatures the electronic heat capacity is proportional and the liquid–vapor system acts as a system with heat ca- to T. Therefore the total low-temperature heat capacity has ϭ ϩ ¯ ӷ 3 pacity C Cliquid Cvapor .IfC k our earlier argument im- the form CϭATϩBT , where the linear and cubic terms ⌬ ϭ plies Sd 1 for the almost-isothermal process in the two- come from the electrons and lattice, respectively. Using data phase system. for metals,37 one can explicitly confirm whether Cӷk for a Another almost-isothermal process entails the heating of 1 sample of a given size. For example, at Tϭ10Ϫ6 K, a calcu- g of copper, initially at Tϭ1 K and with constant-pressure lation for 10Ϫ6 mol (9ϫ10Ϫ9 kg) of beryllium shows that for ϭ ϫ Ϫ5 ϭ␶ϭ ϭ␶ ⌬ ϭ ͑ ͒ heat capacity, C 1.2 10 J/K. When energy Q kT is Q , Sd 1.000 to four digits . Even more convincing added to this system, Eq. ͑26͒ implies that ⌬␶/␶ϭ⌬T/T results hold for other metals and also for larger sample sizes. ϭk/C¯ ϭ10Ϫ18, and Eq. ͑30͒ is valid. For higher tempera- Yet another example is a Bose–Einstein ͑BE͒ tures, C is larger and thus ⌬␶/␶ is even smaller. Evidently for below its critical temperature for BE condensation. A cubic copper, Eq. ͑30͒ holds over a wide range of temperatures. centimeter of material with the density of liquid helium has ϭ ϫ 22 ͉⌬ Ϫ ͉Ͻ Ͼ The derivation of Eq. ͑30͒ fails when C¯ is comparable N 2.2 10 atoms and Sd 1 0.001 for T 2 ϫ Ϫ13 ¯ 10 K. with k. Because C is proportional to the system’s mass, this The above-mentioned examples indicate that ⌬S ϭ1 for happens if the system is sufficiently small. It happens also if d Qϭ␶ typically holds for all but the most extreme low tem- the system is sufficiently cold because C→0 for T→0. We ¯ ӷ ӷ now investigate how low T must get to make ͑30͒ invalid for peratures. Because a sufficient condition for C k is C k in a given sample size. the interval (T,Tϩ⌬T), the above findings can be summa- Ӷ rized as follows. Consider a Debye solid with T TD , the Debye tempera- ture. In this region, C(T) is well approximated by32 C(T) For macroscopic systems with heat capacities Cӷk, ϭ 3 ϭ ␲4 ϭ ␰ bNk(T/TD) where b 12 /5 234. Defining tempergy ␶ is the energy transfer required to increase ϵ⌬ → ϩ⌬ ⌬ ϭ T/T, it follows that when T T T, the dimensionless entropy by unity, i.e., Sd 1. This

1119 Am. J. Phys., Vol. 67, No. 12, December 1999 Harvey S. Leff 1119 defining and examining dimensionless entropy and dimen- sionless entropy per particle inspires a healthy rethinking of the foundations of thermodynamics and statistical mechan- ics. It provides a natural setting in which to compare values of dimensionless entropy per particle for different materials, which can give a better sense of entropy’s role as a measure of the temporal entropy dance of the system over its acces- sible states. Typically, the dimensionless entropy per particle ␴ lies between 0 and ͑roughly͒ 80. These findings imply that x the number of accessible states is Wϳ1010 for macroscopic systems under standard temperature and pressure conditions, with 10ϽxϽ21. W is indeed impressively large. The study of dimensionless entropy suggests that tempera- ture can be replaced by tempergy, ␶ϭkT, and this leads naturally to an investigation of the occurrence and signifi- cance of ␶ in thermal physics. Despite the existence of Fig. 4. A plot showing W(U)ϰUY vs U, as described in the text, and a simple models for which tempergy is proportional to internal geometrical view of Eq. ͑36͒. The of the two dotted lines represent energy per particle, per degree of freedom, or per excitation, the left- and right-hand sides of Eq. ͑36͒. The tempergy ␶Ͼ0 and the fixed no such property holds generally. Two reasonably general external parameters, including volume V, determine the system’s internal energy-specific interpretations of tempergy are known. One energy U*. To enable a graphical view of Eq. ͑36͒, this sketch is not drawn to scale. In real systems, ␶ typically is many orders of magnitude smaller entails an energy transfer, rather than a stored energy, than U. namely: tempergy is the amount of energy needed to increase a system’s dimensionless entropy by unity. This property holds for macroscopic systems that are not at ultralow tem- holds to an excellent approximation for macroscopic peratures, i.e., when Cӷk. The second interpretation is for systems over nearly all attainable temperatures. single-phase systems, for which W(U) has the shape in Fig. Because Sd is of order N, the number of system particles, 4. In this case, ␶—like temperature—determines the average ⌬ ϭ the entropy change Sd 1 is typically a very small relative energy and thus, the region of the energy spectrum where the ͑ ͒ change for Sd . Examination of Eq. 11 shows that increas- system spends most time doing its entropy dance over states. ing Sd by unity corresponds to multiplying the total number The fact that this dance occurs for thermodynamic equilib- of accessible states by the factor eϭ2.718..., the base of rium highlights the dynamic microscopic nature of macro- natural logarithms.38 scopic equilibrium. ϭ␶ ⌬ ϭ It is possible to circumvent Boltzmann’s constant by de- The fact that an added energy Q induces Sd 1isa ϭ ␶ fining entropy as a dimensionless quantity and working with ץ ץ ͒ ͑ consequence of Eq. 16 ,(Sd / U)V 1/ , which relates tempergy rather than temperature. This leads one to question tempergy to the of change of Sd with U. For a system whether k is really a fundamental constant of nature. We can whose dimensionless entropy Sd has continuous first and sec- ond derivatives with respect to U, and whose heat capacity compare it to the of light in vacuum, c, which is fun- 39 damental in that it applies to all electromagnetic radiation. Cv is finite, the concavity condition for entropy implies that Ͻ Ͼ We can also compare it with Planck’s constant h, which is 2 ץ 2ץ ( Sd / U )V 0. The latter condition assures that Cv 0, ͑ ͒ and that Eq. ͑16͒ has a unique solution U* for any ␶Ͼ0.40 fundamental in that it links a particle property energy and a wave property ͑frequency͒ for all electromagnetic radiation For a one-component, single phase system with fixed and all de Broglie particle waves. Boltzmann’s constant can volume, whose entropy has continuous first and second be considered fundamental in the sense that it provides a derivatives, tempergy—like temperature—determines linear link between tempergy and temperature, where tem- the internal energy. Quantum statistically, this pin- perature is a fundamental measure of hotness and tempergy points the average energy and thus, the neighborhood represents a well-defined stored energy for all sufficiently of the system’s energy spectrum where the entropy dilute gases. It is important to realize however that despite dance spends most time. this property of dilute gases, tempergy does not represent a Using Eqs. ͑6͒, ͑10͒, and ͑16͒, we obtain known stored system energy for macroscopic matter in gen- .W͑U*͒ eral ץ ͩ W͑U*͒ͪ ϭ . ͑36͒ Finally, we observe that although many people seem com- U ␶ץ V fortable with their understanding of temperature, fewer seem A graphical interpretation of Eq. ͑36͒ is given in Fig. 4, comfortable with entropy. This is true in part because we can assuming that W(U)ϰUY. This form satisfies the concavity feel when objects have higher or lower temperatures, but we condition above and, for the special case of a classical mon- cannot usually sense entropy directly. To the extent that atomic ideal gas, it is well known41 that YϰN. one’s comfort with temperature stems from the incorrect be- lief that temperature is always proportional to the average kinetic energy per particle, it gives a false sense of security. VI. CONCLUSIONS Baierlein33,34 has emphasized that temperature does not tell Entropy’s conventional units, J/K, can be traced to the us how much energy a system stores, but rather, reflects a definition of the Kelvin temperature scale. The exercise of system’s tendency to transfer energy via a heat process. The

1120 Am. J. Phys., Vol. 67, No. 12, December 1999 Harvey S. Leff 1120 findings here agree with that assessment and, in addition, Petley, The Fundamental Physical Constants and the Frontier of Measure- ϭ␶ ment ͑Hilger, Bristol, 1985͒, pp. 94–99. The best modern values of Bolt- show that the specific energy transfer Q induces the di- ϭ ⌬ ϭ zmann’s constant utilize k R/NA . mensionless entropy change Sd 1. 16See, for example, M. W. Zemansky and R. H. Dittman, Heat and Ther- This article began as an attempt to clarify an aspect of modynamics ͑McGraw–Hill, New York, 1997͒, 7th ed., pp. 186–191. entropy, but has led inadvertently to a rethinking of tempera- 17Clausius used a similar, but less well known, method to obtain Eq. ͑1͒.His ture. Although entropy can be related very generally to the method also required the definition of absolute temperature given in the entropy dance described here and dimensionless entropy can text. See R. J. E. Clausius, The Mechanical Theory of Heat ͑John Van help us better understand that dance, no such system- Voorst, London, 1867͒, pp. 134–135. Interesting discussions of Clausius’s independent picture emerges for temperature or tempergy. method can be found in W. H. Cropper, ‘‘Rudolf Clausius and the road to entropy,’’ Am. J. Phys. 54, 1068–1074 ͑1986͒ and P. M. C. Dias, S. P. The general definition of T in thermodynamics is given by ͑ ͒ Pinto, and D. H. Cassiano, ‘‘The conceptual import of Carnot’s theorem to Eq. 15 , which relates T to the rate of change of energy with the discovery of entropy,’’ Arch. Hist. Exact Sci. 49, 135–161 ͑1995͒. entropy. This suggests that temperature is at least as elusive 18See Ref. 5, pp. 27–32. as entropy, i.e., at least as difficult to understand on micro- 19See M. Planck, The Theory of Heat Radiation ͑Dover, New York, 1991͒. scopic grounds. This is an English translation by Morton Masius of Planck’s second edi- tion of Vorlesungen u¨ber die Theorie der Wa¨rmestrahlung ͑1913͒. Planck wrote ͑p. 120͒, ‘‘... Boltzmann’s equation lacks the factor k which is due to ACKNOWLEDGMENTS the fact that Boltzmann always used gram-molecules, not the molecules themselves, in his calculations.’’ Planck’s numerical calculation of k along I thank Ralph Baierlein, Robert Bauman, and Thomas with Planck’s constant is given on pp. 172–173. Notably, Planck also Marcella for extremely helpful comments on both substance calculated Avogadro’s number and the electron charge. References to the and style. relevant original ͑1901͒ articles are on p. 216. 20See, for example, R. Baierlein, Atoms and Information Theory ͑Freeman, a͒Electronic mail: [email protected] San Francisco, 1971͒, Chap. 3. 21 1J. S. Dugdale, Entropy and its Physical Meaning ͑Taylor & Francis, Bris- The information theory development of statistical mechanics was pio- tol, PA, 1996͒, pp. 101–102. neered by E. T. Jaynes, ‘‘Information theory and statistical mechanics,’’ 2T. V. Marcella, ‘‘Entropy production and the second law of thermodynam- Phys. Rev. 106, 620–630 ͑1957͒; ‘‘Information theory and statistical me- ics: An introduction to second law analysis,’’ Am. J. Phys. 60, 888–895 chanics. II,’’ 108, 171–190 ͑1957͒. A good summary is in K. Ford, Editor, ͑1992͒. Statistical Physics—1962 Brandeis Summer Institute Lectures in Theoret- 3H. S. Leff, ‘‘Thermodynamic entropy: The spreading and sharing of en- ical Physics ͑Benjamin, New York, 1963͒, Vol. 3, pp. 181–218. ergy,’’ Am. J. Phys. 64, 1261–1271 ͑1996͒. 22One exception is Ref. 8, in which ␶ is used throughout. 4R. Baierlein, ‘‘Entropy and the second law: A pedagogical alternative,’’ 23Callen ͑Ref. 5͒ and Landau and Lifshitz ͑Ref. 6͒ adopt this view. A con- Am. J. Phys. 62, 15–26 ͑1994͒. trasting view is taken in F. Reif, Fundamentals of Statistical and Thermal 5H. Callen, Thermodynamics and an Introduction to Thermostatistics Physics ͑McGraw–Hill, New York, 1965͒, pp. 99, 136–137. Here, T is ͑Wiley, New York, 1985͒, pp. 46–47, 331. defined to be dimensionless, and the unit degree is taken to be ‘‘... a unit 6L. D. Landau and E. M. Lifshitz, Statistical Physics ͑Addison–Wesley, in the same sense as the degree of angular measure; it does not involve Reading, MA, 1958͒,p.34. , mass, or time.’’ 7J. E. Mayer and M. G. Mayer, Statistical Mechanics ͑Wiley, New York, 24In Refs. 6–9, the extensive dimensionless entropy is labeled ␴, which is 1977͒, 2nd ed., p. 90. related to the Clausius ͑extensive͒ entropy S by Sϭk␴. In contrast, we use 8 C. Kittel and H. Kroemer, Thermal Physics ͑Freeman, New York, 1980͒, the lower case ␴ for the intensive dimensionless entropy per particle and 2nd ed. See also the first edition ͑Wiley, New York, 1969͒, pp. 29, 132– Sd for the extensive dimensionless entropy. 133, and C. Kittel, Elementary Statistical Physics ͑Wiley, New York, 25␴ ϭ can be expressed in terms of the entropy per mole, smol S/n and the 1958͒,p.27. ϭ ␴ϭ 9 entropy per unit mass, smass S/M as follows: Msmass /(Nk) B. K. Agrawal and M. Eisner, Statistical Mechanics ͑Wiley, New York, ϭ ϭ nsmol /(Nk) smol /R. The latter expression shows that if one knows the 1988͒, pp. 40–47. Ϫ Ϫ molar entropy in J K 1 mol 1, then ␴ϭs /8.314. If s is in calories/ 10See also B. H. Lavenda, Statistical Physics: A Probabilistic Approach mol mol mol, then ␴ϭs /2 to a good approximation. ͑Wiley, New York, 1991͒, pp. 14–15. He defines the Boltzmann statistical mol 26The third law of thermodynamics implies that entropy, and thus dimen- entropy in the conventional way but then writes, ‘‘In most of the subse- sionless entropy, approaches the same value for all paths leading toward quent formulas, Boltzmann’s constant will be omitted implying that the absolute zero temperature. By convention, this value is defined to be zero. temperature will be measured in energy units. This introduces a greater 27R. P. Bauman, Modern Thermodynamics with Statistical Mechanics ͑Mac- symmetry in the formulas, especially in regard to dual Legendre func- ͒ tions.’’ millan, New York, 1992 , p. 345. In this book, the Sackur–Tetrode equa- ϭ ͓ ϩ Ϫ ͔ϩ 11Strictly speaking the reversibility implied by dSϭ␦Q /T is sufficient, tion is written S R 1.5 ln M 2.5 ln T ln P 172.298, where M is the rev ϫ 23 ͒ but not necessary. It is necessary only that the system in question undergo mass, in kg, of one mole (6.02 10 particles . For example, for neon, ϭ ͑ ͒ ϭ a quasistatic process, which can occur even if the system interacts irre- M 0.020 18 kg. In contrast, in our Eq. 13 , M 20.18 mass units. 28 ␴ ͑ ͒ versibly with its environment ͑see Ref. 2͒. If an exact expression for W is used to obtain using Eq. 14 and small ͑ ͒ 12Planck calculated N in 1900 ͑see Ref. 19͒ using data on radiation. A terms are dropped for large N, then use of the inversion formula 15 A ϭ g N ‘‘direct’’ determination of N , using Einstein’s theory of Brownian mo- cannot recover the exact W. For example, if W N q , where g and q are A ␴ϭ ϩ tion, was published in J. B. Perrin, Ann. Chim. ͑Paris͒ 18,1͑1909͒;an constants, then ln q (g/N)ln N and the second term on the right is English translation is in F. Soddy, Brownian Motion and Reality ͑Taylor & negligible for sufficiently large N. Taking antilogarithms of the first term, N g Francis, London, 1910͒. See also A. Einstein, Investigations on the Theory we recover Wϭq , but not the factor N . Thus, Eq. ͑15͒ is useful for of the Brownian Movement ͑Dover, New York, 1956͒, especially pp. 60– obtaining the order of magnitude of W rather than its exact value. 29 ͑ ͒ 62. Icosane also called eicosane , with molecular formula C20H42, has 62 13A student laboratory experiment for measuring Boltzmann’s constant was atoms per molecule. Although it is a solid at room temperature, Lange’s developed by M. Horne, P. Farago, and J. Oliver, ‘‘An experiment to Handbook of lists an entropy value that implies ␴ϭ112 for measure Boltzmann’s constant,’’ Am. J. Phys. 41, 344–348 ͑1973͒. This icosane gas under standard conditions. paper also contains interesting historical information. 30R. C. Tolman, The Principles of Statistical Mechanics ͑Oxford U.P., Ox- 14 ͒ The modern value of NA is known from of the lattice con- ford, 1938; reprinted by Dover, New York , pp. 95–98. Tolman derives a stant, density, and of silicon. See R. D. Deslattes, ‘‘Recent general equipartition principle and shows that a form of equipartition estimates of the Avogadro constant,’’ in Atomic and Fundamental holds even for a relativistic gas of noninteracting particles, whose energy Constants, edited by J. H. Sanders and A. H. Wapstra ͑Plenum, New York, is not quadratic in momentum. Specifically, if u is particle speed and m ͒ 2 Ϫ 2 2 1/2 ϭ 1975 , Vol. 5. rest mass, then ͓(mu )/(1 u /c ) ͔average 3kT. Equipartition holds for 15For a review of modern determinations of the gas constant, see B. W. the kinetic energy per particle only in the nonrelativistic limit.

1121 Am. J. Phys., Vol. 67, No. 12, December 1999 Harvey S. Leff 1121 31 R. K. Pathria, Statistical Mechanics ͑Butterworth-Heinemann, Oxford, agrees ͑in an order of magnitude sense͒ with the condition C¯ ӷk for Q 1996͒, 2nd ed., Chap. 7. ϭkT to induce the change ⌬S ϭ1. 32 d K. Huang, Statistical Mechanics ͑Wiley, New York, 1987͒, 2nd ed., pp. 37See Ref. 16, p. 355, Table 13.4. 283–286. 38 ϭ In ‘‘bit language,’’ the dimensionless entropy is Sd͓bits͔ Sd /ln2 33R. Baierlein, ‘‘The meaning of temperature,’’ Phys. Teach. 28, 94–96 ϭ ͑ ͒ ⌬ ϭ Sd/0.693, and Eq. 30 becomes Sd͓bits͔ 1.44 bits upon addition of ͑1990͒. energy Qϭ␶. This reflects the additional missing information associated 34 R. Baierlein, Thermal Physics ͑Cambridge U.P., New York, 1999͒, pp. with 2.718... as many accessible states. 6–8, 177–178, 347–349. 39See Ref. 5, pp. 203–207. 2 ץ 2ץ 40 35 This is known as Trouton’s rule. See, for example, M. Sprackling, Ther- A unique solution would not exist if ( Sd / U )V,N vanished over a finite mal Physics ͑American Institute of Physics, New York, 1991͒, p. 242. The U interval, which would imply infinite Cv for this range of U values. ⌬ Ϸ Ϫ1 Ϫ1 ⌬␴ Ϸ value smol 85 J mol K cited by Sprackling implies vap 10. Empirically, for a single-phase system, Cv diverges when a critical point is 36 ͑ ͒ 3ӷ The second term in 34 is much less than 1 if (T/TD) 1/(2bN). Using approached. ϭ 3 ӷ 41 the heat capacity, C(T) bNk(T/TD) , this implies C(T) k/2. The latter See Ref. 32, pp. 138–140.

SEVERITY AND FLABBINESS That is the severity of the criterion that science sets for itself. If we are to be honest, then we have to accept that science will be able to claim complete success only if it achieves what many might think impossible: accounting for the emergence of everything from absolutely nothing. Not almost nothing, not a subatomic dust-like speck, but absolutely nothing. Nothing at all. Not even empty space. How different this is from the soft flabbiness of a nonscientific argument, which typically lacks any external criterion of success except popular acclaim or the resignation of unthinking accep- tance. One typical adipose arch-antireductionist argument may be that the world and its creatures were created ͑by something called a God͒, and that was that. Now, that may be true, and I cannot prove that it is not. However, it is no more than a paraphrase of the statement that ‘the exists’. Moreover, if you read into it an active role for the God, it is an exceptionally complex explanation, even though it sounds simple, for it implies that everything ͑or almost everything, even if the God supplied no more than electrons and quarks͒ had to be provided initially.

P. W. Atkins, ‘‘The Limitless Power of Science,’’ in Nature’s Imagination—The Frontiers of Scientific Vision, edited by John Cornwell ͑Oxford University Press, New York, 1995͒.

1122 Am. J. Phys., Vol. 67, No. 12, December 1999 Harvey S. Leff 1122