Dimensional Equations of Entropy Amelia Carolina Sparavigna1
Total Page:16
File Type:pdf, Size:1020Kb
Dimensional Equations of Entropy Amelia Carolina Sparavigna1 1Department of Applied Science and Technology, Politecnico di Torino, Italy. Abstract: Entropy is a quantity which is of great importance in physics and chemistry. The concept comes out of thermodynamics, proposed by Rudolf Clausius in his analysis of Carnot cycle and linked by Ludwig Boltzmann to the number of specific ways in which a physical system may be arranged. Any physics classroom, in its task of learning physics, has therefore to face this crucial concept. As we will show in this paper, the lectures can be enriched by discussing dimensional equations linked to the entropy of some physical systems. Keywords: Physics Classroom, Entropy, Einstein Model, Blackbody Radiation, Bekenstein-Hawking Black Hole Entropy, Casimir Entropy, Bose-Einstein Condensation. 1. Introduction mass (SI unit: J K 1kg1 ) or entropy per unit In physics and engineering, dimensional analysis helps finding relationships between different physical amount of substance. quantities by determining some equations based on a few fundamental quantities. Usually, these fundamental The change in entropy ( S ) of a system was originally quantities are length, time, mass, temperature and defined for a thermodynamically reversible process as , where T is the absolute electric charge and are represented by symbols L, T, M, S dQrev /T and Q, respectively [1]. The dimensions of any temperature of the system. This temperature is dividing physical quantity can be expressed as products of these an incremental reversible transfer of heat into that basic quantities, each raised to a rational power [2]. system ( dQ ). This definition is sometimes called rev Any physically meaningful equation requires the same the macroscopic definition of entropy, because it is used dimensions on its left and right sides, a property known without regard to any microscopic description of the as "dimensional homogeneity". Checking this thermodynamic system. homogeneity is a common application of dimensional analysis, routinely used to verify the plausibility of The most remarkable property of entropy is that of calculations. In this analysis, equations are turned into being a function of state. In thermodynamics, a state “dimensional equations”. function is a property of the system which is depending only on the state of the system, not on the manner this In this paper we discuss some dimensional equations system acquired that state. Several state functions exist related to the entropy of some models of physics besides entropy, all able of describing quantitatively an systems. In thermodynamics, entropy (usual symbol S ) equilibrium state of a system. For this reason, besides is the physical quantity linked to the second law of the change of entropy, an absolute entropy ( rather thermodynamics, the law which is telling that the than ) was defined. In this case, an approach using entropy of an isolated system never decreases. This statistical mechanics is preferred. Let us remember that system will spontaneously proceed towards the concept of entropy, which came out of thermodynamic equilibrium, which is the configuration thermodynamics, as proposed by Rudolf Clausius in his with maximum entropy. analysis of Carnot cycle, was linked by Ludwig Boltzmann to the number of specific ways in which a Entropy is an extensive property. It has the dimension physical system may be arranged, in a statistical of energy divided by temperature, having a unit of mechanics approach. joules per kelvin ( 1) in the International System J K In fact, the modern statistical mechanics was initiated in of Units. However, the entropy of a substance can be the 1870s, by the works of Boltzmann, mainly collected given also as an intensive property, entropy per unit and published in his 1896 Lectures on Gas Theory [3]. This article is published at: http://www.ijsciences.com/pub/issue/2015-08/ DOI: 10.18483/ijSci.811; Online ISSN: 2305-3925; Print ISSN: 2410-4477 Amelia Carolina Sparavigna (Crrespondence) [email protected] +39-011-564-7360 dQ Entropy is then a fundamental quantity for any physics 0 (4) T classroom, in its tasks of learning and teaching physics. rev As we are showing in this paper, lectures on this subject 3. Entropy and statistical mechanics can be enriched by a discussion of dimensional The statistical definition was given by Ludwig equations related to the entropy of some models of Boltzmann in the 1870s. Boltzmann showed that his physical systems. We will see some examples based on entropy was equivalent to that coming from Carnot the entropy of the blackbody radiation and of the black cycle, within a constant number, the Boltzmann's holes, among others. Before examples, let us discuss constant. shortly the concepts of thermodynamic and statistical In the Boltzmann approach, entropy is a measure of the entropies. number of ways in which a system may be arranged. This definition describes the entropy as being 2. Entropy and Carnot cycle proportional to the natural logarithm of the number of The physical concept of entropy arose from the studies possible microscopic configurations of the individual of Rudolf Clausius on the Carnot cycle [4]. This cycle is atoms and molecules of the system (microstates) which a theoretical thermodynamic reversible cycle proposed could give rise to the observed macroscopic state of the by Nicolas Léonard Sadi Carnot in 1824. It is composed system. (5) of an isothermal expansion, followed by an adiabatic S k pi ln pi expansion. Then we have an isothermal compression i followed by an adiabatic compression. When the Carnot The sum is over all the possible microstates of the cycle is represented on a pressure volume diagram (pV system, and pi is the probability that the system is in diagram), the isothermal stages are given by isotherm the i-th microstate [5]. The constant of proportionality lines of the working fluid. The adiabatic stages move k is the Boltzmann constant. It is a constant relating between isotherms. The area bounded by the cycle energy at the individual particle level with temperature. represents the total work done during one cycle. A Carnot cycle is also represented by using a temperature- It is also the gas constant R , divided by the Avogadro entropy diagram (TS diagram). In such diagrams, the constant N A . adiabatic reversible stage is an isentropic stage. The Boltzmann constant is linking macroscopic and microscopic physical quantities; for instance, for an In a Carnot cycle, heat QH is absorbed at temperature ideal gas, the product of pressure p and volume V is TH from a reservoir in the reversible isothermal proportional to the product of amount of substance n expansion, and given up as heat Q to a reservoir, C (in moles) and absolute temperature T : pV nRT , with the reversible isothermal compression, at T , C where R is the abovementioned gas constant. where TH TC . Introducing the Boltzmann constant k transforms the ideal gas law into an alternative form: pV NkT , Through the efforts of Clausius and Lord Kelvin where N is the number of molecules of gas. Therefore, (William Thomson, 1st Baron Kelvin, 1824-1907), it is the equation of the ideal gas is given in a microscopic now known that the maximum work that a system can formalism: produce is the product of the Carnot efficiency and pV nRT NkT (6) the heat absorbed from the hot reservoir: For a gas, the internal energy coming from the first law T of thermodynamics, is linked to the entropy by the C (1) W QH 1 QH second law, in the following equation: T H dU TdS p dV (7) From the first law of thermodynamics, over the entire Let us remember that, for a fixed mass of an ideal gas, cycle, we have: W QH QC . Therefore, we have: the internal energy is a function only of its temperature, Q Q whereas the entropy depends on temperature and H C (2) T T volume. H C This implies that there is a function of state which is 4. Dimensions of entropy conserved over the complete Carnot cycle. Clausius The Boltzmann constant k has the dimensions of called this state function as “entropy”. energy divided by temperature, the same of entropy 23 In the Carnot cycle, we have two entropies: then. Its value in SI units is 1.38010 J / K . Any dimensionless quantity, multiplied by this constant, QH QC SH ; SC (3) becomes an entropy. TH TC They have opposite signs and therefore, from (2), Let us use the symbol E for the dimension [energy] adding them we have zero. This results is generalized to and start from the Clausius entropy. The dimensional generic reversible cycles as: analysis gives: http://www.ijSciences.com Volume 4 – August 2015 (08) 2 3 /T T D x4e x dQ E E C 9Nk dx (12) S [S] [k] V x 2 D 0 (e 1) T E (8) k E [k] [k] [k] E k Let us note that dimensional equations are characterized by the square brackets. In the case we start our discussion from the statistical The lattice entropy is defined as: entropy, we have the presence of the logarithmic 3 function too. The discussion proposed in this paper will T dT T be in the framework of Eq.8. S C Nk (13) V T 0 D We have a clear example of (8) in the entropy of an In general: ideal Fermi-Dirac gas. In a Fermi-Dirac system of d T particles, only one particle may occupy each non- (14) S Nk degenerate energy level. Let us consider first the D absolute zero. If there are N particles, the lowest N In (14), d is the dimension of lattice. In the case of a energy states are occupied up to the level . At low Eo two-dimensional layer d 2.