Quick viewing(Text Mode)

Dimensional Equations of Entropy Amelia Carolina Sparavigna1

Dimensional Equations of Entropy Amelia Carolina Sparavigna1

Dimensional Equations of Entropy Amelia Carolina Sparavigna1

1Department of Applied Science and Technology, Politecnico di Torino, Italy.

Abstract: Entropy is a quantity which is of great importance in and . The concept comes out of thermodynamics, proposed by Rudolf Clausius in his analysis of Carnot cycle and linked by Ludwig Boltzmann to the number of specific ways in which a physical system may be arranged. Any physics classroom, in its task of learning physics, has therefore to face this crucial concept. As we will show in this paper, the lectures can be enriched by discussing dimensional equations linked to the entropy of some physical systems.

Keywords: Physics Classroom, Entropy, Einstein Model, Blackbody Radiation, Bekenstein-Hawking Entropy, Casimir Entropy, Bose-Einstein Condensation.

1. Introduction (SI unit: J K 1kg1 ) or entropy per unit In physics and , helps finding relationships between different physical amount of substance. quantities by determining some equations based on a few fundamental quantities. Usually, these fundamental The change in entropy ( S ) of a system was originally quantities are , , mass, and defined for a thermodynamically reversible process as , where T is the absolute and are represented by symbols L, T, M, S  dQrev /T  and Q, respectively [1]. The of any temperature of the system. This temperature is dividing physical quantity can be expressed as products of these an incremental reversible transfer of heat into that basic quantities, each raised to a rational power [2]. system ( dQ ). This definition is sometimes called rev Any physically meaningful equation requires the same the macroscopic definition of entropy, because it is used dimensions on its left and right sides, a property known without regard to any microscopic description of the as "dimensional homogeneity". Checking this thermodynamic system. homogeneity is a common application of dimensional analysis, routinely used to verify the plausibility of The most remarkable property of entropy is that of calculations. In this analysis, equations are turned into being a function of state. In thermodynamics, a state “dimensional equations”. function is a property of the system which is depending only on the state of the system, not on the manner this In this paper we discuss some dimensional equations system acquired that state. Several state functions exist related to the entropy of some models of physics besides entropy, all able of describing quantitatively an systems. In thermodynamics, entropy (usual symbol S ) equilibrium state of a system. For this reason, besides is the physical quantity linked to the law of the change of entropy, an absolute entropy ( rather thermodynamics, the law which is telling that the than ) was defined. In this case, an approach using entropy of an isolated system never decreases. This statistical mechanics is preferred. Let us remember that system will spontaneously proceed towards the concept of entropy, which came out of thermodynamic equilibrium, which is the configuration thermodynamics, as proposed by Rudolf Clausius in his with maximum entropy. analysis of Carnot cycle, was linked by Ludwig Boltzmann to the number of specific ways in which a Entropy is an extensive property. It has the physical system may be arranged, in a statistical of energy divided by temperature, having a unit of mechanics approach. joules per kelvin ( 1) in the International System J K In fact, the modern statistical mechanics was initiated in of Units. However, the entropy of a substance can be the 1870s, by the works of Boltzmann, mainly collected given also as an intensive property, entropy per unit and published in his 1896 Lectures on Gas Theory [3].

This article is published at: http://www.ijsciences.com/pub/issue/2015-08/ DOI: 10.18483/ijSci.811; Online ISSN: 2305-3925; Print ISSN: 2410-4477

 Amelia Carolina Sparavigna (Crrespondence)  [email protected]  +39-011-564-7360  dQ Entropy is then a fundamental quantity for any physics    0 (4)  T classroom, in its tasks of learning and teaching physics.  rev As we are showing in this paper, lectures on this subject 3. Entropy and statistical mechanics can be enriched by a discussion of dimensional The statistical definition was given by Ludwig equations related to the entropy of some models of Boltzmann in the 1870s. Boltzmann showed that his physical systems. We will see some examples based on entropy was equivalent to that coming from Carnot the entropy of the blackbody radiation and of the black cycle, within a constant number, the Boltzmann's holes, among others. Before examples, let us discuss constant. shortly the concepts of thermodynamic and statistical In the Boltzmann approach, entropy is a measure of the entropies. number of ways in which a system may be arranged. This definition describes the entropy as being 2. Entropy and Carnot cycle proportional to the natural logarithm of the number of The physical concept of entropy arose from the studies possible microscopic configurations of the individual of Rudolf Clausius on the Carnot cycle [4]. This cycle is atoms and molecules of the system (microstates) which a theoretical thermodynamic reversible cycle proposed could give rise to the observed macroscopic state of the by Nicolas Léonard Sadi Carnot in 1824. It is composed system. (5) of an isothermal expansion, followed by an adiabatic S  k pi ln pi expansion. Then we have an isothermal compression i followed by an adiabatic compression. When the Carnot The sum is over all the possible microstates of the cycle is represented on a pressure volume diagram (pV system, and pi is the probability that the system is in diagram), the isothermal stages are given by isotherm the i-th microstate [5]. The constant of proportionality lines of the working fluid. The adiabatic stages move k is the . It is a constant relating between isotherms. The bounded by the cycle energy at the individual particle level with temperature. represents the total work done during one cycle. A Carnot cycle is also represented by using a temperature- It is also the gas constant R , divided by the Avogadro entropy diagram (TS diagram). In such diagrams, the constant N A . adiabatic reversible stage is an isentropic stage. The Boltzmann constant is linking macroscopic and microscopic physical quantities; for instance, for an In a Carnot cycle, heat QH is absorbed at temperature , the product of pressure p and volume V is TH from a reservoir in the reversible isothermal proportional to the product of amount of substance n expansion, and given up as heat Q to a reservoir, C (in moles) and absolute temperature T : pV  nRT , with the reversible isothermal compression, at T , C where R is the abovementioned gas constant. where TH  TC . Introducing the Boltzmann constant k transforms the ideal gas law into an alternative form: pV  NkT , Through the efforts of Clausius and Lord Kelvin where N is the number of molecules of gas. Therefore, (William Thomson, 1st Baron Kelvin, 1824-1907), it is the equation of the ideal gas is given in a microscopic now known that the maximum work that a system can formalism: produce is the product of the Carnot efficiency  and pV  nRT  NkT (6) the heat absorbed from the hot reservoir: For a gas, the internal energy coming from the first law  T  of thermodynamics, is linked to the entropy by the C (1) W  QH  1 QH second law, in the following equation:  T   H  dU  TdS  p dV (7) From the first law of thermodynamics, over the entire Let us remember that, for a fixed mass of an ideal gas, cycle, we have: W  QH  QC . Therefore, we have: the internal energy is a function only of its temperature, Q Q whereas the entropy depends on temperature and H  C (2) T T volume. H C This implies that there is a function of state which is 4. Dimensions of entropy conserved over the complete Carnot cycle. Clausius The Boltzmann constant k has the dimensions of called this state function as “entropy”. energy divided by temperature, the same of entropy

23 In the Carnot cycle, we have two entropies: then. Its value in SI units is 1.38010 J / K . Any dimensionless quantity, multiplied by this constant, QH QC SH  ; SC  (3) becomes an entropy. TH TC They have opposite signs and therefore, from (2), Let us use the symbol E for the dimension [energy] adding them we have zero. This results is generalized to and start from the Clausius entropy. The dimensional generic reversible cycles as: analysis gives:

http://www.ijSciences.com Volume 4 – August 2015 (08) 2 3  /T  T  D x4e x dQ  E   E  C  9Nk  dx (12) S   [S]   [k] V    x 2      D  0 (e 1) T    E  (8)

  k   E   [k]   [k]   [k]     E  k  Let us note that dimensional equations are characterized by the square brackets.

In the case we start our discussion from the statistical The lattice entropy is defined as: entropy, we have the presence of the logarithmic 3 function too. The discussion proposed in this paper will T dT  T  be in the framework of Eq.8. S  C  Nk  (13)  V T   0  D  We have a clear example of (8) in the entropy of an In general: ideal Fermi-Dirac gas. In a Fermi-Dirac system of d  T  particles, only one particle may occupy each non- (14) S  Nk  degenerate energy level. Let us consider first the  D  absolute zero. If there are N particles, the lowest N In (14), d is the dimension of lattice. In the case of a energy states are occupied up to the level . At low Eo two-dimensional layer d  2. For a wire, d 1. We , we have the entropy given by [6]: can have therefore different powers as in the  2  kT  dimensional equation (11). S  Nk  (9) 2  E   o  6. Entropy and condensed matter Here we can see explicitly one of the dimensional Let us consider other two examples from condensed equations in (8): matter physics: one is concerning the vibrational  2  kT  k  entropy, the other the entropy of paramagnets. S  Nk   [S]  [k] (10) 2  E   E   o    In the previous section, we have discussed the entropy Of course, we can also have different dimensional of Debye models. Of course, we can have models equations as the following: considering not only the elastic waves having a constant E     of the sound, but containing phonons and their [S]  [k]  [k] (11)       true dispersions. E    In (11),  is a given power. Here, we have energies and A phonon is a collective excitation in a periodic temperature: in fact, we can find several other arrangement of atoms or molecules, such as in dimensionless ratios too, for instance of , as we crystalline solids. Phonons are obtained from the second will see in this paper. quantization of the displacement field of solids. An assembly of phonons possesses an entropy given by :

5. Entropy of the Debye model of solids i / kT First of all, let us find an entropy which contains a ratio S  k  ln1 e  of temperatures. i (15)   / kT   k i In the Debye model, a solid is treated as an isotropic  i / kT  elastic continuum in which the velocity of the sound is i e 1 constant. For the longitudinal and transverse waves, The sum is over all the and polarizations of Peter Debye (1884-1966) put a cut-off at the upper limit the system [8] . of to justify the fact that the solid is considered as an elastic continuum. The upper angular Einstein proposed in 1907 that a solid could be frequency of waves is  . considered an assembly of a large number of identical D oscillators. All atoms oscillate with the same frequency. If we use the Einstein model of solids, and introduce the The model is characterized by a temperature, the Debye Einstein temperature: temperature, which is given by D  D / k . Let E  kE (16) us define the dimensionless x  /kT . The entropy is:   h/ 2 is the reduced .  /T The heat capacity of the solid is [7]: S  3Nk E  /T e E 1 (17a)  3Nk ln1 eE /T 

http://www.ijSciences.com Volume 4 – August 2015 (08) 3 And also: 5 4 4 8 k 3  /T S  VT (23)  e E 3 3 S  3Nk E 3 15h c T 1 eE /T (17b) Besides the Planck constant h , we have also the speed  3Nk ln(1 eE /T ) of light c .

Here we have a simple ratio of temperatures: The Planck constant is the quantum of action,  E  introduced to describe the proportionality constant [S]  [k]  (18)    between the energy of a charged atomic oscillator in the The vibrational entropy of Eqs.17 appears also in the wall of the black body, and the frequency  , of its calculation of the entropy of diatomic gases. For these associated electromagnetic wave. Its relevance is now gases we have the contribution of rotational modes too. fundamental for quantum mechanics, describing the If the molecules have a of inertia I , at low relationship between energy and frequency in Planck- temperatures we have that [9] : Einstein relation: 2 h 2  E  h  2   (24)  I kT  I kT  2 Srot  3Nk e 1  (19) I kT  2  In (24), we have the angular frequency  and the In (19), we have, as in (11), the following dimensional reduced Planck constant  . Action has the dimensions equation: of [energy]·[time], and its SI unit is joule-second. Therefore, the corresponding dimensional equation is  2   (let us remember that in the dimensional equation T Srot  [S]  [k]  I k means time):   (20) 3 2 2 2  k   E T   E  [S]  [k] L3 3  [k]  [k]  3 3   2   2  h c  ML E   E    (25) k 3 3 L3   E 3   [k]  [k] Let us now discuss an example of entropy of materials  3 3 3   3  having a magnetisation, in particular of a 1/2  E c T   E  paramagnet in terms of temperature and applied 8. An ideal Bose gas magnetic field [6]: In fact, we can write the last equation in (25), in a different form: 1 2B / kT S  Nk lne 1 3 3 3 e2B / kT 1 k  L  (21) [S]  [k]  1  E 3 c3T 3   Nk ln e2B / kT 1   (26) 2B / kT   e 1  L3  In (21), we have the magnetic field and the magnetic  [k]  3  moment  . At low temperatures, the first term is  L  negligible and then: In (26), we used [cT ]  L.  2B  From (26), we have that the entropy can be the S  Nk e2B / kT  kT  Boltzmann constant multiplied by a ratio of a given (22) power of lengths (in this specific case, the third power).  E  [S]  [k] Let us discuss an example, that of an ideal Bose gas.  E  Here we have a ratio as in Eq.8. An ideal Bose gas is the quantum-mechanical version of a classical ideal gas which is composed of bosons. 7. Black-body radiation These particles have an integer value of spin and obey A quite interesting example of dimensional equation is Bose–Einstein . This statistics, developed by coming from the entropy of the black-body radiation. Satyendra Nath Bose for photons, was considered by Albert Einstein for massive particles. Calculating the properties of radiation from a black- body was a major challenge in theoretical physics of the In 1924 [12], Einstein deduced that an ideal gas of late nineteenth century. The problem was solved in bosons can form a condensate at a low enough 1901 by Max Planck in the approach which is known temperature. This condensate is known as the Bose– today as the Planck's law of black-body radiation [10]. Einstein condensate. This condensate is a state of matter The thermodynamics of homogeneous and isotropic in which separate atoms or subatomic particles, cooled electromagnetic radiation in a cavity with given volume to near 0 K, coalesce into a single quantum mechanical and temperature is analysed in [11]. In this reference we entity, described by a wave function. As discussed in find that the entropy is: [13], there is a first-order transition, having a critical temperature TC .

http://www.ijSciences.com Volume 4 – August 2015 (08) 4 The entropy of this gas is given by: This entropy is known as the Bekenstein-Hawking

entropy. In (31), LP stands for the Planck length:  5 v G Nk g(z)  ln z T  TC (32)  2 3 LP  S   (27) c3  5 v  Nk g(1) T  T while G,,c denote, respectively, Newton's gravity  2 3 C   constant, reduced Planck-Dirac constant and the speed In (27), v V / N , where V is the volume of the gas of light. and N the number of particles. We find also the thermal From this equation, it is clear that: wavelength  , which is given by:  L2  2 [SBH ]  [k]  (33) 2  L2   (28)   m kT In this entropy, the black hole is identified with a In (28), we find the mass m of the particle. constant its surface area [14]. This fact was clear The quantity z , which is named “fugacity”, is after discovered that a black hole 3 emits radiation at a well-defined temperature : dimensionless: z   / v and its values are ranging c3 from 0 to 1. The function g(z) is given by: kT  (34)  8GM 4 This temperature is also known as the Hawking g(z)   dxx2 ln(1 zex )  radiation temperature. M is the mass. The radius of a  0 (29) black hole is R  2GM / c2 , which is the  z  Schwarzschild radius. The surface area is then:   5/ 2 G 2M 2 1  A  4R2  16 (35) In (29), we see clearly that the entropy of the Bose gas c4 has dimensions:  v   L3  dQ c2dM [S]  k  [k] (30) The entropy is dS   where the heat  3   3      L  T T increment is identified with the energy equivalent of the 9. Bekenstein-Hawking entropy of black holes Can we find entropies, whose dimensional equations in-falling mass [18]. After integration we can obtain (31). have a different power of lengths? The answer is Let us continue our dimensional analysis: positive. We have it in Ref.14, which gives the entropy of a black hole in a specific formula.  A   c3 A  S BH   k   [k]   4L2  4G  As proposed by Jacob Bekenstein, if a black hole were  P    (36) an object having no entropy, this fact would lead to a  L5T 3  violation of the second law of thermodynamics [15]. In  [k]  2 2  fact, when a hot gas with entropy enters a black hole,  FL M FLT  once it crosses the event horizon, its entropy would Then: disappear. To save the second law, the black hole must [SBH ] be an object having an entropy, the increase of which is greater than the entropy carried by the gas. This entropy  L5T 3   [k] (37) depends on the observable properties of the black hole:  3 2 2 2 1  mass, electric charge and angular momentum. These ML T M ML T  three parameters enter only in a combination which  L5T 3   L2c3  represents the surface area of the black hole, as a  [k]  [k]  5 3   2 3  consequence of the "area theorem" [16,17]. This  L T   L c  theorem tells that the area of event horizon of a black In (36), F means [force] and c a [speed]. hole cannot decrease. It is reminiscent of the law Let us show that, starting from the dimensions of the concerning the thermodynamic entropy of closed classical Clausius entropy, we can arrive to BH-entropy. systems. As a consequence, the black hole entropy is E is the energy and W the work, which have the same proposed as a monotonic function of area: if A stands dimensions. for the surface area of a black hole (area of the event k  k   k  horizon), then the black hole entropy is given by: [S]  [k]   [k]   [k]  (38)  E  W  pV A c3 A   S  k  k (31) BH 2 The volume V is the product of area A and length L , 4LP 4G then:

http://www.ijSciences.com Volume 4 – August 2015 (08) 5  k A   k A L2   RE   LE  (39) [S]  [k]  [k] [S]  [k]   [k]     1  F A L 2  hc  hLT    GM AL (44) The force F has the same dimension of the gravitational  E   E  force. Considering that [k ] has the same dimension of  [k]   [k]   ET T 1   E  an internal energy U of a perfect gas, which is   dimensionally the same of a kinetic energy: The bound is closely associated with black hole thermodynamics.  U AL2  [S]  [k] (40)  2  GM AL 11. Entropy of vacuum The vacuum has entropy too. In quantum mechanics Then: and in quantum field theory, the vacuum is defined as 2 2 2 ML T A L  the state of considered system with the lowest possible [S]  [k]  2  energy. This energy is known as the zero-point energy.  GM A L    (41) Since all quantum mechanical systems undergo  L3T 3 A  c3 A fluctuations, because of their wave-like nature, the  [k]  [k]  1    vacuum is subjected to fluctuations too. The G M AT   G h  fluctuations are temporary changes in the amount of And these are the dimensions of Bekenstein-Hawking energy, as given by the Werner Heisenberg's uncertainty entropy kc3 A/ 4G . principle.

Using again (35): The zero-point energy of quantum electrodynamics was

an important result in the theory of quantized fields 3 3 2 2 c A c G M  [20,21]. The Casimir effect deals with the modification [S]  [k]   [k]   Gh   Gc 4h  of this energy. The original analysis, proposed in [22],     (42) provided the basic problem by calculating the force GM 2 L1   E  between two conducting plates, due to the modification  [k]  [k]  1    imposed by their presence, to the possible  L ch   E  electromagnetic modes.

About the amount of entropy, in [14] an example is For an electromagnetic field, the vacuum may be given. A one-solar mass Schwarzschild black hole has considered as its equilibrium state in the limit of an horizon area of the same order as the municipal area vanishing temperature: in [20], the authors have studied of Atlanta or Chicago. Its entropy is about 41077 k , the extension to finite temperatures. Once the Casimir which is about twenty orders of magnitude larger than free energy had been calculated, by the standard the thermodynamic entropy of the sun [14]. thermodynamic formulae, the pressure on the plates can be obtained from it [20]. In this reference, the radiation

is supposed confined between two conducting plates. 10. The Bekenstein bound The edge size of both plates is L. The first is placed at Bekenstein bound is the upper limit of the entropy S in the XY plane. The second plate is placed at that can be contained within a given finite region of z  0 space which has a finite amount of energy. It is also the z  a parallel to the XY plane. Let us suppose maximum amount of information required to perfectly L  a . The entropy of the zero-point fluctuations of describe a given physical system down to the quantum the fields is [20]: level [19].  (3) L2a S  k 4 3 The universal form of the bound was originally found 2  a (45) by Jacob Bekenstein as the inequality [19]:  L3  [S]  [k] 2 k RE  3  S  (43)  L  c Eq.(45) is given in the limit of high temperature. As in where R is the radius of a sphere that can enclose the (26), this entropy is a ratio of cubic powers of lengths. given system, E is the total mass–energy including In (45), we have: any rest . Note that the expression for the bound  1 does not contain the G . The  (n)  (46)  n Bekenstein-Hawking entropy of black holes exactly m1 m saturates the bound. In [20], it is stressed that, while the Casimir energy Let us consider the dimensional equation of this bound: vanishes in the high temperature limit, the Casimir free energy density does not. Thus, the resultant force of attraction between the plates is of entropic origin [20].

http://www.ijSciences.com Volume 4 – August 2015 (08) 6 12. Entropic forces 4. Lavenda, B.H. (2009). A New Perspective on Let us conclude the paper with another quite attractive Thermodynamics, Springer. ISBN 9781441914309 5. Frigg, R.; Werndl, C. (2010). Entropy - A Guide for the problem, that of a force which, like the Casimir force, Perplexed. In Probabilities in Physics; Beisbart, C. & has its origin in the entropy of the corresponding Hartmann, S., Editors, Oxford University Press. ISBN physical system. 9780199577439 6. Dugdale, J.S. (1996). Entropy And Its Physical Meaning, CRC Press. ISBN 9780748405695 An entropic force is a phenomenological force coming 7. Srivastava, J.P. (2011). Elements of Solid State Physics, PHI for the statistical tendency of increasing entropy, rather Learning Pvt. Ltd. ISBN 9788120343245 than from a particular underlying microscopic force 8. Fultz, B. (2010). Vibrational Thermodynamics of Materials. [23]. The first entropic approach was proposed for the Progress in Materials Science 55(4), 247-352. DOI 10.1016/j.pmatsci.2009.05.002 Brownian motion in Ref.24. 9. Landau, L.D.; Lifshitz, E.M. (1980), Statistical Physics, Elsevier Ltd. ISBN 978-0750633727 To have a force from entropy, we can use the following 10. Planck, M. (1901). On the Distribution Law of Energy in the dimensional equation: Normal Spectrum, Annalen der Physik IV(4), 553-563. DOI 10.1002/andp.19013090310  E   S   S L 11. Kelly, R.E. (1981). Thermodynamics of Blackbody Radiation, [F]    (47)      2  Am. J. Phys. 49(8), 1981, 714-719. DOI 10.1119/1.12416  L   L   L  12. Bose-Einstein Condensate (BEC) 2015. Encyclopaedia Let us determine the entropic force for a specific Britannica Online. Retrieved August 19, 2015, from http://www.britannica.com/science/Bose-Einstein-condensate). phenomenology, that of the elasticity of polymers. 13. Kerson Huang (1963). Statistical Mechanics, Wiley. ISBN Polymers can be modelled as freely jointed chains with 9780471815181 one fixed end and one free end [25]. The length of a 14. Bekenstein, J.D. (2008). Bekenstein-Hawking Entropy. rigid segment of the chain is b ; n is the number of Scholarpedia, 3(10), 7375. DOI 10.4249/scholarpedia.7375 15. Vv. Aa. (2015), Holographic Principle, Wikipedia. segments of length b . r is the distance between the 16. Hawking, S.W. (1971). Gravitational Radiation from Colliding fixed and free ends, and L is the contour length, equal Black Holes, Phys. Rev. Letters 26, 1344-1346. DOI c 10.1103/PhysRevLett.26.1344 to n b . When the polymer chain oscillates, distance 17. Misner, C.W.; Thorne, K.S.; Wheeler, J.A. (1973). Gravitation, Freeman. ISBN 9780716703440 changes over time. 18. Bradford, R.A.W. (2012). Entropy in the , Retrieved The probability of finding the chain ends a distance r August 19, 2015, from apart is given by the following Gaussian distribution f www.rickbradford.co.uk/ChoiceCutsCh58.pdf 19. Bekenstein, J.D. (1981). Universal Upper Bound on the : Entropy-to-Energy Ratio for Bounded Systems, Phys. Rev. D 23(2), 287-298, DOI 10.1103/PhysRevD.23.287 2   2nb   2 2 20. Revzen, M.; Opher, R.; Opher, M.; Mann, A. (1997). Casimir’s f  4 r 2   e  r nb dr (48)   Entropy, Journal of Physics A: Mathematical and General 30,  3  7783-7789. DOI: 10.1088/0305-4470/30/22/017 21. Itzykson, C.; Zuber, J.B. (2006). Quantum Field Theory, Dover In (48),  3/ 2 . By using entropy and the Helmholtz Books on Physics. ISBN 978-0486445687 free energy, we can obtain a force which is like that of 22. Casimir, H.B.G. (1948). On the Attraction between Two the Hooke’s law. Let us consider the entropy [25]: Perfectly Conducting Plates, Proc. K. Ned. Akad. Wet. B51, (49) 793-795. S  k ln f 23. Vv. Aa. (2015). Entropic Force, Wikipedia. The entropy is linked to the Helmholtz free energy: 24. Neumann, R.M. (1977). The Entropy of a Single Gaussian Macromolecule in a Noninteracting Solvent. The Journal of r 2 A  TS  kT (50) Chemical Physics 66 (2), 870-871. DOI 10.1063/1.433923. L b 25. Vv. Aa. (2015). Rubber Elasticity, Wikipedia. c And then [25]: dA kT F     r (51) dr Lcb This is the law corresponding to the dimensional equation previously proposed in Eq. 47.

This example on the entropic force helps us to conclude the paper stressing the importance of dimensional analysis too. In fact, a guessed dimensional equation can suggest a new approach to solve a specific problem.

References 1. Langhaar, H.L. (1980). Dimensional Analysis and Theory of Models, Wiley. ISBN 9780882756820 2. Worstell, J. (2014). Dimensional Analysis: Practical Guides in Chemical Engineering, Butterworth- Heinemann. ISBN 9780128012369 3. Ebeling, W.; Sokolov, I.M. (2005). Statistical Thermodynamics and Stochastic Theory of Nonequilibrium Systems, World Scientific Publishing. ISBN 978-9027716743

http://www.ijSciences.com Volume 4 – August 2015 (08) 7