On Entropy, Information, and Conservation of Information

Total Page:16

File Type:pdf, Size:1020Kb

On Entropy, Information, and Conservation of Information entropy Article On Entropy, Information, and Conservation of Information Yunus A. Çengel Department of Mechanical Engineering, University of Nevada, Reno, NV 89557, USA; [email protected] Abstract: The term entropy is used in different meanings in different contexts, sometimes in contradic- tory ways, resulting in misunderstandings and confusion. The root cause of the problem is the close resemblance of the defining mathematical expressions of entropy in statistical thermodynamics and information in the communications field, also called entropy, differing only by a constant factor with the unit ‘J/K’ in thermodynamics and ‘bits’ in the information theory. The thermodynamic property entropy is closely associated with the physical quantities of thermal energy and temperature, while the entropy used in the communications field is a mathematical abstraction based on probabilities of messages. The terms information and entropy are often used interchangeably in several branches of sciences. This practice gives rise to the phrase conservation of entropy in the sense of conservation of information, which is in contradiction to the fundamental increase of entropy principle in thermody- namics as an expression of the second law. The aim of this paper is to clarify matters and eliminate confusion by putting things into their rightful places within their domains. The notion of conservation of information is also put into a proper perspective. Keywords: entropy; information; conservation of information; creation of information; destruction of information; Boltzmann relation Citation: Çengel, Y.A. On Entropy, 1. Introduction Information, and Conservation of One needs to be cautious when dealing with information since it is defined differently Information. Entropy 2021, 23, 779. in different fields such as physical sciences, biological sciences, communications, philos- https://doi.org/10.3390/e23060779 ophy, and daily life. Generalization of a particular meaning into other fields results in incompatibilities and causes confusion. Therefore, clear distinction with precise language Academic Editors: Milivoje M. Kostic and mathematics is needed to avoid misunderstandings. and Jean-Noël Jaubert The basic building block of modern electronics is the transistor circuit, which acts as a switch capable of staying in one of two states of on-or-off, or equivalently, closed-or-open, Received: 27 May 2021 or 1-or-0 in the binary code, in response to an electrical signal input. A transistor functions Accepted: 17 June 2021 in the same way as the light switch shown in Figure1: the lamp turns on when the switch Published: 19 June 2021 is moved to the on position and electric current is allowed to pass, and the lamp turns off when the switch is moved to the off position. Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in In electronics, a single switch or transistor constitutes the most elementary hardware. published maps and institutional affil- This basic case of one elementary component n = 1 corresponds to two sates of on and off, iations. M = 2. In the area of communication via signal transmission, the information I associated with a single basic switch being on or off corresponds to a binary digit, and thus to a choice between two messages, and is taken as the unit of information called a bit. Therefore, 1 bit represents the amount of information that can be stored by a single switch with two stable states or positions of on/off, closed/open, or 1/0. The bit also represents the basic unit of Copyright: © 2021 by the author. entropy for a system that has only two states. Licensee MDPI, Basel, Switzerland. This article is an open access article If one switch corresponds to one bit of information, we intuitively feel that information distributed under the terms and should be additive, and three switches should correspond to three bits of information. 3 conditions of the Creative Commons However, in the case of three switches, there are 2 = 8 possible choices or combinations of Attribution (CC BY) license (https:// 111, 110, 101, 100, 001, 010, 011, and 000 for messages, the last choice corresponding to the creativecommons.org/licenses/by/ case of all three switches being off. 4.0/). Entropy 2021, 23, 779. https://doi.org/10.3390/e23060779 https://www.mdpi.com/journal/entropy Entropy 2021, 23, x FOR PEER REVIEW 2 of 19 Entropy 23 2021, , 779 mation. However, in the case of three switches, there are 23 = 8 possible choices or combi-2 of 19 nations of 111, 110, 101, 100, 001, 010, 011, and 000 for messages, the last choice corre- sponding to the case of all three switches being off. FigureFigure 1. 1. AnAn electric electric lamp lamp switch switch has has two possible states only: on andand offoff (or,(or, 1 1 and and 0). 0). If If the the lamp lamp is islit, lit, we we know know that that the the switch switch is is in in the theon onposition position (Photos (Photos by by Yunus Yunus Çengel). Ç engel). Therefore,Therefore, as as the the number number of of switches switches increases increases linearly, linearly, the the number number of of pos possiblesible mes- mes- sages,sages, which which is is equivalent equivalent to to the the amount amount of of information information that that can can be be conveyed, conveyed, increases increases exponentially.exponentially. To To keep keep things things simple simple and and manageable, manageable, it it is is desirable desirable to to define define information information inin such such a a way way that that the the amount amount of of information information is is linearly linearly related, related, an andd thus thus directly directly propor- propor- tionaltional,, to to the the amount amount of of hardware hardware—the—the numbernumber ofof switchesswitches oror transistors transistors in in this this case. case. After Af- terall, all, the the amount amount of of hardware hardware is is a a definitive definitive indicator indicator ofof thethe amountamount of information. This This way,way, the the amount amount of hardware becomes aa measuremeasure ofof information,information, which which is is very very practical practical if ifwe we are are to to build build communication communicationsystems systems withwith variousvarious numbersnumbers ofof transistors or switches. AA convenient convenient and and intuitive wayway ofof quantifyingquantifying information information is is to to relate relate it linearlyit linearly to theto thenumber number of switchesof switches so thatso that information information doubles doubles when when the the number number of switchesof switches is doubled. is dou- The simplest way of doing this is to equate one switch to one unit of information. Following bled. The simplest way of doing this is to equate one switch to one unit of information. the suggestion of Hartley [1], Shannon [2] and Weaver [3] have formulated this idea by Following the suggestion of Hartley [1], Shannon [2] and Weaver [3] have formulated this defining information I associated with n switches as the logarithm to the base 2 of the idea by defining information I associated with n switches as the logarithm to the base 2 of number of possible choices or messages N: the number of possible choices or messages N: n I =I log= log2 N2 N= = log log222 2n = n log2 22 = = nn (bits)(bits) ( (1)1) sincesince log log2 2 2= 1, = 1,as asshown shown in Figure in Figure 2. I2n. the In thecase case of a ofsingle a single switch, switch, n = 1 andn = thus 1 and infor- thus mationinformation is I = 1 isbit.I =Then, 1 bit. as Then, desired, as desired, the information the information associated associated with n = 3 with switchesn = 3 becomes switches becomes 3 I = log2 N = log2 8 = log2 23 = 3 log2 2 = 3 bits (2) I = log2 N = log2 8 = log2 2 = 3 log2 2 = 3 bits (2) The unit bit may appear obscure and arbitrary at first. However, this is also the case for many familiar units such as meter, gram, and second. Besides, it is common to use constant multiples of a unit in place of a unit itself, such as using kilogram instead of gram as a unit of mass and using byte (equivalent to eight bits) or even gigabytes (1 billion bytes) instead of bit as a unit of information and storage. Additionally, note from Figure2 that log 2 x = 3.322 log10 x and thus the logarithm of any base can be used in information calculations in practice, including a natural logarithm, by incorporating a suitable constant factor of multiplication. EntropyEntropy 20212021,, 2323,, x 779 FOR PEER REVIEW 33 of of 19 19 FigureFigure 2. 2. SomeSome mathematical mathematical relations relations regarding regarding logarithms logarithms (no (no specified specified base base indicates indicates any any base) base).. TheA more unit generalbit may andappear practical obscure approach and arbitrary is to define at first. information However, in this terms is ofalsoprobabilities the case forp of many messages familiar instead units of such the numberas meter, of gram, choices and for second. messages, Besides, as presented it is common next. Notingto use equiprobable constantthat the multiples probability of ofa unit a choice in place or message of a unit initself, this such as usingcase kilogram with threeinstead switches of gram is p = 1/N = 1/8, as a unit of mass and using byte (equivalent to eight bits) or even gigabytes (1 billion bytes) instead of bit as a unit of information and storage. Additionally, note3 from Figure 2 that I = log2 N = log2 (1/p) = − log2 p = − log2 (1/8) = log2 2 = 3 bits (3) log2 x = 3.322 log10 x and thus the logarithm of any base can be used in information calcu- lationssince again, in practice, log2 2 including = 1.
Recommended publications
  • ENERGY, ENTROPY, and INFORMATION Jean Thoma June
    ENERGY, ENTROPY, AND INFORMATION Jean Thoma June 1977 Research Memoranda are interim reports on research being conducted by the International Institute for Applied Systems Analysis, and as such receive only limited scientific review. Views or opinions contained herein do not necessarily represent those of the Institute or of the National Member Organizations supporting the Institute. PREFACE This Research Memorandum contains the work done during the stay of Professor Dr.Sc. Jean Thoma, Zug, Switzerland, at IIASA in November 1976. It is based on extensive discussions with Professor HAfele and other members of the Energy Program. Al- though the content of this report is not yet very uniform because of the different starting points on the subject under consideration, its publication is considered a necessary step in fostering the related discussion at IIASA evolving around th.e problem of energy demand. ABSTRACT Thermodynamical considerations of energy and entropy are being pursued in order to arrive at a general starting point for relating entropy, negentropy, and information. Thus one hopes to ultimately arrive at a common denominator for quanti- ties of a more general nature, including economic parameters. The report closes with the description of various heating appli- cation.~and related efficiencies. Such considerations are important in order to understand in greater depth the nature and composition of energy demand. This may be highlighted by the observation that it is, of course, not the energy that is consumed or demanded for but the informa- tion that goes along with it. TABLE 'OF 'CONTENTS Introduction ..................................... 1 2 . Various Aspects of Entropy ........................2 2.1 i he no me no logical Entropy ........................
    [Show full text]
  • Practice Problems from Chapter 1-3 Problem 1 One Mole of a Monatomic Ideal Gas Goes Through a Quasistatic Three-Stage Cycle (1-2, 2-3, 3-1) Shown in V 3 the Figure
    Practice Problems from Chapter 1-3 Problem 1 One mole of a monatomic ideal gas goes through a quasistatic three-stage cycle (1-2, 2-3, 3-1) shown in V 3 the Figure. T1 and T2 are given. V 2 2 (a) (10) Calculate the work done by the gas. Is it positive or negative? V 1 1 (b) (20) Using two methods (Sackur-Tetrode eq. and dQ/T), calculate the entropy change for each stage and ∆ for the whole cycle, Stotal. Did you get the expected ∆ result for Stotal? Explain. T1 T2 T (c) (5) What is the heat capacity (in units R) for each stage? Problem 1 (cont.) ∝ → (a) 1 – 2 V T P = const (isobaric process) δW 12=P ( V 2−V 1 )=R (T 2−T 1)>0 V = const (isochoric process) 2 – 3 δW 23 =0 V 1 V 1 dV V 1 T1 3 – 1 T = const (isothermal process) δW 31=∫ PdV =R T1 ∫ =R T 1 ln =R T1 ln ¿ 0 V V V 2 T 2 V 2 2 T1 T 2 T 2 δW total=δW 12+δW 31=R (T 2−T 1)+R T 1 ln =R T 1 −1−ln >0 T 2 [ T 1 T 1 ] Problem 1 (cont.) Sackur-Tetrode equation: V (b) 3 V 3 U S ( U ,V ,N )=R ln + R ln +kB ln f ( N ,m ) V2 2 N 2 N V f 3 T f V f 3 T f ΔS =R ln + R ln =R ln + ln V V 2 T V 2 T 1 1 i i ( i i ) 1 – 2 V ∝ T → P = const (isobaric process) T1 T2 T 5 T 2 ΔS12 = R ln 2 T 1 T T V = const (isochoric process) 3 1 3 2 2 – 3 ΔS 23 = R ln =− R ln 2 T 2 2 T 1 V 1 T 2 3 – 1 T = const (isothermal process) ΔS 31 =R ln =−R ln V 2 T 1 5 T 2 T 2 3 T 2 as it should be for a quasistatic cyclic process ΔS = R ln −R ln − R ln =0 cycle 2 T T 2 T (quasistatic – reversible), 1 1 1 because S is a state function.
    [Show full text]
  • Lecture 4: 09.16.05 Temperature, Heat, and Entropy
    3.012 Fundamentals of Materials Science Fall 2005 Lecture 4: 09.16.05 Temperature, heat, and entropy Today: LAST TIME .........................................................................................................................................................................................2� State functions ..............................................................................................................................................................................2� Path dependent variables: heat and work..................................................................................................................................2� DEFINING TEMPERATURE ...................................................................................................................................................................4� The zeroth law of thermodynamics .............................................................................................................................................4� The absolute temperature scale ..................................................................................................................................................5� CONSEQUENCES OF THE RELATION BETWEEN TEMPERATURE, HEAT, AND ENTROPY: HEAT CAPACITY .......................................6� The difference between heat and temperature ...........................................................................................................................6� Defining heat capacity.................................................................................................................................................................6�
    [Show full text]
  • Entropy: Ideal Gas Processes
    Chapter 19: The Kinec Theory of Gases Thermodynamics = macroscopic picture Gases micro -> macro picture One mole is the number of atoms in 12 g sample Avogadro’s Number of carbon-12 23 -1 C(12)—6 protrons, 6 neutrons and 6 electrons NA=6.02 x 10 mol 12 atomic units of mass assuming mP=mn Another way to do this is to know the mass of one molecule: then So the number of moles n is given by M n=N/N sample A N = N A mmole−mass € Ideal Gas Law Ideal Gases, Ideal Gas Law It was found experimentally that if 1 mole of any gas is placed in containers that have the same volume V and are kept at the same temperature T, approximately all have the same pressure p. The small differences in pressure disappear if lower gas densities are used. Further experiments showed that all low-density gases obey the equation pV = nRT. Here R = 8.31 K/mol ⋅ K and is known as the "gas constant." The equation itself is known as the "ideal gas law." The constant R can be expressed -23 as R = kNA . Here k is called the Boltzmann constant and is equal to 1.38 × 10 J/K. N If we substitute R as well as n = in the ideal gas law we get the equivalent form: NA pV = NkT. Here N is the number of molecules in the gas. The behavior of all real gases approaches that of an ideal gas at low enough densities. Low densitiens m= enumberans tha oft t hemoles gas molecul es are fa Nr e=nough number apa ofr tparticles that the y do not interact with one another, but only with the walls of the gas container.
    [Show full text]
  • The Enthalpy, and the Entropy of Activation (Rabbit/Lobster/Chick/Tuna/Halibut/Cod) PHILIP S
    Proc. Nat. Acad. Sci. USA Vol. 70, No. 2, pp. 430-432, February 1973 Temperature Adaptation of Enzymes: Roles of the Free Energy, the Enthalpy, and the Entropy of Activation (rabbit/lobster/chick/tuna/halibut/cod) PHILIP S. LOW, JEFFREY L. BADA, AND GEORGE N. SOMERO Scripps Institution of Oceanography, University of California, La Jolla, Calif. 92037 Communicated by A. Baird Hasting8, December 8, 1972 ABSTRACT The enzymic reactions of ectothermic function if they were capable of reducing the AG* character- (cold-blooded) species differ from those of avian and istic of their reactions more than were the homologous en- mammalian species in terms of the magnitudes of the three thermodynamic activation parameters, the free zymes of more warm-adapted species, i.e., birds or mammals. energy of activation (AG*), the enthalpy of activation In this paper, we report that the values of AG* are indeed (AH*), and the entropy of activation (AS*). Ectothermic slightly lower for enzymic reactions catalyzed by enzymes enzymes are more efficient than the homologous enzymes of ectotherms, relative to the homologous reactions of birds of birds and mammals in reducing the AG* "energy bar- rier" to a chemical reaction. Moreover, the relative im- and mammals. Moreover, the relative contributions of the portance of the enthalpic and entropic contributions to enthalpies and entropies of activation to AG* differ markedly AG* differs between these two broad classes of organisms. and, we feel, adaptively, between ectothermic and avian- mammalian enzymic reactions. Because all organisms conduct many of the same chemical transformations, certain functional classes of enzymes are METHODS present in virtually all species.
    [Show full text]
  • What Is Quantum Thermodynamics?
    What is Quantum Thermodynamics? Gian Paolo Beretta Universit`a di Brescia, via Branze 38, 25123 Brescia, Italy What is the physical significance of entropy? What is the physical origin of irreversibility? Do entropy and irreversibility exist only for complex and macroscopic systems? For everyday laboratory physics, the mathematical formalism of Statistical Mechanics (canonical and grand-canonical, Boltzmann, Bose-Einstein and Fermi-Dirac distributions) allows a successful description of the thermodynamic equilibrium properties of matter, including entropy values. How- ever, as already recognized by Schr¨odinger in 1936, Statistical Mechanics is impaired by conceptual ambiguities and logical inconsistencies, both in its explanation of the meaning of entropy and in its implications on the concept of state of a system. An alternative theory has been developed by Gyftopoulos, Hatsopoulos and the present author to eliminate these stumbling conceptual blocks while maintaining the mathematical formalism of ordinary quantum theory, so successful in applications. To resolve both the problem of the meaning of entropy and that of the origin of irreversibility, we have built entropy and irreversibility into the laws of microscopic physics. The result is a theory that has all the necessary features to combine Mechanics and Thermodynamics uniting all the successful results of both theories, eliminating the logical inconsistencies of Statistical Mechanics and the paradoxes on irreversibility, and providing an entirely new perspective on the microscopic origin of irreversibility, nonlinearity (therefore including chaotic behavior) and maximal-entropy-generation non-equilibrium dynamics. In this long introductory paper we discuss the background and formalism of Quantum Thermo- dynamics including its nonlinear equation of motion and the main general results regarding the nonequilibrium irreversible dynamics it entails.
    [Show full text]
  • Lecture 6: Entropy
    Matthew Schwartz Statistical Mechanics, Spring 2019 Lecture 6: Entropy 1 Introduction In this lecture, we discuss many ways to think about entropy. The most important and most famous property of entropy is that it never decreases Stot > 0 (1) Here, Stot means the change in entropy of a system plus the change in entropy of the surroundings. This is the second law of thermodynamics that we met in the previous lecture. There's a great quote from Sir Arthur Eddington from 1927 summarizing the importance of the second law: If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equationsthen so much the worse for Maxwell's equations. If it is found to be contradicted by observationwell these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of ther- modynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation. Another possibly relevant quote, from the introduction to the statistical mechanics book by David Goodstein: Ludwig Boltzmann who spent much of his life studying statistical mechanics, died in 1906, by his own hand. Paul Ehrenfest, carrying on the work, died similarly in 1933. Now it is our turn to study statistical mechanics. There are many ways to dene entropy. All of them are equivalent, although it can be hard to see. In this lecture we will compare and contrast dierent denitions, building up intuition for how to think about entropy in dierent contexts. The original denition of entropy, due to Clausius, was thermodynamic.
    [Show full text]
  • Thermodynamic Processes: the Limits of Possible
    Thermodynamic Processes: The Limits of Possible Thermodynamics put severe restrictions on what processes involving a change of the thermodynamic state of a system (U,V,N,…) are possible. In a quasi-static process system moves from one equilibrium state to another via a series of other equilibrium states . All quasi-static processes fall into two main classes: reversible and irreversible processes . Processes with decreasing total entropy of a thermodynamic system and its environment are prohibited by Postulate II Notes Graphic representation of a single thermodynamic system Phase space of extensive coordinates The fundamental relation S(1) =S(U (1) , X (1) ) of a thermodynamic system defines a hypersurface in the coordinate space S(1) S(1) U(1) U(1) X(1) X(1) S(1) – entropy of system 1 (1) (1) (1) (1) (1) U – energy of system 1 X = V , N 1 , …N m – coordinates of system 1 Notes Graphic representation of a composite thermodynamic system Phase space of extensive coordinates The fundamental relation of a composite thermodynamic system S = S (1) (U (1 ), X (1) ) + S (2) (U-U(1) ,X (2) ) (system 1 and system 2). defines a hyper-surface in the coordinate space of the composite system S(1+2) S(1+2) U (1,2) X = V, N 1, …N m – coordinates U of subsystems (1 and 2) X(1,2) (1,2) S – entropy of a composite system X U – energy of a composite system Notes Irreversible and reversible processes If we change constraints on some of the composite system coordinates, (e.g.
    [Show full text]
  • Similarity Between Quantum Mechanics and Thermodynamics: Entropy, Temperature, and Carnot Cycle
    Similarity between quantum mechanics and thermodynamics: Entropy, temperature, and Carnot cycle Sumiyoshi Abe1,2,3 and Shinji Okuyama1 1 Department of Physical Engineering, Mie University, Mie 514-8507, Japan 2 Institut Supérieur des Matériaux et Mécaniques Avancés, 44 F. A. Bartholdi, 72000 Le Mans, France 3 Inspire Institute Inc., Alexandria, Virginia 22303, USA Abstract Similarity between quantum mechanics and thermodynamics is discussed. It is found that if the Clausius equality is imposed on the Shannon entropy and the analogue of the quantity of heat, then the value of the Shannon entropy comes to formally coincide with that of the von Neumann entropy of the canonical density matrix, and pure-state quantum mechanics apparently transmutes into quantum thermodynamics. The corresponding quantum Carnot cycle of a simple two-state model of a particle confined in a one-dimensional infinite potential well is studied, and its efficiency is shown to be identical to the classical one. PACS number(s): 03.65.-w, 05.70.-a 1 In their work [1], Bender, Brody, and Meister have developed an interesting discussion about a quantum-mechanical analogue of the Carnot engine. They have treated a two-state model of a single particle confined in a one-dimensional potential well with width L and have considered reversible cycle. Using the “internal” energy, E (L) = ψ H ψ , they define the pressure (i.e., the force, because of the single dimensionality) as f = − d E(L) / d L , where H is the system Hamiltonian and ψ is a quantum state. Then, the analogue of “isothermal” process is defined to be f L = const.
    [Show full text]
  • 1 Where and How Is Entropy Generated in Solar Energy
    Where and How is Entropy Generated in Solar Energy Conversion Systems? Bolin Liao1* 1Department of Mechanical Engineering, University of California, Santa Barbara, CA, 93106, USA Abstract The hotness of the sun and the coldness of the outer space are inexhaustible thermodynamic resources for human beings. From a thermodynamic point of view, any energy conversion systems that receive energy from the sun and/or dissipate energy to the universe are heat engines with photons as the "working fluid" and can be analyzed using the concept of entropy. While entropy analysis provides a particularly convenient way to understand the efficiency limits, it is typically taught in the context of thermodynamic cycles among quasi-equilibrium states and its generalization to solar energy conversion systems running in a continuous and non-equilibrium fashion is not straightforward. In this educational article, we present a few examples to illustrate how the concept of photon entropy, combined with the radiative transfer equation, can be used to analyze the local entropy generation processes and the efficiency limits of different solar energy conversion systems. We provide explicit calculations for the local and total entropy generation rates for simple emitters and absorbers, as well as photovoltaic cells, which can be readily reproduced by students. We further discuss the connection between the entropy generation and the device efficiency, particularly the exact spectral matching condition that is shared by infinite- junction photovoltaic cells and reversible thermoelectric materials to approach their theoretical efficiency limit. * To whom correspondence should be addressed. Email: [email protected] 1 I. Introduction In the context of the radiative energy transfer, the Sun can be approximated as a blackbody at a temperature of 6000 K1, and the outer space is filled with the cosmic microwave background at an effective temperature of 2.7 K.
    [Show full text]
  • Energy Minimum Principle
    ESCI 341 – Atmospheric Thermodynamics Lesson 12 – The Energy Minimum Principle References: Thermodynamics and an Introduction to Thermostatistics, Callen Physical Chemistry, Levine THE ENTROPY MAXIMUM PRINCIPLE We’ve seen that a reversible process achieves maximum thermodynamic efficiency. Imagine that I am adding heat to a system (dQin), converting some of it to work (dW), and discarding the remaining heat (dQout). We’ve already demonstrated that dQout cannot be zero (this would violate the second law). We also have shown that dQout will be a minimum for a reversible process, since a reversible process is the most efficient. This means that the net heat for the system (dQ = dQin dQout) will be a maximum for a reversible process, dQrev dQirrev . The change in entropy of the system, dS, is defined in terms of reversible heat as dQrev/T. We can thus write the following inequality dQ dQ dS rev irrev , (1) T T Notice that for Eq. (1), if the system is reversible and adiabatic, we get dQ dS 0 irrev , T or dQirrev 0 which illustrates the following: If two states can be connected by a reversible adiabatic process, then any irreversible process connecting the two states must involve the removal of heat from the system. Eq. (1) can be written as dS dQ T (2) The equality will hold if all processes in the system are reversible. The inequality will hold if there are irreversible processes in the system. For an isolated system (dQ = 0) the inequality becomes dS 0 isolated system , which is just a restatement of the second law of thermodynamics.
    [Show full text]
  • The Carnot Cycle, Reversibility and Entropy
    entropy Article The Carnot Cycle, Reversibility and Entropy David Sands Department of Physics and Mathematics, University of Hull, Hull HU6 7RX, UK; [email protected] Abstract: The Carnot cycle and the attendant notions of reversibility and entropy are examined. It is shown how the modern view of these concepts still corresponds to the ideas Clausius laid down in the nineteenth century. As such, they reflect the outmoded idea, current at the time, that heat is motion. It is shown how this view of heat led Clausius to develop the entropy of a body based on the work that could be performed in a reversible process rather than the work that is actually performed in an irreversible process. In consequence, Clausius built into entropy a conflict with energy conservation, which is concerned with actual changes in energy. In this paper, reversibility and irreversibility are investigated by means of a macroscopic formulation of internal mechanisms of damping based on rate equations for the distribution of energy within a gas. It is shown that work processes involving a step change in external pressure, however small, are intrinsically irreversible. However, under idealised conditions of zero damping the gas inside a piston expands and traces out a trajectory through the space of equilibrium states. Therefore, the entropy change due to heat flow from the reservoir matches the entropy change of the equilibrium states. This trajectory can be traced out in reverse as the piston reverses direction, but if the external conditions are adjusted appropriately, the gas can be made to trace out a Carnot cycle in P-V space.
    [Show full text]