
Entropy and the Economic Process Gabriel A. Lozada University of Utah 2004 (A different version of this paper, including less bibliographic information and no footnotes, appears in Cutler J. Cleveland, ed., Encyclopedia of En- ergy, 2004, Volume 2, 471–478. Amsterdam: Elsevier.) I. Entropy II. The Laws of Thermodynamics III. Economic Implications of Entropy to Production Theory IV. Entropy and Neoclassical Economic Analysis V. Interpretations and Misinterpretations of Entropy VI. Conclusion GLOSSARY Entropy Change: The amount of heat absorbed by a material divided by the material’s absolute temperature. Isolated System: A collection of materials which exchanges neither mat- ter nor energy with its surroundings. Macroeconomics: The study of the economy as an aggregated whole, generally ignoring the behavior of individual industries or consumers and instead concentrating on economy-wide magnitudes such as unemployment, inflation, interest rates, the trade deficit, and growth of gross national prod- uct. Microeconomics: The study of the behavior of one or a few firms, indus- tries, or groups of consumers in an economy. Natural Resources: Exhaustible resources such as coal, renewable re- sources such as fish, and ecosystem capacities useful to humans, such as the ability to cleanse dirty water. Neoclassical Economics: The school of economic thought which postu- lates that consumers act in a way that maximizes their well-being subject to their income and possibly other constraints, that firms act in a way that maximizes their profits subject to the technological constraints of produc- tion, and in general that economic agents act in a way that maximizes an objective function which can in principle be rigorously and completely math- ematically described, subject to constraints which can also in principle be so described. Second Law of Thermodynamics: Also known as the Entropy Law, it states that in an isolated system, entropy cannot decrease; it can either rise or remain constant. 1 Neoclassical Theory of Value: The theory that market prices are deter- mined by the intersection of demand curves and supply curves, which in turn arise from the behavior of self-interested atomistic economic agents; these “curves” may be more complicated than simple functions over Euclidian space. I. Entropy In order to explain why a capitalist economy produces certain commodities, and at what price those commodities are exchanged, a description must be given of consumers’ desire and ability to buy the commodity, and of how firms produce that commodity. The description of how firms produce a com- modity is in many ways a description of the engineering opportunities open to the firm, although describing the allocation and supervision of labor is also important, as is the firm’s access to diverse forms of financing. In de- scribing the engineering opportunities open to the firm, economists usually have to work with a simplified version of all the potentially available tech- nologies in order for the problem to be tractable. However, it is important for the simplifications to obey fundamental physical laws, or at least not fla- grantly violate them, so economists studying production should have a basic knowledge of those laws. The writers who first emphasized the relevance of the laws of thermodynamics to production economics were Frederick Soddy (the 1921 Nobel laureate in Chemistry), Nicholas Georgescu-Roegen, and Herman E. Daly. The Second Law of Thermodynamics, called the Entropy Law, has been the focus of most of the work in this field. The first law of thermodynamics states that in an isolated system, energy is neither created nor destroyed. This law cannot explain why an ice cube placed on a sidewalk on a hot summer day always melts, because the first law would permit heat to flow from the ice cube as long as an equal amount of heat flowed to the air. Entropy is the concept used to explain why ice cubes on a hot sidewalk melt, and in general why heat always flows from hotter objects to colder ones and why spontaneous physical processes are spontaneous. The definition of entropy is straightforward. Let T stand for absolute temperature, that is, temperature in degrees Kelvin, denoted K. (Kelvin temperature is 273.15 degrees higher than Celsius temperature.) Let Q stand for the flow of heat into a material (or system, which is a collection of materials). The metric unit of heat Q, as well as that work, is the Joule, denoted ‘J’; the English unit is the calorie or the BTU. Then the change in the entropy “S” of the material (or system) is defined to be dS = dQrev/T , (1) 2 where the subscript “rev” stands for a reversible process; physicists call a process “reversible” if it involves no dissipative effects such as friction, viscosity, inelasticity, electrical resistance, or magnetic hysteresis [Zemansky 1968 p. 193, p. 215; Mackowiak 1965 p. 59]. As Zemansky adds [ibid. p. 225], if a system undergoes an irreversible process between an initial equilibrium state i and a final equilibrium state f, the entropy change of the system is equal to the integral from i to f of dQ/T , taken over any reversible path from i to f. No integral is taken over the original irreversible path. Suppose one has an isolated system containing two bodies, one hot and one cold, placed in thermal contact. When a given quantity of heat, Q0, flows from the hotter body to the colder one, the change in entropy of the system is −Q +Q ∆S = 0 + 0 . Thot Tcold Since Thot > Tcold, ∆S > 0. If heat were to flow in the opposite direc- tion, away from the colder body and toward the warmer body, then the Q0 terms in the above equation would change sign and ∆S would be nega- tive. Because it is the former and not the latter which is always observed, nineteenth-century physicists postulated that in an isolated system, entropy never decreases. This is one form of the second law of thermodynamics. II. The Laws of Thermodynamics The laws of thermodynamics are: Zeroth Law: Two systems which are in thermal equilibrium with a third are in thermal equilibrium with each other [Dugdale 1996 p. 13]. First Law: During a process in which no heat is exchanged with the envi- ronment, the work done is only a function of the initial and final states of the system, not of the path. Furthermore, during any process, heat flow, Q, is equal to Uf − Ui + W where Uf and Ui are the final and initial internal energies of the system and W is the net work done by the system [op. cit. p. 20; Zemansky pp. 78–79]. Second Law, Kelvin-Planck statement: No process is possible whose sole re- sult is the absorption of heat from a reservoir and the conversion of this heat into work [Zemansky 1968 p. 178]. Second Law, Clausius statement: No process is possible whose sole result is the transfer of heat from a cooler to a hotter body [op. cit. p. 184]. Second Law, entropy statement: In an isolated system, entropy is nonde- creasing [op. cit. p 234]. Third Law, Unattainability statement: It is impossible to reach absolute zero by any finite number of processes [op. cit. p. 498; Dugdale 1996 p. 177]. 3 Third Law, Nerst-Simon statement: In the limit as temperature goes to zero degrees Kelvin, the entropy change of any reaction is zero [Zemansky 1968 p. 498; Rao 1985 p. 257; Dugdale 1996 pp. 160–161]. Proofs that the three statements of the second law are equivalent can be found in Zemansky’s text, as can proof that the two statements of the third law are equivalent. III. Economic Implications of Entropy to Production Theory A. Heat Engines. The first application of entropy was to “heat engines”; these are devices, such as internal combustion engines and power-plant steam engines (in both conventional and nuclear power plants), which use tempera- ture differences to do their work. (A car’s motor, for example, will not work in a room hotter than the spontaneous ignition temperature of gasoline.) The “efficiency” of a heat engine is defined to be its work output divided by its heat input. A surprising and important implication of the Second Law of Thermodynamics is that the maximum efficiency of any heat en- gine operating between the temperatures Tlow and Thigh is 1 − (Tlow/Thigh). This is less than 1 because Tlow is strictly greater than zero due to the unattainability statement of the third law. A felicitous explanation of the maximum-efficiency equation appears in a book of Sir Oliver Lodge [Energy, 1929 pp. 39–40], who also explains what it has to do with entropy: If the quantity of heat energy Q is lost by a body at temper- ature T , and gained by a body at temperature T 0, then the entropy lost is Q/T , and the [entropy] gained is Q/T 0. The gain is greater than the loss, if T 0 is less than T . But if some of the heat is utilised,—converted into mechanical or other forms of energy by means of some kind of engine,—so that only Q0 is imparted to the colder body (the work done corresponding to the difference Q − Q0) then it may be that Q/T = Q0/T 0; and in that case the entropy of the system remains constant. This is the condition for a perfect engine conducting a reversible operation. Irreversible operations increase entropy or dissipate energy, for the lost availability cannot be recovered. The efficiency of an engine means the ratio between the portion of heat utilised and the heat taken from the source. It is (Q − Q0)/Q; and we see now that in a perfect engine this would be (T − T 0)/T , because the Q’s and the T ’s are then proportional.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages15 Page
-
File Size-