Heat Capacity Calculationsx

Total Page:16

File Type:pdf, Size:1020Kb

Heat Capacity Calculationsx Department of Chemical Engineering University of California, Santa Barbara ChE 110A Heat capacities in enthalpy and entropy calculations Enthalpy calculations Consider adding a fixed amount of heat to a closed system initially at temperature , at constant ͋ ͎ͥ pressure. We would like to know the final temperature . Applying the first law, we find that: ͎ͦ / ͘͏ Ɣ ͋ ƍ ͑ / Ɣ ͋ Ǝ ͊͐͘ We can rearrange this equation: ͋ Ɣ ͘͢͏ ƍ ͊͐͘͢ Substituting : ͏ Ɣ ͂ Ǝ ͊͐ ͋ Ɣ ͢ʞ͂͘ Ǝ ͊͐͘ Ǝ ͐͊͘ ʟ ƍ ͊͐͘͢ Ɣ ͂͘͢ Ǝ ͐͊͘͢ Since the pressure is constant, and this expression simplifies so we can integrate: ͊͘ Ɣ 0 ͋ Ɣ ͂͘͢ ͋ Ɣ ͢Δ͂ By definition, , and so we can write: ̽ ȸ ʠ ʡ v ͋ Ɣ ͢ ǹ ̽ ͎͘ u If the heat capacity is constant, we find that . On the other hand, in general the heat ͋ Ɣ ̽͢ʚ͎ͦ Ǝ ͎ͥʛ capacity can be temperature-dependent. A general temperature-dependent empirical form for the heat capacity for ideal gases and incompressible liquids is: ̽ ͦ ͯͦ Ɣ ̻ ƍ ̼͎ ƍ ͎̽ ƍ ͎̾ ͌ where and are substance-dependent constants and is absolute temperature. Substituting ̻, ̼, ̽, ̾ ͎ this into the enthalpy integral and performing the integration, we arrive at: ̼ ͦ ͦ ̽ ͧ ͧ ͯͥ ͯͥ ͋ Ɣ ͌͢ Ƭ̻ʚ͎ͦ Ǝ ͎ͥʛ ƍ ʚ͎ͦ Ǝ ͎ͥ ʛ ƍ ʚ͎ͦ Ǝ ͎ͥ ʛ Ǝ ̾ʚ͎ͦ Ǝ ͎ͥ ʛư 2 3 At this point, we could plug in the known values of everything and solve for , but we would need a ͎ͦ fairly robust nonlinear equation solver, since there are several terms involving . A better approach ͎ͦ would be to iterate, where we can provide a controlled initial guess. To set up the equation for iteration, we put it in the same form as the case for constant heat capacity. To do this, we multiply and divide by : ʚ͎ͦ Ǝ ͎ͥʛ ̼ ͦ ͦ ̽ ͧ ͧ ͯͥ ͯͥ ͌ ̻ʚ͎ͦ Ǝ ͎ͥʛ ƍ ʚ͎ͦ Ǝ ͎ͥ ʛ ƍ ʚ͎ͦ Ǝ ͎ͥ ʛ Ǝ ̾ʚ͎ͦ Ǝ ͎ͥ ʛ ʢ 2 3 ʣ ͋ Ɣ ͢ ʚ͎ͦ Ǝ ͎ͥʛ ʚ͎ͦ Ǝ ͎ͥʛ Ɣ ͢˛̽˜ ʚ͎ͦ Ǝ ͎ͥʛ where we have defined the average heat capacity using the expression: ˛̽˜ ̼ ͦ ͦ ̽ ͧ ͧ ͯͥ ͯͥ ͌ ̻ʚ͎ͦ Ǝ ͎ͥʛ ƍ ʚ͎ͦ Ǝ ͎ͥ ʛ ƍ ʚ͎ͦ Ǝ ͎ͥ ʛ Ǝ ̾ʚ͎ͦ Ǝ ͎ͥ ʛ ʢ 2 3 ʣ ˛̽˜ ȸ ʚ͎ͦ Ǝ ͎ͥʛ Here the subscript “H” indicates that the average heat capacity is computed with respect to enthalpy calculations. Note that is also dependent on , the temperature for which we are solving. Thus, ˛̽˜ ͎ͦ we need to perform an iteration. Rearranging our main equation, ͋ ͎ͦ Ɣ ͎ͥ ƍ ͢˛̽˜ To solve for , we provide an initial guess, compute , and then compute a new value for using ͎ͦ ˛̽˜ ͎ͦ the equation above. We continue this process until converges. ͎ͦ For numerical reasons, it is often more convenient to express the average heat capacity using and the ͎ͥ ratio of the temperatures , rather than . Making the substitution in the above ȸ ͎ͦ/͎ͥ ͎ͦ ͎ͦ Ɣ ͎ͥ equation, can also be written as: ˛̽˜ ̼ ̽ ͦ ͦ ͯͥ ͯͥ ˛̽˜ Ɣ ͌ Ƭ̻ ƍ ͎ͥʚ ƍ 1ʛ ƍ ͎ͥ ʚ ƍ ƍ 1ʛ ƍ ͎̾ͥ ư 2 3 Entropy calculations Let’s say we want to compute the entropy change in the same case as above, where the temperature of the system changes from the same to at constant pressure. Since entropy is a state function, it ͎ͥ ͎ͦ doesn’t matter what process we conceptualize for this change, and we will pick a reversible process as the most convenient. Starting again with the first law, simplified from the previous example: ͋ Ɣ ͢d͂ Substituting in : ͋- 1 Ɣ ͎͍͘͢ ͂͘ ͍͘ Ɣ ͎ ͎̽͘ Ɣ ͎ where we have used the definition of as before. Integrating, ̽ v ̽ Δ͍ Ɣ ǹ ͎͘ u ͎ If the heat capacity is constant, we arrive at: ͎ͦ Δ͍ Ɣ ̽ ln ƴ Ƹ ͎ͥ On the other hand, if the heat capacity follows the same substance-dependent form as before, we can again substitute and integrate to get: ͎ͦ ̽ ͦ ͦ ̾ ͯͦ ͯͦ Δ͍ Ɣ ͌ Ƭ̻ ln ƴ Ƹ ƍ ̼ʚ͎ͦ Ǝ ͎ͥʛ ƍ ʚ͎ͦ Ǝ ͎ͥ ʛ Ǝ ʚ͎ͦ Ǝ ͎ͥ ʛư ͎ͥ 2 2 We would like to put this expression in the same form as the constant heat capacity case: ̽ ͦ ͦ ̾ ͯͦ ͯͦ ͌ ʢ̻ ln ʚ͎ͦ⁄͎ͥ ʛ ƍ ̼ʚ͎ͦ Ǝ ͎ͥʛ ƍ ʚ͎ͦ Ǝ ͎ͥ ʛ Ǝ ʚ͎ͦ Ǝ ͎ͥ ʛʣ ͎ͦ Δ͍ Ɣ 2 2 ln ƴ Ƹ ͎ͦ ln ʠ ʡ ͎ͥ ͎ͥ ͎ͦ Ɣ ˛̽˜ ln ƴ Ƹ ͎ͥ Where we have defined a different average heat capacity to be: ˛̽˜ ̽ ͦ ͦ ̾ ͯͦ ͯͦ ͌ ̻ ln ʚ͎ͦ⁄͎ͥ ʛ ƍ ̼ʚ͎ͦ Ǝ ͎ͥʛ ƍ ʚ͎ͦ Ǝ ͎ͥ ʛ Ǝ ʚ͎ͦ Ǝ ͎ͥ ʛ ʢ 2 2 ʣ ˛̽˜ ȸ ͎ͦ ln ʠ ʡ ͎ͥ ͦ ̾ ƍ 1 Ǝ 1 Ɣ ͌ ʪ̻ ƍ ʨ̼ ͎ͥ ƍ ʦ̽ ͎ͤ ƍ ͦ ͦʧ ƴ Ƹʩ ƴ Ƹʫ ͎ͤ 2 ln with as before. Note that the average heat capacity used in entropy calculations is different ȸ ͎ͦ/͎ͥ from that used for enthalpy calculations, as indicated by the subscripts “S” and “H”, respectively. Also note that, if is not specified and is given, we would have to solve iteratively for using the ͎ͦ Δ͍ ͎ͦ rearranged expression: Ϛ͍ ͎ͦ Ɣ ͎ͥ exp ƴ Ƹ ˛̽˜.
Recommended publications
  • On Entropy, Information, and Conservation of Information
    entropy Article On Entropy, Information, and Conservation of Information Yunus A. Çengel Department of Mechanical Engineering, University of Nevada, Reno, NV 89557, USA; [email protected] Abstract: The term entropy is used in different meanings in different contexts, sometimes in contradic- tory ways, resulting in misunderstandings and confusion. The root cause of the problem is the close resemblance of the defining mathematical expressions of entropy in statistical thermodynamics and information in the communications field, also called entropy, differing only by a constant factor with the unit ‘J/K’ in thermodynamics and ‘bits’ in the information theory. The thermodynamic property entropy is closely associated with the physical quantities of thermal energy and temperature, while the entropy used in the communications field is a mathematical abstraction based on probabilities of messages. The terms information and entropy are often used interchangeably in several branches of sciences. This practice gives rise to the phrase conservation of entropy in the sense of conservation of information, which is in contradiction to the fundamental increase of entropy principle in thermody- namics as an expression of the second law. The aim of this paper is to clarify matters and eliminate confusion by putting things into their rightful places within their domains. The notion of conservation of information is also put into a proper perspective. Keywords: entropy; information; conservation of information; creation of information; destruction of information; Boltzmann relation Citation: Çengel, Y.A. On Entropy, 1. Introduction Information, and Conservation of One needs to be cautious when dealing with information since it is defined differently Information. Entropy 2021, 23, 779.
    [Show full text]
  • ENERGY, ENTROPY, and INFORMATION Jean Thoma June
    ENERGY, ENTROPY, AND INFORMATION Jean Thoma June 1977 Research Memoranda are interim reports on research being conducted by the International Institute for Applied Systems Analysis, and as such receive only limited scientific review. Views or opinions contained herein do not necessarily represent those of the Institute or of the National Member Organizations supporting the Institute. PREFACE This Research Memorandum contains the work done during the stay of Professor Dr.Sc. Jean Thoma, Zug, Switzerland, at IIASA in November 1976. It is based on extensive discussions with Professor HAfele and other members of the Energy Program. Al- though the content of this report is not yet very uniform because of the different starting points on the subject under consideration, its publication is considered a necessary step in fostering the related discussion at IIASA evolving around th.e problem of energy demand. ABSTRACT Thermodynamical considerations of energy and entropy are being pursued in order to arrive at a general starting point for relating entropy, negentropy, and information. Thus one hopes to ultimately arrive at a common denominator for quanti- ties of a more general nature, including economic parameters. The report closes with the description of various heating appli- cation.~and related efficiencies. Such considerations are important in order to understand in greater depth the nature and composition of energy demand. This may be highlighted by the observation that it is, of course, not the energy that is consumed or demanded for but the informa- tion that goes along with it. TABLE 'OF 'CONTENTS Introduction ..................................... 1 2 . Various Aspects of Entropy ........................2 2.1 i he no me no logical Entropy ........................
    [Show full text]
  • Lecture 4: 09.16.05 Temperature, Heat, and Entropy
    3.012 Fundamentals of Materials Science Fall 2005 Lecture 4: 09.16.05 Temperature, heat, and entropy Today: LAST TIME .........................................................................................................................................................................................2� State functions ..............................................................................................................................................................................2� Path dependent variables: heat and work..................................................................................................................................2� DEFINING TEMPERATURE ...................................................................................................................................................................4� The zeroth law of thermodynamics .............................................................................................................................................4� The absolute temperature scale ..................................................................................................................................................5� CONSEQUENCES OF THE RELATION BETWEEN TEMPERATURE, HEAT, AND ENTROPY: HEAT CAPACITY .......................................6� The difference between heat and temperature ...........................................................................................................................6� Defining heat capacity.................................................................................................................................................................6�
    [Show full text]
  • Chapter 3. Second and Third Law of Thermodynamics
    Chapter 3. Second and third law of thermodynamics Important Concepts Review Entropy; Gibbs Free Energy • Entropy (S) – definitions Law of Corresponding States (ch 1 notes) • Entropy changes in reversible and Reduced pressure, temperatures, volumes irreversible processes • Entropy of mixing of ideal gases • 2nd law of thermodynamics • 3rd law of thermodynamics Math • Free energy Numerical integration by computer • Maxwell relations (Trapezoidal integration • Dependence of free energy on P, V, T https://en.wikipedia.org/wiki/Trapezoidal_rule) • Thermodynamic functions of mixtures Properties of partial differential equations • Partial molar quantities and chemical Rules for inequalities potential Major Concept Review • Adiabats vs. isotherms p1V1 p2V2 • Sign convention for work and heat w done on c=C /R vm system, q supplied to system : + p1V1 p2V2 =Cp/CV w done by system, q removed from system : c c V1T1 V2T2 - • Joule-Thomson expansion (DH=0); • State variables depend on final & initial state; not Joule-Thomson coefficient, inversion path. temperature • Reversible change occurs in series of equilibrium V states T TT V P p • Adiabatic q = 0; Isothermal DT = 0 H CP • Equations of state for enthalpy, H and internal • Formation reaction; enthalpies of energy, U reaction, Hess’s Law; other changes D rxn H iD f Hi i T D rxn H Drxn Href DrxnCpdT Tref • Calorimetry Spontaneous and Nonspontaneous Changes First Law: when one form of energy is converted to another, the total energy in universe is conserved. • Does not give any other restriction on a process • But many processes have a natural direction Examples • gas expands into a vacuum; not the reverse • can burn paper; can't unburn paper • heat never flows spontaneously from cold to hot These changes are called nonspontaneous changes.
    [Show full text]
  • HEAT and TEMPERATURE Heat Is a Type of ENERGY. When Absorbed
    HEAT AND TEMPERATURE Heat is a type of ENERGY. When absorbed by a substance, heat causes inter-particle bonds to weaken and break which leads to a change of state (solid to liquid for example). Heat causing a phase change is NOT sufficient to cause an increase in temperature. Heat also causes an increase of kinetic energy (motion, friction) of the particles in a substance. This WILL cause an increase in TEMPERATURE. Temperature is NOT energy, only a measure of KINETIC ENERGY The reason why there is no change in temperature at a phase change is because the substance is using the heat only to change the way the particles interact (“stick together”). There is no increase in the particle motion and hence no rise in temperature. THERMAL ENERGY is one type of INTERNAL ENERGY possessed by an object. It is the KINETIC ENERGY component of the object’s internal energy. When thermal energy is transferred from a hot to a cold body, the term HEAT is used to describe the transferred energy. The hot body will decrease in temperature and hence in thermal energy. The cold body will increase in temperature and hence in thermal energy. Temperature Scales: The K scale is the absolute temperature scale. The lowest K temperature, 0 K, is absolute zero, the temperature at which an object possesses no thermal energy. The Celsius scale is based upon the melting point and boiling point of water at 1 atm pressure (0, 100o C) K = oC + 273.13 UNITS OF HEAT ENERGY The unit of heat energy we will use in this lesson is called the JOULE (J).
    [Show full text]
  • A Simple Method to Estimate Entropy and Free Energy of Atmospheric Gases from Their Action
    Article A Simple Method to Estimate Entropy and Free Energy of Atmospheric Gases from Their Action Ivan Kennedy 1,2,*, Harold Geering 2, Michael Rose 3 and Angus Crossan 2 1 Sydney Institute of Agriculture, University of Sydney, NSW 2006, Australia 2 QuickTest Technologies, PO Box 6285 North Ryde, NSW 2113, Australia; [email protected] (H.G.); [email protected] (A.C.) 3 NSW Department of Primary Industries, Wollongbar NSW 2447, Australia; [email protected] * Correspondence: [email protected]; Tel.: + 61-4-0794-9622 Received: 23 March 2019; Accepted: 26 April 2019; Published: 1 May 2019 Abstract: A convenient practical model for accurately estimating the total entropy (ΣSi) of atmospheric gases based on physical action is proposed. This realistic approach is fully consistent with statistical mechanics, but reinterprets its partition functions as measures of translational, rotational, and vibrational action or quantum states, to estimate the entropy. With all kinds of molecular action expressed as logarithmic functions, the total heat required for warming a chemical system from 0 K (ΣSiT) to a given temperature and pressure can be computed, yielding results identical with published experimental third law values of entropy. All thermodynamic properties of gases including entropy, enthalpy, Gibbs energy, and Helmholtz energy are directly estimated using simple algorithms based on simple molecular and physical properties, without resource to tables of standard values; both free energies are measures of quantum field states and of minimal statistical degeneracy, decreasing with temperature and declining density. We propose that this more realistic approach has heuristic value for thermodynamic computation of atmospheric profiles, based on steady state heat flows equilibrating with gravity.
    [Show full text]
  • The Enthalpy, and the Entropy of Activation (Rabbit/Lobster/Chick/Tuna/Halibut/Cod) PHILIP S
    Proc. Nat. Acad. Sci. USA Vol. 70, No. 2, pp. 430-432, February 1973 Temperature Adaptation of Enzymes: Roles of the Free Energy, the Enthalpy, and the Entropy of Activation (rabbit/lobster/chick/tuna/halibut/cod) PHILIP S. LOW, JEFFREY L. BADA, AND GEORGE N. SOMERO Scripps Institution of Oceanography, University of California, La Jolla, Calif. 92037 Communicated by A. Baird Hasting8, December 8, 1972 ABSTRACT The enzymic reactions of ectothermic function if they were capable of reducing the AG* character- (cold-blooded) species differ from those of avian and istic of their reactions more than were the homologous en- mammalian species in terms of the magnitudes of the three thermodynamic activation parameters, the free zymes of more warm-adapted species, i.e., birds or mammals. energy of activation (AG*), the enthalpy of activation In this paper, we report that the values of AG* are indeed (AH*), and the entropy of activation (AS*). Ectothermic slightly lower for enzymic reactions catalyzed by enzymes enzymes are more efficient than the homologous enzymes of ectotherms, relative to the homologous reactions of birds of birds and mammals in reducing the AG* "energy bar- rier" to a chemical reaction. Moreover, the relative im- and mammals. Moreover, the relative contributions of the portance of the enthalpic and entropic contributions to enthalpies and entropies of activation to AG* differ markedly AG* differs between these two broad classes of organisms. and, we feel, adaptively, between ectothermic and avian- mammalian enzymic reactions. Because all organisms conduct many of the same chemical transformations, certain functional classes of enzymes are METHODS present in virtually all species.
    [Show full text]
  • Heat Energy a Science A–Z Physical Series Word Count: 1,324 Heat Energy
    Heat Energy A Science A–Z Physical Series Word Count: 1,324 Heat Energy Written by Felicia Brown Visit www.sciencea-z.com www.sciencea-z.com KEY ELEMENTS USED IN THIS BOOK The Big Idea: One of the most important types of energy on Earth is heat energy. A great deal of heat energy comes from the Sun’s light Heat Energy hitting Earth. Other sources include geothermal energy, friction, and even living things. Heat energy is the driving force behind everything we do. This energy gives us the ability to run, dance, sing, and play. We also use heat energy to warm our homes, cook our food, power our vehicles, and create electricity. Key words: cold, conduction, conductor, convection, energy, evaporate, fire, friction, fuel, gas, geothermal heat, geyser, heat energy, hot, insulation, insulator, lightning, liquid, matter, particles, radiate, radiant energy, solid, Sun, temperature, thermometer, transfer, volcano Key comprehension skill: Cause and effect Other suitable comprehension skills: Compare and contrast; classify information; main idea and details; identify facts; elements of a genre; interpret graphs, charts, and diagram Key reading strategy: Connect to prior knowledge Other suitable reading strategies: Ask and answer questions; summarize; visualize; using a table of contents and headings; using a glossary and bold terms Photo Credits: Front cover: © iStockphoto.com/Julien Grondin; back cover, page 5: © iStockphoto.com/ Arpad Benedek; title page, page 20 (top): © iStockphoto.com/Anna Ziska; pages 3, 9, 20 (bottom): © Jupiterimages Corporation;
    [Show full text]
  • Thermodynamics
    ME346A Introduction to Statistical Mechanics { Wei Cai { Stanford University { Win 2011 Handout 6. Thermodynamics January 26, 2011 Contents 1 Laws of thermodynamics 2 1.1 The zeroth law . .3 1.2 The first law . .4 1.3 The second law . .5 1.3.1 Efficiency of Carnot engine . .5 1.3.2 Alternative statements of the second law . .7 1.4 The third law . .8 2 Mathematics of thermodynamics 9 2.1 Equation of state . .9 2.2 Gibbs-Duhem relation . 11 2.2.1 Homogeneous function . 11 2.2.2 Virial theorem / Euler theorem . 12 2.3 Maxwell relations . 13 2.4 Legendre transform . 15 2.5 Thermodynamic potentials . 16 3 Worked examples 21 3.1 Thermodynamic potentials and Maxwell's relation . 21 3.2 Properties of ideal gas . 24 3.3 Gas expansion . 28 4 Irreversible processes 32 4.1 Entropy and irreversibility . 32 4.2 Variational statement of second law . 32 1 In the 1st lecture, we will discuss the concepts of thermodynamics, namely its 4 laws. The most important concepts are the second law and the notion of Entropy. (reading assignment: Reif x 3.10, 3.11) In the 2nd lecture, We will discuss the mathematics of thermodynamics, i.e. the machinery to make quantitative predictions. We will deal with partial derivatives and Legendre transforms. (reading assignment: Reif x 4.1-4.7, 5.1-5.12) 1 Laws of thermodynamics Thermodynamics is a branch of science connected with the nature of heat and its conver- sion to mechanical, electrical and chemical energy. (The Webster pocket dictionary defines, Thermodynamics: physics of heat.) Historically, it grew out of efforts to construct more efficient heat engines | devices for ex- tracting useful work from expanding hot gases (http://www.answers.com/thermodynamics).
    [Show full text]
  • TEMPERATURE, HEAT, and SPECIFIC HEAT INTRODUCTION Temperature and Heat Are Two Topics That Are Often Confused
    TEMPERATURE, HEAT, AND SPECIFIC HEAT INTRODUCTION Temperature and heat are two topics that are often confused. Temperature measures how hot or cold an object is. Commonly this is measured with the aid of a thermometer even though other devices such as thermocouples and pyrometers are also used. Temperature is an intensive property; it does not depend on the amount of material present. In scientific work, temperature is most commonly expressed in units of degrees Celsius. On this scale the freezing point of water is 0oC and its boiling point is 100oC. Heat is a form of energy and is a phenomenon that has its origin in the motion of particles that make up a substance. Heat is an extensive property. The unit of heat in the metric system is called the calorie (cal). One calorie is defined as the amount of heat necessary to raise 1 gram of water by 1oC. This means that if you wish to raise 7 g of water by 4oC, (4)(7) = 28 cal would be required. A somewhat larger unit than the calorie is the kilocalorie (kcal) which equals 1000 cal. The definition of the calorie was made in reference to a particular substance, namely water. It takes 1 cal to raise the temperature of 1 g of water by 1oC. Does this imply perhaps that the amount of heat energy necessary to raise 1 g of other substances by 1oC is not equal to 1 cal? Experimentally we indeed find this to be true. In the modern system of international units heat is expressed in joules and kilojoules.
    [Show full text]
  • What Is Quantum Thermodynamics?
    What is Quantum Thermodynamics? Gian Paolo Beretta Universit`a di Brescia, via Branze 38, 25123 Brescia, Italy What is the physical significance of entropy? What is the physical origin of irreversibility? Do entropy and irreversibility exist only for complex and macroscopic systems? For everyday laboratory physics, the mathematical formalism of Statistical Mechanics (canonical and grand-canonical, Boltzmann, Bose-Einstein and Fermi-Dirac distributions) allows a successful description of the thermodynamic equilibrium properties of matter, including entropy values. How- ever, as already recognized by Schr¨odinger in 1936, Statistical Mechanics is impaired by conceptual ambiguities and logical inconsistencies, both in its explanation of the meaning of entropy and in its implications on the concept of state of a system. An alternative theory has been developed by Gyftopoulos, Hatsopoulos and the present author to eliminate these stumbling conceptual blocks while maintaining the mathematical formalism of ordinary quantum theory, so successful in applications. To resolve both the problem of the meaning of entropy and that of the origin of irreversibility, we have built entropy and irreversibility into the laws of microscopic physics. The result is a theory that has all the necessary features to combine Mechanics and Thermodynamics uniting all the successful results of both theories, eliminating the logical inconsistencies of Statistical Mechanics and the paradoxes on irreversibility, and providing an entirely new perspective on the microscopic origin of irreversibility, nonlinearity (therefore including chaotic behavior) and maximal-entropy-generation non-equilibrium dynamics. In this long introductory paper we discuss the background and formalism of Quantum Thermo- dynamics including its nonlinear equation of motion and the main general results regarding the nonequilibrium irreversible dynamics it entails.
    [Show full text]
  • Lecture 6: Entropy
    Matthew Schwartz Statistical Mechanics, Spring 2019 Lecture 6: Entropy 1 Introduction In this lecture, we discuss many ways to think about entropy. The most important and most famous property of entropy is that it never decreases Stot > 0 (1) Here, Stot means the change in entropy of a system plus the change in entropy of the surroundings. This is the second law of thermodynamics that we met in the previous lecture. There's a great quote from Sir Arthur Eddington from 1927 summarizing the importance of the second law: If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equationsthen so much the worse for Maxwell's equations. If it is found to be contradicted by observationwell these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of ther- modynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation. Another possibly relevant quote, from the introduction to the statistical mechanics book by David Goodstein: Ludwig Boltzmann who spent much of his life studying statistical mechanics, died in 1906, by his own hand. Paul Ehrenfest, carrying on the work, died similarly in 1933. Now it is our turn to study statistical mechanics. There are many ways to dene entropy. All of them are equivalent, although it can be hard to see. In this lecture we will compare and contrast dierent denitions, building up intuition for how to think about entropy in dierent contexts. The original denition of entropy, due to Clausius, was thermodynamic.
    [Show full text]