Chapter 6 Heat Capacity, Enthalpy, & Entropy

Total Page:16

File Type:pdf, Size:1020Kb

Chapter 6 Heat Capacity, Enthalpy, & Entropy Chapter 6 Heat capacity, enthalpy, & entropy 1 6.1 Introduction In this lecture, we examine the heat capacity as a function of temperature, compute the enthalpy, entropy, and Gibbs free energy, as functions of temperature. We then begin to assess phase equilibria constructing a phase diagram for a single component (unary) system. • By eq. 2.6 & 2.7 (2.6) (2.7) (2.6a) (2.7a) Integration of Eq. (2.7a) between the states ( , ) and ( , ) gives the difference between the molar enthalpies of the two states as (6.1) 2 1 2 6.2 THEORETICAL CALCULATION OF THE HEAT CAPACITY - Empirical rule by Dulong and Petit (1819) : Cv ≈ 3R (classical theory: avg. E for 1-D oscillator, = kT, E = 3N0kT = 3RT) - Calculation of Cv of a solid element as a function of T by the quantum theory: First calculation by Einstein (1907) - Einstein crystal – a crystal containing n atoms, each of which behaves as a harmonic oscillator vibrating independently discrete energy = + (6.2) 1 2 ℎ (6.3) – a system of 3n linear harmonic oscillators The Energy of Einstein crystal (due to vibration in the x, y, and z directions) 3 6.2 THEORETICAL CALCULATION OF THE HEAT CAPACITY Using, = + & eq. 4.13 Into 1 2 ℎ 4 6.2 THEORETICAL CALCULATION OF THE HEAT CAPACITY Taking where = , gives −ℎ⁄ and in which case (6.4) • Differentiation of eq. with respect to temperature at constant volume 5 6.2 THEORETICAL CALCULATION OF THE HEAT CAPACITY • Defining = : Einstein characteristic temperature ℎ⁄ (6.5) 0 0 ≈ → ∞ ≈ → the Einstein equation good at higher T, the theoretical values approach zero more rapidly than do the actual values. 6 6.2 THEORETICAL CALCULATION OF THE HEAT CAPACITY • Problem: although the Einstein equation adequately represents actual heat capacities at higher temperatures, the theoretical values approach zero more rapidly than do the actual values. • This discrepancy is caused by the fact that the oscillators do not vibrate with a single frequency. • In a crystal lattice as a harmonic oscillator, energy is expressed as = + (n = 0,1,2,….) ℎ Einstein assumed2 thatℎ is const. for all the same atoms in the oscillator. • Debye’s assumption (1912) : the range of frequencies of vibration available to the oscillators is the same as that available to the elastic vibrations in a continuous solid. : the maximum frequency of vibration of an oscillator 7 6.2 THEORETICAL CALCULATION OF THE HEAT CAPACITY • Integration Einstein’s equation in the range, 0 ≤ ≤ obtained the heat capacity of the solid which, with x=hυ/kT, gives (6.6) • Defining = = : Debye characteristic T ℎ⁄ ℎ⁄ • (Debye frequency)= = � ℎ • Debye’s equation gives an excellent fit to the experimental data at lower T. 8 6.2 THEORETICAL CALCULATION OF THE HEAT CAPACITY • The value of the integral in Eq. (6.6) from 0 to infinity is 25.98, and thus, for very low temperatures, Eq. (6.6) becomes (6.7) : Debye law for low-temperature heat capacities. 3 Debye’s theory: No consideration on the contribution made to the heat capacity by the uptake of energy by electrons ( absolute temperature) ∝ • At high T, where the lattice contribution approaches the Dulong and Petit value, the molar Cv should vary with T as in which bT is the electronic contribution. 9 6.3 THE EMPIRICAL REPRESENTATION OF HEAT CAPACITIES • By experimental measurements, : Normally fitted 10 6.4 ENTHALPY AS A FUNCTION OF TEMPERATURE AND COMPOSITION For a closed system of fixed composition, with a change in T from T to T at the const. P 1 2 ⅰ) H = H T , P H T , P = C dT (6.1) : H is the area under a plot of T2 ∆ 2 − 1 ∫T1 p ∆ ⅱ) A + B = AB chem. rxn or phase change at const. T, P H T, P = H T, P H T, P H T, P (6.8) : Hess′ law H < 0 exothermic ∆ AB − A − B H > 0 endothermic ∆ ∆ 11 6.4 ENTHALPY AS A FUNCTION OF TEMPERATURE AND COMPOSITION • Enthalpy change Consider the change of state where ( ) is the heat required to increase the temperature of one mole of solid A from ∆to → at constant pressure. 1 2 ( ) 12 6.4 ENTHALPY AS A FUNCTION OF TEMPERATURE AND COMPOSITION or (6.9) ∴where convention assigns the value of zero to H of elements in their stable states at 298 K. ex) M(s) + 1/2O g = MO s at 298K = 2 , , , 1 = , as , & 2 ∆298 298 − 298 − 2 298 , =0 by convention 298 298 2 298 13 6.4 ENTHALPY AS A FUNCTION OF TEMPERATURE AND COMPOSITION Fig 6.7 : For the oxidation Pb + O = PbO with H of mole of 1 1 O gas , 1mole of Pb( ) at 298K (=0 by convention) 2 2 2 2 s ab : 298 T 600K, where H ( ) = C , ( )dT T ≤ ≤ Pb s ∫298 p Pb s ac : 298 T 3000K, where H ( ) = C , ( )dT ; 1 1 T H , = -219p O2,000g J � ≤ ≤ 2O2 g 2 ∫298 ∆ PbO s 298K de : 298 T 1159K where H , = 219,000 + C , ( )dT J T ≤ ≤ PbO s T ∫298 p PbO s 14 6.4 ENTHALPY AS A FUNCTION OF TEMPERATURE AND COMPOSITION With H of mole of O ( ) and 1mole of Pb( ) at 298K(=0 by 1convention) 2 2 g s f : H of mole of O ( ) and 1mole of Pb( ) at T. 1 2 2 g s g : H of 1mole of PbO( ) at T. s Thus where 15 6.4 ENTHALPY AS A FUNCTION OF TEMPERATURE AND COMPOSITION From the data in Table 6.1, and, thus, from 298 to 600 K ( , ) With T=500K, H = 217,800 J In Fig. 6.7a, h: H of 1 mole of ( )at of 600K and 500K 600 to 1200K, ∆given as − 16 6.4 ENTHALPY AS A FUNCTION OF TEMPERATURE AND COMPOSITION • In Fig. 6.7b, ajkl: H of 1 mole of Pb and 1 mole of O ( ), and hence H is calculated from the cycle 2 g ′ ∆ T where 17 6.4 ENTHALPY AS A FUNCTION OF TEMPERATURE AND COMPOSITION Thus This gives = 216,700 at =1000K ′ ∆1000 − 18 6.4 ENTHALPY AS A FUNCTION OF TEMPERATURE AND COMPOSITION If the T of interest is higher than the Tm of both the metal and its oxide, then both latent heats of melting must be considered. 19 6.4 ENTHALPY AS A FUNCTION OF TEMPERATURE AND COMPOSITION • If the system contains a low-temperature phase in equilibrium with a high-temperature phase at the equilibrium phase transition temperature then introduction of heat to the system (the external influence) would be expected to increase the temperature of the system (the effect) by Le Chatelier’s principle. • However, the system undergoes an endothermic change, which absorbs the heat introduced at constant temperature, and hence nullifies the effect of the external influence. The endothermic process is the melting of some of the solid. A phase change from a low- to a high-temperature phase is always endothermic, and hence H for the change is always a positive quantity. Thus H is always positive. The general Eq. (6.9) can be obtained as follows: ∆ ∆ m 20 6.4 ENTHALPY AS A FUNCTION OF TEMPERATURE AND COMPOSITION Subtraction gives or (6.10) and integrating from state 1 to state 2 gives (6.11) Equations (6.10) and (6.11) are expressions of Kirchhoff’s Law. 21 6.5 THE DEPENDENCE OF ENTROPY ON TEMPERATURE AND THE THIRD LAW OF THERMODYNAMICS • The 3rd law of thermodynamics : Entropy of homogeneous substance at complete internal equilibrium state is ‘0’ at 0 K. For a closed system undergoing a reversible process, (3.8) At const. P, As T increased, (6.12) the molar S of the system at any T is given by (6.13) 22 6.5 THE DEPENDENCE OF ENTROPY ON TEMPERATURE AND THE THIRD LAW OF THERMODYNAMICS rd T. W. Richards (1902) found experimentally that ΔS → 0 and ΔCp → 0 as T → 0. (Clue for the 3 law) Nernst (1906) → 0 as T → 0. Why? by differentiating Eq. (5.2) G = H – TS with respect to T at constant P: From Eq. (5.12) dG = -SdT + VdP thus → 0 23 6.5 THE DEPENDENCE OF ENTROPY ON TEMPERATURE AND THE THIRD LAW OF THERMODYNAMICS (i) ΔC = ΣνiC i → 0 means that each C i → 0 (solutions) p p by Einsteinp & Debye (T → 0, C → 0) (ii) ΔS = ΣνiSi → 0 means that each Si → 0 v thus, Ω th = Ω conf = 1 i.e., every particles should be at ground state at 0 K, (Ω th = 1) every particles should be uniform in concentration (Ω conf = 1). Thus, it should be at internal equilibrium. Plank statement 24 6.5 THE DEPENDENCE OF ENTROPY ON TEMPERATURE AND THE THIRD LAW OF THERMODYNAMICS • If ( ) and ( ) 0 as T 0, S & C 0 as T 0 Nernst’s∆G heat theorem∆H states that “for all reactions involving substances in the condensed • ⁄T P ⁄T P → → ∆ ∆ P → → state, ΔS is zero at the absolute zero of temperature” • Thus, for the general reaction A + B = AB, = = 0 = 0 and if and are assigned the value of zero at 0 K, then the compound AB also has zero entropy at 0 K. ∆ − − • The incompleteness of Nernst’s theorem was pointed out by Planck, who stated that “the entropy of any homogeneous substance, which is in complete internal equilibrium, may be taken to be zero at 0 K.” 25 6.5 THE DEPENDENCE OF ENTROPY ON TEMPERATURE AND THE THIRD LAW OF THERMODYNAMICS the substance be in complete internal equilibrium: ① Glasses - noncrystalline, supercooled liquids liquid-like disordered atom arrangements → frozen into solid glassy state → metastable - ≠0, depending on degree of atomic order 0 ② Solutions - mixture of atoms, ions or molecules - entropy of mixing - atomic randomness of a mixture determines its degree of order : complete ordering : every A is coordinated only by B atoms and vice versa : complete randomness : 50% of the neighbors of every atom are A atoms and 50% are B atoms.
Recommended publications
  • On Entropy, Information, and Conservation of Information
    entropy Article On Entropy, Information, and Conservation of Information Yunus A. Çengel Department of Mechanical Engineering, University of Nevada, Reno, NV 89557, USA; [email protected] Abstract: The term entropy is used in different meanings in different contexts, sometimes in contradic- tory ways, resulting in misunderstandings and confusion. The root cause of the problem is the close resemblance of the defining mathematical expressions of entropy in statistical thermodynamics and information in the communications field, also called entropy, differing only by a constant factor with the unit ‘J/K’ in thermodynamics and ‘bits’ in the information theory. The thermodynamic property entropy is closely associated with the physical quantities of thermal energy and temperature, while the entropy used in the communications field is a mathematical abstraction based on probabilities of messages. The terms information and entropy are often used interchangeably in several branches of sciences. This practice gives rise to the phrase conservation of entropy in the sense of conservation of information, which is in contradiction to the fundamental increase of entropy principle in thermody- namics as an expression of the second law. The aim of this paper is to clarify matters and eliminate confusion by putting things into their rightful places within their domains. The notion of conservation of information is also put into a proper perspective. Keywords: entropy; information; conservation of information; creation of information; destruction of information; Boltzmann relation Citation: Çengel, Y.A. On Entropy, 1. Introduction Information, and Conservation of One needs to be cautious when dealing with information since it is defined differently Information. Entropy 2021, 23, 779.
    [Show full text]
  • ENERGY, ENTROPY, and INFORMATION Jean Thoma June
    ENERGY, ENTROPY, AND INFORMATION Jean Thoma June 1977 Research Memoranda are interim reports on research being conducted by the International Institute for Applied Systems Analysis, and as such receive only limited scientific review. Views or opinions contained herein do not necessarily represent those of the Institute or of the National Member Organizations supporting the Institute. PREFACE This Research Memorandum contains the work done during the stay of Professor Dr.Sc. Jean Thoma, Zug, Switzerland, at IIASA in November 1976. It is based on extensive discussions with Professor HAfele and other members of the Energy Program. Al- though the content of this report is not yet very uniform because of the different starting points on the subject under consideration, its publication is considered a necessary step in fostering the related discussion at IIASA evolving around th.e problem of energy demand. ABSTRACT Thermodynamical considerations of energy and entropy are being pursued in order to arrive at a general starting point for relating entropy, negentropy, and information. Thus one hopes to ultimately arrive at a common denominator for quanti- ties of a more general nature, including economic parameters. The report closes with the description of various heating appli- cation.~and related efficiencies. Such considerations are important in order to understand in greater depth the nature and composition of energy demand. This may be highlighted by the observation that it is, of course, not the energy that is consumed or demanded for but the informa- tion that goes along with it. TABLE 'OF 'CONTENTS Introduction ..................................... 1 2 . Various Aspects of Entropy ........................2 2.1 i he no me no logical Entropy ........................
    [Show full text]
  • Practice Problems from Chapter 1-3 Problem 1 One Mole of a Monatomic Ideal Gas Goes Through a Quasistatic Three-Stage Cycle (1-2, 2-3, 3-1) Shown in V 3 the Figure
    Practice Problems from Chapter 1-3 Problem 1 One mole of a monatomic ideal gas goes through a quasistatic three-stage cycle (1-2, 2-3, 3-1) shown in V 3 the Figure. T1 and T2 are given. V 2 2 (a) (10) Calculate the work done by the gas. Is it positive or negative? V 1 1 (b) (20) Using two methods (Sackur-Tetrode eq. and dQ/T), calculate the entropy change for each stage and ∆ for the whole cycle, Stotal. Did you get the expected ∆ result for Stotal? Explain. T1 T2 T (c) (5) What is the heat capacity (in units R) for each stage? Problem 1 (cont.) ∝ → (a) 1 – 2 V T P = const (isobaric process) δW 12=P ( V 2−V 1 )=R (T 2−T 1)>0 V = const (isochoric process) 2 – 3 δW 23 =0 V 1 V 1 dV V 1 T1 3 – 1 T = const (isothermal process) δW 31=∫ PdV =R T1 ∫ =R T 1 ln =R T1 ln ¿ 0 V V V 2 T 2 V 2 2 T1 T 2 T 2 δW total=δW 12+δW 31=R (T 2−T 1)+R T 1 ln =R T 1 −1−ln >0 T 2 [ T 1 T 1 ] Problem 1 (cont.) Sackur-Tetrode equation: V (b) 3 V 3 U S ( U ,V ,N )=R ln + R ln +kB ln f ( N ,m ) V2 2 N 2 N V f 3 T f V f 3 T f ΔS =R ln + R ln =R ln + ln V V 2 T V 2 T 1 1 i i ( i i ) 1 – 2 V ∝ T → P = const (isobaric process) T1 T2 T 5 T 2 ΔS12 = R ln 2 T 1 T T V = const (isochoric process) 3 1 3 2 2 – 3 ΔS 23 = R ln =− R ln 2 T 2 2 T 1 V 1 T 2 3 – 1 T = const (isothermal process) ΔS 31 =R ln =−R ln V 2 T 1 5 T 2 T 2 3 T 2 as it should be for a quasistatic cyclic process ΔS = R ln −R ln − R ln =0 cycle 2 T T 2 T (quasistatic – reversible), 1 1 1 because S is a state function.
    [Show full text]
  • Lecture 4: 09.16.05 Temperature, Heat, and Entropy
    3.012 Fundamentals of Materials Science Fall 2005 Lecture 4: 09.16.05 Temperature, heat, and entropy Today: LAST TIME .........................................................................................................................................................................................2� State functions ..............................................................................................................................................................................2� Path dependent variables: heat and work..................................................................................................................................2� DEFINING TEMPERATURE ...................................................................................................................................................................4� The zeroth law of thermodynamics .............................................................................................................................................4� The absolute temperature scale ..................................................................................................................................................5� CONSEQUENCES OF THE RELATION BETWEEN TEMPERATURE, HEAT, AND ENTROPY: HEAT CAPACITY .......................................6� The difference between heat and temperature ...........................................................................................................................6� Defining heat capacity.................................................................................................................................................................6�
    [Show full text]
  • Chapter 3. Second and Third Law of Thermodynamics
    Chapter 3. Second and third law of thermodynamics Important Concepts Review Entropy; Gibbs Free Energy • Entropy (S) – definitions Law of Corresponding States (ch 1 notes) • Entropy changes in reversible and Reduced pressure, temperatures, volumes irreversible processes • Entropy of mixing of ideal gases • 2nd law of thermodynamics • 3rd law of thermodynamics Math • Free energy Numerical integration by computer • Maxwell relations (Trapezoidal integration • Dependence of free energy on P, V, T https://en.wikipedia.org/wiki/Trapezoidal_rule) • Thermodynamic functions of mixtures Properties of partial differential equations • Partial molar quantities and chemical Rules for inequalities potential Major Concept Review • Adiabats vs. isotherms p1V1 p2V2 • Sign convention for work and heat w done on c=C /R vm system, q supplied to system : + p1V1 p2V2 =Cp/CV w done by system, q removed from system : c c V1T1 V2T2 - • Joule-Thomson expansion (DH=0); • State variables depend on final & initial state; not Joule-Thomson coefficient, inversion path. temperature • Reversible change occurs in series of equilibrium V states T TT V P p • Adiabatic q = 0; Isothermal DT = 0 H CP • Equations of state for enthalpy, H and internal • Formation reaction; enthalpies of energy, U reaction, Hess’s Law; other changes D rxn H iD f Hi i T D rxn H Drxn Href DrxnCpdT Tref • Calorimetry Spontaneous and Nonspontaneous Changes First Law: when one form of energy is converted to another, the total energy in universe is conserved. • Does not give any other restriction on a process • But many processes have a natural direction Examples • gas expands into a vacuum; not the reverse • can burn paper; can't unburn paper • heat never flows spontaneously from cold to hot These changes are called nonspontaneous changes.
    [Show full text]
  • Entropy: Ideal Gas Processes
    Chapter 19: The Kinec Theory of Gases Thermodynamics = macroscopic picture Gases micro -> macro picture One mole is the number of atoms in 12 g sample Avogadro’s Number of carbon-12 23 -1 C(12)—6 protrons, 6 neutrons and 6 electrons NA=6.02 x 10 mol 12 atomic units of mass assuming mP=mn Another way to do this is to know the mass of one molecule: then So the number of moles n is given by M n=N/N sample A N = N A mmole−mass € Ideal Gas Law Ideal Gases, Ideal Gas Law It was found experimentally that if 1 mole of any gas is placed in containers that have the same volume V and are kept at the same temperature T, approximately all have the same pressure p. The small differences in pressure disappear if lower gas densities are used. Further experiments showed that all low-density gases obey the equation pV = nRT. Here R = 8.31 K/mol ⋅ K and is known as the "gas constant." The equation itself is known as the "ideal gas law." The constant R can be expressed -23 as R = kNA . Here k is called the Boltzmann constant and is equal to 1.38 × 10 J/K. N If we substitute R as well as n = in the ideal gas law we get the equivalent form: NA pV = NkT. Here N is the number of molecules in the gas. The behavior of all real gases approaches that of an ideal gas at low enough densities. Low densitiens m= enumberans tha oft t hemoles gas molecul es are fa Nr e=nough number apa ofr tparticles that the y do not interact with one another, but only with the walls of the gas container.
    [Show full text]
  • A Simple Method to Estimate Entropy and Free Energy of Atmospheric Gases from Their Action
    Article A Simple Method to Estimate Entropy and Free Energy of Atmospheric Gases from Their Action Ivan Kennedy 1,2,*, Harold Geering 2, Michael Rose 3 and Angus Crossan 2 1 Sydney Institute of Agriculture, University of Sydney, NSW 2006, Australia 2 QuickTest Technologies, PO Box 6285 North Ryde, NSW 2113, Australia; [email protected] (H.G.); [email protected] (A.C.) 3 NSW Department of Primary Industries, Wollongbar NSW 2447, Australia; [email protected] * Correspondence: [email protected]; Tel.: + 61-4-0794-9622 Received: 23 March 2019; Accepted: 26 April 2019; Published: 1 May 2019 Abstract: A convenient practical model for accurately estimating the total entropy (ΣSi) of atmospheric gases based on physical action is proposed. This realistic approach is fully consistent with statistical mechanics, but reinterprets its partition functions as measures of translational, rotational, and vibrational action or quantum states, to estimate the entropy. With all kinds of molecular action expressed as logarithmic functions, the total heat required for warming a chemical system from 0 K (ΣSiT) to a given temperature and pressure can be computed, yielding results identical with published experimental third law values of entropy. All thermodynamic properties of gases including entropy, enthalpy, Gibbs energy, and Helmholtz energy are directly estimated using simple algorithms based on simple molecular and physical properties, without resource to tables of standard values; both free energies are measures of quantum field states and of minimal statistical degeneracy, decreasing with temperature and declining density. We propose that this more realistic approach has heuristic value for thermodynamic computation of atmospheric profiles, based on steady state heat flows equilibrating with gravity.
    [Show full text]
  • Thermodynamics
    ME346A Introduction to Statistical Mechanics { Wei Cai { Stanford University { Win 2011 Handout 6. Thermodynamics January 26, 2011 Contents 1 Laws of thermodynamics 2 1.1 The zeroth law . .3 1.2 The first law . .4 1.3 The second law . .5 1.3.1 Efficiency of Carnot engine . .5 1.3.2 Alternative statements of the second law . .7 1.4 The third law . .8 2 Mathematics of thermodynamics 9 2.1 Equation of state . .9 2.2 Gibbs-Duhem relation . 11 2.2.1 Homogeneous function . 11 2.2.2 Virial theorem / Euler theorem . 12 2.3 Maxwell relations . 13 2.4 Legendre transform . 15 2.5 Thermodynamic potentials . 16 3 Worked examples 21 3.1 Thermodynamic potentials and Maxwell's relation . 21 3.2 Properties of ideal gas . 24 3.3 Gas expansion . 28 4 Irreversible processes 32 4.1 Entropy and irreversibility . 32 4.2 Variational statement of second law . 32 1 In the 1st lecture, we will discuss the concepts of thermodynamics, namely its 4 laws. The most important concepts are the second law and the notion of Entropy. (reading assignment: Reif x 3.10, 3.11) In the 2nd lecture, We will discuss the mathematics of thermodynamics, i.e. the machinery to make quantitative predictions. We will deal with partial derivatives and Legendre transforms. (reading assignment: Reif x 4.1-4.7, 5.1-5.12) 1 Laws of thermodynamics Thermodynamics is a branch of science connected with the nature of heat and its conver- sion to mechanical, electrical and chemical energy. (The Webster pocket dictionary defines, Thermodynamics: physics of heat.) Historically, it grew out of efforts to construct more efficient heat engines | devices for ex- tracting useful work from expanding hot gases (http://www.answers.com/thermodynamics).
    [Show full text]
  • Lecture 6: Entropy
    Matthew Schwartz Statistical Mechanics, Spring 2019 Lecture 6: Entropy 1 Introduction In this lecture, we discuss many ways to think about entropy. The most important and most famous property of entropy is that it never decreases Stot > 0 (1) Here, Stot means the change in entropy of a system plus the change in entropy of the surroundings. This is the second law of thermodynamics that we met in the previous lecture. There's a great quote from Sir Arthur Eddington from 1927 summarizing the importance of the second law: If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equationsthen so much the worse for Maxwell's equations. If it is found to be contradicted by observationwell these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of ther- modynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation. Another possibly relevant quote, from the introduction to the statistical mechanics book by David Goodstein: Ludwig Boltzmann who spent much of his life studying statistical mechanics, died in 1906, by his own hand. Paul Ehrenfest, carrying on the work, died similarly in 1933. Now it is our turn to study statistical mechanics. There are many ways to dene entropy. All of them are equivalent, although it can be hard to see. In this lecture we will compare and contrast dierent denitions, building up intuition for how to think about entropy in dierent contexts. The original denition of entropy, due to Clausius, was thermodynamic.
    [Show full text]
  • Thermodynamic Processes: the Limits of Possible
    Thermodynamic Processes: The Limits of Possible Thermodynamics put severe restrictions on what processes involving a change of the thermodynamic state of a system (U,V,N,…) are possible. In a quasi-static process system moves from one equilibrium state to another via a series of other equilibrium states . All quasi-static processes fall into two main classes: reversible and irreversible processes . Processes with decreasing total entropy of a thermodynamic system and its environment are prohibited by Postulate II Notes Graphic representation of a single thermodynamic system Phase space of extensive coordinates The fundamental relation S(1) =S(U (1) , X (1) ) of a thermodynamic system defines a hypersurface in the coordinate space S(1) S(1) U(1) U(1) X(1) X(1) S(1) – entropy of system 1 (1) (1) (1) (1) (1) U – energy of system 1 X = V , N 1 , …N m – coordinates of system 1 Notes Graphic representation of a composite thermodynamic system Phase space of extensive coordinates The fundamental relation of a composite thermodynamic system S = S (1) (U (1 ), X (1) ) + S (2) (U-U(1) ,X (2) ) (system 1 and system 2). defines a hyper-surface in the coordinate space of the composite system S(1+2) S(1+2) U (1,2) X = V, N 1, …N m – coordinates U of subsystems (1 and 2) X(1,2) (1,2) S – entropy of a composite system X U – energy of a composite system Notes Irreversible and reversible processes If we change constraints on some of the composite system coordinates, (e.g.
    [Show full text]
  • Thermodynamics
    ...Thermodynamics Entropy: The state function for the Second Law R R d¯Q Entropy dS = T Central Equation dU = TdS − PdV Ideal gas entropy ∆s = cv ln T =T0 + R ln v=v0 Boltzmann entropy S = klogW P Statistical Entropy S = −kB i pi ln(pi ) October 14, 2019 1 / 25 Fewer entropy with chickens Counting Things You can't count a continuum. This includes states which may never be actually realised. This is where the ideal gas entropy breaks down Boltzmann's entropy requires Quantum Mechanics. October 14, 2019 2 / 25 Equilibrium and the thermodynamic potentials Second Law : entropy increases in any isolated system. Entropy of system+surroundings must increase. A system can reduce its entropy, provided the entropy of the surroundings increases by more. October 14, 2019 3 / 25 Equilibrium and boundary conditions Main postulate of thermodynamics: systems tend to \equilibrium". Equilibrium depends on boundary conditions. October 14, 2019 4 / 25 A function for every boundary condition Another statement of 2nd Law... System+surroundings maximise S Compare with mechanical equilibrium System minimises energy In thermodynamic equilibrium System minimises ... Free energy October 14, 2019 5 / 25 Free energy Willard Gibbs 1839-1903: A Method of Geometrical Representation of the Thermodynamic Properties of Substances by Means of Surfaces, 1873 Equation of state plotted as a surface of energy vs T and P. Volume, entropy are slopes of this surface Heat capacity, compressibility are curvatures. October 14, 2019 6 / 25 Surface and Slopes: Not like this October 14, 2019 7 / 25 Surfaces and slope: More like something from Lovecraft H.P.Lovecraft 1928 \The Call of Cthulhu" PV or TS \space" is completely non-euclidean.
    [Show full text]
  • Thermodynamic Potentials
    THERMODYNAMIC POTENTIALS, PHASE TRANSITION AND LOW TEMPERATURE PHYSICS Syllabus: Unit II - Thermodynamic potentials : Internal Energy; Enthalpy; Helmholtz free energy; Gibbs free energy and their significance; Maxwell's thermodynamic relations (using thermodynamic potentials) and their significance; TdS relations; Energy equations and Heat Capacity equations; Third law of thermodynamics (Nernst Heat theorem) Thermodynamics deals with the conversion of heat energy to other forms of energy or vice versa in general. A thermodynamic system is the quantity of matter under study which is in general macroscopic in nature. Examples: Gas, vapour, vapour in contact with liquid etc.. Thermodynamic stateor condition of a system is one which is described by the macroscopic physical quantities like pressure (P), volume (V), temperature (T) and entropy (S). The physical quantities like P, V, T and S are called thermodynamic variables. Any two of these variables are independent variables and the rest are dependent variables. A general relation between the thermodynamic variables is called equation of state. The relation between the variables that describe a thermodynamic system are given by first and second law of thermodynamics. According to first law of thermodynamics, when a substance absorbs an amount of heat dQ at constant pressure P, its internal energy increases by dU and the substance does work dW by increase in its volume by dV. Mathematically it is represented by 풅푸 = 풅푼 + 풅푾 = 풅푼 + 푷 풅푽….(1) If a substance absorbs an amount of heat dQ at a temperature T and if all changes that take place are perfectly reversible, then the change in entropy from the second 풅푸 law of thermodynamics is 풅푺 = or 풅푸 = 푻 풅푺….(2) 푻 From equations (1) and (2), 푻 풅푺 = 풅푼 + 푷 풅푽 This is the basic equation that connects the first and second laws of thermodynamics.
    [Show full text]