Flory-Huggins Model for Polymer Solutions February 5, 2

Total Page:16

File Type:pdf, Size:1020Kb

Flory-Huggins Model for Polymer Solutions February 5, 2 Chemical Engineering 160/260 Polymer Science and Engineering LectureLecture 99 -- FloryFlory--HugginsHuggins ModelModel forfor PolymerPolymer SolutionsSolutions FebruaryFebruary 5,5, 20012001 Read Sperling, Chapter 4 Objectives ! To develop the classical Flory-Huggins theory for the free energy of mixing of polymer solutions based on a statistical approach on a regular lattice. ! To describe the criteria for phase stability and illustrate typical phase diagrams for polymer blends and solutions. Outline ! Lattice Theory for Solutions of Small Molecules " Thermodynamic probability and the Boltzmann Equation " Ideal solution ! Flory-Huggins Theory of Polymer Solutions " Placement of a new polymer molecule on a partially filled lattice " Entropy of mixing " Enthalpy of mixing (for dispersive or dipole-dipole interactions) " Cohesive energy density and solubility parameter " Free energy of mixing Lattice Theory for Solutions of Small Molecules Assume that a solution may be formed by distributing the pure components on the sites of a regular lattice. Further assume that there are N1 molecules of Type 1, N2 molecules of Type 2, and that Type 1 and Type 2 molecules are indistinguishable but identical in size and interaction energy. Small molecule of Type 1 (e.g. solvent) Small molecule of Type 2 (e.g. solute) Thermodynamic Probability Place the molecules on the N = N1 + N2 sites of a three- dimensional lattice. Total number of arrangements of N molecules Total no. different ways N! of arranging molecules of ≡ Ω = Types 1 and 2 on the lattice NN12!! Interchanging the 1’s or 2’s makes no difference. Ω is the thermodynamic probability, which counts the number of ways that a particular state can come about. Boltzmann Equation The thermodynamic probability (or the number of ways that the system may come about )may be related to the entropy of the system through a fundamental equation from statistical thermodynamics that is known as the Boltzmann Equation. Sk= lnΩ Configurational Entropy Apply the Boltzmann Equation to the mixing process: Ω2 ΩΩ21> ∆S > 0 ∆SS= 21− S= kln Ω1 ΩΩ21< ∆S < 0 Consider the entropy of the mixture: Skmix ==lnΩ kNN() ln !−− ln12 ! ln N ! NN≡ Stirling’s approximation: lnyyyy !≅− ln A N N N SkN ln 1 N ln 2 Multiply r.h.s.by A mix = − 1 + 2 N N NA SRxxxxmix = − ()112ln+ ln 2 ln ∆SSm= mix−− SS12= − Rxx∑ i i SSmix= ∆ m Smix is that part of the total entropy of the mixture arising from the mixing process itself. This is the configurational entropy. Ideal Solution of Small Molecules What entropy effects can you envision other than the configurational entropy? ln ∆SRxxmii= − ∑ If the 1-1, 2-2, and 1-2 interactions are equal, then Η ∆Hm = 0 (athermal mixing) If the solute and the solvent molecules are the same size, ∆Vm = 0 The thermodynamics of mixing will be governed by the Gibbs free energy of mixing. GRTxxln Do you expect a polymer ∆ mii= − ∑ solution to be ideal? Lattice Approach to Polymer Solutions To place a macromolecule on a lattice, it is necessary that the polymer segments, which do not necessarily correspond to a single repeat unit, are situated in a contiguous string. Small molecule of Type 1 (solvent) Macromolecule of Type 2 (solute) Flory-Huggins Theory of Polymer Solutions Assume (for now) that the polymer-solvent system shows athermal mixing. Let the system consist of N1 solvent molecules, each occupying a single site and N2 polymer molecules, each occupying n lattice sites. NnNN12+= What assumption about molecular weight distribution is implicit in the system chosen? Placement of a New Polymer Molecule on a Partially Filled Lattice Let (i) polymer molecules be initially placed on an empty lattice and determine the number of ways that the (i + 1)st polymer molecule can be placed on the lattice. How can we get the (i+1)st molecule to fit on the lattice? Placement of Polymer Segments on a Lattice Placement of first segment of polymer (i + 1): Nni− = Number of remaining sites Number of ways to add segment 1 Number of sites occupied by initial i polymer molecules Placement of second segment of polymer (i + 1): Let Z = coordination number of the lattice Nni− Z = Number of ways to add segment 2 N Number of lattice Average fraction of vacant sites on sites adjacent to the the lattice as a whole (When is this first segment most valid?) Probability of Placement of the (i)th Molecule Placement of third segment (and all others) of polymer (i + 1): Nni− ()Z −1 = Number of ways to add segment 3 N Ignore contributions to the One site on the coordination sphere average site vacancy due to is occupied by the second segment segments of molecule (i + 1) Thus, the (i + 1)st polymer molecule may be placed on a lattice already containing (i) molecules in ωi+1 ways. n−2 Nni− Nni− ωi+1 = ()NniZ− ()Z −1 N N n n−2 Nni− ω = ZZ()−1 N i+1 N n n−2 Nni−−()1 For the (i)th molecule: ω = ZZ()−1 N i N Total Number of Ways of Placing N2 Polymer Molecules on a Lattice N2 ωω12LL ωiN ω 2 1 Ω ==∏ωi NN!!i 1 22= 1 N2 Apply Boltzmann’s Equation:Skln mix = ∏ωi N2! i=1 Substitute for ωi to obtain: N2 Nn2 ()−2 N2 ZZ()−1 n Ω = Nni−−()1 Nn2 ()−1 ∏[] NN2! i=1 Examine the product: N2 N 2 n n nN nN2 N = N ∏∏[]Nni−−()11=+ n − i n i==11i n Entropy of the Mixture Write out several terms in the product expression: nnn n N N N N ∏ =+11− + 12− + 13− L + 1− N2 n n n n N N N N ! ()()()123LL− N22− N + 1 n n n n Note that: = N N − N ! ()()()123 − N 2 L 2 Thus n n n n N N ! N Nn−2 nN ! n ZZ2 ()−1 2 () n2 n ∏ = Ω = 1 N Nn2 ()− N − N ! NN2! 2 − N2 ! n n Apply Stirling’s approximation to obtain: S nN N mix = −N ln 2 − N ln 1 k 2 N 1 N ++NZn2[]ln()− 211 ln() Z− + ()− n+ ln n Flory-Huggins Entropy of Mixing Calculate entropy of pure solvent and pure polymer: Pure solvent: N2 = 0 S1 = 0 Entropy of the disordered polymer Pure polymer: N = 0 1 when it fills the lattice SkNZn22=+[]ln()− 211 ln() Z− + ()− n+ ln n N nN ∆SS= −− SS SkNln 1 N ln 2 m mix 21∆ m = − 1 + 2 N N Multiply and divide r.h.s. by N1+N2 and assume N1+N2=NA Calculate entropy of mixing: N N1 nN2 1 nN2 ∆SRxm = − 1 ln+ x2 ln = ϕ1 = ϕ2 N N N N Flory-Huggins Theory for an Athermal Solution Entropy of mixing: ∆SRxxm = − []1122lnϕϕ+ ln Enthalpy of mixing: ∆Hm = 0 Gibbs free energy of mixing: ∆GRTxxm = − []1122lnϕϕ+ ln Concentration Conversions N2 1 ϕ2 nN n ϕ2 2 N2 1 ϕ2 N2 N1 ϕ1 = = x2 = = = Nn NN N2 1 ϕ1 N1 1 ϕ1 12+ 1 + ϕ2 1 + N1 n ϕ1 xx12+=1 ϕϕ12+=1 1 ϕ2 x2 n 1 ϕ2 = − ϕ2 1 1 x2 = + x2 1 − 1 ϕ2 n n 1 + n1 − ϕ2 Flory-Huggins Enthalpy of Mixing Use the same lattice model as for the entropy of mixing, and consider a quasi-chemical reaction: (1,1) + (2,2) 2(1,2) 1 represents a solvent and 2 represents a polymer repeat unit The interaction energy is then given by: ∆wwww= 2 12−− 11 22 ∆w = Change in interaction energy per (1,2) pair 2 Define the system to be a filled lattice with Z nearest neighbors. Each polymer segment is then surrounded by Zϕ2 polymer segments and Zϕ1 solvent molecules. Contributions to the Interaction Energy Contributions of polymer segments Interaction of a polymer segment with its neighbors yields Zwϕϕ222+ Zw 112 1 The total contribution is 1 ZNϕϕϕ2[]()− 1 w 22+ 1 w 12 Remove double counting 2 Contributions of solvent molecules Each solvent molecule is surrounded by Zϕ2 polymer segments and Zϕ1 solvent molecules. Interaction of the solvent with its neighbors then yields Zwϕϕ212+ Zw 111 The total contribution is 1 ZNϕϕ1 2 w 12+ ()1 − ϕ 2 w 11 Remove double counting 2 [] Flory-Huggins Enthalpy of Mixing 1 ∆HZNwww= 2ϕϕ−− ϕϕ ϕϕ m 2 ()1212 1211 12 22 1 ∆∆HZNw= ϕϕ m 2 12 1 Let Zw∆ = χ RT Flory-Huggins interaction 2 parameter (the “chi” parameter) χ = 0 For athermal mixtures χ > 0 For endothermic mixing χ < 0 For exothermic mixing ∆Hm == Nϕϕ12 χ RT N 12 ϕ χ RT Flory-Huggins Free Energy of Mixing: General Case ∆∆GHTSmm= − ∆ m ∆HNm = ϕϕ12 χ RT ∆SRxxm = − []112lnφφ+ ln 2 ln ln ∆GRTNm =++[]ϕϕ12 χ() x 1 ϕ 1 x 2 ϕ 2 >0 0 <0 <0.
Recommended publications
  • Chapter 3. Second and Third Law of Thermodynamics
    Chapter 3. Second and third law of thermodynamics Important Concepts Review Entropy; Gibbs Free Energy • Entropy (S) – definitions Law of Corresponding States (ch 1 notes) • Entropy changes in reversible and Reduced pressure, temperatures, volumes irreversible processes • Entropy of mixing of ideal gases • 2nd law of thermodynamics • 3rd law of thermodynamics Math • Free energy Numerical integration by computer • Maxwell relations (Trapezoidal integration • Dependence of free energy on P, V, T https://en.wikipedia.org/wiki/Trapezoidal_rule) • Thermodynamic functions of mixtures Properties of partial differential equations • Partial molar quantities and chemical Rules for inequalities potential Major Concept Review • Adiabats vs. isotherms p1V1 p2V2 • Sign convention for work and heat w done on c=C /R vm system, q supplied to system : + p1V1 p2V2 =Cp/CV w done by system, q removed from system : c c V1T1 V2T2 - • Joule-Thomson expansion (DH=0); • State variables depend on final & initial state; not Joule-Thomson coefficient, inversion path. temperature • Reversible change occurs in series of equilibrium V states T TT V P p • Adiabatic q = 0; Isothermal DT = 0 H CP • Equations of state for enthalpy, H and internal • Formation reaction; enthalpies of energy, U reaction, Hess’s Law; other changes D rxn H iD f Hi i T D rxn H Drxn Href DrxnCpdT Tref • Calorimetry Spontaneous and Nonspontaneous Changes First Law: when one form of energy is converted to another, the total energy in universe is conserved. • Does not give any other restriction on a process • But many processes have a natural direction Examples • gas expands into a vacuum; not the reverse • can burn paper; can't unburn paper • heat never flows spontaneously from cold to hot These changes are called nonspontaneous changes.
    [Show full text]
  • Calculating the Configurational Entropy of a Landscape Mosaic
    Landscape Ecol (2016) 31:481–489 DOI 10.1007/s10980-015-0305-2 PERSPECTIVE Calculating the configurational entropy of a landscape mosaic Samuel A. Cushman Received: 15 August 2014 / Accepted: 29 October 2015 / Published online: 7 November 2015 Ó Springer Science+Business Media Dordrecht (outside the USA) 2015 Abstract of classes and proportionality can be arranged (mi- Background Applications of entropy and the second crostates) that produce the observed amount of total law of thermodynamics in landscape ecology are rare edge (macrostate). and poorly developed. This is a fundamental limitation given the centrally important role the second law plays Keywords Entropy Á Landscape Á Configuration Á in all physical and biological processes. A critical first Composition Á Thermodynamics step to exploring the utility of thermodynamics in landscape ecology is to define the configurational entropy of a landscape mosaic. In this paper I attempt to link landscape ecology to the second law of Introduction thermodynamics and the entropy concept by showing how the configurational entropy of a landscape mosaic Entropy and the second law of thermodynamics are may be calculated. central organizing principles of nature, but are poorly Result I begin by drawing parallels between the developed and integrated in the landscape ecology configuration of a categorical landscape mosaic and literature (but see Li 2000, 2002; Vranken et al. 2014). the mixing of ideal gases. I propose that the idea of the Descriptions of landscape patterns, processes of thermodynamic microstate can be expressed as unique landscape change, and propagation of pattern-process configurations of a landscape mosaic, and posit that relationships across scale and through time are all the landscape metric Total Edge length is an effective governed and constrained by the second law of measure of configuration for purposes of calculating thermodynamics (Cushman 2015).
    [Show full text]
  • Chemistry C3102-2006: Polymers Section Dr. Edie Sevick, Research School of Chemistry, ANU 5.0 Thermodynamics of Polymer Solution
    Chemistry C3102-2006: Polymers Section Dr. Edie Sevick, Research School of Chemistry, ANU 5.0 Thermodynamics of Polymer Solutions In this section, we investigate the solubility of polymers in small molecule solvents. Solubility, whether a chain goes “into solution”, i.e. is dissolved in solvent, is an important property. Full solubility is advantageous in processing of polymers; but it is also important for polymers to be fully insoluble - think of plastic shoe soles on a rainy day! So far, we have briefly touched upon thermodynamic solubility of a single chain- a “good” solvent swells a chain, or mixes with the monomers, while a“poor” solvent “de-mixes” the chain, causing it to collapse upon itself. Whether two components mix to form a homogeneous solution or not is determined by minimisation of a free energy. Here we will express free energy in terms of canonical variables T,P,N , i.e., temperature, pressure, and number (of moles) of molecules. The free energy { } expressed in these variables is the Gibbs free energy G G(T,P,N). (1) ≡ In previous sections, we referred to the Helmholtz free energy, F , the free energy in terms of the variables T,V,N . Let ∆Gm denote the free eneregy change upon homogeneous mix- { } ing. For a 2-component system, i.e. a solute-solvent system, this is simply the difference in the free energies of the solute-solvent mixture and pure quantities of solute and solvent: ∆Gm G(T,P,N , N ) (G0(T,P,N )+ G0(T,P,N )), where the superscript 0 denotes the ≡ 1 2 − 1 2 pure component.
    [Show full text]
  • Lecture 6: Entropy
    Matthew Schwartz Statistical Mechanics, Spring 2019 Lecture 6: Entropy 1 Introduction In this lecture, we discuss many ways to think about entropy. The most important and most famous property of entropy is that it never decreases Stot > 0 (1) Here, Stot means the change in entropy of a system plus the change in entropy of the surroundings. This is the second law of thermodynamics that we met in the previous lecture. There's a great quote from Sir Arthur Eddington from 1927 summarizing the importance of the second law: If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equationsthen so much the worse for Maxwell's equations. If it is found to be contradicted by observationwell these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of ther- modynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation. Another possibly relevant quote, from the introduction to the statistical mechanics book by David Goodstein: Ludwig Boltzmann who spent much of his life studying statistical mechanics, died in 1906, by his own hand. Paul Ehrenfest, carrying on the work, died similarly in 1933. Now it is our turn to study statistical mechanics. There are many ways to dene entropy. All of them are equivalent, although it can be hard to see. In this lecture we will compare and contrast dierent denitions, building up intuition for how to think about entropy in dierent contexts. The original denition of entropy, due to Clausius, was thermodynamic.
    [Show full text]
  • Solutions Mole Fraction of Component a = Xa Mass Fraction of Component a = Ma Volume Fraction of Component a = Φa Typically We
    Solutions Mole fraction of component A = xA Mass Fraction of component A = mA Volume Fraction of component A = fA Typically we make a binary blend, A + B, with mass fraction, m A, and want volume fraction, fA, or mole fraction , xA. fA = (mA/rA)/((mA/rA) + (m B/rB)) xA = (mA/MWA)/((mA/MWA) + (m B/MWB)) 1 Solutions Three ways to get entropy and free energy of mixing A) Isothermal free energy expression, pressure expression B) Isothermal volume expansion approach, volume expression C) From statistical thermodynamics 2 Mix two ideal gasses, A and B p = pA + pB pA is the partial pressure pA = xAp For single component molar G = µ µ 0 is at p0,A = 1 bar At pressure pA for a pure component µ A = µ0,A + RT ln(p/p0,A) = µ0,A + RT ln(p) For a mixture of A and B with a total pressure ptot = p0,A = 1 bar and pA = xA ptot For component A in a binary mixture µ A(xA) = µ0.A + RT xAln (xA ptot/p0,A) = µ0.A + xART ln (xA) Notice that xA must be less than or equal to 1, so ln xA must be negative or 0 So the chemical potential has to drop in the solution for a solution to exist. Ideal gasses only have entropy so entropy drives mixing in this case. This can be written, xA = exp((µ A(xA) - µ 0.A)/RT) Which indicates that xA is the Boltzmann probability of finding A 3 Mix two real gasses, A and B µ A* = µ 0.A if p = 1 4 Ideal Gas Mixing For isothermal DU = CV dT = 0 Q = W = -pdV For ideal gas Q = W = nRTln(V f/Vi) Q = DS/T DS = nRln(V f/Vi) Consider a process of expansion of a gas from VA to V tot The change in entropy is DSA = nARln(V tot/VA) = - nARln(VA/V tot) Consider an isochoric mixing process of ideal gasses A and B.
    [Show full text]
  • Gibbs Paradox of Entropy of Mixing: Experimental Facts, Its Rejection and the Theoretical Consequences
    ELECTRONIC JOURNAL OF THEORETICAL CHEMISTRY, VOL. 1, 135±150 (1996) Gibbs paradox of entropy of mixing: experimental facts, its rejection and the theoretical consequences SHU-KUN LIN Molecular Diversity Preservation International (MDPI), Saengergasse25, CH-4054 Basel, Switzerland and Institute for Organic Chemistry, University of Basel, CH-4056 Basel, Switzerland ([email protected]) SUMMARY Gibbs paradox statement of entropy of mixing has been regarded as the theoretical foundation of statistical mechanics, quantum theory and biophysics. A large number of relevant chemical and physical observations show that the Gibbs paradox statement is false. We also disprove the Gibbs paradox statement through consideration of symmetry, similarity, entropy additivity and the de®ned property of ideal gas. A theory with its basic principles opposing Gibbs paradox statement emerges: entropy of mixing increases continuously with the increase in similarity of the relevant properties. Many outstanding problems, such as the validity of Pauling's resonance theory and the biophysical problem of protein folding and the related hydrophobic effect, etc. can be considered on a new theoretical basis. A new energy transduction mechanism, the deformation, is also brie¯y discussed. KEY WORDS Gibbs paradox; entropy of mixing; mixing of quantum states; symmetry; similarity; distinguishability 1. INTRODUCTION The Gibbs paradox of mixing says that the entropy of mixing decreases discontinuouslywith an increase in similarity, with a zero value for mixing of the most similar or indistinguishable subsystems: The isobaric and isothermal mixing of one mole of an ideal ¯uid A and one mole of a different ideal ¯uid B has the entropy increment 1 (∆S)distinguishable = 2R ln 2 = 11.53 J K− (1) where R is the gas constant, while (∆S)indistinguishable = 0(2) for the mixing of indistinguishable ¯uids [1±10].
    [Show full text]
  • Chemistry 163B Absolute Entropies and Entropy of Mixing
    Chemistry 163B Winter 2012 Handouts for Third Law and Entropy of Mixing (ideal gas, distinguishable molecules) Chemistry 163B Absolute Entropies and Entropy of Mixing 1 APPENDIX A: Hf, Gf, BUT S (no Δ, no “sub f ”) Hºf Gºf Sº 2 1 Chemistry 163B Winter 2012 Handouts for Third Law and Entropy of Mixing (ideal gas, distinguishable molecules) Third Law of Thermodynamics The entropy of any perfect crystalline substance approaches 0 as Tô 0K S=k ln W for perfectly ordered crystalline substance W ô 1 as Tô 0K S ô 0 3 to calculate absolute entropy from measurements (E&R pp. 101-103, Figs 5.8-5.10) T2 CP S ABCDE,,,, dT ABCDE,,,, T T1 H S T ABCD E S S S S III II III I g 0 + + + + S298.15 SK(0 ) SA S SB S 0 23.66 III II 23.66 43.76 II I at 23.66K at4 3.76K + + + SC + S SD + S SE 43.7654 .39 I 54.3990.20 g 90.30298.15 at54 .39K at90.20 K 4 2 Chemistry 163B Winter 2012 Handouts for Third Law and Entropy of Mixing (ideal gas, distinguishable molecules) full calculation of Sº298 for O2 (g) (Example Problem 5.9, E&R pp103-104 [96-97]2nd) SJ K11 mol S(0K ) 0 SA 0 23.66 8.182 S III II at 23.66K 3.964 SB 23.66 43.76 19.61 S II I at 43.76K 16.98 SC 43.76 54.39 10.13 S I at 54.39K 8.181 SD 54.39 90.20 27.06 Sg at 90.20K 75.59 SE 90.20 298.15 35.27 total 204.9 J K-1 mol-1 5 Sreaction from absolute entropies nAA + nBB ônCC + nDD at 298K 0000 SnSnSnSnS298 298 298 298 reaction CCDAB D A B SK00()298 S reaction i 298 i i S 0 are 3rd Law entropies (e.g.
    [Show full text]
  • SOLUTIONS of EXERCISES 301 Central Role of Gibbs Energy 301:1
    SOLUTIONS OF EXERCISES 301 central role of Gibbs energy 301:1 characteristic Numerical values in SI units: m.p = 2131.3; ΔS = 7.95; ΔH = 16943; ΔCP = −11.47. 301:2 a coexistence problem 1) f = M[T,P] – N[2 conditions] = 0 ; this excludes the first option; 2) by eliminating lnP from the two equilibrium equations in the ideal-gas approximation, the invariant point should follow from [60000 + 16 (T/K) = 0]; this excludes the second option. 301:3 trigonometric excess Gibbs energy – spinodal and binodal . -1 2 RTspin(X) = (4000 J mol )π X(1−X)sin2πX; limits at zero K are X = 0 (obviously) and X = 0.75; at 0.75 Tc the binodal mole fractions are about 0.11 and 0.52 301:4 from experimental equal-G curve to phase diagram ΔΩ = 1342 J.mol-1; Xsol = 0.635; Xliq = 0.435; see also van der Linde et al. 2002. 301:5 an elliptical stability field * The property ΔG A , which invariably is negative, passes through a maximum, when plotted against temperature. The maximum is reached for a temperature which is about ‘in the middle of the ellipse’. The property ΔΩ is positive. 301:6 a metastable form stabilized by mixing The stable diagram has three three-phase equilibria: αI + liquid + β at left; eutectic; liquid + β + αII at right; peritectic; αI + β + αII (below the metastable eutectic αI + liquid + αII); eutectoid. 301:7 the equal-G curve and the region of demixing For the left-hand case, where the EGC has a minimum, the stable phase diagram has a eutectic three-phase equilibrium.
    [Show full text]
  • The Lattice Model of Solutions
    The lattice model of solutions Marc R. Roussel Department of Chemistry and Biochemistry University of Lethbridge February 22, 2009 1 Solubility and thermodynamics Why do some substances dissolve in some solvents and not others? You probably have been exposed to a number of rules of thumb, but it's possible that you have never thought about this very important question in detail. We're going to apply a little thermodynamics here to help us think about the factors which are important in determining solubility (or miscibility, if we're talking about two liquids). Recall that a process is thermodynamically favored at constant temperature and pressure if ∆G = ∆H − T ∆S < 0: A mixture always has more microstates than two separated components, so the entropy 1 of solution, ∆solS, is always positive. Accordingly, dissolving one substance in another is always entropically favored. Thus, the real question is whether ∆solH is or is not too positive for ∆solG to be negative, i.e. we will get a solution if ∆solH < T ∆solS: Of course, if ∆solH is negative, then this inequality is automatically satisfied, and we will get a solution. If on the other hand ∆solH > 0, then the inequality tells us that the enthalpy of solution can't be too large. What we need to do then is to figure out how large the entropy and enthalpy of solution are likely to be, or at least what factors affect their magnitudes. We're going to start by looking a solutions involving small molecules to get our feet wet. Our eventual objective is to discuss polymer solutions.
    [Show full text]
  • Part I : Preliminaries (Thermodynamics and Kinetics) Module 2: Gibbs Free Energy of a Binary Alloy
    Part I : Preliminaries (Thermodynamics and Kinetics) Module 2: Gibbs free energy of a binary alloy 3.1 Motivation Pure lead melts at about 327.50C and pure tin melts at 2320C. However, an alloy of lead and tin known as solder alloy (consisting of 61.9% of tin by weight) melts at 1830C. Why? 3.2 Composition dependence of free energy in a binary alloy Consider the free energy . In condensed systems (solids and liquids), : see the tutorial at the end of this module. Hence, in the rest of this module, we assume . Consider a binary alloy - an alloy consisting of two components. Let us consider a statistical model for the free energy of such an alloy (known as the regular solution model). Such a statistical picture, as we see later in this chapter, can explain many observed features of binary systems. Let the two components of the binary alloy be labeled as and . Let and be the number of and atoms. Let , where the Avogadro number. Thus, the alloy composition is defined as . Consider the free energy associated with the mixing of these two components. Since , one can define the the free energy of mixing . Note , where, is the overall free energy after mixing the and atoms, and and are the molar free energies of pure and respectively. Let us consider the calculation of these two quantities, namely, internal energy ox mixing and entropy of mixing as a function of composition. 3.2.1 Congurational entropy of mixing We assume that the only contribution to entropy is the configurational one.
    [Show full text]
  • Translation of Ludwig Boltzmann's Paper
    Entropy 2015, 17, 1971-2009; doi:10.3390/e17041971 OPEN ACCESS entropy ISSN 1099-4300 www.mdpi.com/journal/entropy Article Translation of Ludwig Boltzmann’s Paper “On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium” Sitzungberichte der Kaiserlichen Akademie der Wissenschaften. Mathematisch-Naturwissen Classe. Abt. II, LXXVI 1877, pp 373-435 (Wien. Ber. 1877, 76:373-435). Reprinted in Wiss. Abhandlungen, Vol. II, reprint 42, p. 164-223, Barth, Leipzig, 1909 Kim Sharp * and Franz Matschinsky E. R. Johnson Foundation, Department of Biochemistry and Biophysics, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA 19104, USA; E-Mail: [email protected] * Author to whom correspondence should be addressed; E-Mail: [email protected]; Tel.: +1-215-573-3506; Fax: +1-215-898-4217. Academic Editor: Kevin H. Knuth Received: 22 December 2014 / Accepted: 27 March 2015 / Published: 2 April 2015 Abstract: Translation of the seminal 1877 paper by Ludwig Boltzmann which for the first time established the probabilistic basis of entropy. Includes a scientific commentary. Keywords: Ludwig Boltzmann; second law of thermodynamics; entropy 1. Introduction Barely any of Boltzmann’s original scientific work is available in translation. This is remarkable given his central role in the development of both equilibrium and non-equilibrium statistical mechanics, his statistical mechanical explanation of entropy, and our understanding of the Second Law of thermodynamics. What Boltzmann actually wrote on these subjects is rarely quoted directly, his methods Entropy 2015, 17 1972 are not fully appreciated, and key concepts have been misinterpreted.
    [Show full text]
  • Statistical Physics University of Cambridge Part II Mathematical Tripos
    Preprint typeset in JHEP style - HYPER VERSION Statistical Physics University of Cambridge Part II Mathematical Tripos David Tong Department of Applied Mathematics and Theoretical Physics, Centre for Mathematical Sciences, Wilberforce Road, Cambridge, CB3 OBA, UK http://www.damtp.cam.ac.uk/user/tong/statphys.html [email protected] { 1 { Recommended Books and Resources • Reif, Fundamentals of Statistical and Thermal Physics A comprehensive and detailed account of the subject. It's solid. It's good. It isn't quirky. • Kardar, Statistical Physics of Particles A modern view on the subject which offers many insights. It's superbly written, if a little brief in places. A companion volume, \The Statistical Physics of Fields" covers aspects of critical phenomena. Both are available to download as lecture notes. Links are given on the course webpage • Landau and Lifshitz, Statistical Physics Russian style: terse, encyclopedic, magnificent. Much of this book comes across as remarkably modern given that it was first published in 1958. • Mandl, Statistical Physics This is an easy going book with very clear explanations but doesn't go into as much detail as we will need for this course. If you're struggling to understand the basics, this is an excellent place to look. If you're after a detailed account of more advanced aspects, you should probably turn to one of the books above. • Pippard, The Elements of Classical Thermodynamics This beautiful little book walks you through the rather subtle logic of classical ther- modynamics. It's very well done. If Arnold Sommerfeld had read this book, he would have understood thermodynamics the first time round.
    [Show full text]