Correlation Between the Combinatory Entropy of Polymer and Ideal Liquid Solutions

Total Page:16

File Type:pdf, Size:1020Kb

Correlation Between the Combinatory Entropy of Polymer and Ideal Liquid Solutions Correlation between the combinatory entropy of polymer and ideal liquid solutions S. Saeki Department of Materials Science and Engineering, Fukui University. Fukui 910, Japan (Received 28 April 1992; revised 20 July 1992) A semiempirical equation of combinatory entropy in binary solutions of semiflexible or rod-like polymers in solvent has been derived, based on a correlation between the combinatory entropy in polymer solution derived by Flory, Huggins, Miller and Guggenheim and an ideal simple liquid solution. The expression for the partial entropy of mixing of solvent ASM,~ obtained in this work is given by: ASu, l/k= -In ~b 1 - ~b2(1 - r- 1) d- ~(]) 1 (]12 --]-ln{ [~b 1 + ((a2/r)]/[qb 1 -I- (~b2/r) H- ~b 1 t~2] } + ctO2 ln{ (r- 1+ ~bl)/[~bl + (q~2/r)+ ~ 1(~2] } where ~ is the volume fraction of polymer (i = 2) and solvent (1), r is the number of segments per polymer, k is the Boltzmann constant and ~ is a parameter characterizing the flexibility of polymer chain such that ct=0 for flexible polymers and ct=(1-r-~)/q~ for rigid-rod polymers. The chemical potential of solvent in the polymer solution is given by: #, -- I2° = R T[-- ASu,,/k + zoo + a)~b22] where ;(0 is the polymer-solvent interaction parameter in the Flory-Huggins theory. (Keywords: combinatory entropy; polymer solution; ideal simple liquid solution; Flory-Huggins theory; semiflexible polymer; rod polymer) INTRODUCTION are discussed by DiMarzio 7, while the combinatorial entropy correction for imperfect randomness is calcu- The Flory-Huggins theory is the most simple and lated by Huggins a. More rigorous calculation of com- fundamental theory of flexible polymer solutions. The binatory entropy based on the self-avoiding and mutually theory uses the mean field approximation that the avoiding walk has been done by Kurata 9 using the concentration of solution is uniform over the entire range pseudoconfiguration method, and by Bawendi and of solution, for calculating the number of configurations Freed 1°-~2 in the lattice spin-field theory, and by Monte of polymer and solvent molecules and for evaluating the Carlo simulation I a, 14. enthalpic term by a van Laar model 1. The division of This work discusses the correlation between the the free energy into entropic and enthalpic terms made the combinatory entropy in polymer solution and the ideal Flory-Huggins theory even more simple and useful. liquid mixture based on the lattice theories of Flory, Flory also stressed a division of the solution into Huggins, Guggenheim and Miller and a simple and useful intermolecular and intramolecular factors, and stated equation is derived for the entropy of mixing and that thermodynamic properties of a flexible polymer chemical potential in binary solutions of semiflexible and solution are independent of the structure and flexibility rod-like polymers in solvents. Division of the Gibbs free of the macromolecules, and are dependent on only one energy of mixing into the entropic and enthalpic term in parameter, the number of segments per polymer, r. polymer solution is also discussed. Guggenheim 2, Miller 3, and Huggins4 derived essen- tially the same equations for the entropy of mixing in polymer solutions where a parameter Z coordination FUNDAMENTAL RELATION IN THE number is included in addition to the parameter r. CALCULATION OF COMBINATORY ENTROPY Investigation of the combinatory entropy for non-flexible It is well known that the configurational partition polymers such as semiflexible and rod-like polymer has function of ideal, simple liquid mixtures (SM) of N 1 been made by Flory 5'6, where new parameters are molecules of component 1 and N 2 molecules of com- introduced: a fraction f of the bonds bent out of the ponent 2 is given by: co-linear direction of the preceding segment in the semiflexible polymer, and the angle of inclination of the nSM= (N1 + N2)!/(N, !N2 !) (1) particle to the domain axis in the rod-like polymer. Many ~X~'X~ N~ (2) different approaches to the calculation of combinatory entropy and free energy in polymer solutions have been where X i = Ni/(N ~ + N2) is the mole fraction of i and the made. The effects of orientation on combinatory entropy Stirling approximation that N~!'~(N~/e) N' is used. It is 0032-3861/93/0714444)5 © 1993 Butterworth-Heinemann Ltd. 1444 POLYMER, 1993, Volume 34, Number 7 Study of combinatory entropy in solutions: S. Saeki interesting to calculate the number of configurations of and the partial entropy of solvent (1) ASM,1 defined by the mixture under the condition that nt molecules of 1 AS~, 1 = k(t~AIn ~/dN 1 )T,P,N2 and A In f~ = In ~'~M-- In f~N,= 0 are fixed at n~ different positions in the lattice, which is --lnGN2=0 is given by: designated as f~F and is given by: ASM,1/k= -In ~bl-~b 2 (rN 2 fixed) (11) -- ~'~SM x I (3) On the other hand, ASM,1 for the Flory-Huggins theory is: where X~ ~ is the probability that n~ different positions are occupied by n~ molecules of 1 simultaneously, or in ASM, ~/k = -In q~1 - 42 -[- ~)2/r (12) other words n 1 molecules are fixed at nl different The term dpE/rin equation (12) comes from the freedom positions in the lattice. It is obvious that if N1 molecules with which N 2 separated molecules of 2 can move over of 1 and N2 molecules of 2 are fixed at positions of the restricted lattice. A formulation of the combinatory (N~ + N 2), then the number of configuration is one, since entropy in polymer solutions for various theories has ~-~SMXXl vNI ""yN2_ 2 --~,1 from equation (2). It is also interesting been done by the same procedure as mentioned above, to calculate the number of configuration when n 1 which is given by: molecules of 1 are removed or disappear from the __ N2 (r- 1)N2 mixture, which is designated as DnR, and is given by: ~)comb-- q2 ~')id©al~2 f(N1,N2,r,Z) (13) where f is a function of N 1, N2, r and coordination D,R, = (N 1 + N 2 -- n~ )!/[(N 1 - n I )!N 2 !] (4) number Z and is dependent on the model. It is derived = DsM[N, (N,- 1)... (NI -hi + 1)]/ that f~rod for a mixture of N 2 parallel oriented rigid-rod [(N, +N2)(N x +N 2 -- 1)... (Na +N 2 -n 1 + 1)] (5) polymers with axis ratio r and N~ solvent is equal to The value of f~ is equal to the number of configurations ~rod=(NI+N2)!/(NI!N2 !) and, using equation (4), is given by: in which nl molecules of 1 are combined with the rest of the molecules (Nl-nl) of 1 to make (Nl-na) big or ~'~rod=~)ideal{[N2r(N2 r- 1)... (N 2 T 1)]/ long molecules. [Nt(N,- 1)... (N 1 + N2 + 1)]} (14) where CORRELATION BETWEEN THE ~'~ideal = Nt!/[Nx !(rN2)!] (15) COMBINATORY ENTROPY IN POLYMER AND The f~rod corresponds to the number of configurations of IDEAL LIQUID SOLUTIONS ideal mixture under the restriction that N2(r- 1) separated It is instructive to express the combinatory partition molecules of 2 are removed from the system or combined function in the Flory-Huggins theory by using the with the rest of 2 to make N 2 long rod molecules. The previous relation. The Flory-Huggins equation was ASM,1 for the rod polymer solution is expressed using the originally expressed byl: volume fraction in equation (9) by: ~"~F--H= (N2!)- 1 [Z(Z - 1 )r - 2]N2[NI1 -r)N2] ASM,1/k= - In q~l + ln[1 - (1 - r- 1)q~2] (16) X r(N2r)[(nt/r)!(nt/r-- N2)!] r (6) ~ -In ~bl- (1-r-1)~b2 th2~0 (17) where N i is thc number of solvent molecules (i = 1) and where equation (17) shows that ASM,1 for the rod mixture polymer (i = 2), r is the number of segments per polymer, approaches that of the Flory-Huggins theory at the limit Z is the coordination number and Nt=N~ +rN2 is the of ~b2 = 0. The partial entropy of mixing of polymer in the total number of lattices. Equation (6) reduces to: Flory-Huggins theory is given by: Y~F-H= q~(Nt!/[N1 !(rN2)!]}~b2N:('- 1) (7) ASM,2/k = q~l(r- D-In ~b2 (F-H) (18) where and that for the rod polymer solution by: q2 = Z(Z-- 1)'- 2tier- 1 (8) ASM,2/k=-ln(a2+ln[l+(a~(r-1)] (Rod) (19) and -ln~b2 + ~bl(r- 1) ~bl(r- 1)~0 (20) (b2 = rN2/(N1 + rN2) (9) Therefore ASM,2 for the rod polymer solution at the limit of q~l =0 is equal to that of the Flory-Huggins theory In the Flory-Huggins theory, Dv-n in equation (7) is for a flexible polymer solution. divided into the intramolecular factor q2 for polymer chain and the intermolecular factors, which are expressed by the ideal mixing term Nt!/ENI!(rN2)! ] for rN 2 DERIVATION OF COMBINATORY ENTROPY IN disconnected and separated molecules of 2 from N 2 SOLUTION OF SEMIFLEXIBLE AND ROD-LIKE polymers with r segments per polymer and N 1 solvent POLYMERS IN SOLVENT molecules, and the probability term that N2(r-1 ) separated molecules are fixed at NE(r-1 ) positions in The configurational partition function f2MiX for solutions of semiflexible and rod-like polymers in solvent which the lattice. In other words, equation (7) gives the number of configurations of ideal mixing of N~ of 1 and rN 2 of can predict cases of both flexible polymer solutions and parallel oriented rod solution is given by: 2 under the restriction that (r- 1)N 2 separated molecules __ ,TNL,TN2('I ~N2(r- l)--x are fixed and only N 2 molecules of 2 and N 1 molecules Mix--~/1 1"/2 ~ideal'W2 of 1 are free to move in the lattice with (r-1)N 2 fixed x {[(N 2 +x)(N 2 +x- 1)..
Recommended publications
  • Chapter 3. Second and Third Law of Thermodynamics
    Chapter 3. Second and third law of thermodynamics Important Concepts Review Entropy; Gibbs Free Energy • Entropy (S) – definitions Law of Corresponding States (ch 1 notes) • Entropy changes in reversible and Reduced pressure, temperatures, volumes irreversible processes • Entropy of mixing of ideal gases • 2nd law of thermodynamics • 3rd law of thermodynamics Math • Free energy Numerical integration by computer • Maxwell relations (Trapezoidal integration • Dependence of free energy on P, V, T https://en.wikipedia.org/wiki/Trapezoidal_rule) • Thermodynamic functions of mixtures Properties of partial differential equations • Partial molar quantities and chemical Rules for inequalities potential Major Concept Review • Adiabats vs. isotherms p1V1 p2V2 • Sign convention for work and heat w done on c=C /R vm system, q supplied to system : + p1V1 p2V2 =Cp/CV w done by system, q removed from system : c c V1T1 V2T2 - • Joule-Thomson expansion (DH=0); • State variables depend on final & initial state; not Joule-Thomson coefficient, inversion path. temperature • Reversible change occurs in series of equilibrium V states T TT V P p • Adiabatic q = 0; Isothermal DT = 0 H CP • Equations of state for enthalpy, H and internal • Formation reaction; enthalpies of energy, U reaction, Hess’s Law; other changes D rxn H iD f Hi i T D rxn H Drxn Href DrxnCpdT Tref • Calorimetry Spontaneous and Nonspontaneous Changes First Law: when one form of energy is converted to another, the total energy in universe is conserved. • Does not give any other restriction on a process • But many processes have a natural direction Examples • gas expands into a vacuum; not the reverse • can burn paper; can't unburn paper • heat never flows spontaneously from cold to hot These changes are called nonspontaneous changes.
    [Show full text]
  • Calculating the Configurational Entropy of a Landscape Mosaic
    Landscape Ecol (2016) 31:481–489 DOI 10.1007/s10980-015-0305-2 PERSPECTIVE Calculating the configurational entropy of a landscape mosaic Samuel A. Cushman Received: 15 August 2014 / Accepted: 29 October 2015 / Published online: 7 November 2015 Ó Springer Science+Business Media Dordrecht (outside the USA) 2015 Abstract of classes and proportionality can be arranged (mi- Background Applications of entropy and the second crostates) that produce the observed amount of total law of thermodynamics in landscape ecology are rare edge (macrostate). and poorly developed. This is a fundamental limitation given the centrally important role the second law plays Keywords Entropy Á Landscape Á Configuration Á in all physical and biological processes. A critical first Composition Á Thermodynamics step to exploring the utility of thermodynamics in landscape ecology is to define the configurational entropy of a landscape mosaic. In this paper I attempt to link landscape ecology to the second law of Introduction thermodynamics and the entropy concept by showing how the configurational entropy of a landscape mosaic Entropy and the second law of thermodynamics are may be calculated. central organizing principles of nature, but are poorly Result I begin by drawing parallels between the developed and integrated in the landscape ecology configuration of a categorical landscape mosaic and literature (but see Li 2000, 2002; Vranken et al. 2014). the mixing of ideal gases. I propose that the idea of the Descriptions of landscape patterns, processes of thermodynamic microstate can be expressed as unique landscape change, and propagation of pattern-process configurations of a landscape mosaic, and posit that relationships across scale and through time are all the landscape metric Total Edge length is an effective governed and constrained by the second law of measure of configuration for purposes of calculating thermodynamics (Cushman 2015).
    [Show full text]
  • Chemistry C3102-2006: Polymers Section Dr. Edie Sevick, Research School of Chemistry, ANU 5.0 Thermodynamics of Polymer Solution
    Chemistry C3102-2006: Polymers Section Dr. Edie Sevick, Research School of Chemistry, ANU 5.0 Thermodynamics of Polymer Solutions In this section, we investigate the solubility of polymers in small molecule solvents. Solubility, whether a chain goes “into solution”, i.e. is dissolved in solvent, is an important property. Full solubility is advantageous in processing of polymers; but it is also important for polymers to be fully insoluble - think of plastic shoe soles on a rainy day! So far, we have briefly touched upon thermodynamic solubility of a single chain- a “good” solvent swells a chain, or mixes with the monomers, while a“poor” solvent “de-mixes” the chain, causing it to collapse upon itself. Whether two components mix to form a homogeneous solution or not is determined by minimisation of a free energy. Here we will express free energy in terms of canonical variables T,P,N , i.e., temperature, pressure, and number (of moles) of molecules. The free energy { } expressed in these variables is the Gibbs free energy G G(T,P,N). (1) ≡ In previous sections, we referred to the Helmholtz free energy, F , the free energy in terms of the variables T,V,N . Let ∆Gm denote the free eneregy change upon homogeneous mix- { } ing. For a 2-component system, i.e. a solute-solvent system, this is simply the difference in the free energies of the solute-solvent mixture and pure quantities of solute and solvent: ∆Gm G(T,P,N , N ) (G0(T,P,N )+ G0(T,P,N )), where the superscript 0 denotes the ≡ 1 2 − 1 2 pure component.
    [Show full text]
  • Lecture 6: Entropy
    Matthew Schwartz Statistical Mechanics, Spring 2019 Lecture 6: Entropy 1 Introduction In this lecture, we discuss many ways to think about entropy. The most important and most famous property of entropy is that it never decreases Stot > 0 (1) Here, Stot means the change in entropy of a system plus the change in entropy of the surroundings. This is the second law of thermodynamics that we met in the previous lecture. There's a great quote from Sir Arthur Eddington from 1927 summarizing the importance of the second law: If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equationsthen so much the worse for Maxwell's equations. If it is found to be contradicted by observationwell these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of ther- modynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation. Another possibly relevant quote, from the introduction to the statistical mechanics book by David Goodstein: Ludwig Boltzmann who spent much of his life studying statistical mechanics, died in 1906, by his own hand. Paul Ehrenfest, carrying on the work, died similarly in 1933. Now it is our turn to study statistical mechanics. There are many ways to dene entropy. All of them are equivalent, although it can be hard to see. In this lecture we will compare and contrast dierent denitions, building up intuition for how to think about entropy in dierent contexts. The original denition of entropy, due to Clausius, was thermodynamic.
    [Show full text]
  • Solutions Mole Fraction of Component a = Xa Mass Fraction of Component a = Ma Volume Fraction of Component a = Φa Typically We
    Solutions Mole fraction of component A = xA Mass Fraction of component A = mA Volume Fraction of component A = fA Typically we make a binary blend, A + B, with mass fraction, m A, and want volume fraction, fA, or mole fraction , xA. fA = (mA/rA)/((mA/rA) + (m B/rB)) xA = (mA/MWA)/((mA/MWA) + (m B/MWB)) 1 Solutions Three ways to get entropy and free energy of mixing A) Isothermal free energy expression, pressure expression B) Isothermal volume expansion approach, volume expression C) From statistical thermodynamics 2 Mix two ideal gasses, A and B p = pA + pB pA is the partial pressure pA = xAp For single component molar G = µ µ 0 is at p0,A = 1 bar At pressure pA for a pure component µ A = µ0,A + RT ln(p/p0,A) = µ0,A + RT ln(p) For a mixture of A and B with a total pressure ptot = p0,A = 1 bar and pA = xA ptot For component A in a binary mixture µ A(xA) = µ0.A + RT xAln (xA ptot/p0,A) = µ0.A + xART ln (xA) Notice that xA must be less than or equal to 1, so ln xA must be negative or 0 So the chemical potential has to drop in the solution for a solution to exist. Ideal gasses only have entropy so entropy drives mixing in this case. This can be written, xA = exp((µ A(xA) - µ 0.A)/RT) Which indicates that xA is the Boltzmann probability of finding A 3 Mix two real gasses, A and B µ A* = µ 0.A if p = 1 4 Ideal Gas Mixing For isothermal DU = CV dT = 0 Q = W = -pdV For ideal gas Q = W = nRTln(V f/Vi) Q = DS/T DS = nRln(V f/Vi) Consider a process of expansion of a gas from VA to V tot The change in entropy is DSA = nARln(V tot/VA) = - nARln(VA/V tot) Consider an isochoric mixing process of ideal gasses A and B.
    [Show full text]
  • Gibbs Paradox of Entropy of Mixing: Experimental Facts, Its Rejection and the Theoretical Consequences
    ELECTRONIC JOURNAL OF THEORETICAL CHEMISTRY, VOL. 1, 135±150 (1996) Gibbs paradox of entropy of mixing: experimental facts, its rejection and the theoretical consequences SHU-KUN LIN Molecular Diversity Preservation International (MDPI), Saengergasse25, CH-4054 Basel, Switzerland and Institute for Organic Chemistry, University of Basel, CH-4056 Basel, Switzerland ([email protected]) SUMMARY Gibbs paradox statement of entropy of mixing has been regarded as the theoretical foundation of statistical mechanics, quantum theory and biophysics. A large number of relevant chemical and physical observations show that the Gibbs paradox statement is false. We also disprove the Gibbs paradox statement through consideration of symmetry, similarity, entropy additivity and the de®ned property of ideal gas. A theory with its basic principles opposing Gibbs paradox statement emerges: entropy of mixing increases continuously with the increase in similarity of the relevant properties. Many outstanding problems, such as the validity of Pauling's resonance theory and the biophysical problem of protein folding and the related hydrophobic effect, etc. can be considered on a new theoretical basis. A new energy transduction mechanism, the deformation, is also brie¯y discussed. KEY WORDS Gibbs paradox; entropy of mixing; mixing of quantum states; symmetry; similarity; distinguishability 1. INTRODUCTION The Gibbs paradox of mixing says that the entropy of mixing decreases discontinuouslywith an increase in similarity, with a zero value for mixing of the most similar or indistinguishable subsystems: The isobaric and isothermal mixing of one mole of an ideal ¯uid A and one mole of a different ideal ¯uid B has the entropy increment 1 (∆S)distinguishable = 2R ln 2 = 11.53 J K− (1) where R is the gas constant, while (∆S)indistinguishable = 0(2) for the mixing of indistinguishable ¯uids [1±10].
    [Show full text]
  • Chemistry 163B Absolute Entropies and Entropy of Mixing
    Chemistry 163B Winter 2012 Handouts for Third Law and Entropy of Mixing (ideal gas, distinguishable molecules) Chemistry 163B Absolute Entropies and Entropy of Mixing 1 APPENDIX A: Hf, Gf, BUT S (no Δ, no “sub f ”) Hºf Gºf Sº 2 1 Chemistry 163B Winter 2012 Handouts for Third Law and Entropy of Mixing (ideal gas, distinguishable molecules) Third Law of Thermodynamics The entropy of any perfect crystalline substance approaches 0 as Tô 0K S=k ln W for perfectly ordered crystalline substance W ô 1 as Tô 0K S ô 0 3 to calculate absolute entropy from measurements (E&R pp. 101-103, Figs 5.8-5.10) T2 CP S ABCDE,,,, dT ABCDE,,,, T T1 H S T ABCD E S S S S III II III I g 0 + + + + S298.15 SK(0 ) SA S SB S 0 23.66 III II 23.66 43.76 II I at 23.66K at4 3.76K + + + SC + S SD + S SE 43.7654 .39 I 54.3990.20 g 90.30298.15 at54 .39K at90.20 K 4 2 Chemistry 163B Winter 2012 Handouts for Third Law and Entropy of Mixing (ideal gas, distinguishable molecules) full calculation of Sº298 for O2 (g) (Example Problem 5.9, E&R pp103-104 [96-97]2nd) SJ K11 mol S(0K ) 0 SA 0 23.66 8.182 S III II at 23.66K 3.964 SB 23.66 43.76 19.61 S II I at 43.76K 16.98 SC 43.76 54.39 10.13 S I at 54.39K 8.181 SD 54.39 90.20 27.06 Sg at 90.20K 75.59 SE 90.20 298.15 35.27 total 204.9 J K-1 mol-1 5 Sreaction from absolute entropies nAA + nBB ônCC + nDD at 298K 0000 SnSnSnSnS298 298 298 298 reaction CCDAB D A B SK00()298 S reaction i 298 i i S 0 are 3rd Law entropies (e.g.
    [Show full text]
  • SOLUTIONS of EXERCISES 301 Central Role of Gibbs Energy 301:1
    SOLUTIONS OF EXERCISES 301 central role of Gibbs energy 301:1 characteristic Numerical values in SI units: m.p = 2131.3; ΔS = 7.95; ΔH = 16943; ΔCP = −11.47. 301:2 a coexistence problem 1) f = M[T,P] – N[2 conditions] = 0 ; this excludes the first option; 2) by eliminating lnP from the two equilibrium equations in the ideal-gas approximation, the invariant point should follow from [60000 + 16 (T/K) = 0]; this excludes the second option. 301:3 trigonometric excess Gibbs energy – spinodal and binodal . -1 2 RTspin(X) = (4000 J mol )π X(1−X)sin2πX; limits at zero K are X = 0 (obviously) and X = 0.75; at 0.75 Tc the binodal mole fractions are about 0.11 and 0.52 301:4 from experimental equal-G curve to phase diagram ΔΩ = 1342 J.mol-1; Xsol = 0.635; Xliq = 0.435; see also van der Linde et al. 2002. 301:5 an elliptical stability field * The property ΔG A , which invariably is negative, passes through a maximum, when plotted against temperature. The maximum is reached for a temperature which is about ‘in the middle of the ellipse’. The property ΔΩ is positive. 301:6 a metastable form stabilized by mixing The stable diagram has three three-phase equilibria: αI + liquid + β at left; eutectic; liquid + β + αII at right; peritectic; αI + β + αII (below the metastable eutectic αI + liquid + αII); eutectoid. 301:7 the equal-G curve and the region of demixing For the left-hand case, where the EGC has a minimum, the stable phase diagram has a eutectic three-phase equilibrium.
    [Show full text]
  • The Lattice Model of Solutions
    The lattice model of solutions Marc R. Roussel Department of Chemistry and Biochemistry University of Lethbridge February 22, 2009 1 Solubility and thermodynamics Why do some substances dissolve in some solvents and not others? You probably have been exposed to a number of rules of thumb, but it's possible that you have never thought about this very important question in detail. We're going to apply a little thermodynamics here to help us think about the factors which are important in determining solubility (or miscibility, if we're talking about two liquids). Recall that a process is thermodynamically favored at constant temperature and pressure if ∆G = ∆H − T ∆S < 0: A mixture always has more microstates than two separated components, so the entropy 1 of solution, ∆solS, is always positive. Accordingly, dissolving one substance in another is always entropically favored. Thus, the real question is whether ∆solH is or is not too positive for ∆solG to be negative, i.e. we will get a solution if ∆solH < T ∆solS: Of course, if ∆solH is negative, then this inequality is automatically satisfied, and we will get a solution. If on the other hand ∆solH > 0, then the inequality tells us that the enthalpy of solution can't be too large. What we need to do then is to figure out how large the entropy and enthalpy of solution are likely to be, or at least what factors affect their magnitudes. We're going to start by looking a solutions involving small molecules to get our feet wet. Our eventual objective is to discuss polymer solutions.
    [Show full text]
  • Part I : Preliminaries (Thermodynamics and Kinetics) Module 2: Gibbs Free Energy of a Binary Alloy
    Part I : Preliminaries (Thermodynamics and Kinetics) Module 2: Gibbs free energy of a binary alloy 3.1 Motivation Pure lead melts at about 327.50C and pure tin melts at 2320C. However, an alloy of lead and tin known as solder alloy (consisting of 61.9% of tin by weight) melts at 1830C. Why? 3.2 Composition dependence of free energy in a binary alloy Consider the free energy . In condensed systems (solids and liquids), : see the tutorial at the end of this module. Hence, in the rest of this module, we assume . Consider a binary alloy - an alloy consisting of two components. Let us consider a statistical model for the free energy of such an alloy (known as the regular solution model). Such a statistical picture, as we see later in this chapter, can explain many observed features of binary systems. Let the two components of the binary alloy be labeled as and . Let and be the number of and atoms. Let , where the Avogadro number. Thus, the alloy composition is defined as . Consider the free energy associated with the mixing of these two components. Since , one can define the the free energy of mixing . Note , where, is the overall free energy after mixing the and atoms, and and are the molar free energies of pure and respectively. Let us consider the calculation of these two quantities, namely, internal energy ox mixing and entropy of mixing as a function of composition. 3.2.1 Congurational entropy of mixing We assume that the only contribution to entropy is the configurational one.
    [Show full text]
  • Translation of Ludwig Boltzmann's Paper
    Entropy 2015, 17, 1971-2009; doi:10.3390/e17041971 OPEN ACCESS entropy ISSN 1099-4300 www.mdpi.com/journal/entropy Article Translation of Ludwig Boltzmann’s Paper “On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium” Sitzungberichte der Kaiserlichen Akademie der Wissenschaften. Mathematisch-Naturwissen Classe. Abt. II, LXXVI 1877, pp 373-435 (Wien. Ber. 1877, 76:373-435). Reprinted in Wiss. Abhandlungen, Vol. II, reprint 42, p. 164-223, Barth, Leipzig, 1909 Kim Sharp * and Franz Matschinsky E. R. Johnson Foundation, Department of Biochemistry and Biophysics, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA 19104, USA; E-Mail: [email protected] * Author to whom correspondence should be addressed; E-Mail: [email protected]; Tel.: +1-215-573-3506; Fax: +1-215-898-4217. Academic Editor: Kevin H. Knuth Received: 22 December 2014 / Accepted: 27 March 2015 / Published: 2 April 2015 Abstract: Translation of the seminal 1877 paper by Ludwig Boltzmann which for the first time established the probabilistic basis of entropy. Includes a scientific commentary. Keywords: Ludwig Boltzmann; second law of thermodynamics; entropy 1. Introduction Barely any of Boltzmann’s original scientific work is available in translation. This is remarkable given his central role in the development of both equilibrium and non-equilibrium statistical mechanics, his statistical mechanical explanation of entropy, and our understanding of the Second Law of thermodynamics. What Boltzmann actually wrote on these subjects is rarely quoted directly, his methods Entropy 2015, 17 1972 are not fully appreciated, and key concepts have been misinterpreted.
    [Show full text]
  • Statistical Physics University of Cambridge Part II Mathematical Tripos
    Preprint typeset in JHEP style - HYPER VERSION Statistical Physics University of Cambridge Part II Mathematical Tripos David Tong Department of Applied Mathematics and Theoretical Physics, Centre for Mathematical Sciences, Wilberforce Road, Cambridge, CB3 OBA, UK http://www.damtp.cam.ac.uk/user/tong/statphys.html [email protected] { 1 { Recommended Books and Resources • Reif, Fundamentals of Statistical and Thermal Physics A comprehensive and detailed account of the subject. It's solid. It's good. It isn't quirky. • Kardar, Statistical Physics of Particles A modern view on the subject which offers many insights. It's superbly written, if a little brief in places. A companion volume, \The Statistical Physics of Fields" covers aspects of critical phenomena. Both are available to download as lecture notes. Links are given on the course webpage • Landau and Lifshitz, Statistical Physics Russian style: terse, encyclopedic, magnificent. Much of this book comes across as remarkably modern given that it was first published in 1958. • Mandl, Statistical Physics This is an easy going book with very clear explanations but doesn't go into as much detail as we will need for this course. If you're struggling to understand the basics, this is an excellent place to look. If you're after a detailed account of more advanced aspects, you should probably turn to one of the books above. • Pippard, The Elements of Classical Thermodynamics This beautiful little book walks you through the rather subtle logic of classical ther- modynamics. It's very well done. If Arnold Sommerfeld had read this book, he would have understood thermodynamics the first time round.
    [Show full text]