Entropy According to Boltzmann Kim Sharp, Department of Biochemistry and Biophysics, University of Pennsylvania, 2016
Total Page:16
File Type:pdf, Size:1020Kb
Entropy according to Boltzmann Kim Sharp, Department of Biochemistry and Biophysics, University of Pennsylvania, 2016 "The initial state in most cases is bound to be highly improbable and from it the system will always rapidly approach a more probable state until it finally reaches the most probable state, i.e. that of the heat equilibrium. If we apply this to the second basic theorem we will be able to identify that quantity which is usually called entropy with the probability of the particular state." (Boltzmann, 1877) (1) Heat flows from a hot object to a cold one. Two gases placed into the same container will spontaneously mix. Stirring mixes two liquids; it does not un-mix them. Mechanical motion is irreversibly dissipated as heat due to friction. Scientists’ attempts to explain these commonplace observations spurred the development of thermodynamics and statistical mechanics and led to profound insights about the properties of matter, heat and time’s arrow. Central to this is the thermodynamic concept of entropy. 150 years ago (in 1865), Clausius famously said (2) "The energy of the universe remains constant. The entropy of the universe tends to a maximum." This is one formulation of the 2nd Law of Thermodynamics. In 1877 Boltzmann for the first time explained what entropy is and why, according to the 2nd Law of Thermodynamics, entropy increases (3). Boltzmann also showed that there were three contributions to entropy: from the motion of atoms (heat), from the distribution of atoms in space (position) (3), and from radiation (photon entropy)(4). These were the only known forms of entropy until the discovery of black hole entropy by Hawking and Bekenstein in the 1970's. Boltzmann's own words, quoted above, are as concise and as accurate a statement on the topic any made in the subsequent century and a half. Boltzmann's original papers, though almost unread today, are unsurpassed in clarity and depth of physical insight (see for example references (1, 5–7)). It is likely that many inaccurate and misleading statements about entropy and the 2nd law of thermodynamics could have been avoided by simply reading or quoting what Boltzmann actually said, rather than what his commentators or scientific opponents said he said! How do I love thee? let me count the ways- E. B. Browning Boltzmann's formulation of the 2nd Law can be paraphrased as "What we observe is that state or change of states that is the most probable, where most probable means: can can be realized the most ways" The 2nd Law, once expressed in this way, has the same retrospective obviousness and inevitability that Darwin's theory of evolution through natural selection and survival of the fittest does. Incidentally, Boltzmann was a strong admirer of Darwin's work and he correctly characterized the struggle of living things "not for elements nor energy...rather, it is a struggle for entropy (more accurately: negative entropy)"(8). Boltzmann's statement of the 2nd Law in terms of probability is not simply a tautology but a statement of great explanatory power. This is apparent once the exact meaning of the terms observe, state, most probable, and ways is established. Atoms, Motion and Probability The fundamental definition of entropy and the explanation of the 2nd Law must both be grounded upon probability, Boltzmann realized, because of two simple considerations: i) Atoms are in constant motion, interacting with other atoms, exchanging energy, changing positions. ii) We can neither observe, nor calculate the positions and velocities of every atom. The macroscopic phenomena we seek to explain using the 2nd Law reflect the aggregate behavior of large numbers of atoms that can potentially adopt a combinatorically larger 1 Entropy according to Boltzmann, © Kim Sharp, 2016 number of configurations and distributions of energies. So we first consider the probability distribution for atomic positions in the simplest case, in order to define our terms, and appreciate the consequences of averaging over the stupendously large numbers involved. Then, with just a little more work, the 'rule' for the probability distribution for the energies of atoms is established. Distribution in space To introduce the probability calculations, consider the distribution of identical molecules of an ideal gas between two equal volumes V, labeled L for left, and R for right (Figure 1). From the postulate of equal a priori probabilities, each molecule has the same probability, ½, of being in either volume: the volumes are the same size, each molecule is moving rapidly with a velocity determined by the temperature, and undergoing elastic collisions with other molecules and the walls which changes its direction, but otherwise making no interactions 1. So at any instant we might expect an equal number of molecules in each volume. Indeed, if there are two molecules, this is the mostly likely single 'state distribution', produced by 2 of the 4 arrangements or 2 complexions: LR and RL both belong to the state distribution (NL=1,NR=1) . The more extreme state distributions of (NL=2,NR=0) and (NL=0,NR=2) are less likely, there being 1 complexion belonging to each: LL and RR. Still, with only two molecules these more unequal state distributions are quite probable, with p=1/4 each. Distributing 4 molecules, the possibilities are given in Table 1, Table 1 State distribution # of complexions Probability 4 R (NL =4,NR=0) 1 (½) = 0.0625 L (NL=3,NR=1) 4 4/16 = 0.25 (NL=2,NR=2) 6 6/16 = 0.375 (NL=1,NR=3) 4 4/16 = 0.25 4 state distribution (N =2,N =2) (NL=0,NR=4) 1 (½) = 0.0625 L R Figure 1 where the probabilities are obtained as follows: Each molecule independently has a probability of ½ of being in L or R, so the probability of any particular complexion of 4 molecules is ½ x ½ x ½ x ½ = (1/2)4 = 1/16. and there is one such complexion LLLL that gives the distribution (NL=4,NR=0), four (RLLL, LRLL etc.) that give the distribution (NL=3,NR=1) and so on. There are more ways (complexions) to obtain an equal distribution (6) than any other distribution, while the probability of the more extreme distributions- all molecules in the left or right volumes, diminishes to p=1/16 each. More generally, for N molecules the total number of complexions is N 2 , and the number of these that have a given number of molecules NL in the first volume and NR = N- NL in the second, is given by the binomial coefficient: N ! W (N L , N )= (1) N L ! N R! The function W(NL,N) is peaked with a maximum at NL= NR given by 1 Tolman (1936) has emphasized that the ultimate justification for the postulate of equal a priori probabilities is the empirical success of the resulting theory 2 Here LR means a 'complexion' where the first molecule is in the left hand volume, and the second one is in the right hand volume, and so on. The state distribution is defined as the number of molecules in each volume. More generally the state distribution describes the number of molecules/atoms lying within each small range of spatial positions and kinetic energies, exactly analogous to the partition function of post-quantum statistical mechanics. 2 Entropy according to Boltzmann, © Kim Sharp, 2016 N ! W =W (N =N , N )= (2) max L R N 2 ( 2 !) As N increases the peak becomes ever sharper, in other words an increasing fraction of the state distributions come to resemble ever more closely the most probable one NL=NR . This is illustrated in the 3 panels below for 10 thousand, 1 million and 100 million molecules: Figure 2 With 10 thousand molecules, we find that the majority of possible state distributions lie within ±1% of the most probable NL= NR one, although a small but significant fraction lie further away. For 1 million molecules, we find that almost all the possible state distributions (99.92%) lie within ¼ of a percent of NL=NR . For a 100 million molecules, most (99%) lie within two hundredths of a percent of NL=NR . In other words, for large numbers of molecules almost every state distribution is indistinguishable from the most probable one, NL=NR, the state distribution that has the most complexions, that can occur in the most ways. Figure 3 Figure 3 shows one computer generated distribution for each of six values of N, starting with 100 and increasing by a factor of 10 each time up to 10 million molecules. Already for N=100,000 molecules, the total number of possible complexions is stupendous, greater than 1 followed by 30,000 zeroes (!), yet the single sample belongs to a state distribution that differs negligibly from NL=NR, exactly as one expects from the narrowing peak in Fig. 1. For the larger values of N, there are even more possible complexions, on the order of 100.3N yet the single sample belongs to a state distribution even more closely resembling the most probable one. The point of this example is twofold: 1) Even picking just one complexion, it is overwhelming likely that it corresponds to a state distribution with NL≈NR . In no way is it necessary to sample all the complexions (the so called ergodic hypothesis), or even any significant fraction of them. 2) As the specific complexion changes due to thermal motion (as molecules cross between L and R), the gas is overwhelmingly likely to stay in state distributions where NL≈ NR if it is already there, or if it has a complexion where L and R are very different, it is overwhelmingly likely to move to complexions belonging to less unequal state distributions, because there are so many more of the latter than the former.