<<

Outline of chapter

1. Boltzmann distribution a. Macrostate vs microstate b. Derivation of Boltzmann distribution c. Definition of Partition function Q d. example of barometric pressure e. example of particle velocity distribution

2. Partition function a. Utility of the partition function b. Density of states c. Q for independent and dependent particles d. The power of Q: deriving thermodynamic quantities from first principles

3. Examples a. Schottky two-state model b. Curie’s law of c. quantum mechanical particle in a box d. rotational partition function

======

1. Boltzmann distribution a. Macrostate, microstate

Key points:  (#ways could happen) / (all possible ways) o Probability to pull an Ace o Guess what number I’m thinking 1 to 10?  Microstate: particular instance (what you would see if you took a picture of the system)  Macrostate: collection of microstates with same state variables (E, N, V, etc.) Examples of macrostates and microstates  p1, p2…p5 are the of the microstates. o If energies were equal, 1/5 o When energies not equal?  Boltzmann.

Intuition on what we are trying to do:  Describe the probabilities that different states of different energy level are occupied.  What happens when T is low? T = 1  What happens when T is high?

What is the “” of the system?  Entropy originates in probability theory T = 10  Measure of “flatness” of a distribution  Low entropy – distribution is sharp, system is “ordered” few accessible states  High entropy – distribution is broad, system is “disordered” many accessible states

Goal: Find flattest distribution (highest entropy) that is physical T = 100  Agrees with thermodynamics  Probabilities sum to 1

b. Derivation of Boltzmann distribution

Derive Boltzmann distribution pj ( Ej ) for system of:  N particles in a box

 Energy levels Ej  constant T, V, and N (Helmholtz)

Begin with Helmholtz free energy, 

Must find minimum subject to constraint,

 ∑ .

Replace U and S with functions of pj and Ej

 Can then solve for pj in terms of Ej to find distribution relating energy to probability (Boltzmann)

For U,

 〈 〉 ∑

For S,

 ∑

Inserting into F

 ∑ ∑

 ∑ ( )

Now just minimize F subject to constraint ∑ using Lagrange multipliers,

 ∑ [( ) ( )]

Differentiating,

 ∑ [( ) ( )]

 ∑

For dF = 0 each j term must equal zero since they are all positive:

 ( )

Solving for pj in terms of Ej yields,

To remove  from the equation, plug into ∑ and divide pj by 1, ie,

 ∑

 Providing the normalized .

Conceptually, what we have done is to find flattest distribution that satisfies 

 ∑  There is only one possible distribution, the Boltzmann distribution.

c. Definition of Partition function Q

Q is defined as the partition function

 ∑

What does the partition function mean, conceptually?  Loosely speaking, it is the number of “accessible” states.  For low T – only ground state accessible.  For high T – all states accessible.

This is why it is necessary to normalize by Q in order to calculate a probability:

 Numerator is number of states available at particular Ej  Denominator Q is number of total states accessible for that temperature.  Probability of seeing that energy is thus number of those states divided by all accessible states.

d. Example of barometric pressure

Use the Boltzmann distribution to write the pressure vs height in the atmosphere.

How do we think of this problem:  Imagine a bunch of gremlins on the surface of the earth throwing balls straight up where velocity varies and average is dictated by “temperature”  Final height achieved depends on initial kinetic energy, since kinetic is converted into gravitational potential energy.  If you look at density of balls, will be denser at surface and less dense high up.  How do we calculate density as a function of height?

Number of balls at a particular height depends on number of balls with enough energy to reach that height:

 => Number of particles at height 0 in volume V

 => Number of particles at height z in volume V

Hence, ratio of balls at height z to height 0 is,

Energy of a particle in the atmosphere is the gravitational potential energy 

Plugging in,

To find how the pressure varies, use the idea gas law  => , assuming V and T constant

 This shows that the pressure goes down exponentially with z.

e. Example of particle velocity distribution

Box full of balls moving with random velocities.

Kinetic energy of each particle given by,

To compute Boltzmann factor, partition function must sum over all states of the particle. Because particle has three velocities (vx, vy, vz), must sum over each of these when computing Q. Easy way to do this is to start y computing the Boltzmann distribution for just the x-axis velocities.

 ∫ ∫

 Where the integral is over all possible velocity states (x-velocities)  Remember, numerator is roughly number of states available at that energy.  Denominator is number of states accessible in total.  If , then distribution is flat.

Can simplify to obtain,

 √

To find the probability distribution for the speed, sim

 √( ) ∫ ∭

Armed with the probability distribution for v, we can calculate the “moments” of the distribution, 〈 〉 and 〈 〉. Recall that for a random variable the expected value is,

 〈 〉 ∫

 〈 〉 ∫ ∫ √( )

 〈 〉 ∫ √( )

2. Partition function a. Utility of the partition function T = 1

 ∑

What does the partition function mean, conceptually? T = 10  Loosely speaking, it is the number of “accessible” states.  Low T – only ground state accessible.  High T – all states accessible.

T = 100 b. Density of states

In

 ∑ the sum is over all possible states. Sometimes it’s convenient to instead sum over energies, Ej. For this, we must define the density of states, which is the number of states at a particular energy level,

 W(El)

That is, the system may have multiple microstates in a particular macrostate energy level, and W(El) tells us how many such microstates are in each level. This allows us to change the limits of the sum that we sum over energy levels and multiply each term by the number of states in that energy level,

 ∑

For example, in the bead chain problem from last time, Q could be calculated by summing over all states:

 ∑

Or by summing over energy levels using the density of states, Examples of macrostates and microstates  

 ∑ =

Using the density of states, you can often guess Q without having to sum a bunch of terms. For example, take the 6-bead chain problem:

What’s the partition function?

c. Q for independent and dependent particles

The partition function is a crank that allows us to go from simple things like the number of states, energy levels, etc., and calculate less obvious, but potentially more useful, things like the free energy, entropy, and so on.

Sometimes turning the crank requires summing over all possible states and, often, this is not possible, such as when you have 1023 states.

In some instances, due to symmetries in the problem, it is possible to simplify the partition function for even a very large system so that it can be written down compactly.

For example, suppose you have a system of independent particles with energies, A B i and m

The energies come in quanta, indexed by i and m, such as the energy levels of an electron in the hydrogen atom. A A A  System A has energies 1 , 2 , 3 … B B B  System b has energies 1 , 2 , 3 …

These two particles could be two electrons in two different hydrogen atoms, one in San Francisco and the other in Beijing. These systems are completely independent from one another, but we can nevertheless write down the total energy of the “system” of two atoms,

Because the subsystems are independent, we can compute each of their partition functions separately: ⁄  ∑

⁄  ∑

What should the partition function for the combined system be?  Remember, the partition function estimates the total number of accessible states.  If one system has 10 states and the other system has 5 states,  What is the combined number of states of the total system?

How do you derive this mathematically?

⁄ ( )⁄  ∑ ∑ ∑

 ie, we must sum up all Boltzmann factors.

A Because systems are independent, result of summing over i is independent of result of B summing over m . A B  summing over i gives a constant that moves out of the sum over m , and vice versa.

This allows us to factor the sums,

⁄ ⁄  ∑ ∑

Intuition:  This is like me saying, if I give you a stack of cards with numbers 0-9  And another person numbers 0-4  How many number combinations can you create?

 If your selection is completely independent from the other person then you can select from 10 cards and the person 5 cards, so, o 10 x 5 = 50 combinations

 This would not be true if the systems were dependent: o e.g. every time you select 7, the other person must select 3. o In this case, to calculate the result of the sum for yourself, you must know what the other person’s card is first, so you could not factor the sums

What would happen if I add a third system?  e.g. another hydrogen atom on the moon  another person gets 7 cards?

Now the total number of combinations is,

In the special case that

For N such systems, 

This is a very useful result. If you have N identical but distinguishable systems (atoms at different locations), then the total partition function can be written down very compactly, even if N is large.

What happens when the atoms are identical and indistinguishable?  i.e. electrons in a gas that can change position?  Electrons are indistinguishable (we can name them different things, but an instant later we may not have any idea who was who.

In this case, as far as the “system” is concerned, the state in which electron 1 has energy 27 and electron 2 energy 56 is the same as for electron 1having 56 and electron 2 having 27.

That is, the double sum is the upper limit to the possible number of states.  In the even that some of the states are the same when you switch the indices, you are over-counting.  1,2 = 2,1  so you should count as 1, not 2.

For a system of 3 atoms  1,2,3 = 3,1,2 = 2,3,1 = 1,3,2 = 2,1,3 = 3,2,1  So you should count as 1, not 6.

That is, we are over counting by the number of permutations of these numbers.

How about 4, or 10, or 100?  There is an easy way to calculate the number of permutations of a string of numbers, it is  #permutations = N!

That is, for each set of summing indices i, j, k…n, you need to normalize by N! to correct. Hence, the corrected partition function becomes,  d. The power of Q: deriving thermodynamic quantities from first principles

Q turns is an amazingly useful quantity  allows us to calculate macroscopic thermodynamic observables starting from probability.  Should not be too surprising: o Q(T,V,N)  ie, Q tells us how the number of accessible states changes as we vary T, V, and N. o Intuitively, makes sense that it should allow us to calculate things like entropy (log #states), free energy, internal energy, heat capacity, etc.

Calculating U from Q

 ∑

According to Boltzmann,

 where,

Plugging in to U,

 ∑

 ∑

To simplify, recognize that

 ∑

 ∑

Substituting into the above,

 ( )

 ( )

Calculating S

The entropy is defined as

 ∑

Substituting the Boltzmann distribution for pj,

 ∑

 ∑ [ ]

 ∑ ∑

Substituting for U,

 ( )

3. Examples a. Schottky two-state model

System:  N particles  2 states

o Energy levels 0 and 0

Partition function:

 ∑

  Why distinguishable?

What fraction is in state 1?

What fraction is in state 2?

What can we calculate from this?

 〈 〉 ∑

 〈 〉

 ( )

b. Curie’s law of paramagnetism

System:  Magnetic atom in magnetic field  2 states o Energy levels and

What is the per-partition function?  Remember, o Ground state energy is zero o Excited state energy is

o

What is the fraction in each state?

 (# ways energy state happen)/(all possible energy states)

What is the average magnetic moment?

 Remember,

o 〈 〉 ∑  discrete random variable

o 〈 〉  ∫ continuous random variable

 〈 〉 ∑  Magnetic moment defined so that + in direction of B

 〈 〉 ( )

How would you generalize this to N atoms?

 〈 〉

 ( )

c. quantum mechanical particle in a box

According to , we cannot predict where a particle is, we can only provide probabilities for where it may be found.

To compute these probabilities, you must solve the Schrodinger equation,

to obtain , which, when you square it, provides the probability that the particle is at the position x. That is

 | |

H is the Hamiltonian and changes depending on the problem. For a particle moving through a potential V(x),

so that the Schrodinger equation becomes,

 ( )

where, in the x-basis

so that the Schrodinger equation becomes

 ( )

For x in the box, V(x) = 0, so that the equation simplifies to,

This is a simple, linear differential equation with solution,

This is the general solution for a plane wave. To get the exact solution for the standing wave that applies to the box, apply the boundary condition and , , yielding

 √

where we have replaced K with the original constants. This is a standing wave with different energy levels indexed by n

That is, there are multiple solutions indexed by n  standing waves of different frequency/energy  n = 1  1 broad peak  n = 100  100 peaks/troughs

What does this mean?  In QM you can’t say where the particle is  You can only provide probabilities for where it may be found.  To compute these probabilities, you solve the Schrodinger equation.  In some sense, this is easier than solving problems in classical mechanics using Newton’s Laws, because you don’t have to be very clever.

Now that you have the energy levels associated with translational motion (ie, particle “hops” from place to place according to wave peaks), you can easily calculate the translational partition function for the n-state system:

 ∑

 ∑

When energy levels are infinitesimally close together (e.g., large T), the sum can be replaced by an integral over n, which can be calculated arithmetically,

 ∫ ( )

Now that you have the partition function for one particle, you could extend this to N particles easily (assuming independence) and calculate the usual things, S, P, CV.

How do we extend this to 3D?  Two options:

o First, . Solve Schrodinger equation in 3D . Obtain energy levels . Compute partition function by summing/integrating Boltzmann factors

o Second . Solve Schrodinger in 1D . Obtain energy levels . Compute partition function . Recognize that x, y, z are independent and compute total partition function as .

Remember, q is “number of accessible states”. If x DOFs are independent of y DOFs, then total number of combination states is product of the two partition functions.

What about Translations/Rotations/Vibrations?  Again, independence.

Total partition function,

d. Equipartition

Important and useful theorem:  Each independent, quadratic degree of freedom gets ½kT of energy.

Where does this come from?

 Work it out.

〈 〉  ∫

∫  〈 〉 ∫

For the special case when,

 the integrals can be solved analytically,

 〈 〉

To recap:  If you have a degree of freedom, x  And the energy is quadratic in x,  Then the average energy of that D.O.F. is ½kT.

This would apply, for instance, to

 Harmonic oscillator   〈 〉 〈 〉 〈 〉

 Translational motion   〈 〉 〈 〉 〈 〉

There are other special cases where the integral can be solved arithmetically, such as

In this case,

 〈 〉

 This describes quantum mechanical vibrations.

Important:  Equipartition does not apply when T is low and the energy levels are quantized.  Sums no longer integrals.  As you lower temperature, D.O.F.’s “freeze out”.