4. Classical Phase Space 4.1. Phase Space and Probability Density
Total Page:16
File Type:pdf, Size:1020Kb
4. Classical phase space quantum states in the part of the space under consideration. 4.1. Phase space and probability density The ensemble or statistical set consists, at a given We consider a system of N particles in a d-dimensional moment, of all those phase space points which correspond space. Canonical coordinates and momenta to a given macroscopic system. Corresponding to a macro(scopic) state of the system q = (q1,...,qdN ) there are thus a set of micro(scopic) states which belong p = (p1,...,pdN ) to the ensemble with the probability ρ(P ) dΓ. ρ(P ) is the probability density which is normalized to unity: determine exactly the microscopic state of the system. The phase space is the 2dN-dimensional space (p, q) , dΓ ρ(P )=1, whose every point P = (p, q) corresponds to a possible{ } Z state of the system. A trajectory is such a curve in the phase space along and it gives the local density of points in (q,p)-space at which the point P (t) as a function of time moves. time t. Trajectories are determined by the classical equations of The statistical average, or the ensemble expectation motion value, of a measurable quantity f = f(P ) is dq ∂H i = f = dΓ f(P )ρ(P ). dt ∂p h i i Z dp ∂H i = , We associate every phase space point with the velocity dt ∂q − i field ∂H ∂H where V = (q, ˙ p˙)= , . ∂p − ∂q H = H(q ,...,q ,p ,...,p ,t) 1 dN 1 dN The probability current is then Vρ. The probability = H(q,p,t)= H(P,t) weight of an element Γ0 evolves then like is the Hamiltonian function of the system. ∂ The trajectory is stationary, if H does not depend on ρ dΓ= Vρ dS. ∂t Γ − ∂Γ · time: trajectories starting from the same initial point P Z 0 Z 0 are identical. d S Trajectories cannot cross: if trajectories meet at a point, they must be identical! Let F = F (q,p,t) be a property of the system. Now G 0 dF ∂F = + F, H , dt ∂t { } Because where F, G stands for Poisson brackets { } Vρ dS = (Vρ) dΓ, ∂Γ · Γ ∇ · ∂F ∂G ∂G ∂F Z 0 Z 0 F, G . continuity equation { }≡ ∂q ∂p − ∂q ∂p we get in the limit Γ0 0 the i i i i i → X ∂ We define the volume measure of the phase space ρ + (Vρ)=0. ∂t ∇ · dN dqidpi dN According to the equations of motion dΓ= = h− dq dq dp dp . h 1 · · · dN 1 · · · dN i=1 Y ∂H q˙i = 34 Here h =6.62608 10− Js is the Planck constant. (Often ∂pi dΓ has 1/N! in front· to remove degeneracy caused by ∂H p˙i = identical particles.) − ∂qi Note: [dq dp]= Js, so dΓ is dimensionless. Note: In classical dynamics the normalization is we have ∂q˙ ∂p˙ irrelevant. In quantum mechanics, it is natural to choose i + i =0, ∂q ∂p h = Planck constant. i i Note: ∆0Γ = 1 corresponds to the smallest possible so we end up with the incompressibility condition volume element of the phase space where a point (sourceless) representing the system can be localized in accordance ∂q˙ ∂p˙ with the QM uncertainty principle. The volume V = i + i =0. ∇ · ∂q ∂p ∆Γ = dΓ is then roughly equal to the number of i i i X R 1 From the continuity equation we get then Ergodic flow: Almost all points of the surface ΓE are sometimes arbitrarily close to any point in ∆ΓE . ∂ρ 0 = + (Vρ) ∂t ∇ · ⇔ ∂ρ The flow is ergodic if f(P ), f(P ) ”smooth enough”, = + ρ V + V ρ ∀ ∂t ∇ · · ∇ ¯ ∂ρ f = f E = + V ρ. h i ∂t · ∇ holds. Here f¯ is the time average When we employ the convective time derivative 1 T f¯ = lim dt f(P (t)) d ∂ T T = + V →∞ Z0 dt ∂t · ∇ and f the energy surface expectation value ∂ ∂ ∂ E = + q˙ +p ˙ , h i ∂t i ∂q i ∂p i i i 1 X f E = dΓE f(P ). h i ΣE the continuity equation can be written in the form known Z as the Liouville theorem We define the microcanonical ensemble so that its density distribution is d ρ(P (t),t)=0. dt 1 ρE(P )= δ(H(P ) E). ΣE − Thus, the points in the phase space move like an incompressible fluid, and the local probability density ρ Every point of the energy surface belongs with the same remains constant during evolution. Different points probability to the microcanonical ensemble. ∂ρE naturally can have different ρ. The microcanonical ensemble is stationary, i.e. ∂t =0 and the expectation values over it temporal constants. 4.2. Flow in phase space The mixing flow is such an ergodic flow where the points of an energy surface element dΓ disperse in the The (constant) energy surface ΓE is the manifold E determined by the equation course of time all over the energy surface. Ifρ ˆE(P,t) is an arbitrary non stationary density distribution at the H(q,p)= E. moment t = t0, then 1 If the energy is a constant of motion, every phase point lim ρˆE(P,t)= δ(H(P ) E)= ρE(P ) i t ΣE − P (t) moves on a certain energy surface ΓEi, of dim. →∞ (2Nd 1). and The expectation− value of the energy of the system lim f = lim dΓˆρE(P,t)f(P ) t h i t E = H = dΓ Hρ →∞ →∞ Z h i Z = dΓ f(P )ρE(P ) is also a constant of motion. Z = f The volume of the energy surface is h iE i.e. the density describing an arbitrary (non equilibrium) Σ = dΓ = dΓ δ(H(P ) E). state evolves towards a microcanonical ensemble under E E − Z Z mixing flow. The volume of the phase space is Liouville theorem (volume element(s) are conserved) + mixing: a phase space infinitesimal volume element ∞ becomes infinitely stretched and folded and is distributed dΓ= dE ΣE. over the full energy surface. Z Z−∞ Let us consider the time evolution of a surface element 4.3. Microcanonical ensemble and ∆ΓE of an energy surface. entropy If the total energy of a macroscopic system is known Non-ergodic flow: In the course of time the element exactly and it remains fixed, the equilibrium state can be ∆ΓE traverses only a part of the whole energy described by a microcanonical ensemble. The surface ΓE. corresponding probability density is Examples: periodic motion; presence of other 1 constants of motion. ρE(P )= δ(H(P ) E). ΣE − 2 For a convenience we allow the energy to have some A) Let us now show that the maximisation of the Gibbs ”tolerance” and define entropy gives us microcanonical ensemble, when the energy of the system is restricted to (E, E + ∆E): 1 ρE,∆E(P )= θ(E + ∆E H(P ))θ(H(P ) E). ZE,∆E − − δS = k dΓ(δρ ln ρ + ρδ ln ρ) − B Here the normalization constant Z∆ΓE = k dΓδρ(ln ρ +1)=0. − B Z = dΓ θ(E + ∆E H(P ))θ(H(P ) E) ∆ΓE E,∆E − − Z Z Because δ1= dΓδρ = 0, above condition is satisfied if is the microcanonical state sum or partition function. ln ρ =const. or ρ(P ) =const., when P ∆Γ . R ∈ E ZE,∆E is the number of states contained in the energy S is indeed maximum: slice E<H<E + ∆E (see the volume measure of the phase space). In the microcanonical ensemble the 1 ∂2(ρ ln ρ) δ2S = k dΓ (δρ)2 probability is distributed evenly in every allowed part of − B 2 ∂ρ2 Z∆ΓE the phase space. 1 2 1 = kB dΓ (δρ) 0. − ∆Γ 2 ρ ≤ Entropy Z E We define statistical entropy as the quantity (depending B) Additive: if we consider 2 separate systems, the joint on ρ) which is a) maximized for a physical ensemble and probability distribution is ρ12(P1, P2)= ρ1(P1)ρ2(P2), b) is an extensive quantity, in order to connect to the laws and dΓ12 = dΓ1dΓ2. of thermodynamics. The maximization of the entropy determines the physical distribution ρ. The Gibbs entropy S = k dΓ ρ ln ρ 12 − B 12 12 12 Z S = k dΓ ρ(P ) ln ρ(P ) = kB dΓ1dΓ2 ρ1ρ2(ln ρ1 + ln ρ2) − B − Z Z = k ( dΓ ρ ln ρ + dΓ ρ ln ρ )= S + S has these properties, as we shall show shortly. − B 1 1 1 2 2 2 1 2 Z Z Let ∆Γi the volume of the phase space element i and ρi the average probability density in i. The state of the system is, with the probability Entropy and disorder Maximizing the entropy S minimization of the ⇒ pi = ρi ∆Γi, information about the system. In a microcanonical ensemble complete lack of information (besides the total in the element i and energy) means that all states with the same total energy are equally probable. pi =1. Conversely, losing the information (= non-trivial X probability distribution) maximizing the entropy. We choose the sizes of all elements to be smallest Maximum of entropy maximum⇒ of disorder. ⇔ possible, i.e. ∆Γi = 1. Then S = k ∆Γ ρ ln ρ = k ρ ∆Γ ln ρ ∆Γ − B i i i − B i i i i i i X X = k p ln p , − B i i i X since ln ∆Γi =0. If ρ is smooth in the range ∆Γ = W we have 1 ρ = , W so that 1 1 S = k ln dΓ. − B W W Z We end up with the Boltzmann entropy S = kB ln W.