
Emergences in Statistical Physics Roger Balian Académie des Sciences IPhT, Centre de Saclay, 91191 Gif-sur-Yvette Cx Statistical physics is one of the most active branches of contemporary theoretical physics. During the XX eth century, the development of microphysics led to the unification of fundamental interactions and exhibited the formal simplicity of the laws at the microscopic scale. These discoveries have strengthened the reductionist attitude, based on the idea that in a system composed of elementary objects, there is “nothing more” than these elements and their mutual relations. We therefore believe that the macroscopic laws, complex and specific, can be explained, starting from the microscopic ones, simple and unified: it is only the large number of constituents which entails the appearance of new properties. Such a type of deduction is the very scope of statistical physics, which is therefore the pre-eminent field of emergence. Since the qualitative properties of a compound system may radically differ from those of its elementary parts, the idea that they could follow from them has been difficult to accept. In the middle of the XVIII th century, Maupertuis raises the question of emergence in his Système de la Nature, essai sur la formation des corps organisés: “A uniform and blind attraction, prevalent in all parts of matter, could not explain how these parts manage to form even the simplest organised body. If they all have the same tendency, the same force to unit to one another, why would these produce an eye, those an ear? why such a marvellous arrangement? and why wouldn’t they all unit in a jumble?” He concludes: “Never the formation of an organised body will be explained from the sole physical properties of matter.” In the second half of the XIX th century, many properties of gases were explained and even predicted by the kinetic theory, the first branch of statistical physics, which was based on the assumption that a gas was an assembly of particles undergoing brief collisions. However, in spite of such successes, this theory suffered much opposition until the direct experimental discovery of atoms and molecules. It seemed unthinkable that matter would not be continuous at the microscopic scale as it appears at our scale. The field of statistical physics has not ceased to extend ever since. We acknowledge that, at the scale of 10 –10 m, matter is simply constituted of atomic nuclei and electrons, characterised by their mass and charge, interacting pair wise through Coulomb forces (plus magnetic forces for some phenomena), and governed by the rules of quantum mechanics. Our aim is then to deduce, from these sole microscopic laws, which are simple and well established, the various properties at our scale of all kinds of materials, fluids, crystals, amorphous, plastics, such properties being mechanical, thermal, optical or electromagnetic. The whole of chemistry and even biology are supposed to derive from this same microphysics. Such a type of deductive understanding is also currently sought after for other systems made of many elementary components. Techniques of statistical mechanics have thus been applied to optics, where the set of many components is the photon gas, to nuclei treated as a cluster of protons and neutrons, to biology in order to explain for instance collective behaviour, and even to cosmology, regarding the Universe as an assembly of galaxies in gravitational interaction. Even if we focus on the passage from the atomic scale to our scale, statistical physics has allowed us to understand the emergence of many phenomena, and has explained the existence of major qualitative differences between these two scales. A few among its successes are listed below. The reader may find more detail in the textbook of R. Balian, From Microphysics to Macrophysics, Methods and Applications of Statistical Physics (Volumes I et II, Springer 2007). 1. Matter appears as continuous although its microscopic structure is discrete. The absence of contradiction between these two properties seems obvious to-day, as it was in the XVIII th century. However, the progress in the XIX th century of thermodynamics, electromagnetism, of wave theory and of fluid dynamics had imposed a continuous conception of matter and atomism had been rejected. 2. Quantities such as energy undergo a metamorphosis from one scale to the other. Energy, a simple function of the microscopic variables that remains constant along the motion, takes at our scale various forms depending on the considered system and phenomenon: heat, work, kinetic, potential, chemical, electric energies, etc.; its conservation keeps track of its microscopic feature. These multiple aspects of energy at our scale explain why its concept has been so difficult to elaborate: the first Law of thermodynamics was recognised only in the middle of the XIX th century, whereas Carnot stated the second Law in 1824; the word “energy” in its scientific meaning was introduced in 1852, and “énergie” appeared in French only in 1875. 3. At the microscopic scale, the physical quantities reduce to the positions and velocities of the constituent particles (plus their intrinsic angular momentum, termed “spin”, relevant for magnetic materials). The current physical quantities emerge from the passage to our scale and are numerous. Some appear as average values: density, pressure, hydrodynamic velocity, electric current, magnetization, concentration of a solute, etc. Others have a hidden statistical nature: entropy measures the microscopic disorder, temperature characterises the probability distribution which governs the system at equilibrium. Still others are derived from the previous ones and are associated with specific properties: heat capacity, compressibility, resistivity, viscosity, etc. All these quantities are collective; they have no microscopic equivalent and exist only owing to the large number of constituents. 4. Statistical physics requires probabilities for two reasons. On the one hand, quantum mechanics, which governs any object at the microscopic scale, is irreducibly probabilistic. (We shall return to this point.) On the other hand, we wish to study a complex object, and this anyhow imposes the use of probabilities: Describing in detail a particular system of this type is unthinkable, so that we search for the general properties of an ensemble of systems, all placed under the same macroscopic conditions but presenting differences at the microscopic scale. Predictions are therefore tainted with uncertainties. However, at our scale, physics is deterministic . To raise this contradiction, we should understand why the probabilistic character of our microscopic description disappears at the macroscopic scale. This follows again from the immensity of the number N of constituents (the Avogadro number is 6.10 23 particles per mole). The Law of large numbers then implies that the relative statistical fluctuation of a collective quantity is of order 1/ √N, hence negligible. Whereas the microscopic variables, the positions and velocities of the particles, widely fluctuate, the above listed macroscopic quantities take (or seem to take) well- defined values. 5. In the reductionist prospect of statistical physics, thermodynamics is dethroned. It is no longer a fundamental science, since its “Laws” appear as mere consequences of the principles of microphysics. The first Law relies on the interpretation of the various macroscopic forms of energy in terms of the microscopic energy. The second Law follows from Laplace’s “principle of insufficient reason”, according to which equal probabilities should be assigned to events on which no information is available. Indeed, it has be shown (R. Balian and N. Balazs, Ann. Phys. 179 (1987) 97) that the statistical entropy, a mathematical object which measures the uncertainty associated with the resulting probability, can be identified with the entropy of thermodynamics, and that the latter should be a maximum in equilibrium: this is exactly Callen’s formulation of the second Law. 6. When applied no longer on general grounds but to specific materials or phenomena, at equilibrium or off equilibrium, statistical physics allows us to understand the microscopic origin of many phenomenological laws: equations of state, specific heats of gases or solids, Hooke’s law of elasticity, magnetic susceptibility in 1/ T, law of mass action, heat equation, Ohm’s law, Navier– Stokes equation, laws of chemical kinetics, and so on. For sufficiently simple materials, one can even evaluate the empirical coefficients entering such laws, for instance, viscosities, conductivities, thermoelectric or diffusion coefficients. 7. Macroscopic dynamical laws are most often non linear , whereas the equations of motion of microphysics are linear. Non linearity can thus emerge from linearity. 8. Another drastic qualitative difference, the emergence of irreversibility has been regarded for a long time as a paradox. At our scale, most evolutions exhibit an “arrow of time”: friction slows down a moving object, the temperatures of two bodies in contact tend to equalize, a sugar melts in water, and the inverse spontaneous processes are forbidden. Nevertheless, microscopic dynamics is reversible: changing the direction of time does not affect the equations of motion of the constituent particles whatsoever. Here again, it is the large number of these particles which allows us to understand why the probability of observing aberrant evolutions, which would go back in time, vanishes in practice,
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages8 Page
-
File Size-