Physics H190 Spring 2005 Notes 1 Timeline for Nineteenth Century Physics

In these notes we give a timeline of major events in 19th century physics, to orient ourselves to the way Planck and other physicists thought about physical problems in the late nineteenth century, and also to understand the open problems that existed at that time. We will start with thermal physics in the nineteenth century, which is most relevant to the story of black body radiation, although toward the end of the century thermal physics becomes mixed with electromagnetic theory. Indeed, the problem of black body radiation was fundamentally one of understanding the of the electromagnetic field. At the beginning of the nineteenth century, the prevailing theory of thermal physics was the caloric theory, in which was conceived of as a fluid, called caloric, which presumably permeated all bodies. A given body had more or less caloric, depending on its . The theory gave a satisfactory accounting of a certain range of experimental data, including quantitative issues. For example, one could define a certain quantity of heat (the calorie) as the amount of heat (caloric) needed to heat one gram of water by one degree Centrigrade. If the water later cooled off by contact with colder bodies, one could say that the caloric had flowed out. It is remarkable that Carnot in 1824 gave a formulation of what is now considered the second law of thermodynamics, considering that it was based on the (defective) caloric the- ory. Of course, Carnot did not call it the second law, since the first law of thermodynamics was not known yet. Carnot’s formulation concerned the maximum efficiency of idealized heat engines that operate between two , what we now call a . Caloric theory fell when it became recognized that heat is a form of , and that total energy, heat plus all its other forms, is conserved. This step was first made by Mayer and (independently) Joule in the 1840’s. Helmholtz also made contributions. It is amazing that the law of , so central to our modern thinking about physics, took so long to be enunciated. In fact, restricted forms of the law were known to Newton much earlier, but without the understanding that heat was a form of energy, the general law could not be formulated. This produced what we now call the first law of thermodynamics. This produced a numerical value for what is sometimes called the mechanical equivalent of heat, that is, the amount of energy (in energy units) in some unit of heat (like the calorie). – 2 –

The number is about 4 Joules/calorie. In the period 1850–54, Clausius and Kelvin reformulated Carnot’s version of the second law of thermodynamics, in light of the newly understood first law. Their statements of the second law involved the impossibility of doing certain things with heat engines. From this followed the definition of and the thermodynamic (absolute) definition of temper- ature. The latter are two entirely nontrivial consequences of the second law. In 1859 Kirchhoff formulated his laws concering radiative transfer between systems in thermal equilibrium. It was the first significant (and successful) attempt to combine the newly formed thermodynamics (first and second laws) with optics. The electromagnetic nature of light was not well understood at this time, so it is remarkable that Kirchhoff was able to do what he did. The notion of a black body dates from Kirchhoff’s analysis, as does that of black body radiation. An important consequence of Kirchhoff’s analysis was that the spectrum of black body radiation in a cavity had to be independent of the constitution of the walls. It was thus a physical property with some degree of universal validity, and over time, this aspect of black body radiation attracted researchers who recognized that deeply fundamental physics must be involved in understanding the black body spectrum. Maxwell, known primarily for his work on electromagnetic theory, also made significant contributions to the , starting in about 1860. In contrast to thermo- dynamics, which makes no assumptions about the detailed microscopic nature of matter, the kinetic theory of gases models gases as large numbers of weakly interacting particles. Maxwell was able to use such models to calculate and predict physical properties of gases, such as the viscosity, in agreement with experiment. Many people at this time doubted the existence of atoms, which had not been directly observed, but Maxwell’s researches added credence to atomic theories. Maxwell’s work can be regarded as the beginning of what we now call statistical mechanics, that is, the prediction of the macroscopic properties of matter based on a microscopic model. But the system considered was a rather special (and simple) one, namely, a nearly . The kinetic theory of gases was further developed by Boltzmann up through the 1890’s. One of Boltzmann’s main contributions was to make a microscopic interpretation of the entropy, which was known from thermodynamics only as a macroscopic quantity. Boltzmann first wrote the formula, S = k log W, (1.1) where S is the entropy, k is Boltzmann’s constant, and W is a certain probabilistic function connected with the distribution of positions and momenta of the gas molecules. The idea that entropy has a fundamentally probabilistic meaning at the microscopic level is due to – 3 –

Boltzmann, although, like Maxwell, he studied only special and especially simple systems, nearly ideal gases. Boltzmann was so proud of his formula (1.1) that he had it engraved on his tombstone. Many physicists, including Planck (at first), did not like Boltzmann’s prob- abilistic theories, and he did not exactly receive a warm welcome in the physics community. Perhaps for these reasons, he committed suicide (about 1905, I think). By the 1880’s the problem of determining the spectrum of black body radiation was attracting increasing attention from both experimentalists and theorists. The Stefan- Boltzmann law (1884) states (in one version) that

4 u = aT , (1.2) where u is the energy density of black body radiation, a is a constant, and T is the absolute temperature. Stefan guessed this on the basis of some incomplete experimental data, and Boltzmann showed that it followed from some electromagnetic theory plus the second law of thermodynamics. Boltzmann used an imaginary Carnot cycle, employing black body radiation as a working fluid. Wien’s displacement law (1894) gives a restricted functional form that the black body spectrum must satisfy, reducing it to an unknown function of a single variable. The history of this law is somewhat tormented, but it was a major advance in the quest for the black body spectrum, in that it narrowed the space of functions that one had to search in, and the thermodynamic reasoning that led to it was powerful and compelling. Planck’s final (successful) discovery of the correct formula for the black body spectrum took place in 1900, although there continued to be debate over its meaning and proper interpretation for several more years. This was the first time that what we now call Planck’s constant, h or h¯, appeared in physics. In 1902 Gibbs and independently in 1903 Einstein presented the foundations of what is now called classical statistical mechanics. It was a general prescription for starting with a microscopic model of matter, not just simple systems like nearly ideal gases, but in principle any system, and deducing from it macroscopic thermodynamic properties. One aspect of this was a reinterpretation of Boltzmann’s formula (1.2), while keeping its probabilistic nature. After quantum mechanics became understood (1920’s to 1930’s), the generalization of classical statistical mechanics to quantum statistical mechanics was straightforward, and the latter is the theory now taught by books like Kittel and Kromer. Classical statistical mechanics is still quite useful, however, in part because it is hard to solve quantum problems with a large number of particles. This concludes the timeline of thermal physics in the 19th century. It may be useful – 4 –

to add another timeline, this one concerning electromagnetic theory, which also developed tremendously in this period. Roughly in the period 1800–15, Young, Fresnel and others showed that light is a wave phenomenon. This involved experiments demonstrating interference, as well as theoretical studies of diffraction and other effects, some of which was quite surprising in its conclusions (and therefore not believed at first). Previously the prevailing theory, going back to Newton, was that light consists of particles. The wave theory held sway throughout the nineteenth century, although of course quantum mechanics brought back the particle interpretation (light as photons), and now we think of light in a subtle way as presenting both particle and wave aspects, depending on the experiment being performed. The same duality has also been extended to other, material, particles, like the electron, which also present wave and particle aspects. In the 1820’s basic experiments on the production of magnetic fields by electric currents were carried out by Oersted, Biot and Ampere. These were important for demonstrating a connection between electricity and magnetism. This connection was not yet deeply under- stood, but it provided an object of fascination for other researchers, notably Faraday. In the 1830’s Faraday carried out a series of experiments showing that magnetic fields can produce electric currents, that is, electric fields that can drive currents. More exactly, Faraday found that time-varying magnetic fields produce electric fields. This is the phe- nomenon of induction, and, with the earlier results of Ampere and others, it showed that there was a kind of duality between electricity and magnetism. Sometime in this period it was noticed that the numerical values of the electric and magnetic force constants, which can be determined by laboratory experiments, can be com- bined to create a quantity with the dimensions of velocity, and that this velocity numerically was equal to the velocity of light, to within experimental accuracy. This implied strongly that light was an electromagnetic phenomenon, although no one understood exactly how. In the period 1862–64 Maxwell took the known experimental results of Faraday and others, expressed them in terms of a set of partial differential equations (field equations), and recognized that the resulting equations were inconsistent with the conservation of charge. He modified these equations by adding a new term, the “displacement current,” which had no experimental justification, but which caused charge to be conserved. The new set of equations are what we now call Maxwell’s equations. Maxwell showed that his equations predict the existence of electromagnetic waves that have two polarizations and propagate with the speed of light. Maxwell’s detailed physical understanding of the nature of electro- magnetic phenomena, however, was rather different from what we have today. In particular, – 5 –

he invoked various pseudo-mechanical models for the dynamics of the electromagnetic field, and embraced the idea of the “ether,” the substance that supposedly vibrates when a light wave passes. Maxwell’s famous book, Treatise on Electricity and Magnetism, appeared in 1872. In 1887 Hertz produced electromagntic waves experimentally from electrical equipment, what we would now call radio waves. He demonstrated their states of polarization, as well as refraction, reflection, and interference of the waves. In the 1890’s, the electron, X-rays, and radioactivity were all discovered. In 1905 Einstein published his special theory of relativity, a truly revolutionary de- velopment. The effect was to unify electromagnetic theory and mechanics, eliminating various discordances or inconsistencies between the two. In the process a unification of space and time, energy and mass, energy and momentum, and electric and magnetic fields was achieved. Finally, I will mention a notable development in classical mechanics in the nineteenth century, namely, Hamilton’s formulation of mechanics in 1834. There are basically three ways of formulating classical mechanics, one based on Newton’s laws, F = ma, one based on Lagrangians, and one based on classical Hamiltonians. These three approaches all have the same physical content, but the Hamiltonian approach is considered the deepest. Stu- dents nowadays learn about quantum Hamiltonians in courses on quantum mechanics, but frequently do not learn about classical Hamiltonians at all. The physicists in the nineteenth century were, however, quite aware of Hamiltonian mechanics, and it played an important role in the development of statistical mechanics and quantum mechanics. We may say something about classical Hamiltonian mechanics in this course.