<<

Studies in History and Philosophy of Modern Physics 41 (2010) 208–217

Contents lists available at ScienceDirect Studies in History and Philosophy of Modern Physics

journal homepage: www.elsevier.com/locate/shpsb

The development of general circulation models of climate

Spencer Weart

American Institute of Physics, College Park, MD, USA article info abstract

Article history: With the coming of digital computers in the 1950s, a small American team set out to model the Received 30 October 2009 weather, followed by attempts to represent the entire general circulation of the atmosphere. The work Received in revised form spread during the 1960s, and by the 1970s a few modelers had produced somewhat realistic looking 19 May 2010 models of the planet’s regional climate pattern. The work took on wider interest when modelers tried Accepted 25 June 2010 increasing the level of greenhouse gases, and invariably found serious global warming. Skeptics pointed to dubious technical features, but by the late 1990s these problems were largely resolved—thanks to Keywords: enormous increases in computer power, the number and size of the closely interacting teams that now Simulation comprised the international modeling community, and the crucial availability of field experiments and Model satellite data to set against the models’ assumptions and outputs. By 2007 nearly all climate experts Computer accepted that the climate simulations represented reality well enough to impel strong action to restrict Climate Atmosphere gas emissions. Circulation & 2010 Elsevier Ltd. All rights reserved.

When citing this paper, please use the full journal title Studies in History and Philosophy of Modern Physics

1. Introduction could only offer simple arguments for the character of the circulation, arguments which might seem plausible but in fact Climate is made by the general circulation of the atmo- were mere hand-waving (see Lorenz, 1967, 59ff). sphere—the global pattern of air movements, with air masses The solution would come by taking the problem from the other rising in the tropics to descend farther north, semi-tropical trade end. Instead of starting with grand equations for the planet as a winds, cyclonic storms that carry energy and moisture through whole, one might seek to find how the circulation pattern was middle latitudes, and so forth. Many meteorologists suspected built up from the local weather at thousands of points. But the that shifts in this pattern were a main cause of climate change. physics of local weather was also a formidable problem. They could only guess about such shifts, for the general circulation was poorly mapped before the 1940s (even the jet streams remained to be discovered). The Second World War and 2. Numerical weather prediction (1922–1965) its aftermath brought a phenomenal increase in observations from ground level up to the stratosphere, which finally revealed Early in the 20th century, Vilhelm Bjerknes developed a set of all the main features. Yet up to the 1960s, the general circulation seven ‘‘primitive equations’’ describing the behavior of heat, air was still only crudely known.1 motion, and moisture. The solution of the set of equations would, This knowledge was strictly observational. From the 19th in principle, describe and predict large-scale atmospheric motions century forward, many scientists had attempted to explain the (Nebeker, 1995; Friedman, 1989). Lewis Fry Richardson (1922) general pattern by applying the laws of the physics of gases to a published a numerical system for weather prediction, using heated, rotating planet. All their ingenious efforts failed to derive a simplified versions of Bjerknes’s equations. Richardson’s idea realistic mathematical solution. The best mathematical physicists was to divide up a territory into a grid of cells, each with its own set of numbers describing its air pressure, temperature, and the like, as measured at a given hour. He would then solve the E-mail address: [email protected] equations that told how air behaved. He could calculate wind 1 This essay is mainly a condensation of Weart (2009), where many additional speed and direction, for example, from the difference in pressure references may be found. It is partly based, by permission, on an unpublished between two adjacent cells. The number of computations was so essay by Paul Edwards; see Edwards (2000). For a more thorough account of the history of GCMs, including vital connections with data gathering and processing, great that Richardson scarcely hoped his idea could lead to see Edwards (2010). The context is summarized in Weart (2008). practical weather forecasting. A practical test, which cost him

1355-2198/$ - see front matter & 2010 Elsevier Ltd. All rights reserved. doi:10.1016/j.shpsb.2010.06.002 S. Weart / Studies in History and Philosophy of Modern Physics 41 (2010) 208–217 209 six weeks of pencil-work, ended in failure. Taking the warning to temperature, moisture, wind speed, and so forth. Largely because heart, meteorologists gave up any hope of numerical modeling of military needs, during the war and afterward networks had (Nebeker, 1995, chap. 6). been established to send up thousands of balloons that radioed The alternative to the failed numerical approach was to keep back measurements of the upper air. For the first time the trying to find a solution in terms of mathematical functions—a atmosphere was seen not as a single layer, as represented by a few pages of equations that an expert might comprehend as easily surface map, but in its full three dimensions. By the 1950s, the as a musician reads music. Through the 1950s, some leading weather over continental areas, up to the lower stratosphere, was meteorologists worked with simplified forms of the physics being mapped well enough for comparison with results from equations, trying a variety of approaches to what we might call rudimentary models.3 a mathematical simulation of the atmosphere. They were never The first serious weather simulation Charney’s team com- able to convincingly show the features of the general circulation, pleted was two-dimensional. Their model divided the atmosphere not even something as simple and important as the trade winds. into a grid of cells, covering North America with 270 points about What was hopeless with pencil and paper might possibly be 700 km apart. Starting with real weather data for a particular day, made to work with the new digital computers, feverishly the computer solved all the equations for how the air should developed during the Second World War and leaping ahead in respond to the differences in conditions between each pair of power as the Cold War demanded ever more calculations. In the adjacent cells. Taking the outcome as a new set of weather data, it lead, energetically devising ways to simulate nuclear weapons stepped forward in time (using a step of three hours) and explosions, was the Princeton mathematician John von Neumann. computed all the cells again. Charney, Fjortoft,¨ and von Neumann Von Neumann saw parallels between his explosion simulations (1950, p. 245) remarked that between each run it took them so and weather prediction (both are problems of non-linear fluid long to print and sort punched cards that ‘‘the calculation time for dynamics). He assembled a small group of theoretical meteorol- a 24-hour forecast was about 24 hours, that is, we were just able ogists at Princeton’s Institute for Advanced Study to look into to keep pace with the weather’’. The resulting forecasts were far regional weather prediction; if that proved feasible, the group from perfect, but they turned up enough features of what the planned to move on to the extremely ambitious problem of weather had actually done on the chosen day to justify pushing modeling the entire global atmosphere. Von Neumann invited Jule forward (Platzman, 1979). The U.S. Weather Bureau and units of Charney, an energetic and visionary meteorologist, to head the the armed forces established a Joint Numerical Weather Predic- new group. Charney came from Carl-Gustaf Rossby’s pioneering tion Unit, which in May 1955 began issuing real-time forecasts in meteorology department at the University of Chicago, where the advance of the weather. The results were encouraging, although it study of weather maps and fluids had developed a toolkit of would be well over a decade before the accuracy of computer sophisticated mathematical techniques and an intuitive grasp of forecasts began to reliably outstrip the subjective guesswork of basic weather processes. experienced human forecasters (Cressman, 1996; Nebeker, 1995). Richardson’s equations were the necessary starting-point, but These early forecasting models were regional, not global in Charney had to simplify them if he hoped to run large-scale scale. It was not for practical reasons of weather prediction that a calculations in weeks rather than centuries. He began with a set of few meteorologists wanted to push on to model the entire general simplified equations that described the flow of air along a narrow circulation of the global atmosphere. To be sure, Von Neumann band of latitude. By 1949, his group had results that looked fairly and the military agencies that funded him speculated that realistic—sets of numbers that you could almost mistake for real understanding the climate might make it possible someday to weather diagrams, if you did not look too closely. In one manipulate it, as rain-makers claimed to do, perhaps even as a characteristic experiment, they modeled the effects of a large form of warfare. But for the foreseeable future, the scientists’ true mountain range on the air flow across a continent. Modeling was ambition was strictly theoretical, the hope of understanding at taking the first steps toward the computer games that would last how the climate system worked—why there are deserts in come a generation later, in which the player acts as a god: raise up some places, wet zones in others, and so forth. That computation a mountain range and see what happens! Soon the group became a holy grail for theoretical meteorologists. proceeded to fully three-dimensional models for a region Norman Phillips in Princeton took up the challenge. He was (Charney, 1949; Charney & Eliassen, 1949).2 encouraged by ‘‘dishpan’’ experiments carried out in Chicago: a To Charney (1949, pp. 371–372), all this was just an extension pan of water, rotating on a turntable and heated at the edge (as of normal theoretical analysis. ‘‘By reducing the mathematical the Earth is heated in the tropics)—a physical simulation of the difficulties involved in carrying a train of physical thought to its atmosphere. The simulation produced something roughly like the logical conclusion,’’ he wrote, ‘‘the machine will give a greater convection cells of the atmosphere’s general circulation and even scope to the making and testing of physical hypotheses’’. He did tiny storms (Fultz, 1949, 1952). For Phillips (1956) this proved not imagine that computer models would be able to convey that ‘‘at least the gross features of the general circulation of the insights in a way that could not come from physics theory, nor a atmosphere can be predicted without having to specify the laboratory setup, nor the data on a weather map, but in an heating and cooling in great detail.’’ If such an elementary altogether new way. His challenge remained what it had been in laboratory system could model a hemisphere of the atmosphere, the traditional style of physics theory: to combine and simplify should not a computer be able to do as well? To be sure, the equations until you got formulas that gave sensible results with a computer at Phillips’s disposal was as primitive as the dishpan (its limited amount of computation. Developing usable combinations RAM held all of five kilobytes and its magnetic-drum memory and approximations of meteorological variables took a rare held ten). So his model had to be extremely simple. By mid-1955 combination of mathematical ingenuity and physical insight. Phillips had developed improved equations for a two-layer And that was only the beginning. atmosphere. To avoid mathematical complexities, his grid covered To know when you were getting close to a realistic model, you not a hemisphere but a cylinder, 17 cells high and 16 in had to compare your results with the actual atmosphere. For that circumference. He drove circulation by putting heat into the you would need an unprecedented number of measurements of lower half, somewhat like the dishpan experimenters only with

2 The first experiment in a GCM (raising up the Himalayas) was Mintz (1965). 3 Nebeker (1995), Smagorinsky (1983), Kutzbach (1996, pp. 362–368). 210 S. Weart / Studies in History and Philosophy of Modern Physics 41 (2010) 208–217 numbers rather than an electrical coil. The calculations turned out recruited a young Tokyo University graduate, Akio Arakawa, to a plausible jet stream and the evolution of a realistic-looking help design the mathematical schemes for a general circulation weather disturbance over as long as a month. model (Arakawa, Venkateswaran, & Wurtele, 1994). This settled an old controversy over what processes built the By 1964 Mintz and Arakawa had produced a climate computed pattern of circulation. For the first time scientists could see, for an entire globe, with only a two-layer atmosphere but among other things, how giant eddies spinning through the including realistic geography—the topography of mountain atmosphere played a key role in moving energy and momentum ranges was there, and a rudimentary treatment of oceans and from place to place. Phillips’s model was quickly hailed as a ice cover. Although the results missed some features of the real ‘‘classic experiment’’—the first true General Circulation Model, or world’s climate, the basic wind patterns and other features came ‘‘GCM’’ (Smagorinsky, 1963, p. 100). out more or less right (Arakawa, 1970; Mintz, 1965). The model, Three-dimensional models were not the only way to approach packed with useful techniques, had a powerful influence on problems of climate. Early models of the global atmosphere had other groups. compressed every process into one or two dimensions. The three- Arakawa was becoming especially interested in a problem that dimensional models were indeed made up of one-dimensional was emerging as a main barrier to progress—accounting for the vertical columns, each exchanging air and energy with its effects of cumulus clouds. The smallest single cell in a global neighbors; such columns continued to attract much research model that a computer can handle, even today, is far larger than attention over subsequent decades. Meanwhile scientists would an individual cloud. Thus the computer calculates none of the develop an ever richer variety of approaches, finding many ways cloud’s details. Models have had to get by with a ‘‘parameteriza- to slice up the total number of arithmetic operations that a tion,’’ a set of numbers (parameters) that represent the net computer could run through in whatever time you could afford to behavior of all the clouds in a cell under given conditions. pay for. This essay does not cover these developments, but Through the decades, Arakawa and others would spend countless concentrates on the three-dimensional GCMs, the countless hours developing and exchanging ways to attack this critical descendants of Phillips’s model. problem.4 Encouraged by the model’s success, Von Neumann drummed Modeling techniques and entire GCMs spread by a variety of up government funding for a long-term project. The effort got means. In the early days, as Phillips recalled, modelers had been underway that same year, 1955, under the direction of Joseph like ‘‘a secret code society.’’5 There were so many subtleties that a Smagorinsky at the Weather Bureau near Washington, DC. real grasp required an apprenticeship on a working model. Smagorinsky’s goal was the one first envisaged by von Neumann Commonly, a new modeling group began with some version of and Charney, a general circulation model of the entire global another group’s model. A post-doctoral student might take a job atmosphere built directly from the primitive equations. In 1958, at another institution, bringing along his old team’s computer he invited the young Japanese meteorologist Syukuro Manabe to code. The new team he assembled would start off working with join the lab. Smagorinsky and Manabe put into their model the the old code and then set to modifying it. Others built new models way rain fell on the surface and evaporated, they put in how from scratch. Americans dominated the field during the first radiation passing through the atmosphere interacted not only postwar decades, but through the 1960s and 1970s, important with water vapor but also with ozone and carbon dioxide gas GCM groups also emerged at institutions from England to 6 (CO2), they put in how the air exchanged water and heat with Australia. simplified ocean, land, and ice surfaces, and much more. Manabe spent many hours in the library studying such esoteric topics as how various types of soil absorbed water. The huge complexities 3. Credible climate prediction (1965–1979) of the modeling required contributions from several others. ‘‘This venture has demonstrated to me,’’ said Smagorinsky (1963, Although the modelers of the 1950s and early 1960s got p. 151), ‘‘the value if not the necessity of a diverse, imaginative, results good enough to encourage them to persevere, they were and dedicated working group in large research undertakings’’. still a long way from reproducing the details of the Earth’s actual As decades passed this necessity would drive the community of climate zones. Even if the computers had been vastly faster, the researchers to grow by orders of magnitude without ceasing to simulations would still have been unreliable. For they were collaborate closely. running up against that famous limitation of computers, ‘‘garbage By 1965 the group had a reasonably complete three-dimen- in, garbage out.’’ To diagnose the failings that kept GCMs from sional global model that solved the basic equations for an being more realistic, scientists needed an intensified effort to atmosphere divided into nine levels. This was still highly collect and analyze data showing the actual profiles of wind, heat, simplified, with no geography—land and ocean were blended moisture, and so forth, at every level of the atmosphere and all into a single damp surface, which exchanged moisture with the around the globe. The data in hand were still deeply insufficient. air but could not take up heat. Nevertheless, the way the model For example, for the atmosphere’s crucial water and energy moved water vapor around the planet looked gratifyingly balances, the expert Edward Lorenz (1967) estimated that the realistic. The printouts showed a stratosphere, a zone of rising commonly used numbers might be off by as much as 50%. air near the equator (creating the doldrums, a windless zone that Smagorinsky (1970, p. 33) put the problem succinctly: ‘‘We are becalmed sailors), a subtropical band of deserts, and so forth. now getting to the point where the dispersion of simulation Many details came out wrong, however (Manabe, Smagorinsky, & results is comparable to the uncertainty of establishing the actual Strickler, 1965; Smagorinsky, Manabe, & Holloway, 1965). atmospheric structure’’. In the late 1950s, as computer power grew and the need for simplifying assumptions diminished, other scientists around the world began to experiment with fully three-dimensional, many- 4 Arakawa & Schubert (1974) was a major step, and briefly reviews the leveled models. An outstanding case was the work of Yale Mintz history. 5 in the Department of Meteorology of the University of California, Norman Phillips, interview by T. Hollingsworth et al., October 1989, p. 23, copies at the National Center for Atmospheric Research, Boulder, CO, and Niels Los Angeles. Like Smagorinsky, Mintz had the rare vision and Bohr Library & Archives, American Institute of Physics, College Park, MD drive necessary to commit himself to a research program that (henceforth: AIP). must take decades to reach its goals. And like Smagorinsky, Mintz 6 For a ‘‘family tree’’ of models, see Edwards (2010, Fig. 7.4, p. 168). S. Weart / Studies in History and Philosophy of Modern Physics 41 (2010) 208–217 211

Lorenz (1967, p. 10) cautioned that ‘‘even the trade winds and kind of detail they pursued was a simple equation they devised to the prevailing westerlies at sea level are not completely represent the reflection of sunlight from snow. They included the explained’’. He and a few others began to suspect that the age of the snow layer (as it gradually melted away) and the problem was not merely difficult, but impossible in principle. ‘‘masking’’ by vegetation (snowy forests are darker than snowy They worried that climate was not a well-defined system, but tundra). They managed to get a quite realistic-looking climate only an average of the ever-changing jumble of daily storm fronts. that ran an order of magnitude faster than some rival GCMs, Would computer modelers ever be able to say they had permitting the group to experiment with multiple runs, varying ‘‘explained’’ the general circulation? Many scientists looked one factor or another to see what changed. In such studies, the askance at the new method of numerical simulation. People were global climate was beginning to feel to researchers like a attacking many kinds of scientific problems by taking a set of comprehensible physical system, akin to the systems of glassware basic equations and running them through hundreds of thousands and chemicals that experimental scientists manipulated on their of computations. Their results were simply stacks of printout with laboratory benches (Hansen et al., 1983). rows of numbers. That was no ‘‘explanation’’ in the traditional Groups continued to proliferate, borrowing ideas from earlier sense of a model in words or diagrams or equations, something models and devising new techniques of their own. In their first you could write down on a few pages, something your brain could decade or so of work the GCM modelers had treated climate as a grasp intuitively as a whole. The numerical approach ‘‘yields little given, a static condition. But in the 1960s a few modelers began to insight,’’ Lorenz (1967, p. 8) complained. ‘‘The computed numbers take an interest in global climate change as a problem over the are not only processed like data but they look like data, and a long term. The observations by Keeling (1960, 1970) that the level study of them may be no more enlightening than a study of real of CO2 in the atmosphere was rising prompted hard thinking meteorological observations’’. about greenhouse warming.

Yet the computer scientist could ‘‘experiment’’ in a sense, by Manabe had a long-standing interest in the effects of CO2, not varying the parameters and features of a numerical model. You because he was worried about the future climate, but simply couldn’t put a planet on a laboratory bench and vary the sunlight because the gas at its current level was a significant factor in the or the way clouds were formed, but wasn’t playing with computer planet’s heat balance. First, however, he had to deal with the models functionally equivalent? Through many such trials you most powerful greenhouse gas: H2O. Manabe and his colleagues might eventually come to understand how the real world were building the first model that took full account of water in all operated. Indeed you might be able to observe the planet more its forms, realistically including the feedback between the air’s clearly in graphs printed out from a model than in the clutter of temperature and the amount of water vapor the air would hold. real-world observations, so woefully inaccurate and incomplete In particular, Manabe’s group calculated the way rising (Kellogg & Schneider, 1974, p. 1166).7 columns of moisture-laden air conveyed heat from the surface A new viewpoint was spreading along with digital computing. into the upper atmosphere. The required computations were Climate was not changed by any single cause, the modelers said. so extensive that Manabe stripped down the model to a single It was the outcome of a staggeringly intricate complex of one-dimensional column, which represented the atmosphere interactions—something that could only be comprehended in averaged over the globe. His aim was to get a system that the working-through of the numbers. The printouts were all the could be used as a basic building-block for a full three- explanation there was. dimensional GCM. The growing community of climate modelers was strength- In 1967, Manabe and a collaborator, Richard Wetherald, used ened by the progress of other teams that carried out detailed the one-dimensional model to test what would happen if the level calculations on short time-scales for weather prediction. This of CO2 changed. Their target was the climate’s ‘‘sensitivity,’’ often progress required much work on parameterization—schemes for defined as the change in average global temperature if the CO2 representing cloud formation, interactions between waves and level doubled. Their answer was that global temperature would winds, and so forth. Such studies accelerated as the 1970s began. rise roughly 2 1C(Manabe & Wetherald, 1967). The weather forecasting models also required data on conditions This was no more than a first baby step toward a realistic at every level of the atmosphere at thousands of points around the three-dimensional model of the changing climate. The next world. Such observations were now being provided by the important step was also taken by Manabe’s group, which in balloons and sounding rockets of an international World Weather 1968 moved to Princeton. They now used a GCM with nine Watch, founded in the mid-1960s. The weather predictions atmospheric levels, but it was still highly simplified. In place of became accurate enough—looking as far as three days ahead—to actual land and ocean geography they pictured a geometrically be economically important. That built support for the meteor- neat planet, half damp surface (land) and half wet (a ‘‘swamp’’ ological measurement networks and computer studies necessary ocean). They could not predict cloudiness but just held it for climate work. unchanged at the present level when they calculated the warmer

An example of the crossover could be found at NASA’s Goddard planet with doubled CO2. However, they did incorporate the Institute for Space Studies in New York City. A group there under movements of water, predicting changes in soil moisture and James Hansen had been developing a weather model as a practical snow cover on land, and they calculated sea surface temperatures application of its mission to study the atmospheres of planets. For well enough to show the extent of sea ice. The results, published one basic component of this model, Hansen developed a set of in 1975, looked quite realistic overall. As one might expect on equations for the transfer of radiation through the atmosphere, basic physical grounds, the warmer model with increased CO2 based on work he had originally done for studies of the planet held more moisture in the air, with an intensified hydrological Venus. The same equations could be used for a , by cycle of evaporation and precipitation, as hot soil dried out in one combining them with an elegant method for computing fluid region and more rain came down elsewhere. The model also dynamics that Arakawa had developed. showed greater warming in the Arctic than in the tropics; this too In the 1970s, Hansen assembled a team to work up a model could be predicted from simple reasoning, for less snow and ice that would be both fast-running and realistic. An example of the meant more absorption of sunlight by ground and sea. But it took a calculation to show that what sounded reasonable on elemen- tary principles was indeed likely to happen in the real world—or 7 For general discussion of heuristic modeling see Dahan-Dalmedico (2001). at least this simulation of it. 212 S. Weart / Studies in History and Philosophy of Modern Physics 41 (2010) 208–217

Averaged over the entire planet, for doubled CO2 the computer climate. Weather disasters and the energy crisis of the early 1970s predicted a warming of around 3.5 1C. It all looked plausible. The had put greenhouse warming on the public agenda. It was now a results made a considerable impact on scientists, and through matter of concern to citizens (or at least the more scientifically them on policy-makers and the public. well-informed citizens) whether the computer models were

Manabe and Wetherald (1975, p. 13) warned that ‘‘it is not correct in their predictions of how CO2 emissions would raise advisable to take too seriously’’ the specific numbers they global temperatures. Newspapers reported disagreements among published. They singled out the way the model treated the oceans prominent scientists. Some experts suspected that factors over- as a simple wet surface. On our actual planet, the oceans absorb looked in the models might keep the climate system from large quantities of heat from the atmosphere, move it around, and warming at all, or might even bring on cooling instead. release it elsewhere. And there remained the old vexing problem The problem was so vexing that the President’s Science Adviser of clouds. As the planet got warmer the amounts of cloudiness asked the National Academy of Sciences to study the issue. The would probably change at each level of the atmosphere in each Academy appointed a panel, chaired by Jule Charney and zone of latitude, but change how? There was no reliable way to including other respected experts who had been distant from figure that out. Worse, the net effect depended on the types of the recent climate debates. They convened at Woods Hole in the cloud and how high they floated in the atmosphere. A better summer of 1979. Charney’s group compared two independent prediction of climate change would have to wait on general GCMs, one constructed by Manabe and the other by Hansen— improvements. elaborate three-dimensional models that used different physical Progress was steady, thanks to the headlong advance of approaches and different computational methods for many electronic computers. From the mid-1950s to the mid-1970s, features. The panel found differences in detail but solid agreement the power available to modelers increased by a factor of for the main point: the world would get warmer as CO2 levels thousands. That meant modelers could put in more factors in rose. The Charney panel announced that they had rather high more complex ways. The models no longer had gaping holes that confidence that as CO2 doubled in the atmosphere, the planet required major innovations, and the work settled into a steady would warm up by about three degrees, plus or minus fifty improvement of existing techniques. At the foundations, modelers percent: in other words, 1.5–4.5 1C (2.7–8 1F). They concluded devised increasingly more sophisticated and efficient schemes of dryly, ‘‘We have tried but have been unable to find any overlooked computation. As input for the computations they worked end- or underestimated physical effects’’ that could reduce the lessly to improve parameterizations. From around 1970 on, many warming (National Academy of Sciences, 1979, pp. 2, 3). journal articles appeared with ideas for dealing with convection, evaporation of moisture, reflection from ice, and so forth (Nebeker, 1989). 4. Ocean circulation and real climates (1969–1988) The most essential element for progress, however, was better data on the real world. Strong efforts were rapidly extending the Several groups pressed ahead toward more realistic models. By observing systems. In 1969 NASA’s Nimbus 3 satellite began to the early 1980s they were using a reasonable facsimile of the broadcast measurements which, although designed primarily for Earth’s actual geography, and replaced the wet ‘‘swamp’’ surface meteorology, provided a fundamental check on model results. A with an ocean that could exchange heat with the atmosphere. main reason Manabe’s 1975 model planet was seen as successful They were coming to believe that their work might be of more was the reasonably good agreement between its reflection of than academic interest. When they introduced a doubled CO2 sunlight at each latitude and the actual numbers measured by level into their improved models, they consistently found the Nimbus 3. same few degrees of warming (Manabe & Stouffer, 1980). Also encouraging was a 1972 model by Mintz and Arakawa Many scientists, especially ones outside the modeling frater- (unpublished, like much of their work), which managed to nity, were skeptical. The treatment of clouds remained a central simulate in a rough way the huge changes as the sunlight shifted uncertainty. Who could say that as the world warmed, clouds from season to season. During the next few years, Manabe and would not spread, reflecting sunlight to prevent further warming? collaborators published a model that produced entirely plausible Another great unknown was the response of the oceans. seasonal variations. Seasons provided a convincing test of the Oceanographers were coming to realize that large amounts of models’ validity. It was almost as if a single model worked for two energy were carried through the seas by a myriad of whorls of quite different planets—the planets Summer and Winter. A 1975 various types, from tiny convection swirls up to sluggish eddies a review panel felt that with this success, realistic numerical thousand kilometers wide. Calculating these whorls, like calculat- climate models ‘‘may be considered to have begun’’ (GARP ing all the world’s individual clouds, was beyond the reach of the (National Academy of Sciences, United States Committee for the fastest computer. Again parameters had to be devised to Global Atmospheric Research Program), 1975, p. 200, citing Mintz, summarize the main effects, only this time for entities that were Katayama, & Arakawa, 1972).8 far worse observed and understood than clouds. Yet basic problems such as predicting cloudiness remained Manabe was keenly aware that if the Earth’s future climate unsolved, while new difficulties rose into view. In particular, were ever to be predicted, it was ‘‘essential to construct a realistic scientists began to realize that the way clouds formed could be model of the joint ocean–atmosphere system’’ (Manabe, Bryan, & strongly affected by the haze of dust and chemical particles Spelman, 1979, p. 394). He shouldered the task in collaboration floating in the atmosphere. Little was known about how these with Kirk Bryan, an oceanographer with meteorological training, aerosols helped or hindered the formation of different types of who had been brought into the group back in 1961 to build a clouds. Such things had scarcely been explored through physics stand-alone numerical model of ocean circulation. Manabe’s theory, nor had they been measured in the field. winds and rain would help drive Bryan’s ocean currents, while Modelers felt driven to do better, for people had begun to in return Bryan’s sea-surface temperatures and evaporation demand much more than a crude reproduction of the present would help drive the circulation of Manabe’s atmosphere. Bryan and Manabe not only incorporated both oceans and atmosphere, but added into the bargain feedbacks from changes in sea ice. 8 Manabe had a rough seasonal simulation by 1970 and published a full Moreover, they included a detailed scheme that represented, seasonal variation in Manabe, Hahn, & Holloway (1974). region by region, how moisture built up in the soil, or evaporated, S. Weart / Studies in History and Philosophy of Modern Physics 41 (2010) 208–217 213 or ran off in rivers to the sea. They had enough confidence in their warming (Hansen et al., 1988). By 1988 Hansen had enough model to undertake a heroic computer run, some 1100 hours long confidence to issue a strong public pronouncement, warning of an (more than 12 full days of computer time devoted to the imminent threat. Meanwhile a major international scientific atmosphere and 33 to the ocean). They published the results in conference in Toronto issued a call for the world’s governments an unusually short paper (Manabe & Bryan, 1969). to restrict greenhouse gas emissions. The predictions of computer Bryan (1969, p. 822) wrote modestly that ‘‘in one sense they models had become a matter of concern to policymakers—and experiment is a failure’’. For even after a simulated century, the politicians. deep ocean circulation had not nearly reached equilibrium. It was The climate changes computed for doubled CO2 by different not clear what the final climate solution would look like. Yet it GCMs, noted reviewers Schlesinger and Mitchell (1987, p. 795), was a great success just to carry through a linked ocean– ‘‘show many quantitative and even qualitative differences; thus atmosphere computation that was at least starting to settle into we know that not all of these simulations can be correct, and equilibrium. The result looked like a real planet—not our Earth, perhaps all may be wrong’’. Skeptics pointed out that GCMs did for in place of geography there was only a radically simplified not really represent even the present climate successfully from geometrical sketch, but it had ocean currents, trade winds, first principles. Anything slightly unrealistic in the initial data or deserts, rain belts, and snow cover, all in roughly the right places. equations could be amplified a little at each step, and after Unlike our actual Earth, so poorly observed, in the simulation one thousands of steps the entire result usually veered off into could see every detail of how air, water, and energy moved about. something impossible. To get around this, the modelers had kept Following up, in 1975 Manabe and Bryan published results one eye over their shoulder at the real world. They adjusted from the first coupled ocean–atmosphere GCM that had a roughly various parameters, for example, numbers describing cloud Earth-like geography. For example, it showed the Sahara and the physics, running the models again and again, ‘‘tuning’’ them until American Southwest as deserts, but plenty of rain in the Pacific the output resembled the real climate. Some of these adjustments Northwest and Brazil. Manabe and Bryan had not shaped their were not calculated from physical principles, nor were they equations deliberately to bring forth such features. These were pinned down precisely by field studies. They were fiddled, within ‘‘emergent features,’’ emerging spontaneously out of the compu- the limits set by the known physics and data, until the model tations. The computer’s output looked roughly like the actual became realistic and stable. But if models were tuned to match climate only because the modelers had succeeded in roughly the current climate, the critics asked, how reliably could they representing the actual operations of the atmosphere upon the calculate a future, different state? Earth’s geography. By 1979, Manabe and Bryan had mobilized Debates over climate models also helped stimulate philoso- enough computer power to run their model through more than a phers of science, who explained that a computer model, like any millennium while incorporating seasons. (Bryan, Manabe, & other embodiment of a set of scientific hypotheses, could never be Pacanowski, 1975; Manabe, Bryan, & Spelman, 1975; Manabe ‘‘proved’’ in the absolute sense one could prove a mathematical et al., 1979). theorem. What models could do was help people sort through Meanwhile a team headed by Warren Washington at the countless ideas and possibilities, offering evidence on which were National Center for Atmospheric Research (NCAR) in Boulder, most plausible. Eventually the models, along with other evidence Colorado developed another ocean model, based on Bryan’s, and and other lines of reasoning, might converge on a representation coupled it to their own quite different GCM. It was gratifying that of climate that—if necessarily imperfect, like all human knowl- again the patterns of air temperature, ocean salinity and so forth edge—could be highly reliable (Norton and Suppe, 2001; Oreskes, came out roughly correct, albeit with noticeable deviations from Shrader-Frechette, & Belitz, 1994). the real planet, such as tropics that were too cold (Washington, Undeniably there remained points where the models stood on Semtner, Meehl, Knight, & Mayer, 1980). Through the 1980s, these shaky foundations. For some features, no calculation could be and other teams continued to refine coupled models, occasionally trusted until more observations were made. Even the actual checking how they reacted to increased levels of CO2. These were cloudiness of various regions of the world had been measured in not so much attempts to predict the real climate as experiments only a sketchy fashion. (Until satellite measurements became to work out methods for doing so. available later in the decade, most models used data from the The results, for all their limitations, said something about the 1950s that only gave averages by zones of latitude, and only for predictions of the atmosphere-only GCMs. Including a somewhat the Northern Hemisphere). It did not solve a problem if you realistic ocean did not turn up anything that would alter the basic happened to understand all the physics. As Manabe regretfully prediction of future warming. Evidently simple models were good explained, so much physics was involved in every raindrop that it enough to point in the right direction. On the other hand, even would never be possible to compute absolutely everything. ‘‘And very simple reasoning showed that the models were still even if you have a perfect model which mimics the climate inadequate for good predictions. In particular, nobody had system, you don’t know it, and you have no way of proving it.’’9 actually simulated a climate change. Manabe and others had run Reliable progress would require more work on fundamental their model twice to compute two equilibrium states, one with elements, to improve the sub-models that represented clouds, current conditions and one with doubled CO2. In the real world, snow, vegetation, and so forth. Modelers settled into a long grind the atmosphere would pass through a series of changes as the of piecemeal improvements. level of the gas rose. There were hints that a model could end up in different states depending on just what route it took. Hansen’s group and a few others therefore undertook 5. Results for policymakers (1988–2007) protracted computer runs to compute the entire ‘‘transient response’’ while the CO2 level rose. They plodded through a century or more simulating from one day to the next. Hansen’s ‘‘There has been little change over the last 20 years or so in the coupled ocean–atmosphere model, which incorporated not only approaches of the various modeling groups,’’ an observer remarked in 1989.10 Modelers felt they had a basic grasp of the the observed rise of CO2 but also the historical record of aerosols from volcanic explosions, turned out a fair approximation to the observed global temperature trend of the previous half century. 9 Manabe, interview by Weart, December 1989 (AIP). Pushed into the future, the model showed sustained global 10 Dickinson (1989, pp. 101–102). 214 S. Weart / Studies in History and Philosophy of Modern Physics 41 (2010) 208–217 main forces and variations in the atmosphere. Their interest was parameters within the GCMs had created some kind of reliable shifting from representing the current climate ever more precisely connection with reality, the actual planet. to studies of long-term climate change. The research front Incorporating aerosols into GCMs helped to answer a major accordingly moved from calculating stable systems to represent- criticism. Skeptics had pointed out that the actual rise in ing the transient response to changes in conditions. Running temperature over the past century had not been as large as models under different conditions, sometimes through simulated would be expected for the known rise of CO2 in the atmosphere, centuries, with rising confidence the teams drew rough sketches given the 3 1C sensitivity that typical models calculated. Try as of how climate could be altered by changes in greenhouse gases. they might, the modelers had not been able to tune their GCMs to Many were now reasonably sure that they knew enough to issue get the modest temperature rise observed. An answer came when stern warnings of future global warming. advanced models put in the history of a global rise of aerosol As GCMs incorporated ever more complexities, modelers pollution. The aerosols had increasingly exerted a cooling effect, needed to work ever more closely with one another and with which had tended to offset part of the greenhouse warming. In people in outside specialties. The clearest case centered around 1995, models at centers in the US, England and Germany all NCAR. It lived up to its name of a ‘‘National Center’’ (in fact an reproduced fairly well the overall trend of 20th-century tempera- international center) by developing what was explicitly a ture changes and even the observed geographical patterns ‘‘Community Climate Model.’’ In 1983 NCAR published all its (Mitchell, Johns, Gregory, & Tett, 1995). This reversed the computer source codes along with a ‘‘Users’ Guide’’ so that significance of the earlier inability of models to reproduce the outside groups could run the model on their own machines. The temperature trend. Apparently the models that had been tuned various outside experiments and modifications in return informed without aerosols had correctly represented a planet without the NCAR group. Subsequent versions incorporated many basic aerosols; they had been grounded solidly enough in reality to changes and additional features. NCAR had an exceptionally resist attempts to force them to give a false answer. strong institutional commitment to building a model that could The success in matching the historic temperature trend be run on a variety of computer platforms, but in other ways their powerfully influenced the Intergovernmental Panel on Climate work was not unusual. By now most models used contributions Change (IPCC), a collaboration of experts and officials appointed from so many different sources that they were all in a sense by the world’s governments. In its 1996 reports the IPCC noted ‘‘community’’ models (Kiehl, Hack, & Bonan, 1996, pp. 1–2). that the pattern of geographical and vertical distribution of An important example of massive collaboration was a 1989 atmospheric heating that the models computed for greenhouse study involving groups in the United States, Canada, England, warming was different from the pattern that would result from France, Germany, China, and Japan. Taking 14 models of varying other influences, for example changes in solar activity. The rough complexity, the groups fed each model the same external forces similarity between the computed greenhouse effect’s ‘‘signature’’ and compared the results. The simulated climates agreed well for and the actual record of recent decades backed up the panel’s clear skies. But ‘‘when cloud feedback was included, compatibility official conclusion: ‘‘The balance of evidence suggests that there is vanished.’’ The models varied by as much as a factor of three in a discernible human influence on global climate’’ (IPCC, 1996, p. their sensitivity to the external forces, disagreeing in particular on 22, see chap. 8).11 how far a given increase of CO2 would raise the temperature (Cess The weaselly wording reflected the fact that many people et al., 1989, 1990). A few respected meteorologists concluded that found reasons for skepticism. In particular, the models that the modelers’ representation of clouds was altogether useless. coupled atmospheric circulation to a full-scale ocean model, a Most experts nevertheless felt the GCMs were on the right track. type of model that now dominated attention, all tended to drift In the multi-model comparisons, all the results were at least in into unrealistic patterns. The only solution was to tune the models rough overall agreement with reality. to match real-world conditions by adjusting various parameters. These studies were helped greatly by a new capability to set The simplest method was to fiddle with the flux of heat at the their results against a uniform body of world-wide data. Specially interface between ocean and atmosphere in regions where the designed satellite instruments were at last monitoring incoming models gave bad results. Modelers would likewise force transfers and outgoing radiation, cloud cover, and other essential para- of water and so forth, formally violating basic laws of physics to meters. No less important, the sketchy parameterizations in the compensate for their models’ deficiencies. The little community of models were increasingly refined by field studies. Decade by modelers was divided, with some roundly criticizing flux adjust- decade the science community mounted ever larger fleets of ments as ‘‘fudge factors’’ that could bring whatever results a ships, aircraft and balloons to observe actual processes in clouds modeler sought (Dahan-Dalmedico, 2007, p. 142; Shackley, and other key features of the climate system. The instrumental Risbey, Stone, & Wynne, 1999). If the models were arbitrarily systems were increasingly oriented toward producing numbers forced to match the present climate, why believe they could tell meaningful to the models, and vice-versa; global data and global us anything at all about a different situation? But by 1999 two models were no longer distinct entities, but parts of a single computer groups managed to do away with flux adjustments system for representing the world (Edwards, 1999, 2010). while running their models through centuries. Their results were There was also progress in building aerosols into climate not severely different from the results of the earlier flux-adjusted models. When Mount Pinatubo erupted in the Philippines in June models. Evidently the tuning had not been a fatal cheat. Models 1991, sharply increasing the amount of sulfuric acid haze in the without flux adjustments soon became standard (Carson, 1999, stratosphere world-wide, Hansen’s group declared that ‘‘this pp. 13–17; Kerr, 1997). volcano will provide an acid test for global climate models.’’ Further raising confidence, in 2001 two groups using coupled Running their model with the new data, they predicted a models matched the rise of temperature that had been detected in noticeable cooling for the next couple of years (Hansen, Lacis, the upper layers of the world’s oceans. They got a good match Ruedy, & Sato, 1992, p. 218). By 1995 the correlation between only by putting in the rise of greenhouse gases. By 2005, computer their predictions and the actual temperatures at different levels of the atmosphere were seen to be ‘‘highly significant and very striking’’ (Carson, 1999, p. 10). The ability of modelers to not only 11 The ‘‘signature’’ or ‘‘fingerprint’’ method was pioneered by Klaus Hassel- reproduce but predict Pinatubo’s effects gave scientists a mann’s group at the Max Planck Institute, e.g., Hasselmann (1993). See also Santer particularly strong reason for believing that adjusting the et al. (1996). S. Weart / Studies in History and Philosophy of Modern Physics 41 (2010) 208–217 215 modelers had advanced far enough to declare that temperature powerful than the most expensive computers of earlier decades, it measurements in every ocean basin over the previous four was possible to explore thousands of combinations of parameters. decades gave a detailed, unequivocal ‘‘signature’’ of the green- But no matter how people fiddled with climate models, the house effect. Nothing but the rise of greenhouse gases could answer was the same. If your model could reproduce something produce such a warming pattern, not the observed changes in the resembling the present climate, and then you added some Sun’s radiation, emissions from volcanoes, or any other proposed greenhouse gases, the model showed serious global warming ‘‘natural’’ mechanism (Barnett et al., 2005). (Stainforth et al., 2005). Yet if modelers now understood how the climate system could By the dawn of the 21st century, climate models had become a change and even how it had changed, they were far from saying crucial source of information for policy-makers and the public. precisely how it would change in future. Never mind how the Struggling to provide better predictions, the community of mean global temperature would rise; citizens and policy-makers modelers grew still larger and better organized. Projects to wanted to know what heat waves, droughts or floods were likely compare the models devised by different groups became a major in their particular region. The attention of the community turned ongoing activity. By 2007, groups were exchanging so much to making predictions in ever more geographical detail. data that it would have taken years to transfer it on the internet, Such detail was unattainable without further refinements. For and they took to shipping it on terabyte hard drives. There were example, ocean–atmosphere GCMs had to be linked interactively about a dozen major teams now and a dozen more that could with models for changes in vegetation. Dark forests and bright make significant contributions. The pictures of overall global deserts not only responded to climate, but influenced it. Since the warming that their GCMs computed were converging. It early 1990s the more advanced numerical models, for weather was largely thanks to their work that, as the editor of Science prediction as well as climate, had incorporated descriptions of magazine announced, a ‘‘consensus as strong as the one that has such things as the way plants take up water through their roots developed around this topic is rare in the history of science’’ and evaporate it into the atmosphere. Changes that pollution (Kennedy, 2001). caused in the chemistry of the atmosphere also had to be Each computer modeling group normally worked in a cycle. incorporated, for these influenced cloud formation and more. When their model began to look outdated, and still more if they Over longer time scales, modelers would also need to consider managed to acquire a new supercomputer, they would go back to changes in ocean chemistry, ice sheets, ecosystems, and so forth. basics and spend a few years developing a new model. It was no When people talked now of a ‘‘GCM’’ they no longer meant a simple process, for introducing a new wrinkle (for example, a new ‘‘General Circulation Model,’’ built from the traditional equations way to calculate convection in the tropics) might introduce for weather. ‘‘GCM’’ now stood for ‘‘Global Climate Model’’ or even unexpected feedbacks that caused the entire model to crash. Once ‘‘Global Coupled Model,’’ incorporating many things besides the a team had persuaded their model to produce stable results that circulation of the atmosphere. These simulations strained the looked like the real world, they would spend the next year or two resources of the newest and biggest supercomputers, some of using it to analyze climate processes, gathering ideas for the next which were built with climate modeling primarily in mind. cycle. The range of modelers’ predictions of global warming for a After finishing their part of the IPCC’s 2001 report, the doubling of CO2 was still around 3 1C give or take 50%. If the real modeling community worked to synchronize the teams’ separate sensitivity was at the lower limit of what seemed reasonably cycles. By early 2004, nearly all the major models simultaneously possible, the world would have ample time to adjust its economy reached the analysis stage. That made it possible for 17 teams to to the gathering threat; if at the upper limit, only prompt and share and compare data in time to produce results for the 2007 vigorous efforts could ward off catastrophe. The ineradicable IPCC report (Schmidt, 2008). uncertainty was caused largely by ignorance of what would The work was grueling. After a group had invested so much of happen to clouds as the world warmed. Among other worries, their time, energy, and careers in their model, they could become when modelers tried to reproduce climates of the distant past, reluctant to admit its shortcomings to outsiders and perhaps even such as the Ice Ages, the results looked reasonably good to themselves. A frequent result was ‘‘prolonged and acrimonious overall—but failed miserably to get good matches for some fights in which model developers defended their models and regions. Evidently forces now unknown could come into play in engaged in serious conflicts with colleagues’’ over whose climates very different from the present one. approach was best (Lahsen, 2005, p. 916). Yet in the end they For a climate pretty much like the present, however, all the found common ground, working out a range of numbers that all significant mechanisms must have gotten incorporated somehow agreed were plausible. into the parameters. At worst, the models were all getting right The numbers got an independent check. Other scientists were results for wrong reasons—flaws that would only show up after studying various kinds of evidence of past climates back to the ice greenhouse gases pushed the climate beyond any conditions that ages, indeed back to the Cretaceous era and earlier still, lining up the models were designed to reproduce. If there were such deep- ancient temperatures with greenhouse gas levels and other set flaws, that did not mean, as some skeptics implied, that there influences such as volcanic aerosols. By 2006 they had arrived was no need to worry about global warming. If the models were at fairly consistent numbers for how the mean global temperature faulty, the future climate changes could be worse than they connected with the level of CO2. The numbers agreed comfortably predicted, not better. For all the millions of hours the modelers with the IPCC computer modelers’ sensitivity consensus of 3 1C, had devoted to their computations, in the end they could not say give or take a couple of degrees (Annan & Hargreaves, 2006; exactly how serious future global warming would be. They could Hegerl, Crowley, Hyde, & Frame, 2006). As one group that worked only say that it was very likely to be bad, and it just might be an with ancient climates explained, ‘‘one is more likely to accept appalling catastrophe. something as the truth when it emerges ‘as the intersection of Those who still denied any serious risk of climate change could independent lies’. Although ‘lies’ may sound a bit too harsh for the not reasonably dismiss computer modeling in general. That would models involved, both our approach and the large simulation throw away much of the past few decades’ work in many fields of models clearly have their shortcomings. Interpreting our results science and engineering, and even key business practices. The in this spirit, they enhance the credibility’’ of global warming challenge to them was to produce a simulation that did not show projections (Scheffer, Brovkin, & Cox, 2006, p. L10706, slightly global warming. Now that personal computers were far more misquoting Levins, 1966, p. 423). 216 S. Weart / Studies in History and Philosophy of Modern Physics 41 (2010) 208–217

As the charts of model outputs increasingly came to look like Dahan-Dalmedico, A. (2001). History and epistemology of models: Meteorology actual observed data, it became easy to forget which was which. (1946–1963) as a case study. Archive for History of Exact Sciences, 55, 395–422. Dahan-Dalmedico, A. (2007). Models and simulations in climate change. Historical, As one modeler put it, ‘‘You start referring to your simulated epistemological, anthropological and political aspects. In A. N. H. Creager, ocean as ‘the ocean’—you know, ‘the ocean gets warm’y there is E. Lunbeck, & M. N. Wise (Eds.), Science without laws: Model systems, cases, a tendency to forget that just because your model says x, y, or z exemplary narratives (p. 123). Durham, NC: Duke University Press. doesn’t mean that that’s going to happen in the real world.’’ Dickinson, R. E. (1989). Use of numerical models to project greenhouse gas- induced warming in polar regions. In National Research Council, Commission (Lahsen, 2005, p. 909) But it was small wonder if modelers were on Physical Sciences, Mathematics and Resources, Board on Atmospheric giving the products of their labor a new kind of ontological status. Sciences and Climate, Committee on Global Change (Ed.), Ozone depletion, The modelers had long since reproduced not only simple greenhouse gases, and climate change. Proceedings of a joint symposium (pp. 98– 102). Washington, DC: National Academy Press. geographical and seasonal averages from December to July and Edwards, P. N. (1999). Global climate science, uncertainty and politics: Data-laden back, but also the spectrum of random regional and annual models, model-filtered data. Science as Culture, 8, 437–472. fluctuations in the averages—indeed it was now a test of a good Edwards, P. N. (2000). A brief history of atmospheric general circulation modeling. In D. A. Randall (Ed.), General circulation model development (pp. 67–90). San model that a series of runs showed a variability similar to the real Diego, CA: Academic Press. weather. Modelers had matched the record of climate changes for Edwards, P. N. (2010). A vast machine: Computer models, climate data, and the the past century. Exploring unusual conditions, they had politics of global warming. Cambridge, MA: MIT Press. Friedman, R. M. (1989). Appropriating the weather: Vilhelm Bjerknes and the reproduced the effects of volcanic eruptions and the ice ages. construction of a modern meteorology. Ithaca, NY: Cornell University Press. Above all, since the 1980s modelers had been predicting that a Fultz, D (1949). A preliminary report on experiments with thermally produced rise of mean global temperature would emerge from the ‘‘noise’’ lateral mixing in a rotating hemispheric shell of liquid. Journal of Meteorology, of random climate shifts around the start of the 21st century; the 6, 17–33. Fultz, D (1952). On the possibility of experimental models of the polar-front wave. prediction was fulfilled by a rise ominously faster and higher than Journal of Meteorology, 9, 379–384. anything in the historical record. GARP (National Academy of Sciences, United States Committee for the Global The models unanimously predicted much greater warming to Atmospheric Research Program) (1975). Understanding climatic change: A program for action. Washington, DC; Detroit, MI: National Academy of come. In IPCC meetings virtually all the world’s climate experts Sciences; Grand River Books. and representatives of nearly all the world’s governments Hansen, J. E., Fung, I., Lacis, A., Rind, D., Lebedeff, S., et al. (1988). Global climate hammered out a consensus. Our civilization would probably face changes as forecast by Goddard Institute for Space Studies 3-dimensional model. Journal of Geophysical Research, 93, 9341–9364. serious trouble, they announced unanimously in 2001—and Hansen, J. E., Lacis, A., Ruedy, R. A., & Sato, M. (1992). Potential climate impact of again, more emphatically, in 2007—if we did not promptly begin Mount Pinatubo eruption. Geophysical Research Letters, 19, 142–158. to reduce our emissions of greenhouse gases. It was a triumph of Hansen, J. E., Russell, G, Rind, D, Stone, P, Lacis, A, et al. (1983). Efficient three- dimensional global models for climate studies: Models I and II. Monthly rationality utterly without historical precedent: a drive to Weather Review, 111, 609–662. severely revise the entire world economy, relying chiefly on Hasselmann, K. (1993). Optimal finger prints for the detection of time dependent simulations whose workings only a few hundred experts actually climate change. Journal of Climate, 6, 1957–1971. understood. Hegerl, G., Crowley, T. J., Hyde, W. T., Frame, D. J. (2006). Climate sensitivity constrained by temperature reconstructions over the past seven centuries. Nature, 440, 1029–1032. doi:10.1038/nature04679. IPCC (International Panel on Climage Change) (1996). Climate change 1995: The References science of climate change. Contribution of Working Group I to the second assessment report of the Intergovernmental Panel on Climate Change. Cambridge: Cambridge University Press. Online at /http://www.ipcc.ch/pub/reports.htmS. Annan, J. D., & Hargreaves, J. C. (2006). Using multiple observationally-based Keeling, C. D. (1960). The concentration and isotopic abundances of carbon dioxide constraints to estimate climate sensitivity. Geophysical Research Letters, 33, in the atmosphere. Tellus, 12, 200–203. L06704. doi:10.1029/02005GL025259. Keeling, C. D. (1970). Is carbon dioxide from fossil fuel changing man’s Arakawa, A. (1970). Numerical simulation of large-scale atmospheric motions. In environment? Proceedings of the American Philosophical Society, 114, 10–17. Numerical solution of field problems in continuum physics (SIAM-AMS conference) Kellogg, W. W., & Schneider, S. H. (1974). Climate stabilization: For better or for (Vol. 2, pp. 24–40). Providence, RI: American Mathematical Society. worse? Science, 186, 1163–1172. Arakawa, A., Venkateswaran, S. V., & Wurtele, M. G. (1994). Yale Mintz, Kennedy, D. (2001). An unfortunate U-turn on carbon. Science, 291, 2515. Atmospheric Sciences: Los Angeles. University of California: In Memoriam, Kerr, R. A. (1997). Model gets it right—Without fudge factors. Science, 276, 1041. 1993. Oakland, CA: University of California Senate. Online at /http://sunsite. Kiehl, J. T., Hack, J. J., & Bonan, G. B. (1996). Description of the NCAR Community berkeley.edu:2020/dynaweb/teiproj/uchist/inmemoriam/inmemoriam1993/S. Climate Model (CCM3). Boulder, CO: National Center for Atmospheric Research, Arakawa, S., & Schubert, W. H. (1974). Interaction of a cumulus cloud ensemble NCAR Technical Note TN-420+STR. Online at /http://www.cgd.ucar.edu/cms/ with the large-scale environment, Part I. Journal of Atmospheric Sciences, 31, ccm3/history.shtmlS. 674–701. Kutzbach, J. E. (1996). Steps in the evolution of climatology: From descriptive to Barnett, T. P., Pierce, D. W., Achuta Rao, K. M., Gleckler, P. J., Santer, B. D., et al. analytic. In J. R. Fleming (Ed.), Historical essays on meteorology 1919–1995 (2005). Penetration of human-induced warming into the world’s oceans. (pp. 353–377). Boston: American Meteorological Society. Science, 309, 284–287. doi:10.1126/science.1112418. Lahsen, M. (2005). Seductive simulations? Uncertainty distribution around Bryan, K. (1969). Climate and the ocean circulation. III. The ocean model. Monthly climate models. Social Studies of Science, 35, 895–992. doi:10.1177/ Weather Review, 97, 806–827. 0306312705053049. Bryan, K., Manabe, S., & Pacanowski, R. C. (1975). A global ocean–atmosphere Levins, R. (1966). The strategy of model building in population biology. American climate model. Part II. The oceanic circulation. Journal of Physical Oceano- Scientist, 54, 421–431. graphy, 5, 30–46. Lorenz, E. N. (1967). The nature and theory of the general circulation of the Carson, D. J. (1999). Climate modeling: Achievement and prospects. Quarterly atmosphere. Geneva: World Meteorological Organization. Journal of the Royal Meteorological Society, 125, 1–27. Manabe, S., & Bryan, K. (1969). Climate calculations with a combined ocean– Cess, R. D., Potter, G. L., Blanchet, J. P., Boer, G., Genio, A. D., et al. (1990). atmosphere model. Journal of Atmospheric Sciences, 26, 786–789. Intercomparison and interpretation of climate feedback processes in 19 Manabe, S., Bryan, K., & Spelman, M. J. (1975). A global ocean–atmosphere climate atmospheric general circulation models. Journal of Geophysical Research, 95, model: Part I. The atmospheric circulation. Journal of Physical Oceanography, 5, 16601–16615. 3–29. Cess, R. D., Potter, G. L., Blanchet, J. P., Boer, G. J., & Ghan, S. J. (1989). Interpretation Manabe, S., Bryan, K., & Spelman, M. J. (1979). A global ocean–atmosphere climate of cloud-climate feedback as produced by 14 atmospheric general circulation model with seasonal variation for future studies of climate sensitivity. models. Science, 245, 513–516. Dynamics of Atmospheres and Oceans, 3, 393–426. Charney, J. G. (1949). On a physical basis for numerical prediction of large-scale Manabe, S., Hahn, D. G., & Holloway, J. L., Jr. (1974). The seasonal variation of the motions in the atmosphere. Journal of Meteorology, 6, 371–385. tropical circulation as simulated by a global model of the atmosphere. Journal Charney, J. G., & Eliassen, A. (1949). A numerical method for predicting the of Atmospheric Sciences, 31, 43–83. perturbations of the middle latitude westerlies. Tellus, 1, 38–54. Manabe, S., Smagorinsky, J., & Strickler, R. F. (1965). Simulated climatology Charney, J. G., Fjortoft,¨ R., & von Neumann, J. (1950). Numerical integration of the of general circulation with a hydrologic cycle. Monthly Weather Review, 93, barotropic vorticity equation. Tellus, 2, 237–254. 769–798. Cressman, G. P. (1996). The origin and rise of numerical weather prediction. In J. R. Manabe, S., & Stouffer, R. J. (1980). Sensitivity of a global climate model to an

Fleming (Ed.), Historical essays on meteorology 1919–1995, 21–39). Boston: increase of CO2 concentration in the atmosphere. Journal of Geophysical American Meteorological Society. Research, 85, 5529–5554. S. Weart / Studies in History and Philosophy of Modern Physics 41 (2010) 208–217 217

Manabe, S., & Wetherald, R. T. (1967). Thermal equilibrium of the atmosphere with Richardson, L. F. (1922). Weather prediction by numerical process. Cambridge: a given distribution of relative humidity. Journal of Atmospheric Sciences, 24, Cambridge University Press. (reprinted NY, Dover, 1965). 241–259. Santer, B. D., Taylor, K. E., Wigley, T. M. L., Johns, T. C., Jones, P. D., et al. (1996).

Manabe, S., & Wetherald, R. T. (1975). The effects of doubling the CO2 A search for human influences on the thermal structure of the atmosphere. concentration on the climate of a general circulation model. Journal of Nature, 382, 39–46. Atmospheric Sciences, 32, 3–15. Scheffer, M., Brovkin, V., & Cox, P. (2006). Positive feedback between global warming Mintz, Y. (1965). Very long-term global integration of the primitive equations of and atmospheric CO2 concentration inferred from past climate change. atmospheric motion. In World Meteorological Organization (Ed.), WMO-IUGG Geophysical Research Letters, 3, L10702. doi:10.1029/12005GL025044,022006. symposium on research and development aspects of long-range forecasting, Schlesinger, M. E., & Mitchell, J. F. B. (1987). Climate model simulations of the Boulder, CO, 1964. (WMO Technical Note No. 66) (pp. 141–155). Geneva: World equilibrium response to increased carbon dioxide. Reviews of Geophysics, 25, Meteorological Organization. 760–798. / Mintz, Y., Katayama, A., & Arakawa, A. (1972). Numerical simulation of the seasonally Schmidt, G. (2008). The IPCC model simulation archive. Realclimate.org http:// S and inter-annually varying tropospheric circulation. In A. E. Barrington (Ed.), realclimate.org/index.php?p=520 , posted February 4. Climatic impact assessment program. Proceedings (pp. 194–216). Cambridge, MA: Shackley, S., Risbey, J., Stone, P., & Wynne, B. (1999). Adjusting to policy U.S. Department of Transportation. expectations in climate change modeling. An interdisciplinary study of flux adjustments in coupled atmosphere–ocean general circulation models. Mitchell, J. F. B., Johns, T. J., Gregory, J. M., & Tett, S. B. F. (1995). Climate response to Climatic Change, 43, 413–454. doi:10.1023/A:1005474102591. increasing levels of greenhouse gases and sulphate aerosols. Nature, 376, Smagorinsky, J. (1963). General circulation experiments with the primitive 501–504. equations. I. The basic experiment. Monthly Weather Review, 41, 99–164. National Academy of Sciences, Climate Research Board (1979). Carbon dioxide and Smagorinsky, J. (1970). Numerical simulation of the global circulation. In G. A. climate: A scientific assessment (Jule Charney, Chair). Washington, DC: National Corby (Ed.), Global circulation of the atmosphere (pp. 24–41). London: Royal Academy of Sciences. Meteorological Society. Nebeker, F. (1989). The 20th-century transformation of meteorology. Unpublished Smagorinsky, J. (1983). The beginnings of numerical weather prediction and general Ph.D. thesis, Princeton University, Princeton, NJ. circulation modeling: Early recollections. Advances in Geophysics, 25, 3–37. Nebeker, F. (1995). Calculating the weather: Meteorology in the 20th century. New Smagorinsky, J., Manabe, S., & Holloway, J. L., Jr. (1965). Numerical results from a York: Academic Press. nine-level general circulation model of the atmosphere. Monthly Weather Norton, S. D., & Suppe, F. (2001). Why atmospheric modeling is good science. In Review, 93, 727–768. C. A. Miller, & P. N. Edwards (Eds.), Changing the atmosphere. Expert knowledge Stainforth, D. A., Aina, T., Christensen, C., Collins, M., Faull, N., et al. (2005). and environmental governance (pp. 67–105). Cambridge, MA: MIT Press. Uncertainty in predictions of the climate response to rising levels of Oreskes, N., Shrader-Frechette, K., & Belitz, K. (1994). Verification, validation, greenhouse gases. Nature, 433, 403–406. and confirmation of numerical models in the earth sciences. Science, 263, Washington, W. M., Semtner, A. J., Jr., Meehl, G. A., Knight, D. J., & Mayer, T. A. 641–646. (1980). A general circulation experiment with a coupled atmosphere, ocean Phillips, N. (1956). The general circulation of the atmosphere: A numerical and sea ice model. Journal of Physical Oceanography, 10, 1887–1908. experiment. Quarterly Journal Royal Meteorological Society, 82, 123–164. Weart, S. R. (2008). The discovery of global warming ((2nd ed.). Cambridge, MA: Platzman, G. W. (1979). The ENIAC computations of 1950—Gateway to numerical Harvard University Press. weather prediction. Bulletin of the American Meteorological Society, 60, Weart, S. R. (2009). The discovery of global warming (rev. ed.) /http://www.aip. 302–312. org/history/climate/S.