S&TR December 2017

Smoke from the 2016 Soberanes Fire in Monterey County, California—the costliest wildfire up to that time—begins to block out the Milky Way. makes droughts more likely and such fires more frequent and larger in scale. (Photo by Li Liu, M.D.)

4 Lawrence Livermore National Laboratory S&TR December 2017 Climate Modeling

The Atmosphere around CLIMATE MODELS

Supercomputers, the laws of physics, and Lawrence Livermore’s nuclear weapons research all interact to advance atmospheric science and climate modeling.

INCE the 1960s, computer models power than we carry in our pockets now, In fact, Livermore’s climate models S have been ensuring the safe whereas today’s advanced computers trace their origins to the Laboratory’s return of astronauts from orbital and allow us to study phenomena vastly more initial development of codes to simulate lunar missions by carefully predicting complex than orbital dynamics.” That nuclear weapons. Hendrickson states, complicated spacecraft trajectories. A computational power offers unprecedented “Our primary mission is nuclear weapons slight miscalculation could cause a craft to insight into how the physical world works, design, which has required us to create zoom past the Moon or Earth and become providing details about phenomena that unique computational capabilities. These lost in space, or approach too steeply would be infeasible to study with physical capabilities have also been applied to and face an equally disastrous outcome. experiments. At Lawrence Livermore, other national needs, including modeling Bruce Hendrickson, Livermore’s associate numerical models running on high- the atmosphere and the rest of the director for Computation, points out, “In performance computers are a vital part climate system.” the 1960s, scientists and engineers put of research in many programs, including Over the years, advances in scientific people on the Moon with less computing stockpile stewardship and climate studies. understanding and increased computational

Lawrence Livermore National Laboratory 5 Climate Modeling S&TR December 2017

power have resulted in higher First Atmospheric Animation for applications such as tracking fidelity climate models that are more From its inception, the Laboratory releases of radioactive and representative of the real world. These pursued numerical approaches to solving other hazardous materials. In computationally intense simulations have problems using cutting-edge computer the late 1950s, Livermore also helped shake down and benchmark systems. “Livermore went all-in with scientist Cecil “Chuck” subsequent generations of the Department computers,” says Glenn Fox, associate Leith developed one of Energy’s (DOE’s) supercomputers director for Physical and Life Sciences. of Livermore’s first- before the machines transition to classified “When the Laboratory’s doors opened, ever numerical models work. “Climate simulation is an application the first big procurement was a state- capable of simulating that can consume the whole machine of-the-art computer.” The room-sized the hydrodynamic and and put it through its paces in a very Univac-1 had 5,600 vacuum tubes and radiative processes demanding way,” explains Hendrickson. 9 kilobytes of memory and ran at a speed in a thermonuclear “The simulations touch every part of the of 1,000 floating-point operations per explosion. Recognizing computer.” (See S&TR, July/August 2015, second (flops). In 2018, Livermore’s fundamental similarities in pp. 3–14.) Sierra supercomputer will use more than the underlying equations and 1 million microprocessors to achieve a interested in demonstrating speed of 150 petaflops—150 trillion times what could be achieved with On July 12, 2017, a 5,801-square-kilometer piece of faster than the Univac-1. more powerful computing, Leith the Larsen C ice shelf broke away from Antarctica, Even in the Laboratory’s early days, turned his attention to creating more as shown in the satellite image. The Energy researchers understood that the same comprehensive weather system models. Exascale Earth System Model (E3SM) is one of computational approaches for simulating Michael MacCracken, a now-retired the first to simulate the movement and evolution of nuclear weapons could be applied to better climate scientist who headed Livermore’s glaciers and ice sheets. (Image courtesy of NASA.) simulate evolution of the weather and atmospheric and geophysical sciences division from 1987 to 1993, came to the Laboratory as one of Leith’s graduate students. MacCracken says, “Using the most advanced computers available in the early 1960s, Leith developed an atmospheric model that was way ahead of its time.” Leith’s Livermore Atmospheric Model (LAM) divided the atmosphere into a three-dimensional (3D) mesh with six vertical layers and a horizontal grid with five-degree intervals in latitude and longitude. LAM was the world’s first global atmospheric circulation model that calculated temperature, winds, humidity, clouds, precipitation, the day-and-night cycle, and weather systems around the globe, all starting from first-principles equations for the conservation of mass, momentum, energy, and water vapor. Leith also created the first animation of atmospheric modeling results by colorizing photographs of a black-and-white video screen and stitching them together into a film. Leith’s atmospheric work also benefited other Livermore programs. For example,

6 Lawrence Livermore National Laboratory S&TR December 2017 Climate Modeling

Climate models divide Earth into a grid with vertical and horizontal intervals. The smaller the intervals, the finer the grid, and the better the resolution of the model—that is, the greater detail the model can produce. (Image courtesy of the National Oceanic and Atmospheric Administration.)

to these efforts, atmospheric chemistry models were also developed. The first such model contributed to a successful plan to limit rising concentrations of photochemical in the San Francisco Bay Area. The second model simulated stratospheric chemistry and was used to calculate the impact of a proposed fleet of supersonic transport aircraft on stratospheric ozone. This modeling also investigated the potential for ozone depletion from atmospheric nuclear testing back in the early 1960s and the much his study of atmospheric turbulence larger depletion that would result from a led to a better understanding of how to global nuclear war with megaton-yield represent turbulence and turbulent flows. nuclear weapons. MacCracken adds, “Although simulations When chlorofluorocarbon (CFC) of astrophysics, plasma physics, and emissions from aerosol spray cans, nuclear weapons address different refrigerators, and other sources came under temperatures, pressures, and timescales, scrutiny in the 1970s, the stratospheric The Livermore Atmospheric Model (LAM), it’s all the same basic physics. So chemistry model was applied to evaluate developed by Cecil “Chuck” Leith in the 1960s, computational advances in one area benefit the ozone depletion potential and develop was the first-ever global climate model to the others and vice-versa.” a metric for calculating depletion that be animated. Shown are screen captures of was later used in the Montreal Protocol to runs for pressure and precipitation (top) and Ozone and regulate CFC emissions. After restrictions temperature (bottom). As environmental awareness rose in the were put in place in 1987, growth of the late 1960s, Laboratory programs began to continent-size hole in the ozone layer address regional and global environmental slowed, and after further international and later became its director from 1958 to problems. Derived from LAM, an early agreements, it eventually stopped growing 1960, Edward Teller had made the critical climate model developed by MacCracken and began to slowly shrink. determination at Los Alamos that a nuclear was used to analyze hypotheses about the In the mid-1980s, famed astrophysicist explosion would not ignite the atmosphere. causes of ice age cycles, the effects of Carl Sagan and others suggested the specter Now he questioned the severity of a nuclear volcanic eruptions and changes in land of a “nuclear winter”—that the blasts and winter. Climate scientist Curt Covey, who cover, and the consequences of changes in fires from a global nuclear war could loft retired from Livermore in 2017, remembers atmospheric composition. The pioneering enough smoke and other matter into the Teller saying, “At Livermore, we have the LAM would eventually lead to the global atmosphere to obscure sunlight for months, best computers. Surely we can do the best climate models that today also encompass causing a global vegetation die-off and job in simulations.” In response, Livermore interactive representations of the oceans, a winterlike cooling of the entire planet used its modeling capabilities to investigate land surfaces, ice masses, and biological that could kill billions of people. In 1945, the global effects of nuclear winter and activity in the oceans and on land. Parallel before he cofounded Lawrence Livermore found that although significant cooling

Lawrence Livermore National Laboratory 7 Climate Modeling S&TR December 2017

would occur depending on the amount of Chernobyl power plant in Ukraine. A wide range of release scenarios, including smoke lofted, the effects would be less partial meltdown of the reactor’s core large fires or chemical spills, incidents severe than initially conjectured. resulted in a massive explosion and involving weapons of mass destruction, Building on these wide-ranging open-air fires that belched radioactive and nuclear power plant failures. Lee activities, Livermore was well positioned material into the atmosphere for days. Glascoe, program leader for NARAC, to simulate and better understand the Livermore’s National Atmospheric Release says, “When we are alerted to a hazardous effects on the climate of increasing Advisory Center (NARAC) had been release, we work quickly with DOE concentrations of carbon dioxide (CO2) created by DOE nearly a decade earlier using NARAC’s atmospheric modeling and other greenhouse gases. Fox notes, as an emergency response asset and capabilities to provide decision makers “Although assessing the impact of human was part of DOE’s 1979 response to the with predictions of hazards associated with activities on the environment was not part radioactive release at the Three Mile Island the plume dispersal to help protect workers of the Laboratory’s original charter, as reactor in Pennsylvania. Immediately and the public.” our capabilities developed, we stepped after the Chernobyl accident, NARAC into a broader program that has played an worked with subject-matter experts Climate Model Intercomparisons important role for the country.” both inside and outside the Laboratory, Over the decades, computational quickly connecting its local and regional advances have allowed more components Chernobyl and Tracking Releases dispersion models to global meteorological of the climate system to be combined into In 1986, the worst nuclear accident models to estimate where the plume and a single model. Previously, combining in history occurred at the Soviet Union’s fallout from Chernobyl would spread. components such as the atmosphere, land The models, validated with surfaces, oceans, sea ice, aerosols, and measurements from different the carbon cycle into one model was far countries, helped to provide too complex. In 1989, Livermore took 2000 a better understanding of the the lead in an international program Ozone Chlorine monoxide impacts of the release and designed to evaluate and learn from the possible protective measures. increasing number of climate models This included analysis of being developed by leading scientific the potential threats to the organizations around the world. This milk supply. effort, the Program for Climate Model After Chernobyl, NARAC’s Diagnosis and Intercomparison (PCMDI), responsibilities were expanded was announced in a press release by Bruce to a global scale to better Tarter, then associate director for Physics safeguard the nation and the and later Laboratory director from 1994 world. Responses to many to 2002. Tarter stated, “The greenhouse 2029 national and international effect is of tremendous global concern. It incidents notably include the is essential that the policymakers in the 2011 power plant accident U.S. and internationally have the necessary in Fukushima, Japan. Today, tools to address it. Our new program will NARAC is expanding its high- enable us to improve the scientific tool that resolution atmospheric and is of extraordinary value in this effort—the transport models to span spatial computer model.” (See S&TR, June 2012, scales from the worldwide pp. 4–12.) transport of radiological Climate models are grounded An atmospheric chemistry model jointly developed by Livermore materials to dispersion down in the laws of physics, and their and NASA predicts recovery of the ozone hole between 2000 city streets from, say, a simulations of the historical climate are and 2029. The left-hand panels show the extent of the ozone radiological dispersal device or carefully compared to available global hole (blue shows minimum ozone), while the right-hand panels an accident at a chemical plant. observations. This grounding allows show the amount of ozone-destroying chlorine monoxide These models incorporate researchers to assess many kinds of (peaking in red). Actual data (not shown) confirmed that once highly resolved terrain and “what if” climate change scenarios. Tom steps to reduce ozone-depleting gases were taken, the hole in meteorological information Phillips has worked as a PCMDI climate the ozone stopped growing and began shrinking. and are used to prepare for a scientist for decades and recognizes that

8 Lawrence Livermore National Laboratory S&TR December 2017 Climate Modeling

the complexity of climate models can in subtropics. As the world warms, wet Models Meet Supercomputers make them difficult to understand, which regions are tending to become wetter and In the late 1980s, as the Cold War ended, in turn has led to criticism. For instance, dry regions drier. What we are doing now Laboratory scientists and engineers looked climate scientists have been accused of is trying to understand the details at much for wider application of their expertise tweaking the models to produce desired finer levels, on regional scales. People care in nuclear weapons modeling and other outcomes. Phillips denies the claim, about what’s happening in their backyard. simulations. Some researchers shifted their explaining, “We look at the whole system They also want to know to what extent careers to climate modeling. Then, in 1996, as manifested in different aspects, such human-induced climate change might when the Comprehensive Nuclear-Test-Ban as the variation in global temperature, have made Hurricanes Harvey, Irma, and Treaty ended underground testing, science- the hydrological cycle, and atmospheric Maria more destructive than previous based stockpile stewardship was born. The circulation. Even if it were possible to hurricanes.” modeling of weapons in great detail as they tweak parameters to achieve specific PCMDI has been a major contributor age is central to stockpile stewardship, and results, the tweak would affect other to all five assessment reports by the the need arose to push even harder on the results, making it very apparent that Intergovernmental Panel on Climate boundaries of supercomputing, resulting in something was amiss.” Parameters in the Change (IPCC). After the fourth machines of incredible power. Behind these models are deeply embedded in the very assessment, more than 40 Livermore supercomputers is a small army of computer equations describing physical processes, researchers were recognized when the scientists who develop computer codes used with values set according to the physics IPCC was co-awarded the 2007 Nobel to safeguard the nuclear weapons stockpile, being represented. Modeling results­­­—say, Peace Prize for its efforts to “build up as well as codes employed in climate rainfall sensitivity to CO2­—emerge only and disseminate greater knowledge about research. Computer scientist Dean Williams after the equations have been solved over man-made climate change, and to lay explains, “You cannot use the actual Earth the range of time covered by the models. the foundations for the measures that are as an experiment. You cannot double or Furthermore, results must be consistent needed to counteract such change.” triple the amount of greenhouse gases in with extensive observed data. For almost three decades, PCMDI has been closely examining and An atmospheric release comparing the results of climate models simulation from Livermore’s with observed changes in the climate National Atmospheric Release system. If climate model results differ Advisory Center shows how a from observations or other models, hazardous plume could disperse scientists use this difference as an through streets and flow around opportunity for learning. Ongoing testing buildings. This sophisticated and and intercomparison can thus lead to validated atmospheric model improvement of all models. Although can resolve to the meter scale. differences in modeling approaches will The system is being developed lead to some degree of variation among to launch on short notice in an climate models and differences with emergency to provide responders normal variation of weather, multiple with actionable information runs from multiple models usually reveal needed to protect people from consistent trends when the results are plume exposure. combined into an ensemble. Climate scientist Céline Bonfils, who focuses on hydrological effects such as aridity, confirms that climate models are performing quite well, adding, “The big picture is relatively well understood in terms of the global scale. For instance, the climate models already accurately simulate winter storm tracks in mid- latitudes, monsoon systems, and arid lands

Lawrence Livermore National Laboratory 9 Climate Modeling S&TR December 2017

the atmosphere to see what happens to the the models so that we can code them and recent decades without including those planet. That’s an impossible option. The give the researchers useful results.” This human effects. only way we can investigate what would teamwork also helps spread knowledge Each decade since the 1960s has been happen is by using computer simulations. about complex modeling approaches warmer than the previous decade. Of We simulate nuclear tests. We simulate how and lessons learned throughout the the 17 hottest years on record, 16 have airplanes fly. We’ve shown this works for Laboratory’s programs. occurred since 2001, and the years 2014, complex systems, with close agreement 2015, and 2016 have consecutively set between simulations and observations. If we Connecting the Dots record high temperatures. These records can do these things, why not climate? The A straightforward way to assess the are independently confirmed by four same scientific approach and fundamental fidelity of a model is to compare the world-leading science institutions— physics principles apply to all.” model’s results from a decade or two NASA, the National Oceanic and The principal investigator and chair of ago with actual observed measurements Atmospheric Administration, the Japanese the Earth System Grid Federation (ESGF), made after the model’s results were Meteorological Agency, and the United Williams has spent almost 30 years working published. Williams adds, “We did some Kingdom’s Met Office Hadley Centre. with climate data. ESGF is a massive data- simulations in 2007, and now, in 2017, Climate models help quantitatively explain management system that allows researchers we compared the results to observed the extent of the human contribution to from all over the world to securely store changes 10 years later. We found that this warming. Readily observable, data- and share models, analyses, and results, the models predicted the temperature based cause-and-effect relationships also along with observational data from changes we are seeing now. Just plot help to explain how human influences are satellites and other scientific sources. (See the data points on the graph.” Another driving other changes, such as melting S&TR, January/February 2013, pp. 4–11.) striking result was the human impact on glaciers and ice sheets, warming oceans, Williams says, “As computer scientists, climate. Scientists have demonstrated that and rising sea levels. Accumulated climate we interface with climate scientists. We when human influences such as increased changes are driving what the world is also work with the hardware, networking, concentrations of greenhouse gases are experiencing today—such as seawater and other software application teams. deliberately excluded from the models, regularly flooding streets in Miami Beach, When I’m talking about climate models the resulting simulation predicts much Florida, and Newport News, Virginia, at and moving around petabytes of data, I’m colder temperatures than what is observed high tide, and the poleward movement of dealing with ESGF. We interface with the today. In fact, no model based on careful fisheries—adding credence to projections of modelers because they’re the ones running representation of physical laws has been future trends. The next generation of climate the models. We really have to understand able to reproduce the actual observed models promises to help us be even better the terminology and the science behind increase in global temperatures over informed and prepared.

Warming near Earth’s surface Satellite observations Climate models is shown by both (left) satellite observations and (right) climate models, specifically, for temperature change in the lower troposphere from January 1979 to December 2016. The average of historical simulations performed with 37 different climate models corresponds well with satellite temperature measurements made by Remote Sensing Systems.

–0.45 0 0.45 Temperature change from 1979 to 2016, °C per decade

10 Lawrence Livermore National Laboratory S&TR December 2017 Climate Modeling

Models of an Exascale Kind This high-resolution ocean simulation The DOE Office of Science launched uses the Energy Exascale Earth the Energy Exascale Earth System Model System Model (E3SM), which (E3SM) program in 2014, but E3SM divides the globe into a grid actually dates back to a 2007 Grand with intervals of only 15 Challenge award at Livermore, which kilometers. E3SM will run provided researchers a large amount of on the most advanced time on the Atlas supercomputer. Using supercomputers Atlas, the team ran a simulation using what and produce results was then one of the most detailed coupled of unprecedented models of global climate ever produced. resolution. Dave Bader, who heads E3SM, says, “We match the strengths of DOE computational science with existing research. DOE has a important energy- mission to understand the consequences of related questions, energy production and use, and obviously such as how the that includes emissions. availability of water This assessment requires an Earth system resources changes model, and E3SM is a DOE model, over periods as short as for DOE missions, running on DOE decades, how changes in computers.” the hydrological cycle will By adding the interactive affect energy production, and how changes impossibility of simulating every single biogeochemical process by which the in heating and cooling will affect the energy atom on Earth. Chaos theory also dictates climate is linked to plant life and other needs of infrastructure, business, and that no model can predict the exact living organisms, global climate models the public. temperature and sky conditions at a given have evolved into Earth system models. place and time of day even a day from A multi-institutional program combining Looking Ahead now, let alone 10 or 100 years in the future. the efforts of six national laboratories The tremendous capabilities Livermore However, the emerging capabilities in and several other leading scientific has built in supercomputing have been Earth system modeling will soon provide organizations, E3SM will run simulations applied successfully to sustain the extraordinary insights into global trends at resolutions of 15 kilometers (whereas nation’s nuclear deterrent and address and climate statistics about Earth’s past, Leith’s first model had a horizontal other national scientific challenges. Those present, and future, allowing society to resolution of about 500 kilometers). The computational capabilities have also explore what has passed and better predict model will also be able to “telescope” to a advanced research in related mission- and prepare for what is to come. resolution as small as 1 kilometer to focus critical fields. Livermore’s long, successful —Dan Linehan attention on towns or other small locales. history of atmospheric modeling has E3SM will also incorporate additional helped identify and address a broad Earth system components such as ice range of issues in Livermore’s mission Key Words: atmospheric model, climate shelves and glaciers that can flow and space. The improvement of modeling change, climate model, Earth System Grid fracture—processes that are critical to has relentlessly continued, leading to Federation (ESGF), Energy Exascale Earth System Model (E3SM), Intergovernmental projecting future rates of sea level rise. ever more realistic representations of the Panel on Climate Change (IPCC), Livermore As part of the DOE Exascale Computing world. Today, the atmospheric release Atmospheric Model (LAM), National Project, supercomputer architecture is models used by NARAC deal with scales Atmospheric Release Advisory Center going through a radical transformation. from local to global as climate models (NARAC), Nobel Peace Prize, nuclear E3SM will start running on pre-exascale are looking at the world at finer and weapon, Program for Climate Model Diagnosis and Intercomparison (PCMDI), stockpile supercomputers but is being designed to finer scales. stewardship, supercomputer. run on full exascale platforms. As early as As with their climate model December 2017, E3SM will have completed predecessors, Earth system models cannot For further information contact David Bader the first of its many simulations addressing be perfectly predictive because of the (925) 422-4843 ([email protected]).

Lawrence Livermore National Laboratory 11