Mission, Tradition, and Data Revolution
Total Page:16
File Type:pdf, Size:1020Kb
SH AP OT N S S COMPUTERS & , , COMPUTATION Mission Tradition F R L O I L H and Data Revolution M T H E Los Alamos has a rich legacy of First, the computer mines the data to figure leading computing revolutions, out what characteristics comprise a better a legacy that began before computers widget, then it explores avenues to arrive even existed. The elaborate calculations at the best widget possible. Human brains underpinning the Laboratory’s original are still required to evaluate performance, mission often took months, so in the race but the goal is for even this to be automated. against time, when every day mattered, On the other side of the Lab’s computing new methods of streamlining were coin lies simulation, a computing revolution continually devised. Those efforts— born long ago from brute force and necessity. both mechanical and mathematical— War and defense have long driven human paid off and secured permanent places innovation, and as the Lab transitioned at the Laboratory for computers and from a temporary war effort to a permanent computation, which have evolved in scientific institution, its first electronic tandem over the decades. computer, MANIAC I, was built to help model Today the Lab is on the leading edge thermonuclear processes for new weapons of a new revolution, born of opportunity. designs. Built in 1952, MANIAC I used With myriad digital devices now cheaply von Neumann architecture, an organization available, mass quantities of data are being scheme envisioned by Manhattan Project produced, and scientists realized that new scientist John von Neumann. Overseeing Improvement in computation capability ways of managing data are needed and MANIAC I was Nicholas Metropolis, who, over the past 25 years is illustrated by these climate simulations of an area around the new ways of utilizing data are possible. along with von Neumann and others Kuroshio current, off the coast of Japan. Thus the field of data science was born. at Los Alamos, devised the Monte Carlo CREDIT: Mat Maltrud/LANL Data science at the Lab falls into two broad method—a computational algorithm categories: pattern recognition-based based on repeated random sampling rather platforms, such as real-time traffic- than direct deterministic computation— navigation assistants or cyber-security which spawned a family of methods that software, that evaluate risks, rewards, remain essential to modern science. to the contrary. The scientists initially and characteristic behavior; and physics- Contemporary with von Neumann and thought the computer got it wrong, but based platforms that match models and Metropolis were Enrico Fermi, John Pasta, then they realized it was their thinking equations to empirical data, such as how Stanislaw Ulam, and Mary Tsingau, who that was off, not the computation. It was fluids flow through fractures in the earth’s together are credited with the birth, in 1955 new physics. It was unexpected and non- subsurface during processes like fracking at Los Alamos, of experimental mathematics intuitive, and it could not have been done or underground nuclear detonation. and nonlinear science. The Fermi-Pasta- without a computer. Presently, Los Alamos data scientists Ulam-Tsingau publication (Mary Tsingau, As long as supercomputers have are making advances in machine learning, the programmer who coded the first-ever existed, Los Alamos has been home to the such that data itself can be the algorithm, numerical simulation experiments on latest and greatest among them. After instead of a human-coded algorithm. MANIAC I, was initially excluded from the MANIAC I came the IBM 701, the first The data come from experiments, for byline of the publication) describes a paradox electronic digital computer, followed by example materials-science experiments in which complicated physical systems the faster IBM 704, then MANIAC II, then geared toward building a better widget. exhibit periodic behavior despite predictions the IBM 7030, or “Stretch,” which is often 4 1663 August 2018 hailed as the first true supercomputer. rely on high-performance computing Revolution in computing Continual innovation in supercomputers capabilities. Thirty years ago, the best is a tradition at Los Alamos over the last six decades has enabled these simulations could do was to parse and is central to the continual innovation in simulation, which, the weather geographically down to Laboratory’s mission. although it began with thermonuclear 200-kilometer squares; now they have processes, is now at the heart of many gotten down to just 10 kilometers. different research efforts at the Lab. Although data science and machine For example, numerical models used to learning are the young new arrivals, predict long-term climate shifts as well supercomputing and simulation are the as weather (e.g., hurricane trajectories) mainstays of the Laboratory’s high- performance computing program, and all have their place at the table. By addressing In the 1980s, prior to Los Alamos engaging in climate-simulation research, the most complex processes in some the best resolution was 2.0 degrees, or about 200-kilometer squares. of the hardest problems facing science, national labs like Los Alamos are pushing the frontier of science and contributing In the early 1990s, the Connection Machine, a resident directly to national security and the global supercomputer, helped bring the resolution down to 0.28 degree, economy. The next milestone on that or about 30-kilometer squares. This simulation was presented to frontier is exascale computing, the ability President Clinton during one of his visits to the Laboratory and to perform a quintillion calculations per also won a Smithsonian Computer World award. second. It’s a tall order and a considerable leap from where we are now, but looking In the 2000s, additional evolution back on where we came from, there’s every of supercomputer hardware reason to have confidence that Los Alamos and architecture enabled the will have a leading role in this revolution resolution to reach 0.1 degree, as well. or 10-kilometer squares. —Eleanor Hutterer Most recently, improvements have centered around incorporating new features and new physics that make the simulations more realistic. Here, the inclusion of ice shelves around Antarctica— important for understanding climate change—makes use of the newest model capabilities. CREDIT: Phillip Wolfram, Matthew Hoffman, and Mark Petersen/LANL 1663 August 2018 5 The Laboratory has been home to consensus that it was impossible, set out Reines and Cowan relied on brute many Nobel laureates. But in only one to capture the elusive neutrino. Because force and brilliance, but this latest instance was the prize-winning work done neutrinos are so inert, the likelihood of Los Alamos neutrino detector has the during the winner’s tenure at Los Alamos. one interacting with a detector is remote, benefit of serendipity as well. It turns out That was in 1956, when Fred Reines and so a tremendous number of neutrinos is that the proton beam at the Los Alamos Clyde Cowan proved the existence of a new needed to be able to observe just one. Neutron Science Center—established in kind of subatomic particle, the neutrino. The duo initially intended to use an 1972 to study the short-lived subatomic Since then, neutrino science has continued underground nuclear bomb test at the Lab and elsewhere, leading to three as the source of this tremendous more Nobel Prizes. Now, new experiments number of neutrinos, but they at Los Alamos are poised on the brink of quickly determined that a nuclear a new discovery, which looks to be just as reactor would be better, so they exciting as any of them. took their detector—a rig about In 1930, theoretical physicist the size of a modern washing Wolfgang Pauli proposed that a new machine—to the reactor at particle—invisible and uncharged—was Hanford, Washington. needed to satisfy the law of conservation After preliminary work at of energy during radioactive decay of Hanford, the team decided to atomic nuclei. Pauli used the name build a bigger and better detector “neutron,” which was the same name given at the brand new reactor in to another, more massive particle. Pauli’s Savannah River, South Carolina. contemporary Enrico Fermi, who would It was there that they finally and later join the war effort at Los Alamos, conclusively observed the electron resolved the nomenclature problem by antineutrino—the antiparticle of giving the less massive particle the Italian the electron neutrino, whose very diminutive “-ino,” and viola! The neutrino. existence proved the existence Scientists now know that neutrinos of the other. Reines and Cowan are among the most abundant particles sent a jubilant telegram to Pauli in the universe—hundreds of trillions of in Switzerland informing him of them stream unobtrusively though our their success. Clyde Cowan died in bodies every second of every day. So far, 1974, and Fred Reines alone was three varieties are known: the electron awarded the Nobel Prize in 1995 neutrino, the muon neutrino, and the tau for their work. neutrino. Neutrinos are almost completely Nowadays most neutrino inert, interacting with other particles only detectors are much, much larger. by gravity and by the weak nuclear force. Usually they are international In fact, Fermi based his original postulation collaborations involving thousands of the weak nuclear force on Pauli’s of tons of liquid in enormous Fred Reines (left) and Clyde Cowan inspect their neutrino detector proposed, and still hypothetical at the time, vessels thousands of feet below the in 1955, a predecessor to the one they used in 1956 to prove the new particle. surface of the earth. But the latest existence of the elusive neutrino. Forty years later and 21 years after In the early 1950s, as the Laboratory neutrino detector at Los Alamos, though Cowan’s death, Reines alone was awarded the 1995 Nobel Prize in was expanding from a war-time weapons larger than the first, is still quite small, Physics for their shared discovery.