Challenges for Understanding the Quantum Universe and the Role of Computing at the Extreme Scale I
Total Page:16
File Type:pdf, Size:1020Kb
Challenges for Understanding the Quantum Universe and the Role of Computing at the Extreme Scale i DISCLAIMER This report was prepared as an account of a workshop sponsored by the U.S. Department of Energy. Neither the United States Government nor any agency thereof, nor any of their employees or officers, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise, does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States Government or any agency thereof. The views and opinions of document authors expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof. Copyrights to portions of this report (including graphics) are reserved by original copyright holders or their assignees, and are used by the Government’s license and by permission. Requests to use any images must be made to the provider identified in the image credits. On the cover: The IBM Blue Gene/P supercomputer at the U.S. Department of Energy’s Argonne National Laboratory. The computer, dubbed the Intrepid, is the fastest supercomputer in the world available to open science and the third fastest among all supercomputers. Future reports in the Scientific Grand Challenges workshop series will feature different Office of Science computers on their covers. ii Challenges for Understanding the Quantum Universe and the Role of Computing at the Extreme Scale SCIENTIFIC GRAND CHALLENGES: CHALLENGES FOR UNDERSTANDING THE QUANTUM UNIVERSE AND THE ROLE OF COMPUTING AT THE EXTREME SCALE Report from the Workshop Held December 9-11, 2008 Sponsored by the U.S. Department of Energy, Office of High Energy Physics, and the Office of Advanced Scientific Computing Research Chair, Roger Blandford Stanford Linear Accelerator Center Co-Chair, Norman Christ Columbia University Co-Chair, Young-Kee Kim Fermi National Accelerator Laboratory Breakout panel chair, Cosmology and Astrophysics Simulation, Michael Norman University of California, San Diego Breakout panel chair, Astrophysics Data Handling, Archiving, and Mining, Alex Szalay Johns Hopkins University Breakout panel chair, High Energy Theoretical Physics, Steve Sharpe University of Washington Breakout panel chair, Accelerator Simulation, Panagiotis Spentzouris Fermi National Accelerator Laboratory Breakout panel co-chair, Experimental Particle Physics, Frank Wuerthwein University of California, San Diego Breakout panel co-chair, Experimental Particle Physics, Jim Shank Boston University Representative, Office of High Energy Physics, John Kogut Representative, Office of Advanced Scientific Computing Research, Lali Chatterjee Challenges for Understanding the Quantum Universe and the Role of Computing at the Extreme Scale iii iv Challenges for Understanding the Quantum Universe and the Role of Computing at the Extreme Scale EXECUTIVE SUMMARY EXECUTIVE SUMMARY A workshop titled “Scientific Challenges for Understanding the Quantum Universe” was held December 9-11, 2008, at the Kavli Institute for Particle Astrophysics and Cosmology at the Stanford Linear Accelerator Center-National Accelerator Laboratory. The primary purpose of the meeting was to examine how computing at the extreme scale can contribute to meeting forefront scientific challenges in particle physics, particle astrophysics and cosmology. The workshop was organized around five research areas with associated panels. Three of these, “High Energy Theoretical Physics,” “Accelerator Simulation,” and “Experimental Particle Physics,” addressed research of the U.S. Department of Energy (DOE) Office of High Energy Physics’ Energy and Intensity Frontiers, while the “Cosmology and Astrophysics Simulation” and “Astrophysics Data Handling, Archiving, and Mining” panels were associated with the Cosmic Frontier. The scientific grand challenges for this field are described in several publications, including the High Energy Physics Advisory Panel’s (HEPAP) Quantum Universe report and the P5 subcommittee’s Long Range Plan for Particle Physics in the United States. From these, the following fundamental questions are posed: What are the main features of physics Beyond the Standard Model? These can be sought by exploring physics at high energy using the Large Hadron Collider (LHC), by seeking small departures from predictions made assuming Standard Model physics for interactions at lower energy, or by exploring physics with neutrinos. What is the identity of dark matter and how does large scale structure develop in the expanding universe? The prime dark matter candidate is a supersymmetric particle, but less massive particles called axions may also be involved. The primordial fluctuations out of which contemporary large- scale structure grew may provide a window on fundamental processes occurring during the epoch of inflation. What are the empirical properties of dark energy and what do they imply about its nature? Present evidence involving observations of supernova explosions and the growth of perturbations in the expanding universe is consistent with it having the properties of Einstein’s cosmological constant. Departures from this behavior may signify the presence of new physics operating over cosmological scales. What are the reasons for and implications of neutrino masses? The measured properties of neutrinos provide an existence proof for important physics Beyond the Standard Model (BSM) but may also lead to an explanation of why there is a net preponderance of matter over anti-matter. Astrophysical arguments also contribute to the answer. Are there extra dimensions beyond the three spatial and single temporal dimensions familiar from everyday experience? These may have tiny length scales as suggested by string theory or may have macroscopic scales that can lead to a weakening of the law of gravity. How do Nature’s particle accelerators operate in extreme environments? Ultra high energy cosmic rays are accelerated by cosmic sources possibly associated with massive black holes to energies as high as 1 ZeV. When these cosmic rays hit the Earth’s atmosphere, they do so with center of momentum energies in excess of 100 TeV. Challenges for Understanding the Quantum Universe and the Role of Computing at the Extreme Scale v EXECUTIVE SUMMARY Discovering the answers to these questions requires an effort of historic proportions. Many of the world’s best scientists, using research instruments of unprecedented scale and complexity, are vigorously pursuing these matters. Critical to this research is massive computation. Spectacular research tools such as the LHC in Europe, Fermilab’s Tevatron, or the recently launched Fermi Gamma Ray Space Telescope produce data in such quantity and of such complexity, that the fastest computers are essential to filter, analyze, and store petabytes of data. Our understanding of the sub-atomic world now permits first-principles calculations that test current theories and point the way to new ones. These calculations use the fastest petaflop computers available. While we can observe cataclysmic events such as a supernova explosion or the collapse of a massive star into a black hole from a vast distance, these involve phenomena that can never be reproduced in a terrestrial laboratory. However, it becomes increasingly possible to recreate these conditions in a digital universe where important questions can be asked and alternatives explored. The planning and design of the International Linear Collider (ILC) is critical for the next step in high energy physics. How can we be certain that this expensive machine will work when it is turned on? Instead of building a series of increasingly costly prototypes, accelerator physicists turn with increasing sophistication to detailed numerical models. These provide proof of principle and also allow them to optimize a cost-effective design where both a series of important small efficiencies and dramatic new approaches can be explored and validated. Thus, large-scale computation provides the framework in which research into the mysteries of the quantum universe takes place. It enables the present successes in the field and represents some of the most important challenges that must be overcome if that research is to advance. As was repeatedly demonstrated during this workshop, the requirements from computing are indeed extreme both in their difficulty and in the transformational opportunities which they offer. Some of the most important of these opportunities include: Compute the elements of the famous Cabbibo-Kobayashi-Maskawa (CKM) matrix with accuracies approaching 1% or better. This can reveal new physics BSM and will provide key tests of the theory of quantum chromodynamics, analogous to the development of atomic physics and quantum electrodynamics in the 20th century. Analyze the 100 petabyte datasets anticipated from future dark matter/dark energy optical surveys in order to reduce the level of systematic error. This will require intensive modeling of the Point Spread Function and the atmosphere. It must be followed by intensive analysis of the derived galaxy clustering datasets in order to derive cosmological parameters which can be used to describe the properties of dark matter and energy.