Overview of the NNSA/ASC Labs

Total Page:16

File Type:pdf, Size:1020Kb

Overview of the NNSA/ASC Labs High Performance Computing Workshop Session VII HPC Resources and Training Opportunities at the NNSA/ASC National Labs Blaise Barney, Lawrence Livermore National Laboratory Overview • Who Are the NNSA/ASC National Labs? • Leadership in High Performance Computing • HPC Science at the Labs • Overview of HPC Platforms at LLNL, LANL and Sandia • HPC Training Opportunities at the NNSA/ASC Labs • ASC Academic Alliance Program • Future Platforms DOE CSGF HPC Workshop 1 Who Are the NNSA/ASC National Labs? • NNSA: the Department of Energy’s National Nuclear Security Administration. • Broad range of responsibilities including nuclear security, non- proliferation, naval reactors and defense programs. $24B budget • National security and stockpile stewardship in a post cold-war era are essential mission components. • ASC: Advanced Simulation and Computing Program within NNSA • Established in 1995 to support NNSA’s shift in emphasis from test based confidence to simulation based confidence. $611M budget • Under ASC, computer simulation capabilities are developed to analyze, predict and certify the performance, safety, and reliability of the nation’s nuclear stockpile. • Multi-disciplinary simulation science and implementation of the world’s most powerful computing resources. DOE CSGF HPC Workshop 2 Who Are the NNSA/ASC National Labs? • The Department of Energy funds a wide range of national laboratories and related facilities: • Ames • Los Alamos • Princeton Plasma Physics • Argonne • National Energy • Sandia • Brookhaven Technology • Savannah River • Fermi National Accelerator • National Renewable • Stanford Linear Accelerator • Idaho Energy • Thomas Jefferson National • Lawrence Berkeley • Oak Ridge Accelerator Facility • Lawrence Livermore • Three of these national laboratories are designated as NNSA defense labs: • Lawrence Livermore • Los Alamos • Sandia • The ASC program integrates the work of the NNSA defense labs to facilitate NNSA national security goals and mission. • The ASC program also funds and integrates the work of academic Alliance research centers at 8 U.S. universities (discussed later): Caltech U. Chicago U. Illinois U. Michigan Purdue Stanford U. Texas U. Utah DOE CSGF HPC Workshop 3 Who Are the NNSA/ASC National Labs? (aka Tri-labs) • Lawrence Livermore National Laboratory (LLNL) • Located in Livermore, CA; Established in 1952 • 7,800+ employees and contractors • Managed by Lawrence Livermore National Security, LLC (LLNS). Budget @ 1.6 billion • Los Alamos National Laboratory (LANL) • Located in Los Alamos, NM; Origins in the Manhattan Project (@1943) • 9,500+ employees and contractors • Managed by Los Alamos National Security, LLC (LANS). Budget @ 2.2 billion • Sandia National Laboratories (SNL) • Albuquerque, NM (main), Livermore, CA, test facilities in Nevada and Hawaii; Established 1949 • 8,400+ employees and contractors • Managed by Lockheed Martin Corporation. Budget @ 2.3 billion DOE CSGF HPC Workshop 4 Leadership in High Performance Computing The NNSA/ASC Labs have a long HPC history – even before the term "HPC" and the NNSA/ASC were invented. For example - at LLNL: DOE CSGF HPC Workshop 5 Leadership in High Performance Computing At LANL…. DOE CSGF HPC Workshop 6 Leadership in High Performance Computing At Sandia... • 1987 - 1024 node nCUBE 10 • 1990 - two 1024-node nCUBE-2’s • 1993 - Sandia fielded first Intel Paragon. 1850 nodes/3900 cpus Reached #1 on Top500 nCube • 1997-2006 - ASCI Red - Intel. World’s first Tflop system; Upgraded to 3Tflops, 4510 nodes/9298 cpus. Fastest computer on TOP500 list for 3 years. • 1997 - Cplant DEC/Compaq with Myrinet network. Achieved 997 Gflops in Nov 2003. • 2006 - current - Thunderbird - Dell. Linux cluster 65.4 Tflops. 4,480 nodes/8960 cpus. Jun 2006 #6 on TOP500 list. • 2006 - current - Red Storm - Cray. 284 Tflops theoretical peak. 12960 nodes/38,400 cpus. Nov 2009 Listed #9 in TOP500 list Paragon ASCI Red Red Storm Cplant DOE CSGF HPC Workshop 7 Leadership in High Performance Computing The NNSA/ASC Labs have dominated the Top 500 list (top500.org) for most of the 16 years since the list was created in 1993. Ranked #1 position 22 times (as of 6/09) DOE CSGF HPC Workshop 8 Leadership in High Performance Computing • Award winning science and technology • Gordon Bell Prize recognizing outstanding achievements in HPC - 16 times since it began in 1987 • Many R&D 100 Awards • Very long lists of other awards by government, industry and scientific organizations: www.sandia.gov/news/corp/awards/ www.llnl.gov/llnl/sciencetech/awards.jsp www.lanl.gov/science/awards/ • The science in the Labs’ state- of-the-art simulations is recognized in scientific publications around the world DOE CSGF HPC Workshop 9 Leadership in High Performance Computing • The NNSA/ASC Labs have been instrumental in revitalizing the U.S. HPC industry by working closely with vendors such as Cray and IBM From "Getting Up to Speed: The Future of Supercomputing" by Susan L. Graham, Marc Snir, and Cynthia A. Patterson, Editors, Committee on the Future of Supercomputing, National Research Council. 2004 Supercomputers play a significant and growing role in a variety of areas important to the nation. They are used to address challenging science and technology problems. In recent years, however, progress in supercomputing in the United States has slowed. The development of the Earth Simulator supercomputer by Japan showed that the United States could lose its competitive advantage and, more importantly, the national competence needed to achieve national goals. In the wake of this development, the Department of Energy asked the NRC to assess the state of U.S. supercomputing capabilities and relevant R&D. Subsequently, the Senate directed DOE in S. Rpt. 107-220 to ask the NRC to evaluate the Advanced Simulation and Computing program of the National Nuclear Security Administration at DOE in light of the development of the Earth Simulator. This report provides an assessment of the current status of supercomputing in the United States including a review of current demand and technology, infrastructure and institutions, and international activities. DOE CSGF HPC Workshop 10 Leadership in High Performance Computing • The NNSA/ASC Labs have been instrumental in revitalizing the U.S. HPC industry by working closely with vendors such as Cray and IBM. From "Getting Up to Speed: The Future of Supercomputing" by Susan L. Graham, Marc Snir, and Cynthia A. Patterson, Editors, Committee on the Future of Supercomputing, National Research Council. 2004 Supercomputers play a significant and growing role in a variety of areas important to the nation. They are used to address Thingschallenging have science and technology problems. In recent years, however, progress in supercomputing in the United States has changedslowed. …The development of the Earth Simulator supercomputer by Japan showed that the United States could largelylose due its competitive to advantage and, more importantly, the national competence needed to achieve national goals. In the DOE/NNSA/ASCwake of this development, the Department of Energy asked the NRC to assess the state of U.S. supercomputing capabilities investmentsand relevant inR&D. Subsequently, the Senate directed DOE in S. Rpt. 107-220 to ask the NRC to evaluate the Advanced HPC Simulation and Computing program of the National Nuclear Security Administration at DOE in light of the development of the Earth Simulator. This report provides an assessment of the current status of supercomputing in the United States including a review of current demand and technology, infrastructure and institutions, and international activities. DOE CSGF HPC Workshop 11 HPC Science at the Labs • There is a very broad range of science being explored at the Labs, much depending upon HPC: • Physics - applied, nuclear, particle & accelerator, condensed matter, high pressure, fusion, photonics • Atmosphere, Earth, Environment and Energy • Biosciences and Biotechnology • Engineering - defense technologies, laser systems (NIF), mechanical… • Chemistry • Materials • Microelectronics • Pulsed power • Computer & Information Science, Mathematics • Diverse community of scientists, researchers and collaborations - Tri-lab, other labs, universities, international… DOE CSGF HPC Workshop 12 HPC Science at the Labs: A Few Examples Concurrently using 196,608 processors in a single run, the high-fidelity simulations of a three-dimensional laser beam interacting with target is critical to achieving fusion ignition on the National Ignition Facility. BlueGene/L is refining the design of the National Ignition Facility, scheduled to achieve fusion ignition in 2010. Obtaining controlled laboratory fusion is the holy grail of national energy independence. DOE CSGF HPC Workshop 13 HPC Science at the Labs: A Few Examples DOE CSGF HPC Workshop 14 HPC Science at the Labs: A Few Examples DOE CSGF HPC Workshop 15 HPC Science at the Labs: A Few Examples University of Texas: Center for Predictive Engineering and Computational Sciences (PECOS). The goal of the PECOS Center is development of advanced computational methods for predictive simulation of multiscale, multiphysics phenomena applied to the problem of reentry of vehicles into the atmosphere. DOE CSGF HPC Workshop 16 Overview of HPC Platforms at the Tri-labs DOE CSGF HPC Workshop 17 HPC @ LLNL: BG/L and BG/P • 1999 origins in IBM exploring a novel, new architecture: massively scalable parallel with low power consumption and small footprint. • Targeted for protein folding research, but later became
Recommended publications
  • Year in Review 2 NEWSLINE January 7, 2011 2010: S&T Achievement and Building for the Future
    Published for Nthe employees of LawrenceEWSLINE Livermore National Laboratory January 7, 2011 Vol. 4, No. 1 Year in review 2 NEWSLINE January 7, 2011 2010: S&T achievement and building for the future hile delivering on its mission obligations with award-winning sci- ence and technology, the Laboratory also spent 2010 building for the future. W In an October all-hands address, Director George Miller said his top priori- ties are investing for the future in programmatic growth and the underpinning infrastructure, as well as recruiting and retaining top talent at the Lab. In Review “It’s an incredibly exciting situation we find ourselves in,” Miller said in an earlier talk about the Lab’s strategic outlook. “If you look at the set of issues facing the country, the Laboratory has experience in all of them.” Defining “national security” broadly, Miller said the Lab will continue to make vital contributions to stockpile stewardship, homeland security, nonprolif- eration, arms control, the environment, climate change and sustainable energy. “Energy, environment and climate change are national security issues,” he said. With an eye toward accelerating the development of technologies that benefit national security and industry, the Lab partnered with Sandia-Calif. to launch the Livermore Valley Open Campus (LVOC) on the Lab’s southeast side. Construction has begun on an R&D campus outside the fence that will allow for collaboration in a broad set of disciplines critical to the fulfillment DOE/NNSA missions and to strengthening U.S. industry’s economic competitiveness, includ- If you look at the set of issues facing ing high-performance computing, energy, cyber security and environment.
    [Show full text]
  • 2017 HPC Annual Report Team Would Like to Acknowledge the Invaluable Assistance Provided by John Noe
    sandia national laboratories 2017 HIGH PERformance computing The 2017 High Performance Computing Annual Report is dedicated to John Noe and Dino Pavlakos. Building a foundational framework Editor in high performance computing Yasmin Dennig Contributing Writers Megan Davidson Sandia National Laboratories has a long history of significant contributions to the high performance computing Mattie Hensley community and industry. Our innovative computer architectures allowed the United States to become the first to break the teraflop barrier—propelling us to the international spotlight. Our advanced simulation and modeling capabilities have been integral in high consequence US operations such as Operation Burnt Frost. Strong partnerships with industry leaders, such as Cray, Inc. and Goodyear, have enabled them to leverage our high performance computing capabilities to gain a tremendous competitive edge in the marketplace. Contributing Editor Laura Sowko As part of our continuing commitment to provide modern computing infrastructure and systems in support of Sandia’s missions, we made a major investment in expanding Building 725 to serve as the new home of high performance computer (HPC) systems at Sandia. Work is expected to be completed in 2018 and will result in a modern facility of approximately 15,000 square feet of computer center space. The facility will be ready to house the newest National Nuclear Security Administration/Advanced Simulation and Computing (NNSA/ASC) prototype Design platform being acquired by Sandia, with delivery in late 2019 or early 2020. This new system will enable continuing Stacey Long advances by Sandia science and engineering staff in the areas of operating system R&D, operation cost effectiveness (power and innovative cooling technologies), user environment, and application code performance.
    [Show full text]
  • Safety and Security Challenge
    SAFETY AND SECURITY CHALLENGE TOP SUPERCOMPUTERS IN THE WORLD - FEATURING TWO of DOE’S!! Summary: The U.S. Department of Energy (DOE) plays a very special role in In fields where scientists deal with issues from disaster relief to the keeping you safe. DOE has two supercomputers in the top ten supercomputers in electric grid, simulations provide real-time situational awareness to the whole world. Titan is the name of the supercomputer at the Oak Ridge inform decisions. DOE supercomputers have helped the Federal National Laboratory (ORNL) in Oak Ridge, Tennessee. Sequoia is the name of Bureau of Investigation find criminals, and the Department of the supercomputer at Lawrence Livermore National Laboratory (LLNL) in Defense assess terrorist threats. Currently, ORNL is building a Livermore, California. How do supercomputers keep us safe and what makes computing infrastructure to help the Centers for Medicare and them in the Top Ten in the world? Medicaid Services combat fraud. An important focus lab-wide is managing the tsunamis of data generated by supercomputers and facilities like ORNL’s Spallation Neutron Source. In terms of national security, ORNL plays an important role in national and global security due to its expertise in advanced materials, nuclear science, supercomputing and other scientific specialties. Discovery and innovation in these areas are essential for protecting US citizens and advancing national and global security priorities. Titan Supercomputer at Oak Ridge National Laboratory Background: ORNL is using computing to tackle national challenges such as safe nuclear energy systems and running simulations for lower costs for vehicle Lawrence Livermore's Sequoia ranked No.
    [Show full text]
  • FY 2005 Annual Performance Evaluation and Appraisal Lawrence Livermore National Laboratory (Rev
    Description of document: FY 2005 Annual Performance Evaluation and Appraisal Lawrence Livermore National Laboratory (Rev. 1 June 15, 2006) Requested date: 26-January-2007 Released date: 11-September-2007 Posted date: 15-October-2007 Title of Document Fiscal Year 2005 Annual Performance Evaluation and Appraisal Lawrence Livermore National Laboratory Date/date range of document: FY 2005 Source of document: Department of Energy National Nuclear Security Administration Service Center P.O. Box 5400 Albuquerque, NM 87185 Freedom of Information Act U.S. Department of Energy 1000 Independence Ave., S.W. Washington, DC 20585 (202) 586-5955 [email protected] http://management.energy.gov/foia_pa.htm The governmentattic.org web site (“the site”) is noncommercial and free to the public. The site and materials made available on the site, such as this file, are for reference only. The governmentattic.org web site and its principals have made every effort to make this information as complete and as accurate as possible, however, there may be mistakes and omissions, both typographical and in content. The governmentattic.org web site and its principals shall have neither liability nor responsibility to any person or entity with respect to any loss or damage caused, or alleged to have been caused, directly or indirectly, by the information provided on the governmentattic.org web site or in this file. Department of Energy National Nuclear Security Administration Service Center P. O. Box 5400 Albuquerque, NM 87185 SEP 11 200t CERTIFIED MAIL - RESTRICTED DELIVERY - RETURN RECEIPT REQUESTED This is in final response to your Freedom oflnformation Act (FOIA) request dated January 26, 2007, for "a copy ofthe most recent two annualperformance reviews for Pantex Site, Kansas City Site, Sandia Site, Los Alamos Site, Y-12 Site and Livermore Site." I contacted the Site Offices who have oversight responsibility for the records you requested, and they are enclosed.
    [Show full text]
  • The Artisanal Nuke, 2014
    The Artisanal Nuke Mary C. Dixon US Air Force Center for Unconventional Weapons Studies Maxwell Air Force Base, Alabama THE ARTISANAL NUKE by Mary C. Dixon USAF Center for Unconventional Weapons Studies 125 Chennault Circle Maxwell Air Force Base, Alabama 36112-6427 July 2014 Disclaimer The opinions, conclusions, and recommendations expressed or implied in this publication are those of the author and do not necessarily reflect the views of the Air University, Air Force, or Department of Defense. ii Table of Contents Chapter Page Disclaimer ................................................................................................... ii Table of Contents ....................................................................................... iii About the Author ......................................................................................... v Acknowledgements ..................................................................................... vi Abstract ....................................................................................................... ix 1 Introduction .............................................................................................. 1 2 Background ............................................................................................ 19 3 Stockpile Stewardship ........................................................................... 27 4 Opposition & Problems ......................................................................... 71 5 Milestones & Accomplishments ..........................................................
    [Show full text]
  • Technical Issues in Keeping the Nuclear Stockpile Safe, Secure, and Reliable
    Technical Issues in Keeping the Nuclear Stockpile Safe, Secure, and Reliable Marvin L. Adams Texas A&M University Sidney D. Drell Stanford University ABSTRACT The United States has maintained a safe, secure, and reliable nuclear stockpile for 16 years without relying on underground nuclear explosive tests. We argue that a key ingredient of this success so far has been the expertise of the scientists and engineers at the national labs with the responsibility for the nuclear arsenal and infrastructure. Furthermore for the foreseeable future this cadre will remain the critical asset in sustaining the nuclear enterprise on which the nation will rely, independent of the size of the stockpile or the details of its purpose, and to adapt to new requirements generated by new findings and changes in policies. Expert personnel are the foundation of a lasting and responsive deterrent. Thus far, the United States has maintained a safe, secure, and reliable nuclear stockpile by adhering closely to the designs of existing “legacy” weapons. It remains to be determined whether this is the best course to continue, as opposed to introducing new designs (as envisaged in the original RRW program), re-using previously designed and tested components, or maintaining an evolving combination of new, re-use, and legacy designs. We argue the need for a stewardship program that explores a broad spectrum of options, includes strong peer review in its assessment of options, and provides the necessary flexibility to adjust to different force postures depending on evolving strategic developments. U.S. decisions and actions about nuclear weapons can be expected to affect the nuclear policy choices of other nations – non-nuclear as well as nuclear – on whose cooperation we rely in efforts to reduce the global nuclear danger.
    [Show full text]
  • A Comparison of the Current Top Supercomputers
    A Comparison of the Current Top Supercomputers Russell Martin Xavier Gallardo Top500.org • Started in 1993 to maintain statistics on the top 500 performing supercomputers in the world • Started by Hans Meuer, updated twice annually • Uses LINPACK as main benchmark November 2012 List 1. Titan, Cray, USA 2. Sequoia, IBM, USA 3. K Computer, Fujitsu, Japan 4. Mira, IBM, USA 5. JUQueen, IBM, Germany 6. SuperMUC, IBM, Germany Performance Trends http://top500.org/statistics/perfdevel/ Trends • 84.8% use 6+ processor cores • 76% Intel, 12% AMD Opteron, 10.6% IBM Power • Most common Interconnects - Infiniband, Gigabit Ethernet • 251 in USA, 123 in Asia, 105 in Europe • Current #1 Computer • Oak Ridge National Laboratory Oak Ridge, Tennessee • Operational Oct 29, 2012 • Scientific research • AMD Opteron / NVIDIA Tesla • 18688 Nodes, 560640 Cores • Cray Gemini Interconnect • Theoretical 27 PFLOPS Sequoia • Ranked #2 • Located at the Lawrence Livermore National Library in Livermore, CA • Designed for "Uncertainty Quantification Studies" • Fully classified work starting April 17 2013 Sequoia • Cores: 1572864 • Nodes: 98304 • RAM: 1572864 GB • Interconnect: Custom 5D Torus • Max Performance: 16324.8 Tflops • Theoretical Max: 20132.7 Tflops • Power Consumption: 7890 kW Sequoia • IBM BlueGene/Q Architecture o Started in 1999 for protein folding o Custom SoC Layout 16 PPC Cores per Chip, each 1.6Ghz, 0.8V • Built for OpenMP and POSIX programs • Automatic SIMD exploitation Sequoia • #3 on Top 500, #1 June 2011 • RIKEN Advanced Institute for Computational
    [Show full text]
  • FY 2006 Annual Performance Evaluation and Appraisal Lawrence Livermore National Laboratory (Rev
    Description of document: FY 2006 Annual Performance Evaluation and Appraisal Lawrence Livermore National Laboratory (Rev. 1 January 19, 2007) Requested date: 26-January-2007 Released date: 11-September-2007 Posted date: 15-October-2007 Title of Document Fiscal Year 2006 Annual Performance Evaluation and Appraisal Lawrence Livermore National Laboratory Date/date range of document: FY 2006 Source of document: Department of Energy National Nuclear Security Administration Service Center P.O. Box 5400 Albuquerque, NM 87185 Freedom of Information Act U.S. Department of Energy 1000 Independence Ave., S.W. Washington, DC 20585 (202) 586-5955 [email protected] http://management.energy.gov/foia_pa.htm The governmentattic.org web site (“the site”) is noncommercial and free to the public. The site and materials made available on the site, such as this file, are for reference only. The governmentattic.org web site and its principals have made every effort to make this information as complete and as accurate as possible, however, there may be mistakes and omissions, both typographical and in content. The governmentattic.org web site and its principals shall have neither liability nor responsibility to any person or entity with respect to any loss or damage caused, or alleged to have been caused, directly or indirectly, by the information provided on the governmentattic.org web site or in this file. Department of Energy National Nuclear Security Administration Service Center P. O. Box 5400 Albuquerque, NM 87185 SEP 11 200t CERTIFIED MAIL - RESTRICTED DELIVERY - RETURN RECEIPT REQUESTED This is in final response to your Freedom oflnformation Act (FOIA) request dated January 26, 2007, for "a copy ofthe most recent two annualperformance reviews for Pantex Site, Kansas City Site, Sandia Site, Los Alamos Site, Y-12 Site and Livermore Site." I contacted the Site Offices who have oversight responsibility for the records you requested, and they are enclosed.
    [Show full text]
  • The Blue Gene/Q Compute Chip
    The Blue Gene/Q Compute Chip Ruud Haring / IBM BlueGene Team © 2011 IBM Corporation Acknowledgements The IBM Blue Gene/Q development teams are located in – Yorktown Heights NY, – Rochester MN, – Hopewell Jct NY, – Burlington VT, – Austin TX, – Bromont QC, – Toronto ON, – San Jose CA, – Boeblingen (FRG), – Haifa (Israel), –Hursley (UK). Columbia University . University of Edinburgh . The Blue Gene/Q project has been supported and partially funded by Argonne National Laboratory and the Lawrence Livermore National Laboratory on behalf of the United States Department of Energy, under Lawrence Livermore National Laboratory Subcontract No. B554331 2 03 Sep 2012 © 2012 IBM Corporation Blue Gene/Q . Blue Gene/Q is the third generation of IBM’s Blue Gene family of supercomputers – Blue Gene/L was announced in 2004 – Blue Gene/P was announced in 2007 . Blue Gene/Q – was announced in 2011 – is currently the fastest supercomputer in the world • June 2012 Top500: #1,3,7,8, … 15 machines in top100 (+ 101-103) – is currently the most power efficient supercomputer architecture in the world • June 2012 Green500: top 20 places – also shines at data-intensive computing • June 2012 Graph500: top 2 places -- by a 7x margin – is relatively easy to program -- for a massively parallel computer – and its FLOPS are actually usable •this is not a GPGPU design … – incorporates innovations that enhance multi-core / multi-threaded computing • in-memory atomic operations •1st commercial processor to support hardware transactional memory / speculative execution •… – is just a cool chip (and system) design • pushing state-of-the-art ASIC design where it counts • innovative power and cooling 3 03 Sep 2012 © 2012 IBM Corporation Blue Gene system objectives .
    [Show full text]
  • Stewarding a Reduced Stockpile
    LLNL-CONF-403041 Stewarding a Reduced Stockpile Bruce T. Goodwin, Glenn L. Mara April 24, 2008 AAAS Technical Issues Workshop Washington, DC, United States Disclaimer This document was prepared as an account of work sponsored by an agency of the United States government. Neither the United States government nor Lawrence Livermore National Security, LLC, nor any of their employees makes any warranty, expressed or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States government or Lawrence Livermore National Security, LLC. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States government or Lawrence Livermore National Security, LLC, and shall not be used for advertising or product endorsement purposes. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. Stewarding a Reduced Stockpile Bruce T. Goodwin, Principal Associate Director Weapons and Complex Integration Lawrence Livermore National Laboratory Glenn Mara, Principal Associate Director for Weapons Programs Los Alamos National Laboratory The future of the US nuclear arsenal continues to be guided by two distinct drivers: the preservation of world peace and the prevention of further proliferation through our extended deterrent umbrella. Timely implementation of US nuclear policy decisions depends, in part, on the current state of stockpile weapons, their delivery systems, and the supporting infrastructure within the Department of Defense (DoD) and the Department of Energy’s National Nuclear Security Administration (NNSA).
    [Show full text]
  • January/February 2005 University of California Science & Technology Review Lawrence Livermore National Laboratory P.O
    January/February 2005 University of California Science & Technology Review Lawrence Livermore National Laboratory P.O. Box 808, L-664 Livermore, California 94551 National Nuclear Security Administration’s Lawrence Livermore National Laboratory Also in this issue: • Jobs for Russian Weapons Workers • New Facility for Today’s and Tomorrow’s Printed on recycled paper. Supercomputers • Extracting Marketable Minerals from Geothermal Fluids About the Cover Livermore scientists use powerful machines and codes for computer simulations that have changed the way science is done at the Laboratory. As the article beginning on p. 4 describes, computer simulations have become powerful tools for understanding and predicting the physical universe, from the interactions of individual atoms to the details of climate change. For example, Laboratory physicists have predicted a new melt curve of hydrogen, resulting in the possible existence of a novel superfluid. The cover illustrates the transition of hydrogen from a molecular solid (top) to a quantum liquid (bottom), with a “metallic sea” added in the background. Cover design: Amy Henke Amy design: Cover About the Review Lawrence Livermore National Laboratory is operated by the University of California for the Department of Energy’s National Nuclear Security Administration. At Livermore, we focus science and technology on ensuring our nation’s security. We also apply that expertise to solve other important national problems in energy, bioscience, and the environment. Science & Technology Review is published 10 times a year to communicate, to a broad audience, the Laboratory’s scientific and technological accomplishments in fulfilling its primary missions. The publication’s goal is to help readers understand these accomplishments and appreciate their value to the individual citizen, the nation, and the world.
    [Show full text]
  • B UCRL-AR-143313-08 Lawrence Livermore National Laboratory
    UCRL-AR-143313-08 B Lawrence Livermore National Laboratory UCRL-AR-143313-08 FY09 Ten Year Site Plan A UCRL-AR-143313-08 Acknowledgments Responsible Organization Publication Editor Contributing Authors Institutional Facilities Karen Kline Jacky Angell Dan Knight Management Art Director Mike Auble Bill Maciel Sharon Beall Matt Mlekush Responsible Manager Scott Dougherty Denise Robinson Jeff Brenner Al Moser Design and Production Dennis Chew Ray Pierce Publication Directors Scott Dougherty Ray Chin Paul Reynolds Carey Bailey Marleen Emig Paul Chrzanowski Larry Sedlacek Paul Chu Paul Chu Rich Shonfeld Publication Manager Shawne Ellerbee Mark Sueksdorf Marleen Emig Kent Johnson Doug Sweeney Paul Kempel Jesse Yow And a special thanks to the NNSA Livermore Site Office, LLNL Directorates, Programs, Area Facility Operations Managers, contributors, and reviewers. DISCLAIMER This document was prepared as an account of work sponsored by an agency of the United States government. Neither the United States government nor Lawrence Livermore National Security, LLC, nor any of their employees makes any warranty, expressed or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States government or Lawrence Livermore National Security, LLC. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States government or Lawrence Livermore National Security, LLC, and shall not be used for advertising or product endorsement purposes.
    [Show full text]