NVIDIA Powers Next-Generation Supercomputer at University of Edinburgh
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Durham Unlocks Cosmological Secrets
Unlocking cosmological secrets Drawing on the power of high performance computing, Durham University and DiRAC scientists are expanding our understanding of the universe and its origins. Scientific Research | United Kingdom Business needs Solutions at a glance Durham University and the DiRAC HPC facility need • Dell EMC PowerEdge C6525 servers leading-edge high performance computing systems • AMD® EPYC™ 7H12 processors to power data- and compute-intensive cosmological • CoolIT® Systems Direct Liquid Cooling technology research. • Mellanox® HDR 200 interconnect Business results • Providing DiRAC researchers with world-class capabilities • Enabling a much greater level of discovery • Accelerating computational cosmology research • Keeping Durham University at the forefront cosmology research The COSMA8 cluster has The AMD EPYC processor used as many as in the COSMA8 cluster has cores 76,000 64 cores 64 per CPU CORES “This will primarily be used for testing, for getting code up Researching very big to scratch,” says Dr. Alastair Basden, technical manager for the COSMA high performance computing system at Durham questions University. “But then, coming very soon, we hope to place an order In scientific circles, more computing power can lead to bigger for another 600 nodes.” discoveries in less time. This is the case at Durham University, whose researchers are unlocking insights into our universe Deploying the full 600 nodes would provide a whopping 76,800 with powerful high performance computing clusters from Dell cores. “It’s a big uplift,” Dr. Basden notes. “COSMA7 currently is Technologies. about 12,000 cores. So COSMA8 will have six times as many cores. It will have more than twice as much DRAM as previous What is the universe? What is it made of? What is dark matter? systems, which means that we’ll be able to more than double What is dark energy? These are the types of questions explored the size of the simulations. -
Call for Applications for Dirac Community Development Director 1
DiRAC Health Data Science and AI Placement Opportunity DiRAC will awarD one Innovation Placement in 2021 in tHe area of HealtH Data Science anD tHe application of AI. The nominal length is 6 montHs anD Has to be completed by 30 September 2021. In tHis scheme a final year PhD stuDent or an early career researcHer can Have a funDed placement (up to £25k) witH the Getting It Right First Time (GIRFT) programme. GIRFT is funDeD by the UK Department of Health anD Social Care and is a collaboration between NHS England & NHS Improvement and the Royal National Orthopaedic Hospital NHS Trust. GIRFT uses comprehensive benchmarking data analysis to identify unwarranted variation in Healthcare provision and outcomes in National HealtH Service (NHS) Hospitals in EnglanD and combine this with deep dive visits to tHe Hospital by clinicians witH follow up on agreeD actions by an improvement team. The programme covers the majority of healthcare specialities. You have to be working on research that falls within the STFC remit in orDer to qualify for tHe placement; however, you can be funDeD by otHer organisations besides STFC, as long as tHe subject area is identifiable as being in Particle Physics, Astronomy & Cosmology, Solar Physics and Planetary Science, Astro-particle Physics, and Nuclear Physics. To check your eligibility please contact Jeremy Yates ([email protected]) anD Maria MarcHa ([email protected]). You must get your Supervisor or PIs permission before applying for this placement. It is allowed under UKRI’s rules, but only with your supervisor/PIs consent. -
School of Physics and Astronomy – Yearbook 2019
* SCHOOL OF PHYSICS AND ASTRONOMY – YEARBOOK 2019 Introduction This 2019 Yearbook brings together the highlights from the blog, and from our School’s press releases, to provide a Table of Contents record of our activities over the past twelve months. In fact, this edition will be packed full of even more goodies, Introduction .......................................................... 2 spanning from October 2018 (the last newsletter) to School Events & Activities ..................................... 4 December 2019. We expect that future editions of the Yearbook will focus on a single calendar year. Science News ...................................................... 13 As we look forward to the dawn of a new decade, we can Space Park Leicester News ................................. 31 reflect on the continued growth and success of the School Physicists Away from the Department ............... 34 of Physics and Astronomy. Our Department has evolved to become a School. Our five existing research groups are Celebrating Success ............................................ 36 exploring ways to break down any barriers between them, Physics Special Topics: Editors Pick .................... 44 and to restructure the way our School works. The exciting dreams of Space Park Leicester are becoming a reality, with Comings and Goings ........................................... 46 incredible new opportunities on the horizon. Our foyer has been thoroughly updated and modernised, to create a welcoming new shared space that represents our A very warm welcome to the first School of Physics and ambitions as a School. Our website has been dragged, Astronomy Yearbook! We have moved away from the kicking and screaming, into the 21st century (SiteCore) to termly email news bulletins of the past, towards an online showcase the rich portfolio of research, teaching, and blog1 that celebrates all the successes of our School. -
Dirac-3 Technical Case
DiRAC-3 Technical Case Contributors: Peter Boyle, Paul Calleja, Lydia Heck, Juha Jaykka, Adrian Jenkins, Stuart Rankin, Chris Rudge, Paul Shellard, Mark Wilkinson, Jeremy Yates, on behalf of the DiRAC project. 1 Table of Contents 1 Executive summary ............................................................................................................................. 3 2 Technical case for support ............................................................................................................... 4 1.1 Service description ..................................................................................................................................... 4 1.2 Track record ................................................................................................................................................. 5 1.2.1 Hosting capability ................................................................................................................................................. 6 1.3 International competition ....................................................................................................................... 6 1.4 Risk Management ...................................................................................................................................... 7 1.5 Impact ............................................................................................................................................................. 7 1.5.1 Industrial Collaboration ................................................................................................................................... -
Press Release
Press release Atos supercomputer to help unlock secrets of the Universe Paris (France), London, (UK) 1 June 2021 – Atos today announces it has been awarded a contract by the University of Edinburgh to deliver its most efficient supercomputer, the BullSequana XH2000, the most energy-efficient supercomputing system on the market. This is the largest system dedicated to GPU computing deployed at a customer site in the UK. The new system will constitute the Extreme Scaling Service of the UK’s DiRAC HPC Facility. The state-of-the-art platform will allow scientists across the STFC theory community to drive forward world-leading research in particle physics, among other areas, using NVIDIA Graphics Processing Units (GPUs) and AMD processors. It represents a major boost to DiRAC’s computing capacity, significantly increasing the power of the Extreme Scaling service. DiRAC is a distributed facility with high performance computing resources hosted by the Universities of Edinburgh, Cambridge, Durham and Leicester. These systems support fundamental research in particle physics, astrophysics, nuclear physics and cosmology. This agreement forms part of a £20 million investment by the UK Research and Innovation (UKRI) World Class Laboratories scheme, through the Science and Technology Facilities Council (STFC), to fund an upgrade of the DiRAC facility. The investment is delivering new systems which are up to four times more powerful than the existing DiRAC machines, providing computing capacity that can also be used to address immediate and emerging issues such as the COVID-19 pandemic. The upgraded DiRAC-3 facility will also be much more energy efficient than previous generations. -
The Bluegene/Q Supercomputer
The BlueGene/Q Supercomputer P A Boyle∗ University of Edinburgh E-mail: [email protected] I introduce the BlueGene/Q supercomputer design, with an emphasis on features that enhance QCD performance, discuss the architectures performance for optimised QCD codes and compare these characteristics to other leading architectures. The 30 International Symposium on Lattice Field Theory - Lattice 2012, June 24-29, 2012 Cairns, Australia ∗Speaker. c Copyright owned by the author(s) under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike Licence. http://pos.sissa.it/ The BlueGene/Q Supercomputer P A Boyle 1. QCD computer design challenge The floating point operation count in the Monte Carlo evaluation of the QCD path integral is dominated by the solution of discretised Euclidean Dirac equation. The solution of this equation is a classic sparse matrix inversion problem, for which one uses one of a number of iterative Krylov methods. The naive dimension of the (sparse) matrix is of O([109]2), and hundreds of thousands of inversions must be performed in a serially dependent chain to importance sample the QCD path integral. The precision with which we can solve QCD numerically is computationally limited, and the development of faster computers is of great interest to the field. Many processors are in principal faster than one, providing we can arrange for them to co- ordinate work effectively. QCD is easily parallelised with a geometrical decomposition spreading L4 space time points across multiple N4 processing nodes, each containing