Large-Scale First-Principles Molecular Dynamics Simulations on the Bluegene/L Platform Using the Qbox Code

Large-Scale First-Principles Molecular Dynamics Simulations on the Bluegene/L Platform Using the Qbox Code

Large-Scale First-Principles Molecular Dynamics simulations on the BlueGene/L Platform using the Qbox code François Gygi Erik W. Draeger Bronis R. de Supinski Center for Applied Scientific Center for Applied Scientific Center for Applied Scientific Computing Computing Computing Lawrence Livermore National Lawrence Livermore National Lawrence Livermore National Laboratory Laboratory Laboratory Livermore, CA 94551 Livermore, CA 94551 Livermore, CA 94551 +1 925 422 6332 +1 925 424 3650 +1 925 422 1062 [email protected] [email protected] [email protected] Robert K. Yates Franz Franchetti Stefan Kral Center for Applied Scientific Department of Electrical and Institute of Analysis and Scientific Computing Computer Engineering Computing Lawrence Livermore National Carnegie Mellon University Vienna University of Technology Laboratory 5000 Forbes Ave Wiedner Hauptstrasse 8 Livermore, CA 94550 Pittsburgh, PA 15213 A-1040 Vienna, Austria +1 925 424 3642 +1 412 268 6979 [email protected] [email protected] [email protected] Juergen Lorenz Christoph W. Ueberhuber John A. Gunnels Institute of Analysis and Scientific Institute for Analysis and Scientific IBM T.J. Watson Research Center Computing Computing Yorktown Heights, NY 10598 +1 914 945 3833 Vienna University of Technology Vienna University of Technology Wiedner Hauptstrasse 8 Wiedner Hauptstrasse 8 [email protected] A-1040 Vienna, Austria A-1040 Vienna, Austria juergen.lorenz@ +43 1 58801 10152 [email protected] aurora.anum.tuwien.ac.at James C. Sexton IBM T.J. Watson Research Center Yorktown Heights, NY 10598 +1 914 945 1679 [email protected] ABSTRACT We demonstrate that the Qbox code supports unprecedented © 2005 Association for Computing Machinery. ACM acknowledges that large-scale First-Principles Molecular Dynamics (FPMD) this contribution was authored or co-authored by a contractor or applications on the BlueGene/L supercomputer. Qbox is an affiliate of the [U.S.] Government. As such, the Government retains a FPMD implementation specifically designed for large-scale nonexclusive, royalty-free right to publish or reproduce this article, parallel platforms such as BlueGene/L. Strong scaling tests for a or to allow others to do so, for Government purposes only. Materials Science application show an 86% scaling efficiency SC|05 November 12-18, 2005, Seattle, Washington, USA (c)2005 ACM 1-59593-061-2/05/0011...$5.00 between 1024 and 32,768 CPUs. Measurements of performance because of the difficulty to achieve and maintain extreme by means of hardware counters show that 36% of the peak FPU conditions of pressure and temperature. FPMD simulations offer performance can be attained. the possibility of predicting these properties without relying on any empirical or adjusted parameters. The validation of the Categories and Subject Descriptors physical models used in FPMD simulations is an important step J.2 [Physical Sciences and Engineering]:– Chemistry, Physics. toward this goal, and requires large computational resources. In this paper, we demonstrate the feasibility of FPMD simulations General Terms on large samples of transition metals. The sample chosen for the Algorithms, Measurement, Performance. present performance study consists of 1000 molybdenum atoms (12,000 electrons) at ambient pressure conditions, and includes a Keywords highly accurate treatment of electron-ion interactions. Norm- Electronic structure. Molecular Dynamics. Ab initio simulations. conserving semi-local pseudopotentials were used to represent the First-principles simulations. Parallel computing. BlueGene/L. electron-ion interactions. A total of 64 non-local projectors were used (8 radial quadrature points for p and d channels) to represent the potential of each atom. A plane wave energy cutoff of 44 Ry 1. INTRODUCTION defines the basis set used to describe the electronic wave First-Principles Molecular Dynamics (FPMD) is an accurate functions. In all our calculations, electronic wavefunctions are atomistic simulation method that is routinely applied to a variety represented at a single k-point in the Brillouin zone (k=0). We of areas including solid-state physics, chemistry, biochemistry choose to work at ambient pressure since this is the most CPU- and nanotechnology[1]. It combines a quantum mechanical intensive calculation: higher pressures involve smaller simulation description of electrons with a classical description of atomic volumes and are correspondingly less demanding. nuclei. The Newton equations of motion for all nuclei are integrated in time in order to simulate dynamical properties of physical systems at finite temperature. At each discrete time step This calculation is considerably larger that any previously feasible of the trajectory, the forces acting on the nuclei are derived from a FPMD simulation. Our demonstration that BG/L’s large calculation of the electronic properties of the system. computing power makes such large simulations feasible opens the way to accurate simulations of the properties of metals, including the calculation of melting temperatures using the two-phase The electronic structure calculation is the most time-consuming simulation technique [3], and the calculation of defect energies part of an FPMD simulation, and limits the size of tractable and defect migration processes. systems to a few hundred atoms on most currently available parallel computers. It consists in solving the Kohn-Sham (KS) 3. QBOX: A HIGHLY SCALABLE FPMD equations [2], a set of non-linear, coupled integro-differential partial differential equations. Considerable efforts have been IMPLEMENTATION devoted over the past three decades to the development of Qbox is a C++ implementation of the FPMD method. It is efficient implementations of the electronic structure computation. developed at the Center for Applied Scientific Computing In its most common implementation, the solution of the Kohn- (CASC), Lawrence Livermore National Laboratory (LLNL). A Sham equations has O(N3) complexity, where N is the number of detailed description of the code has been given elsewhere [4]. atoms. Qbox exploits parallelism using the MPI message-passing paradigm. The design of Qbox yields good load balance through an efficient data layout and a careful management of the data flow In this paper, we report on FPMD simulations performed using during the most time consuming operations. Qbox solves the the Qbox code on the BlueGene/L (BG/L) computer installed at Kohn-Sham (KS) equations within the pseudopotential, plane Lawrence Livermore National Laboratory. Qbox is a parallel wave formalism. The solution of the KS equations has been implementation of the FPMD method designed specifically for extensively discussed by various authors and will not be repeated large parallel platforms, including BlueGene/L. Simulations were here [1]. performed using up to 32,768 processors, including up to 1000 atoms. Performance measurements of strong scaling show that an 86% parallel efficiency is obtained between 1k and 32k CPUs. It should be noted that two commonly used variants of FPMD Floating point operation counts measured with hardware simulations lead to “embarassingly parallel” implementations: 1) performance counters show that 36% of peak performance is calculations of periodic systems involving multiple k-points in the attained when using 4k CPUs. Brillouin zone, and 2) Path-Integral Monte-Carlo (PIMC) simulations in which the quantum behavior of nuclei (mostly in 2. PREDICTIVE SIMULATIONS OF the case of hydrogen) are investigated. Both these methods require the solution of the KS equation for many independent MATERIALS PROPERTIES configurations, and therefore lead to an easy parallelization by Although the applicability of FPMD is not limited to a specific distributing the independent problems to subsets of a parallel class of physical system, we focus in this paper on an important computer. This allows one to trivially achieve excellent scaling. issue of Materials Science, the simulation of metals under extreme We emphasize that we do not use either of these approaches, and conditions. Predicting the properties of metals at high pressures that our scaling results represent true strong scaling, i.e. the and high temperatures has been a longstanding goal of Materials solution of the same problem (a single k-point calculation) using Science and high energy density physics. Experimental an increasing number of processors. measurements of these properties can be extremely challenging 4. BLUEGENE/L: A SCALABLE available, Qbox currently only uses these instructions in the ARCHITECTURE FOR SCIENTIFIC DGEMM and FFT implementations discussed later in this paper. SIMULATIONS The integrated BG/L platform significantly increases the scale of BlueGene/L, as designed and implemented by a partnership high-end computing platforms. The final machine, expected in the between Lawrence Livermore National Laboratory (LLNL) and fall of 2005, will have 65,536 compute nodes for a total peak IBM, is a tightly-integrated large-scale computing platform. Its performance of 360TF; half of that machine is already installed compute node ASICs include all networking and processor and in use at LLNL. As shown in Figure 1, the final system will functionality; in fact, a compute node uses only that ASIC and include 1024 I/O nodes that are nearly identical to the compute nine DRAM chips. This system-on-a-chip design results in nodes. The difference is that they include gigabit ethernet (Gig-E) extremely high power and space efficiency. The full

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    8 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us