<<

2016

INTRODUCTION METHODS & RESULTS Numerical general relativity simulations discretize Making Einstein’s equations and are essential to study We employed our 3D general-relativistic MHD phenomena involving strong , such as the code Zelmani for our MHD turbulence collision of black holes or the formation of simulations. Zelmani is open-source, is based on the in core-collapse explosions. Cactus framework, and uses the Einstein Toolkit. Magnetars, neutron stars with extremely strong 15 We mapped our initial conditions from a full 3D magnetic fields (B ≥ 10 ), are believed to adaptive mesh refinement simulation of rapidly power energetic supernova explosions (so-called rotating stellar collapse with an initial “hypernovae”) and cosmic gamma-ray bursts. MHD of 1010 Gauss. We then carried out simulations at turbulence has been suggested as a mechanism four different resolutions (500 m, 200 m, 100 m, 50 for building up the extremely strong magnetar m), where the two highest resolutions are sufficient magnetic field. This MHD turbulence is driven to resolve the key driver of turbulence in this system, by the magnetorotational instability (MRI), whose the MRI. The 50 m simulation was run on 132,768 fastest growing mode has a very short wavelength, Blue Waters cores, used 7.8 TB of runtime memory, making it difficult to resolve in simulations. An and created 500 TB of output data. additional complication is that the MRI builds up Previous local simulations have shown that the magnetic field strength locally, but what is needed instability creates small patches of the magnetar- for a magnetar is a large-scale (global) magnetic strength magnetic field, but have not been able to field that can force the high-density stellar plasma address whether these patches can be turned into a into coherent motion and thus drive a supernova MAGNETARS, BLACK HOLE COLLISIONS FOR global dynamically relevant toroidal magnetic field explosion. structure needed to power an explosion. LIGO, AND A NEXT GENERATION NUMERICAL Gravitational Waves (GWs) from a coalescing Our simulations show that not only does the MRI pair of black holes were observed by LIGO on RELATIVITY CODE produce local magnetar-strength magnetic field, but September 14, 2015. This first direct detection of that a subsequent dynamo process connects this GWs opened up the new field of GW . local field into a global field that is able to support FIGURE 1: Volume PI: Christian D. Ott1 Numerical relativity simulations of coalescing binary 1 2 an explosion. This dynamo process, which is akin to rendering of the Co-PIs: Mark Scheel , Peter Diener systems (and other GW sources) provide essential 3 1 1 1 5 2 local storm cells merging to form large-scale storm toroidal magnetic Collaborators: Philipp Mösta , Luke Roberts , David Radice , Sherwood Richers , Roland Haas , Erik Schnetter , Lawrence GW signal predictions for LIGO – these predictions 4 4 4 3 patterns like a hurricane, works extremely efficiently field strength in a Kidder , Saul Teukolsky , Scott Field , and Francois Foucart are needed for comparison with GW observations to and we predict the formation of a large-scale toroidal rapidly rotating 1 California Institute of Technology extract the astrophysical parameters (e.g., mass and field independent of the initial magnetization of core-collapse 2Lousinana State University spin of black holes) of the sources and for testing 3 the progenitor . Figure 1 shows an example supernovae. The University of California, Berkeley General Relativity. 4 magnetic field configuration obtained in our highest- simulation shows a Cornell University Parallel scaling and efficiency on heterogeneous 5 resolution simulation. forming magnetar as Albert Einstein Institute architectures are key requirements for petascale (and Our simulations demonstrate that rapidly rotating the turbulent engine future exascale) simulation codes. Much current stellar collapse is a viable formation channel for of the explosion. EXECUTIVE SUMMARY waves. We responded with target-of-opportunity computational and numerical relativity magnetars. Magnetars are abundantly observed Colors represent numerical relativity simulations on Blue Waters that codes use traditional data-parallel approaches and in the universe and have recently gained a lot of magnetic field Neutron stars with an extremely strong magnetic are now helping LIGO to extract information more sequential execution models that suffer from load attention as possible engines driving hypernovae strength. Yellow and field are called magnetars. An open question is how reliably from the waves. balancing issues, allow no or only limited execution and superluminous supernovae, some of which are light blue indicate they obtain their magnetic field. We have carried To more efficiently use large core counts on Blue concurrency and make it difficult to benefit from also connected to long gamma-ray bursts. magnetar-strength out 3D simulations of a magnetized differentially Waters and in preparation for the next-generation accelerators. High-order finite-difference and finite- positive/negative rotating proto-. These simulations show Binary Black Hole Waveforms for high-performance computing architectures, we volume methods require the communication of field and red and that the magnetorotational instability is active in Astronomy with LIGO have designed, implemented, and tested the new three or more subdomain boundary points in each blue weaker field. such proto-neutron stars. It efficiently amplifies We simulate binary black hole (BBH) coalescence general-relativistic magnetohydrodynamics (MHD) direction, creating large communication overheads. magnetic field locally and a dynamo process using the Spectral Einstein Code (SpEC). SpEC uses Image credit: Robert code, SpECTRE. It employs the Discontinuous The Discontinuous Galerkin (DG) method can rearranges and orders the field, creating a large- multi-domain pseudospectral methods to discretize R. Sisneros (NCSA) Galerkin method; is based on the Charm++ parallel drastically reduce the number of points that must scale global magnetic field as needed for driving Einstein’s equations. Pseudospectral methods and Philipp Mösta programming system, and it exhibits near perfect be communicated. DG, in combination with a task- supernova explosions. provide exponential convergence and are optimal for (UC Berkeley). strong scaling to 128,000 Blue Waters cores. based parallelization strategy (which allows task The National Foundation’s (NSF) Laser concurrency), provides dynamical load balancing, simulating gravitational fields, which are guaranteed Interferometer Gravitational Wave Observatory and can hide latencies, may be the route forward to be smooth. The black hole interiors are excised (LIGO) has made its first detection of gravitational for simulation codes. from the computational grid to avoid singularities.

32 33 BLUE WATERS ANNUAL REPORT 2016

numerical relativity to within a specifiable error tolerance. It can thus serve as a surrogate model that replaces additional numerical relativity simulations. This error tolerance dictates how many numerical relativity simulations are needed – typically many orders of fewer than the millions that are needed for parameter estimation – and a greedy algorithm determines where in parameter space these simulations must be performed. Motivated by LIGO’s detection, we have carried out target-of-opportunity SpEC BBH simulations using Blue Waters essentially as a high-throughput capacity computing facility to run hundreds of FIGURE 3: Strong BBH simulation, each using only two or three scaling of the next- nodes. Figure 2 shows the simulated points of the generation general- parameter space. In total, we simulated 286 BBH relativistic MHD code systems, and each was simulated with three different SpECTRE for an MHD test resolutions to estimate the truncation error. Using problem on Blue Waters. these waveforms, we built a surrogate model that outperforms previous analytic waveform models and enables more accurate parameter estimation. A next-generation simulation code: SpECTRE SpECTRE is a new numerical relativity and general-relativistic hydrodynamics code that uses the how magnetar-strength magnetic field can be PUBLICATIONS AND DATA SETS DG method with a task-based parallelization strategy generated. Blue Waters’ capacity is crucial for a Mösta P., et al., A large-scale dynamo and FIGURE 2: Parameters facilitated by the Charm++ parallel programming quick turnaround in our target-of-opportunity An astrophysical binary black hole (BBH) system magnetoturbulence in rapidly rotating core-collapse of the 286 binary system. DG and task-based parallelization make numerical relativity BBH simulations. No other has seven parameters: the mass ratio MA/MB and supernovae, , 528:376 (2015), doi: 10.1038/ black hole systems a promising combination that will allow multi- machine allows us to run hundreds of simulations two spin vectors with three components each. The nature15755 simulated for LIGO. physics applications to be treated both accurately concurrently to generate the theoretical predictions total mass of the system is a scale factor that can Radice D., et al., -driven Convection The location of each and efficiently. On Blue Waters, SpECTRE now needed for LIGO. Finally, Blue Waters is a highly- be removed from theoretical predictions, but can in Core-collapse Supernovae: High-resolution circle indicates scales to more than 128,000 cores on a relativistic valuable development and testing resource for be inferred from the frequency of the observed Simulations, Astrophys. J., 820:76 (2016), doi: the mass ratio MHD test case simulating the interactions of shocks our next-generation code SpECTRE. We couldn’t waves. Using LIGO data and approximate analytic 10.3847/0004-637X/820/1/76 and dimensionless (Fig. 3). The Charm++ task-based parallelism library develop this code without being able to test it at models for BBH gravitational waveforms, some of Radice D., et al., The One-Armed Spiral Instability spin magnitudes includes a dynamic runtime system to assign task scale on Blue Waters. the BBH parameters have been measured for the in Neutron Star Mergers and its Detectability in for a system. The ordering as well as communication optimizations BBH system that LIGO observed. To improve these Gravitational Waves, Phys. Rev. D., arXiv:1603.05726 arrows indicate the tailored to Cray's Gemini network. measurements, numerical relativity waveforms can NEXT GENERATION WORK (2016). direction of spin Building on this success, we are now adding the be used. However, Bayesian parameter estimation We are working with high intensity on our new Radice D., et al., Dynamical Mass Ejection from of each black hole remaining infrastructure and physics to simulate algorithms need millions of waveforms as they code SpECTRE, which we plan to deploy on a next- Binary Neutron Star Mergers. Mon. Not. Roy. Ast. (blue arrows for realistic astrophysical systems. To tackle challenging sample the parameter space near the detected signal. generation Track-1 system for more detailed, higher- Soc., arXiv:1601.02426 (2016). hole A, the larger multi-scale problems which naturally arise in This is problematic because each numerical relativity resolution, and longer (in physical time) simulations Bernuzzi S., et al., How Loud Are Neutron Star black hole; red such systems, we are adding local time stepping simulation takes weeks to months to complete. of core-collapse supernovae, magnetars, and binary Mergers? Mon. Not. Roy. Ast. Soc., arXiv:1512.06397 arrows for hole B) techniques and adaptive mesh refinement strategies To address this problem, we have developed systems of black holes and neutron stars. Present (2016). and are scaled by to either split the elements into smaller elements or a technique to interpolate between numerical simulations, even with Blue Waters, are limited the magnitude of increase the number of basis functions in an element. relativity waveforms that we did simulate to in accuracy and, importantly, in the physical time the spin. construct waveforms at parameters that we did WHY BLUE WATERS they can track. Our magnetar MHD simulation, for not simulate. The interpolant waveform can be example, could cover only tens millisecond of the computed in a fraction of a second (compared to the Our MHD turbulence simulations that resolve magnetar’s life while we would need to simulate tens to hundreds of thousand central processing unit the MRI would have been completely impossible at least a hundred times longer to fully study hours required to construct such a waveform using without Blue Waters’ capability. Blue Waters its magnetic field evolution and its impact on the numerical relativity). Furthermore, the interpolant was essential for these simulations and directly supernova explosion. can be made indistinguishable from the results of facilitated the breakthrough in our understanding

34 35