Indra: a Public Computationally Accessible Suite of Cosmological 푁-Body Simulations

Total Page:16

File Type:pdf, Size:1020Kb

Indra: a Public Computationally Accessible Suite of Cosmological 푁-Body Simulations MNRAS 000,1–12 (2021) Preprint 5 August 2021 Compiled using MNRAS LATEX style file v3.0 Indra: a Public Computationally Accessible Suite of Cosmological #-body Simulations Bridget Falck,1¢ Jie Wang,2,3 Adrian Jenkins,4 Gerard Lemson,1 Dmitry Medvedev,1 Mark C. Neyrinck,5,6 Alex S. Szalay1 1Department of Physics and Astronomy, Johns Hopkins University, 3400 N Charles St, Baltimore, MD 21218, USA 2National Astronomical Observatories, Chinese Academy of Sciences, Beijing, 100012, China 3School of Astronomy and Space Science, University of Chinese Academy of Sciences, Beijing 100049, China 4Institute for Computational Cosmology, Department of Physics, Durham University, Durham DH1 3LE, UK 5Ikerbasque, the Basque Foundation for Science, 48009, Bilbao, Spain 6Department of Physics, University of the Basque Country UPV/EHU, 48080, Bilbao, Spain Accepted XXX. Received YYY; in original form ZZZ; Draft version 5 August 2021 ABSTRACT Indra is a suite of large-volume cosmological #-body simulations with the goal of providing excellent statistics of the large-scale features of the distribution of dark matter. Each of the 384 simulations is computed with the same cosmological parameters and different initial phases, with 10243 dark matter particles in a box of length 1 ℎ−1 Gpc, 64 snapshots of particle data and halo catalogs, and 505 time steps of the Fourier modes of the density field, amounting to almost a petabyte of data. All of the Indra data are immediately available for analysis via the SciServer science platform, which provides interactive and batch computing modes, personal data storage, and other hosted data sets such as the Millennium simulations and many astronomical surveys. We present the Indra simulations, describe the data products and how to access them, and measure ensemble averages, variances, and covariances of the matter power spectrum, the matter correlation function, and the halo mass function to demonstrate the types of computations that Indra enables. We hope that Indra will be both a resource for large-scale structure research and a demonstration of how to make very large datasets public and computationally-accessible. Key words: dark matter – large-scale structure of Universe – methods: numerical 1 INTRODUCTION this observational volume. We can, however, simulate an ensemble of universes, increasing the precision with which we can numerically Understanding the observations made by large-scale structure sur- predict the statistical properties (e.g., the covariance matrix) of the veys such as Euclid, Roman, DESI, and the VRO requires input matter distribution on large scales for a given cosmological model. from theoretical predictions of structure formation in the form of numerical simulations. Cosmological #-body simulations solve for Ideally, one would want to simulate the full volume and resolu- the nonlinear gravitational collapse of matter and are indispensable tion that the next generation of galaxy surveys will attain, repeat this tools both for testing cosmological models and for planning for, and thousands of times for different realizations of the same cosmological analyzing, observations. For example, simulations are necessary to model and parameters, run this ensemble for thousands of combina- predict the large-scale structure observables of dark energy and mod- tions of cosmological parameters in a given model, and repeat the arXiv:2101.03631v2 [astro-ph.CO] 3 Aug 2021 ified gravity theories in the non-linear regime (e.g Joyce et al. 2016; entire process for any number of cosmological models one expects Winther et al. 2015), which can be compared to simulations of the the observations to test. Such a scenario has obvious drawbacks: it current ΛCDM (cold dark matter with a cosmological constant, Λ) would require a prohibitively expensive amount of computing power cosmological paradigm. Simulations have also become an integral and produce an excessive amount of data which would be impossible part of the analysis of measurements of baryon acoustic oscillations to analyze without equally massive computing resources. Because (BAO) from galaxy redshift surveys (e.g. Percival et al. 2014). of these immense technical challenges, workarounds are being de- One of the pressing challenges when making inferences from large vised: cosmological emulators (Heitmann et al. 2014, 2016; Garrison observational surveys is that we can only observe the Universe from et al. 2018; DeRose et al. 2019; Nishimichi et al. 2019) attempt to one vantage point in a finite volume. This “cosmic variance” places span cosmological parameter space with a few full simulations and limits on the precision with which we can measure statistical fluc- interpolate between them; low-resolution particle mesh simulations tuations on very large scales. This also presents a problem for sim- attempt to capture features only at large scales (Takahashi et al. 2009; ulations: we cannot simulate exactly the positions of the galaxies in Blot et al. 2015); approximate simulations combine numerical short- cuts with analytic solutions to produce fast and cheap ensembles of realizations with a limited range of accuracy (see Chuang et al. 2015; ¢ E-mail:[email protected] Lippich et al. 2019, and references therein); and analytic methods at- © 2021 The Authors 2 Falck, et al. tempt to model the statistics in order to reduce the number of full shifts and merger histories of dark matter halos are able to be built. realizations required to obtain the same precision (Pope & Szapudi We describe the simulations and their data products in Section2, 2008; Taylor et al. 2013; Dodelson & Schneider 2013; Heavens et al. including details on how the initial conditions were created and an 2017). issue that affected the first 128 simulations. In Section3, we measure This paper aims to address the technical challenges head-on by ensemble averages, variances, and covariances from the three Indra producing a suite of cosmological #-body simulations called Indra1 data products: the dark matter particles, the halo catalogs, and the hosted on SciServer2, a science platform that allows the community Fourier modes of the density field. In the appendix we describe how to perform analysis where the data are stored. This method of server- the full suite of simulations can be accessed and analyzed by the side analysis was pioneered in astronomy by the Sloan Digital Sky community. Survey (SDSS) SkyServer (Szalay et al. 2000), which enabled as- tronomers to interactively explore a multi-terabyte database by lever- aging state-of-the-art archiving technologies. Modelled on the suc- 2 THE INDRA SIMULATIONS cess of the SkyServer, the Millennium Simulation Database (Lemson & Virgo Consortium 2006) made available the post-processing out- 2.1 Overview puts of the Millennium simulation (Springel et al. 2005), including The Indra suite of simulations were run using the code L- halo catalogs, merger trees, and later, semi-analytic galaxy cata- Gadget2 (Springel 2005), each with a box length of 1 ℎ−1 Gpc, 10243 logs (see also Riebe et al. 2011; Loebman et al. 2014; Bernyk et al. particles, and WMAP7 cosmological parameters (Ω" = 0.272, 2016). The dark matter particle positions and velocities, however, ΩΛ = 0.728, ℎ = 0.704, f8 = 0.81, =B = 0.967)(Komatsu were not in the database, as their large size (roughly 20 TB) and us- et al. 2011). They are purely gravitational dark-matter-only cos- age patterns (being point clouds instead of object catalogs) presented mological simulations; each dark matter particle has a mass of significant technical challenges. 10 −1 −1 < ? = 7.031×10 ℎ M . The force softening length is 40 ℎ kpc, Though database technology has seen significant advances in the and the parameters used for the force accuracy, time integration, and past decade, our early efforts to build a relational database for In- tree criteria are the same as used for the Millennium Run (Springel dra were motivated by reducing the number of rows in a table – the et al. 2005), which used the same lean version of Gadget. Initial planned full suite of 512 simulations amounts to 35 trillion output conditions were generated using second-order Lagrangian perturba- particles. We considered storing particles in spatially-sorted array- tion theory (2LPT, Scoccimarro 1998; Jenkins 2010), with pre-initial like “chunks” (Dobos et al. 2012) and developed an inverted indexing conditions on a regular grid and a starting redshift of I = 127.3 64 scheme (Crankshaw et al. 2013) to access individual particles in these snapshots are saved down to I = 0 at the same redshifts as the Mil- arrays (e.g. to track their movement across many snapshots). How- lennium Run. In general the box size, particle number, and other ever, many technical advances opened up the capability to release parameters of Indra are geared toward resolving halos more massive data in their native binary file format, without needing to first load than that of the Milky Way and obtaining good statistics on BAO them into relational databases, by mounting the data to contained scales, while saving snapshots at a large number of redshifts and not compute environments running on virtual machines. Indra takes ad- exceeding a petabyte of data. Since Indra is intended to be a public vantage of the SciServer science platform, built at Johns Hopkins resource, we hope it can serve a variety of cosmological applications. University where the data are stored, which provides interactive and The Indra simulations can be identified using three digits that batch mode server-side analysis in various environments, public and range from 0 to 7, which we denote as x_y_z, corresponding to the private file storage in both flat files and databases, and many collab- coordinates of an 8 × 8 × 8 cube. Note however that each simulation oration tools (Taghizadeh-Popp et al. 2020). is fully independent and obeys periodic boundary conditions in a 1 Recent advances have seen some tera- and peta-scale cosmological ( ℎ−1 Gpc)3 volume; see Section 2.3 for details about how the initial simulation suites that provide their data for download, but with no phases were set up.
Recommended publications
  • Curriculum Vitæ 0000-0002-2769-9507
    Institut de Physique des 2 Infinis de Lyon 4 rue Enrico Fermi, Université Claude Bernard Lyon 1 Lyon, 69622 Villeurbanne B [email protected] Oliver Newton MusicalNeutron Musical-Neutron Curriculum Vitæ 0000-0002-2769-9507 Employment & voluntary work Vocational 2019– Postdoctoral Researcher, Institut de Physique des Deux Infinis, Lyon, France. 2014–2015 Graduate Trading Platform Engineer, Fidessa, Woking, UK. Volunteering 2014–2020 Founding trustee & Treasurer, UniBrass Foundation (charity number: 1159359), UK. Education 2015–2019 PhD, Institute for Computational Cosmology, Durham, UK. Probing the nature of dark matter with small-scale cosmology Supervisors: Prof. Adrian Jenkins and Prof. Carlos Frenk 2010–2014 BSc MPhys, University of Warwick, Coventry, UK, 1st Class (Hons). Masters project: Determining the fundamental properties of Higgs candidates at the LHC Awards and scholarships 2019 ICC Research Scholarship, Institute for Computational Cosmology, Durham, UK. 2015–2019 STFC Postgraduate Studentship, Institute for Computational Cosmology, Durham, UK. Conference contributions Contributed talks July 2020 EAS (Online), Leiden, Netherlands. Constraining the properties of WDM using the satellite galaxies of the Milky Way Jan 2020 VIRGO Consortium meeting, Durham, UK. Constraining the properties of WDM using the satellite galaxies of the Milky Way Sep 2019 CLUES Collaboration meeting, IN2P3, Lyon, France. Constraints on thermal relic WDM from satellites of the LG July 2019 Small Galaxies, Cosmic Questions, Durham, UK. Constraints on the mass of the thermal relic warm dark matter particle Jan 2019 DEX XV, Edinburgh, UK. Constraints on the mass of the thermal relic warm dark matter particle Dec 2018 VIRGO Consortium meeting, Leiden University, Netherlands. Constraints on the mass of the thermal relic warm dark matter particle Aug 2018 XXX IAU General Assembly, Vienna, Austria.
    [Show full text]
  • The 2Df Galaxy Redshift Survey and Cosmological Simulations Carlos S
    University of Durham The 2dF galaxy redshift survey and cosmological simulations Carlos S. Frenk Institute for Computational Cosmology, Durham Institute for Computational Cosmology The 2dF Galaxy Redshift University of Durham Survey 1997- 2002 250 nights at 4m AAT 221,000 redshifts to bj<19.45 median z = 0.11 First 100k z's released June/01 Full catalogue released July/03 Institute for Computational Cosmology 2dF Galaxy Redshift Survey: University of Durham Team Members Ivan K. Baldry10 Carlton M. Baugh2 Joss Bland-Hawthorn1 Terry Bridges1 Russell Cannon1 Shaun Cole2 Matthew Colless3 Chris Collins13 Warrick Couch5 Nicholas Cross6 Gavin Dalton9 Kathryn Deely5 Roberto De Propris5 Simon P. Driver6 George Efstathiou8 Richard S. Ellis7 Carlos S. Frenk2 Karl Glazebrook10 Edward Hawkins12 Carole Jackson3 Ofer Lahav8 Ian Lewis9 Stuart Lumsden11 Steve Maddox12 Darren Madgwick8 Stephen Moody8 Peder Norberg2 John A. Peacock4 Will Precival4 Bruce A. Peterson3 Mark Seaborne9 Will Sutherland4 Keith Taylor7 Institutions 1Anglo-Australian Observatory 2University of Durham 3The Australian National University 4University of Edinburgh 5University of New South Wales 6University of St Andrews 7California Institute of Technology 8University of Cambridge 9University of Oxford 10Johns Hopkins University 33 people at 11University of Leeds 12University of Nottingham 13 Liverpool John Moores University 12 Institute for Computational Cosmology institutions The 2dF galaxy redshift survey University of Durham QuickTimeã and a YUV420 codec decompressor are needed to
    [Show full text]
  • Public Release of N-Body Simulation and Related Data by the Virgo Consortium
    Mon. Not. R. Astron. Soc. 000, 1–4 (2000) Printed 25 October 2018 (MN LATEX style file v1.4) Public Release of N-body simulation and related data by the Virgo consortium. C. S. Frenk,1 J. M. Colberg,2 H. M. P. Couchman,3 G. Efstathiou,4 A. E. Evrard,5 A. Jenkins,1 T. J. MacFarland,6 B. Moore,1 J. A. Peacock,7 F. R. Pearce,1 P. A. Thomas,8 S. D. M. White2 and N. Yoshida.2 (The Virgo Consortium)10 1 Dept Physics, South Road, Durham, DH1 3LE. 2 Max-Planck Inst. for Astrophysics, Garching, Munich, D-85740, Germany. 3 Department of Physics and Astronomy, McMaster University, Hamilton, Ontario, L8S 4M1, Canada 4 Institute of Astronomy, Madingley Road, Cambridge CB3 OHA 6 Now at 105 Lexington Avenue, Apt. 6F,New York, NY 10016 5 Dept Physics, University of Michigan, Ann Arbor, MI-48109-1120. 7 Royal Observatory, Institute of Astronomy, Edinburgh, EH9 3HJ 8 Astronomy Centre,CPES, University of Sussex, Falmer, Brighton, BN1 9QJ 10 http://star-www.dur.ac.uk/∼frazerp/virgo/ 25 October 2018 ABSTRACT We are making available on the WWW a selection of the archived data from N-body simulations carried out by the Virgo consortium and related groups. This currently includes: (i) time-slice, lightcone and cluster data from the two 109-particle Hubble volume simulations described by Evrard 1998; (ii) time-slice data from simula- tions of 4 different cold dark matter cosmological models with 2563 particles analysed by Jenkins et al 1998; (iii) Dark halo catalogs, merger trees and galaxy catalogs from arXiv:astro-ph/0007362v1 25 Jul 2000 the GIF project described by Kauffmann et al 1999.
    [Show full text]
  • Optimising FLASH for Cosmological Simulations
    I V N E R U S E I T H Y T O H F G E R D I N B U Optimising FLASH for Cosmological Simulations Chris Daley August 24, 2007 M.Sc. in High Performance Computing The University of Edinburgh Year of Presentation: 2007 Abstract Runtime performance is a limiting factor in large cosmological simulations. A paral- lel application code called FLASH, which can be used for cosmological simulations is investigated in this report. FLASH is an Adaptive Mesh Refinement (AMR) code, parallelised using MPI, and incorporating a Particle-Mesh (PM) Poisson solver. Pro- files of a cosmological simulation based on FLASH, indicate that the procedure which accumulates particle mass onto the mesh is the major bottleneck. A parallel algorithm devised by the FLASH center may help to overcome this bottleneck. In this project, the algorithm is implemented for the latest beta release of FLASH, which cannot currently perform cosmological simulations. There is interest in FLASH because the adaptive mesh can be refined to resolve shocks in cosmological simulations. In addition, the new version of FLASH is designed with an emphasis on flexible and easy to extend code units. The communication strategy in the implemented parallel algorithm does not involve guard cell exchanges (halo swaps). As such, it uses less memory than the previous im- plementation in the previous version of FLASH. Due to time restrictions, the delivered implementation is only compatible with a uniform grid in both uniform and adaptive grid modes. In addition, results indicate it is approximately 3-5 times slower than the previous implementation.
    [Show full text]
  • Craf Pub Eng050126.Indd
    I NFORMATION FOR THE PUBLIC The Crafoord Prize 2005 The Royal Swedish Academy of Sciences has decided to award the Crafoord Prize in Astronomy 2005 to James Gunn, Princeton University, USA, James Peebles, Princeton University, USA, and Martin Rees, University of Cambridge, UK, “for contributions towards understanding the large-scale structure of the Universe”. How was the Universe formed? One of the most obvious facts about the Universe is that it shows a wealth of struc- ture on all scales from planets, stars and galaxies up to clusters of galaxies and super-clusters extending over several hundred million light years. Astronomi- cal observations have shown that the galaxies are not evenly distributed, but are mainly found in clusters and enormous filaments (Fig. 1). Often this is referred to as the ’cosmic web’. The origin and history of this structure has long been one of the most important problems for astronomy and cosmology. The recipients of this year’s Crafoord Prize have all made fundamental contributions to the dramatic increase in our understanding of how this large-scale structure was formed. The first traces of structure in the Universe can be seen in the fluctuations of the cosmic microwave background radiation about 380,000 years after Big Bang (Fig. 2). The origin of the fluctuations can, however, be traced back to epochs much earlier in the history of the Universe when small quantum fluctuations gave rise Fig. 1. Two dimensional distribution of galaxies from the Sloan Digital Sky Survey. The plot corresponds to a thin slice in the ’vertical’ direction, but covering most of the sky in the other direction.
    [Show full text]
  • Brandon Burkholder
    Brandon Burkholder Uses Structures Information Projects Millenium, Eris, Virgo Consortium, Bolshoi Pleiades, Titan, Hyades Physics Cosmological Principle General Relativity N-Body Problem Hydrodynamics Methods Brute Force Particle Mesh Tree Method Hybrids Initial Conditions Software ChaNGa GADGET GRAPE RAMSES Planets Galaxies Stars CDM Halos IGM/ISM Big Bang Exotic objects (black holes, neutron stars, supernovae, gamma-ray bursts) Explaining Observation Choosing Observation Targets Explaining Structure Formation Predicting Evolution of Structure A basic scientific tool to test theories in cosmology is to evaluate their consequences for the observable parts of the universe. One piece of observational evidence is the distribution of matter, including galaxies and intergalactic gas, which are observed today. Light emitted from more distant matter must travel longer in order to reach Earth, meaning looking at distant objects is like looking further back in time. This means the evolution in time of the matter distribution in the Universe can be observed. Millennium Run (cosmology) 2e9 ly cube, 1010 particle (dark matter) N-body simulation Starts at emission of CMB to utilize inhomogenous matter distribution initial conditions 25 TB output data, > 1 month computing time Eris (galactic formation) 7 6e particles, 13 billion years Dark matter foundation & galactic gas Feedback (UV background heating, supernovae blast wave enrichment, cooling) 9 months computing time (1.6 million cpu hours) Bolshoi (cosmology)
    [Show full text]
  • Gravitational Waves
    Gravitational waves Jo van den Brand, Nikhef and VU University Amsterdam, [email protected] Maratea, June 30, 2018 February 2016: discovery of gravitational waves Tiny vibrations in space can now be observed by using the kilometer-scale laser-based interferometers of LIGO and Virgo. This enables an entire new field in science Event GW150914 Chirp-signal from gravitational waves from two Einsteincoalescing black holes were Telescope observed with the LIGO detectors by the LIGO-Virgo Consortium on September 14, 2015 The next gravitational wave observatory Intermezzo Metric tensor and geometry Geometry Geometry is the study of shapes and spatial relations Euclidean geometry - sum of internal angles of a triangle is 180o - circumference of a circle with radius R has length 2R - a sphere with radius R has area A = 4R2 - Newton: space is Euclidean and time is universal - Newton thus assumes a flat spacetime Euclidian geometry is not the only kind of geometry - Bolyai, Lobachevsky, Gauss discovered other geometries Especially, Carl Friedrich Gauss (1777 – 1855) and Bernhard Riemann (1826 – 1866) have made important contributions to the development of geometry. The defined differential geometry Determining the geometric properties of the space (or even spacetime) around us is an experimental enterprise. It is not the subject of pure mathematics Differential geometry and the line element Line element is the central element of any geometry Example: Length of a curve in Euclidean flat geometry in 2D - Cartesian coordinates - divide the curve in segments - use Pythagoras for each segment - sum in order to find the entire length of the curve Take the limit to infinitesimal elements - this defines the line element Now we only need to know how to evaluate such an integral.
    [Show full text]
  • The Illustris Simulation: Public Data Release
    The illustris simulation: Public data release The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation Nelson, D. et al. “The Illustris Simulation: Public Data Release.” Astronomy and Computing 13 (November 2015): 12–37 © 2015 Elsevier B.V. As Published http://dx.doi.org/10.1016/j.ascom.2015.09.003 Publisher Elsevier Version Author's final manuscript Citable link http://hdl.handle.net/1721.1/111982 Terms of Use Creative Commons Attribution-NonCommercial-NoDerivs License Detailed Terms http://creativecommons.org/licenses/by-nc-nd/4.0/ The Illustris Simulation: Public Data ReleaseI Dylan Nelsona,∗, Annalisa Pillepicha, Shy Genelb,a,1, Mark Vogelsbergerc, Volker Springeld,e, Paul Torreyc,g, Vicente Rodriguez-Gomeza, Debora Sijackif, Gregory F. Snyderh, Brendan Griffenc, Federico Marinaccic, Laura Blechaj,2, Laura Salesi, Dandan Xud, Lars Hernquista aHarvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA, 02138, USA bDepartment of Astronomy, Columbia University, 550 West 120th Street, New York, NY, 10027, USA cKavli Institute for Astrophysics and Space Research, Department of Physics, MIT, Cambridge, MA, 02139, USA dHeidelberg Institute for Theoretical Studies, Schloss-Wolfsbrunnenweg 35, 69118 Heidelberg, Germany eZentrum f¨urAstronomie der Universit¨atHeidelberg, ARI, M¨onchhofstr.12-14, 69120 Heidelberg, Germany fInstitute of Astronomy and Kavli Institute for Cosmology, University of Cambridge, Madingley Road, Cambridge CB3 0HA, UK gTAPIR, Mailcode 350-17, California Institute of Technology, Pasadena, CA 91125, USA hSpace Telescope Science Institute, 3700 San Martin Dr, Baltimore, MD 21218 iDepartment of Physics and Astronomy, University of California, Riverside, 900 University Avenue, Riverside, CA 92521, USA jUniversity of Maryland, College Park, Department of Astronomy and Joint Space Science Institute Abstract We present the full public release of all data from the Illustris simulation project.
    [Show full text]
  • Halo and Galaxy Formation Histories from the Millennium Simulation
    Halo and Galaxy Formation Histories from the Millennium Simulation: Public release of a VO-oriented and SQL-queryable database for studying the evolution of galaxies in the ΛCDM cosmogony Gerard Lemson∗† and the Virgo Consortium 31 July 2006 ABSTRACT The Millennium Run is the largest simulation of the formation of structure within the ΛCDM cosmogony so far carried out. It uses 1010 particles to follow the dark matter distribution in a cubic region 500h−1Mpc on a side, and has a spatial resolution of 5 h−1kpc. Application of simplified modelling techniques to the stored output of this calculation allows the formation and evolution of the ∼ 107 galaxies more luminous than the Small Magellanic Cloud to be simulated for a va- riety of assumptions about the detailed physics involved. As part of the activities of the German Astrophysical Virtual Observatory we have used a relational database to store the detailed assembly histories both of all the haloes and subhaloes resolved by the simulation, and of all the galaxies that form within these structures for two independent models of the galaxy formation physics. We have created web applica- tions that allow users to query these databases remotely using the standard Structured Query Language (SQL). This allows easy access to all properties of the galaxies and halos, as well as to the spatial and temporal relations between them and their en- vironment. Information is output in table format compatible with standard Virtual Observatory tools and protocols. With this announcement we are making these struc- tures fully accessible to all users. Interested scientists can learn SQL, gain familiarity arXiv:astro-ph/0608019v2 3 Aug 2006 with the database design and test queries on a small, openly accessible version of the Millennium Run (with volume 1/512 that of the full simulation).
    [Show full text]
  • Carlos S. Frenk (Sept 2020)
    Curriculum Vitae: Carlos S. Frenk (Sept 2020) Personal Details: Name: Carlos S. Frenk Nationality: British & Mexican Date of Birth: 27/Oct/51 Address: Institute for Computational Cosmology (44 191) 3343641 Ogden Centre, Physics Department (44 191) 3343635 (secretary) University of Durham (44 191) 3343645 (fax) Durham DH1 3LE, England [email protected] Summary and highlights, as detailed in remainder of CV: Publications, citations, etc • Founding Director of Institute for Computational Cosmology (2001 - 2020) • Have published 500 papers in the refereed scientific literature. • My papers have been cited over 95,002 times (H-index 136), making me one of most cited authors ever in astronomy and space science in the world. My average number of citations per refereed paper is 190. • Author of 4 of the 100 most cited papers ever published in astronomy and space science (a total of over 930,000). • Included in the list of \Highly cited" researchers (that is those whose papers rank among the top 1% most cited in their field the previous year), published by the Web of Science Group, every year since this list was introduced. • 2020 Clarivate Web of Science Citation Laureate (out of 24 across all areas of science and eco- nomics). • Have delivered 232 invited talks at major international conferences. • Supervised 44 PhD students (37 graduated so far) and 43 postdoctoral researchers, many of whom have gone on to influential positions in industry and academia around the world. • Included in \The world's most influential scientific minds, 2015", a list of 3000 highly cited researchers (compiled by Thomson Reuters).
    [Show full text]
  • A Primer on LIGO and Virgo Gravitational Wave Detection
    A primer on LIGO and Virgo gravitational wave detection Jo van den Brand, Nikhef and VU University Amsterdam, and Maastricht University, [email protected] The 2019 European School of High-Energy Physics, St Petersburg, September 16, 2019 Interferometric gravitational wave detectors Tiny vibrations in space can now be observed by using the kilometer-scale laser-based interferometers of LIGO and Virgo. This enables an entire new field in science Eleven detections…so far First gravitational wave detection with GW150914 and first binary neutron star GW170817 Event GW150914 Chirp-signal from gravitational waves from two Einsteincoalescing black holes were Telescope observed with the LIGO detectors by the LIGO-Virgo Consortium on September 14, 2015 The next gravitational wave observatory Event GW150914 On September 14th 2015 the gravitational waves generated by a binary black hole merger, located about 1.4 Gly from Earth, crossed the two LIGO detectors displacing their test masses by a small fraction of the radius of a proton Displacement Displacement ( 0.005 0.0025 0.000 Proton radii Proton - 0.0025 - 0.005 ) Proton radius=0.87 fm Measuring intervals must be smaller than 0.01 seconds Intermezzo Metric tensor and geometry Geometry Geometry is the study of shapes and spatial relations Euclidean geometry - sum of internal angles of a triangle is 180o - circumference of a circle with radius R has length 2R - a sphere with radius R has area A = 4R2 - Newton: space is Euclidean and time is universal - Newton thus assumes a flat spacetime Euclidian geometry is not the only kind of geometry - Bolyai, Lobachevsky, Gauss discovered other geometries Especially, Carl Friedrich Gauss (1777 – 1855) and Bernhard Riemann (1826 – 1866) have made important contributions to the development of geometry.
    [Show full text]
  • The Insect Plane
    news and views Daedalus a b 10Ð6 The insect plane Aviation engineers look with envy on birds ) Atom 1 and especially insects. Their flapping Ð Vacancy s 10Ð7 wings lift and propel them far more Wire cross m c ( efficiently than the fixed wings of aircraft. V section V One reason is their ability to exploit the subtleties of stalling. Atom Ð8 Vacancy 10 If the angle of attack of a wing is increased, it ultimately stalls, with sudden, 1.50 1.55 1.60 1.65 1.70 1.75 1.80 1.85 1.90 disastrous loss of lift. No fixed-wing T (K) aircraft dare risk stalling. But an insect with oscillating wings can exploit an Figure 1 Superdiffusion in solid helium. a, Migration of crystal vacancies, and counterflow of atoms, intriguing loophole in the laws of around a stress-imposing wire in a solid crystal of helium, which allows the wire to move. b, Velocity aerodynamics. Accelerated at a high angle of the wire as a function of temperature as observed by Polturak et al.1, showing a peak at the of attack into the stalling regime, a wing temperature where ‘superdiffusion’ of atoms and vacancies occurs. This peak coincides with the takes a short while to stall. And until it temperature at which helium changes crystal structure, and is much larger than any such effect seen does, it generates enormous lift. By in metals (note that the ordinate scale is logarithmic). speeding into stall and out again at each flap, an insect wing develops amazingly softening of a particular kind of crystal defects.
    [Show full text]