DISCOVER DISCOVER CYBERINFRASTRUCTURE Terashake: Simulating “The Big One” COVER STORY

Total Page:16

File Type:pdf, Size:1020Kb

DISCOVER DISCOVER CYBERINFRASTRUCTURE Terashake: Simulating “The Big One” COVER STORY San Diego Supercomputer Center SPECIAL ISSUE FOR SC2004 DISCOVER DISCOVER CYBERINFRASTRUCTURE TeraShake: Simulating “the Big One” COVER STORY TERASHAKE EARTHQUAKE SIMULATION Ground surface velocity predicted for earthquake waves in the large-scale SCEC/CME, GEOFFREY ELY, UCSD TeraShake simulation of magnitude 7.7 earthquake on the southern San Andreas fault. The simulation ran for more than four days on SDSC’s 10 teraflops IBM DataStar supercomputer, producing an unprecedented 47 terabytes of data. Figure displays detailed fault-parallel velocity, showing the intense back- and-forth shaking predicted, with blue indicating upward motion and red indicating downward motion. The north-to-south fault rupture (thick white line) produced especially intense ground motion in the sediment-filled basin of the Coachella Valley which “shook like a bowl of jelly,” reaching peak velocities of more than 2 meters per second. TeraShake research will advance basic earthquake science and eventually help design more earthquake-resistant structures. 4 Terashake: Simulating the “Big One” on the San Andreas Fault THE SAN DIEGO SUPERCOMPUTER CENTER SDSC cyberinfrastructure Founded in 1985, the San Diego Supercomputer Center models 7.7 earthquake in (SDSC) has a long history of enabling science and engineering discoveries. Continuing this legacy into the unprecedented detail. next generation, SDSC’s mission is to “extend the reach” of researchers and educators by serving as a core resource for cyberinfrastructure— providing them with high-end hardware technologies, integrative software technologies, and deep interdisciplinary expertise. SDSC is an organized research unit of the University of California, San Diego and is primarily funded by the National Science Foundation (NSF). With a staff of more than 400 scientists, software developers, and support personnel, SDSC is an international leader in data management, grid computing, biosciences, geosciences, and visualization. SDSC INFORMATION Fran Berman, Director Vijay Samalam, Executive Director San Diego Supercomputer Center University of California, San Diego 9500 Gilman Drive MC 0505 La Jolla, CA 92093-0505 Phone: 858-534-5000 Fax: 858-534-5152 [email protected] www.sdsc.edu FROM THE DIRECTOR Greg Lund, Director of Communications ENVISION Delivering Cyberinfrastructure, enVision magazine is published by SDSC and presents Enabling Discovery leading-edge research in cyberinfrastructure and computational science. In the science and engineering ISSN 1521-5334 community, there’s a lot of discussion EDITOR: Greg Lund about cyberinfrastructure these days. [email protected] 858-534-8314 DESIGNER: Gail Bamber [email protected] 2 858-534-5150 Any opinions, conclusions, or recommendations in this publication are those of the author(s) and do not necessarily reflect the views of NSF, other funding organizations, SDSC, or UC San Diego. All brand names and product names are trademarks or registered trademarks of their respective holders. ALAN DECKER FEATURES 8 A New Star for Cyberinfrastructure SDSC’s DataStar Supercomputer tackles tough problems from molecules to galaxies. 12 SDSC Helps Students Make Beautiful Music 14 Storage Resource Broker harmonizes music education in SDSU EdCenter project. SDSC Technologies Power SIOExplorer Oceanography Digital Library Cyberinfrastructure to rescue legacy data and harvest THE BACK COVER shipboard data in real time. GEON Tools Reveal Earth’s Interior Tools developed in the NSF GEON project “Cyberinfrastructure for the Geosciences” help geoscientists “see” into the Earth. ALAN DECKER From the Director Delivering Cyberinfrastructure, Enabling Discovery n the science and engineering community, there’s a lot of discussion about cyberinfrastructure these days. For the record, cyberinfrastructure is the organized aggregate of technologies that enable us to access and integrate today’s informationI technology resources—data and storage, computation, communication, visualization, networking, scientific instruments, expertise—to facilitate science and engineering goals. Cyberinfra- structure captures the culture of modern science and engineering research and provides the technological foundation for significant discovery, synthesis, and dissemination brought about by the Information Revolution. It is widely believed in the science and their ability to focus on the challenges of the engineering community, and compellingly science rather than the challenges of using the described in the 2003 Report of the NSF Blue tools driving new discovery. In the response, Ribbon Advisory Panel on Cyberinfrastructure SDSC described a broad spectrum of user-ori- (the “Atkins Report”), that achievement of the ented cyberinfrastructure services that will “Cyberinfrastructure cyberinfrastructure vision has the power to enable users to focus on the science. Particular transform science and engineering research and efforts will be made to help users coordinate ultimately will be education. And the key to achievement of the the technological tools at their disposal in their cyberinfrastructure vision is the planning, local “home” environment with the larger evaluated not by the delivery, and commitment to make cyberinfra- landscape of cyberinfrastructure resources. structure, and the cyberscience it will enable, a success of its vision, reality. INTEGRATION Last Spring, SDSC responded to an NSF Cyberinfrastructure involves the integration but by the success of “Dear Colleague” letter for “Core” fund- of a wide variety of human, software, ing to evolve its activities from a and hardware systems to form a the infrastructure PACI Leading Edge Site to a powerful platform for enabling Cyberinfrastructure Center. The new discovery. The coordination, that is delivered to core funding response gave SDSC synthesis, and teaming required to an opportunity to think deeply provide cohesive support embod- the user and the about the challenges and opportu- ies both the promise and the chal- nities of cyberinfrastructure and lenge of the cyberinfrastructure enabling of cyberscience, and to develop a com- vision. SDSC’s response described the- prehensive and cohesive action plan to matic, organizational, and management ven- cyberscience.” build and deliver enabling cyberinfrastructure. ues for the integration and synthesis of science SDSC’s Core funding response focused on and engineering goals and cyberinfrastructure three key themes critical to achievement of the technologies. Integration activities included cyberinfrastructure vision: extensive user-focused software and services, community collaboration projects that bridge SERVICE science and technology goals, and the creation The “customers” and beneficiaries of cyber- of a joint SDSC/Cal-IT2 Synthesis Center for infrastructure are its users, and the ultimate leveraging infrastructure across projects and metric of success for cyberinfrastructure is disciplines. 2 ENVISION 2004 www.sdsc.edu by Francine Berman, Director, San Diego Supercomputer Center LEADERSHIP of the commercial sector and their experience and grids, and high performance computing with infrastructure at scale. This autumn, we . The achievement of the cyberinfrastructure and storage are initiating the Cyberinfrastructure is a long-term commit- vision will require broad-based community SDSC Cyberinfrastructure with the purpose of developing deep ment for NSF, SDSC, NCSA, PSC, ETF, and leadership as well as substantial commitment Forum partnerships that will facilitate integration of a broad community of collaborators, end-users, and effort. Building on a generation of exten- innovative academic and commercial efforts to process-builders, scientists, engineers, sive work in data technologies and data sci- design, develop, and build cyberinfrastructure. researchers, and educators who can benefit ence, the SDSC Core response focused on a from the implementation of this transforming comprehensive and coordinated set of activities Cyberinfrastructure ultimately will be evalu- vision. It will not happen without strategic to build out the “data stack,” forming an inte- ated not by the success of its vision, but by the success of the infrastructure that is delivered to the planning, coordinated effort, resources, and grated end-to-end Data Cyberinfrastructure. The Core commitment. In this issue of , we We are also increasingly engaging the key user and the enabling of cyberscience. enVision response allowed SDSC to develop a thought- inaugurate SDSC’s initial “official” activities as communities of computer scientists and social ful plan of action to build and deliver key serv- a Cyberinfrastructure Center. With these activ- scientists as cyberinfrastructure “process ices, infrastructure, and innovations that will ities and other critical enablers, the cyberinfra- builders,” and are intensifying traditional activ- contribute to the ultimate success of cyberin- structure vision holds the promise to achieve ities to provide a rich environment for cyberin- frastructure, and to focus in on the critical the potential of the Information Revolution, frastructure’s “end users.” areas of and to transform science and engineering in There is an important opportunity with applications, software and services, our time. M cyberinfrastructure to leverage the innovations databases and data technologies, networking ALEX PERRYMAN, JUNG-HSIN LIN, ANDREW MCCAMMON,ALEX PERRYMAN, UCSD SDSC cyberinfrastructure enables detailed molecular dynamics simulations of drug- resistant HIV protease, helping scientists identify a potential mechanism
Recommended publications
  • Supernovae Sparked by Dark Matter in White Dwarfs
    Supernovae Sparked By Dark Matter in White Dwarfs Javier F. Acevedog and Joseph Bramanteg;y gThe Arthur B. McDonald Canadian Astroparticle Physics Research Institute, Department of Physics, Engineering Physics, and Astronomy, Queen's University, Kingston, Ontario, K7L 2S8, Canada yPerimeter Institute for Theoretical Physics, Waterloo, Ontario, N2L 2Y5, Canada November 27, 2019 Abstract It was recently demonstrated that asymmetric dark matter can ignite supernovae by collecting and collapsing inside lone sub-Chandrasekhar mass white dwarfs, and that this may be the cause of Type Ia supernovae. A ball of asymmetric dark matter accumulated inside a white dwarf and collapsing under its own weight, sheds enough gravitational potential energy through scattering with nuclei, to spark the fusion reactions that precede a Type Ia supernova explosion. In this article we elaborate on this mechanism and use it to place new bounds on interactions between nucleons 6 16 and asymmetric dark matter for masses mX = 10 − 10 GeV. Interestingly, we find that for dark matter more massive than 1011 GeV, Type Ia supernova ignition can proceed through the Hawking evaporation of a small black hole formed by the collapsed dark matter. We also identify how a cold white dwarf's Coulomb crystal structure substantially suppresses dark matter-nuclear scattering at low momentum transfers, which is crucial for calculating the time it takes dark matter to form a black hole. Higgs and vector portal dark matter models that ignite Type Ia supernovae are explored. arXiv:1904.11993v3 [hep-ph] 26 Nov 2019 Contents 1 Introduction 2 2 Dark matter capture, thermalization and collapse in white dwarfs 4 2.1 Dark matter capture .
    [Show full text]
  • Arxiv:2101.12220V2 [Astro-Ph.HE]
    Neutron Stars Harboring a Primordial Black Hole: Maximum Survival Time Thomas W. Baumgarte1 and Stuart L. Shapiro2, 3 1Department of Physics and Astronomy, Bowdoin College, Brunswick, Maine 04011 2Department of Physics, University of Illinois at Urbana-Champaign, Urbana, Illinois 61801 3Department of Astronomy and NCSA, University of Illinois at Urbana-Champaign, Urbana, Illinois 61801 We explore in general relativity the survival time of neutron stars that host an endoparasitic, possibly primordial, black hole at their center. Corresponding to the minimum steady-state Bondi accretion rate for adiabatic flow that we found earlier for stiff nuclear equations of state (EOSs), we derive analytically the maximum survival time after which the entire star will be consumed by the black hole. We also show that this maximum survival time depends only weakly on the stiffness for polytropic EOSs with Γ ≥ 5/3, so that this survival time assumes a nearly universal value that depends on the initial black hole mass alone. Establishing such a value is important for constraining −16 −10 the contribution of primordial black holes in the mass range 10 M⊙ . M . 10 M⊙ to the dark-matter content of the Universe. Primordial black holes (PBHs) that may have formed the spherical, steady-state, Bondi accretion formula, in the early Universe (see, e.g., [1, 2]) have long been con- M 2ρ sidered candidates for contributing to, if not accounting M˙ =4πλ 0 (1) for, the mysterious and elusive dark matter (see, e.g., [3], a3 as well as [4] for a recent review). Constraints on the PBH contribution to the dark matter have been estab- for adiabatic flow ([18]; see also [19] for a textbook treat- lished by a number of different observations.
    [Show full text]
  • Lecture Notes in Astrophysical Fluid Dynamics
    Lecture Notes in Astrophysical Fluid Dynamics Mattia Sormani November 19, 2017 1 Contents 1 Hydrodynamics6 1.1 Introductory remarks..........................6 1.2 The state of a fluid...........................6 1.3 The continuity equation........................7 1.4 The Euler equation, or F = ma ....................8 1.5 The choice of the equation of state.................. 10 1.6 Manipulating the fluid equations................... 14 1.6.1 Writing the equations in different coordinate systems.... 14 1.6.2 Indecent indices......................... 16 1.6.3 Tables of unit vectors and their derivatives.......... 17 1.7 Conservation of energy......................... 18 1.8 Conservation of momentum...................... 21 1.9 Lagrangian vs Eulerian view...................... 21 1.10 Vorticity................................. 22 1.10.1 The vorticity equation..................... 23 1.10.2 Kelvin circulation theorem................... 24 1.11 Steady flow: the Bernoulli's equation................. 26 1.12 Rotating frames............................. 27 1.13 Viscosity and thermal conduction................... 28 1.14 The Reynolds number......................... 33 1.15 Adding radiative heating and cooling................. 35 1.16 Summary................................ 36 1.17 Problems................................. 37 2 Magnetohydrodynamics 38 2.1 Basic equations............................. 38 2.2 Magnetic tension............................ 44 2.3 Magnetic flux freezing......................... 45 2.4 Magnetic field amplification.....................
    [Show full text]
  • Numerical Hydrodynamics in General Relativity
    Numerical hydrodynamics in general relativity Jos´eA. Font Departamento de Astronom´ıay Astrof´ısica Edificio de Investigaci´on “Jeroni Mu˜noz” Universidad de Valencia Dr. Moliner 50 E-46100 Burjassot (Valencia), Spain Abstract The current status of numerical solutions for the equations of ideal general rel- ativistic hydrodynamics is reviewed. With respect to an earlier version of the article the present update provides additional information on numerical schemes and extends the discussion of astrophysical simulations in general relativistic hy- drodynamics. Different formulations of the equations are presented, with special mention of conservative and hyperbolic formulations well-adapted to advanced numerical methods. A large sample of available numerical schemes is discussed, paying particular attention to solution procedures based on schemes exploiting the characteristic structure of the equations through linearized Riemann solvers. A comprehensive summary of astrophysical simulations in strong gravitational fields is presented. These include gravitational collapse, accretion onto black holes and hydrodynamical evolutions of neutron stars. The material contained in these sections highlights the numerical challenges of various representative simulations. It also follows, to some extent, the chronological development of the field, concerning advances on the formulation of the gravitational field and arXiv:gr-qc/0003101v2 12 May 2003 hydrodynamic equations and the numerical methodology designed to solve them. 1 Introduction The description
    [Show full text]
  • Bondi-Hoyle Accretion
    A Review of Bondi–Hoyle–Lyttleton Accretion Richard Edgar a aStockholms observatorium, AlbaNova universitetscentrum, SE-106 91, Stockholm, Sweden Abstract If a point mass moves through a uniform gas cloud, at what rate does it accrete ma- terial? This is the question studied by Bondi, Hoyle and Lyttleton. This paper draws together the work performed in this area since the problem was first studied. Time has shown that, despite the simplifications made, Bondi, Hoyle and Lyttleton made quite accurate predictions for the accretion rate. Bondi–Hoyle–Lyttleton accretion has found application in many fields of astronomy, and these are also discussed. Key words: accretion PACS: 95.30.Lz, 97.10.Gz, 98.35.Mp, 98.62.Mw 1 Introduction arXiv:astro-ph/0406166v2 21 Jun 2004 In its purest form, Bondi–Hoyle–Lyttleton accretion concerns the supersonic motion of a point mass through a gas cloud. The cloud is assumed to be free of self-gravity, and to be uniform at infinity. Gravity focuses material behind the point mass, which can then accrete some of the gas. This problem has found applications in many areas of astronomy, and this paper is an attempt to address the lack of a general review of the subject. I start with a short summary of the original work of Bondi, Hoyle and Lyt- tleton, followed by a discussion of the numerical simulations performed. Some issues in Bondi–Hoyle–Lyttleton accretion are discussed, before a brief sum- mary of the fields in which the geometry has proved useful. Email address: [email protected] (Richard Edgar).
    [Show full text]
  • Vanamala Venkataswamy
    Vanamala Venkataswamy Email: [email protected] Website: http://www.cs.virginia.edu/~vv3xu Research Interest Job Scheduling, Reinforcement Learning, Cloud Computing, Distributed Storage. Applied Machine Learning: Deep Reinforcement Learning for job scheduling in data-centers powered by renewable energy sources, a.k.a Green Data-centers. Education Ph.D., Computer Science. University of Virginia (UVa), Charlottesville, VA. GPA 3.9/4.0 Advisor: Prof. Andrew Grimshaw. 2016-Present M.S., Computer Science. University of Southern California (USC), Los Angeles, CA. 2009-2010 B.E., Information Technology. Visvesvaraya Technological University, India. 2000-2004 Research and Professional Experience TOMORROW’S PROFESSOR TODAY (TPT), UVA AUG 2020 - AUG -2021 • Currently attending workshops and seminars designed to facilitate the transition from student to academic professional. The program focuses on improving preparedness primarily in teaching at the college level, with emphases in professional development and adjustment to a university career. SYSTEM SOFTWARE INTERN, LANCIUM INC. MAY 2019 - AUG -2019 • Configured Lancium’s cloud backend system for accepting jobs from users and scheduling jobs to matching resources in the data center. • Developed scheduler software to match users' job requests' to available matching GPU resources. • Assisted with the beta testing and release of Lancium’s cloud backend software. • Assisted with troubleshooting users' jobs and maintaining the cloud backend. PROGRAMMER ANALYST, UVA, CHARLOTTESVILLE, VA. 2011- 2016 • Developed grid command-line tools and web services for GenesisII software. • Developed framework for unit testing and regression testing for GenesisII software. • Administration and Maintenance of Cross Campus Grid (XCG) and XSEDE grid spanning multiple institutions and supercomputing facilities nationwide. • Interacted with grid users (users from UVa and other institutions) to troubleshoot issues while using grid resources.
    [Show full text]
  • Accretion Processes on a Black Hole Sandip K
    Contents Abstract ........................................... ............. 4 1: Introduction ..................................... .............. 5 1.1 Some General Issues Related to Black Hole Astrophysics . .................. 7 1.1.1 Formation of Black Holes ......................... ....................... 7 1.1.2 Fueling Active Galactic Nuclei . ........................ 9 1.1.3 Evolution of Galactic Centers .................... ....................... 12 1.1.4 Black Holes in Galactic Halos? .................... ..................... 13 1.1.5 Some Signatures of Black Holes .................... ..................... 14 1.2 Difference between Motions around a Newtonian Star and a BlackHole ... 16 1.3 Basics of Pseudo-Newtonian Geometries . .................... 21 1.4 Remarks About Units and Dimensions . .................. 23 2: Spherical Accretion ............................... ............ 24 2.1 Bondi Accretion on a Newtonian Star .................. ................... 25 2.1.1 Basic Equations ................................ ........................ 25 2.1.2 Phase Space Behaviour of the Bondi Flow . ................. 28 2.2 Bondi Flow on a Black Hole ........................... ................... 30 2.2.1 In Schwarzschild Geometry ....................... ....................... 30 2.2.2 In pseudo-Newtonian Geometry .................... ..................... 31 2.3 Bondi Flow with Simple Radiative Transfer . ................... 32 2.4 Bondi Flow with General Radiative Transfer . ................... 32 2.4.1 Single Temperature Solutions
    [Show full text]
  • LNCS 8147, Pp
    Mahasen: Distributed Storage Resource Broker K.D.A.K.S. Perera1, T. Kishanthan1, H.A.S. Perera1, D.T.H.V. Madola1, Malaka Walpola1, and Srinath Perera2 1 Computer Science and Engineering Department, University Of Moratuwa, Sri Lanka {shelanrc,kshanth2101,ashansa.perera,hirunimadola, malaka.uom}@gmail.com 2 WSO2 Lanka, No. 59, Flower Road, Colombo 07, Sri Lanka [email protected] Abstract. Modern day systems are facing an avalanche of data, and they are be- ing forced to handle more and more data intensive use cases. These data comes in many forms and shapes: Sensors (RFID, Near Field Communication, Weath- er Sensors), transaction logs, Web, social networks etc. As an example, weather sensors across the world generate a large amount of data throughout the year. Handling these and similar data require scalable, efficient, reliable and very large storages with support for efficient metadata based searching. This paper present Mahasen, a highly scalable storage for high volume data intensive ap- plications built on top of a peer-to-peer layer. In addition to scalable storage, Mahasen also supports efficient searching, built on top of the Distributed Hash table (DHT) 1 Introduction Currently United States collects weather data from many sources like Doppler readers deployed across the country, aircrafts, mobile towers and Balloons etc. These sensors keep generating a sizable amount of data. Processing them efficiently as needed is pushing our understanding about large-scale data processing to its limits. Among many challenges data poses, a prominent one is storing the data and index- ing them so that scientist and researchers can come and ask for specific type of data collected at a given time and in a given region.
    [Show full text]
  • Brian David Metzger
    Curriculum Vitae, Updated 10/20 Brian David Metzger Columbia University Email: [email protected] Department of Physics Web: http://www.columbia.edu/∼bdm2129 909 Pupin Hall, MC 5217 Phone: (212) 854-9702 New York, NY 10027 Fax: (212) 854-3379 ACADEMIC POSITIONS 07/20− Full Professor of Physics, Columbia University 07/19−07/20 Visiting Scholar, Simons Flatiron Institute 01/17−07/20 Associate Professor of Physics, Columbia University 01/13−01/17 Assistant Professor of Physics, Columbia University 09/12−12/12 Lyman Spitzer Jr. Fellow, Princeton University 09/09−12/12 NASA Einstein Fellow, Princeton University EDUCATION 08/03−05/09 University of California at Berkeley M.A. & Ph.D. in Physics (Thesis Adviser: Prof. Eliot Quataert) Dissertation: \Theoretical Models of Gamma-Ray Burst Central Engines" 08/99−05/03 The University of Iowa B.S. in Physics, Astronomy, & Mathematics (Highest Distinction) SELECT HONORS, FELLOWSHIPS and AWARDS 2020 Blavatnik National Laureate in Natural Sciences & Engineering 2020 Simons Investigator in Mathematics and Theoretical Physics 2019 Simons Fellow in Mathematics and Theoretical Physics 2019 2020 Decadal Survey in Astronomy & Astrophysics, Program Panelist 2019 Salpeter Honorary Lecturer, Cornell 2019 Bruno Rossi Prize, American Astronomical Society 2018,19,20 Blavatnik National Awards for Young Scientists, Finalist 2019 New Horizons Breakthrough Prize in Physics 2018 Charles and Thomas Lauritsen Honorary Lecture, Caltech 2016 Scialog Fellow, Research Science Corporation 2014 Alfred P. Sloan Research Fellowship 2009−12 NASA Einstein Fellowship, Princeton 2009 Dissertation Prize, AAS High Energy Astrophysics Division 2009 NASA Hubble Fellowship 2009 Lyman Spitzer Jr. Fellowship, Princeton 2009 Mary Elizabeth Uhl Prize, UC Berkeley Astronomy 2005−08 NASA Graduate Student Research Fellowship 2003 James A.
    [Show full text]
  • An Interoperable & Optimal Data Grid Solution For
    An Interoperable & Optimal Data Grid Solution for Heterogeneous and SOA based Grid- GARUDA Payal Saluja, Prahlada Rao B.B., ShashidharV, Neetu Sharma, Paventhan A. Dr. B.B Prahlada Rao [email protected] s IPDPS 2010, Atlanta 2010, IPDPS s ’ 19 April 2010 IPDPS10 System Software Development Group Centre for Development of Advanced Computing HPGC Workshop, IEEE Workshop, HPGC C-DAC Knowledge Park, Bangalore, India GARUDA DataGrid Solutions:GSRM – Prahlada Rao.. et all 1 Presentation Outline • Grid Storage Requirements • Various Data Grid Solutions • Comparision of Grid data Solutions • Indian National Grid –GARUDA • Data Management Challenges for GARUDA • GARUDA Data Storage Solution - GSRM • GSRM Highlights • GSRM Architecture Integration with GARUDA s IPDPS 2010, Atlanta 2010, IPDPS s middleware components ’ • GSRM Usage Scenario for Par. applications • Conclusions HPGC Workshop, IEEE Workshop, HPGC GARUDA DataGrid Solutions:GSRM – Prahlada Rao.. et all 2 Grid Storage Requirements • Data Availability • Security • Performance & Latency • Scalability • Fault Tolerance s IPDPS 2010, Atlanta 2010, IPDPS s ’ HPGC Workshop, IEEE Workshop, HPGC GARUDA DataGrid Solutions:GSRM – Prahlada Rao.. et all Data Grid Solutions & Supported Storage Systems Data Grid Storage Systems Storage Grid File Storage Resource System Resource iRODS WS-DAI Broker Manager Gfarm StoRM GPFS DPM Lustre dCache Xtreemfs Grid Storage Storage Grid Solutions Bestman NFSv4 1.File Systems 1.File systems 1. File systems 1.File Systems AMG 2. Archives 2.Parallel File 2. Parallel File 2. Archives A WS- 3. Storage Area Systems Systems 3. Storage DAI s IPDPS 2010, Atlanta 2010, IPDPS s ’ Network (SAN) 3.Object 3.Object storage Area Network 4. Data Bases storage device (SAN) WS- 4.
    [Show full text]
  • Grid Resource Broker
    GRID RESOURCE BROKER ABHISHEK SINGH CSE Deptt. University at Buffalo Fall - 2006 Agenda • Grid Resource Broker- Architecture and Algorithm • Adaptable Resource Broker ( Drawback and Proposed solution) • SDSC Storage Resource Broker (SRB) • Future Enhancements A Sample Grid Grid Resource Broker INTERFACE Resource Resource Information Filter Lookup Services App/User Resource Resource Ranker Make match Fig 1. Resource Broker Architecture showing main components of the Design. Information Service : GIIS Tell me about the computing resources belonging to the HPC Lab that are uniprocessor Linux workstation, with low CPU load and available memory < 250 Mbyte XYZ.abc GRIS GRIS GIIS GRIS GRIS WEB Information Service: GRIS Tell me about the features of “xyz.abc” WEB GRIS Xyz.abc Grid Resource Broker INTERFACE Resource Resource Information Filter Lookup Services App/User Resource Resource Ranker Make match Fig 1. Resource Broker Architecture showing main components of the Design. Resource Brokering Algorithm • Input :- one or more job request • Action :- Select and submit job to most appropriate resources. • Output: none 1. Contact GIIS server(s) to obtain a list of available clusters. 2. Contact each resource's GRIS for static for static and dynamic resource information( hardware and software characteristics, current queue and load, etc.) 3. For each job :- (a) Select the cluster to which the job will be submitted. i. Filter out clusters that do not fulfill the requirements on memory, disk space architecture etc, and clusters that the user are not authorize to use. ii. Estimate the Total Time to Delivery (TTD) for each remaining resource. iii. Select the cluster with the shortest predicted TTD.
    [Show full text]
  • Arxiv:2002.12778V2 [Astro-Ph.CO] 9 May 2021 8
    CONSTRAINTS ON PRIMORDIAL BLACK HOLES Bernard Carr,1, 2, ∗ Kazunori Kohri,3, 4, 5, y Yuuiti Sendouda,6, z and Jun'ichi Yokoyama2, 5, 7, 8, x 1School of Physics and Astronomy, Queen Mary University of London, Mile End Road, London E1 4NS, UK 2Research Center for the Early Universe (RESCEU), Graduate School of Science, The University of Tokyo, Tokyo 113-0033, Japan 3KEK Theory Center, IPNS, KEK, Tsukuba, Ibaraki 305-0801, Japan 4The Graduate University for Advanced Studies (SOKENDAI), Tsukuba, Ibaraki 305-0801, Japan 5Kavli Institute for the Physics and Mathematics of the Universe, The University of Tokyo, Kashiwa, Chiba 277-8568, Japan 6Graduate School of Science and Technology, Hirosaki University, Hirosaki, Aomori 036-8561, Japan 7Department of Physics, Graduate School of Science, The University of Tokyo, Tokyo 113-0033, Japan 8Trans-scale Quantum Science Institute, The University of Tokyo, Tokyo 113-0033, Japan (Dated: Tuesday 11th May, 2021, 00:43) We update the constraints on the fraction of the Universe that may have gone into primordial black holes (PBHs) over the mass range 10−5{1050 g. Those smaller than ∼ 1015 g would have evaporated by now due to Hawking radiation, so their abundance at formation is constrained by the effects of evaporated particles on big bang nucleosynthesis, the cosmic microwave background (CMB), the Galactic and extragalactic γ-ray and cosmic ray backgrounds and the possible generation of stable Planck mass relics. PBHs larger than ∼ 1015 g are subject to a variety of constraints associated with gravitational lensing, dynamical effects, influence on large-scale structure, accretion and gravitational waves.
    [Show full text]