In Focus: Computing

Total Page:16

File Type:pdf, Size:1020Kb

In Focus: Computing CERNCOURIER IN FOCUS COMPUTING A retrospective of 60 years’ coverage of computing technology cerncourier.com 2019 COMPUTERS, WHY? 1972: CERN’s Lew Kowarski on how computers worked and why they were needed COMPUTING IN HIGH ENERGY PHYSICS 1989: CHEP conference considered the data challenges of the LEP era WEAVING A BETTER TOMORROW 1999: a snapshot of the web as it was beginning to take off THE CERN OPENLAB 2003: Grid computing was at the heart of CERN’s new industry partnership CCSupp_4_Computing_Cover_v3.indd 1 20/11/2019 13:54 CERNCOURIER WWW. I N F OCUS C OMPUT I N G CERNCOURIER.COM IN THIS ISSUE CERN COURIER IN FOCUS COMPUTING Best Cyclotron Systems provides 15/20/25/30/35/70 MeV Proton Cyclotrons as well as 35 & 70 MeV Multi-Particle (Alpha, Deuteron & Proton) Cyclotrons Motorola CERN Currents from 100uA to 1000uA (or higher) depending on the FROM THE EDITOR particle beam are available on all BCS cyclotrons Welcome to this retrospective Best 20u to 25 and 30u to 35 are fully upgradeable on site supplement devoted to Cyclotron Energy (MeV) Isotopes Produced computing – the fourth in 18F, 99mTc, 11C, 13N, 15O, a series celebrating CERN Best 15 15 64Cu, 67Ga, 124I, 103Pd Courier’s 60th anniversary. Big impact An early Data shift CERN’s Computer Centre in 1998, with banks of Best 15 + 123I, The Courier grew up during the microprocessor chip. 19 computing arrays for the major experiments. 25 Best 20u/25 20, 25–15 111 68 68 In, Ge/ Ga computing revolution. CERN’s first Best 30u Best 15 + 123I, SPECIAL FOREWORD 5 30 computer, the Ferranti Mercury, which (Upgradeable) 111In, 68Ge/68Ga CERN computing guru Graeme Stewart describes the immense task of had less computational ability than an taming future data volumes in high-energy physics. Greater production of average pocket calculator has today, Best 35 35–15 Best 15, 20u/25 isotopes COMPUTERS AT CERN 9 plus 201Tl, 81Rb/81Kr was installed in 1958 and early issues 1964: with CERN about to be equipped with the most powerful computer in 82 82 123 67 of the magazine carried helpful articles Sr/ Rb, I, Cu, Installation of Best 70 MeV Europe, the data handling division considered the challenges that it faced. Best 70 70–35 81 explaining how computers work Kr + research Cyclotron at INFN, Legnaro, Italy and why they are important. As the COMPUTERS, WHY? 13 reproduced articles in this issue show, 1972: in the year Intel’s 8008 processor was launched and the compact disc invented, CERN’s Lew Kowarski explained why computers were here to stay. it did not take long for computing to become a pillar of high-energy physics. WEAVING A BETTER TOMORROW 16 The BG-75 Biomarker Generator is a revolutionary 400 MeV Rapid Cycling Medical Synchrotron The field has inspired many advances, 1999: at 10 years old, the web was only just beginning to fulfil its potential. development in radio-pharmaceutical production for Proton-to-Carbon Heavy Ion Therapy: most famously the web and the Grid, THE MICROPROCESSOR BOOM 19 18 that delivers a single or batch dose of F-FDG, and and must now pursue the most advanced 1979: this prescient article described how the microprocessor would soon 18 Intrinsically small beams facilitating additional advanced F biomarkers on demand. beam delivery with precision technology developed elsewhere to meet become a routine part of the high-energy physicists’ toolkit. The system provides integration of all components the demands of future experiments. In Small beam sizes – small magnets, COMPUTING IN HIGH ENERGY PHYSICS 21 needed to produce and qualify PET biomarkers our special foreword, software-specialist into a single, self-contained system that occupies light gantries – smaller footprint 1989: the Computing In High Energy Physics conference weighed up Graeme Stewart sets out the need for the challenges of analysing data from LEP. a fraction of the space required by conventional Highly efficient single turn extraction physicists to prepare for a deluge of data solutions, simplifying the implementation of PET. Efficient extraction – less shielding A MAJOR SHIFT IN OUTLOOK 25 in a fast-evolving technological and 2001: Ben Segal described how distributed UNIX boxes took over from Flexibility – heavy ion beam therapy commercial landscape. CERN’s all-powerful IBM and Cray mainframe workhorses. (protons and/or carbon), beam • All of this year’s special supplements, delivery modalities THE CERN OPENLAB 27 including those devoted to detector, 2003: Grid computing was the theme of this feature about a new model vacuum and magnet technology, can for partnership between CERN and industry. be read at cerncourier.com under the CERN’S ULTIMATE ACT OF OPENNESS 30 section “In Focus”. 2019: the seed that led CERN to relinquish ownership of the web in 1993 Matthew Chalmers was planted when the laboratory formally came into being. ion Rapid Cycling Medical Synchrotron (iRCMS) Editor Philippe Bloch, Roger Head of Media Advertising sales Advertising Published by CERN, 1211 Matthew Chalmers Forty, Mike Lamont, Jo Allen Tom Houlden, Tel +44 (0)117 930 1026 Geneva 23, Switzerland Associate editor Matthew Mccullough Head of Media Chris Thomas or +44 (0)117 930 1164; Tel +41 (0) 22 767 61 11 Mark Rayner Business Development Advertisement e-mail sales@ E-mail Produced for CERN by Ed Jost production cerncourier.com Printed by Warners [email protected] IOP Publishing Ltd Art direction Katie Graham (Midlands) plc, Bourne, TeamBest Companies © 2019 * Best iRCMS is under development and not available for sale currently. Advisory board Temple Circus, Temple Frédérique Swist Marketing and General distribution Lincolnshire, UK Peter Jenni, Christine Way, Bristol BS1 6HG, UK Production editor circulation courrier-adressage@ www.bestcyclotron.com • www.bestabt.com • www.bestproton.com Sutton, Claude Amsler, Tel +44 (0)117 929 7481 Ruth Leopold Laura Gillham cern.ch © 2019 CERN BCS tel: 604 681 3327 • Best ABT tel: 865 982 0098 • BPT tel: 703 451 2378 DECEMBER 2019 3 BCS_Best ABT_BPT_ComboAd_CERNCourier_213x282mm_withIPAC2019_ShowInfo_v29_04162019_working.indd 1 4/16/19 11:50:57 AM CCSupp_4_Computing_Conts_v5.indd 3 20/11/2019 11:21 CERNCOURIER WWW. I N F OCUS C OMPUT I N G CERNCOURIER.COM HIGH VOLTAGE. EXACTLY. SPECIAL POWER FOR MODERN DETECTORS FOREWORD MODULAR LOW AND HIGH VOLTAGE MULTICHANNEL POWER SUPPLY SYSTEMS MMS MMS HV LV compatible compatible Adapting to exascale computing CERN’s Graeme Stewart surveys Bennett/CERNS six decades of computing milestones in high-energy physics and describes the immense challenges in taming the data volumes from future experiments in a rapidly changing technological landscape. It is impossible to envisage high- energy physics without its foundation of microprocessor technology, software and distributed computing. Almost as s o o n a s C E R N w a s fo u n d e d t h e fi r s t c on- tract to provide a computer was signed, but it took manufacturer Ferranti more than two years to deliver “Mercury”, our first valve-based behemoth, in 1958. Ever-evolving The CERN data centre, photographed in 2016. Graeme Stewart So early did this machine arrive that is a software the venerable FORTRAN language had computers, and every time transistors Worldwide LHC Computing Grid (WLCG). highest channel density to save valueable installation space specialist in the yet to be invented! A team of about 10 got smaller, clock speeds could be Not only would information need to be CERN EP-SFT people was required for operations and ramped up. It was a golden age where transferred, but huge amounts of data best ripple and noise characteristics group. He is a the I/O system was already a bottle- more advanced machines, running ever and millions of computer jobs would member of the neck. It was not long before faster and faster, gave us an exponential increase need to be moved and executed, all with ATLAS experiment more capable machines were available in computing power. a reliability that would support the LHC’s up to 30 kV, up to 100 W per channel and coordinator of at the lab. By 1963, an IBM 7090 based on Key also to the computing revolution, physics programme. A large investment the High-Energy transistor technology was available with alongside the hardware, was the growth in brand new grid technologies was pure LV compatible, pure HV compatible and mixed setup crates available Physics Software a FORTRAN compiler and tape storage. of open-source software. The GNU pro- undertaken, and software engineers Foundation. This machine could analyse 300,000 ject had produced many utilities that and physicists in the experiments had high class & quality precision low and high voltage supplies frames of spark-chamber data – a big could be used by hackers and coders on to develop, deploy and operate a new grid early success. By the 1970s, computers which to base their own software. With system utterly unlike anything that had interchangeable controller boards for networking and software control were important enough that CERN hosted the start of the Linux project to provide gone before. Despite rapid progress in its first Computing and Data Handling a kernel, humble PCs became increas- computing power, storage space and net- School. It was clear that computers were ingly capable machines for scientific working, it was extremely hard to make a here to stay. com puting. Around the same time, Tim reliable, working distributed system for FLEXIBLE By the time of the LEP era in the Berners-Lee’s proposal for the World particle physics out of these pieces. Yet CHANNEL late 1980s, CERN hosted multiple large Wide Web, which began as a tool for w e a c h i e v e d t h i s i n c r e d i bl e t a s k .
Recommended publications
  • CERN Courier–Digital Edition
    CERNMarch/April 2021 cerncourier.com COURIERReporting on international high-energy physics WELCOME CERN Courier – digital edition Welcome to the digital edition of the March/April 2021 issue of CERN Courier. Hadron colliders have contributed to a golden era of discovery in high-energy physics, hosting experiments that have enabled physicists to unearth the cornerstones of the Standard Model. This success story began 50 years ago with CERN’s Intersecting Storage Rings (featured on the cover of this issue) and culminated in the Large Hadron Collider (p38) – which has spawned thousands of papers in its first 10 years of operations alone (p47). It also bodes well for a potential future circular collider at CERN operating at a centre-of-mass energy of at least 100 TeV, a feasibility study for which is now in full swing. Even hadron colliders have their limits, however. To explore possible new physics at the highest energy scales, physicists are mounting a series of experiments to search for very weakly interacting “slim” particles that arise from extensions in the Standard Model (p25). Also celebrating a golden anniversary this year is the Institute for Nuclear Research in Moscow (p33), while, elsewhere in this issue: quantum sensors HADRON COLLIDERS target gravitational waves (p10); X-rays go behind the scenes of supernova 50 years of discovery 1987A (p12); a high-performance computing collaboration forms to handle the big-physics data onslaught (p22); Steven Weinberg talks about his latest work (p51); and much more. To sign up to the new-issue alert, please visit: http://comms.iop.org/k/iop/cerncourier To subscribe to the magazine, please visit: https://cerncourier.com/p/about-cern-courier EDITOR: MATTHEW CHALMERS, CERN DIGITAL EDITION CREATED BY IOP PUBLISHING ATLAS spots rare Higgs decay Weinberg on effective field theory Hunting for WISPs CCMarApr21_Cover_v1.indd 1 12/02/2021 09:24 CERNCOURIER www.
    [Show full text]
  • Technical Details of the Elliott 152 and 153
    Appendix 1 Technical Details of the Elliott 152 and 153 Introduction The Elliott 152 computer was part of the Admiralty’s MRS5 (medium range system 5) naval gunnery project, described in Chap. 2. The Elliott 153 computer, also known as the D/F (direction-finding) computer, was built for GCHQ and the Admiralty as described in Chap. 3. The information in this appendix is intended to supplement the overall descriptions of the machines as given in Chaps. 2 and 3. A1.1 The Elliott 152 Work on the MRS5 contract at Borehamwood began in October 1946 and was essen- tially finished in 1950. Novel target-tracking radar was at the heart of the project, the radar being synchronized to the computer’s clock. In his enthusiasm for perfecting the radar technology, John Coales seems to have spent little time on what we would now call an overall systems design. When Harry Carpenter joined the staff of the Computing Division at Borehamwood on 1 January 1949, he recalls that nobody had yet defined the way in which the control program, running on the 152 computer, would interface with guns and radar. Furthermore, nobody yet appeared to be working on the computational algorithms necessary for three-dimensional trajectory predic- tion. As for the guns that the MRS5 system was intended to control, not even the basic ballistics parameters seemed to be known with any accuracy at Borehamwood [1, 2]. A1.1.1 Communication and Data-Rate The physical separation, between radar in the Borehamwood car park and digital computer in the laboratory, necessitated an interconnecting cable of about 150 m in length.
    [Show full text]
  • Exploring Collaborative Learning Methods in Leadership Development Programs Mary F
    Walden University ScholarWorks Walden Dissertations and Doctoral Studies Walden Dissertations and Doctoral Studies Collection 2018 Exploring Collaborative Learning Methods in Leadership Development Programs Mary F. Woods Walden University Follow this and additional works at: https://scholarworks.waldenu.edu/dissertations Part of the Business Administration, Management, and Operations Commons, Management Sciences and Quantitative Methods Commons, and the Organizational Behavior and Theory Commons This Dissertation is brought to you for free and open access by the Walden Dissertations and Doctoral Studies Collection at ScholarWorks. It has been accepted for inclusion in Walden Dissertations and Doctoral Studies by an authorized administrator of ScholarWorks. For more information, please contact [email protected]. Walden University College of Management and Technology This is to certify that the doctoral dissertation by Mary F. Woods has been found to be complete and satisfactory in all respects, and that any and all revisions required by the review committee have been made. Review Committee Dr. Richard Schuttler, Committee Chairperson, Management Faculty Dr. Keri Heitner, Committee Member, Management Faculty Dr. Patricia Fusch, University Reviewer, Management Faculty Chief Academic Officer Eric Riedel, Ph.D. Walden University 2018 Abstract Exploring Collaborative Learning Methods in Leadership Development Programs by Mary F. Woods MA, Oakland City University, 2006 BS, Oakland City University, 2004 Dissertation Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy Management Walden University May 2018 Abstract Collaborative learning as it pertained to leadership development was an obscured method of learning. There was little research addressing the attributes contributing to collaborative learning for leadership development in leadership development programs.
    [Show full text]
  • F5 Data Solutions Compatibility Matrix
    F5 Data Solutions Compatibility Matrix Last Revised: January 25, 2016 This document provides interoperability information intended for use as a reference guide in qualifying customer’s environments for use with the F5 ARX Data Management Operating System 6.4.0, 6.x.x, 5.3.x, v5.2.x, v5.1.x, v5.0.x, v4.1.x, v3.2.x and F5 Data Manager v3.1.x, v3.0.x and v2.6.1. Use this document for planning purposes only because hardware and software versions vary and numerous combinations are possible. The content of this document may change at any time. Please contact F5 Networks to review specific site requirements and questions. For ARX and Data Manager Versions not listed in this document and for storage systems and versions not covered in this solution matrix, please contact your F5 Networks representative or submit a Customer Special Request (CSR) to F5 Networks Data Solutions Product Management. F5 ARX and NAS / File-Server Compatibility This section lists common storage systems and file servers, as well as their respective software versions, which have been tested and qualified for interoperability with the F5 ARX. These servers were tested with the F5 ARX series intelligent file virtualization devices running the Data Management Operating System. Please contact F5 Networks to review any NAS or file server option not listed in this section. Supported Network File Virtual Present- Operating System Level6 Comments Platforms Protocols1,2 Snapshots5 ation Vols. Amazon S3 Amazon Simple Storage CIFS No Yes Requires ARX Cloud Extender v1.10.1.3 and ARX v5.1.9 or greater for CIFS Services (S3) NFS Requires ARX Cloud Extender v1.10.1 and ARX v6.0.0, v6.1.0 for NFS see Deployment Guide F5 ARX DG for Amazon S3 for details.
    [Show full text]
  • Information Technology 3 and 4/1998. Emerging
    OCCASION This publication has been made available to the public on the occasion of the 50th anniversary of the United Nations Industrial Development Organisation. DISCLAIMER This document has been produced without formal United Nations editing. The designations employed and the presentation of the material in this document do not imply the expression of any opinion whatsoever on the part of the Secretariat of the United Nations Industrial Development Organization (UNIDO) concerning the legal status of any country, territory, city or area or of its authorities, or concerning the delimitation of its frontiers or boundaries, or its economic system or degree of development. Designations such as “developed”, “industrialized” and “developing” are intended for statistical convenience and do not necessarily express a judgment about the stage reached by a particular country or area in the development process. Mention of firm names or commercial products does not constitute an endorsement by UNIDO. FAIR USE POLICY Any part of this publication may be quoted and referenced for educational and research purposes without additional permission from UNIDO. However, those who make use of quoting and referencing this publication are requested to follow the Fair Use Policy of giving due credit to UNIDO. CONTACT Please contact [email protected] for further information concerning UNIDO publications. For more information about UNIDO, please visit us at www.unido.org UNITED NATIONS INDUSTRIAL DEVELOPMENT ORGANIZATION Vienna International Centre, P.O. Box 300, 1400 Vienna, Austria Tel: (+43-1) 26026-0 · www.unido.org · [email protected] EMERGING TECHNOLOGY SERIES 3 and 4/1998 Information Technology UNITED NATIONS INDUSTRIAL DEVELOPMENT ORGANIZATION Vienna, 2000 EMERGING TO OUR READERS TECHNOLOGY SERIES Of special interest in this issue of Emerging Technology Series: Information T~chn~~ogy is the focus on a meeting that took INFORMATION place from 2n to 4 November 1998, in Bangalore, India.
    [Show full text]
  • Sixtrack V and Runtime Environment
    February 19, 2020 11:45 IJMPA S0217751X19420351 page 1 International Journal of Modern Physics A Vol. 34, No. 36 (2019) 1942035 (17 pages) c World Scientific Publishing Company DOI: 10.1142/S0217751X19420351 SixTrack V and runtime environment R. De Maria∗ Beam Department (BE-ABP-HSS), CERN, 1211, Geneva 23, Switzerland [email protected] J. Andersson, V. K. Berglyd Olsen, L. Field, M. Giovannozzi, P. D. Hermes, N. Høimyr, S. Kostoglou, G. Iadarola, E. Mcintosh, A. Mereghetti, J. Molson, D. Pellegrini, T. Persson and M. Schwinzerl CERN, 1211, Geneva 23, Switzerland E. H. Maclean CERN, 1211, Geneva 23, Switzerland University of Malta, Msida, MSD 2080, Malta K. N. Sjobak CERN, 1211, Geneva 23, Switzerland University of Oslo, Boks 1072 Blindern, 0316, Oslo, Norway I. Zacharov EPFL, Rte de la Sorge, 1015, Lausanne, Switzerland S. Singh Indian Institute of Technology Madras, IIT P.O., Chennai 600 036, India Int. J. Mod. Phys. A 2019.34. Downloaded from www.worldscientific.com Received 28 February 2019 Revised 5 December 2019 Accepted 5 December 2019 Published 17 February 2020 SixTrack is a single-particle tracking code for high-energy circular accelerators routinely used at CERN for the Large Hadron Collider (LHC), its luminosity upgrade (HL-LHC), the Future Circular Collider (FCC) and the Super Proton Synchrotron (SPS) simula- tions. The code is based on a 6D symplectic tracking engine, which is optimized for long-term tracking simulations and delivers fully reproducible results on several plat- forms. It also includes multiple scattering engines for beam{matter interaction studies, by INDIAN INSTITUTE OF TECHNOLOGY @ MADRAS on 04/19/20.
    [Show full text]
  • CERN Openlab VI Project Agreement “Exploration of Google Technologies
    CERN openlab VI Project Agreement “Exploration of Google technologies for HEP” between The European Organization for Nuclear Research (CERN) and Google Switzerland GmbH (Google) CERN K-Number Agreement Start Date 01/06/2019 Duration 1 year Page 1 of 26 THE EUROPEAN ORGANIZATION FOR NUCLEAR RESEARCH (“CERN”), an Intergovernmental Organization having its seat at Geneva, Switzerland, duly represented by Fabiola Gianotti, Director-General, and Google Switzerland GmbH (Google), Brandschenkestrasse 110, 8002 Zurich, Switzerland , duly represented by [LEGAL REPRESENTATIVE] Hereinafter each a “Party” and collectively the “Parties”, CONSIDERING THAT: The Parties have signed the CERN openlab VI Framework Agreement on November 1st, 2018 (“Framework Agreement”) which establishes the framework for collaboration between the Parties in CERN openlab phase VI (“openlab VI”) from 1 January 2018 until 31 December 2020 and which sets out the principles for all collaborations under CERN openlab VI; Google Switzerland GmbH (Google) is an industrial Member of openlab VI in accordance with the Framework Agreement; Article 3 of the Framework Agreement establishes that all collaborations in CERN openlab VI shall be established in specific Projects on a bilateral or multilateral basis and in specific agreements (each a “Project Agreement”); The Parties wish to collaborate in the “exploration of applications of Google products and technologies to High Energy Physics ICT problems related to the collection, storage and analysis of the data coming from the Experiments” under CERN openlab VI (hereinafter “Exploration of Google technologies for HEP”); AGREE AS FOLLOWS: Article 1 Purpose and scope 1. This Project Agreement establishes the collaboration of the Parties in Exploration of Google technologies for HEP, hereinafter the “Project”).
    [Show full text]
  • Alan Turingturing –– Computercomputer Designerdesigner
    AlanAlan TuringTuring –– ComputerComputer DesignerDesigner Brian E. Carpenter with input from Robert W. Doran The University of Auckland May 2012 Turing, the theoretician ● Turing is widely regarded as a pure mathematician. After all, he was a B-star Wrangler (in the same year as Maurice Wilkes) ● “It is possible to invent a single machine which can be used to compute any computable sequence. If this machine U is supplied with the tape on the beginning of which is written the string of quintuples separated by semicolons of some computing machine M, then U will compute the same sequence as M.” (1937) ● So how was he able to write Proposals for development in the Mathematics Division of an Automatic Computing Engine (ACE) by the end of 1945? 2 Let’s read that carefully ● “It is possible to inventinvent a single machinemachine which can be used to compute any computable sequence. If this machinemachine U is supplied with the tapetape on the beginning of which is writtenwritten the string of quintuples separated by semicolons of some computing machinemachine M, then U will compute the same sequence as M.” ● The founding statement of computability theory was written in entirely physical terms. 3 What would it take? ● A tape on which you can write, read and erase symbols. ● Poulsen demonstrated magnetic wire recording in 1898. ● A way of storing symbols and performing simple logic. ● Eccles & Jordan patented the multivibrator trigger circuit (flip- flop) in 1919. ● Rossi invented the coincidence circuit (AND gate) in 1930. ● Building U in 1937 would have been only slightly more bizarre than building a differential analyser with Meccano.
    [Show full text]
  • Harnessing Green It
    www.it-ebooks.info www.it-ebooks.info HARNESSING GREEN IT www.it-ebooks.info www.it-ebooks.info HARNESSING GREEN IT PRINCIPLES AND PRACTICES Editors San Murugesan BRITE Professional Services and University of Western Sydney, Australia G.R. Gangadharan Institute for Development and Research in Banking Technology, India A John Wiley & Sons, Ltd., Publication www.it-ebooks.info This edition first published 2012 © 2012 John Wiley and Sons Ltd Registered office John Wiley & Sons Ltd, The Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, United Kingdom For details of our global editorial offices, for customer services and for information about how to apply for permission to reuse the copyright material in this book please see our website at www.wiley.com. The right of the author to be identified as the author of this work has been asserted in accordance with the Copyright, Designs and Patents Act 1988. All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by the UK Copyright, Designs and Patents Act 1988, without the prior permission of the publisher. Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic books. Designations used by companies to distinguish their products are often claimed as trademarks. All brand names and product names used in this book are trade names, service marks, trademarks or registered trademarks of their respective owners. The publisher is not associated with any product or vendor mentioned in this book.
    [Show full text]
  • NEWSLETTER Cpdtrf;Lq'tp.Nn •
    Th~?p~,~eps~9ti(t~J.s··r'~ws{~tt~I"" DIGITAL COMPUTER j.~~p'~r()xJ ?~i~\ .me? lumfOr ~he !.."te r-?h~9ge among t n'ter~~te~ p~r~On~\?jlnf9rtl).~t 1?n<?o" ... <;(Enl~g ,·rT?~9~•.· •.···· ••• ~. ~v;lop1l1en~~ In.t,v.~t.I.~iJ$>·· .. ~... 1,9 f~·.1l.·I·i· ••. ?9I11p.tJt~.r pr;.;J;ct$~pi.~lr.I.~.lIt.ton.<.i~ ..... 11 T'" ItT<i'~O'9?y~rvm~~t~gEl~<:iesi NEWSLETTER cPDtrf;lQ'tp.nn •. "'n(.t,.~ontr I bu't;Qrs. OFFICE OF NAVAL RESEARCH MATHEMATICAL SCIENCES DIVISION Vol. 11, No.3 Gordon D. Goldstein, Editor July 1959 Jean S. Campbell, Asst. Editor TABLE OF CONTENTS Page No. COMPUTERS AND DATA PROCESSING, NORTH AMERICA 1. Ferranti Electric, Inc., Sirius, Hempstead, L.L, New York 1 2. Librascope, Inc., Libratrol- 500, Glendale, California 1 3. Monroe Calculating Machine Company, Distributape, Orange, New Jersey 2 COMPUTING CENTERS 1. Air Force Cambridge Research Center, Computer and Mathematical Sciences Laboratory, L.G. Hanscom Field, Bedford, Massachusetts 2 2. Georgia Institute of Technology, Rich Electronic Computer Center, 2 Atlanta, Georgia 2 3. Holloman Air Force Base, New Mexico, Data Assimilator 3 4. National Bureau of Standards, Computation Laboratory, Washington, D.C. 4 5. New York University,AEC Computing and Applied Mathematics Center, NewYork,N.Y. 4 6. RCA Service Company, FLAC I and IBM 709, Patrick Air Force Base, Florida 5 7. U.S. Naval Missile Center, 709 andRAYDAC Systems, Point Mugu, California 5 8. U.S. NavalProving Ground, Naval Ordnance Computation Center, Dahlgren, Virginia 6 9.
    [Show full text]
  • Particle Tracking and Identification on Multi- and Many-Core Hardware
    PARTICLETRACKINGANDIDENTIFICATIONONMULTI-AND MANY-COREHARDWAREPLATFORMS by plácido fernández declara A dissertation submitted by in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Computer Science and Technology Universidad Carlos III de Madrid Advisors: José Daniel García Sánchez Niko Neufeld September 2020 This thesis is distributed under license “Creative Commons Attribution – Non Commercial – Non Derivatives”. To my parents. A mis padres. We have seen that computer programming is an art, because it applies accumulated knowledge to the world, because it requires skill and ingenuity, and especially because it produces objects of beauty. — Donald E. Knuth [87] ACKNOWLEDGMENTS Since I finished my master’s degree and started working inthe aerospace industry with GMV I have been interested in pursuing a PhD. But I did not find the right challenge or topic I was looking for. One day I received a call from my professor from Computer Science and Engineering degree and director of the master’s degree I would later study in Leganés. He let me know about an opening position to do a PhD in High Performance Computing at CERN, in Switzerland. It was just what I was looking for; an incredible computing challenge working in the Large Hadron Collider. I applied for it and some interviews later I got the job. I have to thank first and foremost my parents. It is thanks tothem that I was able to purse any dream and challenge I ever imagined and I am ever thankful for their love and support. To my love and friend Afri, who helps me overcome any situation and makes any adventure an amazing journey to enjoy life.
    [Show full text]
  • Analysis and Optimization of the Offline Software of the Atlas Experiment at Cern
    Technische Universität München Fakultät für Informatik Informatik 5 – Fachgebiet Hardware-nahe Algorithmik und Software für Höchstleistungsrechnen ANALYSIS AND OPTIMIZATION OF THE OFFLINE SOFTWARE OF THE ATLAS EXPERIMENT AT CERN Robert Johannes Langenberg Vollständiger Abdruck der von der Fakultät für Informatik der Technischen Universität München zur Erlangung des akademischen Grades eines Doktors der Naturwissenschaften (Dr. rer. nat.) genehmigten Dissertation. Vorsitzender: 0. Prof. Bernd Brügge, PhD Prüfer der Dissertation: 1. Prof. Dr. Michael Georg Bader 2. Prof. Dr. Siegfried Bethke Die Dissertation wurde am 03.07.2017 bei der Technischen Universität München einge- reicht und durch die Fakultät für Informatik am 11.10.2017 angenommen. ABSTRACT The simulation and reconstruction of high energy physics experiments at the Large Hadron Collider (LHC) use huge customized software suites that were developed during the last twenty years. The reconstruction software of the ATLAS experiment, in particular, is challenged by the increasing amount and complexity of the data that is processed because it uses combinatorial algorithms. Extrapolations at the end of Run-1 of the LHC indicated the need of a factor 3 improvement in event processing rates in order to stay within the available resources for the workloads expected during Run-2. This thesis develops a systematic approach to analyze and optimize the ATLAS recon- struction software for modern hardware architectures. First, this thesis analyzes limita- tions of the software and how to improve it by using intrusive and non-intrusive tech- niques. This includes an analysis of the data flow graph and detailed performance meas- urements using performance counters and control flow and function level profilers.
    [Show full text]