Universitat De Val`Encia Distributed Computing and Farm Management
Total Page:16
File Type:pdf, Size:1020Kb
Universitat de Val`encia Departament de F´ısica Te`orica Distributed computing and farm management with application to the search for heavy gauge bosons using the ATLAS experiment at the LHC (CERN) CERN-THESIS-2008-056 21/01/2008 Tesis de Doctorado Juan Antonio Lopez-Perez 2007 CONTENTS Preface 9 Prefacio 13 I Introduction 17 1 CERN, LHC and ATLAS 19 1.1 CERN . 19 1.1.1 Discoveries and research fields . 20 1.2 The LHC . 22 1.3 The ATLAS experiment . 25 1.3.1 ATLAS sub-detectors . 27 1.3.2 ATLAS performance . 32 1.3.3 ATLAS software . 35 2 The Standard Model and Extra Dimensions 41 2.1 The Standard Model . 41 2.2 Beyond the Standard Model . 43 2.3 Extra dimensions . 45 3 Distributed Grid computing 49 3.1 The Grid . 49 3.1.1 Definition . 49 3.1.2 Some key concepts . 51 3.1.3 Middleware components . 52 3 4 3.1.4 The Globus toolkit . 53 3.1.5 Overview of a Grid job . 54 3.1.6 Challenges and benefits of the Grid . 55 3.1.7 Different kinds of Grids . 57 3.2 LHC Computing Grid . 59 3.2.1 The LCG project . 59 3.2.2 The EGEE project . 61 3.2.3 The EDG project . 63 3.2.4 The Quattor software management tool . 64 3.3 The BOINC distributed computing environment . 66 3.3.1 Overview of a BOINC job . 67 3.3.2 BOINC credit system . 68 3.3.3 The future of BOINC . 69 II Data analysis 73 4 Search for Z∗ and W ∗ decay modes 75 4.1 Introduction . 75 4.2 b-tagging . 76 4.3 Search for Z∗ −→ b¯b ....................... 77 4.3.1 Simulation . 77 4.3.2 Selection cuts . 78 4.3.3 Results . 79 4.4 Search for Z∗ −→ tt ....................... 81 4.4.1 Simulation . 81 4.4.2 Selection cuts . 82 4.4.3 Results . 82 4.5 Search for W ∗ −→ t b ....................... 85 4.5.1 Simulation . 85 4.5.2 Selection cuts . 85 4.5.3 Results . 86 4.6 Mass dependence . 88 III Distributed Computing 93 5 The LCG project 95 5.1 LCG technology and infrastructure . 95 5.1.1 Security . 97 5.1.2 Information service . 97 5.1.3 Workload management System (WMS) . 98 5.1.4 Storage Element (SE) . 99 5.1.5 Compute Resource Services . 99 5 5.1.6 Data Management . 100 5.1.7 File Transfer Service (FTS) . 100 5.2 LCG Activity Areas . 101 5.2.1 Fabric Area . 101 5.2.2 Grid Deployment Area . 101 5.2.3 Distributed Analysis (ARDA) . 102 5.2.4 Applications Area . 102 5.3 Quattor and farm management . 105 6 BOINC 107 6.1 Middleware components . 107 6.2 BOINC technology and infrastructure . 109 6.2.1 Generating work . 111 6.2.2 Validation . 112 6.2.3 Result assimilation . 113 6.2.4 Server-side file deletion . 113 6.2.5 Database purging . 113 6.3 LHC@home . 114 6.4 Applications ported to BOINC . 115 6.4.1 PYTHIA . 116 6.4.2 Atlfast . 118 6.4.3 Geant4 . 119 6.4.4 Garfield . 120 6.5 Installations performed . 121 6.5.1 lxboinc cluster . 121 6.5.2 Extremadura testbed . 122 7 Comparing LCG and BOINC 125 7.1 Technology and infrastructure . 125 7.1.1 Installation, configuration and management . 125 7.1.2 Security . 126 7.1.3 Information service . 127 7.1.4 Workload management System (WMS) . 127 7.1.5 Storage Elements and Data Management . 128 7.1.6 Compute Resource Services . 128 7.1.7 File Transfer Service (FTS) . 129 7.1.8 User Interfaces (UIs) . 129 7.2 Summary of the comparison . 130 IV Conclusions 133 Conclusions 135 Conclusiones 141 6 V Appendices 147 A Quattor tasks performed at CERN 149 A.1 Use case. Implementation of Quattor in Solaris . 149 A.1.1 Porting . 149 A.1.2 Corrections . 150 A.1.3 Creation . 151 A.2 Quattor software management system . 151 B Porting of LCG software 155 B.1 LCG External Software . 156 B.2 SEAL . 157 C Atlfast porting to BOINC 159 C.1 Atlfast compilation and configuration . 159 D Geant4 porting to BOINC 165 D.1 The Simulation . 165 D.2 Boincification . 166 D.3 Porting to Windows . 168 D.4 BOINC templates . 171 E Glossary 173 Acknowledges Agradecimientos 181 References 183 LIST OF FIGURES 1.1 Scheme of the CERN accelerator complex. 22 1.2 The 4 LHC experiments. 23 1.3 The Alice and LHCb detectors. 24 1.4 Sketch of the ATLAS detector. 24 1.5 The CMS detector. 25 1.6 Photograph of the ATLAS detector under construction. 26 1.7 A view of the ATLAS Inner Detector. 28 1.8 Scheme including the calorimeters of the ATLAS detector. 30 3.1 Overview of the life cycle of a Grid job . 54 3.2 The different countries involved in the ATLAS collaboration. 56 3.3 Overview of the life cycle of a BOINC job. 67 3.4 Sketch about describing BOINC credits are granted. 69 4.1 Simulated signal obtained in the study of Z∗ → bb . 79 4.2 Signal and background obtained in the study of Z∗ → bb . 80 4.3 Simulated signal obtained in the study of Z∗ → tt with M(Z∗) = 2 T eV .......................... 83 4.4 Signal and background obtained in the study of Z∗ → tt with M(Z∗) = 2 T eV .......................... 83 4.5 Simulated signal obtained in the study of W ∗ → t b with M(Z∗) = 2 T eV .......................... 86 4.6 Signal and background obtained in the study of W ∗ → t b with M(Z∗) = 2 T eV ....................... 87 4.7 Significance as a function of mass for the study of the decay channel Z∗ → bb and assuming L = 3 × 105 pb−1 . 89 7 8 LIST OF FIGURES 4.8 Significance as a function of mass for the study of the decay channel Z∗ → tt and assuming L = 3 × 105 pb−1 . 89 4.9 Significance as a function of mass for the study of the decay channel W ∗ → t b and assuming L = 3 × 105 pb−1 . 90 6.1 BOINC client-server infrastructure showing the different parts of each kind of components. 108 6.2 BOINC daemons . 110 6.3 Statistics of participants on the LHC@home project and total credits. 115 A.1 Schema of the Quattor automated software management system.152 Preface The Standard Model of particle physics (SM), presented in the introduction of this work, describes the strong, weak, and electromagnetic forces between the fundamental particles of ordinary matter. The SM describes with high precision the interactions at energies reached by present accelerators. How- ever, it presents several problems, also discussed in this work, and some questions remain unanswered so the SM cannot be considered a complete theory of fundamental interactions. Since the completion of the SM, many extensions have been proposed in order to address these problems. Some important recent extensions are the Extra Dimensions (ED) theories. These theories unify gravity with the other fundamental forces and solve the hierarchy problem. An introduction on the motivation, history and main ideas of ED theories is given in this work. In the context of some models with ED of size about 1 T eV −1, in particular in the ADD model with only fermions confined to a D-brane, heavy Kaluza-Klein (KK) excitations are expected, with the same properties as SM gauge bosons but more massive. In this work, three hadronic decay modes of some of such massive gauge bosons, Z∗ and W ∗, are investigated using the ATLAS experiment at the Large Hadron Collider (LHC), presently under construction at CERN. The LHC will start to operate in 2007 and collect data in 2008. It will pro- duce roughly 15 Petabytes (15 million Gigabytes, i.e. more than 20 millions of CDs) of data per year. Access to this experimental data has to be provided for some 5,000 scientists working in 500 research institutes and universities. In addition, all data need to be available over the estimated 15-year lifetime of the LHC. The analysis of the data, including comparison with theoretical 9 simulations, requires a huge amount of computing power. The computing challenges that scientists have to face are: the huge amount of data and calculations to perform and the large number of col- laborators. The Grid has been proposed as a solution for those challenges. In particular, regarding the high CPU requirements required for LHC studies, a particular kind of Grid that is useful and has been considered in this work is BOINC. The feasibility of porting some simulation and reconstruction physics tools has been studied as well. This work is presented in five different parts, including seven sections and four appendices. In short, there are two main topics. One refers to data analysis and another one to distributed computing. In addition, there is an introduction, conclusions and appendices. In the first part of this work, an introduction to the physics and computing tools is presented. In the first chapter of this part an introduction to CERN and to its main collider, the LHC, is given. The ATLAS detector and the main computing tools are also introduced. In the second chapter, the Standard Model of particle physics is intro- duced and some extensions are proposed. In particular, an introduction to Extra Dimensions Models is given. In the third chapter, a general introduction to Grid computing is presented.