IT Project Plan Template Version 0.3

Total Page:16

File Type:pdf, Size:1020Kb

IT Project Plan Template Version 0.3

LCG APPLICATIONS AREA

PLANS FOR PHASE 2

Organization: CERN – LCG Project

Document Revision #: 0.76

Date of Issue: 31 August 2005

Editor: Pere Mato Plans for Phase 2 LCG Applications Area

Approval Signatures

Approved by: Project Leader Approved by: LCG Project Leader

Prepared by: Project Manager Prepared by: LCG Project Manager

Reviewed by: Quality Assurance Manager

Document Change Control

This section provides control for the development and distribution of revisions to the Project Work Plan up to the point of approval.

Revision Number Date of Issue Author(s) Brief Description of Change 0.1 14/06/2005 P. Mato Initial merge of the contributions from the project leaders 0.2 16/06/2005 P. Mato Added POOL contribution from Dirk. Added chapter “Major changes…” Modifications in ROOT chapter after discussion with Rene + resource table. 0.3 28/06/2005 P. Mato Changes in PROOF section by Fons Updated POOL chapter by Dirk

0.4 05/07/2005 P. Mato Updates in POOL chapter from Dirk 0.5 13/07/2005 P. Mato Updates in POOL chapter from Dirk and summary Level milestone table 0.6 01/08/2005 P.Mato Changes from Drik. Added resource summary tables 0.7 31/08/2005 L. Moneta Added SEAL work package in ROOT

Page i Plans for Phase 2 LCG Applications Area

project section

Page ii Plans for Phase 2 LCG Applications Area

Table of Contents

1. Introduction...... 1

2. Applications Area Scope and Requirements...... 3

2.1...High Level requirements...... 3 2.2...Software Architecture...... 4 2.3...OS Platforms...... 5 3. Applications Area Organization...... 7

4. Major Changes in the Applications Area for Phase 2...... 9

4.1...SEAL and ROOT merge...... 10 5. Software Process and Infrastructure project (SPI)...... 1211

5.1...Purpose, Scope and Objectives...... 1211 5.2...Project Organization...... 1211 5.3...Services and Responsibilities...... 1211 5.4...Main project milestones for 2005...... 1312 6. Core libraries and services project (ROOT)...... 1817

6.1...Project Organization...... 1817 6.2...The BASE work package...... 1817 6.2.1 Milestones...... 1918 6.3...The DICT work package...... 1918 6.3.1 Milestones...... 2019 6.4...The I/O and Trees work package...... 2019 6.4.1 Milestones...... 2120 6.5...The PROOF work package...... 2120 6.5.1 Milestones...... 2221 6.6...The MATH work package...... 2221 6.6.1 Milestones...... 2322 6.7...The GUI work package...... 2322 6.7.1 Ongoing work...... 2423 6.7.2 Milestones...... 2524 6.8...The Graphics work package...... 2524 6.8.1 Ongoing work...... 2625 6.8.2 Milestones...... 2625 6.9...The GEOM work pacakge...... 2625 6.9.1 On going works...... 2726

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page iii 0.76 Plans for Phase 2 LCG Applications Area

6.10.Resources...... 2726 6.10.1 Staffing...... 2826 7. Persistency Framework Projects (POOL/COOL)...... 3028

7.1...Purpose, Scope and Objectives...... 3028 7.2...Project Organization...... 3129 7.3...Persistency Framework work packages...... 3230 7.3.1 Object Storage and References...... 3230 7.3.2 Collections and Meta Data...... 3331 7.3.3 Database Access and Distribution...... 3432 7.3.4 Catalog and Grid Integration...... 3533 7.3.5 Conditions Database...... 3634 7.4...Technical process...... 3735 7.5...Resources and Milestones...... 3836 7.5.1 Staffing...... 3836 Simulation Project...... 4038

7.6...Purpose, Scope and Objectives...... 4038 7.7...Project Organization...... 4038 7.8...Simulation Framework...... 4139 7.8.1 WP1 - Geometry Description Markup Language...... 4139 7.8.2 WP2 - Geant4 geometry object persistency...... 4139 7.8.3 WP3 - Python interfaces to Geant4...... 4240 7.8.4 WP4 – Simulation Framework for physics validation...... 4240 7.8.5 WP5 - Monte Carlo truth handling...... 4240 7.8.6 Sub-project Milestones...... 4240 7.9...Physics Validation...... 4341 7.9.1 WP1 – Impact of simulation on LHC physics...... 4442 7.9.2 WP2 - Electromagnetic physics...... 4442 7.9.3 WP3a – Hadronic physics: calorimetry...... 4442 7.9.4 WP3b – Hadronic physics: inner detectors...... 4543 7.9.5 WP3c – Hadronic physics: background radiation...... 4543 7.9.6 WP4 – Validation of the simulation environment...... 4644 7.9.7 Sub-project Milestones...... 4644 7.10.Geant4...... 4745 7.10.1 WP1 - Geometry, Field and Transportation...... 4745 7.10.2 WP2 - Software Management...... 4846 7.10.3 WP3 - EM Physics and error propagation...... 4846 7.10.4 WP4 - Hadronic Physics...... 4947 7.10.5 WP5 - System Testing...... 5048

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page iv 0.76 Plans for Phase 2 LCG Applications Area

7.10.6 WP6 - Acceptance Suite...... 5048 7.10.7 WP7 - Coordination and Releases...... 5149 7.10.8 Sub-project milestones...... 5149 7.11.Fluka...... 5250 7.11.1 Summary of milestones...... 5351 7.12.Garfield...... 5351 7.12.1 WP1 – Electric field and Diffusion...... 5452 7.12.2 WP2 – Gas and Transport...... 5452 7.12.3 Sub-project milestones...... 5452 7.13.Generator Services...... 5452 7.13.1 WP1 - The Generator library (GENSER)...... 5553 7.13.2 WP2 - Storage, Event Interfaces and Particle Services...... 5553 7.13.3 WP3 - Public Event Files and Monte Carlo Database...... 5553 7.13.4 WP4: Monte Carlo Validation...... 5654 7.13.5 Sub-project milestones...... 5654 7.14.Technical process...... 5755 7.15.Resources...... 5755 7.15.1 Staffing...... 5755 8. Applications Area Summary...... 5957

8.1...Resources...... 5957 8.2...Milestones...... 5957 8.2.1 Level-1 proposed milestones...... 5957 ...... Level-2 proposed milestones...... 6058 8.2.2.6058 9. References...... 6362

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page v 0.76 Plans for Phase 2 LCG Applications Area

1. Introduction

The Applications Area of the LCG Project is concerned with developing, deploying and maintaining that part of the physics applications software and associated supporting infrastructure software that is common among the LHC experiments. This area is managed as a number of specific projects with well-defined policies for coordination between them and with the direct participation of the primary users of the software, the LHC experiments. It has been organized to focus on real experiment needs and special attention has been given to maintaining open information flow and decision making. The experiments set requirements and monitor progress through participation in the bodies that manage the work programme (SC2, PEB and Architects Forum). Success of the project is gauged by successful use, validation and deployment of deliverables in the software systems of the experiments. The Applications Area is responsible for building a project team among participants and collaborators; developing a work plan; designing and developing software that meets experiment requirements; assisting in integrating the software within the experiments; and providing support and maintenance. The project started at the beginning of 2002 and recently completed the first phase in its programme of work. The scope and highlights of activities in Phase 1 are:  The establishment of the basic environment for software development, documentation, distribution and support. This includes the provision of software development tools, documentation tools, quality control and other tools integrated into a well-defined software process. The Savannah project portal and software service has become an accepted standard both inside and outside the project. A service to provide ~100 third party software installations in the versions and platforms needed by LCG projects has also been developed.  The development of general-purpose scientific libraries, C++ foundation libraries, and other standard libraries. A rather complete set of core functionality has already been made available in public releases by the SEAL and ROOT projects, and has been used successfully in both LCG and experiment codes.  The development of tools for storing, managing and accessing data handled by physics applications, including calibration data, metadata describing events, event data, and analysis objects. The objective of a quickly-developed hybrid system leveraging ROOT I/O and an RDBMS was fulfilled with the development of the POOL persistency framework. POOL was successfully used in large scale production in ATLAS, CMS and LHCb data challenges in which >400 TB of data were produced.  The adaptation and validation of common frameworks and toolkits provided by projects of broader scope than LHC, such as PYTHIA, GEANT4 and FLUKA. Geant4 is now firmly established as baseline simulation engine in successful ATLAS, CMS and LHCb production, following validation tests of physics processes and by proving to be extremely robust and stable. The work of the Applications Area is conducted within projects. At the time there are four active projects: software process and infrastructure (SPI), persistency framework

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 1 0.76 Plans for Phase 2 LCG Applications Area

(POOL), core software common libraries and components (CORE), and simulation (SIMU). The purpose, scope and objectives of each of the projects will be described in sections 4 to 7. Each project section will include the main expected deliverables for Phase II with more details and emphasis in the next 12 months. At section 8 we will summarize the resource and the main milestones for the complete Applications Area.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 2 0.76 Plans for Phase 2 LCG Applications Area

2. Applications Area Scope and Requirements

2.1 High Level requirements A basic set of high level requirements were established at the start of Phase 1 of the project and there is no reason to change them in the Phase 2. Here we recall those that have guided development work so far. It is evident that software environments and optimal technology choices evolve over time and therefore LCG software design must take account of the >10 year lifetime of the LHC. The LCG software itself must be able to evolve smoothly with it. This requirement implies others on language evolution, modularity of components, use of interfaces, maintainability and documentation. At any given time the LCG should provide a functional set of software with implementations based on products that are the current best choice. The standard language for physics applications software in all four LHC experiments is C++. The language choice may change in the future, and some experiments support multi-language environments today. LCG software should serve C++ environments well, and also support multi-language environments and the evolution of language choices. LCG software must operate seamlessly in a highly distributed environment, with distributed operation enabled and controlled by components employing Grid middleware. All LCG software must take account of distributed operation in its design and must use the agreed standard services for distributed operation when the software uses distributed services directly. While the software must operate seamlessly in a distributed environment, it must also be functional and easily usable in ‘disconnected’ local environments. LCG software should be constructed in a modular way based on components, where a software component provides a specific function via a well-defined public interface. Components interact with other components through their interfaces. It should be possible to replace a component with a different implementation respecting the same interfaces without perturbing the rest of the system. The interaction of users and other software components with a given component is entirely through its public interface. The component architecture and interface design should be such that different implementations of a given component can be easily interchanged provided that they respect the established interfaces. Component and interface designs should not, in general, make assumptions about implementation technologies; they should be as implementation-neutral as possible. A principal requirement of LCG software components is that they integrate well in a coherent software framework, and integrate well with experiment software and other tools. LCG software should include components and employ designs that facilitate this integration. Integration of the best of existing solutions as component implementations should be supported, in order to profit from existing tools and avoid duplication. Already existing implementations which provide the required functionality for a given component should be evaluated and the best of them used if possible (re-use). Use of existing software should be consistent with the LCG architecture.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 3 0.76 Plans for Phase 2 LCG Applications Area

LCG software should be written in conformance to the language standard. Platform and OS dependencies should be confined to low level system utilities. A number of Hardware/OS/compiler combinations (platforms) will be supported for production and development work. These will be reviewed periodically to take account of market trends and usage by the wider community. Although the Trigger and DAQ software applications are not be part of the LCG scope, it is very likely that such applications will re-use some of the core LCG components. Therefore, the LCG software must be able to operate in a real-time environment and it must be designed and developed accordingly, e.g. incorporating online requirements for time budgets and memory leak intolerance.

2.2 Software Architecture Applications Area software must conform in its architecture to a coherent overall architectural vision; must make consistent use of an identified set of core tools, libraries and services; must integrate and inter-operate well with other LCG software and experiment software. This vision was established in a high level ‘blueprint’ for LCG software which provided the guidance needed for individual projects to ensure that these criteria are met 1. LCG software is designed to be modular, with the unit of modularity being the software component. A component internally consists of a number of collaborating classes. Its public interface expresses how the component is seen and used externally. The granularity of the component breakdown should be driven by that granularity at which replacement of individual components (e.g. with a new implementation) is foreseen. Components are grouped and classified according to the way the way in which they interact and cooperate to provide specific functionality. Each group corresponds to a domain of the overall architecture and the development of each domain is typically managed by a small group of 5-10 people. The principal software domains for LCG Applications Area software are illustrated schematically in Figure 1. Software support services (management, packaging, distribution etc.) are not shown in this figure.

Simulation Program Reconstruction Program Analysis Program

Event Detector Calibration Algorithms Experiment Frameworks

Engines Persistency DataBase Batch

Generators Framework FileCatalog Conditions Interactive Simulation Data Management Distributed Analysis

Geometry Histograms Fitters NTuple Physics

MathLibs I/O GUI 2D Graphics

PluginMgr Dictionary Interpreter Collections 3D Graphics

Foundation Utilities OS binding Core Figure 1 : Physics applications software domain decomposition

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 4 0.76 Plans for Phase 2 LCG Applications Area

The Core Software Domain provides basic functionality needed by any application. At the lowest level we identify the foundation libraries, utilities and services employed that are fairly independent class libraries (e.g. STL, or a library providing a Lorentz Vector). Above this are core services supporting the development of higher level framework components and specializations such as the plug-in manager and object dictionary by which all parts of the system have knowledge of, and access to, the objects of the system. Other core software services include command line environments for interactive and batch (script) access to the functionality of the system, as well as general graphics and GUI tools that can be used to build experiment-specific interfaces but which are not themselves experiment-specific. Histogramming, ntuples, fitting, statistical analysis, and data presentation tools also contribute to Core functionality. Above the Core software are a number of specialized frameworks that offer services specific to particular domains. The Data Management Domain covers object persistency, file cataloguing, event-specific data management, and detector-conditions- specific data management. In general, the domain of expertise stays in the area of relational databases applications development. Support and LCG-directed development of simulation toolkits such as Geant4 and Fluka and ancillary services and infrastructure surrounding them are part of the Simulation Domain. Ancillary services surrounding event generators (e.g. standard event and particle data formats, persistency, configuration service), and support and distribution of event generator software, are also in the scope of common project activities. The Distributed Analysis Domain is the area where the physicist and physics application software interface to Grid middleware and services in order to support job configuration, submission and monitoring, distributed data management and Grid-enabled analysis. The scope of common activities in this area has still to be specified Experiment applications are built on top of specialized frameworks which are specific to the experiment and not in LCG scope.

2.3 OS Platforms The LHC experiments and the computer centres of universities and laboratories need to run LCG software on a large variety of platforms and operating systems, in several flavours and versions. Therefore, in order to guarantee portability, the software must be written following the most common standards in terms of programming languages and operating systems. Applications Area software is being routinely developed and run on a number of different compilers and operating systems, including Red Hat Linux, Microsoft Windows, and Apple Mac OSX, both with gcc and with their C++ proprietary compilers. This approach helps to ensure conformance to language standards and allows the project to manage dependencies on platform-specific features, both on 32-bit and 64- bit hardware architectures. Applications Area projects are involved in the certification and in the verification of new versions of compilers or operating systems at CERN. The “production” platforms currently supported are:

 Red Hat 7.3 with gcc 3.2 and gcc 3.2.3 - the Linux reference platform for the LHC experiments and for the main computer centres. Red Hat 7.3 will be stopped by end 2005.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 5 0.76 Plans for Phase 2 LCG Applications Area

 Scientific Linux 3 with gcc 3.2.3, and in the near future also with gcc 3.4.3 - the new Linux reference platform for CERN and other large HEP laboratories. This is binary compatible with Red Hat Enterprise 3. In addition “development-only” platforms are supported that have better development tools and are therefore used by many programmers and users to increase productivity and assure software quality:

 Microsoft Windows, with the Visual C++ 7.1 compiler and CygWin;

 Mac OSX 10.3 with gcc 3.3, and soon 10.4 probably with gcc 4. Any changes to the list of supported platforms or compilers is discussed and approved at the Architects Forum, where all the LHC experiments are represented. When a new platform is a candidate to become supported firstly all LCG software and external packages are re-compiled and re-built in order to assess the implications and changes needed for the new platform to become fully supported. Platforms that will likely be supported in the near future are:

 SLC3 Linux on AMD 64-bit processors as an additional production platform;

 GCC 3.4.4 compiler on all Linux platforms to take advantage of better performance.

 Mac OSX 10.4 as development platform, to resolve issues related to loading of dynamic libraries.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 6 0.76 Plans for Phase 2 LCG Applications Area

3. Applications Area Organization

Applications Area work in the various activity areas described above is organized into projects, each led by a Project Leader with overall responsibility for the management, design, development and execution of the work of the project. The Applications Area Manager has overall responsibility for the work of the Applications Area.

SC2 PEB LHCC Alice Atlas CMS LHCb Workplans Reviews Quartery Reports Resources Architects Forum AA Manager Chairs Decisions Application Area Meeting

LCG AA Projects

PROJECT A PROJECT B PROJECT D

WP1 WP2 WP1 WP2 . . . WP1 WP3 WP2

External Collaborations ROOT Geant4 EGEE

Figure 2: Applications Area organization Work in the projects must be consistent and coherent with the architecture, infrastructure, processes, support and documentation functions that are agreed application area-wide. Larger projects may in turn be divided in sub-projects and later into work packages with ~1-3 FTE activity levels per work package. An Architects Forum (AF) consisting of the applications area leader (chair), the architects of the four LHC experiments, and other invited members provides for the formal participation of the experiments in the planning, decision making and architectural and technical direction of applications area activities. Architects represent the interests of their experiment and contribute their expertise. The AF decides the difficult issues that cannot be resolved in open forums such as the applications area meeting. The AF meets every 2 weeks or so. The Applications Area Meeting takes pace fortnightly and provides a forum for information exchange between the project and the LHC experiments. The Applications Area work breakdown structure, milestones and deliverables for all aspects of the project are documented at http://lcgapp.cern.ch/project/mgmt. The work breakdown structure maps directly onto the project breakdown of the Applications Area.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 7 0.76 Plans for Phase 2 LCG Applications Area

The schedule of milestones for the completion of deliverables is similarly organized. Milestones are organized at three levels:  Level 1: the highest level. A small, select number of very important milestones are at this level. These milestones are monitored at the LHCC level.  Level 2: the ‘official milestones’ level. Milestones at this level chart the progress of applications area activities in a comprehensive way. Each project has a small number of milestones per quarter at this level. These milestones are monitored at the LCG project level (PEB, SC2).  Level 3: internal milestones level. Milestones at this level are used for finer grained charting of progress for internal applications area purposes. These milestones are monitored at the AA level. Milestones include integration and validation milestones from the experiments to track the take-up of AA software in the experiments.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 8 0.76 Plans for Phase 2 LCG Applications Area

4. Major Changes in the Applications Area for Phase 2

While AA Phase 1 has been mainly a development activity of a number of new software products, in Phase 2 we need to establish what is the desired level of long-term support and maintenance that is required for these products that are essential for the operation of the LHC experiments. To facilitate maintenance we need to minimize duplication and put in practice re-use of software and infrastructure across projects. The ultimate goal is to facilitate as much as possible the maintenance of the AA software at the end of LCG project when the resources associated to LCG will not be there. In addition, we will emphasise the Physics Analysis functionality during this Phase. A number of major changes in the project structure and objectives for the Applications Area for Phase 2 have been discussed among the project leaders and experiment representatives. The proposed changes have been reviewed in depth in the AA internal review that tool place beginning of April 5. The proposed changes are summarized as follows: 1. SEAL and ROOT projects merge. This is the major and more challenging change in the Applications Area for Phase 2. The motivations for this merge are: the optimization of resources by avoiding duplicate development (SEAL and ROOT were overlapping in some aspects), provide a better coherency of the products developed and maintained by the AA vis-à-vis the LHC experiments, the ROOT activity fully integrated in the LCG organization (planning, milestones, reviews, resources, etc.), and ease the long-term maintenance and evolution of a single set of software products thinking in the post-LCG era. In practice this merge will imply the merge of the SEAL and ROOT teams into single team with a combined program of work delivering the union of the existing functionality in ROOT and SEAL. Initially the SEAL libraries in use by the LHC experiments will be continued to be maintained a long they are still in use. The migration to new merged functionality libraries will be done adiabatically in coordination with the experiments. 2. Small redefinition of the SPI role. The SPI project should continue to provide and run a number of services of common interest (savannah project portals, external libraries service, etc,) for the LCG AA projects but also used directly by the LHC experiments. In general, SPI should continue to help projects and experiments to provide and maintain the software development infrastructure by providing a number of common key roles in for the software production (librarian, webmaster, etc.). Some of the SPI activities will be distributed among the projects, and the people will directly participate in the software development projects (release managers, quality assurance, documentation, etc.). 3. Some adaptations of POOL will be required. The domain of expertise of the POOL project will continue to be data persistency, data management, deployment in the Grid and relational databases. There are no major changes proposed in the structure. The two major products being POOL (object persistency framework) and COOL(conditions database) will continue but need to be adapted to the changes in the underlying software packages due to the merge of SEAL and ROOT.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 9 0.76 Plans for Phase 2 LCG Applications Area

4. PI project will be discontinued. The project as such will be discontinued and the existing software libraries in use will be absorbed by the client projects. In general the idea is that after an inventory of the existing libraries, the one not used will be abandoned, the ones used by one experiment or framework will be moved to the experiment or framework and the remaining ones moved to the new SEAL+ROOT project. 5. Simulation project basically unchanged. The domain of expertise of the simulation project will all the defined sub-projects will be in event generator and detector simulation. There no major changes in the structure foreseen, besides the addition of a new sub-project, Garfield, for the simulation gaseous detectors. During this phase, the simulation projects will be encouraged to use the core software and software development environment provided by the other AA projects, for example for implementing interactivity, persistency, analysis, etc.

4.1 SEAL and ROOT merge The merge of the SEAL and ROOT projects is the action that will take most attention during the next 12 months. Being at the core of the software its implications are very important. The current idea is to migrate adiabatically the functionality provided by SEAL to a new set of libraries in the ROOT project. It is essential to provide end-user compatibility at each step in the process and involve the experiments in the definition of the new functionality and implantation details. The time that will take the overall migration (SEAL libraries not needed by the experiments) is estimated to be a couple of one yearrs. This will be proposed as a level-1 milestone. The AA internal review concluded positively about the proposal of the merge and encouraged to continue the discussion of the technical details of the planning in the Architects Forum (AF). Since then, we have established in the AF a process on how to address all the topics that require detailed plans. The process is as follows:

- Produce a list of topics (functionality) that needs to be merged or understood their needs and implications. This list is available at http://lcgapp.cern.ch/project/mgmt/coretopics2.html - Experiments should prioritize the list. - Each topic will be handled individually. All the topics can not be handled in parallel, but they do not need to be treated completely sequentially. - AA projects and experiments will assign people for each of the topics and informal discussions will be organized to gather the requirements, constraints, and design and implementation issues. - A written proposal for each topic will be produced after a period of 2-3 weeks specifying the agreed functionality, API, implementation details, impact for the experiments, time scale, and so on. - An open AA meeting will be organized to present and discuss the topic.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 10 0.76 Plans for Phase 2 LCG Applications Area

- AF will decide and give green light for the implementation. This process is being applied and the first decisions have been taken and they are already reflected in the detailed plans for each of the projects. More recently, the AF has agreed to keep the existing libraries of the SEAL project for a much longer period of time available for the experiments, so that they are not forced to migrate their application software. In view of this, an additional work package in the ROOT project has been created to ensure the maintenance of the SEAL/PI libraries.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 11 0.76 Plans for Phase 2 LCG Applications Area

5. Software Process and Infrastructure project (SPI)

5.1 Purpose, Scope and Objectives The AA software projects share a single development infrastructure; this infrastructure is provided by the SPI project. A set of basic services and support are provided for the various activities of software development. The definition of a single project managing the infrastructure for all the development projects is crucial in order to foster homogeneity and avoid duplications in the way the AA project develop and manage their software. In its current project definition, the users of the deliverables of the of the project are all the LCG software projects and any other CERN experiment interested in using part or the entire infrastructure. The first of such users are the POOL, SEAL, PI, and Simulation projects, as well as any future LCG project. The project regularly interacts with all LHC experiments (Alice, Atlas, CMS, and LHCb) and with the main projects at the Laboratory (Geant4, Root, etc.) in order to receive advice and feedback. Many services of the SPI project (Savannah, external software, testing frameworks, etc) are used outside the LCG from the LHC experiments and several IT projects.

5.2 Project Organization The project has now delivered all initial versions of the foreseen tools and services and it is in a steady state mode of operation. Manpower is extremely tight at present, sufficient for essential operations but impacting the responsiveness of the project to new requests for service and tool enhancements and extensions. Injection of experienced FTEs would improve the responsiveness and reduce the internal stresses arising from people being spread too thinly over the required tasks. EGEE will use the SPI services and add resource in order to expand the existing SPI services to address their needs (e.g. development in Java and other tools).

5.3 Services and Responsibilities The following table list the current services and tasks being carrier by the SPI project together with the names of the people responsible for them.

Service and Responsibilities SPI 2005

Service On-going Activities and Tasks Who Software Perform releases and pre-releases of the LCG AA A.Pfeiffer Librarian, build software. (then also with and release Automate build and release functions. SFT fellow) Work with Librarians of the AA projects and LHC experiments in order to provide what is needed by the users.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 12 0.76 Plans for Phase 2 LCG Applications Area

Service and Responsibilities SPI 2005

External Software Install and upgrade, on the LCG platforms, all external E.Poinsignon software needed by the AA projects and LHC Y.Patois experiments (via AF and LIM). (then S.Roiser Automation of the existing installations. and B.Bellenot) Software Basic release management (tarfiles only for the E.Poinsignon Distribution moment) to generate installation kits for the LCG A.Pfeiffer software and provide a download site.

Define, develop and support a release and packaging Y.Patois service. LCG Software Provide software configuration and build information for E.Poinsignon Configuration the software projects on all supported platforms (CMT A.Pfeiffer and Scram files) Savannah Dev. User support. Functionality enhancements. Y.Perrin Portal Maintenance of the service and bug fixing. Work/Merge with Savannah open source. QA and Policies Quality Assurance metrics and measurements, and J.Benard Responsible verification of the policies in LCG projects. (then also Generation of QA reports for all AA projects, M.Gallas) concerning bug fixing and test coverage. Testing User support, maintenance, configuration, J.Benard Frameworks development of the tools used for testing (CppUnit, (then also PyUnit, Oval, QMTest). M.Gallas) Code Doc. And User support. Maintenance, configuration, A.Aimar Web applications development of the tools used for the code documentation (Doxygen, lxr, ViewCVS). LCG policies, Define new policies and templates on the existing A.Aimar templates activities; foster the definition of new standards. Web Basic web templates for SPI and for other projects. A.Aimar management, Web definition, support and editing. Documentation, Development of a general infrastructure for a common Wiki, LCG documentation and workbook. AOB All SPI members Basic workbook development and documentation for internal use in SPI. Table 1 List of services and tasks for SPI

5.4 Main project milestones for 2005 All SPI milestones are in one single list, grouped by service. Milestones are added and changed following the decisions and priorities defined by the Architect Forum of the LCG Applications Area.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 13 0.76 Plans for Phase 2 LCG Applications Area

The SPI services (in Section 5.3) have their own more detailed “to do” and tasks lists, each with specific features, improvements, installations, etc. The list below includes only all project-level major milestones, not the ongoing activities. The resources in the SPI project are sufficient for the continuation of the existing services in 2005 at current level, and also to fulfil the milestones that follow.

ID Milestone Date Who Build system BUIL01 Toolbox for scram v1 provided (manually) Jun-05 EP, AP BUIL02 gcc 3.4.3(4) on IA32, AMD64 Jun-05 AP, EP, YPa BUIL03 Definition of products and partitions of the LCG Jul-05 AP, EP, software. YPa Individual partial releases and bug releases. BUIL04 MAC OSX 10.4 support; (gcc4.0) Jul-05 AP, EP, YPa BUIL05 Productize the suite of release build tools and Jul-05 AP distribute for general use BUIL06 Definition of build information for projects in Aug-05 YPa, AP, XML (dep. on BUIL03) EP BUIL07 Independent build and release of partitions of the Oct-05 YPa, AP, LCG software (dep. on BUIL03) EP BUIL08 Generate CMT configuration information from Nov-05 YPa, AP XML configuration (dep. on BUIL03) BUIL09 Generate SCRAM V1 build information from Nov-05 YPa, AP XML configuration (dep. on BUIL03) SPI01 CMT and SCRAM generated automatically from Oct-05 XML SPI02 Independent build and release of partitions of the Nov-05 LCG software Distribution DIST01 Pacman cache available for binaries of current Jun-05 AP, EP, releases YPa DIST02 Pacman cache available for sources of current Jul-05 AP, EP, releases YPa DIST03 Automation of generation of caches from XML Sep-05 YPa dependency files

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 14 0.76 Plans for Phase 2 LCG Applications Area

DIST04 Complete and document XML format Aug-05 YPa

DIST05 Selective download for parts of GENSER Aug-05 AP

DIST06 Productize distribution system Oct-05 AP, EP, YPa SPI03 Pacman caches for binaries and sources Jul-05

SPI04 Automated distribution system in place, generate Oct-05 from XML description External Software EXT01 Automate all packages. Jun-05 EP Write XML procedures for all packages of the latest configuration EXT02 Document procedures for selection of new Nov-05 EP packages and how to phase out existing packages EXT03 AMD64/IA32 gcc-3.4 and MAC OSX 10.4 Sep-05 EP, YPa, support AP EXT04 Procedure for non-AF packages: Define how Nov-05 EP other people can contribute to the installation of new packages and their maintenance (to then propose to the AF) Savannah SAV01 Provide documentation for end users and project Aug-05 YPe administrators (series of how-to, Wiki, etc)

SAV02 Move service to SLC3 servers: Jul-05 YPe, AP Get SLC3 and all products that savannah relies on installed on development system and lcgapp1. Install Savannah code and database and make sure it works fine. Move production system to lcgapp1, test, and then release to users. SAV03 User Forums: Dec-05 YPe Setup up a user forum for discussions and self- help (maybe a general user forum system) SAV04 Export Feature: Oct-05 YPe Provide mechanism to extract data from Savannah (define granularity, interface, format) SAV05 Import Feature: Nov-05 YPe Provide mechanism to import/migrate external data into savannah

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 15 0.76 Plans for Phase 2 LCG Applications Area

SPI07 Import/export features available in the Savannah Nov-05 system. Provide mechanism to import/migrate external data into savannah. SPI08 Savannah user forum setup, user and admin Dec-05 documentation available. Quality Assurance QA01 QA procedures for LCG projects: Actively define Sep-05 JB procedure for the LCG projects, discussing with AA (AF them and proposing to the AF the policy to meetings) follow (start from existing policies) QA02 Test Coverage for non LCG projects: Jul-05 JB Define test coverage procedures and scripts for non LCG projects, refine and customize reports QA03 Profiling Tools: Oct-05 JB Investigate and propose a solution for profiling the LCG software (gprof, Intel Vtune, Valgrind, etc) QA04 LCG Metrics: Nov-05 JB, AA (AF Define a set of consistent metrics for the LCG meetings) projects (check with the AF, as there is the re- organization) SPI09 QA reporting on savannah and test coverage Oct-05 available and documented.

Testing TST01 Usage of QMTest in SPI: Define where QMTest Dec-05 JB, all SPI could be used in SPI, add actions accordingly TST02 Present outside LCG: Testing frameworks Dec-05 JB interesting for the LHC experiments (AA review) TST03 Update all documentation and verify all testing Sep-05 JB tools on all platforms (including Windows and OSX) TST04 Install latest version of testing tools on AFS and Aug-05 JB update the configuration for the projects SPI10 Testing tools updated to the latest versions on the Sep-05 AFS area and available to LCG projects and outside. Documentation DOC01 LCG workbook: Workbook for all users of SPI Sep-05 AA

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 16 0.76 Plans for Phase 2 LCG Applications Area

infrastructure (project administrators, developers, users) DOC02 Doxygen and LXR documentation: Automated Jun-05 AA generation of snapshots and tagged versions DOC03 External packages: Generate doxygen for AIDA, Jun-05 AA CLHEP and ROOT DOC04 Cross-link doxygen: Link among all the generated Jul-05 AA doxygen webs, taking into account the correct configuration/toolbox DOC05 SPI Wiki: Sort out the structure and procedures Aug-05 AA for the workbook for SPI (internal and external). Document, track and coordinate services provided by IT division, like the build servers, CVS service, etc. SPI11 Doxygen and LXR automated and documented Aug-05 for the LCG software and for the packages (ROOT, AIDA, CLHEP) Training TRAI01 Define training for LCG users: Define training Jun-05 AA needs for the LCG users, push authors to make material and organize courses System administration and maintenance SYS01 Move services from rh73 to slc3: Install SLC3 on July -05 AP, YPe lcgapp1 (and lcgapp3), move/merge services from lcgapp2 (RH73) to lcgapp1 and test, then release to users. SYS02 Setup redundant server: Prepare lcgapp3 to be Sep-05 AP able to take over from lcgapp1 in case of emergency and test. Others OTH01 EGEE support and new plan for EGEE phase 2 Dec-05 AA, AP OTH02 Testing and QA tools for distributed applications Dec-05 JB (EGEE request)

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 17 0.76 Plans for Phase 2 LCG Applications Area

6. Core libraries and services project (ROOT)

6.1 Project Organization Following the merge of the SEAL and ROOT projects (see Section 4.1), the ROOT project has been restructured into the following work packages (the work package responsible also indicated): - BASE: (Fons Rademakers). System classes, CVS management, documentation and releases. - DICT: (Philippe Canal/FNAL). Reflexion system, ROOT meta classes, CINT and Python interpreters. - I/O: (Philippe Canal): Basic I/O, Trees, Tree queries - PROOF (Fons Rademakers): The PROOF and xrootd systems. - MATH (Lorenzo Moneta); Mathematical libraries, histograming, fitting. - GUI (Ilka Antcheva): Graphical User Interfaces and Object Editors - GRAPHICS (Olivier Couet): 2-D and 3-D graphics - GEOM (Andrei Gheata/Alice) : The geometry system

6.2 The BASE work package This work-package is responsible for the entire base infrastructure that is used by all the other work-packages. The deliverables are: - System and OS-dependent classes (implementations of the base class TSystem). - The plug-in manager - The ACLIC system, an automatic interface between the CINT interpreter and the local compilers and linkers. We would like to reimplement the library dependency tracking mechanism. Currently it relies on linking the new library against all the libraries that have been loaded. To solve many issues inherent with this approach we need to use a different approach equivalent (or identical) to autotools' .la files, which keep track of dependencies in a formatted text file. - The configuration system in CVS and the build procedures. Once the merging with SEAL is completed, the migration to a central CVS service will be investigated. - The development (every month) and production (twice a year) releases. - The documentation system: web pages, reference and user guides. Make the documentation in the code compatible with Doxygen. - QA with the test suite, the code checker and monitoring facilities. - The mailing lists, ROOT Forum and bug reporting system.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 18 0.76 Plans for Phase 2 LCG Applications Area

6.2.1 Milestones ID Delivery date Description BAS1 2005/08/01 New updated plugin manager BAS2 2005/07/15 A new edition of the ROOT Users Guide BAS3 2005/07/15 The ROOT bug tracking system moved to Savannah

6.3 The DICT work package Responsible for maintaining and developing the reflexion system, the ROOT meta classes, CINT and the interpreter interfaces. Masa Goto remains fully responsible for CINT and its evolution. The current reflexion system is entirely provided by CINT. We plan on updating CINT to use the reflexion system provided by the Reflex system. The CINT/Reflex workshop, help in early May 2005, produced a plan for evaluation and implementation. The code interpretation part of CINT is currently being reengineered by Masa Goto. The previous implementation had two different mechanisms for code interpretation, one for non-optimized code, another one using the generation of an internal byte-code for the optimized case. The new system will have only one case based on a new version of the byte-code generator. The new version of CINT will depend on Reflex and the new CINT tar file will also contain Reflex for consistency and satisfy the requirements of CINT users not using the ROOT system. During the May workshop, it was decided to split the work in the following tasks: - DIC1: Porting and adapting the SEAL Reflex classes to the ROOT environment (coding conventions and supported platforms) (Stefan Roiser) - DIC2: PyRoot: improve the compliance to C++ (namespaces, templates, etc.) (Wim) - DIC3: Adapting the rootcint header file parser and code generator to produce code consistent with the Reflex API. (Stefan) - DIC 4: Modifying the bulk of CINT to use the Reflex API (Philippe and Stefan) - DIC 5: Work on the new byte-code generator (Masa). - DIC 6: Adapting the ROOT Meta classes to the new API - DIC 7: Address the known deficiencies that have not been solved by using the new reflexion system - DIC 8: PyRoot: Adaptation to Reflex (Wim)

In addition we have - Address known issues in the Meta package and rootcint

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 19 0.76 Plans for Phase 2 LCG Applications Area

We expect to finish (DIC1) and most of (DIC2) for the June release. The upgrade of the code parser/generator (DIC3) should be available in September. Before all phases are completed, we will have to support the old rootcint as well as the new rootcint (probably using the existing cintex interface). The evaluation phase for (DIC4) has been estimated to take about 3 months by Philippe. Assuming that major changes or additions will not be required in Reflex, we expect to finish (DIC4) for the December release. A progress report about this phase will be given to the ROOT workshop at the end of September. Masa is working on (DIC5) since one year. We expect to plug-in this new algorithm (or one still in parallel) before the end of this year. The (DIC5) (expected to take about one month), pyroot and (DIC8) steps, will start only once the steps (DIC1), (DIC3) and (DIC4) will have been completed. In parallel, we also need to provide on-going support and try to solve any new issues.

6.3.1 Milestones ID Delivery date Description DIC1 2005/06/30 Reflex and Cintex in the ROOT v5 development release DIC2 2005/09/30 rootcint header file parser and code generator adapted to produce code consistent with the Reflex API. DIC3 2005/09/30 Assessment of the work to modify CINT to Reflex. DIC4 2005/12/31 CINT fully converted to Reflex DIC5 2006/03/31 PyRoot exclusively based on Reflex

6.4 The I/O and Trees work package Responsible for maintaining and developing the core I/O system, TTrees and TTree queries. Following the merge, a lot of work has already been done in the object persistency mechanism. This is already included in ROOT version 4.04 end of April. The first development release of ROOT version 5 will include new optimisations in speed and support for additional C++ constructs. A corresponding optimisation has already been done in the Trees, in particular when using STL collections. The main developments with Trees are - IO1: Support for friend trees when trees are grouped into TChains. - IO2: Support for bitmap indices (work in collaboration with LBL) - IO3: Adaptation to PROOF - IO4: Speed-up of the Tree query mechanism.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 20 0.76 Plans for Phase 2 LCG Applications Area

- IO5: Creation a interface between TTree and SQL databases

The main development related to TTree queries - TRE1: Production release of the MakeProxy mechanism - TRE2: Implementation of the auto-dereferencing of TRefs and Pool::Ref - TRE3: Optimize TTreeFormula run-time by using the new TFormula optimization - TRE4: In addition there are a few outstanding issues (support virtual, private, and protected inheritances, private and protected typedefs) that need to be addressed

6.4.1 Milestones

6.5 The PROOF work package A major upgrade of the PROOF system has been started in January. PROOF is evolving from a system that processes interactive short blocking queries to a system that also supports long running queries in a stateless client mode. It will be possible to reconnect from the same or another client session and to browse the set of running queries or retrieve the results of finished ones. To be able to support these new features we plan to reuse the xrootd [ref] file server infrastructure and extend it with a protocol handler for PROOF. Xrootd offers an advanced, very efficient, multi-threaded connection and poller infrastructure and a protocol processor that is easily extensible. An early prototype has already shown the feasibility of this approach. We are also working on a set of authentication plugins for xrootd that will support the most popular authentication methods: Kerberos, Globus, SRP, ssh, uid/pwd and uid/gid. By reusing xrootd as PROOF front-end daemon PROOF will directly benefit from this work. Further work is ongoing in making the PROOF cluster setup "zero-config" by using the Apple Bonjour protocol for service discovery. By using Bonjour the PROOF server nodes will be able to discover the PROOF-config service and register themselves and their capabilities. When a user connects to the PROOF master the master will discover and use the PROOF-config service to find the slave nodes. In addition a lot of work is being done it the area of usability and user friendliness of PROOF. Integration in the ROOT browser will allow the browsing of the ongoing PROOF sessions and the analysis products created in these session. Also all chain, friend, eventlist and index operations available in standalone ROOT sessions will be supported in PROOF. To run PROOF on the Grid we are working on interfaces to Grid middleware that will allow us to access file catalogs, job schedulers and storage elements from ROOT and PROOF. Via these interface we can find the desired file sets in the file catalog, the

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 21 0.76 Plans for Phase 2 LCG Applications Area

storage elements containing the files and partition the query in such a way that slaves are started on the compute elements close to the data. Except for the Grid extensions we expect all the milestones to be achieved before the ROOT 2005 workshop. The Grid extensions are dependent on the Grid middleware currently under development in LCG/EEGE and the different LHC experiments and the milestones are therefore more difficult to define. Once the new system has been demonstrated further developments will largely be driven by user feedback.

6.5.1 Milestones ID Delivery date Description PRF1 2005/07/15 Finish xrootd authentication modules PRF2 2005/07/15 Add support for friends, eventlists and indices PRF3 2005/08/01 Switch to new xrootd based front-ends, still blocking PRF4 2005/08/15 Add zero-config features PRF5 2005/09/01 First version of non-blocking queries PRF6 2005/09/01 Add session and result browsing PRF7 2005/09/30 Demonstration of new PROOF system at ROOT 2005

6.6 The MATH work package This work package is responsible for: - MAT1: MathCore: a small set of mathematical functions, utilities, physics vectors and random number generators. MathCore is a self-consistent set of classes that will also be released as an independent library. A first call for comments for the vector package has been done. We expect to include a first version of this library in the ROOT development release end of June with the physics vector package. We will later migrate in the library some of the numerical algorithms currently existing in ROOT, in other classes like TF1. For the random numbers we will evaluate a prototype package compliant with the proposed design to the C++ standard and eventually re-implement the current ROOT TRandom classes using this new package. - MAT2: MathMore: a larger set of utilities and functions. This first implementation will consist of C++ wrappers calling functions automatically extracted from GSL. A first release including the C++ wrapper developed in SEAL for special and statistical functions some numerical algorithms like adaptive integration will be available for the August release. - MAT3: the histogramming classes. No major developments are foreseen in the near future.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 22 0.76 Plans for Phase 2 LCG Applications Area

- MAT4: Minimisation and Fitting classes. The current abstract interface (TVirtualfitter) will be modified and extended to support the new Linear and Robust fitters and the new version of Minuit. A new proposal with redesign of the classes is expected for the end of September and the implementations for the end of the year. - MAT5: The new version of C++ Minuit from SEAL requires some changes and simplification in the user interface (API). The Minimization package developed by the Fermilab computing group will be evaluated for a possible merge with the new Minuit. - MAT6: The RooFit package from BaBar is currently being introduced in CVS. Some more work to understand its impact on the other fitters and to have a complete and coherent integration with the other ROOT Math packages will have to be done later. - MAT7: Continue a detailed evaluation of the ROOT Linear Algebra package with performance tests in experiment applications such as track reconstruction, comparing with currently existing packages. Evaluate also the possibility to move to template classes on the scalar type (float, double)

6.6.1 Milestones ID Delivery date Description MAT1 2005/06/30 First version of new MathCore library MAT2 2005/08/31 First version of the MathMore and random number generator library MAT3 2005/09/30 New C++ Minuit in be released in ROOT MAT4 2005/12/31 Finalization of the Fitting and Minimization interfaces MAT5 2005/12/31 RootFit fully integrated (in CVS by July)

6.7 The GUI work package The Graphical User Interface sub-system of ROOT provides the following sub-systems: - GUI1: Several implementations of the abstract interface TVirtualX used by the GUI and the 2-D graphics classes o TGX11: The implementation used on Unix systems and based on X11. o TGWin32: The corresponding implementation for Windows using directly the native win32 interface. o TGQt: an implementation based on the Qt package from TrollTech and developed in collaboration with Valeri Fine from BNL.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 23 0.76 Plans for Phase 2 LCG Applications Area

These 3 implementations provide the same look & feel and behaviour on all systems. The Qt interface supports several modes: o one mode where ROOT manages the graphics event loop. Qt events are forwarded to Qt. o one mode where Qt manages the graphics event loop. All non-Qt events are forwarded to ROOT. In both modes a ROOT graphics widget or a Qt native widget can be included in either a ROOT managed canvas or a Qt managed canvas. The Qt interface is currently under validation testing and is available only on Unix systems. - GUI2: A set of high level widgets like: o the canvas manager o the ROOT object browser that understands a complex hierarchy of C++ objects or files. o the Tree Viewer specialized to process interactive queries to Trees or Chains. It works also in the context of PROOF. - GUI3: The object editors. The system provides a protocol to automatically create interfaces for user-defined objects and also for the most currently used ROOT objects like histograms, graphs or basic graphics primitives. - GUI4: A GUI designer and code generator. The designer is a toolkit helping the interactive creation of a GUI. Once a new widget has been created, the system can automatically generate the corresponding C++ code, such than an iterative process calling the generated code via the interpreter (or compiled code via ACLIC) and using the drag and drop features of the editor speeds-up the building of the UI.

6.7.1 Ongoing work For GUI1 is ongoing consolidation of the various interfaces. The Qt interface still requires work to become stable and it must be made available under Windows. We expect to have a more stable interface with the July/August development release. The port on Windows should be available next year (preliminary planned for the release in December). For GUI2, the ROOT object browser needs several upgrades to support the control of attributes in the nodes being visualized. This development is important when browsing detector geometries. Ideally we would like to have a solution this year, but with the departure of the developer, it is not clear today who can implement these new features. Improvements of TTreeViewer planned for the release in December. For GUI3, the new editors are in the pipeline. A Style Editor will be implemented by a summer student. We expect to implement a Fit editor before March 2006.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 24 0.76 Plans for Phase 2 LCG Applications Area

For the GUI designer, originally implemented by V.Onuchin, all complex widgets still remain to be implemented in the toolkit. They will be implemented gradually as time permit. An important feature is the possibility to generate automatically the C++ code for the signal/slots.

6.7.2 Milestones ID Delivery date Description

6.8 The Graphics work package This work-package is responsible for: - GRA1: basic graphics like o lines, text, polygons, markers drawing o graphs to draw a set of points with or without errors (all dimensions) o histograms visualisation (all dimensions) - GRA2: generation of the various output formats like ps, eps, pdf, gif, jpg, bmp - GRA3: image processing classes. This work is done in collaboration with Sasha Vasko, the main developer of the open source system libAfterImage. The image processing classes are becoming more and more interesting in many aspects of visualisation. They provide more functionality and are more efficient when viewing large matrices of points. This set of classes is also used to generate output formats like gif, jpg when running in batch mode. This is required by automatic tools using ROOT to produce reports that can be visualized with external viewers on the web. - GRA4: 3-D graphics including basic 3-d primitives and detector geometries. An abstract interface to a 3-D viewer has been defined (TVirtualViewer3D) and implemented with the old X3D viewer (Unix only) and the GL viewers. A big effort is currently going on to develop a powerful GL viewer that could be used to visualize complex geometries and physics events taking advantage of the modern 3-d graphics cards available on most laptops. The new GL viewer is designed such that it communicates with the client with only a basic set of classes (TVirtualViewer3D and TBuffer3D). The viewer is independent of the original producer (e.g. the TGeo classes, or g3d shapes). This viewer is providing much more functionality than legacy GL viewers like OpenInventor or Coin3D. It is developed in collaboration with Timur Potchepstov from JINR Dubna. The following features are currently under implementation: o automatic reloading of the GL data base from the original detector geometry data base in real time. This is a major achievement compared to

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 25 0.76 Plans for Phase 2 LCG Applications Area

classical viewers that can only show static extracts from the geometry data base. o slicing of the detector with cuts with 1,2 or 3 planes. o drawing of the outline of the shapes (like in CAD systems). o support for composite shapes. o optimisation of the GL data base with clever caching to speed-up the viewing pipeline when rotating or zooming the picture. - GRA5: Implement a version of TPad and TCanvas based on OpenGL to allow mixing of high level 2d and 3d graphics in the same output.

6.8.1 Ongoing work The usual consolidation work for GRA1 is ongoing. We intend to adapt the viewing of histograms with the 3-D options (lego, surfaces) to the new GL viewer. This will happen gradually between September 2005 and June 2006. For GRA2, we intend to remove several restrictions in the output formats, e.g. transparent pads) and implement support for new graphics patterns in formats like pdf. For GRA3, we will have to follow the developments in libAfterImage. For GRA4, we intend to demonstrate the new GL viewer at the ROOT workshop in September with possible applications in Event display packages. The viewing of dynamic particles will follow and could be available for the December release. For GRA5, it is expected to be available for the ROOT workshop in September.

6.8.2 Milestones ID Delivery date Description GRA1 2005/09/30 Demonstrate the new GL viewer at the ROOT workshop.

6.9 The GEOM work pacakgepackage The ROOT geometry classes (TGeo) have been developed with and mainly Andrei and Mihaela Gheata from Alice. These classes support: - description of complex detectors. All LHC detector geometries are currently supported as well as the quasi totality of existing detectors. - automatic and efficient detection of volume overlaps or extrusions. - navigation in these geometries with efficient computation of distances - to boundaries, normals to surfaces. - powerful visualisation package.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 26 0.76 Plans for Phase 2 LCG Applications Area

- These classes have been used in conjunction with the VMC (see below) and have interfaces with Geant3, Geant4 and Fluka. The interesting feature of the TGeo classes is that they can be used for detector simulation, reconstruction, event displays, online, etc.

6.9.1 On going works A continuous optimisation process is going on to: - optimize the performance - run quality assurance tests comparing the results with other geometry - data bases like Geant3, Geant4 and Fluka. - add additional functions required by reconstruction programs like the concept of logical assemblies, extrapolation to a virtual surface. This work is being done mainly by Alice in the context of their MonteCarlo validation process. The TGeo classes have been so far certified with Geant3 and Fluka. This continuous optimisation process is expected to continue in the coming months. A major worry is the continuation of this process once the main authors contracts will finish in the middle of 2006.

6.10 The SEAL work package

This work package is responsible for maintaining the SEAL software which is not going to be integrated in ROOT but it is requested by the LHC experiments and other LCG projects such as POOL. No new developments for these packages are foreseen and the maintenance will be done only for the official platforms agreed by the LCG application area. The packages of SEAL that will be maintained in the long term and they will not be part of ROOT, are the following:  Foundation: o SealBase, SealIOTools, SealUtil, SealZip and PluginManager  FrameWork: o SealKernel and SealServices

The current existing Dictionary Reflex and Cintex packages and the MathLib packages (Minuit and MathCore) will be maintained as long as the integration inside ROOT is not yet completed. The Dictionary Reflection and ReflectionBuilder packages and the Scripting packages PyLCGDict and PyLCGDict2, which are based on Reflection, will be maintained until Reflex will be in used in production by the experiments.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 27 0.76 Plans for Phase 2 LCG Applications Area

The current version of SEAL provides in addition the dictionary libraries generated using both Reflex and Reflection for some common used external packages like STL, CLHEP and ROOT. It is foreseen that the dictionary for STL and ROOT, will be released as part of ROOT. The dictionary for CLHEP will be maintained as long as the experiments will have not completed the migration from CLHEP to the new ROOT Math classes.

The SPI group will have the responsibility of building the releases on all the supported platforms. Milestones: ID Delivery date Description SEAL1 2005/09/30 Remove from new SEAL releases Reflex and Cintex SEAL2 2005/09/30 Remove from new SEAL releases Minuit and MathCore SEAL2 2005/12/31 Remove from new SEAL releases Reflection, ReflectionBuilder and the Scripting packages (PyLCGDict and PyLCGDict2)

6.11 Resources

6.11.1 Staffing BASE : Fons (0.2) , Ilka (0.3), Jose (0.2), Philippe (0.1), Bertrand (0.1), Axel (0.2) DICT: Stefan (0.66), Markus (0.1), Wim (0.2), Philippe (0.4), Masa (0.2) I/O: Markus (0.2), Philippe (0.5) PROOF: Fons (0.3), Marek (0.7), Gerri (1.0), Guenter (0.4), Maarten (0.5) MATH: Lorenzo (01.90), Andras (1.0), Anna (0.5), Eddy (0.2) GUI: Ilka (0.7), Valeriy (0.5), Fons (0.1), Valeri (0.1), Bertrand (0.2) GRAF: Olivier (1.0), Richard (1.0), Andrei (0.1), Bruno (0.1), Timur (0.8) GEOM: Andrei (0.6), Mihaela (0.2) SEAL: Lorenzo (0.1), Stefan (0.1)

CERN+LCG LHC Exp. Other Total BASE 0.5 0.2 0.4 1.1 DICT 0.6 0.3 0.6 1.5 I/O 0.2 0.5 0.7 PROOF 2.0 0.4 0.5 2.9 MATH 2.45 0.2 2.7

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 28 0.76 Plans for Phase 2 LCG Applications Area

GRAF 2.0 0.2 0.8 3.0 GUI 1.3 0.3 1.6 GEOM 0.8 0.8 SEALTotal 0.28.9 2.1 3.3 0.214.3 Total 9.0 2.1 3.3 14.4

Table 2 Manpower currently available for the ROOT project by work package and funding source.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 29 0.76 Plans for Phase 2 LCG Applications Area

7. Persistency Framework Projects (POOL/COOL)

7.1 Purpose, Scope and Objectives The persistency framework project provides a general persistency store for the LHC experiments to store event and associated meta data. The project has following the proposal of the LCG Persistency RTAG 6 adopted a hybrid approach for its implementation. The LCG persistency framework consists of two main packages: POOL and COOL). The POOL project represents a generic object store which provides independent of the backend technology a consistent interface to store C++ objects into any of the POOL implementation technologies (ROOT or RDBMS). The main concept used to store and retrieve objects is the POOL token, which is generated when an object is stored. This token fully specifies the location of an object in a distributed store consisting of many files and databases. The token can later be used to quickly retrieve any object from the store. In addition, the POOL object token can be persistently stored inside other objects to represent object associations in a similar way as C++ pointers allow this for transient data. POOL has provided the main required functionality for file based object storage integrated with the grid cataloguing services. These core capabilities have been validated in several larger experiment data challenges up to the scale of 400TB. More recently POOL has provided a general relational database abstraction package (RAL), which has been successively replaced all direct (MySQL specific) database uses through out POOL and has been picked up also by experiments to implement their database applications. This package is expected to play a central role in the integration of physics database application with the distributed database environment as prepared by the LCG 3D project. As one of the central LCG data manage packages, POOL is responsible for integrating the experiment offline frameworks with other core packages produced by the application area such as SEAL and ROOT. At the same time POOL components connect the experiment applications with the basic grid services such as file catalogs and security and distributed database services. For this reason POOL will during LCG Phase 2 need to evolve in close collaboration with both application area and grid middleware/deployment areas. The COOL subproject provides a more specialised software package to manage time series of user data such as detector conditions or calibration constants. The package provides a high level API, which allows the user to store arbitrary payload data (relational tuples or references to complex object) together with the time interval of its validity (IOV). COOL supports to organise a large number of different data types in a hierarchical folder structure, similar as file systems allow to organise files. COOL also supports optionally to maintain for a given quantity and validity time to version the payload data. This versioning is required to represent e.g. different results for calibration constants that have been obtained from different algorithms. For conditions data time of validity, conditions folder and the optional processing version uniquely define each payload data item. The COOL implementation is build on top of the relational database abstraction (RAL) and will directly profit from its integration with grid aware database

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 30 0.76 Plans for Phase 2 LCG Applications Area

services. COOL will allow using the POOL C++ object storage components to store more complex object payloads in databases or files. This choice will be left to the COOL users/experiments. COOL is a more recent development that still needs to be fully integrated in the experiment frameworks and to be validated in larger scale production activities. In particular, the distributed deployment of COOL as one of the first distributed database applications will need significant preparations and tests during Phase 2 of LCG.

7.2 Project Organization The persistency project is structured into work packages, which closely follow the domain decomposition of the persistency framework. Each work package is coordinated by a work package leader and is responsible for the definition of its components interfaces in collaboration with the main clients. These clients may be other POOL components or end-users. Each work package is also responsible to implement if necessary several different technology plugins for each component, to match the required operational environments or to follow technology evolution with time.

Work Package Coordination Catalog and Grid Integration Zhen Xie Storage Manager and Object References Markus Frank Collections and Meta Data David Malon Infrastructure and Testing Giacomo Govi Database Access & Distribution Ioannis Papadopoulos Conditions Database Andrea Valassi

Roles Project Leader Dirk Duellmann Ioannis Papadopoulos (POOL) Release Management Andrea Valassi (COOL)

POOL Experiment Contacts Link Person ATLAS Ioannis Papadopoulos CMS Giacomo Govi / Zhen Xie LHCb Markus Frank

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 31 0.76 Plans for Phase 2 LCG Applications Area

7.3 Persistency Framework work packages

7.3.1 Object Storage and References This work package defines a technology independent API to store and retrieve arbitrary user objects. This includes the definition of object cross references (tokens) and the infrastructure to use them transparently from C++ (Refs). The API also defines the semantic for consistent transactions across file and database storage implementations. It provides implementations of this object store and an associated client side object cache based on ROOT I/O and an object-relational mapping component. The main milestone in this area include the application area wide migration to the new LCG reflection package “reflex” which will affect the most POOL storage components. The implementation dependent storage managers for ROOT I/O and the relational object mapping will be upgraded to make use of the new package as well as the POOL references, which use the reflection to implement safe type casting operations. The ROOT based storage manager will take advantage of the Cintex package. In parallel to the move to reflex POOL will provide a set of command line tools and APIs, which allows storing the object-mappings based on user provided for pre-existing relational data. These tools will support the experiment requirement to make eg relation data written directly into a supported back-end database available in a consistent way with other POOL stored objects. After an initial Reflex based release of POOL we expect a validation phase in which the experiments evaluate the new dictionary implementation in particular with respect to compatibility with data written with older POOL releases. After successful validation POOL will release a reflex based production version and will freeze developments based on the older LCG dictionary. Anther important improvement for the POOL deployment will be the introduction of schema loading on demand based on the SEAL dictionary service. This task will likely require work on the ROOT side to allow for the necessary callouts to inform POOL of the attempt to use a C++ class for which no dictionary has been loaded so far. With the integration of Reflex, Cintex and on demand schema loading it will be possible to provide a seamless integration of POOL into ROOT as a analysis shell. Some clarification from the experiments about their detailed requirements (POOL plugin for ROOT or direct interpretation of POOL tokens inside ROOT) needs to be obtained to schedule the necessary developments on the POOL and ROOT side to allow transparent use of all POOL features from ROOT based analysis programs. A proposal will be produced by the project and presented to the experiments for discussion. To allow for consistent and secure access of file based and database based data POOL will need to pickup the security infrastructure as defined by ROOT. This may involve passing LCG certificate information to a ROOT I/O version which supports SRM based files and to RAL based components for secure database access.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 32 0.76 Plans for Phase 2 LCG Applications Area

Further discussions will be required for the integration with xrootd as file server backend, which may have an impact on the POOL file catalog and security support. This activity will need more consultation with the experiments planning to use xrootd to come up with a common work plan proposal by the POOL and ROOT teams. After the release of the repackaged RAL component POOL will upgrade the object relational mapping layer to the new version of this package. This will provide database lookup and client side database diagnostics also to applications storing complex C++ objects via POOL in relational databases. Milestones:

ID Delivery Date Description OSM01 2005/07/31 Validation release based on Reflex OSM02 2005/09/30 Reflex production release (Reflex validation by experiments completed) OSM03 2005/08/31 Command line tools to access pre-existing relational data OSM04 2005/09/30 Support mapping of STL containers to BLOBs OSM05 2005/09/30 On demand schema loading OSM06 2005/10/31 Proposal for POOL refs and collections accessible from ROOT as analysis shell OSM07 2005/11/30 Certificate based access to files and databases OSM08 2005/09/30 Completed migration of Relational Storage Manager to the new RAL release POOL01 2006/03/31 Alignment with new CORE plugin and foundations proposals POOL02 2006/03/31 Overall performance study and validation with expected requirements

7.3.2 Collections and Meta Data The collection work package provides a technology independent API to access and manage large scale collections of persistent objects. Both the collections and the objects themselves may reside in a POOL file or database. The collection API forms the developer level tools to construct queriable event collections as used by experiment analysis and book keeping activities. The API is accompanied by a set of command line tools, which allow maintaining collections similar to the administration tools, provided for the file and database catalogs. The collection implementation is increasingly integrated with the POOL object storage services. It is therefore proposed to merge this work package with the Object Storage work package. After the new RAL release the database based collections will make use of the security, distribution and monitoring capabilities introduced there. All collection implementations

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 33 0.76 Plans for Phase 2 LCG Applications Area

will undergo a validation against the experiment requirements. For the database based collections this activity will also be used to define the database service requirements for the associated back-end services.

Milestones:

ID Delivery Date Description COL01 2005/09/30 Collection implementation based on new RAL release COL02 2005/09/30 Tier 0 Collection scalability test COL03 2005/10/31 Relational collection distribution tests

7.3.3 Database Access and Distribution The database access and distribution work package defines a vendor independent hight level, relational database API (RAL) that is used as foundation for relational POOL components but also has been picked upfor relational database applications developed by the experiments. RAL implementations for ORACLE, MySQL and SQLight are provided as part of POOL. Other back-ends may follow later as required by the experiment deployment models. POOL RAL after the first production releases went through a component review and several interface changes and functionality extensions have been requested. In particular the infrastructure for logical database service lookups (instead of hard-coded physical connect strings), consistent error and retry handling in case of connection problems and client side query summaries (top queries and query latencies) will be required to support the distributed deployment of database application. To allow the use of the database foundation layer without introducing unnecessary dependencies on higher-level POOL components (eg for online use) the software will be packaged such that it can be used also outside of the full POOL release. These requests will be addressed together as part of a first re-factoring release of RAL aligned with the database distribution tests in the LCG 3D project. An important development that has been agreed recently is the development of a plug-in which will connect the POOL database access layer to data caching infrastructure such as the FroNtier system. This will allow to access read-only data via the http protocol instead of direct connections to a database server. This promises to achieve increased scalability and availability by replication of database data using standard web cache server implementations like squid. A proof-of-concept implementation of this plug-in will be provided and tested as part of the LCG 3D work plan against application workloads provided by the experiments. This database caching approach is expected to significantly simplify the deployment of replicated relational data to higher LCG tiers for those database applications for which the cache consistency issues can be controlled. Introducing this cache in the POOL database abstraction layer will make this new access method available to all higher-level components and application eg Conditions data stored via COOL.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 34 0.76 Plans for Phase 2 LCG Applications Area

Milestones:

ID Delivery Date Description RAL01 2005/04/30 RAL component review RAL02 2005/08/31 Complete support for database bulk operations (adding bulk update and delete required for COOL) RAL03 2005/08/31 RAL validation release (implementing the RAL review outcome and separate packaging) RAL04 2005/09/30 Proof-of-concept plug-in for proxy-cache database access RAL05 2005/10/31 RAL production release RAL06 2005/11/30 Production release of proxy-cache access plugin RAL07 2005/12/31 Native MySQL plugin (instead of ODBC based implementation)

7.3.4 Catalog and Grid Integration The catalog component is responsible for the management of consistent sets of POOL files and databases associated meta data. For grid disconnected use cases native POOL implementations exist (based on XML or database access via RAL). For deployment in grid environments POOL provides adaptors that connect POOL to the catalog services provided by the grid middleware (eg LFC, fireman, GLOBUS RLS). To achieve this POOL has with release V2.1 separated the catalog lookup interfaces from a now optional meta data interfaces. This has allowed the use also of those grid catalog services which do not support the definition of catalog meta data support. This separation will also simplify a possible direct integration with experiment provided meta data systems. After the new file catalog adaptors have been evaluated by the experiments POOL will release production versions, which take into account the experiment feedback, and bug fixes from the middleware developers. At this point we would like to de-support the older EDG-RLS implementations, for which the development has already been stopped in agreement with the experiments. After the implementation of the review comments to the POOL/RAL package also the POOL relational catalog will be upgraded to make use of the new version of this package and this catalog (with MySQL, Oracle and SQLight as backend) will become production implementation for local/non-grid use. It is proposed to de-support the older direct implementation against MySQL at this point.

Milestones

ID Delivery Date Description CAT01 30/06/2005 Separate lookup and meta data interfaces

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 35 0.76 Plans for Phase 2 LCG Applications Area

CAT02 31/08/2005 Production release of LCG catalog adaptors (level 2) CAT03 30/09/2005 Relational catalog upgrade to new RAL (including database lookup and security)

7.3.5 Conditions Database The Conditions Database subproject is responsible for providing the software and tools to store, manage and distribute conditions data (detector data that vary with time and may exist in several versions, such as calibration and alignment data). Since Q4 2004, the subproject has focused on the development of COOL, a new software implementation designed to merge the two pre-existing Oracle and MySQL based packages and extend their functionalities using a unified approach. COOL has been designed to avoid duplication of efforts by making maximum use of the software and tools provided by the SEAL and POOL projects. COOL provides an abstract interface to store and retrieve conditions data using the same user code to access different back-ends. Presently, only one relational implementation of this API exists, providing access to Oracle and MySQL relational databases via the POOL RAL component. The focus on the COOL development side will stay on the completion of the functionality and performance needed by the experiments for their first deployment of COOL at CERN in 2005. A prioritised list of requirements has been gathered based on experiment input during the project meetings (http://savannah.cern.ch/task/?group=cool). In the short term, the main priorities for functional extensions are the implementation of a deployment model involving read-only and write-only accounts with no DDL privileges, the completion of user examples and documentation, the provision of a few missing methods for basic tag manipulation, the completion of support for the SQLite backend and the possibility to store long strings as CLOBs in the database. Towards the end of 2005, the emphasis will move to the development of tools and APIs for data distribution at COOL deployments outside CERN. On the same timescale, COOL will as the other relational components migrate to the use of the new RAL package soon after its release to profit from service lookup, grid authentication, connection retry and client monitoring, and from many other functional and performance enhancements recommended by the COOL team during the RAL review. In parallel to ongoing development of COOL the first production release is currently undergoing a series of performance and scalability tests stressing both the software and the back-end service. The tests just started with the CERN Tier 0 applying reference workloads and functional requirements as defined by the experiments for different use cases, such as online condition data population, offline condition reading. A comprehensive multi-client COOL test suite will be developed for this purpose which will be the basis for regression tests on new software releases and to validate service changes (eg new database server hardware /configuration or new database software releases). The first results from the tests already pointed out the need for some high-

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 36 0.76 Plans for Phase 2 LCG Applications Area

priority performance enhancements in COOL and in RAL, such as the simultaneous bulk insertion/retrieval of IOVs into/from different channels in the same COOL folder or an improved handling of Oracle cursors in RAL.

Milestones:

ID Delivery Date Description CDB01 2005/04/30 First COOL production release CDB02 2005/07/31 Define reference performance requirements for COOL software and COOL service (ATLAS online exists) CDB03 2005/07/31 COOL 1.2.1 release with support for SQLight backend and Oracle tools to grant reader/writer/tagger privileges CDB04 2005/07/31 COOL tier 0 deployment starts for ATLAS CDB05 2005/08/31 User examples and basic user guide, including documentation of current and foreseen tag functionality CDB06 2005/08/311/08 COOL 1.3.0 release with support for large character objects (CLOBS) CDB07 2005/09/30 COOL software review CDB08 2005/10/15 COOL 1.4.0 release with bulk insertion and retrieval in different “channels” within the same “folder” CDB09 2005/10/15 COOL passes all tier 0 validation tests CDB10 2005/10/15 COOL integration into ATLAS and LHCb offline framework finished CDB11 2005/11/30 COOL1.5.0. release based on new RAL release CDB12 2005/12/31 First prototypes of API and command line tools for data extraction and cross-population between COOL databases

7.4 Technical process POOL and COOL follow the LCG Application Area software process and POOL has very actively contributed to its definition. The POOL project uses a release procedure, which is based on individual component tags, which get integrated into project internal releases. These releases are picked up by the experiments for early validation of new functionality and to confirm bug fixes. Internal releases are produced relatively frequent – typically every 2 weeks. Production releases that are fully validated and supported are produced based on the experiments plans and the outcome of their validation every 2-3 month. POOL has to maintain compatibility for older releases while providing validation releases with new, still incomplete functionality. This requires the project to regularly branch-off the production release line from the development head (eg during root4 and

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 37 0.76 Plans for Phase 2 LCG Applications Area

reflex validation). The POOL developers provide documentation as part of the release procedure. Reference and User documentation are produced in html and pdf formats based on DocBook fragments that are tagged in CVS together with the software releases. POOL and COOL are coordinated via open developer meetings, which take place every two weeks for POOL and each week for COOL. Project plans and requests for experiment prioritisation are reported to the Application Area Architect Forum. Project status and plans are regularly exposed to the larger community during experiment weeks and during persistency workshops organised at least once per year.

7.5 Resources and Milestones

7.5.1 Staffing The following table lists active persistency framework developers – people who have recently committed code to the persistency framework repository. In case a commitment to continue their participation in 2005 has been obtained this is listed as well. The total committed manpower for the year 2005 sums up to 6 FTE of which 2.3 FTE are contributions from LHC experiments.

Name FTE Work package CERN IT / LCG Radovan Chytracek 0.7 RAL, MySQL plug-in, client monitoring Dirk Duellmann 0.5 Persistency framework leader Giacomo Govi 0.7 RAL, object mapping Maria Girone 0.2 Catalogs David Front Application & DB service validation (currently COOL performance testing) Ioannis Papadopoulos 0.8 Persistency service, RAL, object relational storage manager, primary release coordination Andrea Valassi 0.8 COOL subproject leader, COOL kernel ATLAS Kristo Karr 0.5? Root and databases collections Marcin Nowak 0.5 Storage Manager, schema evolution, schema loading Sven Schmidt 0.8? COOL kernel CMS Zhen Xie 0.3 Catalogs, Object Relational Storage

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 38 0.76 Plans for Phase 2 LCG Applications Area

William Tanenbaum 0.1 Root storage manager LHCb Markus Frank 0.2 Storage manager, ROOT backend Marco Clemencic 0.1 COOL, RAL

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 39 0.76 Plans for Phase 2 LCG Applications Area

8. Simulation Project

8.1 Purpose, Scope and Objectives The Simulation Project covers a range of activities in the simulation as part of the LCG Applications Area, encompassing common work among the LHC experiments. Among its activities it is included the development of a simulation framework and infrastructure for physics validation studies, the CERN and LHC participation in the Monte Carlo generator services and in the Geant4, Fluka and Garfield simulation engines. The Project coordinates a major effort among all the experiments and the LCG to validate the physics of the simulation using and providing a simulation infrastructure and event generator services.

8.2 Project Organization

The LCG Simulation Project actively participates both in terms of management and development activities to external projects like Geant4 and Garfield and sees external participation of Fluka and Monte Carlo generators authors, as well as the participation of experiments’ experts in the validation activity. The structure in subprojects is shown in Figure 3. The overall project is led by Gabriele Cosmo (CERN). The subprojects are led by John Apostolakis (Geant4), Paolo Bartalini (Generator Services), Alfredo Ferrari (Fluka), Witold Pokorski (Simulation Framework), Alberto Ribon (Physics Validation) and Rob Veenhof (Garfield).

Simulation Project Fluka Experiment Geant4 Leader Validation MC4LHC Project Projec t

Physics Generator Framework Geant4 Fluka Garfield Validation Services

WP WP WP WP WP WP WP WP WP WP WP WP WP WP WP

Figure 3 - The overall structure of the Simulation Project. External participations to each subproject (blue circles) are also shown.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 40 0.76 Plans for Phase 2 LCG Applications Area

8.3 Simulation Framework The general task of the Simulation Framework subproject is to provide flexible infrastructure and tools for the development, validation and usage of Monte Carlo simulation applications. The purpose of this is to facilitate interaction between experiments and simulation toolkits developers as well as to eliminate duplication of work and divergence.

8.3.1 WP1 - Geometry Description Markup Language The Geometry Description Markup Language (GDML), initially developed in the context of an XML based geometry description for Geant4, has been adopted by the Simulation Project as the geometry exchange format. Its purpose is to allow interchanging detector geometries between different applications (simulation and/or analysis). The work package consists of the GDMLSchema part, which is a fully self-consistent definition of the GDML syntax and the GDML I/O part which provides means for writing out and reading in GDML files. The GDML schema is already complete enough to support realistic geometries like the LHCb one. The implementation of some missing elements like support for division and reflections (used in some more complex geometries) is on the list of the milestones. There has been GDML readers and writers for Geant4 supporting the present schema implemented in C++. In addition, GDML readers for Geant4 and Root have been implemented in Python. - GDMLSchema extension to support divisions and reflections: first half of 2005 - Python implementation of GDML writers for Root: first half of 2005 - Introduction of modularization of GDML files allowing for selective importation of parts of the geometry: second half of 2005 - User support and new features implementation, such as refinement of GDMLSchema to support user extensions of elements: June 2006 - CPPGDML code revision and re-engineering to allow easier extensibility: December 2006

8.3.2 WP2 - Geant4 geometry object persistency In addition to the work on geometry interchange format, there is also some effort devoted to address direct object persistency in Geant4. It is planned to perform a feasibility study of the usage of POOL/ROOT for that purpose. Such a mechanism would be useful for running detector simulation of complex detectors, as well as for storing Geant4 geometries which are constructed interactively. - Feasibility study of POOL/ROOT based Geant4 geometry persistency: first half of 2005

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 41 0.76 Plans for Phase 2 LCG Applications Area

8.3.3 WP3 - Python interfaces to Geant4 Python interfaces to C++ applications have already proven their usefulness in adding flexibility, configurability as well as facilitating 'gluing' of different elements together. This technology has also clear benefits in the context of the detector simulation. The effort undertaken was meant to demonstrate the usage of LCG tools such as Reflex and its Python binding for running Geant4 simulation application from the Python prompt. Several examples have been implemented and documented on the Simulation Framework web page. Using the existing ROOT Python binding (PyROOT), there has also been an example implemented demonstrating Geant4 simulation interfaced to ROOT visualization, all in Python and using GDML as the geometry source. - Working example of Python-driven Geant4 application (using Reflex): first part of 2005

8.3.4 WP4 – Simulation Framework for physics validation The aim of this work package is to provide example and whenever possible, reusable code for the Flugg application in the context of Fluka-Geant4 test beam validations. Flugg is an interface which allows running Fluka with Geant4 geometry description. The activity within the Simulation Framework subproject concentrates on the application of Flugg in the context of the physics validation with a particular emphasis on the implementation of the generic interface to ‘hits’. - Example of Flugg extension applied to the test-beam validation: second part of 2005 - Completed evaluation of VMC for use in physics validation: first half of 2006

8.3.5 WP5 - Monte Carlo truth handling Monte Carlo truth handling is a difficult task, especially for large multiplicity events of LHC. There is a large number of particles produced, for instance, in the showers and the selection criteria for filtering out unwanted particles are often complicated. All of the LHC experiments have come up with their own solutions, but further improvement of the performance and flexibility is possible. Some effort will be devoted within the Simulation Framework subproject to perform a feasibility study of a common mechanism for MCTruth handling. - Proposal for universal handling of Monte Carlo truth: second half of 2005 - Tentative first release of a common framework for MCTruth handling: second half of 2006

8.3.6 Sub-project Milestones ID Delivery date Description 05.05 2005/03/31 New release of GDML 2.1.0 with improvements in configuration scripts and kernel modules 05.06 2005/03/31 First prototype of a Python-driven Geant4 simulation using

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 42 0.76 Plans for Phase 2 LCG Applications Area

LCGDict and PyLCGDict 05.10 2005/06/15 New release of GDML including ‘autoconf’ based build system front-end 05.11 2005/06/15 Working example of POOL/ROOT-based geometry persistency mechanism for Geant4 05.15 2005/06/30 GDML-Schema extension to support divisions and reflections 05.16 2005/06/30 Feasibility study of POOL/ROOT based Geant4 geometry persistency 05.17 2005/06/30 Python implementation of GDML writers for Geant4 and ROOT 05.20 2005/07/15 First version of ROOT->GDML geometry writer in Python 05.30 2005/09/30 Flugg application to one of the LHC calorimeter test-beam simulation 05.37 2005/12/15 Introduction of modularization of GDML files 05.38 2005/12/15 Proposal for universal handling of Monte Carlo truth 05.39 2005/12/15 Evaluation of VirtualMC for usage in Geant4-Fluka test- beam validation studies 06.08 2006/06/30 Refinement to GDMLSchema to support user extensions of elements

06.09 2006/06/30 Completed evaluation of VMC for use in physics validation

06.13 2006/10/31 First release of a common framework for MCTruth handling 06.15 2006/12/15 CPPGDML code revision and re-engineering to allow easier extensibility Table 3 Simulation Framework level-2 and level-3 (italic) milestones

8.4 Physics Validation The goal of this subproject is to compare the main detector simulation engines for LHC, Geant4 and Fluka, with experimental data, in order to understand if they are suitable for LHC experiment applications. The basic strategy for validating these simulation programs is that the dominant systematic effects for any major physics analyses should not be dominated by the uncertainties coming from simulation. This approach relies on the feedback provided by the physics groups of the LHC experiments to the developers of these simulation codes.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 43 0.76 Plans for Phase 2 LCG Applications Area

8.4.1 WP1 – Impact of simulation on LHC physics In this work package, we aim to collect and revisit requirements for the simulation packages from the LHC experiments, by evaluating the impact of simulation uncertainties on physics observables.  A document summarizing the major requirements has been already released, but is expected to be updated in 2006, based on the analyses of the latest (2004) test- beams. Furthermore, the current effort in the CMS collaboration for preparing the Physics TDR could also provide important new feedbacks to the simulation developers.

8.4.2 WP2 - Electromagnetic physics Electromagnetics is always the first physics component of any simulation code to be validated, because it plays an important role also in hadronic showers. The goal of this work package is exactly to test the electromagnetic physics of the simulation packages, via both test-beam data (using electrons and muons as beam particles) and simple benchmarks (i.e. thin target setups). The electromagnetic physics for Geant4 has been already validated at percent level, and there is some activity in ATLAS to push its precision below that level, allowing the use of simulation for calibrating the calorimeter. Important new validation results of electromagnetic physics (and of hadronic physics as well, see WP3a and WP3b) will be produced by the analyses of the latest (2004) test- beams. Preliminary results are expected in the second half of 2005, whereas more complete analyses should be available in 2006.

8.4.3 WP3a – Hadronic physics: calorimetry Hadronic physics is a very broad and complex field, due to the inability of the underlying theory, quantum chromo-dynamics (QCD), to make calculable, perturbative predictions for hadron-nucleus reactions in the kinematics region relevant for describing the passage of hadrons through a detector. The simulation therefore relies on different models, which need to be tested in various combinations of beam particle type, beam kinetic energy, and target material. Given the variety of aspects of hadronic physics validation, we have subdivided WP3 in three work sub-packages. In the first one described in this section, we check the hadronic physics simulation with calorimeter test-beam data. In this case, the physics observables are macroscopic properties of hadronic showers, which are the results of many interactions and effects. Simple benchmarks are also utilized to study single particle interactions, providing complementary information at microscopic level. The first round of hadronic physics validation has been concluded with good results for the pion energy resolution and e/pi ratio. For the longitudinal and transverse shower shapes, some discrepancies between Geant4 QGSP physics list and the data are observed. Most of the effort of LCG contribution to Physics Validation is devoted to address this issue.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 44 0.76 Plans for Phase 2 LCG Applications Area

With version 7.0 of Geant4, some changes have been introduced to improve the shower shapes. The experiments are repeating their analyses with the old test-beam data to check if the problem is now fixed. Results should be ready by middle 2005. The major analyses that have been produced so far with calorimeter test-beams have been carried out with the Geant4 physics lists LHEP and QGSP. It is quite interesting to compare at least one of them with Fluka. This work is also technically challenging for the Simulation Framework subproject (see below WP4). Results are expected for the second half of 2005. After a discussion with experiments and simulation experts, the third simple benchmark was chosen: rapidity and transverse momentum distributions of inclusive charged pion produced in 100 GeV hadron (pions, kaons, protons)-nucleus (Mg, Ag, Au) interactions. Results are expected for the second half of 2005. Some other possible simple benchmarks have been identified for the future. Important new validation results of hadronic physics in calorimeter setups (as well as in trackers, see WP3b, and in general for electromagnetic physics, see WP2) will be produced by analysing the latest (2004) test-beam data. Preliminary results are expected in the second half of 2005, whereas more complete analyses should be available in 2006.

8.4.4 WP3b – Hadronic physics: inner detectors The second work sub-package for hadronic physics validation is focused on tests in tracking detectors. Similarly to simple benchmarks, tracking setups offer the unique possibility to test hadronic models at the level of single interactions, and, at the same time, allow to estimate the reliability of the simulation for the occupancy of the tracker and hence also for the properties of the reconstruction algorithms. Some of the activities of this work package have been also technically relevant for the Simulation Framework subproject (see WP4). New validation results of hadronic physics in inner detectors (as well as for calorimeters, see WP3a, and in general for electromagnetic physics, see WP2) will be produced by the analyses of the latest (2004) test-beams. Preliminary results are expected in the second half of 2005, whereas more complete analyses should be available in 2006.

8.4.5 WP3c – Hadronic physics: background radiation The LHC caverns will be a very hostile environment, with high radiation level. Shielding is therefore mandatory to protect some components of the detectors. The design of the shielding is based on the prediction of the background radiation of some simulation codes. In spite of the use of conservative “safety factors” to scale up the predicted radiation level, and on a continuous effort to benchmark Fluka with experimental data, it is anyhow interesting to compare its results with those of other independent simulation packages, as Geant4. Differently from its predecessor, Geant3, Geant4 offers a precise treatment of low energy neutrons, which is essential for this kind of applications. The purpose of this work package is therefore to compare the simulation results for the background radiation in the LHC experimental caverns, between Fluka and Geant4.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 45 0.76 Plans for Phase 2 LCG Applications Area

Some preliminary results have been already presented at Physics Validation meetings by LHCb and CMS collaborations. In the former case, the tools needed to get fluencies and doses are in place, and the next step is to carefully use Geant4 with a set of “cuts” as close as possible to those used for Fluka, in order to have a consistent comparison between the two simulation engines. This is expected for the middle of 2005.

8.4.6 WP4 – Validation of the simulation environment This last work package is somehow less physics oriented, but still very important in practice in order to have effective simulation tools available for LHC. There are two main working directions: (a) validate the shower parameterization packages (which are useful in many situations when a less accurate but significantly faster simulation is needed), and (b) to validate the adequacy and usability of simulation infrastructures (e.g. simulation frameworks, geometry description exchange, interactivity, etc.). This second part is done in close collaboration with the Simulation Framework subproject. As far as the validation of shower parameterization packages is concerned, the plan is to start at the beginning of 2006 by reviewing the current status of these packages and the tests that have been performed so far. Then, in close collaboration with all the LHC experiments, it will be decided how to proceed. Results are expected in the second half of 2006. For the work on Simulation Framework, an important first step has already been achieved in using Flugg to extend to Fluka the comparison with the ATLAS pixel 2001 test-beam data, starting from a Geant4 description of the setup. The next step is to achieve something similar for the much more complex case of a calorimeter test-beam setup. As described in WP3a, this work is undergoing and results are expected for the second half of 2005.

8.4.7 Sub-project Milestones ID Delivery date Description 05.02 2005/02/28 Review/prioritization of simple benchmarks for physics validation 05.08 2005/05/15 First results of background radiation studies with Geant4 05.12 2005/06/30 New validation results on longitudinal shower shapes with Geant4 7.0 completed 05.22 2005/09/30 3rd simple benchmark for physics validation completed 05.24 2005/09/30 Validation of Fluka against one LHC calorimeter test-beam 05.32 2005/10/31 First results of ATLAS combined and 2004 test-beams data comparisons 06.12 2006/10/31 Validation of shower parameterization packages completed

Table 4 Physics Validation level-2 and level-3 (italic) milestones

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 46 0.76 Plans for Phase 2 LCG Applications Area

8.5 Geant4 This sub-project contributes to support/maintenance and development of Geant4 in a number of areas. The continuous response to problem reports, inquiries, requests to port to and test on new compilers, etc in these areas from users applying Geant4 in production for LHC and related areas require a significant and growing fraction of the current effort. In addition development, verification and validation are continuing to address existing and new requirements from the LHC experiments. The work of the sub-project is sub-divided into work packages according to the work area in which support, maintenance and development are envisaged. In addition there are dedicated work-packages for the software management aspects, for the system integration testing and the coordination/release aspects.

8.5.1 WP1 - Geometry, Field and Transportation Undertakes leading role in the support, maintenance and further development of the corresponding Geant4 module for creating, and navigating in model detector geometries, for propagating charged tracks in the model geometries and for communicating the results to the tracking module. Additional capabilities include the provision of event- biasing, using Russian roulette and splitting, and driven by volume importance values or space-energy weight-windows. The main tasks are: - Geometry developments including improved parameterizations and a generalized twisted trapezoid shape. Scheduled by end of March 2005. - A revision of the geometry optimization mechanism to handle re-optimization of local sub-trees for usage in dynamic geometry setups. First design iteration for allowing navigation on parallel geometries. Evaluation and tuning/benchmark of performance for transportation in field. Scheduled by end of June 2005. - First implementation of a new ellipsoid shape. Prototype implementation for allowing tuning of the precision/tolerance for points on the surface of solids. Scheduled by end of September 2005. - An implementation review of the second order equations algorithms for tube and cone solids, in the attempt to increase precision in the calculation of intersections. Extensions to the optimisation mechanism for special handling of navigation in regular geometry structures. Scheduled by end 2005. - Implementation of new tools to help in the identification of overlaps in the user’s geometry at the time of construction (i.e. placement of volumes in the geometry setup). Scheduled by end of March 2006. - First implementation of parallel navigation to enable better radiation monitoring of charged particles at arbitrary locations. Scheduled by end of May 2006. - Implementation for tuneable tolerances applied for the treatment of points on the solids’ surface, based on the implemented prototype. Scheduled by end 2006. - First implementation of Boundary Represented Solids (BREPS) for experiment support structures. Scheduled by March 2007.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 47 0.76 Plans for Phase 2 LCG Applications Area

8.5.2 WP2 - Software Management This work package undertakes the development and maintenance of the ‘global’ category in Geant4; it is responsible for release management, creation of release distributions including binary distributions, and provides common services like a web server, CVS service, etc. to the Geant4 collaboration. Most of these tasks are ongoing support activities for which long term milestones are difficult to plan. The main tasks are: - Move of CVS repository to CVS service provided by CERN/IT (Jan 2005). - Move problem reporting system to Savannah, or assist the transfer of Bugzilla to another Geant4 site (Dec 2005). - Investigate streamlining of distributions to simplify building (Mar 2005). - Code migrations to CLHEP 2.x, gcc-3.4 and 64 bit architectures, including AMD64 (Sep 2005), sstream (Dec 2005). Ongoing work for new compiler versions and OS versions as required. - Support for Geant4 tools, services and servers: move to new servers for Web and MySQL database (Sep 2005), review tools documentation and improve/add to existing documentation (Sep 2005), review tools with emphasis to improving security (Sep 2005).

8.5.3 WP3 - EM Physics and error propagation The EM physics package in the Geant4 toolkit covers sub-packages for simulation of electromagnetic interactions of particles with matter. These ‘standard’ EM processes cover electrons and gammas (e-/e+/ above 1 KeV), muons, x-rays (Cerenkov, transition radiation, etc.), high-energy, optical. In addition utility classes, a physics list sub- package, examples and tests are provided, maintained and refined. Key parts are the processes and models for energy loss of charged particles, for multiple scattering, and for gamma and optical photons interactions. The work is carried out by CERN group in collaboration with other Standard EM developers. The main tasks are: - Improvements and additions to the ‘standard’ EM processes, completing the set of corrections to Bethe-Bloch model for ionization processes, and extending modeling of transition radiation (Dec 2005). - Provide simulation of rare high energy EM processes for LHC, including radioactive corrections to major EM processes, positron annihilation to hadrons, and pair production by protons & pions for use in simulation of hadron and EM showers at LHC (Dec 2006). - Extensions and additions to the ‘standard’ EM processes to complete the set of EM processes relevant to LHC physics. These include more precise modeling of synchrotron radiation, improvements in the LPM effect and photo-effect in light materials (Jun 2007). - Evaluate and improve the stability and precision of shower response in LHC calorimeter when changing production thresholds. This requires evaluation of step dependence of observables due to the current methods of modeling multiple scattering and of sampling fluctuations (Jun 2006).

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 48 0.76 Plans for Phase 2 LCG Applications Area

- Enable simulation of EM interaction of potential exotic particles at LHC (June 2005) and provide a set of specific models for exotics (Jun 2007). - Enable configuration and steering of EM ‘package’ in applications, by providing physics list components, and new capabilities to set precise model only for a particular sub-detector or region (Dec 2005). - Enable interfacing of existing EM models for error propagation in reconstruction - to provide estimates of energy loss and multiple scattering effects (Mar 2005) and provide a set of components for forward/backward propagation of particles in reconstruction (Dec 2006). A continuing activity will be to provide support, maintenance, validation, acceptance testing and CPU-usage optimization of the ‘EM-standard’ package.

8.5.4 WP4 - Hadronic Physics The hadronic physics work package contributes to the work of the hadronic physics working group in Geant4 and offers advice and support to LHC experiments on topics related to interactions of hadrons. Physics models currently supported by the hadronic physics work package include the QGS model, the Binary Cascade, the Chiral Invariant Phase Space (CHIPS) model, a model for elastic interactions of hadrons with matter, and physics lists for the LHC experiments. The main tasks foreseen include: - Enable background and radiation studies through the optimization of physics list for these studies (Jun 2005). - Integrate the string fragmentation work in CHIPS with the parton string models to enable its trial use for physics performance in realistic scenarios (Jun 2005). - Provide support for issues arising from LHC detector test-beam analyses. This includes an elaboration on the ATLAS calibration problem (Sep 2005). - Participate in the effort to include simulation of exotic physics for ATLAS and CMS by providing assistance for the inclusion of processes for exotic particles like split super-symmetry, gravitino, the lightest SUSY particle, etc... (Sep 2005). - Finalise paper on neutron transport and on deep inelastic scattering using CHIPS (Jul 2005). - Investigate and resolve issues in hadronization part of QGS model concerning: o Transverse momenta of secondaries produced (Dec 2005); o Masses of resonances created (Mar 2006). - Extend Binary Cascade with propagate interface to strings models (Sep 2005) and re-design parts of the Binary Cascade with the goal to encapsulate field- transitions (Sep 2006). - Provide physics list based on CHIPS model first using QGSC for high energies (Sep 2005) and later replacing QGSC by a CHIPS quasmon-string model (2006).

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 49 0.76 Plans for Phase 2 LCG Applications Area

- Improve regression testing of physics performance for CHIPS model (Dec 2005). - Study the need and provide refined physics lists with newer model options and improvements for low rate processes (Sep 2006). - Include additional verification tests and simple benchmarks for hadronic interactions in the intermediate energy range (10-50 GeV) (Jun 2006). - Benchmark Geant4 for the use in shielding and radiation environment applications (Sep 2006). - Improvement of pre-compound and evaporation models (Mar 2007, pending additional manpower).

8.5.5 WP5 - System Testing The system testing working group provides a service for integration testing for all the development and maintenance undertaken on the Geant4 toolkit (including source code, data and physics list). For this purpose it maintains a facility which includes tools and testing infrastructure. The tools enable developers to log their changes as directory tree tags, classify them and potentially submit them for testing, monitor testing phase and track their resulting acceptance or rejection. The regular suite of system tests includes tests of integration and simple comparisons with restricted running times. This suite is exercised regularly on a list of supported platforms with several CPU/compiler/OS combinations, and must be successful before tags are accepted. An extended version of these tests is also run on a regular, monthly, basis to identify infrequent problems. In addition to the ongoing service, a number of development objectives regarding the tests and the testing tools are planned. Their milestones are: - Prototyping of 'standard notation' for results (output) (Jun 2005) and first deployment (Dec 2005). - Clarification of coverage map for tests/examples (Jun 2006). - Automation for keeping results and track performance of tests (Jun 2005). - Full deployment of 'standard notation' for test results (Dec 2006).

8.5.6 WP6 - Acceptance Suite The goal of this work package is to provide a set of tools for the automatic comparison of key physics observables between different versions of Geant4. The validation tests are chosen to be simplified but representative of typical LHC use cases. Running these will provide a level of assurance that the attained physics performance for typical test cases is maintained. It undertakes a complementary role to the verification testing made by physics working groups on their separate development, and that of the system testing for integration.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 50 0.76 Plans for Phase 2 LCG Applications Area

The main application is the systematic regression testing for Geant4 public releases, to identify potential statistically significant discrepancies in a number of physics distributions, between the new release and the established results of a chosen release. Given the large number of potential observables one could consider, it is essential to restrict as much as possible the number of those distributions that need to be looked by a person. Therefore, a variety of powerful statistical tests are necessary. The main deliverables are: - Improvements in the deployment on the Grid of the test suite, in the monitoring of the p-values and the CPU time (Mar 2005). - Tools to summarize results with key statistics and plots (Jun 2005). - Revised tests with additional observables, and extensions of the statistical tests (Sep 2005). - Prototype for storing/retrieving previous results (n-tuples and plots) (Dec 2005). - Extension of tests to electron projectiles, and comparison with existing electromagnetic physics validation tests (Jun 2006). - Addition of thin-target tests for hadronic physics (Dec 2006). 8.5.7 WP7 - Coordination and Releases The objectives of this work package are to coordinate work in SFT on Geant4, to collect and follow requirements from the LHC experiments, and to undertake the release management of Geant4 releases, patches and development tags. It also assists in integration LHC requirement in the work plans for the Geant4 modules in which SFT is not involved.

8.5.8 Sub-project milestones ID Delivery date Description 05.03 2005/03/25 Development release of Geant4 including fixes and improvements in geometry and physics. Issues discovered in production; geometry developments including improved parameterizations and generalized twisted trapezoid and a prototype of energy loss process with capability for backward extrapolation 05.18 2005/06/30 Release 7.1: refinements to ionization processes, additional string fragmentation and verification of proton reactions at high energies. Prototype of specialized model for multiple scattering for electrons; first implementation of gamma processes in model interface and of materials builder using NIST data; integration of CHIPS string fragmentation with the parton string models, enabling trial use for physics studies; verification of proton reactions at high energies 05.19 2005/07/15 Tutorial on Geant4 and move of CVS repository to IT service. This includes move to IT service for the Geant4

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 51 0.76 Plans for Phase 2 LCG Applications Area

source code repository; finalised paper on neutron transport and on deep inelastic scattering using CHIPS; revised tutorials and sections of Geant4 User Documentation 05.23 2005/09/30 Development release: cascade interface to strings, stability study for EM observables and review of LPM effect. Including studies on stability of EM quantities from sampling calorimeters against changes in cuts, max-step size; review of LPM effect in EM processes; propagator interface in Binary Cascade to string models; prototype port to CLHEP 2.x. 05.31 2005/10/31 Improved regression suite for release validation and testing infrastructure. With improvements in Bonsai enabling open viewing and authorization for changes, common automation for system tests and "acceptance suite" regression tests, study of power of regression tests for shower shape 05.42 2005/12/20 New public scheduled release. It includes fixes, improvements and new developments, including positron annihilation and geometry voxelisation improvements. 06.05 2006/03/31 Development release including new tool for overlap detection at geometry construction and extensions to QGS. 06.06 2006/05/31 Development release. Including new features for parallel navigation enabling scoring charged particles at arbitrary locations, improvements to stability of showering for changes in cuts, and additional verification tests for hadrons between 10 and 50 GeV, as part of potential June 2006 public release of Geant4 06.11 2006/09/30 Geant4 development release. Including developments including redesign of Binary Cascade’s field transitions, additional benchmarking for radiation and shielding use cases and refinements to physics lists for low-rate processes. 06.14 2006/12/01 Geant4 development release. Including surface tolerances tuned to model geometry size 07.01 2007/06/01 Geant4 development release. Including refined models for EM interactions of exotic particles, first implementation of tessellated BREP solids Table 5 Geant4 level-2 milestones

8.6 Fluka The Fluka team participates as external members to the Simulation Project; the collaboration is beneficial for the Simulation Framework and Physics Validation subprojects, where the Fluka simulation engine is interfaced and utilised.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 52 0.76 Plans for Phase 2 LCG Applications Area

The plan of work is reported in this document for completeness, although not included in the Simulation Project planning.

8.6.1 Summary of milestones Below it is listed the set of major achievements and activities planned in Fluka for the next months: - Completion of PEMF elimination and thorough testing of the new solution. - Completion of the radioactive decay modules. - Completion of the "name based" input transition. - Final elimination of the intermediate energy hadronic model (superseded by an extension of PEANUT). - Final setup of the new low energy neutron cross section library (production use postponed after the beta-release). - Cleanup and proper documentation/acknowledgments of the source code. - beta-release (source included, limited to CERN and INFN). - Release of the first report (a second one is foreseen by the end of the year/beginning 2006) on FLUKA. - Thorough testing and cross-checks of the new beta-version. - Feedback to user questions/reports. - Further development on advanced heavy ion physics features as required by LHC- ION and NASA. - Production use of the new low energy neutron cross section library. - Production use of the "extended" PEANUT interaction model (it will include and substitute the current high energy interaction model).

8.7 Garfield Garfield is a computer program for the detailed simulation of two- and three-dimensional chambers made of wires and planes, such as drift chambers, TPCs and multi-wire counters. It accepts two and three dimensional field maps computed by finite element programs such as Maxwell, Tosca and FEMLAB as a basis for its calculations. Work is on-going to upgrade interfaces to all these finite element programs. An interface to the Magboltz program is provided for the computation of electron transport properties in nearly arbitrary gas mixtures. Garfield also has an interface with the Heed program to simulate ionisation of gas molecules by particles traversing the chamber. New releases of both Heed and Magboltz are in the process of being interfaced. The integration of the new release of Heed will also mark a major change in the programming aspects of Garfield since Heed is now written in C++.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 53 0.76 Plans for Phase 2 LCG Applications Area

Transport of particles, including diffusion, avalanches and current induction is treated in three dimensions irrespective of the technique used to compute the fields. Currently Monte-Carlo simulations of drift with diffusion assume Gaussian spreads. This is not applicable in detectors such as GEMs where, indeed, the calculated diffusion spread depends on the step length. Correcting this is in progress. Negative ion TPCs are being considered as detectors in the search for dark matter. To simulate these devices not only needs attachment processes, which are already available, but also dissociation processes. These are in the process of being written.

8.7.1 WP1 – Electric field and Diffusion Future developments include: - Interfaces with Maxwell 2D (Ansoft) and FEMLAB, work on the interface for Tosca, by February 2005. - Diffusion in strongly converging and diverging fields in GEM and Micromegas detectors, by April 2005.

8.7.2 WP2 – Gas and Transport Future developments include: - Interface with a new version of Magboltz, by end of 2005. - Interface with a new version of Heed, by end of 2005.

8.7.3 Sub-project milestones ID Delivery date Description 05.01 2005/01/31 Interfaces with Maxwell 2D (Ansoft) and FEMLAB, work on the interface for Tosca 05.07 2005/03/31 Diffusion in strongly converging and diverging fields (GEM and Micromegas) 05.09 2005/05/16 Lecture series for graduate students in Cagliari 05.21 2005/09/10 Invited talk at TRD 2005 conference (Ostuni, Italy) on photon detection 05.40 2005/12/15 Interface with a new version of Magboltz 05.41 2005/12/15 Interface with a new version of Heed

Table 6 Garfield level-2 and level-3 (italic) milestones

8.8 Generator Services The Generator services project collaborates with Monte Carlo (MC) generators authors and with LHC experiments in order to prepare validated LCG compliant code for both the theoretical and experimental communities at the LHC, sharing the user support duties,

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 54 0.76 Plans for Phase 2 LCG Applications Area

providing assistance for the development of the new object oriented generators and guaranteeing the maintenance of the older packages on the LCG supported platforms. Contact persons for most of MC generator packages relevant for the LHC and representatives for all the LHC experiments have been agreed. Responsibilities are currently shared in the following way: coordination (0.30 FTE, University of Florida); liaison with experiments & Monte Carlo project (0.25 FTE, CERN), WP1-librarian (0.25 FTE, CERN), WP1-integration (0.75 FTE, LCG Russia), WP3-MCDB (0.75 FTE, LCG Russia), WP3-Production (0.5 FTE, University of Oviedo and University of Santander). On top of the activities developed internally, the coordination with other external projects allows to cover the entire spectrum of topics outlined in the work packages: WP4-validation (CEDAR/JETWEB), WP1-new object oriented MC generators (THEPEG, PHENOGRID/HERWIG++).

8.8.1 WP1 - The Generator library (GENSER) GENSER is the central code repository for MC generators and generator tools. It was the first CVS repository in the LCG simulation project. The sources and the binaries are installed in AFS and the tar-balls are made available by SPI. The size of GENSER is growing quickly. This is currently felt as a major problem for the distribution of main and bug fix releases. The LHC experiments have submitted a specific request to allow for lighter GENSER distributions, giving access to the single GENSER sub-packages. A strategy to address the problem is in place with the help of the LCG librarian. A corresponding milestone has been set for this to be achieved by September 2005. Some of the generators supported in GENSER are already installed in the LCG external area. However, test suites and examples are always available in GENSER. .

8.8.2 WP2 - Storage, Event Interfaces and Particle Services The goal of this work package is to contribute to the definition of the standards for generator interfaces and formats, collaborating in the development of the corresponding application program interfaces. In order to favour the adoption of the new object oriented MC generators in the experiment simulation frameworks, the Generator project will share some responsibilities on the development and maintenance of the Toolkit for High Energy Physics Event Generation (ThePEG). A common milestone with the PHENOGRID initiative for Q3 2005: the first test of ThePEG and EvtGenLHC integration in Herwig++.

8.8.3 WP3 - Public Event Files and Monte Carlo Database The goal of this work package is to produce “certified” public generator event files to be used for benchmarks, comparisons and combinations. The format and the structure of the files have to be accessible to the simulation frameworks of the LHC experiments. Three different activities have been started: 1) The development of a simple production and validation framework at generator level. A common software project between LCG and CMS has been defined which is relying on HepMC (event interface), ROOT and POOL (event storage).

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 55 0.76 Plans for Phase 2 LCG Applications Area

Connection with the Monte Carlo data base (WP3) for configuration and book keeping is currently evaluated. The beta version of the framework should be available end Q2 2005. 2) A dedicated production centre integrated in the grid middleware to provide the LHC experiments and the other end-users with a transparent access to the public event files. This will essential for those samples requiring a huge amount of CPU time and parallelization techniques. 3) The LCG Monte Carlo Data Base (MCDB) [hep-ph/0404241], that will be a public database for the configuration, book-keeping and storage of the generator level event files. The generator events often need to be prepared and documented by Monte-Carlo experts. MCDB aims at facilitating the communication between Monte-Carlo experts and end-users. Its use can be optionally extended to the official event production of the LHC experiments. The LCG MCDB relies on the following software packages supported by LCG: MySQL; CASTOR (RFIO); CGI; Perl; Apache. A beta prototype is currently in production on a dedicated web server: http://mcdb.cern.ch; authentication is currently based on AFS tokens. Work is ongoing to improve the documentation and to support authentication through GRID certificates. The MCDB team is available to provide assistance for the development of MCDB API to the experiment dependent software.

8.8.4 WP4: Monte Carlo Validation The activity in this work package is currently concentrating on the functional validation of the generator packages inserted in GENSER. If JetWeb will adopt GENSER, possible usage of JetWeb for physics validation will be considered.

8.8.5 Sub-project milestones ID Delivery date Description 05.04 2005/03/31 First C++ Monte Carlo (SHERPA) fully integrated in GENSER 05.13 2005/06/30 First test of ThePEG integration in Herwig++ 05.14 2005/06/30 Generator level production framework beta version 05.25 2005/09/30 MCDB user document with definition of procedures to gain access through certificates 05.26 2005/09/30 Pythia 8: release of alpha version 05.27 2005/09/30 Procedure for light GENSER releases 05.28 2005/09/30 Definition of EvtGen development plans & policy: agreement on responsibilities for EvtGen development in GENSER 05.29 2005/09/30 First introduction of NRQCD Prompt Quarkonia Production models in Pythia 6.3

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 56 0.76 Plans for Phase 2 LCG Applications Area

05.33 2005/09/30 Proposal for a LCG Generator production centre integrated in the grid-middleware 05.34 2005/12/15 Integration of GENSER in JetWeb 05.35 2005/12/15 Generator level production framework: production quality release 05.36 2005/12/15 Evaluation of possible migration of HEPMC to GENSER 06.01 2006/03/31 Porting of most GENSER Fortran packages to gcc-4.0 06.02 2006/03/31 Introduction of MCDB Grid certificates and Management of large files 06.03 2006/03/31 Finalization of NRQCD Prompt Quarkonia Production models in Pythia 6.3 06.04 2006/03/31 MCDB Integration, experiment specific APIs 06.07 2006/06/30 Generator level validation framework beta version 06.10 2006/09/30 Pythia 8. Release of beta version 06.16 2006/12/15 Fully operational LCG Generator production centre integrated in the grid-middleware 07.01 2007/06/30 Generator level validation framework production version 07.02 2007/09/30 Pythia 8. Release of production version

Table 7 Generator Services level-2 and level-3 (italic) milestones

8.9 Technical process The various subprojects in the LCG Simulation Project make extensive usage of the infrastructure provided by SPI. Software packages in the Generator Services (GENSER) are released, distributed and installed following tools and standards available through SPI. Some of the available tools for Q/A are also used by the Geant4 project, the Geant4 CVS repository is planned to migrate in 2005 to the central LCG CVS facility, and a proposal for migrating the existing bug-report system to the Savannah portal is ongoing. Validation benchmarks and code for the test-beam setups and Simulation Framework packages are progressively made available in the LCG CVS repository, browsable through the tools provided by SPI.

8.10 Resources

8.10.1 Staffing A table of the currently available and required (wished) manpower for the LCG Simulation Project is reported in Table 8

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 57 0.76 Plans for Phase 2 LCG Applications Area

Subproject Available Required Simulation Framework 0.90 2.00 Geant4 8.25 9.15 Coordination - 1.20 (1.20) Geometry, Field, Biasing – 2.00 (2.00) Hadronic Physics – 2.45 (2.75) Electromagnetic Physics – 1.25 (1.50) Testing and Software Management – 1.35 (1.70) Physics Validation 1.30 2.50 Generator services 2.80 3.05 Management, documentation/release – 0.80 GENSER development, validation, MCDB – 1.75 Production Framework – 0.25 Garfield 1.00 2.00 Management 0.25 0.25

Table 8 - Currently available and required (wished) manpower for the Simulation Project. For Geant4 and Generator Services subprojects, the approximate current manpower in different areas is shown.

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 58 0.76 Plans for Phase 2 LCG Applications Area

9. Applications Area Summary

We summarize here the resources and milestones for all the Applications Area projects.

9.1 Resources l a a i n t s i o s a T u p

S R S E d r E N - - L b A n e C S E R L L I A C G G a h t r E T G L M N N H C C

Project Sub-Project C O A E L A F C L L B G MGR 0.9 0.9 MGR Total 0.9 0.9

POOL Catalog 0.2 0.3 0.5 Collections 0.5 0.5 Cool 0.8 0.8 0.1 1.7 Mgr 0.5 0.5 Ral 2.2 2.2 StorageMgr 0.5 0.1 0.2 0.8 POOL Total 3.7 1.8 0.4 0.3 6.2

ROOT Base 0.5 0.3 0.3 1.1 Dictionary 0.6 0.2 0.2 0.4 1.4 Geom 0.8 0.8 Graf 2 0.8 0.2 3 Gui 1.3 0.2 0.1 1.6 I/O 0.5 0.3 0.8 Math 2.5 0.2 2.7 Mgr 1 1 Proof 2 0.5 0.4 2.9 ROOT Total 9.9 2.2 0.2 1.4 1.2 0.3 0.1 15.3

SIMU Framework 0.9 0.9 Garfield 1 1 Geant4 7.7 7.7 Genser 0.45 1.75 0.25 0.25 2.7 Mgr 0.25 0.25 Validation 1.5 1.5 SIMU Total 11.8 1.75 0.25 0.25 14.05

SPI (blank) 3.95 2 5.95 SPI Total 3.95 2 5.95

Grand Total 30.25 2.2 2 2 1.75 1.4 1.2 0.65 0.6 0.25 0.1 42.4

9.2 Milestones

9.2.1 Level-1 proposed milestones

ID Delivery date Description AA0 2005/03/31 Development, support and resource plan through 2008 AA1 2005/09/30 Phase 1 AA software complete and deployed AA2 2005/09/30 ROOT workshop at CERN. Various prototypes addressing different topics of the SEAL+ROOT merge will be completed and detailed plans will be made available to the wider ROOT community. AA3 2006/06/30 Completed the migration to the merged SEAL and ROOT projects. End-

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 59 0.76 Plans for Phase 2 LCG Applications Area

users will not need to use anymore any of the old SEAL libraries; all the required functionality will be available in ROOT.

9.2.2 Level-2 proposed milestones

ID Project Due Summary SPI01 SPI 31/10/2005 CMT and SCRAM generated automatically from XML SPI02 SPI 30/11/2005 Independent build and release of partitions of the LCG software SPI03 SPI 31/07/2005 Pacman caches for binaries and sources SPI04 SPI 31/10/2005 Automated distribution system in place, generate from XML description External Software. SPI07 SPI 30/11/2005 Import/export features available in the Savannah system. Provide mechanism to import/migrate external data into savannah. SPI08 SPI 31/12/2005 Savannah user forum setup, user and admin documentation available. SPI09 SPI 31/10/2005 QA reporting on savannah and test coverage available and documented. SPI10 SPI 30/09/2005 Testing tools updated to the latest versions on the AFS area and available to LCG projects and outside. SPI11 SPI 01/08/2005 Doxygen and LXR automated and documented for the LCG software and for the packages (ROOT, AIDA, CLHEP) BAS1 ROOT 01/08/2005 New updated plugin manager BAS2 ROOT 15/07/2005 A new edition of the ROOT Users Guide BAS3 ROOT 15/07/2005 The ROOT bug tracking system moved to Savannah DIC1 ROOT 30/06/2005 Reflex and Cintex in the ROOT v5 development release DIC4 ROOT 31/12/2005 CINT fully converted to Reflex DIC5 ROOT 31/03/2006 PyRoot exclusively based on Reflex PRF7 ROOT 30/09/2005 Demonstration of new PROOF system at ROOT 2005 MAT1 ROOT 30/06/2005 First version of new MathCore library MAT2 ROOT 31/08/2005 First version of the MathMore and random number generator library MAT3 ROOT 30/09/2005 New C++ Minuit in be released in ROOT MAT4 ROOT 31/12/2005 Finalization of the Fitting and Minimization interfaces MAT5 ROOT 31/12/2005 RootFit fully integrated (in CVS by July) GRA1 ROOT 30/09/2005 Demonstrate the new GL viewer at the ROOT workshop. SEAL1 ROOT 30/09/2005 Remove from new SEAL releases Reflex and Cintex SEAL2 ROOT 30/09/2005 Remove from new SEAL releases Minuit and MathCore SEAL2 ROOT 31/12/2005 Remove from new SEAL releases Reflection, ReflectionBuilder and the Scripting packages (PyLCGDict and PyLCGDict2) POOL01 POOL 31/03/2006 Alignment with new CORE plugin and foundations proposals

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 60 0.76 Plans for Phase 2 LCG Applications Area

POOL02 POOL 31/03/2006 Overall performance study and validation with expected requirements RAL03 POOL 31/08/2005 RAL validation release (implementing the RAL review outcome and separate packaging) RAL05 POOL 31/10/2005 RAL production release RAL06 POOL 30/11/2005 Production release of proxy-cache access plugin CAT02 POOL 31/08/2005 Production release of LCG catalog adaptors (level 2) CDB01 POOL 30/04/2005 First COOL production release CDB08 POOL 15/10/2005 COOL 1.4.0 release with bulk insertion and retrieval in different "channels" within the same "folder" CDB09 POOL 15/10/2005 COOL passes all tier 0 validation tests CDB12 POOL 31/12/2005 First prototypes of API and command line tools for data extraction and cross-population between COOL databases 5.05 SIMU 31/03/2005 New release of GDML 2.1.0 with improvements in configuration scripts and kernel modules 5.06 SIMU 31/03/2005 First prototype of a Python-driven Geant4 simulation using LCGDict and PyLCGDict 5.15 SIMU 30/06/2005 GDML-Schema extension to support divisions and reflections 5.16 SIMU 30/06/2005 Feasibility study of POOL/ROOT based Geant4 geometry persistency 5.17 SIMU 30/06/2005 Python implementation of GDML writers for Geant4 and ROOT 5.30 SIMU 30/09/2005 Flugg application to one of the LHC calorimeter test-beam simulation 5.37 SIMU 15/12/2005 Introduction of modularization of GDML files 5.38 SIMU 15/12/2005 Proposal for universal handling of Monte Carlo truth 5.39 SIMU 15/12/2005 Evaluation of VirtualMC for usage in Geant4-Fluka test-beam validation studies 6.08 SIMU 30/06/2006 Refinement to GDMLSchema to support user extensions of elements 6.09 SIMU 30/06/2006 Completed evaluation of VMC for use in physics validation 6.13 SIMU 31/10/2006 First release of a common framework for MCTruth handling 6.15 SIMU 15/12/2006 CPPGDML code revision and re-engineering to allow easier extensibility 5.02 SIMU 28/02/2005 Review/prioritization of simple benchmarks for physics validation 5.08 SIMU 15/05/2005 First results of background radiation studies with Geant4 5.12 SIMU 30/06/2005 New validation results on longitudinal shower shapes with Geant4 7.0 completed 5.22 SIMU 30/09/2005 3rd simple benchmark for physics validation completed 5.24 SIMU 30/09/2005 Validation of Fluka against one LHC calorimeter test-beam 5.32 SIMU 31/10/2005 First results of ATLAS combined and 2004 test-beams data

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 61 0.76 Plans for Phase 2 LCG Applications Area

comparisons 6.12 SIMU 31/10/2006 Validation of shower parameterization packages completed 5.03 SIMU 25/03/2005 Development release of Geant4 including fixes and improvements in geometry and physics. Issues discovered in production; geometry developments including improved parameterizations and generalized twisted trapezoid and a prototype of energy loss process with capability for backward extrapolation 5.18 SIMU 30/06/2005 Release 7.1: refinements to ionization processes, additional string fragmentation and verification of proton reactions at high energies. Prototype of specialized model for multiple scattering for electrons; first implementation of gamma processes in model interface and of materials builder using NIST data; integration of CHIPS string fragmentation with the parton string models, enabling trial use for physics studies; verification of proton reactions at high energies 5.19 SIMU 15/07/2005 Tutorial on Geant4 and move of CVS repository to IT service. This includes move to IT service for the Geant4 source code repository; finalised paper on neutron transport and on deep inelastic scattering using CHIPS; revised tutorials and sections of Geant4 User Documentation 5.23 SIMU 30/09/2005 Development release: cascade interface to strings, stability study for EM observables and review of LPM effect. Including studies on stability of EM quantities from sampling calorimeters against changes in cuts, max-step size; review of LPM effect in EM processes; propagator interface in Binary Cascade to string models; prototype port to CLHEP 2.x. 5.31 SIMU 31/10/2005 Improved regression suite for release validation and testing infrastructure. With improvements in Bonsai enabling open viewing and authorization for changes, common automation for system tests and "acceptance suite" regression tests, study of power of regression tests for shower shape 5.42 SIMU 20/12/2005 New public scheduled release. It includes fixes, improvements and new developments, including positron annihilation and geometry voxelisation improvements. 6.05 SIMU 31/03/2006 Development release including new tool for overlap detection at geometry construction and extensions to QGS. 6.06 SIMU 31/05/2006 Development release. Including new features for parallel navigation enabling scoring charged particles at arbitrary locations, improvements to stability of showering for changes in cuts, and additional verification tests for hadrons between 10 and 50 GeV, as part of potential June 2006 public release of Geant4 6.11 SIMU 30/09/2006 Geant4 development release. Including developments including redesign of Binary Cascade's field transitions, additional benchmarking for radiation and shielding use cases and refinements to physics lists for low-rate processes. 6.14 SIMU 01/12/2006 Geant4 development release. Including surface tolerances tuned to model geometry size 7.01 SIMU 01/06/2007 Geant4 development release. Including refined models for EM interactions of exotic particles, first implementation of tessellated BREP solids

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 62 0.76 Plans for Phase 2 LCG Applications Area

5.01 SIMU 31/01/2005 Interfaces with Maxwell 2D (Ansoft) and FEMLAB, work on the interface for Tosca 5.07 SIMU 31/03/2005 Diffusion in strongly converging and diverging fields (GEM and Micromegas) 5.40 SIMU 15/12/2005 Interface with a new version of Magboltz 5.41 SIMU 15/12/2005 Interface with a new version of Heed 5.35 SIMU 15/12/2005 Generator level production framework: production quality release 6.02 SIMU 31/03/2006 Introduction of MCDB Grid certificates and Management of large files 6.16 SIMU 15/12/2006 Fully operational LCG Generator production centre integrated in the grid-middleware 7.01 SIMU 30/06/2007 Generator level validation framework production version 7.02 SIMU 30/09/2007 Pythia 8. Release of production version

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 63 0.76 Plans for Phase 2 LCG Applications Area

10. References

[1] CERN-LCGAPP-2002-09 - Architecture Blueprint RTAG report

[2] CERN-LCG-2003-029 - Detector Simulation (RTAG-10)

[3] CERN-LCG-2002-024 - Monte Carlo Generators (RTAG-9)

[4] CERN-LCG-2002-023 - Detector Geometry and Material Description (RTAG-7)

[5] CERN-LCGAPP-2005-03 - Applications Area internal review 2005 final report

[6] Persistency Framework (RTAG-1)

Organisation Title/Subject Number CERN – LCG Project Applications Area Plans Phase 2 Owner Approved by Date 27/11/2017 Version Page 64 0.76

Recommended publications