<<

Deep Underground Neutrino Experiment (DUNE)

Far Detector Interim Design Report

Volume 4:

Chapter Breakout: software-computing.tex chapter-code-mgmt.tex

July 18, 2018

The DUNE Collaboration

Contents

Contents i

List of Figures ii

List of Tables 1

1 Code management 2 1.1 DUNE Software Stacks ...... 2 1.1.1 Beam Simulation ...... 3 1.1.2 Triggering and DAQ ...... 3 1.1.3 LarSoft ...... 3 1.1.4 LArSoft-Like Software Packages ...... 5 1.1.5 QScan ...... 6 1.1.6 External Software ...... 6 1.2 Software Repositories ...... 6 1.2.1 Build Environment ...... 6 1.2.2 Software Environment Setup ...... 7 1.2.3 Code Distribution ...... 8 1.2.4 Release Management ...... 8 1.2.5 The art Framework ...... 9 1.2.6 NTuple Analysis (CAFAna) ...... 9

References 11

i List of Figures

1.1 Software dependency tree for dunetpc, showing dependent software products...... 4

ii LIST OF TABLES 0–1

List of Tables

Chapter Breakout: The DUNE Far Detector Interim Design Report software-computing.tex chapter-code-mgmt.tex Chapter 1: Code management 1–2

Chapter 1

Code management

The DUNE experiment consists of several large components – prototypes, a beam, a near de- tector, and a far detector, which have their own schedules and subsets of the collaboration that work on them. Computer simulation is used to help optimize the design of the components in the early phases, and as an integral part of the analysis procedures during the extraction of physics results. Triggering, reconstruction and event selection all require sophisticated software devel- oped by teams over long time periods. The large, international collaboration introduces its own challenges regarding software development, deployment, and maintenance. This section describes the software stacks in use on DUNE and some of which are expected in the future, as well as technologies used to build, distribute, and maintain and archive the software.

1.1 DUNE Software Stacks

Commonality is sought among the several detector simulation and reconstruction efforts in order to maximize the value of the several detectors planned for DUNE. Liquid-argon time projec- tion chambers (LArTPCs) are used in the 35-ton prototype, the 3x1x1 dual-phase prototype, the single-phase and dual-phase ProtoDUNE detectors, one component of the near detector (a pixel- based LArTPC), and the single-phase and dual-phase far detector (FD) modules. Several other LArTPC’s have operated or will in the near future: ICARUS, ArgoNeuT, LArIAT, MicroBooNE, SBND, and MiniCaptain, as well as some smaller prototypes, such as LongBo, TallBo, CDDF, Ar- gonCube. The LArSoft collaboration supports common software solutions for the many LArTPC’s and constitutes a large portion of DUNE software. This section also describes some of the other software stacks in use on DUNE, such as the beam simulation and triggering and DAQ.

Chapter Breakout: The DUNE Far Detector Interim Design Report software-computing.tex chapter-code-mgmt.tex Chapter 1: Code management 1–3

1.1.1 Beam Simulation

The beam simulation software, G4LBNF, is based on GEANT4 and private code that defines the baffle, target, horn, decay pipe, and shielding geometries and materials. The geometry is spec- ified in C++ methods so that parameters can be adjusted algorithmically in order to optimize the beamline components. This simulation is used to provide standard flux predictions for down- stream analyses, including near and far detector simulations, as well as sensitivity projections. It is also used to predict systematic uncertainties. The hadronic interaction uncertainties are esti- mated using the PPFX [2] package, and focusing uncertainties are estimated by adjusting beamline component locations, horn current fields, and beam size, position, and angle offsets. Additional uncertainties, such as the thickness of the cooling water layer are included. These predictions are provided in TTree objects for simulated neutrinos and their progenitors for the flux predictions, as well as aggregate prediction histograms and covariance matrices which provide a convenient interface for those not requiring the full information in the TTrees. Separate flux predictions and uncertainties are provided for the near and far sites, due to geometrical effects.

1.1.2 Triggering and DAQ

The ProtoDUNE-SP and 35-ton detector data acquisition systems’ software is based on artdaq, a full-featured package that provides for configuration control, interprocess communication, data ingestion, event building, logging, and monitoring. The artdaq software is written and maintained by a dedicated group at Fermilab. Experiment-specific plug-ins, such as the boardreaders, are maintained by the DUNE DAQ group. FIXME say something about ProtoDUNE-DP DAQ. The triggering for ProtoDUNE-SP is accomplished with an FPGA on the Penn Trigger Board which takes input from the beam instrumentation components.

Beam instrumentation data is stored in the DIP/DIM databases at CERN and extracted in near real-time by a dedicated process, which stores a persistent version of the data for downstream merging with the detector data.

For DUNE, the intention is to use the IFBEAM database to communicate beam parameters such as bunch current and arrival time at the target for later analysis.

1.1.3 LarSoft

LArSoft is a software toolkit based on the art event-processing framework. It is used, developed, and shared by many collaborations that use LARTPC detectors. It speeds the development of simulation and reconstruction software through this sharing, and also reduces the maintenance cost as software that is tested on one experiment can be re-used on another, reducing the need for debugging and tuning, though some detector-specific tuning remains in many cases. LArSoft also provides definitions of commonly used data products, such as raw digits, deconvoluted waveforms, hits, clusters, tracks, showers, vertices, and identified particles.

Chapter Breakout: The DUNE Far Detector Interim Design Report software-computing.tex chapter-code-mgmt.tex Chapter 1: Code management 1–4

dunetpc v06_64_00

larsoft dune_pardata dune_raw_data genie_phyopt genie_xsec lbne_raw_data duneutil v06_64_00 v01_27_00 v1_11_00 v2_12_8 v2_12_8 v1_04_17 v06_64_00

larpandora larana lareventdisplay dunepdsprce artdaq_core larbatch v06_19_00 v06_14_02 v06_15_09 v0_0_4 v1_07_12 v01_36_00

larreco larexamples TRACE jobsub_client sam_web_client v06_52_00 v06_08_02 v3_08_01 v1_2_5 v2_0

larsim larwirecell cigetcert v06_38_04 v06_08_11 v1_16_1

larpandoracontent larsoft_data larevt wirecell cigetcertlibs setpath v03_09_02 v1_02_01 v06_16_16 v0_6_2 v1_1 v1_11

eigen lardata ifdh_art jsoncpp jsonnet v3_3_3 v06_36_00 v2_04_02 v1_7_7 v0_9_3

lardataobj nutools larcore range nucondb ifbeam v1_26_01 v2_16_09 v06_15_12 v3_0_2_4 v2_1_3 v2_1_3

gallery genie nusimdata geant4 cry larcorealg ifdhc libwda art v1_06_04 v2_12_8a v1_08_04 v4_10_3_p01a v1_7h v1_15_00 v2_2_3 v2_24_0 v2_08_04

canvas_root_io lhapdf log4cpp larcoreobj ifdhc_config awscli cpn v1_00_05 v5_9_1h v1_1_2 v1_18_01 v2_2_3 v1_7_15 v1.7

pandora marley pdfsets canvas v03_07_00 v1_0_0a v5_9_1b v3_01_03

caffe tensorflow root cppunit messagefacility v1_0e v1_3_0a v6_10_08b v1_13_2a v2_01_04

leveldb lmdb glog hdf5 lapack opencv protobuf gsl postgresql mysql_client xrootd pythia fftw libxml2 clhep tbb fhiclcpp v1_20 v0_9_21 v0_3_5 v1_10_1b v3_7_1 v3_3_0a v3_3_1 v2_4 v9_6_5 v5_5_57 v4_7_0 v6_4_28i v3_3_6_pl2 v2_9_5 v2_3_4_5a v2018_1 v4_06_03

snappy gflags python cetlib v1_1_7 v2_2_1 v2_7_14 v3_01_03

sqlite cetlib_except boost v3_20_01_00 v1_01_05 v1_65_1

Figure 1.1: Software dependency tree for dunetpc, showing dependent software products.

LArSoft is divided into several repositories, each of which contains the source code to build multiple shared object libraries, and associated configuration and parameterized data files. A user may check out one repository at a time and set up pre-built versions of all of the others. A dependency diagram is shown in Figure 1.1, headed by the DUNE-specific package dunetpc.

1.1.3.1 Detector Simulation

The detector simulation is based on GEANT4. Detector geometry is read in from GDML files. A common interface to the GEANT4 stepping process simulates the details of the interaction of particles with liquid argon. Once the ionization energy has been determined by GEANT4, the fraction that recombines and makes light is computed in LArSoft-specific routines (NEST or a simple parameterization). The drift, diffusion, and attachment of electrons on impurities are simulated in LArSoft code, as is the collection on the wires or pixels. Detector electronics response functions are provided by detector-specific parameterizations, as is noise, digitization, and compression. The wrapped induction-plane wires provides a DUNE-specific challenge – multiple wire segments are read out by any given induction-plane DAQ channel. LArSoft assumes that experiment-provided methods give the mapping between DAQ channels and wire segments, and DUNE provides these for 35-ton, ProtoDUNE-SP, and the FD. Parameters such as the electric fields, argon temperature, electron lifetime, and others are specifiable in FHiCL files that allow easy reconfiguration of a program without recompiling.

Chapter Breakout: The DUNE Far Detector Interim Design Report software-computing.tex chapter-code-mgmt.tex Chapter 1: Code management 1–5

1.1.3.2 Reconstruction

Reconstruction proceeds in several steps. The raw digits are uncompressed, stuck bits interpolated over, noise filtered (coherent in space, frequency-based filtering in time), and the field response is deconvoluted, an important step for optimizing the signal analysis from induction-plane wires. Hits are fit as Gaussian pulses to the filtered, deconvoluted waveforms. A similar set of steps is used in the photon detector data preparation. Hits are grouped into clusters, which may be part of tracks or showers. Several track-finding and shower reconstruction algorithms have been developed. Ones in common use on DUNE are the Projection-Matching Algorithm (PMA) [3], Pandora [4, 5], and WireCell [6]. PMA tests 3D hypotheses by projecting them onto the 2D plane (wire vs. time) of the observed data and testing the consistency. Pandora builds up a 3D interpretation of each event based on matching clusters in the 2D views, using a variety of algorithms tuned to specific topolo- gies. Pandora is in fact a framework into which simpler algorithms can be plugged in. Pandora can be run in two modes: cosmic-ray identification and rejection, and neutrino-scatter reconstruction. These two modes were motivated by MicroBooNE’s needs, as each MicroBooNE trigger contains several cosmic-ray tracks on average. FIXME how many? The ProtoDUNE analyses will require similar steps in order to separate beam particle interactions from those made by cosmic rays. A convolutional neural network (CNN) has been trained to identify hits as belonging to tracks or to showers based on neighboring pixels in the deconvoluted 2D data [7]. Photon detector hits are grouped into flashes which are associated with TPC clusters for further analysis. The WireCell package is a tool that forms a three-dimensional image of the event based on the raw digits. It contains a two-dimensional deconvolution method that is able to use the extra information that neighboring wires provide in order to extract more information about the signal on a wire. This 2D deconvolution has been shown to improve the identification and reconstruction of tracks that travel in the plane of the electric field and an induction-plane wire.

The output files of the reconstruction contain data products made at all levels: raw and fitered/deconvoluted digits, hits, clusters, tracks and particles, and associations between them. It is necessary to keep this information for purposes of reconstruction algorithm development and tuning, though produc- tion of large data sets may drop the input data in the output files in order to reduce the storage requirements. A ntuple format has bee defined, called AnaTree, which provides convenient ac- cess to pertinent reconstructed information without having to read the much larger art-formatted files. The gallery package [8] also provides a simpler interface to reading art-formatted files from ROOT or Python scripts, or user-compiled standalone programs.

1.1.4 LArSoft-Like Software Packages

Not all detector components on DUNE use LArSoft directly, though there is a strong motivation to use the same framework and similar data products from one detector technology to another, in order to reduce the overhead of learning new systems when collborators switch projects within DUNE, or even come from other experiments to work on DUNE. For example, hisoft is a repository of code that simulates and reconstructs events in the CDR reference design near detector, a straw-tube tracker in a magnetic field with a calorimeter and muon chambers. It is built on art like LArSoft and has ported many of the features of LArSoft, but does not use LArSoft directly.

Chapter Breakout: The DUNE Far Detector Interim Design Report software-computing.tex chapter-code-mgmt.tex Chapter 1: Code management 1–6

1.1.5 QScan

QScan [9] is a software package developed initially for ICARUS but which is now in use for the dual-phase 3x1x1 prototype, ProtoDUNE-DP, and the proposed dual-phase FD module(s). It provides the similar functionality of simulation and reconstruction as LArSoft. It is designed to be run in the online monitor of ProtoDUNE-DP. Reconstruction of ProtoDUNE-DP data with LArSoft is planned.

1.1.6 External Software

DUNE encourages the exploration and use of external simulation and reconstruction packages that are publicly available. One notable example is the ALICE full software stack [10, 11] which is used in the Gaseous Argon TPC near detector simulation effort. Another is the re-use of HighLAND, an ntuple-analysis tool developed for T2K. Some software, such as GeGeDe and EdepSim, have been developed with DUNE in mind but are sufficiently general that they find use beyond the collaboration and are stored in GitHub instead of a DUNE-controlled repository.

1.2 Software Repositories

Official (i.e. supported, maintained, version-controlled and archived) software is stored in git repositories hosted at Fermilab. The Redmine [12, 13] web tool provides an html-based interface to each repository, allowing developers to view current and previous versions, branches, commit comments and differences, and annotated versions of the code indicating the person who checked in each line. Redmine also provides wiki support for associated documentation. Automated code documentation systems such as DOxygen [14] and LXR are in use. Software checked in to the repositories should contain DOxygen-friendly headers and specify the authors of the code. The repositories are publicly visible without credentials. In order to modify software in a repository, however, permission must be granted by the DUNE software management team to each developer.

In addition to dunetpc which contains LArSoft-related code, there are duneutil for utility scripts and XML files used in production, and dune-raw-data (and lbne-raw-data) which contain DAQ interface software.

1.2.1 Build Environment

Most DUNE software is written in C++ which must be compiled and installed before it can be run. We use the Multi-Repository Build System MRB [15]. It is capable of managing a local release of many checked-out git repositories’ worth of code. The smallest unit of code that can be checked out and compiled is the git repository with MRB. Other tools allow building smaller pieces. MRB produces UPS products [16] and installs them in a separate directory owned by the user. It also

Chapter Breakout: The DUNE Far Detector Interim Design Report software-computing.tex chapter-code-mgmt.tex Chapter 1: Code management 1–7 can produce tarfiles containing the built code arranged as a UPS product so it can be installed in a public location or transferred to a grid job.

The fact that the repository is the smallest unit of buildable code means that each repository needs to be tagged with a version, and if code in one repository depends on code in another repository, then the repository that requires the setup of the other one must also contain a file with the appropriate version numbers of all dependent repositories. In order for disk usage to be minimized and build times reduced, the sizes of the repositories must be made as small as practical. The maintenance of versions and the effort needed to factorize code that has grown in size provides pressure in the other direction.

1.2.2 Software Environment Setup

Code that uses UPS to set up depends on the environment variable LD_LIBRARY_PATH (on MacOS, the corresponding variable is called DYLD_LIBRARY_PATH), which is a search path to find shared-object libraries. This variable is used by the image activator to load the desired pre- built code. In the common case that a user would like his or her own version of a shared library, the local version is in a directory earlier in the search path than the publicly installed version, allowing convenient building and testing of small components without rebuilding the entire application. When UPS sets up a software product (for LArSoft, each git repository is built into a single UPS product), it also sets up corresponding dependent products for a consistent set. GEANT4, GENIE, and ROOT are most often set up in this manner. Users logging in without setting up the DUNE environment have a “clean” system with just operating-system and user-defined variables, which can be very valuable when debugging a complicated situation that depends on a user’s setup. A UPS product’s setup table file also specifies additions to the search paths PATH, FW_SEARCH_PATH, and FHICL_FILE_PATH, which are used to search for command-line executables, framework data files, and job configuration files. Any other variables required by the software are also set up at this stage.

The UPS model assumes that all versions of software that are needed by any user of a system are already present in directories that can be searched for the appropriate version. Centrally-managed systems with large shared disks for storing software had been in use but more modern distributed tools scale better for DUNE’s requirements. This software environment differs from commercial distribution systems (like RPM, yum, pip, and others) because of the need for physics analyses to be able to reproduce their results. If an analysis is in its final stages and systematic uncertainties are being evaluated, users are not able to upgrade to a new version of the software, even if it is better (has a better reconstruction for example). Multiple users of a computer system may be in various stages of analyses and require stable versions that differ from each other. To this end, every component of the software stack, including the compiler, is distributed as UPS products.

The dependence of code on (DY)LD_LIBRARY_PATH causes an issue with MacOS. The System Integrity Protection (SIP) feature introduced with MacOS version 10.11 prevents its use by sub- processes when it is defined in a process. The reasoning is that a malicious library can be placed earlier in a search path than a desired one and cause a security breach. While SIP can be disabled in MacOS versions 10.11 through 10.13, the LArSoft and art development teams are not confident

Chapter Breakout: The DUNE Far Detector Interim Design Report software-computing.tex chapter-code-mgmt.tex Chapter 1: Code management 1–8 that this will persist indefinitely. Furthermore, if distributions follow suit, then the use of library paths may become obsolete.

A future development in the bulid and setup systems is to use Spack [17], which builds code in a way that preserves the locations of the dependent libraries; these locations are stored in the calling libraries’ .so files at link time so that the can insist that only the library that was used at link time is the one used at run time.

1.2.3 Code Distribution

While the source code of DUNE’s software stacks is publicly available, pre-built versions are needed for running on grid computing resources, and also to speed development. Pre-built binaries are distributed along with the source code using CVMFS [18], a distributed file system designed for efficient distribution of code. The source files and header files are needed for debugging and development, even for other repositories that depend on a particular repository. Pre-built binaries are distributed for the major supported platforms: Scientific Linux (currently SL6 and SL7), and MacOS (currently 10.11 and 10.12) are the supported platforms. The distributed builds on Linux are built on nodes running Scientific Linux Fermi, which is compatible with Scientific Linux CERN (SLC), CENTOS, and , with corresponding version numbers. Other distributions of Linux are not currently supported.

Another distribution mechanism is a web server containing the pre-built binaries and source UPS products collected in tarfiles. These are stored at http://scisoft.fnal.gov. Along with the tarfiles are files that contain lists of tarfiles needed to be downloaded in order to install a consistent set.

Each version of the code is provided with a debug build or with an optimize build. Code compiled with the optimizer turned on (FIXME – O3?) is much faster than unoptimized code (some code runs four times faster or more when optimized), but it is much more difficult to use with a debugger, due to out-of-order execution of portions of statements, as well as the use of CPU registers for temporary variables, or removing some calculations entirely.

1.2.4 Release Management

LArSoft and dunetpc are large, distributed software efforts and frequently an upgrade to one product requires a simultaneous upgrade to another. Developers working with the latest source code of one repository must then work with the latest source code of another repository. In order to minimize the occurrence of this, and allow developers to use pre-built binaries whenever possible, frequent releases of the LArSoft code stack are made. Each week, there is a release, and additional ones are made, as needed.

Because UPS sets up LArSoft when dunetpc is set up, and only the matching version can be set up, each week there is a release of dunetpc, with a version number matching the LArSoft release.

Chapter Breakout: The DUNE Far Detector Interim Design Report software-computing.tex chapter-code-mgmt.tex Chapter 1: Code management 1–9

Version numbers contain three digits: major, minor, and patch numbers. DUNE-specific code does not have to wait for the weekly LArSoft release to make a DUNE release. Adding additional identifying characters to a release tag allows for multiple dunetpc releases for each LArSoft release.

Frequently, changes in LArSoft require corresponding changes to dunetpc, such as changing the directories in which header files are located, when LArSoft undergoes a rearrangement. These changes are frequently scripted so that all experiments using LArSoft can upgrade easily. Fre- quently developers of shared tools will make git feature branches of changes that are required by their tool, and these feature branches are merged by the DUNE software managers before tagging and building the weekly release.

Official releases are built on the Jenkins [19] build system at Fermilab, which automates the build process on all supported platforms and versions, such as debug or optimized builds.

1.2.5 The art Framework

DUNE’s near and far detector simulation and reconstruction software, as well as that used for the ProtoDUNE detectors, is built on LArSoft which is built on the art framework [20]. The art framework provides the event loop, I/O features not supported by ROOT such as parameterizable output file naming, configuration using the Fermi Hierarchical Configuration Language (FHiCL), provenance storage by saving all configuration information in the output file, message handling, and resource consumption (memory, CPU, I/O) accounting. It is supported by the art development team at Fermilab. It provides the basis for both analysis programs and for artdaq. Extensive documentation and tutorials are provided [21][22]. The art deveopers hold weekly stakeholders’ meetings at which experiment concerns and requests are aired.

1.2.6 NTuple Analysis (CAFAna)

The process of extracting oscillation parameters from DUNE’s results requires a sophisticate fit to the near and far detector data samples, using muon, electron, and possibly neutral-current candidate events. This fit must include the effects of detector acceptance and resolution, and also include systematic uncertainties on these. The oscillation parameters affect predictions as functions of the true values L/E, while DUNE can only measure an approximate energy on each event, so the full transfer function of true to measured energy must be incorporated in the fits. CAFAna [23, 24] is a tool developed for use on NOvA, which takes as input Monte Carlo and data NTuples and forms histograms in as many categories as the experiment defines, and performs these fits. It has been ported to DUNE to read similarly constructed NTuples built from fully-simulated DUNE events.

Chapter Breakout: The DUNE Far Detector Interim Design Report software-computing.tex chapter-code-mgmt.tex Chapter 1: Code management 1–10

Chapter Breakout: The DUNE Far Detector Interim Design Report software-computing.tex chapter-code-mgmt.tex REFERENCES 1–11

References

[1] DOE Office of High Energy Physics, “Mission Need Statement for a Long-Baseline Neutrino Experiment (LBNE),” tech. rep., DOE, 2009. LBNE-doc-6259.

[2] L. Aliaga Soplin, Neutrino Flux Prediction for the NuMI Beamline. PhD thesis, William-Mary Coll., 2016. http://lss.fnal.gov/archive/thesis/2000/fermilab-thesis-2016-03.pdf.

[3] R. Sulej and D. Stefan http://larsoft.org/single-record/?pdb=102.

[4] J. S. Marshall and M. A. Thomson, “The Pandora Software Development Kit for Pattern Recognition,” Eur. Phys. J. C75 no. 9, (2015) 439, arXiv:1506.05348 [physics.data-an].

[5] MicroBooNE Collaboration, R. Acciarri et al., “The Pandora multi-algorithm approach to automated pattern recognition of cosmic-ray muon and neutrino events in the MicroBooNE detector,” Eur. Phys. J. C78 no. 1, (2018) 82, arXiv:1708.03135 [hep-ex].

[6] X. Qian, B. Viren, and C. Zhang https://www.phy.bnl.gov/wire-cell/.

[7] R. Sulej http://larsoft.org/single-record/?pdb=138.

[8] M. Paterno et al. https://cdcvs.fnal.gov/redmine/projects/gallery.

[9] D. Lussi, Study of the response of the novel LAr LEM-TPC detector exposed to cosmic rays and a charged particle beam. PhD thesis, ETH Zürich, 2013. https://doi.org/10.3929/ethz-a-010008070.

[10] https://alice-offline.web.cern.ch/.

[11] ALICE Collaboration, B. Abelev et al., “Performance of the ALICE Experiment at the CERN LHC,” Int. J. Mod. Phys. A29 (2014) 1430044, arXiv:1402.4476 [nucl-ex].

[12] https://cdcvs.fnal.gov/redmine.

[13] http://www.redmine.org/projects/redmine.

Chapter Breakout: The DUNE Far Detector Interim Design Report software-computing.tex chapter-code-mgmt.tex REFERENCES 1–12

[14] https://github.com/doxygen/doxygen.

[15] https://cdcvs.fnal.gov/redmine/projects/mrb.

[16] https://cdcvs.fnal.gov/redmine/projects/ups.

[17] https://spack.io/.

[18] https://cernvm.cern.ch/portal/filesystem.

[19] https://jenkins.io/.

[20] C. Green, J. Kowalkowski, M. Paterno, M. Fischler, L. Garren, et al., “The art framework,” J.Phys.Conf.Ser. 396 (2012) 022020.

[21] http://art.fnal.gov.

[22] https://indico.fnal.gov/event/9928/.

[23] https://cdcvs.fnal.gov/redmine/projects/novaart/wiki/CAFAna_overview.

[24] NOvA Collaboration, P. Adamson et al., “Measurement of the neutrino mixing angle θ23 in NOvA,” Phys. Rev. Lett. 118 no. 15, (2017) 151802, arXiv:1701.05891 [hep-ex].

Chapter Breakout: The DUNE Far Detector Interim Design Report software-computing.tex chapter-code-mgmt.tex