Nuclear Physics Network Requirements Review Report

Total Page:16

File Type:pdf, Size:1020Kb

Nuclear Physics Network Requirements Review Report Nuclear Physics Network Requirements Review Report May 8–9, 2019 Nuclear Physics Network Requirements Review Report May 8–9, 2019 Office of Nuclear Physics, DOE Office of Science Energy Sciences Network (ESnet) Gaithersburg, Maryland ESnet is funded by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research. Benjamin Brown is the ESnet Program Manager. ESnet is operated by Lawrence Berkeley National Laboratory, which is operated by the University of California for the U.S. Department of Energy under contract DE-AC02-05CH11231. This work was supported by the Directors of the Office of Science, Office of Advanced Scientific Computing Research, Facilities Division, and the Office of Nuclear Physics. This is a University of California, Publication Management System report number LBNL-2001281. Disclaimer This document was prepared as an account of work sponsored by the United States Government. While this document is believed to contain correct information, neither the United States Government nor any agency thereof, nor The Regents of the University of California, nor any of their employees, makes any warranty, express or implied, or assumes any legal responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by its trade name, trademark, manufacturer, or otherwise, does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States Government or any agency thereof, or The Regents of the University of California. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof or The Regents of the University of California. Cover Images: (top left) ALICE image, courtesy of CERN; (top right) JLab Hall image, courtesy of JLab; (bottom left) GRETA image, courtesy of Mario Cromaz, GRETA Project Team; (bottom right) PHENIX image, courtesy of Brookhaven National Laboratory. Participants and Contributors Whitney Armstrong, Argonne National Laboratory Andrew Melo, Vanderbilt University John Arrington, Argonne National Laboratory Jonathan Menard, Princeton Plasma Physics Ed Balas, Lawrence Berkeley National Laboratory, Laboratory ESnet Inder Monga, Lawrence Berkeley National Laboratory, Debbie Bard, Lawrence Berkeley National Laboratory, ESnet NERSC Brent Morris, Thomas Jefferson National Accelerator Steve Beher, Michigan State University / Facility for Facility Rare Isotope Beams Thomas Ndousse-Fetter, Department of Energy Office Laura Biven, Department of Energy Office of Science, of Science, ASCR ASCR Chris Pinkenburg, Brookhaven National Laboratory Vincent Bonafede, Brookhaven National Laboratory Robinson Pino, Department of Energy Office of Ben Brown, Department of Energy Office of Science, Science, ASCR ASCR Jeff Porter, Lawrence Berkeley National Laboratory, Rich Carlson, Department of Energy Office of Science, NERSC ASCR Gulshan Rai, Department of Energy Office of Science, Mike Carpenter, Argonne National Laboratory NP Taylor Childers, Argonne National Laboratory Thomas Rockwell, Michigan State University, Facility Jody Crisp, ORISE for Rare Isotope Beams Mario Cromaz, Lawrence Berkeley National Lauren Rotman, Lawrence Berkeley National Laboratory Laboratory, ESnet Eli Dart, Lawrence Berkeley National Laboratory, Arjun Shankar, Oak Ridge National Laboratory ESnet Adam Slagell, Lawrence Berkeley National Laboratory, Brent Draney, Lawrence Berkeley National ESnet Laboratory, NERSC James Sowinski, Department of Energy Office Robert Edwards, Thomas Jefferson National of Science, NP Accelerator Facility Tom Uram, Argonne National Laboratory Timothy Hallman, Department of Energy Office Chip Watson, Thomas Jefferson National Accelerator of Science, NP Facility Barbara Helland, Department of Energy Office Paul Wefel, Lawrence Berkeley National Laboratory, of Science, ASCR ESnet Bryan Hess, Thomas Jefferson National Accelerator Linda Winkler, Argonne National Laboratory Facility Jason Zurawski, Lawrence Berkeley National Graham Heyes, Thomas Jefferson National Accelerator Laboratory, ESnet Facility Susan Hicks, Oak Ridge National Laboratory Report Editors Scott Kampel, Princeton Plasma Physics Laboratory Ben Brown, Department of Energy Office of Science: Eric Lancon, Brookhaven National Laboratory [email protected] Jerome Lauret, Brookhaven National Laboratory Eli Dart, ESnet: [email protected] Carolyn Lauzon, Department of Energy Office of Gulshan Rai, Department of Energy Office of Science: Science, ASCR [email protected] Steven Lee, Department of Energy Office of Science, Lauren Rotman, ESnet: [email protected] ASCR Paul Wefel, ESnet: [email protected] Sean Liddick, Michigan State University, Facility Jason Zurawski, ESnet: [email protected] for Rare Isotope Beams Paul Mantica, Michigan State University, Facility for Rare Isotope Beams Nuclear Physics Network Requirements Review Report 2019 II Table of Contents 1 Executive Summary 1 2 Review Findings 6 2.1 Technology Support 6 2.2 Scientific Workflow 6 2.3 Data Set Growth and Management 7 2.4 ESnet Specific 7 3 Review Action Items 8 4 Requirements Review Structure 9 4.1 Background 9 4.2 Case Study Methodology 9 5 Nuclear Physics Case Studies 11 5.1 Thomas Jefferson National Accelerator Facility (JLab) 12 5.1.1 Discussion Summary 12 5.1.2 JLab — Facility Notes 12 5.1.3 JLab — Measurement of a Lepton-Lepton Electroweak Reaction (MOLLER) Notes 14 5.1.4 JLab — Solenoidal Large Intensity Device (SoLID) Notes 14 5.1.5 JLab — Electron-Ion Collider (EIC) Notes 15 5.1.6 JLab — Lattice Quantum Chromodynamics (LQCD) / Theory Notes 15 5.1.7 Facility Case Study 16 5.1.7.1 Background 16 5.1.7.2 Collaborators 17 5.1.7.3 Instruments and Facilities 18 5.1.7.3.1 Hall A 18 5.1.7.3.3 Hall C 19 5.1.7.3.4 Hall D 19 5.1.7.3.5 Compute and Storage 20 5.1.7.4 Process of Science 21 5.1.7.4.1 Calibration 21 5.1.7.4.2 Reconstruction 22 5.1.7.4.3 Analysis 22 5.1.7.4.4 Simulation 22 5.1.7.5 Remote Science Activities 24 5.1.7.6 Software Infrastructure 25 5.1.7.7 Network and Data Architecture 26 5.1.7.8 Cloud Services 29 5.1.7.9 Data-Related Resource Constraints 29 5.1.7.10 Outstanding Issues 30 5.1.7.11 Summary 31 5.1.7.12 Case Study Contributors 31 Nuclear Physics Network Requirements Review Final Report III 5.1.8 Measurement of a Lepton-Lepton Electroweak Reaction (MOLLER) 31 5.1.8.1 Science Background 31 5.1.8.2 Collaborators 31 5.1.8.3 Instruments and Facilities 32 5.1.8.4 Process of Science 32 5.1.8.5 Remote Science Activities 32 5.1.8.6 Software Infrastructure 32 5.1.8.7 Network and Data Architecture 32 5.1.8.8 Cloud Services 32 5.1.8.9 Data-Related Resource Constraints 32 5.1.8.10 Outstanding Issues 32 5.1.8.11 Case Study Contributors 32 5.1.9 Solenoidal Large Intensity Device (SoLID) 32 5.1.9.1 Science Background 32 5.1.9.2 Collaborators 33 5.1.9.3 Instruments and Facilities 33 5.1.9.4 Process of Science 33 5.1.9.5 Remote Science Activities 33 5.1.9.6 Software Infrastructure 33 5.1.9.7 Network and Data Architecture 33 5.1.9.8 Cloud Services 34 5.1.9.9 Data-Related Resource Constraints 34 5.1.9.10 Outstanding Issues 34 5.1.9.11 Case Study Contributors 34 5.1.10 Lattice Quantum Chromodynamics / Theory 34 5.1.10.1 Science Background 34 5.1.10.2 Collaborators 34 5.1.10.3 Instruments and Facilities 35 5.1.10.4 Process of Science 36 5.1.10.5 Remote Science Activities 36 5.1.10.6 Software Infrastructure 36 5.1.10.7 Network and Data Architecture 36 5.1.10.8 Cloud Services 37 5.1.10.9 Data-Related Resource Constraints 37 5.1.10.10 Outstanding Issues 37 5.1.10.11 Case Study Contributors 37 5.1.11 Electron-Ion Collider (EIC) 37 5.1.11.1 Science Background 37 5.1.11.2 Collaborators 37 5.1.11.3 Instruments and Facilities 37 5.1.11.4 Process of Science 38 5.1.11.5 Remote Science Activities 38 5.1.11.6 Software Infrastructure 38 5.1.11.7 Network and Data Architecture 38 Nuclear Physics Network Requirements Review Report 2019 IV 5.1.11.8 Cloud Services 38 5.1.11.9 Data-Related Resource Constraints 38 5.1.11.10 Outstanding Issues 38 5.1.11.11 Case Study Contributors 38 5.2 Facility for Rare Isotope Beams (FRIB) 39 5.2.1 Discussion Summary 39 5.2.2 Science Background 40 5.2.3 Collaborators 41 5.2.4 Instruments and Facilities 45 5.2.5 Process of Science 51 5.2.6 Remote Science Activities 53 5.2.7 Software Infrastructure 53 5.2.8 Network and Data Architecture 55 5.2.9 Cloud Services 56 5.2.10 Data-Related Resource Constraints 57 5.2.11 Outstanding Issues 57 5.2.12 Case Study Contributors 57 5.3 Gamma-Ray Energy Tracking Array (GRETA) 58 5.3.1 Discussion Summary 58 5.3.2 Science Background 58 5.3.3 Collaborators 59 5.3.4 Instruments and Facilities 59 5.3.5 Process of Science 61 5.3.6 Remote Science Activities 62 5.3.7 Software Infrastructure 62 5.3.8 Network and Data Architecture 62 5.3.9 Cloud Services 63 5.3.10 Data-Related Resource Constraints 63 5.3.11 Outstanding Issues 63 5.3.12 Case Study Contributors 63 5.4 Argonne National Laboratory — Gammasphere / Argonne Tandem Linear Accelerator System (ATLAS) 64 5.4.1 Discussion Summary 64 5.4.2 Science Background 64 5.4.3 Collaborators 65 5.4.4 Instruments and Facilities 65 5.4.5 Process of Science 66 5.4.6 Remote Science Activities 66 5.4.7 Software Infrastructure 66 5.4.8 Network and Data Architecture 66 5.4.9 Cloud Services 66 5.4.10 Data-Related Resource Constraints 66 5.4.11 Outstanding Issues 66 5.4.12
Recommended publications
  • QUILT CIRCLE2020 a Letter from the President
    THE QUILT CIRCLE2020 A Letter From the President This 2020 Quilt Circle edition commemorates the 20th Anniversary of The Quilt. The fabric of our research and education (R&E) networking community has never been stronger. While our Quilt community has evolved in new and exciting ways in the past two decades, we have also been faced with a number of challenges which we take head-on and always with the spirit of collaboration. As we address the unprecedented challenges presented by the current global public health crisis due to the COVID-19 pandemic, the work of our members is more important than ever to the missions of their member communities. U.S. higher education institutions rely on R&E networks to give them a competitive edge in the most impactful scientific research initiatives which is essential in this crisis. We connect the educational institutions that support university medical centers and their associated hospitals. R&E networks also connect tens of thousands of other community anchor institutions, including K-12 schools, public libraries, local/state government, research sites, cultural institutions, public safety, and tribal lands. Being responsive and providing vital networking infrastructure and resources right now to address immediate needs is who we are and what we do. R&E networks are part of our nation’s critical infrastructure. This year’s edition of The Quilt Circle showcases several examples of the key role of R&E network members in both providing and facilitating the use-network infrastructure to further scientific discovery and collaborations at higher education institutions of all sizes.
    [Show full text]
  • Esnet: Advanced NETWORKING for SCIENCE
    ENERGY SCIENCES NETWORK ESnet: Advanced NETWORKING for SCIENCE Researchers around the world using advanced computing for scientific discovery are connected via the DOE-operated Energy Sciences Network (ESnet). By providing a reliable, high-performance communications infrastructure, ESnet facilitates the large-scale, collaborative science endeavors fundamental to Office of Science missions. Energy Sciences Network tive science. These include: sharing of massive In many ways, the dramatic achievements of 21st amounts of data, supporting thousands of collab- century scientific discovery—often involving orators worldwide, distributed data processing enormous data handling and remote collabora- and data management, distributed simulation, tion requirements—have been made possible by visualization, and computational steering, and accompanying accomplishments in high-per- collaboration with the U.S. and international formance networking. As increasingly advanced research and education (R&E) communities. supercomputers and experimental research facil- To ensure that ESnet continues to meet the ities have provided researchers with powerful requirements of the major science disciplines a tools with unprecedented capabilities, advance- new approach and a new architecture are being ments in networks connecting scientists to these developed. This new architecture includes ele- tools have made these research facilities available ments supporting multiple, high-speed national to broader communities and helped build greater backbones with different characteristics—redun- collaboration within these communities. The dancy, quality of service, and circuit oriented DOE Office of Science (SC) operates the Energy services—all the while allowing interoperation of Sciences Network (ESnet). Established in 1985, these elements with the other major national and ESnet currently connects tens of thousands of international networks supporting science.
    [Show full text]
  • Ipv6 in Esnet
    SLAC-PUB-8902 IPv6 in ESnet Warren Matthews, Principal Network Specialist, Stanford Linear Accelerator Center (SLAC) Bob Fink, Energy Sciences Network (ESnet) Research Staff, Co-chair NGtrans WG, IETF and Project Lead for the 6bone Project, IETF. Susan Hicks, Oak Ridge National Laboratory (ORNL) Vyto Grigaliunas, Network Analyst, Fermi National Accelerator Laboratory (FNAL) Abstract The importance of the Internet to modern High Energy Physics collaborators is clearly immense, and understanding how new developments in network technology impact net- works is critical to the future design of experiments. The next generation Internet Protocol (IPv6) is being deployed on testbeds and pro- duction networks throughout the world. The protocol has been designed to solve todays internet problems, and many of the features will be core Internet services in the future. In this talk the features of the protocol will be described. Details will be given on the deployment at sites important to High Energy Physics Research and the network services operating at these sites. In particular IPv6 deployment on the U.S. Energy Sciences Network (ESnet) will be reviewed. The connectivity and performance between High Energy Physics Laboratories, Universities and Institutes will be discussed. Keywords: Network, Technology, Internet, Protocol, IP, IPv6 1 Introduction High Energy and Nuclear Physics (HENP) experiments around the world generate huge amounts of data, much of which is transferred across networks to collaborators for analysis. Network technology has become critical to the success of experiments. The requirements of HENP researchers often means the networks they use become early implementors of new network technology. In this paper the next generation of the Internet Protocol (IP) is reviewed.
    [Show full text]
  • Description of Omnipop for Proposals
    Description of OmniPoP for Proposals Summary The OmniPoP is a collaborative effort between 12 of the member universities of the Big Ten Academic Alliance. Together, these institutions have pooled their efforts to create a high performance shared infrastructure based in the Chicago area. This infrastructure was designed to complement and augment the shared fiber infrastructure that the Big Ten Academic Alliance members had previously purchased. The OmniPoP operates a high capacity switching infrastructure that supports 10 gigabit and 100 gigabit connections to its member institutions and equivalent high capacity links to national research and education networks such as Internet2, ESnet, and Starlight. This allows OmniPoP connections to be leveraged to provide services to large data flows in support of multi-institutional cooperative research efforts. Efforts supported today include interconnections between the Large Hadron Collider (LHC) Tier 2 efforts at the member institutions and the Midwest Openflow Crossroads Initiative (MOXI) project which links several midwest regional networks to the GENI backbone. OmniPoP Infrastructure and Peerings The Omnipop infrastructure consists of a redundant pair of 100 gigabit capable switches. These switches operate from geographically diverse co-location facilities within the Chicago metropolitan areas. These facilities also serve as Points of Presence (PoPs) for other major networks such as Internet2, ESnet (Department of Energy’s Energy Sciences Network), and Starlight (the international peering exchange), enabling seamless cross connections to the major national and international research and education networks that support much of the academic research community. An additional benefit to these facilities is that they offer the opportunity for Big Ten Academic Alliance members to co-locate additional network related equipment in support of their own projects independent of the OmniPoP core infrastructure.
    [Show full text]
  • High Energy Physics and Nuclear Physics Network Requirements
    High Energy Physics and Nuclear Physics Network Requirements HEP and NP Network Requirements Review Final Report Conducted August 20-22, 2013 ESnet Energy Sciences Network DISCLAIMER This document was prepared as an account of work sponsored by the United States Government. While this document is believed to contain correct information, neither the United States Government nor any agency thereof, nor The Regents of the University of California, nor any of their employees, makes any warranty, express or implied, or assumes any legal responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by its trade name, trademark, manufacturer, or otherwise, does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States Government or any agency thereof, or The Regents of the University of California. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof or The Regents of the University of California. High Energy Physics and Nuclear Physics Network Requirements Offices of High Energy Physics and Nuclear Physics, DOE Office of Science Energy Sciences Network Gaithersburg, Maryland — August 20–22, 2013 ESnet is funded by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research (ASCR). Vince Dattoria is the ESnet Program Manager. ESnet is operated by Lawrence Berkeley National Laboratory, which is operated by the University of California for the U.S.
    [Show full text]
  • Global Data Sharing for Earth Observation Through R&E Networks
    Global Data Sharing for Earth Observation through R&E Networks At a Glance – The Earth Observation Challenge From scientists to policy makers, meteorologists, R&E networks can play a fundamental role in providing the Earth forecasters, transport planners and insurers, a Observation community with the high-speed, high-capacity, secure diverse mix of Earth Observation users are data transmission capability it requires to collect, integrate, share increasingly collaborating on a global scale. As and analyse environmental data. satellite imaging, radar and seismic probes offer In particular R&E networking can support Earth Scientists by more detailed ways of recording information, facilitating: increasingly sophisticated instruments, tools and • Transfer of extremely large files such as satellite images over large models are needed to cope with this growing distances in real time. volume and complexity of data. • Real-time distribution of monitoring data to provide early warning of impending natural disasters, such as earthquakes or typhoons. • Parallel dissemination of satellite data to weather agencies using multicast. Managing this data on a global scale represents a great challenge for • Simulations via grid infrastructures requiring colossal amounts of the earth science community. The intergovernmental Group on Earth computing power and bandwidth. Observations (GEO) was established in response to this challenge and • Dedicated private connectivity between two or more institutions is leading a plan to globally interconnect the different user for research projects with specific security or computing needs. communities, facilitating collaboration and knowledge through the • Seamless secure access to distributed data archives via single sign- creation of the Global Earth Observation System of Systems (GEOSS).
    [Show full text]
  • GÉANT Support for R&E During the Early Stages of the COVID-19
    29-05-2020 Deliverable D3.9 GÉANT Support for R&E during the Early Stages of the COVID-19 Pandemic Deliverable D3.9 Contractual Date: 31-05-2020 Actual Date: 29-05-2020 Grant Agreement No.: 856726 Work Package WP3 Task Item: Tasks 1, 2, 3 and 4 Nature of Deliverable: R (Report) Dissemination Level: PU (Public) Lead Partner: GÉANT Association Document ID: GN4-3-20-41613A Authors: V. Capone, T. Fryer, A. Grant, B. Weber, D. Wustenberg (GÉANT Association}, with input from NREN and RREN partners © GÉANT Association on behalf of the GN4-3 project. The research leading to these results has received funding from the European Union’s Horizon 2020 research and innovation programme under Grant Agreement No. 856726 (GN4-3). Abstract This document summarises the work GÉANT and its European and global partner networks have been doing to support research and education during the early stages of the COVID-19 pandemic, with particular focus on activities being supported by the EC. Table of Contents Executive Summary 1 1 Introduction 2 2 EC-Supported Activities 4 2.1 EMBL-EBI 5 2.1.1 Overview 5 2.1.2 GÉANT Support 5 2.1.3 NREN Support 7 2.2 New RTD Projects on Coronavirus 7 2.2.1 Overview 7 2.2.2 GÉANT Support 8 2.3 RDA COVID-19 Working Group 8 2.3.1 Overview 8 2.3.2 GÉANT Support 9 2.4 Virus Outbreak Data Network 9 2.4.1 Overview 9 2.4.2 GÉANT Support 10 2.5 Data Together COVID-19 Appeal and Actions 10 2.5.1 Overview 10 2.5.2 GÉANT Support 10 3 GÉANT Activities 11 3.1 Network 11 3.2 Services 11 3.2.1 Up2U 11 3.2.2 eduMEET 12 3.2.3 Cloud Offerings 13
    [Show full text]
  • Chapter 9: Enabling Capabilities for Science and Energy September 2015 9 Enabling Capabilities for Science and Energy
    QUADRENNIAL TECHNOLOGY REVIEW AN ASSESSMENT OF ENERGY TECHNOLOGIES AND RESEARCH OPPORTUNITIES Chapter 9: Enabling Capabilities for Science and Energy September 2015 9 Enabling Capabilities for Science and Energy Tools for Scientific Discovery and Technology Development Investment in basic science research is expanding our understanding of how structure leads to function—from the atomic- and nanoscale to the mesoscale and beyond—in natural systems, and is enabling a transformation from observation to control and design of new systems with properties tailored to meet the requirements of the next generation of energy technologies. At the core of this new paradigm is a suite of experimental and computational tools that enable researchers to probe and manipulate matter at unprecedented resolution. The planning and development of these tools is rooted in basic science, but they are critically important for technology development, enabling discoveries that can lead to broad implementation. These tools are available through a user facility model that provides merit-based open access for nonproprietary research. Each year, thousands of users leverage the capabilities and staff expertise for their research, while the facilities leverage user expertise toward maintenance, development, and application of the tools in support of the broader community of users. The challenges in energy science and technology development increasingly necessitate interdisciplinary collaboration. The multidisciplinary and multi-institutional research centers supported by DOE are designed to integrate basic science and applied research to accelerate development of new and transformative energy technologies. Capabilities supported by DOE and the DOE-Office of Science (SC) are enabling the following across science and technology: - The X-ray light and neutron sources provide unprecedented access to the structure and dynamics of materials and the molecular-scale basis of chemical reactions.
    [Show full text]
  • Get Report (PDF-8.3
    For further information about NASA’s High-End Computing Program, please contact: Dr. Tsengdar Lee Scientific Computing Portfolio Manager NASA Science Mission Directorate NASA Headquarters (202) 358-0860 [email protected] Dr. Rupak Biswas HECC Project Manager NASA Advanced Supercomputing Division NASA Ames Research Center (650) 604-4411 [email protected] www.nas.nasa.gov Dr. W. Phillip Webster NCCS Project Manager Computational and Information Sciences and Technology Office NASA Goddard Space Flight Center (301) 286-9535 [email protected] www.nccs.nasa.gov HIGH-END COMPUTING AT NASA 2006 November 1, 2006 Members of the Scientific Research and Engineering Community: We are extremely pleased to present the inaugural report from NASA’s newly established High-End Com- puting (HEC) Program. This publication captures remarkable science and engineering accomplishments enabled by the HEC Program’s shared high-end computing systems and services. For several decades, high-end computing has played an important role in supporting NASA’s missions, with advancements as broad as preparing the Space Shuttle for Return to Flight, assimilating vast quantities of Earth observational data into climate models, and developing astrophysical calculations that help us better understand the origins of our universe. Today, our Program is committed to maintaining a stable, service- oriented computing environment for all four of the Agency’s mission directorates—Aeronautics Research, Exploration Systems, Science, and Space Operations—as well as NASA’s Engineering and Safety Center, external collaborators, and the nation. In this report, you will read about the technologies that help make NASA’s HEC Program successful.
    [Show full text]
  • 2016 Annual Report for the Northern Tier Network
    NORTHERN TIER NETWORK NORTH DAKOTA FY16 North Dakota State University, University of North Dakota and North Dakota Information Technology Department /////////////////////////////////////////////////////INDEX FY16 ANNUAL REPORT 3 KEY ELEMENTS 4 LOOKING AHEAD–CRITICAL UPGRADES FOR NEXT GENERATION APPLICATIONS 4 OVERVIEW 5 FY16 FINANCIAL SUMMARY 9 21ST CENTURY COLLABORATIVE RESEARCH AND ACADEMICS 10 LOCAL AND GLOBAL IMPACT 14 MILESTONES 15 NTN-ND PARTNERS 18 REFERENCES 19 2 | NORTHERN TIER NETWORK FY16 ANNUAL REPORT The report will provide an update on the operations and maintenance of the Northern Tier Network – North Dakota (NTN-ND) for the fiscal year 2016 aligned to the intent of the network as described in the North Dakota Century Code. Contents of this report include a description of the functions of the network and its service to the research and education activities in North Dakota, along with its role in the larger regional, ///////////////////////////////////////////////////// national and global research and education (R&E) network community. This report is intended to inform NTN-ND stakeholders across North Dakota, especially those beneficiaries of the exemplary resources that connection to this global community provides. These include students, faculty and staff located at higher education institutions and Tribal Colleges, and the K-12 community. An overview of NTN-ND and its role within the larger regional, national and global R&E network environment is followed by a current financial report, examples of 21st century collaborative research and academics, highlights of major milestones in the evolution of the R&E network community, and NTN-ND partners. EVALUATION PROCEDURES Data gathered and analyzed for this document included a review of current financial records, annual operations, recent R&E network traffic summaries, and current trends in use cases of R&E networks.
    [Show full text]
  • Summary 1. EPOC Overview
    Engagement and Performance Operations Center (EPOC) (Formerly known as ReSEC) NSF Award #1826994 Year 2 Quarter 1 Report 1 April 2019 through 30 June 2019 PIs: Jennifer Schopf (IU), Jason Zurawski (LBNL), Dave Jent (IU) Summary The goal of the EPOC project is to provide researchers and network engineers with a holistic set of tools and services needed to debug performance issues and enable reliable and robust data transfers. It supports five main activities: Roadside Assistance and Consulting, Application Deep Dives, Network Analysis using NetSage, the provision of Managed Services, and Training. In Year 2 Quarter 1, highlights included attending every summer meeting for our Regional Partners and giving EPOC overviews at each, completing nine consultation cases and continuing to work nine others, completing the interview portion for six Deep Dives, and lead five training sessions, including one with completely new material. 1. EPOC Overview The Engagement and Performance Operations Center (EPOC) is a production platform for operations, applied training, monitoring, and research and education support. EPOC is a collaborative focal point for operational expertise and analysis and is jointly led by Indiana University (IU) and the Energy Sciences Network (ESnet). EPOC provides researchers and network engineers with a holistic set of tools and services needed to debug performance issues and enable reliable and robust data transfers. By considering the full end-to-end data movement pipeline, EPOC is uniquely able to support collaborative science,
    [Show full text]
  • July 1, 2018 – March 31, 2019
    Engagement and Performance Operations Center (EPOC) (Formerly known as ReSEC) NSF Award #1826994 Year 1 Annual Report 1 July 2018 through 31 March 2019 PIs: Jennifer Schopf (IU), Jason Zurawski (LBNL), Dave Jent (IU) Summary The goal of the EPOC project is to provide researchers and network engineers with a holistic set of tools and services needed to debug performance issues and enable reliable and robust data transfers. It supports five main activities: Roadside Assistance and Consulting, Application Deep Dives, Network Analysis using NetSage, the provision of Managed Services, and Training. In Year 1, highlights included setting up the full Roadside Assistance and Consulting framework and working through multiple cases of each, two completed Deep Dive sessions and planning for seven others, NetSage prototype deployments for the Great Plains Network and iLight, initial work to define Managed Services and partner interest, and several engineering and engagement Training sessions. 1. EPOC Overview The Engagement and Performance Operations Center (EPOC) is a production platform for operations, applied training, monitoring, and research and education support. EPOC is a collaborative focal point for operational expertise and analysis and is jointly led by Indiana University (IU) and the Energy Sciences Network (ESnet). EPOC provides researchers and network engineers with a holistic set of tools and services needed to debug performance issues and enable reliable and robust data transfers. By considering the full end-to-end data movement pipeline, EPOC is uniquely able to support collaborative science, allowing researchers to make the most effective use of shared data, computing, and storage resources to accelerate the discovery process.
    [Show full text]