Cybinfrastructure in the U.S
Total Page:16
File Type:pdf, Size:1020Kb
Cybinfrastructure in the U.S. Charlie Catlett Director, NSF TeraGrid Initiative Senior Fellow, Computation Institute The University of Chicago and Argonne National Laboratory Member, Board of Directors, Open Grid Forum Workshop on e-Infrastructure Helsinki, Finland October 4-5, 2006 Source: Dan Atkins, NSF Office of Cyberinfrastructure Cyberinfrastructure Vision • “Atkins report” - Blue-ribbon panel, chaired by Daniel E. Atkins • Called for a national-level, integrated system of hardware, software, & data resources and services • New infrastructure to enable new paradigms of science & engineering research and education with increased efficiency www. nsf.gov/od/oci/reports/toc.jsp Source: Dan Atkins, NSF Office of Cyberinfrastructure 3 NSF Cyberinfrastructure (CI) Governance • New NSF Office of Cyberinfrastructure Reports to NSF Director • OCI focuses on provisioning “production-quality” CI to accelerate 21st century research and education Director Dan Atkins (University of Michigan) • Oversight, Strategy, Governance CI Council: NSF Associate Directors, chaired by NSF Director External CI Advisory Committee Cyberinfrastructure User Advisory Committee (CUAC) Adapted from: Dan Atkins, NSF Office of Cyberinfrastructure 4 “NSF Cyberinfrastructure Vision for 21st Century Discovery” 4. Education and Workforce 3. Collaboratories, observatories, virtual organizations From Draft 7.1 CI Plan at www.nsf.gov/oci/ “sophisticated” science application software 1. Distributed, 2. Data, data scalable up to analysis, petaFLOPS HPC visualization includes networking, includes data to and middleware, systems from instruments software • provide sustainable and evolving CI that is secure, efficient, reliable, accessible, usable, and interoperable • provide access to world-class tools and services Source: Dan Atkins, NSF Office of Cyberinfrastructure 5 $60M in FY2007 6 Charlie Catlett ([email protected]) June 2006 7 Charlie Catlett ([email protected]) June 2006 OCI Investment Highlights (FY2007) • High Performance Computing (HPC) TeraGrid Facility ($30M in FY07; $67M/yr beginning FY08) FY2006-2009: Midrange HPC Acquisitions ($30M/yr) FY2007-2010: Leadership Class HPC System Acquisition (Total $200M) • Data- and Collaboration-Intensive Software and Services International Research Network Connections NSF Middleware Initiative (NMI) Cyberinfrastructure for Environmental Observatories: Prototype Systems (CEO:P) Total Program in FY2007 ($27M) • Learning and Workforce Development FY2006-2007: CI-TEAM ($10M/yr) Research Experience for Undergraduates (REU) Adapted from: Kevin Thompson, NSF Office of Cyberinfrastructure 8 TeraGrid Mission • TeraGrid will create integrated, persistent, and pioneering computational resources that will significantly improve our nation’s ability and capacity to gain new insights into our most challenging research questions and societal problems. –Our vision requires an integrated approach to the scientific workflow including obtaining access, application development and execution, data analysis, collaboration and data management. –These capabilities must be accessible broadly to the science, engineering, and education community. Charlie Catlett ([email protected]) June 2006 TeraGrid A National Production Cyberinfrastructure Facility Phase I: 2001-2004 Design, Deploy, Expand ($90M) Phase II: 2005-2010 Operation & Enhancement ($150M) Hardware (incl. operations) ($390M) UW NCAR UC/ANL PU PSC IU NCSA Caltech ORNL UNC USC/ISI SDSC TACC 20+ Distinct Computing Resources (150TF today; -->570TF by August 2007) 100+ Data Collections Charlie Catlett ([email protected])Dedicated nx10Gb/s optical network May 2006 TeraGrid Usage Charlie Catlett ([email protected]) 1000 projects, 4000 usersOctober 2006 TeraGrid User Community Blue: 10 or more PI’s Red: 5-9 PI’s Yellow: 2-4 PI’s Green: 1 PI Charlie Catlett ([email protected]) 1000 projects, 4000 usersOctober 2006 Enabling Discovery via “Capability” Computing Project PI (Inst) NSF Div Status TeraGrid Staff Arterial Karniadakis (Brown U) CTS completed O’Neal (GIG/PSC) Vortonics Boghosian (Tufts U) CTS completed O’Neal (GIG/PSC) SPICE Coveney (UCL) CHE completed O’Neal (GIG/PSC) ENZO Norman (UCSD) AST completed Harkness (SDSC) Injector Heister (Purdue) ASC completed Kim (NCSA) MD-Data Jakobsson (UIUC) BIO completed Parker (NCSA) NREL Brady (Cornell) BIO completed Chukkapali (SDSC) SCEC Olsen (SDSU) GEO in progress Cul (SDSC), Reddy (GIG/PSC) BIRN Ellisman (SDSC) BIO in progress Majumdar (SDSC) CMS Newman (Caltech) PHY in progress Milfeld (TACC) CIG Gurnis (Caltech) GEO in progress Gardner (PSC) EarthScope Pavlis (IU) GEO in progress Sheppard (IU) Crystal Deem (Rice) PHY in progress Walker (TACC) Tstorms Droegemeier (OU) ATM in progress O’Neal (GIG/PSC) Turbulence Woodward (U Minn) ASC in progress Reddy (PSC) Nemo3D Klimeck (Purdue) ENG proposed Raymond (GIG/PSC) Epidemiology Barrett (VaTech), BCS proposed Marcusiu (NCSA) Cuticchia (Duke) Pulmonary Immunity Benos (Pitt) BIO proposed Raymond (GIG/PSC) Demography Lansing (U Arizona) DBS proposed Majumdar (SDSC), Gome (PSC) Multidimensional Luby-Phelps (UT SW BIO proposed Hempel (TACC) Microscope Imaging Med Ctr, Dallas) Advanced Support for TeraGrid Applications (ASTA) Program Charlie Catlett ([email protected]) June 2006 Broader Engagement • How can TeraGrid engage the broader science, engineering, and education community in leveraging and creating discovery? • TeraGrid Strategies for Engagement –Science Gateways •Empower community-designed, led, supported infrastructure –Education and Training Initiatives •Create and support a community of educators and faculty –Campus Partnerships • New partnership programs (TeraGrid, OSG, Internet2) 14 Charlie Catlett ([email protected]) June 2006 TeraGrid Science Gateways Initiative: Community Interface to Grids Grid-X TeraGrid Grid-Y Scientists and educators are doing their work through web browsers and desktop applications. TeraGrid is adapting the high-performance infrastructure to these user environments through standard Web Services and programmable interfaces that allow for community-specific customization. Partnership in design assures broad and global benefit. Charlie Catlett ([email protected]) October 2006 Science Gateway Partners • Network for Computational • GIScience Gateway Nanotechnology and nanoHUB (GISolve, Iowa) (Purdue) • Biology and Biomedicine Science • National Virtual Observatory Gateway (UNC RENCI) (NVO, Caltech) • Open Life Sciences Gateway • Linked Environments for Atmospheric (OLSG, UChicago) Discovery (LEAD, Indiana) • The Telescience Project (UCSD) • Computational Chemistry Grid • Grid Analysis Env. (GAE, Caltech) (GridChem, NCSA) • Neutron Sci. Instr. Gateway (ORNL) • Computational Science and Engineering • TeraGrid Viz Gateway (ANL) Online (CSE-Online, Utah) • BIRN (UCSD) • GEON(GEOsciences Network) (GEON, SDSC) • Gridblast Bioinformatics Gateway • Network for Earthquake Engineering (NCSA) Simulation (NEES, SDSC) • Earth Systems Grid (NCAR) • SCEC Earthworks Project (USC) • SID Grid (UChicago) • • Astrophysical Data Repository (Cornell) Special PRiority and Urgent Computing Environment • CCR ACDC Portal (Buffalo) (SPRUCE, UChicago) • Open Science Grid (OSG) Charlie Catlett ([email protected]) October 2006 TeraGrid Science Gateway Partner Sites TG-SGW-Partners 21 Science Gateway Partners (and growing) - Over 100 partner Institutions Contact: Nancy Wilkins-Diehr ([email protected]) Charlie Catlett ([email protected]) October 2006 Emerging Partnership Frameworks • Currently Under Discussion – Joint leadership: TeraGrid, OSG, and Internet2 • Campus and Agency Partnership Programs – Integrated Authorization • Attributes-based authorization; local authentication – Integrated HPC and Data Management • Frameworks to support “co-generation” – Integrated Data Collections and Digital Assets • Frameworks for federating digital content/assets. – Expand CI Beyond only the Top R1 Institutions • Affiliate program with mentoring institutions – Expanded CI Education and Training Capabilities Charlie Catlett• ([email protected])Develop, harvest best-of-breed education and trainingOctober materials 2006 OGF Grid Interoperation NOW (GIN) • Multi-grid collaboration to pursue basic interoperation with four services in 2006 – Information, job execution, authorization, data movement – Driven by early adopter applications – Initially nine production grid facilities... today over 20 EGEE NGS APAC DEISA K*Grid PRAGMA Initial meeting in Nov 2005 OSG Workshops held 2/06, 5/06, and 9/06 TeraGrid NAREGI GGF Charlie Catlett ([email protected]) October 2006.