Cybinfrastructure in the U.S

Total Page:16

File Type:pdf, Size:1020Kb

Cybinfrastructure in the U.S Cybinfrastructure in the U.S. Charlie Catlett Director, NSF TeraGrid Initiative Senior Fellow, Computation Institute The University of Chicago and Argonne National Laboratory Member, Board of Directors, Open Grid Forum Workshop on e-Infrastructure Helsinki, Finland October 4-5, 2006 Source: Dan Atkins, NSF Office of Cyberinfrastructure Cyberinfrastructure Vision • “Atkins report” - Blue-ribbon panel, chaired by Daniel E. Atkins • Called for a national-level, integrated system of hardware, software, & data resources and services • New infrastructure to enable new paradigms of science & engineering research and education with increased efficiency www. nsf.gov/od/oci/reports/toc.jsp Source: Dan Atkins, NSF Office of Cyberinfrastructure 3 NSF Cyberinfrastructure (CI) Governance • New NSF Office of Cyberinfrastructure Reports to NSF Director • OCI focuses on provisioning “production-quality” CI to accelerate 21st century research and education Director Dan Atkins (University of Michigan) • Oversight, Strategy, Governance CI Council: NSF Associate Directors, chaired by NSF Director External CI Advisory Committee Cyberinfrastructure User Advisory Committee (CUAC) Adapted from: Dan Atkins, NSF Office of Cyberinfrastructure 4 “NSF Cyberinfrastructure Vision for 21st Century Discovery” 4. Education and Workforce 3. Collaboratories, observatories, virtual organizations From Draft 7.1 CI Plan at www.nsf.gov/oci/ “sophisticated” science application software 1. Distributed, 2. Data, data scalable up to analysis, petaFLOPS HPC visualization includes networking, includes data to and middleware, systems from instruments software • provide sustainable and evolving CI that is secure, efficient, reliable, accessible, usable, and interoperable • provide access to world-class tools and services Source: Dan Atkins, NSF Office of Cyberinfrastructure 5 $60M in FY2007 6 Charlie Catlett ([email protected]) June 2006 7 Charlie Catlett ([email protected]) June 2006 OCI Investment Highlights (FY2007) • High Performance Computing (HPC) TeraGrid Facility ($30M in FY07; $67M/yr beginning FY08) FY2006-2009: Midrange HPC Acquisitions ($30M/yr) FY2007-2010: Leadership Class HPC System Acquisition (Total $200M) • Data- and Collaboration-Intensive Software and Services International Research Network Connections NSF Middleware Initiative (NMI) Cyberinfrastructure for Environmental Observatories: Prototype Systems (CEO:P) Total Program in FY2007 ($27M) • Learning and Workforce Development FY2006-2007: CI-TEAM ($10M/yr) Research Experience for Undergraduates (REU) Adapted from: Kevin Thompson, NSF Office of Cyberinfrastructure 8 TeraGrid Mission • TeraGrid will create integrated, persistent, and pioneering computational resources that will significantly improve our nation’s ability and capacity to gain new insights into our most challenging research questions and societal problems. –Our vision requires an integrated approach to the scientific workflow including obtaining access, application development and execution, data analysis, collaboration and data management. –These capabilities must be accessible broadly to the science, engineering, and education community. Charlie Catlett ([email protected]) June 2006 TeraGrid A National Production Cyberinfrastructure Facility Phase I: 2001-2004 Design, Deploy, Expand ($90M) Phase II: 2005-2010 Operation & Enhancement ($150M) Hardware (incl. operations) ($390M) UW NCAR UC/ANL PU PSC IU NCSA Caltech ORNL UNC USC/ISI SDSC TACC 20+ Distinct Computing Resources (150TF today; -->570TF by August 2007) 100+ Data Collections Charlie Catlett ([email protected])Dedicated nx10Gb/s optical network May 2006 TeraGrid Usage Charlie Catlett ([email protected]) 1000 projects, 4000 usersOctober 2006 TeraGrid User Community Blue: 10 or more PI’s Red: 5-9 PI’s Yellow: 2-4 PI’s Green: 1 PI Charlie Catlett ([email protected]) 1000 projects, 4000 usersOctober 2006 Enabling Discovery via “Capability” Computing Project PI (Inst) NSF Div Status TeraGrid Staff Arterial Karniadakis (Brown U) CTS completed O’Neal (GIG/PSC) Vortonics Boghosian (Tufts U) CTS completed O’Neal (GIG/PSC) SPICE Coveney (UCL) CHE completed O’Neal (GIG/PSC) ENZO Norman (UCSD) AST completed Harkness (SDSC) Injector Heister (Purdue) ASC completed Kim (NCSA) MD-Data Jakobsson (UIUC) BIO completed Parker (NCSA) NREL Brady (Cornell) BIO completed Chukkapali (SDSC) SCEC Olsen (SDSU) GEO in progress Cul (SDSC), Reddy (GIG/PSC) BIRN Ellisman (SDSC) BIO in progress Majumdar (SDSC) CMS Newman (Caltech) PHY in progress Milfeld (TACC) CIG Gurnis (Caltech) GEO in progress Gardner (PSC) EarthScope Pavlis (IU) GEO in progress Sheppard (IU) Crystal Deem (Rice) PHY in progress Walker (TACC) Tstorms Droegemeier (OU) ATM in progress O’Neal (GIG/PSC) Turbulence Woodward (U Minn) ASC in progress Reddy (PSC) Nemo3D Klimeck (Purdue) ENG proposed Raymond (GIG/PSC) Epidemiology Barrett (VaTech), BCS proposed Marcusiu (NCSA) Cuticchia (Duke) Pulmonary Immunity Benos (Pitt) BIO proposed Raymond (GIG/PSC) Demography Lansing (U Arizona) DBS proposed Majumdar (SDSC), Gome (PSC) Multidimensional Luby-Phelps (UT SW BIO proposed Hempel (TACC) Microscope Imaging Med Ctr, Dallas) Advanced Support for TeraGrid Applications (ASTA) Program Charlie Catlett ([email protected]) June 2006 Broader Engagement • How can TeraGrid engage the broader science, engineering, and education community in leveraging and creating discovery? • TeraGrid Strategies for Engagement –Science Gateways •Empower community-designed, led, supported infrastructure –Education and Training Initiatives •Create and support a community of educators and faculty –Campus Partnerships • New partnership programs (TeraGrid, OSG, Internet2) 14 Charlie Catlett ([email protected]) June 2006 TeraGrid Science Gateways Initiative: Community Interface to Grids Grid-X TeraGrid Grid-Y Scientists and educators are doing their work through web browsers and desktop applications. TeraGrid is adapting the high-performance infrastructure to these user environments through standard Web Services and programmable interfaces that allow for community-specific customization. Partnership in design assures broad and global benefit. Charlie Catlett ([email protected]) October 2006 Science Gateway Partners • Network for Computational • GIScience Gateway Nanotechnology and nanoHUB (GISolve, Iowa) (Purdue) • Biology and Biomedicine Science • National Virtual Observatory Gateway (UNC RENCI) (NVO, Caltech) • Open Life Sciences Gateway • Linked Environments for Atmospheric (OLSG, UChicago) Discovery (LEAD, Indiana) • The Telescience Project (UCSD) • Computational Chemistry Grid • Grid Analysis Env. (GAE, Caltech) (GridChem, NCSA) • Neutron Sci. Instr. Gateway (ORNL) • Computational Science and Engineering • TeraGrid Viz Gateway (ANL) Online (CSE-Online, Utah) • BIRN (UCSD) • GEON(GEOsciences Network) (GEON, SDSC) • Gridblast Bioinformatics Gateway • Network for Earthquake Engineering (NCSA) Simulation (NEES, SDSC) • Earth Systems Grid (NCAR) • SCEC Earthworks Project (USC) • SID Grid (UChicago) • • Astrophysical Data Repository (Cornell) Special PRiority and Urgent Computing Environment • CCR ACDC Portal (Buffalo) (SPRUCE, UChicago) • Open Science Grid (OSG) Charlie Catlett ([email protected]) October 2006 TeraGrid Science Gateway Partner Sites TG-SGW-Partners 21 Science Gateway Partners (and growing) - Over 100 partner Institutions Contact: Nancy Wilkins-Diehr ([email protected]) Charlie Catlett ([email protected]) October 2006 Emerging Partnership Frameworks • Currently Under Discussion – Joint leadership: TeraGrid, OSG, and Internet2 • Campus and Agency Partnership Programs – Integrated Authorization • Attributes-based authorization; local authentication – Integrated HPC and Data Management • Frameworks to support “co-generation” – Integrated Data Collections and Digital Assets • Frameworks for federating digital content/assets. – Expand CI Beyond only the Top R1 Institutions • Affiliate program with mentoring institutions – Expanded CI Education and Training Capabilities Charlie Catlett• ([email protected])Develop, harvest best-of-breed education and trainingOctober materials 2006 OGF Grid Interoperation NOW (GIN) • Multi-grid collaboration to pursue basic interoperation with four services in 2006 – Information, job execution, authorization, data movement – Driven by early adopter applications – Initially nine production grid facilities... today over 20 EGEE NGS APAC DEISA K*Grid PRAGMA Initial meeting in Nov 2005 OSG Workshops held 2/06, 5/06, and 9/06 TeraGrid NAREGI GGF Charlie Catlett ([email protected]) October 2006.
Recommended publications
  • The Institutional Challenges of Cyberinfrastructure and E-Research
    E-Research and E-Scholarship The Institutional Challenges Cyberinfrastructureof and E-Research cholarly practices across an astoundingly wide range of dis- ciplines have become profoundly and irrevocably changed By Clifford Lynch by the application of advanced information technology. This collection of new and emergent scholarly practices was first widely recognized in the science and engineering disciplines. In the late 1990s, the term e-science (or occasionally, particularly in Asia, cyber-science) began to be used as a shorthand for these new methods and approaches. The United Kingdom launched Sits formal e-science program in 2001.1 In the United States, a multi-year inquiry, having its roots in supercomputing support for the portfolio of science and engineering disciplines funded by the National Science Foundation, culminated in the production of the “Atkins Report” in 2003, though there was considerable delay before NSF began to act program- matically on the report.2 The quantitative social sciences—which are largely part of NSF’s funding purview and which have long traditions of data curation and sharing, as well as the use of high-end statistical com- putation—received more detailed examination in a 2005 NSF report.3 Key leaders in the humanities and qualitative social sciences recognized that IT-driven innovation in those disciplines was also well advanced, though less uniformly adopted (and indeed sometimes controversial). In fact, the humanities continue to showcase some of the most creative and transformative examples of the use of information technology to create new scholarship.4 Based on this recognition of the immense disciplinary scope of the impact of information technology, the more inclusive term e-research (occasionally, e-scholarship) has come into common use, at least in North America and Europe.
    [Show full text]
  • Teragrid Evaluation Report Project Findings August 2008
    Report from the TeraGrid Evaluation Study, Part 1: Project Findings Ann Zimmerman Thomas A. Finholt August 2008 Collaboratory for Research on Electronic Work School of Information University of Michigan 1075 Beal Avenue Ann Arbor, MI 48109-2112 http://www.crew.umich.edu Table of Contents Executive Summary ............................................................................................................ 1 1. Introduction ..................................................................................................................... 4 2. Research Questions ......................................................................................................... 5 3. Research Methods ........................................................................................................... 5 3.1 Data Collection ......................................................................................................... 6 3.1.1 User Workshop .................................................................................................. 6 3.1.2 Site Visits ........................................................................................................... 6 3.1.3 Interviews ........................................................................................................... 6 TeraGrid Users ........................................................................................................ 8 TeraGrid and Centers Personnel ........................................................................... 10 Non-TeraGrid
    [Show full text]
  • The Neutron Science Teragrid Gateway, a Teragrid Science Gateway to Support the Spallation Neutron Source
    The Neutron Science TeraGrid Gateway, a TeraGrid Science Gateway to Support the Spallation Neutron Source John W. Cobb*, Al Geist*, James A. Kohl*, Stephen D. Miller‡, Peter F. Peterson‡, Gregory G. Pike*, Michael A. Reuter‡, Tom Swain†, Sudharshan S. Vazhkudai∗ , Nithya N. Vijayakumar $ SUMMARY The National Science Foundation’s (NSF’s) Extensible Terascale Facility (ETF), or TeraGrid [1] is entering its operational phase. An ETF science gateway effort is the Neutron Science TeraGrid Gateway (NSTG.) The Oak Ridge National Laboratory (ORNL) resource provider effort (ORNL-RP) during construction and now in operations is bridging a large scale experimental community and the TeraGrid as a large-scale national cyberinfrastructure. Of particular emphasis is collaboration with the Spallation Neutron Source (SNS) at ORNL. The U.S. Department of Energy’s (USDOE’s) SNS [2] at ORNL will be commissioned in spring of 2006 as the world’s brightest source of neutrons. Neutron science users can run experiments; generate datasets; perform data reduction, analysis, visualize results; collaborate with remotes users; and archive long term data in repositories with curation services. The ORNL-RP and the SNS data analysis group have spent 18 months developing and exploring user requirements, including the creation of prototypical services such as facility portal, data, and application execution services. We describe results from these efforts and discuss implications for science gateway creation. Finally, we show incorporation into implementation planning for the NSTG and SNS architectures. The plan is for a primarily portal-based user interaction supported by a service oriented architecture for functional implementation. KEY WORDS: Portal, Neutron Scattering, TeraGrid, Science Gateway, Service Architecture, Grid 1.
    [Show full text]
  • Cyberinfrastructure for High Energy Physics in Korea
    17th International Conference on Computing in High Energy and Nuclear Physics (CHEP09) IOP Publishing Journal of Physics: Conference Series 219 (2010) 072032 doi:10.1088/1742-6596/219/7/072032 Cyberinfrastructure for High Energy Physics in Korea Kihyeon Cho1, Hyunwoo Kim and Minho Jeung High Energy Physics Team Korea Institute of Science and Technology Information, Daejeon 305-806, Korea E-mail: [email protected] Abstract. We introduce the hierarchy of cyberinfrastructure which consists of infrastructure (supercomputing and networks), Grid, e-Science, community and physics from bottom layer to top layer. KISTI is the national headquarter of supercomputer, network, Grid and e-Science in Korea. Therefore, KISTI is the best place to for high energy physicists to use cyberinfrastructure. We explain this concept on the CDF and the ALICE experiments. In the meantime, the goal of e-Science is to study high energy physics anytime and anywhere even if we are not on-site of accelerator laboratories. The components are data production, data processing and data analysis. The data production is to take both on-line and off-line shifts remotely. The data processing is to run jobs anytime, anywhere using Grid farms. The data analysis is to work together to publish papers using collaborative environment such as EVO (Enabling Virtual Organization) system. We also present the global community activities of FKPPL (France-Korea Particle Physics Laboratory) and physics as top layer. 1. Introduction We report our experiences and results relating to the utilization of cyberinfrastrucure in high energy physics. According to the Wiki webpage on “cyberinfrastrucure,” the term “cyberinfrastructure” describes the new research environments that support advanced data acquisition, storage, management, integration, mining, and visulaization, as well as other computing and information processing services over the Internet.
    [Show full text]
  • Boston University Cyberinfrastructure Plan
    Boston University Information Services & Technology 111 Cummington Street Boston, Massachusetts 02215 Boston University Cyberinfrastructure Plan I. BU Research Infrastructure o THE MASSACHUSETTS GREEN HIGH-PERFORMANCE COMPUTING CENTER (MGHPCC) BU recently partnered with government and regional institutions to establish the Massachusetts Green High Performance Computing Center (MGHPCC), operated by BU, Harvard, MIT, Northeastern, and the University of Massachusetts—five of the most research-intensive universities in Massachusetts—dedicated to serving our growing research computing needs more collaboratively efficiently, ecologically, and cost-effectively. BU high- performance computational resources are housed at the MGHPCC, taking advantage of an abundant source of clean renewable energy from the Holyoke Gas and Electric hydroelectric power plant on the Connecticut River. This project created a world-class high-performance computing center with an emphasis on green sustainable computing, offering unprecedented opportunities for collaboration among research, government, and industry in Massachusetts. MGHPCC is the first academic research computing center to achieve LEED Platinum certification, the highest level of green certification awarded by the Green Building Council. MGHPCC comprises 33,000 SF of space optimized for high performance computing, a 19 MW power feed, and high-efficiency cooling to support up to 10 MW of load with provisions for expansion to 30 MW. The MGHPCC owns the 8.6 acre site, providing substantial room for expansion. Network infrastructure includes a fiber loop passing through Boston, the Northern Crossroads (NoX), and New York City, connecting BU to the MGHPCC at 40 Gb/s. BU is also actively engaged in other national and regional consortia where we provide technical, strategic, and governance leadership.
    [Show full text]
  • David a Lifka 8-2019
    DAVID A. LIFKA 234 Day Hall, Cornell University, Ithaca, NY 14853-3801 607-254-8621 – cell 607-227-5517 – [email protected] Executive Profile Cornell University Vice President and Chief Information Officer. IT leader and business strategist with strong leadership, team building, and Problem-solving skills. Develops, executes, and enhances a sustainable, cutting-edge IT organization that drives down costs, operates efficiently, and transforms education, research, and Public service by oPening new avenues for collaboration and innovation. ExPert in develoPing and dePloying optimal IT solutions defined by organizational priorities, strategic value, customer needs, and budget constraints. Over thirty years experience in administrative and research computing as a center director and lead technologist at an Ivy League university and national research laboratory. Invited industry speaker on emerging technologies and best business practices. Active member of the University Board of Trustees Cyber Security subcommittee of the Audit Committee, camPus IT committees, advisory boards, and other leadership positions. Federal and industry grant awardee with publications and ComPuting and Information Science teaching experience. Skills Analyzing organizational challenges and identifying opportunities; building relationships with key stakeholders in order to identify and define high-value needs; managing Projects; motivating emPloyees; defining and meeting fiscal goals; re-engineering IT functions; dealing with performance issues and celebrating successes; building mutually beneficial vendor relations; imPlementing commodity and emerging technologies to meet research and administrative comPuting needs in a timely and transparent manner. Technical skills include: solution architecture and implementation, interconnects, data management and analysis, cloud and grid comPuting, high performance computing, compilers, parallel and object-oriented programming languages, resource management, and high-speed networking.
    [Show full text]
  • Grid Computing and the Teragrid GI Science Gateway
    Grid Computing and the TeraGrid GI Science Gateway Kate Cowles 22S:295 High Performance Computing Seminar, Nov. 29, 2007 () 1 / 38 Outline 1 Example data 2 Grid computing 3 The TeraGrid 4 TeraGrid Science Gateways 5 GISolve () 2 / 38 Example data Outline 1 Example data 2 Grid computing 3 The TeraGrid 4 TeraGrid Science Gateways 5 GISolve () 3 / 38 Example data Sulfur dioxide concentration data from the EPA Air Quality System database () 4 / 38 Example data n = 4711 observations from 870 sites in the years 1998–2005 () 5 / 38 Grid computing Outline 1 Example data 2 Grid computing 3 The TeraGrid 4 TeraGrid Science Gateways 5 GISolve () 6 / 38 Grid computing Grid computing several different definitions, all involving distributed computing networking together computing clusters at different geographic locations to harness computing and storage resources () 7 / 38 The TeraGrid Outline 1 Example data 2 Grid computing 3 The TeraGrid 4 TeraGrid Science Gateways 5 GISolve () 8 / 38 The TeraGrid The Teragrid the world’s largest, most comprehensive distributed cyberinfrastructure for open scientific research began in 2001 when NSF awarded $45 million to establish a Distributed Terascale Facility (DTF) to NCSA, SDSC, Argonne National Laboratory, and the Center for Advanced Computing Research (CACR) at California Institute of Technology coordinated through the Grid Infrastructure Group (GIG) at the University of Chicago () 9 / 38 The TeraGrid TeraGrid, continued more than 250 teraflops of computing capability (May 2007) teraflop: trillion (1012) floating point operations per second more than 30 petabytes of online and archival data storage (May 2007) petabyte: quadrillion (1015) bytes rapid access and retrieval over high-performance networks.
    [Show full text]
  • Placing Landmarks on the Genome
    What is XSEDE? The Extreme ScienceXSEDE and Engineering Discovery The five-year, $121 million project is supported by XSEDE is led by the University of Illinois’s National Environment (XSEDE) is the most advanced, powerful, the National Science Foundation. It replaces and Center for Supercomputing Applications. and robust collection of integrated digital resources expands on the National Science Foundation TeraGrid The partnership includes: and services in the world. It is a single virtual system project. More than 10,000 scientists used the TeraGrid • Carnegie Mellon University/Pittsburgh that scientists can use to interactively share to complete thousands of research projects, at no Supercomputing Center - University of Pittsburgh computing resources, data, and expertise. cost to the scientists. XSEDE continues that same • Center for Advanced Computing - Cornell University sort of work—with an expanded scope, generating • Indiana University • Jülich Supercomputing Centre more knowledge, and improving our world in an even • National Center for Atmospheric Research broader range of fields. • Ohio Supercomputer Center - The Ohio State University • Purdue University • Rice University • Shodor Education Foundation • Southeastern Universities Research Association • University of California Berkeley • San Diego Supercomputer Center - University of California San Diego • University of Chicago • National Center for Supercomputing Applications - University of Illinois at Urbana-Champaign • National Institute for Computational Sciences - University
    [Show full text]
  • Open Cybergis Software for Geospatial Research and Education in the Big Data Era
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Elsevier - Publisher Connector Available online at www.sciencedirect.com ScienceDirect SoftwareX 5 (2016) 1–5 www.elsevier.com/locate/softx Open cyberGIS software for geospatial research and education in the big data era Shaowen Wanga,b,c,d,e,f,∗, Yan Liua,b,c,f, Anand Padmanabhana,b,c,f a CyberGIS Center for Advanced Digital and Spatial Studies, University of Illinois at Urbana–Champaign, Urbana, IL 61801, USA b CyberInfrastructure and Geospatial Information Laboratory, University of Illinois at Urbana–Champaign, Urbana, IL 61801, USA c Department of Geography and Geographic Information Science, University of Illinois at Urbana–Champaign, Urbana, IL 61801, USA d Department of Urban and Regional Planning, University of Illinois at Urbana–Champaign, Urbana, IL 61801, USA e Graduate School of Library and Information Science, University of Illinois at Urbana–Champaign, Urbana, IL 61801, USA f National Center for Supercomputing Applications, University of Illinois at Urbana–Champaign, Urbana, IL 61801, USA Received 16 March 2015; received in revised form 3 July 2015; accepted 28 October 2015 Abstract CyberGIS represents an interdisciplinary field combining advanced cyberinfrastructure, geographic information science and systems (GIS), spatial analysis and modeling, and a number of geospatial domains to improve research productivity and enable scientific breakthroughs. It has emerged as new-generation GIS that enable unprecedented advances in data-driven knowledge discovery, visualization and visual analytics, and collaborative problem solving and decision-making. This paper describes three open software strategies – open access, source, and integration – to serve various research and education purposes of diverse geospatial communities.
    [Show full text]
  • The Seamless Cyberinfrastructure: the Challenges of Studying Users of Mass Digitization and Institutional Repositories
    The Seamless Cyberinfrastructure: The Challenges of Studying Users of Mass Digitization and Institutional Repositories Dawn Schmitz April 2008 Seamless Cyberinfrastructure Contents 1 Introduction......................................................................................................................1 1.1 Scope of this Report ................................................................................................1 1.2 Rationale for this Report ........................................................................................2 2 Background on Mass Digitization................................................................................4 2.1 Definition of Mass Digitization .............................................................................4 2.1.1 Selection and Copyright..................................................................................6 2.2 Anticipated Users of Mass Digitization ...............................................................7 2.3 User Issues Related to Mass Digitization............................................................8 3 Background on Institutional Repositories .................................................................8 3.1 Definition of Institutional Repository..................................................................9 3.2 Prevalence of Institutional Repositories ...........................................................11 3.3 User Issues Related to Institutional Repositories............................................11 3.3.1 Who Is Depositing
    [Show full text]
  • What Is Cyberinfrastructure? Craig A
    What is Cyberinfrastructure? Craig A. Stewart* Stephen Simms Beth Plale 812-855-4240 812-855-7211 812-855-4373 Pervasive Technology Institute & Research Technologies, UITS & School of Informatics and Computing Research Technologies, UITS** Pervasive Technology Institute & Pervasive Technology Institute [email protected] [email protected] [email protected] Matthew Link David Y. Hancock Geoffrey C. Fox 812-855-633 812-855-4021 812-856-7977 Research Technologies, UITS & Research Technologies, UITS & School of Informatics and Computing Pervasive Technology Institute Pervasive Technology Institute & Pervasive Technology Institute [email protected] [email protected] [email protected] *All authors at Indiana University, Bloomington, IN 47405 **UITS is University Information Technology Services ABSTRACT General Terms Cyberinfrastructure is a word commonly used but lacking a single, precise definition. One recognizes intuitively the analogy Management, Performance, Design, Security, Human Factors, with infrastructure, and the use of cyber to refer to thinking or Standardization. computing – but what exactly is cyberinfrastructure as opposed to information technology infrastructure? Indiana University has Keywords developed one of the more widely cited definitions of Cyberinfrastructure, e-Science, Infrastructure, Open Science cyberinfrastructure: Grid, TeraGrid. Cyberinfrastructure consists of computing systems, data storage systems, advanced instruments and data 1. INTRODUCTION repositories, visualization environments, and people, The term ‘cyberinfrastructure’ was coined in the late 1990s, and all linked together by software and high performance its usage became widespread in 2003 with the publication of networks to improve research productivity and “Revolutionizing Science and Engineering Through enable breakthroughs not otherwise possible. Cyberinfrastructure: Report of the National Science Foundation Blue-Ribbon Advisory Panel on Cyberinfrastructure” by Atkins A second definition, more inclusive of scholarship generally and et al.
    [Show full text]
  • 1.2 OVERVIEW Campuses & Institutes
    1.2 OVERVIEW Campuses & Institutes Carnegie Mellon has been a multifaceted institution throughout its exis- tence. The original Carnegie Technical Schools, founded in 1900, com- prised four colleges spanning the arts, sciences and vocations. As it later evolved into the Carnegie Institute of Technology, it became a national leader in fields as disparate as engineering and drama. In 1967, it merged with the Mellon Institute for Industrial Re- search, resulting in the modern Carnegie Mellon Univer- sity. Today, the school has branch campuses in California and the Middle East and degree programs in six countries across the world. Carnegie Mellon’s main campus comprises 103 acres in Pittsburgh’s vibrant Oakland neighborhood, where many of the city’s major cultural and scientific institutions are located. Close to 11,000 students and 1,200 full-time faculty conduct their studies here, and Pittsburgh’s long history as an American hub of industrial engineering, finance and arts-oriented philanthropy provides an appropriate backdrop MAIN CAMPUS for the university’s current innovative pursuits. The campus contains more than 50 academic, Pittsburgh, Pennsylvania research and administrative buildings and three libraries. Resource sharing between Carnegie Founded 1900 by Andrew Carnegie; Mellon and other local universities provides students with almost unlimited information resources. seven graduate & undergraduate What’s more, the campus is committed to environmental practices, building new LEED-certified schools spanning the sciences and dorms and buying electricity from a regional wind farm. fine arts www.cmu.edu CARNEGIE MELLON SILICON VALLEY Silicon Valley, California, Founded 2003 Master’s degrees in software engineering and software management http://sv.cmu.edu Carnegie Mellon West graduated its first class in August 2003, with 17 students receiving master’s degrees in information technology with specializations in software engineering, e-business technology and the learning sciences.
    [Show full text]