DOCUMENT RESUME

ED 336 066 IR 015 073

TITLE High Performance Computing and Networking for Science--Background Paper. INSTITUTION Congress of the U.S., Washington, D.C. Office of Technology Assessment. REPORT NC OTA-BP-CIT-59 PUB DATE Sep 89 NOTE 51p.; Fcc reports and hearings on the High Performarce Computing Acts of 1989, 1990, and 1991, see ED 323 244, ED 329 226, and ED 332 693-694. AVAILABLE FROM Superintcndent of Documents, U.S. Government Printing Office, Washington, DC 20402-9325 (Stock No. 052-003-01164-6; $2.25). PUB TYPE Information Analyses (070)

EDRS PRICE MF01/PC03 Plus Postage. DESCRIPTORS *Computer Networks; Federal Government; Federal Legislation; *Information Networks; *Information Technology; International Programs; *National Programs; Public Policy; *Research and Development; Telecommunications IDENTIFIERS *High Performance Computing; *National Research and Education Network;

ABSTRACT The Office of Technology Assessment is conducting an assessment of the eff3cts of new information technologies--includirg high performance computing, data networking, and mass data archiving--on research and development. This paper offers a view of the issues and their implications for current discussions about Federal initiatives and legislative initiatives concerning a national data communication network. The observations to date emphasize the critical importance of advanced information technology to research and development in the United States, the interconnection of these telecommunications technologies into a national system program, and the need for immediate and coordinated federal action to bring into being an advanced information technology infrastructure to support U.S. research, engineering, and education. High performance computers are discussed in detail using the Cornell Theory Center, the National Center for Supercomputing Applications, the Pittsburgh Supercomputing Center, the San Diego Supercomputer Center, and the John von Neumann National Supercomputer Center as examples. Several high performance computer facilities at the state level are also reviewed, as well as ct-anges in the scientific computing environment, the review and renewal of the Nr*tional Science Foundation (NSF) Centers, and international programs in Japan and Europe. A detailed discussion of the status of and policy issues surrounding data networking for science focused on the proposed National Research and Education Network (NREN) concludes the document. A list of reviewers and the names and affiliations of the High Performance Computing and Networking for Science Advisory Panel are inciuded. (DB) HIGH PERFORMANCE COMPUTING & NETWORKING FOR SCIENCE cez

A

U S DEPARTMENT OF EDUCATION Office of Educational Re learch and Improvement EDUCATIONAL RESOURCES INFORMATION CENTER (ERIC) 0 This document has been reproouced as received from the person or organization originating 0 Minor changes have been made to improve reproduction quality

Points of view or opinions stated in this docu ment do not necessarily representofficial OERI position or policy

Deer fumymoan arse s. Office of Technology Assessn.ent

Congressional Board of the 101st Congress

EDWARD NI. KENNEDYALissdebusetts, CLARENCE . MILLER,Ohio, Vit

Senate House ERNEST F. HOLLINCS MORRIS K. I'DALL Snuth Cam/ma 11/Un, CLAIBORNE PELL (;EORC.F. F. BROWN, .11( Rhode IsIdnd C,IlifornLi FED Sri:AA:AS JOHN D. DINtiELL ALIck4 hiqm)

ORR IN ;ILA 1 1 DON SLNDQUIST rtah Ten 11('!siT CHAR LES E. (i1(:\SSLEy AM() 11( )L(;HT)N lot% .Ve%% l'ork JOHN IL GIBBONS (Nunvotmg

Advisory Council

PERRY DAVID S YITER. C/hurnhin NE11. HARE WILI 1M) 1r01110h CM' rill murs Com (Rey) StAte 1. 111VCrW 1:ILASF. N. PETERS()N, A'ne ANI ES C !IV NT SALLY RIDE ersitv of Unwer!-nv n; Tennesse Cahlornid Spare Institute SEPII E ROSS CHRLES A BOWSLIER HENRY Kt )111.ER J Gencidi.11«,untlitg()trice 'tliversity rIzona (:Ongressiomil Researt. Service LEI)11(IiERG )IIN SINIS \11(:11FI. I HLBOLIN JOSI Alu he/ T. I idihoott hiergt ( :IC Rockefeller l'tillersit% 1.sibelli Cud! Aline, inc.

Director

JOHN H. ( -,111BONs

thr 111AN \I)C(I III 1I,C kind >CI>I I li,)tit( IS (III ()I .\ 1d. iNflt% CU ii.()IInd! HIGHPERFORMANCE COMPUTING &NETWORKING FOR SCIENCE

BACKGROUND PAPER

CONGRESS OF THE UNITED STATES OFFICEOF TECHNOLOGY ASSESSMENT 4 Recommended Citation: U.S. Congress, Office of itchnology Assessment, High Performance Computing and Networking for Science--Background Paper, OTA-BP-CIT-59 (Washington, DC: U.S. Government Printing Office, September 1989).

Libraiy of Congress Catalog Card Number 89-600758 For sale by the Superintendent of Docvments U.S. Government Printing Office, Washington, DC 20402-9325 (Order form can be found in the back of this report.)

11 Foreword

Information technology is fundamental to today's research and development: high performance computers for solving complex problems; high-speed data communication networks for exchanging scientific and engineering information; very large electronic archives for storing scientific and technical data; and new display technologies for visualizing the results of analyses. This background paper explores key issues concerning the Federal role in supporting national high performance computing facilities and in developing a national research and education network. Itis the first publication from our assessment,Information Technology and Research,which was requested by the House Committee on Science and lbchnology and the Senate Committee on Commerce, Science, and Transportation. OTA gratefully acknowledges the contributions of the many experts, within and outside the government, who served as panelists, workshop participants, contractors, reviewers, detailees, and advisers for this document. As with all OTA reports, however, the content is solely me responsibihty of OTA and does not necessarily constitute the consensus or endorsement of the advisory panel, workshop participants, or the Ibchnology Assessment Board.

JOHN H. GIBBONS Director

III 6 High Performance Computing and Networking for Science Advisory Panel

John P. (Pat) Crecine, Chairman President, Georria Institute of Technology

Charles Bender Lawrence Landweber Sharon J. Rogers Director Chairman University Librarian Ohio Supercomputer Center Computer Science Department Gelman Library University of Wisconsin-Madison The George Washington University Charles DeLisi Chairman Carl Ledbetter William Schrader Department of Biomathematical President/CEO President Science ETA Systems NYSERNET Mount Sinai School of Medicine Donald Marsh Kenneth Toy Deborah L. Estrin Vice President, Technology Post-Co aduate Resear,h A istant Professor Contel Corp. Geophysicist Scripps Institution of Oceanography r-Jrnputer Science Depurtment Michael J. McGill Uidversity of Southern California Vice President Keith Uncaphez Robert Ewald lechnical Assessment & DevelopmentVice President Vice President, OCLC, Computer Library Center, Inc. Corporation for the National Research Initiatives Research, inc. Kenneth W. Neves Kenneth Flamm Manager Al Weis Senior Fellow Research & Development Program Vice President The Brookings Institution Boeing Computer Services Engineering & Scientific Computer Data Systems Division Malcolm Cetz Bernard O'Lear IBM Corp. Associate Provost Manager of Systems Information Services & Technology National Center for Atmospheric Vanderbilt University Research Ira Goldstein William Poduska Vice President, Research Chairman of the Board Open Software Foundation Stellar Computer, Inc. Robert E. Kraut ttium. Rich Manager Directoi Interpersonal Communicadons Group ArtificiAl Intelligence Lab Bell Communications Research Microelectronics and Computer Technology Corp.

NOTE: OTA is grateful for the valuable assistance and thoughtful critiques provided by the advisory panel. The views expresseicl in this OTA background paper, however, are the sole responsibility of the Office tg Technology Assessment.

iv OTA Project StaffHigh Performance Computing

John Andelin, Assistant Director, OTA Science, Information, and Natural Resources Division

James W. Cur lin, Program Manager Communication and Information Technologies Program

Fred W. Weingarten, Project Director Charles N. Brownstein, Senior Analyst] Lisa Heinz,Analyst Elizabeth I. Miller, Research Assistant

Administrati ve St4ff Elizabeth Emaauel, Administrative Assistant Karolyn Swauger, Secretary Jo Anne Price, Secretary

Other Contributors

Bill Bartelone Mervin Jones Timothy Lynagh Legislative/Federal Program Program Analyst Supervisory Data and Manager Defense Automation Resources Program Analyst Cray Research, inc. Information Center Government Services Administration

I Detailix from NSF List of Reviewers

Janice Abraham Eloise E. Clark Judson M. Harper Executive Director Vice President, Academic Vice President of Research Cornell Theory Center Affairs Colorado State University Cornell University Bowling Green University Gene Hemp Lee R. Alley Paul Coleman Senior Associate V.P. for Assistant Vice President for Professor Academic Affairs Information Resources ManagementInstitute of Geophysics and University of Florida Arizona State University Space Physics University of California Nobuaki leda James Almond Senior Vice President Director Michael R. Dingerson NTT America, Inc. Center for High Performance Associate Vice Chancellor for Computing Research and Dean of the Hiroshi Inose Balcones Research Center Graduate School Director General University of Mississippi National Center for Science Julius Archibald Information System Department Chairman Christopher Eoyang Department of Computer Science Director Heidi James State University of New York Institute for Supercomputing Executive Secretary College of Plattsburgh Research United States Activities Board J. Gary Augustson David Farber IEEE Executive Director Professor Computer and Information Computer & Information Russell C. Jones Systems Science Department University Research Professor Pennsylvania State University University of Pennsylvania University of Delaware Philip Austin Sidney Fernbach Brian Kahin, Esq. President Independent Consultant Research Affiliate on Colorado State University Communications Policy Susan Fratkin Massachusetta Institute of Technology Steven C. Beering Director, Special Programs President NASULGC Robert Kahn Purdue University President Doug Gale Corporation of National Jerry Berkman . Director of Computer Research Initiatives Fortran Spezialist Research Center Central Computing Services Office of the Chancellor Hisao Kanai U.Liversity of California at Berkeley University of Nebraska-Lincoln Executive Vice President NEC Corporation Kathleen Bernwri Robert Gillespie Director for Science Policy President Hiroshi Kashiwagi and liThnology Programs Gillespie, Folkner Deputy Director-General Cray Research, Inc. & Associates, Inc. Electrotechnical Laboratory Justin L. Bloom Eiichi Goto Lauren Kelly President Director, Computer Center Department of Commerce Technology International, Inc. University of lbkyo Thomas Keyes Charles N. Brownstein C.K. Gunsalus Professor of Chemi-stry Executive Officer Assistant Vice Chancellor Boston University Computing & latbrinfition Science & for Research Engineering University of lainois National Science Foundation at Urbana-Champaign Continued on next pee.)

fJ List a; RVA9Wers (continued)

Doyle Knight Shoichi Ninomoiya Steven Sample President Executive Director President John von Neumann National Fujitsu Limited SUNY, Buffalo Supercomputer Center Bernard O'Lear John Sell Consortium for Scientific Manager of Systems President Computing National Center for Atmospheric Minnesota Supercomputer Center Mike Levine Research Hiroshi Shima Co-director of the Pittsburgh Ronald Orcutt Deputy Director-General for Supercomputing Center Executive Director Technology A.'airs Carnegie Mellon University Project Athena Agency of Industrial Science and George E. Lindamood MIT Technology,,MITI Program Director Tad Pinkerton Yoshio Shimamoto Industry Service DLector Senior Scientist (Retired) Gartner Group, Inc. Office of Information Applied Mathematics Department M. Stuart Lynn 'Technology Brookhaven National Laboratory Vice President for University of Wisconsin-Madison Charles Sorber Information Technologies Harold J. Raveche Dean, School of Engineering Cornell University President University of Pittsburgh Ikuo Makino Stevens Institute of Technology Harsey Stone Director Ann Redelf Special Assistant to the Electrical Machinery & Consumer Manager of Information President Electronics Division Services University of Delaware Ministry of International r.'ornell Theory Center Trade and Indus !Ty Dan Sulzbach Cornell University Manager, User Services Richard Mandelbaum Glenn Ricart San L''.ego Supercomputer Center Vice Provost for Computing Director University of Rochester Tatsuo Tanaka Computer Science Center Executive Director Martin Massengale University of Maryland in Interoperability Technology Chancellor College Park Association for Information Processing, U.aversity of Nebraska-Lincoln Ira Richer Japan Gerald W. May Program Manager Ray Toland President DARPA/ISTO President University of New Mexico John Riganati Alabama Supereomputing Network Yoshiro Mild Director of Systems Research Authority Director, Policy Research Division Supercomputer Research Center Kenneth Tolo Science and Thchnology Policy BureauInstitute for Defense Analyses Vice Provost Science and Technology Agency Mike Roberts University of Texas Takeo Miura Vice President at Austin Senior Executive Managing Director EDUCOM Kenneth Toy Ktachi, Ltd. David Roselle Post-Graduate Research J. Gerald Morgan President Geophysicist Dean of Engineering University of Kentucky Scripps Institution of Oceanography New Mexico State University Nora Sabel li V. Rama Murthy National Center for Supercomputing August B. Thrnbull, Ill Vice Provost for Academic Applications Provost & Vice President, Affairs University of Illinois at Academic Affairs University of Minnesota Urbana-Champaign Florida State University

Continued on next page

vii 1 0 List of Reviewers (continued)

Gerald Timer Hugh Walsh James Woodward Chancellor Data Systems Division Chancellor University of Mississippi IBM University of North Carolina at Charlotte Douglas Van Houweling Richard West Vice Provost for Information Assistant Vice President, Akihiro Yoshikawa & Technology IS&AS Research Director University of Michigan University of California University of California, Berkeley BRIE/IIS Anthony Villasenor Steve Wolff Program Manager Program Director for Networking Science Networks Computing & Information Science & Office of Space Science and Engineering Acplications National Science Foundation National Aeronautics and Space Administration

NOTE: OTA is grateful for the valuable assistance and thoughtful critiques provided by the advisory panel, The views expressed in this OTA background paper, however, are the sole responsibility of the Office of Thchnology Assessment,

1 1 Contents Page Page Chapter I: Introduction and Overview Other Nations 19 21 Observations 1 Chapter 3: Networks RESEARCH AND INFORMATION THE NATIONAL RESEARCH AND 22 TECHNOLOGYA FUTURE EDUCATION NETWORK (NREN) The Origins of Research Networking .. 22 SCENARIO 1 MAJOR ISSUES AND PROBLEMS 1 The Growing Demand for Capability and 23 NATIONAL IMPORTANCETHE NEED Connectivity 23 FOR ACTION 3 The Present NUN 3 Research Networking as a Strategic Economic Importance 25 Scientific Importance 4 High *Technology Infrastructure Timing 4 Fetieral Coordination of the Evolving 5 Internet 25 Users 26 Collaborators 5 Players in the NV'N 5 The NREN in the International Service Providers - 28 . Thlecommunications Environment Chapter 2: High Performance Computers . 28 WHAT IS A HIGH PERFORMANCE Policy Issues 7 Planning Amidst Uncertainty 29 COMPUTER? 29 HOW FAST IS FAST? 8 Network Scope and Access Policy and Management Structure 31 THE NATIONAL SUPERCOMPUTER 32 CENTERS 9 Financing and Cost Recovery The Cornell Theor / Center 9 Network Use 33 33 Thc National Center for Supercomputing Longer-Thrm Science Policy Issues 34 Applications 10 Tbchnical Questions Pittsburgh Supercomputing Center 10 Federal Agency Plans: 10 FCCSET/FRICC 34 San Diego Supercomputer Center 35 John von Neumann National NREN Management Desiderata Supercomputer Center 11 OTHER HPC FACILITIES 11 Figures 11 Minnesota Supercomputer Center Figure Page The Ohio Supercomputer Center 12 1-1. An Information Infrastructure for Center for High Performance Computing, Research 2 Thxas (CHPC) 12 1-2. Distribution of Federal Supercomputers ...14 Alabama Supercomputer Network 13 Commercial Labs 13 13 Tables Federal Centers Page CHANGING ENVIRONMENT 13 Table REVIEW AND RENEWAL OF TI-IE NSF 2-1. Some Key Academic High Performance 12 CENTERS 15 Computer Installations THE INTERNATIONAL ENVIRONMENT .16 3-1. Principal Policy Issues in Network 16 Development 30 Japan 36 Europe 17 3-2. Proposed NREN Budget

1 2 ix Chapter 1 Introduction and Overview Observations

The Office of lechnology Assessment is conduct- archival storage systems that contain spe- ing an assessment of the effects of new information cialized research databases; technologiesincluding high performance comput- experimental apparatussuch as telescopes, ing, data networking, and mass data archiving---Jn environmental monitoring devices, sei smographs, research and development. This background paper and so ondesigned to be set-up and operated offers a midcourse view of the issues and discusses remotely; their implications for current discussions about services that support scientific communication, Federal supercomputer initiatives and legislative including electronic mail, computer confer- initiatives concerning a national data communica- encing systems, bulletin boards, and electronic tion network. journals; a "dir,ital library" containing reference mate- Our obst rvations to date emphasize the critical rial, books, journals, pictures, sound record- importance of advanced information technology ings, films, software, and other types of infor- to research and development in the United States, mation in electronic form; and the interconnection of these technologies into a specialized output facilities for displaying the national system (and, as a result, the tighter results of experiments or calculations in more coupling of policy choices regarding them), and readily understandable and visualizable ways. the need for immediate and coordinated Federal action to bring into being an advanced informa- Many of these resources are already used in some tion technology infrastructure to support U.S. form by some scientists. Thus, the scenario that is research, engineering, and education. drawn is a straightforward extenskin of current usage. Its importance for the scientific community and for government policy stems from three trends: RESEARCH AND INFORMATION 1) the rapidly and continually increasing capability TECHNOLOGYA FUTURE of the technologies; 2) the integration of these technologies into what we will refer to as an SCENARIO "information infrastructure"; and 3) the diffusion of Within the next decade, the desks and laboratory information technology into the work of most benches of most scientists and engineers will be scientific disciplines. entry points to a complex electronic web of informa- Few scientists would use all the resources and tion technologies, resources and information serv- facilities listed, at least on a daily basis; and the ices, connected together by high-speed data commu- particular choice of resources eventually made nication networks (see figure 1-1). These technolo- available on the network will depend on how the gies will be critical to pursuing research in most tastes and needs of research users evolve. However, fields. Through powerful computers on the basic form, high-speed data networks connecting their desks, researchers will access a wide variety of user with a worldwide assortmentof resources, such as: information technologies and services, is becoming an interconnected assortment of local campus, a crucial foundation for scientific research in most State and regional, natiolial, and even interna- disciplines. tional data communication networks that link users worldwide; MAJOR ISSUES AND PROBLEMS specialized and general-purpose computers in- cluding supercomputers, minisupercomputers, Developing this system to its full potential will mainframes, and a wide variety of special require considerable thought and effort on the part of architectures tailored to specific applications; government at all levels, industry, research institu- collections of application programs and soft- tions, and the scientific community, itself. It will ware toolsto help users find, modify, or present policymakers with some difficult questions develop programs to support their research; and decisions. -1-

1 3 2

Figura1-1AnInformation Infrastructure for Rasaarch

Mainframes

On-line experiments Electronic mail

-0 \ Bulletin boards

Digital electronic libraries Networks

Workstations Data archives Associated Services

Electronic journals Supercomputers Special purpose computers SOURCE: Office of Technology Assessment, 1989.

Scientific applications are ..emanding on new methods for storing and accessing infor- technological capability. A substantial R&D com- mation from veryirge data archives. ponent will need to accompany programs in- tended to advance R&D use of information An important characteristic of this system is that technology. lb realize the potential benefits of this different parts of it will be funded and operated by new infrastructure, research users need advances in different entities and made available to users in such areas as: different ways. For example, databases could be operated by government agencies, professional soci- more powerful computer designs; eties, non-profit journals, or commercial firms. more powerful and efficient computational Computer facilities could similarly be operated by techniques and software; overly high-speed government, industry, or univerities. The network, itself, already is an assemblage of pieces funded or switched data communications; operated by various agencies in the Federal Govern- improved technoiogies for visualizing data ment; by States and regional authorities; and by local results and interacting with computers; and agencies, firms and educational institutions. Keep-

1 4 3 ing these components interconnected technologi- NATIONAL IMPORTANCE-- cally and allowing users to move smoothly among the resources titzy need will present difficult THE NEED FOR ACTION management and policy problems. Over the last 5 years, Congress has become increasingly concerned about information technol- Furthermore, the system will require significant ogy and research. The National ScienceFoundation capital investment to build and maintain, as well as (NSF) has been authorized to establish supercom- specialized technical expertise to manage. How the puter centers and a science network. Bills (S 1067 various components are to be funded, how cr.its HR 3131) are being considered in the Congress to are to be allocated, and how the key components authorize a major effort to plan and develop a such as the network will be managed over the national research and education network and to long term will be important questions. st mulate information technology use in science and c..acation. Interest in the role information technol- Since this system as envisioned would be so ogy could playin research and education has widespread and fundamental to the process of stemmed, first, from the government's major role as research, access to it would be crucial to participa- a funder, user, and participant in research and, tion in science. Questions of access and participa- secondly, from concern Lir ensuring the strength and tion are crucial to planning, management, and competitiveness of the U.S. economy. policymaking for the network and for many of Observation I : The Federal Government needs to the services attached to it. establish its commitment to the advanced infor- mation technology infrastructure necessary for Changes in information law brought about by the furthering U.S. sci,nce and education. This need electronic revolution will create problems and con- stems directly from the importance of science and flicts for the scientific community and may influ- technology to economic growth, the importance ence how and by whom these technologies are used. The resolution of broaAer information issues of information technology to research and devel- such as security and privacy, intellectual prop- opment, and the critical timing for certain policy erty protection, access controls on sensitive infor- decisions. mation, and government dissemination practices could affect whether and how information tech- Economic Importance nologies will be used by researchers and who may A strong national effort in science and technology use them. is critical to the long-term economic competitive- ness, national security, and social well-being ofthe Finally, to the extent that, over the long run, United States. That, in the modern international modern information technology becomes so funda- economy, technological innovation isconcomitant mental to the research process, it will transform the with social and ecoriomic growth is a basic assump- very nature of that process and theinstitutions- tion held in most political and economic systems in libraries, laboratories, universities, and so onthat the world these days; and we will take it here as a serve it. These basic changes in sciencewould basic premise. It has been a basic finding in many affect government both in the operation of its own OTA studies.' (This observation is not to suggest laboratories and in its broader relationship as a that technology is a panacea for all social problems, supporter and consumer of research. Conflicts nor that serious policy problems are not oftenraised may also arise to the extent that government by its use.) Benefits from of this infrastructure are becomes centrally involved, both through fund- expected to flow into the economy in three ways: ing and through management, with the tradition- ally independent and uncontrolled communication First, the information technology industry can channels of science. benefit directly. Scientific use has always been a

1For example, U.S. Congress, Office of Technology Assessment, Technology and the AmericanEconomic Transition, OTA-TET-283 (Washington, DC: U.S. Government Printing Office, May 1988) and Information Technology R&D:Critical Trends and Issues, OTA -ClT-268 (Washington, DC: U.S. Government Printing Office, February 1985).

15 major source of innovation in computers and com- Scientific and technical information is increas- munications technology. Packet-switched data com- ingly being generated, stored and distributed in munication, now a wieely used commercial offering, electronic form; was first developed by the Defense Advanced Computer-based communications and data han- Research Projects Agency (DARPA) to support its dling are becoming essential for accessing, research community. Department of Energy (DOE) manipulatthg, analyzing, and communicating national laboratories have, for many years, made data and research results; and, contributions to supercomputer hardware and soft- In many computationally intensive R&D areas, ware. New initiativesto develop higher speed from climate research to groundwater modeling computers and a national science network could to airframe design, major advances will depend similarly feed new concepts back to the computer upon pushing the state of the art in high and communications industry as well as to providers performance computirg, very large databases, of information services. visualization, and other related information technologies. Some of these applications have Secondly, by improving the tools and methodolo- been labeled "Grand Challenges." These proj- gies fox- R&D, the infrastructure will impact the ects hold promise of great social benefit, such research process in many critical high technology as designing new vaccines and drugs, under- industries, such as pharmaceuticals, airframes, chem- standing global warming, or modeling the icals, consumer electronics, and many others. Inno- world economy. However, for that promise to vation and, hence, international competitiveness in be realized in those fields, researchers require these key R&D-intensive sectors can be improved. major advances in available computational The economy as a whole stands to benefit from power. increased technological capabilities of information Many proposed and ongoing "big science" systems and improved understanding of how to use projects, from particle accelerators and large them. A National Research and Education Network array radio telescopes to the NASA EOS could be the precursor to a much broader high satellite project, will create vast streams of new capacity network serving the United States, and data that must be captured, analyzed, archived, many research applications developed for high and made availab:e to the research community. performance computers result in techniques much These new demands could well overtax the more broadly applicable to commercial firms. capability of currently available resources.

Scientific Importance Timing Research and development is,inherently, an Government decisions being made now and in the intormation activity. Researchers generate, organ- near future will shape the long-term utility and ize, and interpret information, build models, com- effectiveness of the information technology infra- municate, and archive results. Not surprisingly, structure for science. For example: then, they are now dependent on information tech- NSF is renewing its multi-year commitments to nology to assist them in these tasks. Many major all or most of the existing National Supercom- studies by many scientificindpolicy organizations puting Centers. over the years--as far back as the President's Executive agencies, under the informl aus- Science Advisory Committee (PSAC) in the middle pices of the Federal Research Internet Coordi- 1960s, and as recently as a report by COSEPUP of nating Committee (FRICC), are developing a the National Research Council published in 19882-- national "backbone" network for science. Deci- have noted these trends and analyzed the implica- sions made now will have long term influence tions for science support. The key points are as on the nature of the network, its technical follows: characteristics, its cost, its management, serv-

2Panel on information Thchno logy and thc Conduct of Research, Committee on Science. Engineering, and Public Policy, Worrnation Technology and the Conduct of Research: The User's View (Washington, DC: National Academy Pins, 1989).

1 f; 5

ices available on it, access, and the information interconnect and users must move smoothly among policies that will govern its use. them, the system requires a high degree of coordina- The basic communications industry is in flux, tion rather than being treated as simply a conglomer- as are the policies and rules by which govern- ation of independent facilities. ment regulates it. However, if information technology resources for Congress and the Executive Branch are cur- science are treated as infrastructure, a major policy rently considering, and in some cases have issue is one of boundaries. Who is it to serve; who started, several new major scientific projects; areiibeneficiaries? Who should participate in including a space station, the Earth Orbiting designing it, building and operating it, providing System, the Hubble space telescope, the super- services over it, and using it? The answers to these conducting supercollider, human genome map- questions will also indicate to Congress who should ping, and so on. Thchnologies and policies are be part of the policymaking and planning process; needed to deal with these "firehoses of data." In they will govern the long term scale, scope, and the addition, upgrading the information infrastruc- technological characteristics of the infrastructure ture could open these projects and data streams itself; and they will affect the patterns of support for to broad access by the research community. the facilities. Potentially interested parties include Observation 2: Federal policy in this area needs to the following: be more broadly based than has been traditional Users with Federal science efforts. Planning, building, and managing the information technology infra- Potential users might include academic and indus- structure requires cutting across agency pro- trial researchers, teachers, graduate, undergraduate, grams and the discipline and mission-orionted and high school students, as well as others such as approach of science support. In addition, many the press or public interest groups who need access parties outside the research establishment will to and make use of scientific information. Institu- have important roles to play and stakes in the tions, such as universities and colleges, libraries, and outcome of the effort. schools also have user interests. Furthermore, for- eign scientists working as part of international The key information technologieshigh per- research teams or 'n firms that operate internation- formance computing centers, data communication ally will wish access to the U.S. system, which, in networks, large data archives, along with a wide turn, will need to be connected with other nation's range of supporting softwareare usedinall research infrastructures. research disciplines and support several different agency missions. In many cases, economies of scale Collaborators and scope dictate that some of these technologies Another group of interested parties include State (e.g., supercomputers) be treated as common re- end local governments and parts of the information sources. Some, such as communicationnetworks, industry. We have identified them with the term are most efficiently used if shared orinterconnected "collaborators" because they will be participating in in some way. funding, building, and operating the infrastructure. There are additional scientific reasons to treat States are establishing State supercomputer centers information resources as a broadly used infrastruc- and supporting local and regional networking, some computer companies participate in the NSF National ture:fostering communication among scientists between disciplines, sharing resources and tech- Supercomputer Centers, and some telecommunica- niques, and expanding access to databases and tion firms are involved in parts of the science software, for instance. However, there are very few network. models from the history of Federal science support Service Providers for creating and maintaining infrastructure-like re- sources for science and technology across agency Finally, to the extent that the infrastructure serves and disciplinary boundaries. Furthermore, since the as a basic toolfor most of the research and networks, computer systems, databases, and so on development community, information service pro-

1 "I 6 viders will require access to make their products network while protecting privacy and valuable available to scientific users. The service providers resources will require careful balancing of legal and may include government agencies (which provide technological controls. access to government scientific databases, for exam- ple),libraries and library utilities, journal and Intellectual property protection in an electronic text-book publishers, professional societies, and environment may pose difficult problems. Providers private software and dataLas,e providers. will be concerned that electronic databases. soft- ware, and even electmnic formats of printed journals Observation 3: Several irtformation policy issues and other writings will not be adequately protected. will be raised in managing and using the network. In some cases, the product, itself, may not be well Depending on how they are resolved, they could protected under existing law. In other cases elec- sharply restrict the utility and scope of network tronic formats coupled with a communications use in thl scientific community. network erode the ability to control restrictions on Security and privRcy have already become of copying and disseminating. major cc ncem and will pose a problem. In general, Access controls may be called for on material that users will want the network and the services on it to is deemed to be sensitive (although unclassified) for be as open as possible; however, they will also want reasons of national security or economic competi- the networks and services to be as robust and tiveness.Yet, the networks will be accessibie dependable as possiblefree from deliberate or worldwide and the ability to identify and control accidental disruption. Furthermore, different re- users may be limited. sources will require different levels of security. Some bulletin boards and electronic mail services The above observations have been broad, looking may want to be as open and public as possible; others at the overall collection of information technology may require a high level of privacy. Soni ,12tabases resources for science as an integrated system and at may be unique and vital resources that will need a the questions raised by it. The remaining portion of very high level of protection, others may not be so this paper will deal specifically with high perform- critical. Maintaining an open, easily accessible ance computers and networking.

1 S Chapter 2 High Performance Computers

.1111154,1111111=

An important set of issues has been raised during Have we learned anything about the effectiveness of the last 5 years around the topic of high performance the National Centers approach? Should the goals of computing (HPC). These issues stem from a grow- the Advanced Scientific Computing (ASC) and ing concern in both the executive branch and in other related Federal programs be refined or rede- Congress that U.S. science is impeded significantly fined? Should alternative approaches be considered, by lack of access to HPC1 and by concerns over the either to replace or to supplement the contributions competidveness implications of new foreign tech- of the centers? nology initiatives, such as the Japanese "Fifth OTA is presently engaged in a broad assessment Generation Project." In response to these concerns, of the impacts of information technology on re- policies have been developed and promoted with search, and as part of that inquiry, is examining the three goals in mind. question of scientific computational resources. It has 1. To advance vital research applications cur- been asked by the requesting committees I f' an rently hampered by lack of access to very high interim paper that might help shed some light on the speed computers. above questions. The full assessment will not be 2. lb accelerate the development of new FIPC completed for several months, however; so this technology, providing onhanced tools for re- paper must confine itself to some tentative observa- search and stimulating the competitiveness of tions. the U.S. computer industry. 3. lb improve software tools and techniques for using HPC, thereby enhancing their contribu- WHAT IS A HIGH PERFORMANCE tion to general U.S. economic competitive- COMPUTER? ness. The term, "supercomputer," is commonly used in In 1984, the National Science Foundation (NSF) the press, but it is not necessarily useful for policy. initiated a group of programs intended to improve In the first place, the definition of power in a the availability and use of high performance comput- computer is highly inexact and depends on many ers in scientific research. As thecenterpiece of its factors including processor speed, memory size, and initiative, after an initial phase of buying and so on. Secondly, there is not a clear lowerboundary distributing time at existing supercomputer centers, of supercomputer power. IBM 3090 computers NSF established five National Supercomputer Cen- come in a wide range of configurations, someof the ters. largest of which are the basis ofipercomputer centers at institutions such as Cornell, the Universi- Over the course of this and the next year, the ties of Utah, and Kentucky. Finally, technology is initial multiyear contracts with the National Centers changing rapidly and with it our conceptions of are coming to an end, which hasprovoked a debate power and capability of various types ofmachines. about whether and, if so, in what form they should We use the more general term, "high performance be renewed. NSF undertook an elaborate review and computers," a term that includes a variety of renewal process and announced that, depending on machine types. agency funding, itis prepared to proceed with renewing at least four of the centers2. In thinking One class of HPC consists of very large, powerful about the next Eteps in the evolution of the advanced machines, principally designed for very large nu- computing program, the science agencies and Con- merical applications such as those encountered in gress have asked some basic questions.Have our science. These computers are the ones often refe led perceptions of the needs of research for HPC to as "supercomputers." They are expensive,costing changed since the centers were started? If so, how? up to several million dollars each.

'Peter D. Laz,Report of the Panel on Lioge-Scale Computing in Science and Engineering(Washington, DC: National Science Foundation, 1982). 20ne of the five centers, the John von Newnann National Supercomputer Center, has been based onETA-10 technology. The Ccntcr ha.s been asked to resubmit a proposal showing revised plans in ruction to thewithdrawal of that machine from the market. -7- 8

A large-scale computer's power comes froma slower and, hence, cheaper processors. The problem combination of very high-speed electronic compo- is that computational mathematicians have not yet nents and specialized architecture (a term used by developed a good theoretical or experiential frame- computer designers to describe the overall logical work for understanding in general how to arrange arrangement of the computer). Most designs use a applications to take full advantage of these mas- combination of "vector processing" and "parallel- sively parallel systems. Hence, they are still, by and ism" in their design. A isan large, experimental, even though some are nowon arithmetic unit t f the computer that produces a series the market and users have already developed appli- of similar calculations in an overlapping, assembly cations software for them. Experimental as these line fashion. (Many scientific calculations can be set systems may seem now, many experts think that any up in this way.) significantly large increase in computational power Parallelism uses several processors, assuming that eventually must grow out of experimental systems a problem can be broken into large independent such as these or from some other form of massively parallel architecture. pieces that can be computed on separate processors. Currently, large, mainframe HPC's such as those Finally, "workstations," the descendants of per- offered by Cray, IBM, are only modestly parallel, sonal uesktop computers, are increasing in power; having as few as two up to as many as eight new chips now in development willoffer the processors.3 The trendis toward more parallel computing power nearly equivalent to a Cray 1 processors on these large systems. Some experts supercomputer of the late 1970s. Thus, although anticipate as many as 512 processor machines top-end HPCs will be correspondingly more power- appearing in the near future. The key problem to date ful, scientists who wish to do serious cmputing will has been to understand how problems can be set up have a much wider selection of options in the near to take advantage of the potential speed advantage of future. larger scale parallelism. A few policy-related conclusions flow from this Several machines are now on the market that are discussion: based on the structure and logic of a large supercom- The term "Supercomputer" is a fluid one, puter, but use cheaper, slower electronic compo- potentially covering a wide variety of machine nents. These systems make some sacrifice in speed, types, and the "supercomputer industry" is but cost much less to manufacture. Thus, an applica- similarly increasingly difficult to identify clearly. tion that is demanding, but that does not necessarily Scientists need access to a wide range of high require the resources of a full-size supercomputer, performance computers, ranging from desktop may be much more cost effective to run on such a workstations to full-scale supercomputers, and "minisuper." they need to move smoothly among these Other types of specialized systems have also machines as their research needs dictate. appeared on the market and in the research labora- Hence, government policy needs to oe flexible tory. These machines represent attempts to obtain and broadly based, not overly focused on major gains in computation speed by means of narrowly defined classes of machines. fundamentally different architectures. They are known by colorful names such as "Hypercubes," "Connec- tion Machines," "Datafiow Processors," "Butterfly HOW FAST IS FAST? Machines," "Neural Nets," or "Fuzzy Logic Com- 1lopular comparisons of supercomputer speeds are puter. " Although they differ in detail, many of these ustially based on processing speed, the measure sys'ns are based on large-scale parallelism. That is, being "FLOPS," or "Floating Point Operation Per theli designers attempt to get increases in processing Second." The term "floating point" refers to a speed by hooking together in some way a large particular format for numbers within the computer numberhundreds or even thousandsof simpler, that is used for scientific calculation; and a floating

3To distinguish between this modest level and the larger scale parallelism found on somemore experimental machines, some experts refer to this limited parallelism as "multiprocessing."

20 9

point "operation" refers to a single arithmetic step, One can draw a few policy implications from such as adding two numbers, using the floating point these observations on speed: format. Thus, FLOPS measure the speed of the Since overall speed improvement is closely arithmetic processor. Currently, the largest super- linked with how their machines are actually computers have processing speeds ranging up to programmed and used, computer designers are several billion FLOPS. critically dependent on feedback from that part of the user community which is pushing their However, pure processing speed is not by itself a machines to the limit. useful measure of the relative power of computers. There is no "fastest" machine. The speed of a lb set why, let's look at an analogy. high performance computer is too dependent on the skill with which it is used and programmed, In a supermarket checkout counter, the calcula- and the particular type of job it is being asked tion speed of the register does not, by itself, to perform. determine how fast customers can purchase their Until machines are available in the market and groceries and get an of the store. Rather, the speed have been tested for overall performance, of checkout is also affected by the rate at which each policymakers should be skeptical of announce- purchase can be entered into the register and the ments based purely on processor speeds that overall time it takes to complete a transaction with some company or country is producing"faster a customer and start a new one.Of course, ulti- machines." mately, the length of time the customer must wait in Federal R&D programs for improving high line to get to the clerk may be the biggest determi- performance computing need to stress software nant of all. and computational mathematics as well as research on machine architecture. Similarly, Li a computer, how fast calculations can be set tin and presented to the processor andhow THE NATIONAL fast new jobs and their associated data can be moved in, and completed work moved out of the computer, SUPERCOMPUTER CENTERS determines how much of the processor's spetd can In February of 1985, NSF selected four sites to actually be harnessed. (Some users refer to this as establish national supercomputing centers: The Uni- "solution speed.") In a computer, those speeds are versity of California at San Diego, The University of determined by a wide variety of hardware and Illinois at Urbana-Champaign, Cornell University software characteristics. And, similar to the store and the john von Neumann Center in Princeton. A checkout, as a fast machine becomes busy, users fifth site, Pittsburgh, was added in early 1986. The may have to wait a significant time to get their turn. five NSF centers are described briefly below. From a user's perspective, then, a theoretically fast computer can look very slow. The Cornell Theory Center The Cornell Theory Center is located on the In order to fully test a machine's speed, experts campus of Cornell University. Over 1,900 users use what are called "benchmark programs,"sample from 125 institutions access the center. Although programs that reprocuce the actual work low:.Since Cornell does not have a center-oriented network, 55 workloads vary, there are several different bench- academic institutions are able to utilize the resources mark programs, and they are constantly being at Cornell through special nodes. A14-member refined and revised. Measuring a supercomputer's Corporate Research Institute works within the center speed is, itself, a complex and important area of in a variety of university-industry cost sharing research. It lends insight not only into what type of projects. computer currently on the market is best to use for particidar applications: but carefully structured meas- In November of 1985 Cornell received a 3084 urements can also show where bottlenecks occur computer from IBM, which was upgraded to a and, hence, where hardware and software improve- four-processor 3090/400VF a year later. The 3090/ ments need to be made. 400VF was replaced by a six-processor 3090/600E 10 in May, 1987. In October, 1988 a second 3090/600E Piitsburgh Supercomputing Center was added. The Cornell center also operates several other smaller parallel systems, including an Intel The Pittsburgh Supercomputing Center (PSC) is iPCS/2, a Transtech NT 1000, and a Topologix run jointly by the University of Pittsburgh, Carnegie- T1000. Some 50 percent of the resources of North- Mellon University, andi Westinghouse Electric Corp. east Parallel Architecture Center, which include two More than 1,4,00 users from 44 States utilize the Connection machines, an Encore, and an Mimi center. Twenty-seven universities are affiliated with FX,/80, are accessed by the Cornell facility. PSC. Until October of 1988, all IBM computers were The center received a Cray X-MP148 in March of "on loan" to Cornell for as long as Cornell retained 1986. In December of 1988 PSC became the first its NSF fanding. The second IBM 3090/600, pro- non-Federal laboratory to possess a Cray Y-MP. cured in October, will be paid for by an NSF grant. Both machines were being used simultaneously for Over the past 4 years, corporate support for the a short time, however the center has phased out the Cornell facility accounted for 48 percent of the Cray X-MP. The renter's graphics hardware in- operating costs. During those same years, NSF and cludes a Pixar image computer, an Ardent Titan, and New York State accounted for 37 percent and 5 a Silicon Graphics IRIS workstation. percent respectively of the facility's budget. This The operating projection at PSC for fiscal year funding has allowed the center to maintain a staff of 1990, a "typical year," has NSF supporting 58 about 100. percent of the center's budget while indust y and vendors account for 2`... percent of the costs. The The National Center for Commonwea'.h of Pennsylvania and the National Supercennputing Applications Institutes of Health both support PSC, accounting The National Center for Supercomputing Appli- for Si percent and 4 percent of budget respectively. cations (NCSA) is operated by the University of Excluding working students, the center has a staff of Illinois at Urbana-Champaign. The Center has over around 65. 2,500 academic users from about 82 academic affiliates. Each affiliate receives a block grant of San Diego Snpercomputer Center time on the Cray X-MP/48, training for the Cray, and The San Diego Supercomputer Center (SDSC) is help using the network to access the Cray. located on the cainpus of the University of Califor- The NCSA received its Cray X-MP/24 en October nia at San Diego and is operated by General 1985. That machine was upgraded to a Cray Atomics. SDSC is linked to 25 consortium members but has a user base in 44 States. At the end of 1988, X-MP/48 in 1987. In October 1988 a Cray-2s/4-128 was installed, giving the center two Cray machines. over 2,700 users were accessing the center. SDSC This computer is the only Cray-2 now at an NSF has 48 industrial partners who use the facility's national center. The center also houses a Connection hardwane, software, and support staff. Machine 2, an Alliant FX/80 and FX/8, and over 30 A Cray X-MP/48 was installed in December, graphics workstations. 1985. SDSC's first upgrade, a Y-MP8/864, is In addiidon to NSF funding, NCSA has solicited planned for December, 1989. In addition to the Cray, industrial support, Amoco, Eastman Kodak, Eli SDSC has 5 Sun workstations, two IRIS worksta- Lilly, FMC Corp., Dow Chemical, and Motorola tions, an Evans and Sutherland terminal, 5 Apollo have each conuibuted around $3 million ovet a workstationa, a Pixar, an Ardent Titan, an SCS-40 3-year penod to the NCSA. In fiscal year 1989 minisupercomputer, a Supertek S-1 minisupercom- corporate support has amounted to 11 percent of puter, and two Symbolics Machines. NCSA's funding. About 32 percent of NCSA's The University of California at San Diego spends budget came from NSF while the State of Illinois more than $250,000 a year on utilities and services and the Univershy of Illinois accounted for the for SDSC. For fiscal year 1990 the SDSC believes remaining 27 percent of the center's $21.5 million NSF will account for 47 percent of the center's budget. The center has a full-time staff of 198. operating budget. The State of California currently provides $1.25 million per year to the center and in OTHER HPC FACILITIES 1988, approved funding of $6 million over 3 years to SDSC for research in scientific visualization. For Before 1984 only three universities operated University fiscal year 1990 the State is projected to support 10 supercomputers: Purdue University, the percent of the center's costs. Industrial support, of Minnesota, and Colorado State University. The which has given the center $12.6 million in dona- NSF supercomputing initiative established five new accessi- tions and in-kind services, is projected to provide 15 supercomputer centers that were nationally percent of the total costs of SDSC infiscal year ble. States and universities began funding their own supercomputer centers, both in response togrowing 1990. needs on campus and to increased feeling on the part of State leaders that supercomputer facilities could John von Neumann be important stimuli to local R&D and, therefore, to economic development. Now, many State and uni- National Supercomputer Center versity centers offer access to high performance computers;4 and the NSF centers are only part of a The John von Neumann National Supercomputer Center (JvNC), located in Princeton New Jersey, is much larger HPC environment including nearly 70 Federal installations (see table 2-1) managed by the Consortium tor Scientific Comput- ing Inc., an organization of 13 institutions from New Supercomputer center operators perceive their Jersey, Pennsylvania, Massachusetts, New York, roles in different ways. Some want to be a proactive Rhode Island, Colorado, and Arizona. Currently force in the research community, leading the wayby there are over 1,400 researchers from 100institutes helping develop new applications, training users, accessing the center. Eight industrial corporations and so on. Others are content to follow in thepath utilize the JvNC facilities. tLt the NSF National Centers create. These differ- ences in goals/missions lead tovaried services and At present there are two Cyber 205 and two computer syetems. Some centers are "cycleshops," ETA-10s, in use at the JvNC. The first ETA-10 was offering computing time but minimal support staff. installed, after a 1-year delay, in March of 1988. In Other centers maintain a large support staff and offer addition to these machines there is a Pixar II, two consulting, training sessions, and even assistance Silicon Graphics IRIS and video animation capabili- with software development. Four representative ties. centers are described below: When the center was established in 1985 by NSF, the New Jersey Commission on Science and Thch- Minnesota Supercomputer Center nology committed $12.1 million to the center over a The Minnesoli Supercomputer Center, originally 5-year period. An addition $13.1 million hasbeen part of the University ofMinnesota, is a for-profit set-aside for the center by the New Jersey Commis- computer center owned by the Universityof Minne- sion for fiscal year 1991-1995. Direct funding from sota. Currently, several thousandresearchers use the the State of New Jersey and university sources center, over 700 of which are from theUniversity of constitutes 15 percent of the center's budget for Minnesota. The Minnesota Supercomputing Insti- fiscal year 1991-1995. NSF will account for 60 tute, an academic unit of the University,channels percent of the budget. Projected industry revenue university usage by provie' g grants to the students and cost sharing account for 25 percent of costs. through a peer review prc .ess. Since the announcement by CDC to close its ETA subsidiary, the future of JvNC is uncertain. Plans The Minnesota Supercomputer Center received have been proposed to NSF by JvNC to purchase a its first machine, a Cray 1A, in September,1981. In Cray Research Y-MP, eventually upgrading to a mid 1985, it installed a Cyber 205; and in the latter C-90. NSF is reviewing the plan and a decision on part of that year, two Cray 2 computers were renewal is expected in October of 1989. installed within 3 months of each other.Minnesota

4The number cannot he estimated exactly. First, it depends on the definitionof supercomputer one uses. Secondly, thc number keeps changing as States announce new plans for centers and as large research universities purchasetheir own HPCs. 12

Theis 2-1Federee Uncieesified supercomputers than anyone outside the Federal Supercomputer Mete Pistons Government.

Number Laboratory The Minnesota State Legislature provides funds of machines to the University for the purchasing of supercom- Departramt ofEnergy Los Alamos National Lab 6 puter time. Although the University buys a substan- Livermore Nation) Lab, NMFECC 4 tial portion of supercomputing time, the center has LiverMOIE National Lab 7 many industrial clients whose identities are preprie- Sandia National Lab, Livermore 3 Sandia National Lab, Albuquerque 2 tary, but they include representatives of the auto, Oak Ridge National Lab 1 aerospace, petroleum, and electronic industries. Idaho Falls National Engineering 1 They are charged a fee for the use of the facility. Argonne National Lab 1

Knolls Atomic Power Lab 1 Bettie Atomic POWer Lab 1 The Ohio Supercomputer Center SavannaWDOE 1 Richland/DOE 1 The Ohio Supercomputer Center (OSC) origi- Ssheneotady Naval Reactors/DOE 2 Pittsburgh Naval Reastors/DOE 2 ntozd from a coalition of scientists in the State. The Deftertinent of Delmore ct aces, located on Ohio State University's campus, Naval Research Lab 1 is 7,onnected to 20 other Ohio universities via the Navti Ship R&D Center 1

Fleet Numerical Oceanography 1 Academic Research Network (OARNET). As Naval Underwater System Crrnmend 1 c. -,vauary 1989, three private firms were using the Naval Weapons Center 1 Center's resources. Martin letarietta/NTB 1 Air Force Vtieapons Lab 2 In August, 1987, OSC installed a Cray X-MP/24, Air Force Global VA:tether 1 Arnold Engineering and Development 1 which was upgraded to an Cray X-MP128a year Wright Patterson AFB 1 later. The center replaced the X-MP in August 1989 Aerospace Corp. 1 Army BMW° Research Lab 2 with a Cray Research Y-MP. In addition to Cray Army/Tacom 1 hardware, there are 40 Sun Graphic workstations,a Army/Huntsville I Pixar U, a Stallar Graphics machine,a Silicon Amiy/Kweialeln 1 Graphic workstation and a Abekas Still Store Army/WES (on order) 1 Army /Wenn I machine. The Center maintains a staff of about 35 Defense Nuclear Agency 1 people. NASA Ames 5 The Ohio General Assembly began funding the Goddard 2 Lewis center in the summer of 1987, appropriating $7.5 Lang* million. In March of 1988, the Assembly allocated Marshal $22 million for the acquisition of a Cray Y-MP. Ohio Depertrnent of Commerce Nationef Inst of Standards and Technology State University has pledged $8.2 million to aug- National Oceanic & Atmospheric Administration 4 ment the center's budget. As of February 1989 the Environmentol Protection Agency State has spent $37.7 million in funding.5 OSC's Raleigh, North Carolina Deoerimmt of ifseith end Human *twice. annual budget is around $6 million (not including National Institutes of Health the purchase/leasing of their Cray). National Cancer Institute SOURCE:Office ct Technology Aseessment estimate. Center for High Performance Computing, bought its third Cray 2, the only one inuse now, at Texas (CHPC) the end of 1988, just after it installed its ETA-10. The Center for High Performance Computing is The ETA-10 has recently been decommissioned due located at The University of Thxas at Austin. CHPC to the closure of ETA. A Cray X-MP has been added, serves all 14 institutions, 8 academic institutions, giving them a total of two supercomputers. The and 6 health-related organizations, in the University Minnesota Supercomputer Center has acquiredmore of Thxas System.

nine Ware, "Ohioans: Bluing Computer," Ohio, February 1989,p. 12. 13

The University of Texas installed a Cray X-MP/ Commercial Labs 24 in March 1986, and a Cray 14se in November of A few corporations, such as the Boc:,ig Computer 1988. The X-MP is used primarily for research. For the time being, the Cray 14se is being used as a Corp., have been selling high performance computer vehicle for the conversion of users to the Unix time for a while. Boeing operates a Cray X-MP/24. Other commercial sellers of high performance com- system. About 40 people staff the center. puting time include the Houston Area Research Original funding for the center and the Cray Center (HARC). HARC operates the only Japanese X-MP came from bonds and endowments from both Supercomputer in America, the NEC 5X2. The Tim University of Texas system and The University center offers remote services. of lbxas at Austin. The annual budget of CHPC is Computer Sciences Corp. (CSC), located in Falls about $3 million. About 95 percent of the center's Church, Virginia, has a 16-processor FLEX/32 from operating budget comes from State funding and Flexible Comruter Corp., a Convex 120 from endowments. Five percent of the costs are recovered Convex Computer Corp, and a DAP2i0 from Active from selling CPU time. Memory Technology. Federal agencies comprise two-thirds of CSC's customers.6 Power Computing Co., located in Dallas, 'Ibxas, offers time on a Cray Alabama Supercomputer Network X-MP/24. Situated in Houston, Texas, Supercom- puting lechnology sells time on its Cray X-MP/28. The George C. Wallace Supercomputer Center, Opticom Corp., of San Jose, California, offers time located in Huntsville Alabama, serves the needs of on a Cray X-MP/24, Cray 1-M, ConvexC220, and researchers throughout Alabama. Through the Ala- Cl XP. bama Supercomputer Network, 13 Alabama institu- tions, university and government sites, are con- Federal Centers nected to the center. Under contract to the State, Boeing Computer Services provides the support In an informal poll of Federal agencies, OTA staff and technical skills to operate the center. identified 70 unclassified installations that operate supercomputers, confirming the commonly expressed Support staff are located at each of the nodes to help view that the Federal Government still represents a facilitate the use of the supercomputer from remote major part of the market for TrIPC in the United States sites. (see figure 2-1). Many of these centers serve the A Cray X-MP/24 arrived in 1987 and became research needs of government scientists and engi- operational in early 1988. In 1987 the State of neers and are, thus, part of the total research Alabama agreed to finance the center. The State computing environment. Some are available to allocated $2.2 million for the center and $38 million non-Federal scientists, others are closed. to Boeing Services for the initial 5 years.The average yearly budget is $7 million. The centerhas CHANGING ENVIRONMENT a support staff of about 25. The scientific computing environment has changed in important ways during the few years that Alabama universities are guaranteed 60 percent of NSF's Advanced Scientific Computing Programs the available time at no cost while coriercial have existed. Some of these changes are as follows: researchers are charged a user fee. The impetus for the State to create a supercomputer center has been The ASC programs, themselves, have not stated as the technical superiority a supercomputer evolved as originally planned. The original NSF would bring, which would draw high-tech industry planning document for the ASC program originally to the State, enhance interaction betweenindustry proposed to establish 10 supercomputer centers over and the universities, and promote research and the a 3-year period; only 5 were funded.Center manag- associated educational programs within the univer- ers have also expressed the strongopinion that NSF sity. has not met many of its original commitments for

6Norris Parka Smith, "More Than Just Buying Cycles," Supercomputer Review, April 1989. 14

Sum 2-1Distribution of Federal Supercomputers development of the Cray 3, a machine based on Supercomputers gallium arsenide electronics. 40

35 33 At the middle and lower end, the HPC industry has introduced several new so-called "mini- 30 supercomputers"many of them based on radically 25 1 different system concepts, such as massive parallel- ism, and many designed for specific applications, 19 20-I such as high-speed graphics. New chips promise

15-4 very high-speed desktop workstations in the near 10 future. 10I 5 Finally, three Japanese manufacturers, NEC, Fujitsu, 5 2 1 and Hitachi have beel successfully building and 0 Mai seeseee marketing supercomputers that are reportedlycom- DOE DoD NASA Commerce HSS EPA petitive in performance with U.S. machines.7 While Agencies these machines have, as yet, not penetrated the U.S.

SOURCE: Office of Technology Assessment, 1989. computer market, they indicate the potential com- petitiveness of the Japanese computer industry in the funding in successive years of the contracts, forcing international HPC markets, and raise questions for the centers to change their operations' priorities and U.S. policy. search for support in other directions. Many universities and State systems have established "supercomputer enters" to serve the Ttchnology has changed. There has been a burst needs of their researchers.8 Many of these centers of innovation in the HPC industry. At the top of the have only recently been formed, some have not yet Line, Cray Research developed two lines of ma- installed their systems, so their operational experi- chines, the Cray 2 and the Cray X-MP (and its ence is, at best, limited to date. Furthermore, some successor, the Y-MP) that are much more powerful other centers operate systems that, while very than the Cray 1, which was considered the leading powerful scientific machines, are not considered by edge of supercomputing for several years by the all experts to be supercomputers. Nevertheless, these mid-1980s. IBM has delivered several 3090s equipped renters provide high performance scientific comput- with multiple vector processors and has also become ing to the research community, and create new a partner in a project to develop a new supercom- demands for Federal support for computer time. puter in a joint venture with SSI, a firm started by Stev:hen, a noted supercomputer architect previ- Individual scientist and mearch teams are also ously with Cray Research. getting Federal and private support from their sponsors to buy their own "minisupercomputers." In More recently, major changes have occurred in some case ;, these systems are used to develop and the industry. Control Data has closed down ETA, its check out software eventually destined to run on supercomputer operation. Cray Research has been larger machines; in other cases, researchers seem to broken into two partsCray Computer Corp. and find these machines adequate for their needs. In Cray Research. Each will develop and market a either mode of use, these departmental or laboratory different line of supercomputers. Cray Research systems expand the range of possible sources will, initially, at least, concentrate on the Y-MP researchers can turn to for high performance corn- models, the upcoming C-90 machines, and their puting. Soon, desktop workstations will have per- longer term successors. Cray Computer Corp., under formance equivalent to that of supercomputers af a the leadership of , will concentrate on decade ago at a significantly lower cost.

as shown above, comparing the power and performance of supercomputers is a complex and arcane field, OTA will refrain from comparing or rankingsystemsin any absolute sense. eSee National Association of State Universities and Land-Grant Colleges, Supercomputirtg for the1990's: AShared Responsibility (Washington, DC: !limy 1989). 15

Finally, some important changes have oc- the review of the JvNC on hold pendingreview of a curred in national objectives or perceptionsof revised plan that Las now been submitted. Adecision issues. For example, the development of a veryhigh is expected soon. has capacity national science network (or "internet") Due to the environmental changes notedabove, taken on a much greater significance. Originally if the centers are to continue in their present conceived of in the narrow context of tyingtogether status as special NSF-sponsoredfacilities, the supercomputer centers andproviding regional ac- National Supercomputer Centers will need to cess to them, the sciencenetwork has now come to sharply define their roles in terms of: 1)the users be thought of by its proponents as a bas.infrastruc- they intend to serve, 2) the types ofapplications ture, potentially extendingthroughout (and, perhaps, they serve, and 3) the appropriatebalance be- even beyond) the entirescientific, technical, and tween service, education, and research. educational community. The NSF centers are only a few of agrowing Science policy is also changing, as new important HPC being number of facilities that provide access to and costly projects have been started or are resources. Assuming that NSF'sbasic objective is to seriously considered. Projects such as the supercol- assure researchers access to the mostappropriate lider, the space station, NASA's Earth Observing computing for their work, it will be underincreasing System (EOS) program, and the human genome funds to one limited for pressure to justify dedicating mapping may seem at first glance to compete group of facilities. Five years ago,few U.S. aca- funding with science networks and supercomputers. demic supercomputer centers existed.When scien- However, they will create formidable newdemands attention was data tific demand was less, manageriai for computation, data communications, and focused on the immediate problem of gettingequip- additional storage facilities; and, hence, constitute ment installed and of developing anexperienced arguments for investments in aninformation tech- user community. Underthose circumstances, some nology infrastructure. ambiguity of purpose may have been acceptableand Finally, some of the research areas in the so-called understandable. However, in light of theprolifera- "Grand Challenges"9 have attained even greater tion of alternative technologies and centers, aswell social importancesuch as fluid flowmodeling as growing demandby researchers, unless the which will help the design of faster and morefuel rposes of the NationalCenters are more clearly efficient planes and ships, climate modeling tohelp delineated, the facilities are at risk of beingasked to understand long term weather patterns, andthe serve too many roles and, as aresult, serving none structural analysis of proteins to helpunderstand well. fight diseases and design vaccines and drugs to Some examples of possible choices are asfol- them. lows: REVIEW AND RENEWAL 1. Provide Access to HPC OF THE NSF CENTERS Provide access to the most powerful, leading edge, supercomputers available. Based on the recent review, NSF has concluded Serve the HPC requirements for research pro- that the centers, by and large, have beensuccessful Federal their systems jects of critical importance to the and arc operating smoothly. That is, Government, for example, the "Grand Chal- are being fully used,they have trained many new good science. In light lenge" topics. users, and they are producing researchers of that conclusion, NSF has tentativelyagreed to Serve the needs of all NSF-funded renewal for the three Cray-based centers andthe for HPC. IBM-based Cornell Center. The John vonNeumann Serve the needs of the (academic, educational, Center in Princeton has beer based onETA-10 and/or industrial) scientific community for computers. Since ETA was closeddown, rF put HPC. computing resources than 'Grand Challenge" research topics arc questions of majorsocial importance that require for progress substantially greater are currently available. The term wasfirst coined by Nobel Laureate physicist, KennethWilson. 16

2. Educate and Train "Fifth Generation" refers to computers specially Provide facilities and programs to teach scien- designed for artificial intelligence applications, es- tists and students how to use high performance pecially those that involve logical inferenceor computing in their research. "reasoning.") 3. Advance the State of HPC Use in Research Although in the eyes of many scientists the Fifth Develop applications and system software. Generation project has fallen short of its original Serve as centers for research in computational goals, eight years laterit has produced some science. accomplishments in hardware architecture and arti- Work with vendors as test sites for advanced ficial intelligence software. MITI's second project, HPC systems. dealing with supercomputers, has been more suc- cessful. Since 1981, when no supercomputers were As the use of HPC expands into more fields and manufactured by the Japanese, three companies among more researchers, what are the policies for have designed and produced supercomputers. providing access to the necessary computing re- sources? The Federal Government needs to de- The Japanese man tfacturers followed the Ameri- velop a comprehensive analysis of the require- cans into the supercomputer market, yet in the short ments of the scientific researchers for high time since their entrance, late 1983 for Hitachi and performance computing, Federal policies of sup- Fujitsu, they have rapidly gained ground in HPC port for scientific computing, and the variety of hardware. One company, NEC, has recently an- Federal and State/private computing facilities nounced a supercomputer with processor speeds up available for research. to eight times faster than the present fastest Ameri- can machine.° Outside of the United States, Japan We expect that OTA's final report will contribute to this analysis from a congressional perspective. is the single biggest market for and supplier of supercomputers, although American supercomputer However, the executive branch, includinr; both lead companies account for less than one-fifth of all agencies and OSTP also need to participate actively supercomputers sold in Japan." in this policy and planning process. In the present generation of supercomputers, U.S. THE INTERNATIONAL supercomputers have some advantages. One of ENVIRONMENT American manufacturer's major advantages is the Since some of the policy debate over HPCs has availability of scientific applications software. The involved comparison with foreign programs, this Japanese lag behind the Americans in software section will conclude a brief description of the status development, although resources are being devoted of HPC in some other nations. to research in software by the Japanese manufactur- ers and government and there is no reason to think Japan they will not be successful. The Ministry of International Trade and Industry Another area in which American firms differ from (MITI), in October of 1981, announced the undertak- the Japanese has been in their t se of multiprocessor ing of two computing projects, one on artificial architecture (although this picture is now changing). intelligence, the Fifth Generation Computer Project, For several years, American supercomputer compa- and one on supercomputing, the National Super- nies have been designing machines with multi- speed Computer Project. The publicity surrounding processors to obtain speed. The only Japanese MITI's announcement focused on fifth generation supercomputer that utilizes multiprocessors is the computers, but brought the more general subject of NEC system, which will not be available until the supercomputing to the public avention. (The term fall of 1990.

IcThe NEC machine is not scheduled for delivery until 1990,at which time faster Cray computers may well be on the market also. See also the comments above about computer speed. "Marjorie Sun, "A Global Supercomputer Race for High Stakes,"Science, February 1989, vol. 243, pp. 1004-1006.

4 c)j 17

American firms have been active in the Japanese NEC's current supercomputer architecture is aot market, with mixed success. based on its mainframe computer and it is not IBM compatible. They entered the supercomputer market Since 1979 Cray has sold 16 machines in Japan. later than Hitachi and Fujitsu. Three NEC supercom- Of the 16 machines, 6 went to automobile manufac- puters have been soldlmstalled in foreign markets: turers, 2 to NTT, 2 to Recruit, 1 to MITI, 1 to one in the United States, an SX-2 machine at the lbshiba, 1 to Aichi Institute of Thchnology, and 1 to Houston Area Research Consortium, one at the Mitsubishi Electric. None have gone to public Laboratory of Aerospace Research in Netherlands, universities or to government agencies. and an SX-1 has recently been sold in Singapore. Their domestic users include five universities. IBM offers their 3090 with attached vector facilities. IBM does not make public its customers, On April 10, 1989; in a joint venture with but report that they have sold around 70 vector Honeywell Inc., NEC announced a new line of processor computers to Japanese clients. Some supercomputers, the SX-X. The most powerful owners, or soon to be owners, include Nissan, NTF, machine is reported to be up to eight times faster Mazda, Waseda University, Nippon Steel and Mis- than the Cray X-MP machine. The SX-X reportedly tubishi Electric. will run Unix-based software and will have multi- processors. This machine is due to be shipped in the ETA sold two supercomputers in Japan. The first fall of 1990. was to the 'Ibkyo Institute of Technology(TIT). The sale w&s important because it was the first sale of a Fujitsu's supercomputer, like Hitachi's, is based CDC/ETA supercomputer to the Japanese as well as on their IBM compatible mainframes. Their first the first purchase of an American supercomputer by machine was delivered in late 1983. Fujitsu had sold a Japanese national university. Thismachine was 80 supercomputers in Japan by mid-1989. An delivered late (it arrived in May of 1988) and had estimated 17 machines have been sold to foreign many operating problems, partially due to itsbeing customers. An Amdahl VP-200 is used at the the first installment of an eight-processor ETA 10-E. Western Geophysical Institute in London. In the The second machine was purchased (not delivered) United States, the Norwegian company GECO, on February 9, 1989 by the University ofMeiji. How located in Houston, has a VP-200 and twu VP-100s. CDC will deal with the ETA 10 at TIT in light of the The most recent sale was to the Australian National closure of ETA is enknown at this time. University, a VP-100. Hitachi, Fujitsu, and NEC, the three Japanese manufacturers of supercomputers, are among the Europe largest computer/electronic companies in Japan; and they produce their own semiconductors. Their size European countries that have (or have ordered) allows them to absorb the high initial costs of supercomputers include: West Germany, France, designing a new supercomputer, as well as provide England, Denmark, Spain, Norway, the Netherlands, large discounts to customers. Japan's technological Italy, Finland, Switzerland, and Belgium. Europe is lead is in its very fast single-vector processors. Little catching up quicIdy with America and Japan in is known, as of yet, what is happening with parallel understanding the importance of high performance processing in Japan, although NEC's recent product computing for science and industry. The computer announcement for the SX-X states that the machine industry is helping to stimulate European interest. will have multiprocessors. For example, IBM has pledged $40 million towards a supercomputer initiative in Europe over the 2-year Hitachi's supercomputer architecture is loosely period between 1987-89. It is creating a large base of based on its IBM compatible mainframe. Hitachi followers in the European academic community by entered the market in November of 1983. Unlike participating in such progauns as the European their domestic rivals, Hitachi has not entered the Academic Supercomputing Initiative (EASI), and international market. All 29 of its ordered/installed the Numerically `ntensive Computing Enterprise supercomputers are located in Japan. (NICE). Cray 1 _Arch also has a solid base in 2ci 18

academic Europe, supplying over 14 supercomput- components of technology it needs to be competitive ers to European universities. on the world markets within a decade."14 The EC has designed a program that forces collaboration be- The United Kingdom began implementing a high tween nations, develops recognizable standards in performance computing plan in 1985. The Joint the information technology industry, and promotes Working Party on Advanced Research Cnmputing's pre-competitive R&D. The R&D focuses on five report in June of 1985, "Future Facilitiesfor main areas: microelectronics, software develop- Advanced Research Computing," recommended a ment, office systems, computer integrated manufac- national facility for advanced research computing. turing, and advanced information. This center would have the most powerful super- computer available; upgrade the United Kingdom's Phase I of Esprit, the first 5 years, received $3.88 networking systems, JANET, to ensure communica- billion in funding.15 The funding was split 50-50 by tions to remote users; and house a national organiza- the EC and its participants. This was considered the tion of advanced research computing to promote catch-up phase. Emphasis was placed on basic collaboration with foreign countries and within research, realizing that marketable goods will fol- industry, ensuring the effective use of these re- low. Many of the companies that participated in sources.12 Following this report, a Cray X-MP/48 Phase I were small experimental companies. was installed at the Atlas Computer Center in Rutherford. A Cray ls was installed at the University Phase II, which begins in late 1989, is called of London. Between 1986 and 1989, some $11.5 commercialization. Marketable goods will be the million was spent on upgrading and enhancing major emphasis of Phase II. This implies that the JANET.13 larger firms will be the main industrial participants since they have the capital needed to put a product Alvey was the United Kingdom's key information on the market. The amount of funds for Phase II will technology R&D program. The program promoted be determined by the world environment in informa- projects in information technology undertaken jointly tion technology and the results of Phase I, but has by industry and academics. The United Kingdom been estimated at around $4.14 billion.16 began funding the Alvey program in 1983. During the first 5 years, 350 million pounds were allocated Almost all of the high performance computer to the Alvey program. The program war, eliminated technologies emerging from Europe have been at the end of 1988. Some research was picked up by based on massively parallel architectures. Some of other agencies, and many of the projects that were Europe's parallel machines incorporate the transputer. sponsored by Alvey are now submitting proposals to Transputer technology (basically a computer on a Esprit (see below). chip) is based on high density VLSI (very large-scale integration) chips. The T800, Inmos's transputer, The European Community began funding the has the same power as Intel's 80386/80387 chip, the European Strategic Programme for Research in difference being in size and price. The transputer is Information Thchnology (Esprit) program in 1984 about one-third the size and price of Intel's chip.17 partly as a reaction to the poor performance of the The transputer, created by the Inmos company, had European Economic Community in the market of its initial R&D funded by the British government. information technology and partly as a response to Eventually Thorn EMI bought Inmos and the rights MITI's 1981 computer programs. The program, to the transputer. Thom EMI recently sold Inmos to funded by the European Community (EC), intends 'n a French-Italian. joint venture company, SGS- "provide the European IT industry with the key Thomson, just as it was beginning to be profitable.

12"Future Facilities for Advanced Research Computing," the report of a Joint Working Party on Mvanced Research Computing. United Kingdom, July 1985. 13Diacussion paper on "Supercomputers in Mkt:a lia," Department of Industry. lbchnoloy and Commerce. April 1988. pp. 14-15. 14"Esprit." Commission of the European CominuniUes, p. 5. 15"Esprit," Commission of the European Communities, p. 21. 16Simon Peny, "FAiropean 'learn Effort Breaks Ground in Software Standards," Electronic Business, Aug. 15, 1988, pp. 90-91. 11Graham K. Ellis, "Tranrputers Advanc.e Parallel Processing," Research and Development. March 1989, p. 50. 3 0 /9

Some of the more notable high performance com- the West German government in their super- puter products and R&D in Europe include: computing program. A computer prototype was recently shown at the industry fair in Hanover. T.Node, formerly called Supernode P1085, is It will be marketed in Germany by the end of one of the more successfulendeavors of the the year for around $14 million. Esprit program. T.Node is a massively parallel The Supercluster, produced and manufactured machine that exploits the Inmos T800 transputer. by Parsytec GmbH, a small private company, A single node is composed of 16 transputers exemplifies Silicon Valley initiative occurring connected by two NEC VLSI chips and two in West Germany. Parsytec has received some additional transputers. The participants in the financial backing from the West German gov- project are The University of Southampton, ernment for their ventnre. This start-upfirm Royal Signals, Radar Establishment, Thorn- sells a massively parallel machine that rivals EMI (all British) and the French firm lelernat. superminicomputers or low-end supercomput- The prototype of the French TNode, Marie, a ers. The Superclusterarchitecture exploits the massively parallel MIMD (multiple instruc- 32-bit a.ansputer from Inmos, the T800. Sixteen tion, multiple data) computer, was delivered in transputer-based processors in clusters of four April of 1988. The product is now being are linked together. Thisarchitecture is less marketed in America. costly than conventional machines, costing Project 415 is also funded by Esprit. Its project between $230,CLO and $320,000.19 Parsytec leader is Philips, the Dutch electronics group. has just begun to market its product is, America. This project, which consists of six groups, focuses on symbolic computation, artificial intelligence (AI), rather than "number crunch- Other Nations ing" (mathematical operations by conventional supercomputers). Using parallel architecture, The Australia National University recently pur- chased a Fujitsu VP-100. A private seri, ice bureau in the project is developing operating systems and languages that they hope will be available in 5 Australia, Leading Edge, possesses a Cray Research environment.18 computer. At least two sites in India have supercom- years for the office Centre and The Flagship project, originally sponsored by puters, one at the Tndian Meteorological the Alvey program, has created a prototype one at ISC Unive.sity. TwoMiddle Eastern petro- parallel machine using 15 processors. Itscrigi- leum companies house supercomputers, and Korea with nal participants were ICL, Imperial College, and Singapore both have research institutes and the University of Manchester. OtherAlvey supercomputers. projects worked with the Flagship project in Over half a dozen Canadian universities have high designing operating systems and languages for performance computers from CDC, Cray Research, the computer. By 1992 the project hopes to or IBM. Canada's private sectorhas also invested in have a marketable product. Since cancellation supercomputers. Around 10 firms possesshigh of the Alvey program, Flagship has gained performance computers. The Alberta goverdnx,x, sponsorship from the Esprit Program. aside from purchasing a supercomputer and support- The Supernum Project of West Germany, ing associated services, has helped fmance M:rias with the help of the French Isis program, Computer Corp. A wholly owned U.S. subsidi.vy, currently is creating machinery with massively Myrias Research Corp. manufactures the SP-2, a parallel architecture. The parallelism, based on minisupercomputer. Intel's 80386 microprocessors, is one of Es- prit's more controversial and ambitious pro- One newly industrialized country is reported to be jects. Originally the project was sponsoredby developing a minisupercomputer of its own. The

"Julia Vowier, Sivercomputing Review, "European Transpines-based Projects issueChallenge to U.S. Superoomputing Supennacy," November/Dec=1w 1988, pp. 8-9, 19John Gosh, "A New Transputer Design From West GermanStartup," Electronks, Mar, 3, 1988, pp. 71-72.

3 20 first Brazilian minisupercomputer, claimedto be machine will sell for $2.5 million. The Funding capable of 150 mips, is planned to be available by the Authority of Studies and Projects (F1NEP) financed end of 1989. The prototype isa parallel machine the project, with annual investment around $1 with 64 processors, each with 32-bit capacity. The million. Chapter 3 Networks

IMIMIMMINIEMP 411=11,

Information is the lifeblood of sience; commu- quate funding to carry out initiatives that are set nication of that information is crucial to the 121 Congress. advance of research and its applications. Data Research networking faces two particular policy communication networks enable scientists to talk complications. First, since the network in its broad- with each other, access unique experimental data, est form serves most disciplines, agencies, and many share results and publications, and run models different groups of users, it has no obvious lead on remote supercomputers, all with a speed, champion. As a common resource, its potential capacity, and ease that makes possible the posing of new questions and the prospect for new sponsors may each be pleased to use it but unlikely to give it the priority and funding required to bring answers. Networks ease research collaboration by removing geographic barriers. They have it to its full potential. There is a need for clear central leadership, as well as coordinatiot of governments, become an invaluable research tool, opening up the private sector, and universities. A second com- new channels of communication and increasing plication is a mismatch between the concept of a access to research equipment and facilities. Most -ttportant, networking is becoming the indispen- national research network and the traditionally decentralized, subsidized, mixed public-private na- le foundation for all other use of information ture of higher education and science. The processes technology in research. and priorities of mission agency-based Federal Research networking is also pushing the frontiers support may need some redesigning, as they are of data communications and network technologies. oriented towards supporting ongoing mission- Like electric power, highways, and the telephone, oriented and basic research, and may work less well data communications is an infrastructure that will be at fostering large-scale scientific facilities and infra- crucial to all sectors of the eccnomy. Businesses structure that cut across disciplines and agency demand on-line transaction processing, and finan- missions. cial markets run on globally networked electronic trading. The evolution of telephony to digital In the near term, the most important step is techn- logy allows merging ot voice, data, and getting a widely connected, operational network in place. But Mit "bare bones" networks are a information services networking, although voice small part of the picture. Information that rows circuits still dominate the deployment of the technol- over the network, and the scientific resources and ogy. Promoting scientific research networking data available through the network, are the dealing with datadintense outputs like satellite imag- ing and supercoinpeter modelingshould push important payoffs. Key long-term issues for the research community will be those that affect the networking technology that will fmd application far sort of information available over the network, outside of science. who has accg!ss to it, and how much it costs.The Policy action is needed, if Congress wishes to main issue ar,las for scientific data networking are see the evolution of a full-scale national research outlined below: and education network. The existing "iniernet" researchto develop the technology required of scientific networks is a fledgling.As this conglomeratiol of networks evolves from an to transmit and switch data at very high rates; R&D enterprise to an operational network, users private sector participationrole of the com- will demand round-the-clock, high-quality serv- mon carriers and telecommunication compa- ice. Academics, policymakers, and researchers nies in developing and managing the network around the world agree on the pressing need to and of private information firms in offering transform it into a permanent infrastructure. services; This will entail grappling with difficult issues of scopewho the network is designed to serve public and private roles in funding, management, will drive its structure and management; pricing/cog recovery, access, security, and inter- accessbalancing open use against security national coordination as well as assuring ade- and information control and determining who -21-- 22

will be able to gain access to the network for and instructional community." what purpose; EDUCOM/NTTF, March 1989. standardsthe role of government, industry, "The NREN will provide high-speed communica- users, and international organizations in setting tion access to over 1300 institutions across the and maintaining technical standards; United States within five years. It will offer suffi- managementpublic and private roles; degree cient capacity, performance, and functionality so that of decentralization; the physical distance between institutionsis no fundingan operational network will require longer a barrier to effective collaboration. It will significant, stable, continuing investment; the support access to high-performance computing fa- financial responsibilities demarcated must re- cilities and services...and advanced information flect the interests of various players, from sharing and exchange, including national file sys- tems and online libraries....the NREN will evolve individual colleges through States and the toward fully supported commercial facilities that Federal Government, in their stake in network support a broad range of applications and services." operations and policies; economicspricing and cost recovery for net- FRICC, Program Plan work use, central to the evolution and manage- for the NREN, May 23, 1989. ment of any infrastructure. Economics will This chapter of the background paper reviews the drive the use of the network; status vf and issues surrounding data networking for information serviceswho will decide what science, in particular the proposed NREN. It de- types of services are to be allowed over the scribes current Federal activities and plans, and network, who is allowed to offer them; and who identifies issues to be examined in the full report, to will resolve information issues such as privacy, be completed in summer 1990. intellectual property,fair competition, and security; The existing array of scientific networks consists long-term science policy issuesthe networks' of a hierarchy of local, regional and national impacts on the process of science, and on networks, linked into a whole. Inthis paper, access to and dissemination of valuable scien- "NREN" will be used to describe the next generation tific and technical information. of the national "backbone" that ties them together. The term "Internet" is used to describe a more specific set of interconnected major networks, all of THE NATIONAL RESEARCH AND which use the same data transmission protocols. The EDUCATION NETWORK (NREN) most important are NSFNET and its major regional subnetworks, ARPANET, and several other feder- "A universal communications network connected allyinitiated networks such as ESNET to national and international networks enables elec- NASNET The term internet is used fairly loosely. tronic communication among scholars anywhere in At its broadest, the more generic term internet can be the world, as well as access to worldwide informa- tion sources, special experimental instruments, and used to describe the international conglomeration of computing resources. The network has sufficient networks, with a variety of protocols and capabili- bandwidth for scholarly resources to appear to be ties, which have a gateway into Internet; which attached to a world local area network." could include such things as BITNTT and MCI Mail. EDUCOM, 1988. The Origins of Research Networking "..a naqonal research network to provide a distrib- uted computing capability that links the government, Research users were among the firstto link industry, and higher education communities." computers into networks, to share information and OSTP, 1987. bioaden remote access to computing resources. "The goal of the National Research and Education DARPA created ARPANET in the 1960s for two Network is to enhance national competitiveness and pir71sfts7to advance networking and data communi- productivity through a high-speed, highquality catilns R&D, and to develop a robust communica- network infrastructure 41,ich supports a broad set of tions netwo f.,.that would support the data-rich applications and network services for the research conversations of computer scientists. Building on

0r) 4 23

the resulting packet-switched network technology, national backbone. The primary driver for this other agencies developed specialized networks for interconnecting and coalescing of networks has been their research communities (e.g., ESNET, CSNET, the need for connectivity among users. The power of NSFNET). Thlecommunications and electronic in- the whole is vastly greater than the sum of the pieces. dustries provided technology and capacity for these Substantial costs are saved by extending connectiv- networks, but they were not policy leaders or ity while reducing duplication of network coverage. innovators of new systems. Meanwhile, other itsearch- The real payoff is in connecting people, information, oriented networks, such as BITNET azd Usenet, and resouices. Linking brings users in reach of each were developed in parallel by academic and industry other. Just as telephones would be of li:Oe use if only users who, not being grantees or contractorsof a few people had them, a research andeducation Federal agencies, were not served by the agency- network's connectivity is central to its usefulness, sponsored networks. These university and lab-based and this connectivity comes both from ability of networks serve a relatively small number of special- each network to reach the desks, labs, and homes of ized scientific users, a market that has been ignored its users and the extent to which various networks by the traditional telecommunications industry. The are, themselves, interconnected. networks sprang from the efforts of users academic and other research scientistsand the The Present NREN Federal managers who were supporting them.' The national research and education network can be viewed as four levels of increasingly complex Lnd The Growing Demand for Capability and flexible capability: Connectivity physical wire/fiber optic common carrier "high- Today there are thousands of computer networks ways"; in the United States. These networks range from user-defined, packet-switched networks; tempoary linkages tetween modem-equipped2 desk- basic network operations and services; and top computers linked via common carriers,to research, education, database, and information institution-wide area networks, to regional and services accessible to network users national networks. Network traffic moves through different media, including copper wire and optical In a fully developed NREN, all of these levels of cables, signal processors and switches, satellites, service must be integrated. Each lev .1 involves and the vast common carrier system developed for different technologies, services, policy issues, re- voice communication. Much of this hodgepodge of search opportunities, engineering requirements, cli- networks has been linked (at least in terms of ability entele, providers, regulators, and policy issues. A to interconnect) into the internet. The ability of any more detailed look at the policyproblems can be two systems to interconnect depends on their ability drawn by separating the NREN into its major to recognize and deal with the forminformation components. flows take in each. These "protocols" are sets of technical standards that, in a sense, are the "lan- Level 1: Physical wire/fiber optic common guages" of communication systems. Networks with carrk. highways different protocols can often be linked together by The foundation of the network is the physical computer-based "gateways" that translate the proto- conduits that carry digital signals. These telephone cols between the networks. wires, optical fibers, microwave links, and satellites National networks have partially coalesced, where are the physical highways andbyways of data technology allows cost savings without losing transit. They are invisible to th c.ttwork user. To connectivity. Over the past years, several agencies provide the physical skeleton for the internet, have pooled funds and plans to support a shared government, industry, and university network man-

'John S. Quarterman and Josiah C. Hoskins, "Notable Computer Networks,"Communications of the ACM, vol 29, No. 10, October 1986, pp. 932-971; John S. Quanerman, The Matrix. Networks Arotuul the World, Digital Press.August 1989. 2A "Modem" converts information in a computer to a Conn that a communication system can carry, and vice versa. It also automates some simple functions, ruch as dialing and answering the phone, detecting and correctingtransmission errors.

3 24

agers lease circuits from public switched common Level 3: Basic network operations and services carriers, such as AT&T, MCI, GTE, and NTN. In doing so they take advantage of the large system of A small number of basic maintenance tools keeps the network running and accessible by diverse, circuits already laid in place by the telecommunica- distributed users. These basic services are software- tions common carriers for other telephony and data based, provided for the users by network operators markets. A key issue at this level is to what extent and computer manufacturers in operating systems. broader Federal agency and national telecommuni- They include software for password recognition, cations policies will promote, discourage, or divert electronic-mail, and file transfer. These are core the evolution of a research-oriented data network. services necessary to the operation of any network. These basic services are not consistent across the current range of computers used by research. A key Level 2: User-defined subnetworks issue is to what extent these services should be standardized, and as important, who should make The internet isa conglomeration of smaller those decisions. foreign, regional, State, local, topical, private, gov- Level 4: Value-added superstructure: links to ernment, and agency networks. Generally, these research, education, and information services separately managed networks, such as SURANET, BARRNET, BITNET, and EARN, evolved along The utility of the network lies in the information, naturally occurring geographic, topical, oruser services, and people that the user can access through lines, or mission agency needs. Most of these logical the nework. These value-added services provide networks emerged from Federal research agency specialized tools, information, and data for research an!, education. Today they include specialized (including the Department of Defense) initiatives. In computers and software, library catalogs and publi- addition, there are more and more commercial, State cation databases, archives of research data, confer- and private, regional, and university networks (such encing systems, and electronic bulletin boards and as Accunet, Telenet, and Usenet) at the same time publishing services that provide access to colleagues specialized and interlinked. Many have since linked in the United States and abroad. These information through the Internet, while keeping to some extent resources aye provided by volunteer scientists and by their own technical and socioeconomic identity. non-profit, for-profit, international, and government This division into small, focused networks offers the organizations. Some are amateur, poorly maintained advantage of keeping network management close to bulletin boards; others are mature information or- its users; but demands standardization and some ganizations with well-developed services. Some are central coordination to realize the benefits of inter- "free"; others recover costs through user charges. connection. Core policy issues are the appropriate roles for various information providers on the network. If the Networks at this level of operations are distin- network is viewed as public infrastructure, what is guished by independent management and technical "fair" use of this infrastructure? If the network eases boundaries. Networks often have different standards access to sensitive scientific data (whether raw and protocols, hardware, and software. They carry research data or government regulatory databases), information of different sensitivity and value. The how will this stress the policies that govern the diversity of these logical subnetworks matters to relationships of industry, regulators, lobbyists, and institutional subscribers (who must choose among experts? Should profit-seeking companies be al- network offerings), to regional and national network lowed to market their services? How can we ensure managers (who must manage and coordinate these that technologies needed for network maintenance, networks into an internet), and to users (who can find cost accounting, and monitoring will not be used the variety of alternatives confusing and difficult to inappropriately or intrusively? Who should set deal with). A key issue is the management mlation- prices for various users and services? How will ship among these diverse networks; to what extent intellectual property rights be structured for elec- is standardization and centralization desirable? tronically available information? Who is responsible

:16 25

for the quality and integrity of the data provided and ment support for applied research can catalyze and used by researchers on the network? integrate R&D, decrease risk, create markets for network technologies and services, transcend eco- Research Networking as a Strategic nomic and regulatory barriers, and accelerate early High Technology Infrastructure technology development and deployment. This would not only bolster U.S. science and education, but Research networking has dual roles. First, net- would fuel industry R&D and help support the working is a strategic, high technology infrastruc- market and competitiveness of the U.S. network and ture for science. More broadly applied, data net- information services industry. working enables research, education, business, and manufacturing, and improves the Nation's knowl- Governments and private industries the world edge competitiveness. Second, networking technol- over are developing research networks, to enhance ogies and applications are themselves a substantial R&D productivity and to create testbeds for highly advanced communications services and technolo- growth area, meriting focused R&D. gies. Federal involvement in infrastructure is moti- Knowledge is the commerce of education and vated by the need for coordination and nationally research. lbday networks are the highways for oriented investment, to spread financial burdens, information and ideas. They expand access to and promote social policy goals (such as furthering computing, data, instruments, the research commu- basic research).3 Nations that develop markets in nity, and the knowledge they create. Data are network-based technologies and services will create expensive (relative to computing hardware) and are information industry-based productivity growth. increasingly created in many widely distributed locations, by specialized instruments and enter- Federal Coordination of the Evolving Internet prises, and then shared among many separate users. The more effectively that research information is NREN plans have evolved rapidly. Congres- disseminated to other researchers and to industry, sional interest has grown; in 1986, Congress re- the more effective is scientific progress and social quested the Office of Science and Technology application of technological knowledge. An internet Policy (OSTP) to report on options for networking of networks has become a strategic infrastructure for for research and supercomputing.4 The resulting research. report, completed in 1987 by the interagency Federal Coordinating Council for Science, Engineering, and The research networks are also a testbed for Technology (FCCSET), called for a new Federal data communications technology. lbchnologies program to ,:reate an advanced national research developed through the researth networks are likely network by the year 2000.5 This vision incorporated to enhance productivity of all economic sectors, not two objectives:1) providing vital computer- just university research. The federally supported communications network services for the Nation's Internet has not only sponsored frontier-breaking academic research community, and 2) stimulating network research, but has pulled data-networking networking and communications R&D which would technology with it. ARPANET catalyzed the devel- fuel U.S. industrial technology and commerce in the opment of packet-switching technology, which has growing global data communications market. expanded rapidly from R&D networking to multibil- lion-dollar data handling for business and financial The 1987 FCCSET report, building on ongoing transactions. The generic technologies developed Federal activities, addressed near-term questions for the Internethardware (such as high-speed over the national network's scope, purposes, agency switches) and software for network management, authority, performance targets, and budget. It did not routing, and user interfacewill transfer readily resolve issues surrounding the long-term operation into general data-networking applications. Govern- of a network, the role of commercial services in

3Congressional Budget Office, New Directions for the Nation's Public Works, September 1988, p. xiii;CRO, Federal Policies for Infrastructure Managenwnt, June 1986. 4P.L. 99-383, Aug. 21, 1986. 5OSTP , A Research and Development Strategy for High Performance Compluing, Nov. 20, 1987. 26 providing network operations and services, or inter- networking industry, the telecommunications, face with broader telecommunications policies. data communications, computer, and informa- A 1988 National Research Council report praised tion service companies that provide networking ongoing activities, emphasized the need for coordi- technologies and services; State enterprises devoted to economic develop- nation, stable funding, broadened goals and design ment, ..etarch, and education; criteria, integrated management, and increased pri- vate sector involvement.6 industrial R&D labs (network users); and e the Federal Government, primarily the national FCCSET's Subcommittee on Networking has labs and research-funding agencies since issued a plan to upgrade and expand the network.7 In developing this ian, agencies have Federal funding and policy have stimulated the worked together to improve and interconnect several development of the Internet. Federal initiatives have existing networks. Most regional networks were been well complemented by States (through funding joint creations of NSF and regional consortia, and State networking and State universities' institutional have been part of the NSFNET world since their and regional networking), universities (by funding campus networking), and industry (by contributing inception. Other quasi-private, State, and regional networks (such as CICNET, Inc., and CERFNET) networking technology and physical circuitsat sharply reducen rates). End users have experienced have been started. a highly subsidized service during this "experimen- Recently, legislation has been reintroduced to tal" stage. As the network moves to a bigger, more authorize and coordinate a national research net- expensive, more established operation, how might work.8 As now proposed, a National Research and these relative roles change? Education Network would link universities, national laboratories, non-profit institutions and government Universities research organizations, private companies doing Academic institutions house teachers, research- government-supported research and education, and ers, and students in all fields. Over the past few facilitiessuch as supercomputers, experimental decades universities have invested heavily in librar- instruments, databases, and research libraries. Net- ies, local computing, campus networks, and regional work research, as a joint endeavor with industry, network consortia. The money invested in campus would create and transfer technology for eventual networking far outweighs the investment in the commercial exploitation, and serve thedata- NSFNET backbone. In general, academics view the networking needs of research and higher education NREN as fulfillment of a longstanding ambition to into the next century. build a national system for the transport of informa- tion for research and education. EDUCOM has long Players in the NREN labored from the "bottom" up, bringing together researchers and educators who used networks (or The current Internet has been created by Federal believed they could use them) for both research and leadership and funding, pulling together a wide base teaching. of university commitment, national lab and aca- demic expertise, and industry interest and technol- Networking Industry ogy. The NREN involves many public and private actors. Their roles must be better delineated for There is no simple unified view of the NREN in effective policy. Each of these actors has vested the fragmented telecommunications "industry." The long-distance telecommunications common carriers interests and spheres of capabilities. Key players are: generally see the academic market as too specialized universities, which house most end users; and risky to offer much of a profit opportunity.

6National Research Council, Toward a Nanonal Research Network (Washington, DC, National Academy Press, 1988), especially pp. 25-37. 7FCCSET or Federal Coordinating Council for Science, Engineering, and Technology, The Federal High Petformance Computing Program, Washington, DC, OSTP, Sept. 8, 1989. IS, 1067, "The National High-Performance Computer Technology Act of 1989," May 1989, introduced by Mr. Gore. Hearings were held on June 21, 1989. H.R. 3131, "The National High-Performance Computer Technology Act of 1989," introduced by Mr. Walgren.

S 27

However, companies have gained early experience internet users bring with them their own set of with new technologies and applications by partici- concerns such as cost accounting, propernetwork pating in university R&D; it is for this reason that use, and information security. Othernon-R&D industry has jointly funded the creation and develop- companies, such as business analysts, also are likely ment of NSFNET. to seek direct network connectivity touniversities, government laboratories, and R&D-intensive com- Various specialized value-added common carriers panies. offer packet-switched services. They could in princi- ple provide some of the same services that the NREN would provide, such as electronic mail. They are not, Federal however, designed to meet the capacity require- Three strong rationalessupport of mission and ments of researchers, such as transferring vastfiles basic science, coordinating a strategic national of supercomputer-generated visualizations of weather infrastructure, and promotion of data-networking systems, simulated airplane test flights, or econo- technology and industrial productivitydrive a metric models. Nor can common carriers provide the substantial, albeit changing, Federal involvement. "reach" to all carriers. Another more modest goal is to rationalize duplica- tion of effort by integrating, extending, and modern- States izing existing research networks. That is in itself The interests of States in research, education, and quite important in the present Federal budgetary economic development parallel Federal concerns. environment. The international nature of the net- Some States have also invested in information work also demands a coherent national voice in infrastructure development. Many States have in- international telecommunications standardization. vested heavily in education and research network- The Internet's integration with foreign networks also ing, usually based in the State university system and justifies Federal concern over the international flow encompassing, to varying degrees, private universi- of militarily or economically sensitive technical ties, State government, and industry. The State is a information. The same university-government- "natural" political boundary for network financing. industry linkages on a domestic scale drive Federal In some States, such as Alabama, New York, North interests in the flow of information. Carolina, and Texas, special initiatives have helped Federal R&D agencies' interest in research net- create statewide networks. working is to enhance their external research support Industry Users missions. (Research networking is a small, special- ized part of agency telecommunications. It is de- There are relatively few industry users of the signed to meet the needs of the research community, internet; most are very large R&D-intensive compa- rather than agency operations and administrative nies such as IBM and DEC, or small high- telecommunications thatare addressed in FTS technology companies. Many large companies have 2000.) The hardware and software communications internal business and research networks which link technologies involved should be of broad commer- their offices and laboratories within the United cial importance. The NREN plans reflect national States and overseas; many also subscribe to com- interest in bolstering a serious R&D base and a mercial services such as MCI Mail. However, these competitive industry in advanced computer commu- proprietary and commercial networks do not provide nic ations. the internet's connectivity to scientists or thehigh bandwidth and services so useful for research The dominance of the Federal Government in communications. Like universities and national network development means that Federal agency labs, companies are a part of the Nation's R&D interests have strongly influenced its form and endeavor; and being part of the research community shape. Policies can reflect Federal biases; for in- today includes being "on" the internet. Appropriate stance, the limitation of access to the earlyAR- industry use of the NREN should encourage interac- PANET to ARPA contractors left out many academ- tion of industry, university, and government re- ics, who consequently created their own grass-roots, searchers, and foster technology transfer. Industry lower-capability BITNET.

3 28

International actors are also important. As with tions comes from the voice market. One reason is the telephone system, the internet is inherently uncertainty about the legal limits, for rroviding international. These links require coordination, for information services, imposed on the newly divested example for connectivity standards, higher level companies. (In comparison, the computer industry network management, and security. This require- has been unregulated. With the infancy of the ment implies the need for Federal level management technology, and open markets, computer R&D has and policy. been exceptionally productive.) A crucial concern for long-range NREN planning is that scientific The NREN in the International and educational needs might be ignored among Telecommunications Environment the regulations, technology priorities, and eco- The nature and economics of an NREN will nomics of a telecommunications market geared depend on the international telecommunications toward the vast telephone customer base. context in which it develops. Research networks are a leading edge of digital network technologies, but POLICY ISSUES are only a tiny part of the communications and The goal is clear; but the environment is information services markets. complex, and the details will be debated as the The 1990s will be a predominantly digital world; network evolves historically different computing, telephony, and There is substantial agreement in the scientific business communications technologies are evolving and higher education community about the pressing into new information-intensive systems. Digital national need for a broad-reaching, broad- technologies are promoting systems and market bandwidth, state-of-the-art research network. The integration. Iblecommunications in the 1990s will existing Internet provides vital communication, revolve around flexible, powerful, "intelligent" net- research, and information services, in addition to its works. However, regulatory change and uncertainty, concomitant role in pushing networking and data market turbulence, international competition, the handling technology. Increasing demand on network explosion in information services, and significant capacity has quickly saturated each network up- changes in foreign telecommunications policies, all grade. In addition, the fast-growing demand is are making telecommunications services more tur- overburdening the cwrent informal administrative bulent. This will cloud the research network's arrangements for running the Internet. Expanded long-term planning. capability and connectivity will require substantial High-bandwidth, packet-switched networking is budget increases. The current network is adequate at persent a young market in comparison to commer- for broad e-inail service and for more restricted file cialtelecommunications. Voice overwhelmingly transfer, remote logon, and other sophisticated uses. dominates other services (e.g. fax, e-mail, on-line Moving to gigabit bandwidth, with appropriate data retrieval). While flexible, hybrid voice-data network services, will demand substantial techno- services are being introduced in response to business logical innovation as well as u.vestment. demand for data services, the technology base is There are areas of disagreement and even broader optimized for voice telephony. areas of uncertainty in planning the future national Voice communications brings to the world of research network. There are several reasons for this: computer telecommunications complex regulatory the immaturity of data network technology, serv- and economic baggage. Divestiture of the AT&T ices, and markets; the Internet's nature as strategic regulated monopoly opened the telecommunications infrastructure for diverse users and institutions; market to new entrants, who have slowly gained and the uncertainties and complexities of overriding long-haul market share and offered new technolo- telecommunications policy and economics. gies and information services. In general, however, First, the current Internet is, to an extent, an the post-divestiture telecommunications industry experiment in progress, similar to ti 3 early days of remains dominated by the descendants of old the telephone system. Ibchnologies, uses, and po- AT&T, and most of the impetus for service innova- tential markets for network services are still nascent.

4 (1 29

Patterns of use are still evolving; and a reliable management structure appropriate to the desired network has reached barely half of the research mission is established. community. Future uses of the network are difficult Third, the network is part of the telecommunica- to identify; each upgrade over the past 15 years has tions world, rampant with policy and economic brought increased value and use as improved net- confusion. The research community is small, with work capacity 'and access have made new applica- specialized data needs that are subsidiary to larger tions feasible. markets. It is not clear th?' science's particular networking needs will be met. The Internet is a conglomeration of networks that grew up ad hoc. Some, such asARPANET, CSNET, Planning Amidst Uncertainty and MFENET, were high-quality national networks supported by substantial Federal funding. Other Given these three large uncertainties, there is no smaller networks were built and maintained by the straightforward or well-accepted model for the late-night labors of graduate students and computer ."best" way to design, manage, and upgrade the centers operators. One of these, BITNET, has future national research network. Future network use become a far-reaching and widely used university will depend on cost recovery and charging practices, network, through the coordination of EDUCOM and about which very little is understood. These uncer- support of IBM. The Internet has since become a tainties should be accommodated in the design of more coherent whole, under Federal coordination network management as well as the network itself. led by NSF and DARPA and advised by the Internet One way to clarify NREN options might be to Activities Board. Improvements inservice and look at experiences with other infrastructures (e.g., connectivity have been astounding. Yet the patch- waterways, telephones, highways) for lessons about work nature of the Internet still dominates; some how different financing and charging policies affect campus and regional networks are highquality and who develops and deploys technology, how fast well maintained; others are lower speed, less relia- technology develops, and who has access to the ble, and reach only a few institutions in their region. infrastructure. Additionally, some universities are Some small networks are gatewayed into the In- beginning trials in charging for network services; ternet; others are not. This patchwork nature limits these should provide experience in how various the effectiveness of the Internet, and argues ...or better charging practices affect usage, technology deploy- planning and stronger coordination. ment and upgrading, and the impacts of network use policies on research and education at the level of the Second, the network is a strategic infrastructure, institution. with all the difficulties in capitalizing, planning, Table 3-1 lists the major areas of ageement and financing, and maintaining that seem to attend any disagreement in various "models" of the proper form infrastnicture.9 Infrastructures tend to suffer from a of network evolution. "commons" problem, leading to continuing underin- vestment and conflict over centralized policy. Byits Network Scope and Access nature the internet has many diverse users,with diverse interests in and demands on the network. The Scope network's value is in linking and balancing the needs of these many users, whether they want advanced Where should an NREN reach: beyond research- supercomputer services or merely e-mail. Some intensive government laboratories and universities users are network-sophisticated,while many users to all institutions of higher education? high schools? want simple, user-friendly communications.This nonprofit and corporate labs? Many believe that diversity of users complicates network planning and eventually perhaps in 20 yearsde facto data management. The scope and offerings of the net- networking will provide universal linkage, akin to a work must be atleast sketched out before a sophisticated phone system.

9Conwessional Budget Office, New Directions for the Nation' s Public Works, September 1988;National Council on Public Works Improvement, Fragile Foundations: A Report on America' s Public Works, Washington, DC, February 1988.

4.1 30

Table 3-1--PrInclpel Policy Issues In Network Development

Major areas of agreement Major areas of disagreement and uncertainty Scope and access 1. The national need for a broad state-of-the-art research network la. The exact scope of the NREN; whether and how to control that links basic research, govern msnt, and higher education. domesbc and foreign access. lb. Hierarchy of network capability. Cost and effort limit the reach of state-of-the-art networking; an "appropriate networking" scenario would have the most intensive users on a leading edge network and less demanding users on a lower-cost network that suffices for their needs. Where should those lines be drawn, and who should draw them? How can the Federal Government ensure that the g* between leading edge and casual Is not too large, and that access is appropriate and equitable?

Policy and menagement structunt 2. The need for a more formal mechanism for planning and 2a. The form and function of an NREN policy and management operating the NREN, to supersede and better coordinate authority; the extent of centralization, particularly the role of informal interagency cooperation and ad hoc university and Federal Government; the extent of participation of industry State participation, and for international coordination. users, networking industry, common carriers, and universities in policy and operations; mechanisms for standard setting.

Financkg and cost recovery 3. The desirability of moving from the C Irrent "market- 3a. How the transition to commercial operations and charging can establishing" environment of Federal and E tate grants and and should be made; more generally, Federal-private sector subsidies, with services "free" to users, to 'More formal cost roles In network policy and pricing; how pricing practices will recovery, shifting more of the cost burden and financial shape access, use, and demand. incentives to end users.

Network use 4. The desirability of realizing the potential of a network; the need 4a. Who should be able to use the networic for what purposes, and for standards and policies to link to information services, at what entry cost; the process of guiding economic structure databases, and nonresearch networics. of services, subsidies, price of for multi-product services; intellectual property policies. SOURCE: Office of Technology Assessment, 1989.

The appropriate breadth of thc network is unlikely participation in standard-setting to make it feasible to be fully resolved until more user communities for currently separated communities, such as high gain more experience with networking, and a better schools and universities, to interconnect later on. understanding is gained of the risks and benefits of Industry-academic boundaries are of particular various degrees of network coverage. A balance concern. Interconnection generally promotes re- must be struck in network scope, which provides a search and innovation. Companies are dealing with small network optimized for special users (such as risk of proprietary information release by maintain- scientists doing full-time, computationally intensive ing independent corporate networks and by restrict- research) and also a broader network serving more ing access to open networks. How can funding and diverse users. The scope of the internet, and capabil- pricing be struc"ired to ensure that for-profit compa- ities of the networks encompassed in the internet, nies bear an appropriate burden of network costs? will need to balance the needs of specialized users without diluting the value for top-end and low-end Access users. NREN plans, standards, and technology Is it desirable to restrict access to the internet? should take into account the possibility of later Who should control access? Open access is desired expansion and integration with other networks and by many, but there are privacy, security, and other communities currently not linked up. After-the- commercial arguments for restricting access. Re- fact technical patches are usually inefficient and stricting access is difficult, and is determined more expensive. This may require more government by access controls (e.g., passwords and monitoring) 31

on the computers thatattach users to the network, Policy and Management Structure needed on than by the network itself. Study is Possible management models include: federally by whether and how access can be controlled chartered nonprofit corporations, single lead agen- technical fixes within the network, by computer informal codes of cies, interagency consortium, government-owned centers attached to the network, contractor operations, commercialoperations; and behavior, or laws. itnnessee Valley Authority, Atomic Energy Com- mission, the NSF Antarctic Program, and Fannie Another approach is not to limit access, but Mae. What are the implications of various scenarios minimize the vulnerability of the networkand its for the nature of traffic and users? information resources and users--to accidents or malice. In comparison, essentially anyone who has Degree of Centralization a modest amount of money caninstall a phone, or use a public phone, or use afriend's phone, and What is the value of centralized, federally ac- countable management for network access control, access the national phone system.However, crimi- nal, fraudulent, and ha, assing uses of thephone traffic management and monitoring, and security, system are illegal. Access isunrestricted, but use is compared to the value of decentralized operations, open access and traffic? There are twokey technical governed. questions here: to what extent does network tech- nology limit the amount of control that can be Controlling International Linkages exerted over access and traffic content? To what extent does technology affect thestrengths and Science, business, and industry are international; weaknesses of centralized and decentralized man- their networks are inherently international. Itis agement? difficult to block private telecommunications links Mechanisms for Interagency Coordination with foreign entities, and public telecommunica- tions is already international. However; thereis a Interagency coordination has worked well so far, fundamental conflict between the desire to capture but with the scalirig up of the network, moreformal information for national or corporate economic lain, mechanisms are needed to deal with largerbudgets and the inherent openness of a network.Scientists and to more tightly coordinatefurther development. generally argue that open network access fosters Coordination With Other Networks scientifically valuable knowledge exchange,which in turn leads to commercially valuableinnovation. National-level resources allocation and planning must coordinate with interdependentinstitutional and mid-level networking (the other twolegs of Hierarchy of Network Capability networking).

Investment in expanded network access mustbe Mechhnisms for Standard Setting balanced continually with the upgrading of network Who should set standards, when shouldthey be performance. As the network is a significant com- set, and how overaxhing shouldthey be? Standards petitive advantage in research and highereducation, at some common denominatorlevel are absolutely access to the "best" networkpossible is important. necessary to make networks work.But excessive There are also technological considerationsin link- standardization may deter innovation in network ing networks of various performance levelsand technology, applications and services, and other various architectures. There is already a consensus standards. that there should be a separate testbed orresearch network for developing and testing new network Any one set of standards usually is optimal for technologies and services, which will truly be atthe some applications or users,but not for others. There cutting edge (and therefore also have the weaknesses are well-establishedinternational mechanisms for interna- of cutting edge technology, particularlyunreliability formal standards-setting, as well as cli-ong standards and difficulty of use). tional involvement in more infonnal

4 3 32

development. These mechanisms have worked well, Funding and Charge Structures albeit slowly. Early standard-setting by agencies and their advisers accelerated the development of U.S. Finaacing issues are akin to ones in more tradi- networks. In many cases the early established tional infrastructures, such as highways andwater- standards have become, with some modification, de war. These issues, which continue to dominate facto national and even international standards. This infrastructure debates, are Federal private sector is proving the case with ARPANET's protocol suite, roles and the sta.ucture of Federal subsidies and TCP/IP. However, many have complained that incentives (usually to restructure payments and agencies' relatively precipitous and closed standards access to infrastructure services). Is there a continu- determination has resulted in less-than-satisfactory ing role for Federal subsidies? Howcan university standards. NREN policy should embrace standards- accounting, OMB circular A-21, and cost recovery setting. Should it, however, encourage wider partici- practices be accommodated? pation, especially by industry, than has been the case? U.S. policy must balance the need for interna- User fees for network access are currently charged tional compatibility with the furthering of national as membership/access fees to institutions. End users interests. generally are not charged. In the future,user fees may combine access/connectivity fees, and use- Financing and Cost Recovery related fees. They may be secured via a trust fund (as is the case with national highways, inland water- How can the capital and operating costs of the ways, and airports), or be returned directlyto NREN be met? Issues include subsidies, useror operating authorities. A few regional networks (e.g., access charges, cost recovery policies, and cost CICNET, Inc.) have set membership/connectivity accounting. As an infrastructure thatspans dise- fees to recover full costs. Many fear that user feesare plines and sectors, the NREN is outside the tradi- not adequate for full funding/cost recovery. tional grant mechanisms of science policy. How might NREN economics be structured to meet costs and achieve various policy goals, such asencourag- Industry Participation ing widespread yet efficient use, ensuring equity of access, pushing technological development while Industry has had a substantial financial role in maintaining needed standards, protecting intellec- network development. Industry participation has tual property and sensitive information whileen- been motivated by a desire to stay abreast of couraging open communication, and attracting U.S. data-networking technology as well asa desire to commercial involvement and third-party informa- develop a niche in potential markets for research tion services? networking. It is thus desirable to have significant industry participation in the development of the Creating a Market NREN. Industry particir anon does several things; industry cost sharing makes the projects financially One of the key issues centers around the extent to feasible; industry has the installed long-haul tele- which deliberate creation of a market should be built communications base to build on; and industry into network policy, and into the surrounding involvement in R&D should foster technology science policy system. There are those who believe transfer and, generally, the competitiveness of U.S. that itis important that the delivery of network telecommunications industry. Industry in-kindcon- access aid services to academics eveatually become tributions to NSFNET, primarily from MCI and a commercial operation, and that the current Federal IBM, are estimated at $40 million to $50 million subsidy and apparently "free" services will get compared to NSF's 5 year, $14 million budget.10 It academics so used to fite services that there will is anticipated that the value of industry cost sharing never be a market. How do you gradually create an (e.g., donated switches, lines, or software) for NREN information market, for networks, or for network- would be on the order of hundreds of millions of accessible value-added services? doll ars.

1°Eliot Marshal, "NSF Opens High-Speed Computer Network,'' Science,p. 22. 33

Network Use a fully implemented NREN change the concen- tration of academic science and Federal fund- Network service offerings (e.g., databases and ing in a limited number of departments and database searching services, news, publication, and research universities, and of corporate science software) will need some policy treatment. There in a few large, rich corporations; what might be need to be incentives to encourage development of the impacts of networks on traditional routes to and access to network services, yet not unduly scientific priority and prestige?) subsidize such services, or compete with private contolling scientific information flow. What business, while maintaining quality control. Many technologies and authority to control network- network services used by scientists have been "free" resident Jcientific information? How might to the end user. these controls affect misconduct, quality con- Economic and legal policies will need to be trol, economic and corporate proprietary pro- clarified for reference services, commercial infor- tection, national security, and preliminary re- mation industry, Federal data banks, university data lease of tentative or confidential research infor- resources, libraries, publishers, and generally all mation that is scientifically or medically sensi- potential services offered over the network." These tive? policies should be designed to encourage use of cost and capitalization of doing research; to services, while allowing developers to capture the what extent might networking reduce the need potential benefits of network services and ensure for facilities or equipment? legal and economic incentives to develop and oversight and regulation of science, such as market network services. quality control, investigations of misconduct, research monitoring, awarding and auditing of Longer Term Science Policy Issues government grants and contracts, data collec- The near-term technical implementation of the tion, accountability, and regulation of research NREN is well laid out. However, longer-term policy procedures.12 Might national networking ena- issues will arise as the national network affects more ble or encourage new oversight roles for deeply the conduct of science, such as: governments? patterns of collaboration, communication and the access of various publics to sctentists and information transfer, education, and appren- research information; ticeship; the dissemination of scientific information, intellectual property, the value and ownership from raw data, research results, drafts of papers of information; through finished researcreports and reviews; export control of scientific information might some scientific journals be replaced by publishing of research results electronic reports? the "productivity" of research and attempts to legal issues, data privacy, ownership of data, measure it copyright. How might national networking communication among scientists, particularly interact with trends already underway in the across disciplines and between university, gov- scientific enterprise, such as changes in the ernment, and industry scientists. nature of collaboration, sharing of data, and potential economic and national security risks impacts of commercial potential on scientific of international scientific networking, collabo- research? Academic science traditionally has ration, and scientific communication; emphasized open and early communication, equity of access to scientific resources, such as but some argue that pressures from competition facilities, equipment, databases, research for research grants and increasing potential for grants, conferences, and other scientists, (Will commercial value from basic research have

I lOMB, Circular A-130, 50 Federal Register j2730 (Dec, 24, 1985); A-130. KR. 2381, The Information Policy Act of 1989, which restates the role of OMB and policies on government information dissemination. 12U.S. Congress, Office of Technology Assessment,The Regulatory Environmentfor Science,OTA-TM-SET-34 (Washington, DC: U.S. Oovermnent Printing Office, February 1986).

4 5 dampened free communication. Might net- universities by 1995, providing reliable service and works counter, or strengthen, this trend? rapid transfer of ery large data streams, such as are found in interactive computer graphics, in apparent Technical Questions real time. The currently operating agency networks would be integrated under this proposal, to create a Several unresolved technical challenges are im- shared 45Mb/s service net by 1992. The second part portant to policy because they will help determine of the NREN would be R&D on a gigabit network, who has access to the network for what purposes. to be deployed in the latter 1990s. The first part is Such technical challenges include: primarily an organizational and financial initiative, standards cor networks and network-accessible requiring little new technology. The second involves information services; major new research activity in government and requirements for interface to common carriers industry. (local through international); requirements for interoperability across many The "service" initiative extends present activities different computers; of Federal agencies, adding a governance structure improving user interfaces; which includes the non-Federal participants (re- reliability and bandwidth requirements; gional and local networking institutions and indus- methods for measuring access and usage, to try), in a national networking council. It formaliza charge users that will determine who is most what are now ad-hoc arrangements of the FRICC, likely to pay for network operating costs; and and expands its .cale and scope. Under this effort, methods to promote security, which will affect virtually all of the Nation's research and higher the balance between network and information education communities will be interconnected. Traf- vulnerability, privacy, and open access. fic and traffic congestion will be managed via priority routing, with service for participating agen- Federal Agency Plans: FCCSETIFRICC cies guaranteed via "policy" routing te-hniques. The benefitswill be in improving productivity for A recently relcased plan by the Federal Research researchers and educators, and in creating and Internet Coordinating Committee (FRICC) outlines demonstrating the demand for networks and network a technical and management plan for NREN.13 This services to the computing, telecommunications, and plan has been incorporated into the broader FCCSET information industries. implementation plan. The technical plan is well thought through and represents further refinement of The research initiative (called stage 3 in the the NREN concept. The key stages are: FCCSET reports) is more ambitious, seeking sup- port for new research on communications technolo- Stage 1: upgrade and interconnect existing agency gies capable of supporting a network that is at least networks into a jointly funded and a thousand times faster thin the 45Mb/s net. Such a managed T. (1.5 Mb/s) National Net- net could use the currently unused capabilities of working Testbed.14 optical fibers to vastly increase effective capability Stage 2: integrate national networks into a T3 (45 and capacity, which are congested by today 's Mb/s) backbone by 1993. technology for switching and routing, and support Stage 3: push a technological leap to a multigiga- the next generation of computers and communica- bit NREN starting in the mid-1990s. tions applications. This effort would require a The proposal identifies two parts of an NREN, an substantial Federal investment, but could invigorate operational network and networking R&D. A serv- the national communication technology base, and ice network would connert about 1,500 labs and boost the long-term economic competitiveness of

13FR1CC. P rogram Plan for the National Research and Education Network, May 23, 1989. FR1CC has members from DHHS, DOE, DARPA, USGS, NASA, NSF, NOAA, and obser vers from the Internet Activities Boatd. FR1CC is an infonnal conunittee that grew out of agencies' shared interest in coordinating related network activities and avoiding duplication of resources. As the de facto interagency coordination fonun, FR1CC was asked by NSF to prepare the NREN program plan. "See also NYSERNET NOTE, vol. 1, No. 1, Feb. 6, 1989. NYSERNET has been awarded a multimillion-dollar contract from DARPA to develop the National Networking Thstbed.

4 E; 35

the telecommunications and computing industries. interests would be represented in practice. It is not The gigabit network demonstration can be consid- clear what form this may take, or whether itwill ered similar to the Apollo project for communica- necessitate some formal policy authority, but there tions technologies, albeit on a smaller and less is need to accommodate the interests of universities spectacular scale. `Thchnical research needed would (or some set of universities), industry researchlabs, involve media, switches, network design and control and States in parallel to a Federal effort. The software, operating systems in connected comput- concerns of universities and theprivate sector about ers, and applications. their role in the national network are reflected in EDUCOM's proposal for an overarching Federal- There are several areas where the FRICC manage- private nonprofit corporation, and to a lesser extent ment planand other plansis unclear. It callsfor, in NRI's vision, The FRICC plan does notexclude but does not detail any transition to commercial such a broader policy-setting body, but the current operations. It does not outline potential structures for plan stops with Federal agency coordination. long-term financing or cost recovery. And the Funding for the FRICC NREN, based on the national network council's formal area of responsi- analysis that went into the FCCSET report,is bility is limited to Federal agency operations. While proposed at $400 million over 5 years, as shown this scope is appropriate for a Federal entity, and the below. This includes all national backbone Federal private sector has participated influentially in past spending on hardware, software, and research, Federal FRICC plans, the proposed council does not which would be funneled through DARPA and NSF encompass all the policy actors that need topartici- and overseen by an interagency council. Itincludes p le in a coordinated nationalnetwork. The growth mid-level or institu- demonstrates that some some contimzd support for oi non-Federal networks tional networking, but not the value of any cost interestssuch as smaller universities on the fling.. of Federal-supported R&Dhave not been served. sharing by industry, or specialized netwcirk R&D by various agencies. This budget is generally regarded The FRICC/FCCSET implementation plan for net- anything, modest considering working research focuses on the more near-teim as reasonable and, if the potential benefits (see table3-2).15 management problems of coordinated planningand management of the NREN. Tt does not dealwith two extremely important and complex interfaces. At the NREN Management Desiderata carriers, the most fundamental level, the common All proposed initiatives share the policy goal of network is part of the larger telecommunications increasing the nation's research productivity and labyrinth with all its attendant regulations, vested creating new opportunities for scientific collabora- interests, and powerful policy combatants. At the top a gateway into a global tion. As a technological catalyst, an explicitnational level, the network is NREN initiative would reduce unacceptably high information supermarket. This marketplace of infor- mation services is immensely complex as well as levels of risk for industry and help create new potentially immensely profitable, and policy and markets for advanced computer-communications services and technologies. What is needed now is a regulation has not kept up with the many new sustained Federal commitment to consolidatc and opportunities created by technology. fortify agency plans, and to catalyze broader na- The importance of institutional and mid-level tional involvement. The relationship between science- networking to the performance of a national net- oriented data networking and the broader telecom- work, and the continuing fragmentation and regula- munications world will need to be better sot ted out tory and economic uncertainty of lower-level net- before the NREN can be made into a partly orfully working, signals a need for significant policy commercial operation. As the engineering challenge attention to coordinating and advancing lower-level of building a fully national data network is sur- networking. While there is a formal advisory role for mounted, management and user issues of econom- universities, industry, and other users in the FRICC ics, access, and control of scientific informationwill plan, it is difficult to say how and how well their rise in importance.

oFor example. National Research Council, Toward a National ResearchNetwork (Washington, DC: National Academy Press, 1988), pp. 28-31.

47 36

Table 3-2--Proposed NREN Budget ($ millions)

FY90 FY91 FY92 FY93 FY94 FCCSET Stage 1 & 2 (upgrade; NSF) 14 23 55 50 50 FCCSET Stage 3 (gigabit+; DARPA) 16 27 40 55 60 Total 30 50 95 95 110 S. 1067 authorization 50 50 100 100 100 HR. 3131 authorization 50 50 100 100 1 00 SOURCE: Offioe of Technology Assessment, 1989.

The NREN is a strategic, complex infrastructure The pace of the resolution of these issues will be which requires long-term planning. Consequently, controlled initially by the Federal budget of each network management should be stable (insulated participating agency. While the bulk of the overall from too much politics and budget vagaries),yet investment rests with midlevel and campus net- allow for accountability, feedbazk, and course cor- works, it cannot be integrated without strong central rection. It should be able to leverage funding, coordination, given present national telecommuni- maximize cost efficiency, and create incentives for commercial networks. Currently, there is no single cations policies and market conditions for the entity that is big enough, risk-protected enough, and required network technology. The relatively modest regulatory-free enough to make a propel. national investment proposed by the initiative can have major network happen. While there is a need to formalize impact by providing a forum for public-private current policy and management, there is concern that cooperation for the creation of new knowledge, and setting a strong federally focused structure in place a robust and willing experimental market to test new might prevent a move to a more desirable, effective, ideas and technologies. appropriate management system in the long run. There is need for greater stability in NREN policy. For the short term there is a clear need to maintain The primary vehicle has been a voluntary coordinat- the Federal initiative, to sustain the present momen- ing group, the FRICC, consisting of program offi- tum, to improve the technology, and coordinate the cers from research-oriented agencies, working within expanding networks. The initiative should acceler- agency missions with loose policy guidance from ate the aggregation of a sustainable domestic market the FCCSET The remarkable cooperation and for new information technologies and services. progress made so far depends on a complex set of These goals are consistent with a primary purpose of agency priorities and budget fortunes, and continued improving the data communications infrastructure progress must be considered uncertain. for U.S. science and engineering.

45 Superinteneznt of Documents Publication Order Form Order Procewid Code Charge your order. * 6720 It's easy!

YES9 please send me the following indicated publications: High-Performance Computing and Networking for ScienceBackground Paper GPO stock number 052-003-01164-6; price $2.25

1. The total cost of my order is (International customers please add an additional 25%.) All prices include regular domestic postage and handling and are good through2/90.After this date, please call Order and Information Desk at 202-783-3238 to verify prices. Please Type or Print 2. 3. Please choose method of payment: (Company or personal name) HCheck payableto the Superintendent of Documents

(Additional address/attention line) 1 GPO DepositAccount 11-1 1 1- IIVISA, CHOICE or MasterCard Account (Street address) ITJU1tTJITJHJ L111 (City. State. ZIP (ode) Thank you for your order! (Credit card expiration date)

(Daytime phone including area code) (Signaturo 9/8) 4. Nlail To:Superintendent of Documents, Government Printing Office, Washington, D.C. 20402-9325 Office of Technology Assessment

The Office of Technology Assessment(01A) was created in 1972 as an analytical arm of Congress. OTA's basic. function is tohelp legislative policy- makers anticipate and plan for the consequencesof techmdogical changes and to examine the many ways,expected and unexpected, in whichtechnology affects people's lives. The assessment ofte(hnoh)gy calls for exploration of the physical, biological, econmnic, social,and political impacts that can result from applications of scientific knowledge.OTA provides Congress with in- dependent and timely information about thepotential effectsboth benefi- cial and harmfulof technologicalapplications. Requests fOr studies are made bychairmen of standing conituitter of the I louse of Representatives or Senate;by the Technology Assessment Boanl. the governing body Of OTA, or bythe Director of OTA in consultaticm with the Board. The Tec hnology Assessment Board iscomposed of six members of the I louse, six members of the S(nate, andthe OTA Director, who is a non- voting member. )TA has studies under way in nine program areas: energyand materials: indus4ry, tec hnologv, and employment:international sec urity and «ontnen biological applic iltiops., food and renewable reM)11Res:health, communication and information technologies; oceansand environment; and science, educa- tic n. and transportation.

5 ()

REST COPY AVAILABLE CONGRESS OF THE UNITED STATES OFFICE OF TECHNOLOGY ASSESSMENT

WASHINGTON DC 20510 B025 POSTAGE AND FEES PAID OFFICE OF TECHNOLOGY ASSESSMENT nFFICIAL BUSINESS 379 PENALTY FOR PRIVATE USE $300

OTA-BP-CIT-59 SEPTEMBER 1989 BEST COPY AVAILABLE