Supercomputers: Government Plans and Policies

March 1986

NTIS order #PB86-205218 Recommended Citation: U.S. Congress, Office of Technology Assessment, : Government Plans & Policies–A Background Paper, OTA-BP-CIT-31 (Washington, DC: U.S. Government Printing Office, March 1986).

Library of Congress Catalog Card Number 86-600508

For sale by the Superintendent of Documents U.S. Government Printing Office, Washington, DC 20402 Foreword

Supercomputers: Government Plans and Policies, presents a review of the Fed- eral Government’s large-scale computing programs and examines the network- ing and software programs within selected agencies. Certain management and institutional questions pertinent to the Federal efforts are also raised and discussed. This background paper was requested by the House Committee on Science and Technology. Within the past 2 years, there has been a notable expansion in the Federal programs and this increase prompted the committee’s request for a review of issues of resource management, networking, and the role of supercomputers in basic research. OTA gratefully acknowledges the contributions of the many experts, within and outside the government, who served as workshop participants, contractors, and reviewers of this document. As with all OTA reports, however, the content is the responsibility of OTA and does not necessarily constitute the consensus or endorsement of the workshop participants or the Technology Assessment Board.

Director

.,. Ill OTA Project Staff—Supercomputers: Government Plans and Policies

John Andelin, Assistant Director, OTA Science, Information, and Natural Resources Division

Frederick W. Weingarten, Communication and Information Technologies Program Manager

Project Staff Prudence S. Adler, Study Director

Contractor Susan P. Walton, Editor

Administrative Staff Elizabeth A. Emanuel Shirley Gayheart* Patricia M. Keville Audrey Newman

*Deceased, Dec. 1 I, 1985.

iv Supercomputers Workshop Participants

Kent K. Curtis Paul Schneck Director Director Division of Research Supercomputing Research Center National Science Foundation Institute for Defense Analyses James Decker Jacob Schwartz Deputy Director Director U.S. Department of Energy Division of Computer Sciences Courant Institute of Mathematical Sidney Fernbach Sciences Consultant New York University Control Data Corp. Joe Wyatt Robert Gillespie Chancellor Gillespie, Folkner & Associates Vanderbilt University

Contractors

Ira Fuchs Robert Gillespie Educom Gillespie, Folkner, & Associates

Reviewers

Albert Brenner Sidney Fernbach Director Consultant Consortium for Scientific Computing Control Data Corp. John Connolly Craig Fields Director Deputy Director Office of Advanced Scientific Computing Engineering Applications Office National Science Foundation Defense Advanced Research Projects Agency Kent K. Curtis Director Robert Gillespie Division of Computer Research Gillespie F’olkner & Associates National Science Foundation Randy Graves James Decker Deputy Director Deputy Director Aerodynamics Division U.S. Department of Energy National Aeronautics and Space Administration Earl Dowdy Research Analyst Dennis Jennings Program Director for Networking David J. Farber National Science Foundation Department of Electrical Engineering University of Delaware Sid Karin Jacob Schwartz Director Director Advanced Scientific Computer Center Division of Computer Science Courant Institute of Mathematical Lawrence Landweber Sciences Department of Computer Science New York University University of Wisconsin, Madison Joe Wyatt Lewis Peach Chancellor Numerical Aerodynamics Simulation Vanderbilt University Program National Aeronautics and Space Administration Paul Schneck Director Supercomputing Research Center Institute for Defense Analyses

vi Contents

Page INTRODUCTION...... 1 PRINCIPAL FINDINGS ...... 4 NATIONAL POLICY ...... 6 NATIONAL GOALS ...... 8 CURRENT FEDERAL PROGRAMS ...... 11 National Science Foundation ...... 11 National Aeronautics and Space Administration ...... 13 Department of Energy ...... 14 Supercomputing Research Center, National Security Agency ...... 15 Defense Advanced Research Projects Agency...... 16 NETWORKS ...... 17 National Science Foundation ...... 17 Department of Energy ...... 18 National Aeronautics and Space Administration ...... 19 SOFTWARE DESIGN AND DEVELOPMENT ...... 20 National Science Foundation ...... 21 National Aeronautics and Space Administration ...... 21 Supercomputing Research Center ...... 21 Department of Energy ...... 21 ISSUES:MANAGEMENT AND INSTITUTIONAL QUESTIONS ...... 22 Coordination...... 22 Center Management ...... 23 Problems and Prospects for Software Development ...... 25 Network Design and Development ...... 27

Tables

Table No. Page l. Partial List of Problems/Applications That Will Benefit From Use of Large-Scale Facilities ...... 9 2. NSF/OASC Budget...... 11 3. NAS Development Budget ...... 14 4.DOE Budget ...... 15 5. Defense Advanced Research Projects Agency Budget...... 16 6.DOE Network Budget ...... 19 ‘7.NASA Network NAS Budget ...... 19

Figures

Figure No. Page l. Current and Projected Supercomputers, 1960-90 ...... 2 2. Range of Federal Policies Possible With a Supercomputer 200 Times the Current Capabilities ...... 10

vii INTRODUCTION

The Office of Technology Assessment (OTA) the von Neumann design. Called “vector” ma- recently completed a report entitled Informa- chines, they gain their speed by breaking up tion Technology R&D: Critical Trends and Is- computational tasks (such as addition and mul- sues. This report explored the structure and tiplication) into separate “pipelines,” which al- orientation of selected foreign programs, is- lows certain problems to be executed far faster. sues of manpower, institutional change, new (See figure 1.) research organizations developing out of Bell Most computer scientists have concluded Laboratories, and trends in science and tech- nology policy. Four specific areas of research: that the sequential, von Neumann design can no longer sustain the rapid growth to which advanced computer architecture, artificial in- we have become accustomed (though compo- telligence, fiber optics, and software engineer- ] ing were also examined. To supplement this nent speeds will continue to improve). They are looking elsewhere for new design ideas, and earlier work, the House Committee on Science their interest has turned to parallelism. In a and Technology requested that OTA examine parallel machine, rather than one processor issues of resource management, networking, working sequentially on the steps of solving and the role of supercomputers in basic re- a problem, many processors work simultane- search. This background paper will explore is- sues raised in the earlier R&D assessment and ously on the computation. This interest in par- allel design is based on three propositions: examine new and ongoing Federal programs in large-scale computer research. 1. the parallel computer will theoretically be far more powerful than the current von Supercomputer is the term applied to the Neumann design; class of the most powerful available 2. the parallel multiprocessor could be less at any particular time. The cost/performance ratio of all , from the costly for a given task, especially when largest to the smallest, continues to decrease utilizing mass production technologies; and rapidly, and today’s desk-top computer has parallel architectures will achieve higher the power that years ago was available only 3. — computational speeds. in mainframes. Speed is gained both by im- proving the logical design of the computer and As the Federal Government sponsors more by making electronic components of the ma- and more research in parallel computation, it chine operate faster. Hence, each generation is important to recognize this new design of supercomputers has tested many new de- direction as a key component of the govern- sign ideas and component technologies that ment’s computer research effort. At the same were later introduced in smaller, less expen- time, it must be recognized that computer sive machines. scientists and mathematicians are only begin- ning to understand how to use optimally the Since the 1950s most large computers have types of highly parallel designs that computer shared an architecture named for John von architects are exploring. Because of the grow- Neumann, a prominent mathematician who ing importance of parallel computation, the played a major role in the invention and de- terms “largescale computing” and “advanced velopment of the digital computer. In the von scientific computing’ refer in this background Neumann architecture, data and program in- structions both reside in memory, and instruc- paper to both current vector supercomputers tions are acted on one by one, sequentially by that employ von Neumann architecture and the “processor” (other parts are the “control” systems based on multiprocessor technologies. and the ‘‘memory’ ‘). Many of the supercom- ‘U.S. Congress, office of TechnoIog~ Assessment, ,llicro- puters now popular, such as the 1 and electronics Research and De\.elopment —A Background Paper, OTA-BP-CIT-40 (Washington, DC: [J. S, Government Printing the Cyber 205, are still based on variations of office, March 1986). 2

Figure l.— Current and Projected Supercomputers, 1960-90 I I I I I I

10’

~ -Denelcc HEP 2

10’

..,1 v Cray X-MP148

Cray-”

10

Approximate year of introduction

( I I 1 I 10 I r I 1 I 1 1960 1965 1970 1975 1980 1985 1990 2005

SOURCE Sidney Fernbach 3

Federal interest in largescale computing de ● the availability of these machines to mem- vices is based on many concerns including: bers of the scientific, research, and in- dustrial communities to perform new re- the ability of selected Federal agencies to search in a variety of fields. fulfill their mission requirements in na- tional defense, space technologies, energy These and other needs have led the Federal technologies, and other areas; Government to expand its program in ways the viability of the U.S. supercomputer designed to give scientists and researchers industry, particularly in light of increas- greater access to large-scale computing facil- ing foreign competition; ities. This access will also foster the develop- the research and development that is per- ment of new architectures, and will lead to new formed in hopes of increasing computa- generations of information technologies and tional speed and the capabilities of these the design and development of new software. machines; and PRINCIPAL FINDINGS

● Several interagency panels were established ● At least over the short term, limited human in 1983 by the Federal Coordinating Coun- resources will be a critical factor in the suc- cil on Science, Engineering, and Technology cess of the supercomputer programs. The (FCCSET) Panel as a forum for discussion opening of seven new centers by the Nation- on specific supercomputer issues. These al Science Foundation (NSF), the Depart- panels have succeeded in this role, and they ment of Energy (DOE) and Florida State remain as a forum where national interests, University, and the National Aeronautics goals, and programs can be fully considered. and Space Administration (NASA) in the At the same time, FCCSET panels hold lim- next fiscal year will generate a large demand ited authority to alter or implement govern- for expert personnel to manage and operate ment policy based on their findings. the centers, but relatively few are available. Demand will be particularly heavy in the ● No single government agency holds lead au- areas of applications software design and de- thority in advanced computer research and velopment, and this can only worsen as sig- access. Each agency’s programs represent nificantly different architectures proliferate. their own mission requirements. Though it is clear that there is a need for this diversity ● Software is an important determinant of the of government programs, there is also a need efficiency of the machines and the types of for enhanced coordination of these efforts to problems that can be tackled on them. It ensure that national goals are realized. This also influences the design of the next gen- may be especially true as greater fiscal con- eration of machine. Therefore an investment straints are placed on the funding agencies. in algorithm and software development is es- sential and integral to any large-scale compu- Q Federal efforts have grown substantially in tation program. the past 2 years in response to a series of ● reports that noted shortcomings in the na- Research in parallel computation has be- tional supercomputer program. With the come a key component of the Federal Gov- diversity of programs, program goals, and ernment’s computer research effort and one mission requirements now underway through- result has been a proliferation of significant- out the government, it may be advisable to ly different architectures. Most scientists assess the Federal efforts to ensure that the and researchers consider these experimental original concerns noted in the reports (e.g., the designs necessary and fundamental to the need for more research in computational efforts of advancing computational speed. mathematics, software, and algorithms) are ● Our current understanding of software and still valid or have been replaced by new more algorithm development is inadequate to pressing concerns. If it is to be effective, fully realize the benefits of the new architec- such a reexamination should: tures. Resources need to be directed to: —involve scientific and research users, –develop an understanding of the new ar- members of the private sector, and perti- chitectures, nent agency administrators; and —define the research necessary to move —include a broader examination of the role soft ware and algorithms from current of new information technologies and the generation supercomputers to other su- conduct of scientific research. percomputers and architectures, and –develop software and algorithms for the ● It is difficult to accurately assess the Fed- new architectures. eral investment in large-scale computing programs as the agencies employ a variety ● Advanced data communication networks of terms to describe comparable or similar are important to the conduct of research efforts. science and technology, because they pro-

4 5

vide nationwide (and sometimes interna- national research network for scientists, tional) access to resources and information. researchers, and interested members of Networks can: the industrial community. The coupling of —expand interconnections between research NSFnet and a national research network communities, could have far-reaching implications, and —encourage greater joint or collaborative merits an in-depth and detailed study by an efforts, and organization such as the National Academy –broaden access to a variety and number of Sciences. In the interim, there are key is- of resources. sues regarding technology development and management operations of NSFnet that c Government and private networks are pro- need to be considered by NSF: liferating, many employing a variety of —NSFnet is developing quickly and choices technologies, standards, and protocols. This made today, of pilot projects for example, diversity may merit concern in the near fu- may affect the future configuration of a ture if it makes use of the networks diffi- national research network; and cult for users. —the industrial community has not been in- ● NSF is establishing NSFnet, a network that cluded in plans for NSFnet, which may will link researchers with the large-scale restrict private researchers’ access to re- computing resources. It is also NSF’s inten- sources and NSFnet users access to in- tion that NSFnet will be the basis of a dustrial resources. NATIONAL POLICY Over the past few years, a combination of 3 elicit proposals for supercomputer re- events has broadened awareness of and inter- search centers, and support up to 10 est in Federal programs for the design and use centers within 3 years; of largescale computing facilities. Prompted 4, support networks to link universities and by a decline in the computational facilities and laboratories with each other and with su- services at American universities and colleges, percomputer centers, thus providing ac- particularly scientific researchers’ lack of ade cess to facilities, file transfer capability, quate access to large-scale computing facil- and scientific communication; ities, NSF convened a panel of scientists to re- 5 create an advisory committee to assist view the situation in 1982. The Department and oversee NSF’s decisions concerning of Defense (DOD), DOE, and NASA joined computer services and networks; and NSF in sponsoring the panel, which was 6. support academic research and training chaired by Peter Lax of New York University. programs in the areas of advanced com- Unlike previous studies, which explored the puter systems design, computational needs of specific segments of the research com- mathematics, software, and algorithms.3 munity, this panel (referred to as the “Lax While the Lax panel was studying large- panel”) examined the large-scale computing scale computing needs in the United States, needs of the entire U.S. research community. the Japanese Government was working inten- The panel noted two key problems: access to sively to develop two programs, the National supercomputer facilities was limited and R&D Super Speed Computer Project and the Fifth- on new architectures was insufficient to meet Generation Computer Project. These pro- the perceived need for more sophisticated com- grams, both designed to meet Japan’s domes- puters. The panel recommended four actions: tic supercomputer needs, also give entry into 1. provide the scientific research community the international marketplace. The National with increased access to supercomputing Super Speed Computer Project is a lo-year facilities and experimental computers program that seeks to develop a machine one- through high bandwidth networks; thousand times faster than a current super- 2. increase research in computational math- computer. The Fifth-Generation Computer ematics, software, and algorithms; Project is focusing on development of a ma- 3. train personnel to use these facilities; and chine with artificial intelligence applications. 4. conduct R&D of large-scale computing Both projects are supported by Ministry of In- systems.2 ternational Trade and Industry, and private companies. Recently, three Japanese compa- A second report sponsored by NSF, the nies, Fujitsu, Nippon Electric Corp. and Bardon-Curtis report, outlined how NSF could Hitachi, announced supercomputers that ap- respond to the problems noted by the Lax peared to be faster than U.S. machines. panel. The Bardon-Curtis report laid the groundwork for the new NSF supercomputing In 1983, the British Government also an- centers. The report recommended that NSF nounced a research effort in this area based take six steps: on the recommendations of the Alvey Commit- tee, the committee that formulated the Brit- 1. enhance coordination between Federal ish response to the Japanese Fifth-Generation and private programs and supercomputer Program. The British effort is focused on arti- research projects; ficial intelligence and largescale integrated cir- 2. increase support for local scientific com- cuits, software engineering, and man/machine puting facilities;

3M. Bardon and K. Curtis, A Natiomd Computing Environ- ‘National Science Foundation, Report of the Panel on Large ment for Academk Research, National Science Foundation, July Scale Computing in Science and Engineering, 1982. 1983.

6 7 interfaces. The European Economic Commu- tronics, software, office automation, advanced nity has initiated the ESPRIT project, which information processing, and computer in- will fund research in advanced microelec- tegrated manufacturing. NATIONAL GOALS

Together, foreign competition and pressure and the U.S. position in the international mar- from the academic community have height- ketplace is quite clear and is discussed in ened concern over what the U.S. Govern- depth in two OTA reports: Information Tech- ment role should be in the development and nology R&D: Critical Trends and Issues and use of advanced scientific computers and large- International Competitiveness in Electronics.5 scale facilities. Several themes have emerged As the number of supercomputers available from the various reports that describe and elu- to the scientific and research communities in- cidate the national role and goals with respect creases, more and more applications will be to large-scale computing machines. employed expanding the commercial and eco- nomic benefits to the United States. Clearly, The advancement of science is one of the the potential benefits from employing large- most commonly cited goals. scale computing machines within the defense, Perhaps the most significant applications of industrial, and scientific communities are scientific computing lie not in the solution of enormous. old problems but in the discovery of new phenomena through numerical experimenta- The development of supercomputers in sup- tion. They [supercomputers], permit the solu- port of national defense and national security tion of previously intractable problems, and programs are critical goals of a national pro- motivate scientists and engineers to explore gram. High-performance computing is needed and formulate new areas of investigation.”4 to verify arms control, analyze intelligence Integral to achieving this goal are education data and information, and protect national and access. Because computers and comput- security secrets. Supercomputers have always ing have become essential tools in scientific been essential in defense programs for military and engineering research many trained and preparedness, and the design of weapon sys- knowledgeable personnel are needed to oper- tems and ballistic trajectories. ate and to use them. With access to these ma- Largescale computing facilities are also cen- chines and facilities, researchers can be trained tral to R&D in such fields as atmospheric in large-scale computing and also conduct re- sciences, aeronautics, nuclear reactor theory search using high-performance machines. and design, and geophysics. “Research at the The reports also stress that the United forefront of contemporary and future science State’s economic strength and ability to com- and technology will demand adequate access pete internationally both now and in the fu- to supercomputer power. ”6 With the help of ture are dependent on the continuing develop- supercomputers, scientists can now tackle ment of and access to large-scale computing problems and investigations in areas not pos- machines. Supercomputers are now integral in sible before. Problems such as simulating the the design of aerospace, automotive, chemical, circulation of the oceans and the atmosphere, pharmaceutical, and microelectronic products. and the effects of carbon dioxide buildup in Over the last two decades, the United States the atmosphere are examples of questions pre- has led the industrialized world in computer viously too large and too time-consuming to technology; each succeeding generation of su- attempt solution on computing facilities. “The ———— percomputer has led to new and innovative ap- ‘U.S. Con&ess, Office of Technology Assessment, Informa- plications and designs. The relationship be- tion Technology R&D: Critical Trends and Issues, OTA-CIT- tween a large-scale computing program to 268 (Washington, DC: U.S. Government Printing Office, Feb- ruary 1985); and U.S. Congress, Office of Technology Assess- advance the state of the art in numerous fields ment, International Competitiveness in Electronics, OTA-ISC- 200 (Washington, DC: U.S. Government Printing Office, No- —. vember 1983).

4N ational Science Foundation, Report of the Panel on Z.arge ‘U.S. Department of Energy, The Role of Supercomputers Scale Computing in Science and En@”neering, 1982, p. 6. in Energy Research Programs, February 1985, p. 4.

8 9 —. -— .

Table 1 .—Partial List of Problems/Applications That Will Benefit From Use of Large-Scale Facilities

Use of supercomputers In engineering: Design of membranes Automobile design Design of new materials and chemicals Aircraft design Use of supercomputers in energy research and Digital simulation development: Nuclear power system safety Petroleum exploration Stress/strain analysis of structures Reservoir modeling Use of supercomputers In manufacturing: Power grid modeling Robotics Fusion engineering Assembly Exploration support System control Use of supercomputers in support of new computer Plant design design and development: Computer-aided design Simulation of new computer architectures Use of supercomputers In support of national security Supercomputer design missions: Computer graphics Command and control Examples of scientific research employing Embedded systems supercomputers: Weapons systems design Cell growth in biological systems Mapping Lattice quantum chromodynamics Intelligence collection and analysis Quantum calculations of molecular energy surfaces Use of supercomputers In exploring environmental Molecular dynamics simulations on genetically questions: engineered proteins Weather modeling and climate General circulation models of the atmosphere to study Satellite imaging the cllmatic effects of aerosols Chemical flow models Simulation and condensed matter physics Free oscillations of the earth Use of supercomputers in medicine and health-related issues and problems” Determination of the structure of an animal virus. human rhinovirus-14 Diagnosis Biochemical processes Simulations of DNA molecules in an aqueous environment Design of drugs Simulations of the origins of the universe Genetic research Neural modellng of the frog retina Use of supercomputers in economics research: Simulations of interactions between enzymes and Real-time models of world economic relationships inhibitor molecules Prediction effects of phenomena Use of supercomputers In chemistry: Design of catalysts

SOURCE Off!ce of Technology Assessment story of numerical weather prediction and cli- powerful as the Cray 2 or Cyber 205, dubbed mate modeling is one of a never-ending fight the , and those machines for faster computers, because they raise the not yet available, the next generation. level of realism we can put into our models and The minisupercomputers, such as the Intel hence the level of realism we can expect in our , the IPSC, or that of results. (See table 1.) As increased avail- Corp., the C-1, present the ability of large-scale computing facilities has scientist with a cost-effective alternative to a extended the range of problems to be investi- supercomputer with hands-on availability and gated by the scientific and research commu- large-number crunching capabilities. But there nities in the United States, it has also shar- are problems still intractable on current gen- pened interest in facilities on either side of the eration supercomputers in fields such as hydro- supercomputing spectrum; those not nearly as dynamics, fusion, plasma physics, and others that drive scientists to design new architec- . ‘hlerr}; Ilaisel, “Questions for Supercomput,ers, ’” (ia[her tures with increased computational capabili- Scatter, San Diego Supercomputer Center, ,Januar? 1986, p. ,5. ties. (See figure 2.) 10

Figure 2.—Range of Federal Policies Possible With a Supercomputer 200 Times the Current Capabilities

“benchmark”

,~1 10 h rs

Large product Ion 1 1 hr

Small production

0.1 f+ min

KEY. 9

Table l.— Partial List of Problems/Applications That Will Benefit From Use of Large-Scale Facilities

Use of supercomputers in engineering: Design of membranes Automobile design Design of new materials and chemicals Aircraft design Use of supercomputers in energy research and Digital simulation development: Nuclear power system safety Petroleum exploration Stress/strain analysis of structures Reservoir modeling Use of supercomputers in manufacturing: Power grid modeling Robotics Fusion engineering Assembly Exploration support System control Use of supercomputers in support of new computer Plant design design and development: Computer-aided design Simulation of new computer architectures Use of supercomputers in support of national security Supercomputer design missions: Computer graphics Command and control Examples of scientific research employing Embedded systems supercomputers: Weapons systems design Cell growth in biological systems Mapping Lattice quantum chromodynamlcs Intelligence collection and analysis Quantum calculations of molecular energy surfaces Use of supercomputers in explorlng environmental Molecular dynamics simulations on genetically questions: engineered proteins Weather modeling and climate General circulation models of the atmosphere to study Satellite imaging the climatic effects of aerosols Chemical flow models Simulation and condensed matter physics Use of supercomputers In medicine and health-related Free oscillations of the earth Issues and problems” Determination of the structure of an animal virus, Diagnosis human rhinovirus-14 Biochemical processes Simulations of DNA molecules In an aqueous environment Design of drugs Simulations of the origins of the universe Genetic research Neural modeling of the frog retina Use of supercomputers in economics research: Simulations of interactions between enzymes and Real-time models of world economic relationships inhibitor molecules Prediction effects of phenomena Use of supercomputers in chemistry: Design of catalysts

SOURCE office Technology Assessment story of numerical weather prediction and cli- powerful as the Cray 2 or Cyber 205, dubbed mate modeling is one of a never-ending fight the minisupercomputers, and those machines for faster computers, because they raise the not yet available, the next generation. level of realism we can put into our models and The minisupercomputers, such as the Intel hence the level of realism we can expect in our Personal Supercomputer, the IPSC, or that of results. (See table 1.) As increased avail- Convex Computer Corp., the C-I, present the ability of large-scale computing facilities has scientist with a cost-effective alternative to a extended the range of problems to be investi- supercomputer with hands-on availability and gated by the scientific and research commu- large-number crunching capabilities. But there nities in the United States, it has also shar- are problems still intractable on current gen- pened interest in facilities on either side of the eration supercomputers in fields such as hydro- supercomputing spectrum; those not nearly as dynamics, fusion, plasma physics, and others that drive scientists to design new architec- 7 Merry Maisel, “Questions for Supercomputers, ” Gather tures with increased computational capabili- Scatter. San Diego Super-computer Center, ,January 1986. p. 5. ties. (See figure 2.) 10

Figure 2.—Range of Federal Policies Possible With a Supercomputer 200 Times the Current Capabilities

102 State-of-the-art——— 4— da “benchmark”

10’ 10 hrs

Large production 1 1 hr

Small production

0.1 6 min

KEY. CURRENT FEDERAL PROGRAMS

This section will briefly review the Federal has a Cray X-MP. The OASC allocated 22,000 large-scale computing programs in the perti- hours of supercornputer time to NSF research- nent agencies to illustrate the scope and na- ers in fiscal year 1985; 5,000 hours had been ture of the [J. S. Government’s investment in used by the end of fiscal year 1985. At the be- large-scale computing programs and facilities. ginning of fiscal year 1986, time is being used The OTA review will describe five major pro- at a rate of over 1,000 hours per month. grams at: NSF, NASA, DOE, the Supercom- The new supercomputer centers represent puting Research Center, and the Defense Ad- a major new Federal investment. Over the vanced Research Projects Agency (DARPA). next 5 years, NSF will invest approximately The programs vary in nature, ranging from $200 million in five more centers (phase II) at those of DOE and NSF, which combine R&D Cornell University, the Consortium for Scien- funding and access to facilities, to DARPA, tific Computing near Princeton (a consortium which is focused on R&L) of advanced com- of 13 universities), the University of Califor- puter research. nia at San Diego (a consortium of 19 univer- sities and research institutions), the Univer- National Science Foundation sity of Illinois, and the Pittsburgh Center (a Within the past 2 years, NSF has broadened consortium of 10 to 20 universities). (See ta- its large-scale computing program efforts in ble 2.) NSF funds serve also as “seed” money, direct response to the Lax and Bardon-Curtis and have already generated interest and sup- reports and the pressures and problems they port in other sectors. In addition to Federal cited. To rectify two commonly cited prob- funding, the centers receive some money from lems, NSF established an Office of Advanced State governments and industry. For exam- Scientific Computing (OASC). The new office ple, the State of New Jersey, private industry, is designed to provide U.S. researchers with and consortium members have already com- access to supercomputers or advanced com- mitted $39.3 million to the John von Neumann puting services; and encourage the growth and Center for Scientific Computing near Prince- development of advanced scientific computing ton, New tJersey. Private industry is also ex- in the United States. pected to fund specific research projects at these five centers. ~NSF is providing researchers with increased access to advanced computing services in sev- Each of the five new supercomputer centers eral ways. In July 1984, NSF funded three are expected to develop a different research operating computer centers (phase 1), at Pur- due University, University of Minnesota, and Table 2 .—NSF/OASC Budget (in millions) Boeing Computer Services in Seattle. This ac- Fiscal Fiscal Fiscal tion presented qualified researchers with im- year year year mediate opportunities to conduct research at 1985 1986 1987 these computing facilities and at the same Centers: ‘- time become familiar with four supercom- Phase I ...... $ 9.7 $ 33 $ 2.3 Phase !1 . . . . . 19.3 25.1 34.5 puters: the Cyber 205, the Cray 2, the Cray Training ...... 0,5 1.0 1.0 1A, and the Cray 1S. The majority of users Networks: of the three centers are current NSF grantees, NSFnet ... . . 3.7 5.0 6.5 Local access . . 2.2 1,8 .5 who also comprise the bulk of new proposals New Technologies. soliciting computer center time. Since those Cornell Center ...... 4.9 5.2 6.2 original grants were made, three other facil- Other experimental access .0 2.3 1.0 ities were funded. Colorado State houses a Software . . . . .9 1.5 1.6 Cyber 205; Digital/Vector Productions has a Total ...... $4i.4 $45.2 $53.6 SOURCE John Connolly Director Off Ice of Advanced Sclentlflc Comput!ng, Na Cray X-MP; and AT&T Bell Laboratories now Ilonal Science Foundation

11 12 emphasis. The Center for Theory and Simu- by Scientific Computer Systems. At this cen- lation in Science and Engineering, located at ter, the focus will be on providing research Cornell University, has been designated an ex- time on the supercomputer facilities. Members perimental center where research will focus on of the consortium believe that the center will parallel processing and software productivity. develop strengths in particular disciplines, Researchers there use an IBM 3084 QX main- such as microbiology. frame computer attached to FPS 164 and 264 The fourth center, the National Center for scientific processors. IBM has donated both Supercomputing Applications is located at the equipment and personnel to support this cen- University of Illinois at Urbana-Champaign. ter. An important aspect of the Cornell pro- Like the San Diego Center, it will use a Cray gram is the plan to bring in interdisciplinary X-MP/24 and upgrade to a Cray X-MP/48. The teams of scientists to develop new algorithms. Illinois center will be closely affiliated with the Unlike the other center programs, this pro- Center for Supercomputer Research and De- gram focuses on experimental equipment, and velopment, a program supported by DOE and this configuration means that it will serve a NSF. The Illinois center will provide comput- few users with large needs, rather than a large ing cycles to the research community, and number of users in need of computing cycles. through a visiting scholar program, it will also The John von Neumann Center, located near focus on the development of new architectures Princeton, will be managed by the Consortium and algorithms. The Illinois center has re- for Scientific Computing, which represents 13 ceived extensive support from the State of universities.s At first, the von Neumann Cen- Illinois. ter will use a Cyber 205, then later will up- The Pittsburgh Center for Advanced Com- grade the facilities to include an ETA-10. The puting will not be funded at the same level as center was established to provide researchers the other phase II centers, although NSF is with access to the facilities for scientific re- committed to its long-term operation. A Cray search and to develop new architectures and 1S donated by NASA prompted the establish- algorithms. ment of this new center, which will be dedi- The San Diego Supercomputer Center, lo- cated to providing time on the Cray facilities. cated at the University of California at San A Cray X-MP/48 and SSP will be delivered in Diego, is managed by GA Technologies and April 1986 to update the center’s facilities. The supported by a consortium of 19 universities University of Pittsburgh Center will be man- and research institutions. 9 The State of Cali- aged by Carnegie-Mellon University with par- fornia is committing $1 million per year to this ticipation by Westinghouse Electric Corp. center. The San Diego Supercomputer Center The funding cycles of these centers vary. will use a Cray X-MP/48 and plans to use a The phase I centers will be funded for 2 years, Cray-compatible pledged through 1986, after which phase II centers will “Members are: University of Arizona; Brown University; Co- begin full operation. ’” Funding for each phase lumbia University; University of Colorado; Harvard Univer- II center will be approximately $40 million per sity; Institute for Advanced Study-Princeton, iNJ; Massachu- setts Institute of Technology; New York University; University center over a 5-year period. Prototype centers of Pennsylvania; Pennsylvania State University; Princeton (e.g., Cornell University), will be funded for 3 University; University of Rochester; and Rutgers University. years at $20 million. NSF projects that the ‘The members are: Agouron Institute, San Diego, CA; Cali- fornia Institute of Technology; National Optical Astronomy Ob- program will require between $300 million and servatories; Research Institute of Scripps Clinic; Salk Institute $500 million within 5 years. This will cover the for Biological Studies; San Diego State University; Southwest costs of network development and of establish- Fisheries Center; Stanford University; University of Califor- nia at Berkeley; University of California at Los Angeles; Univer- ing 11 to 13 supercomputer centers nation- sity of California at San Diego; Scripps Institution of Ocean- wide, with two systems per center. This esti- ography; University of California at San Franciso; University mate is based on an analysis of projected of Hawaii; University of Maryland; University of Michigan; University of Utah; University of Washington; and the Univer- ‘“Phase I centers are now going through year two funding sity of Wisconsin. cycles. ———.—

CURRENT FEDERAL PROGRAMS

This section will briefly review the Federal has a Cray X-MP. The OASC allocated 22,000 large-scale computing programs in the perti- hours of supercomputer time to NSF research- nent agencies to illustrate the scope and na- ers in fiscal year 1985; 5,000 hours had been ture of the U.S. Government’s investment in used by the end of fiscal year 1985. At the be- large-scale computing programs and facilities. ginning of fiscal year 1986, time is being used The OTA review will describe five major pro- at a rate of over 1,000 hours per month. grams at: NSF, NASA, DOE, the Supercom- The new supercomputer centers represent puting Research Center, and the Defense Ad- a major new Federal investment. Over the vanced Research Projects Agency (DARPA). next 5 years, NSF will invest approximately The programs vary in nature, ranging from those of DOE and NSF, which combine R&D $200 million in five more centers (phase II) at funding and access to facilities, to DARPA, Cornell University, the Consortium for Scien- which is focused on R&D of advanced com- tific Computing near Princeton (a consortium of 13 universities), the University of Califor- puter research. nia at San Diego (a consortium of 19 univer- sities and research institutions), the Univer- National Science Foundation sity of Illinois, and the Pittsburgh Center (a Within the past 2 years, NSF has broadened consortium of 10 to 20 universities). (See ta- its large-scale computing program efforts in ble 2.) NSF funds serve also as “seed” money, direct response to the Lax and 13ardon-Curtis and have already generated interest and sup- reports and the pressures and problems they port in other sectors. In addition to Federal cited. To rectify two commonly cited prob- funding, the centers receive some money from lems, NSF established an Office of Advanced State governments and industry. For exam- Scientific Computing (OASC). The new office ple, the State of New Jersey, private industry, is designed to provide U.S. researchers with and consortium members have already com- access to supercornputers or advanced com- mitted $39.3 million to the John von Neumann puting services; and encourage the growth and Center for Scientific Computing near Prince- development of advanced scientific computing ton, New Jersey. Private industry is also ex- in the United States. pected LO fund specific research projects at these five centers. NSF is providing researchers with increased access to advanced computing services in sev- Each of the five new supercomputer centers eral ways. In July 1984, NSF funded three are expected to develop a different research operating computer centers (phase 1), at Pur- due University, University of Minnesota, and Table 2.– NSF/OASC Budget (in millions) Boeing Computer Services in Seattle. This ac- Fiscal Fiscal Fiscal tion presented qualified researchers with im- year year year mediate opportunities to conduct research at 1985 1986 1987 these computing facilities and at the same Centers: time become familiar with four supercom- Phase I ...... $ 9.7 $ 3.3 $ 2.3 Phase II ... ., ~ ~ “ 19.3 251 34.5 puters: the Cyber 205, the Cray 2, the Cray Training ...... 0.5 1.0 1.0 1A, and the Cray 1 S. The majority of users Networks: of the three centers are current NSF grantees, NSFnet ...... , . . ... 3,7 5.0 6.5 Local access . . . . 2.2 1.8 .5 who also comprise the bulk of new proposals New Technologies soliciting computer center time. Since those Cornell Center ...... 4,9 5,2 6.2 original grants were made, three other facil- Other experimental access ., . 0 2.3 1.0 ities were funded. Colorado State houses a Software ...... ,9 1.5 1.6 Cyber 205; Digital/Vector Productions has a Total ...... $41,4— $45.2 $53.6 SOURCE John Connolly Dlrec-tor Off Ice of Advanced Sclentlf!c ComputlngI Na Cray X-MP; and AT&T Bell Laboratories now tlonal Sc(ence Foundation

11 12 emphasis. The Center for Theory and Simu- by Scientific Computer Systems. At this cen- lation in Science and Engineering, located at ter, the focus will be on providing research Cornell University, has been designated an ex- time on the supercomputer facilities. Members perimental center where research will focus on of the consortium believe that the center will parallel processing and software productivity. develop strengths in particular disciplines, Researchers there use an IBM 3084 QX main- such as microbiology. frame computer attached to FPS 164 and 264 The fourth center, the National Center for scientific processors. IBM has donated both Supercomputing Applications is located at the equipment and personnel to support this cen- University of Illinois at Urbana-Champaign. ter. An important aspect of the Cornell pro- Like the San Diego Center, it will use a Cray gram is the plan to bring in interdisciplinary X-MP/24 and upgrade to a Cray X-MP/48. The teams of scientists to develop new algorithms. Illinois center will be closely affiliated with the Unlike the other center programs, this pro- Center for Supercomputer Research and De- gram focuses on experimental equipment, and velopment, a program supported by DOE and this configuration means that it will serve a NSF. The Illinois center will provide comput- few users with large needs, rather than a large ing cycles to the research community, and number of users in need of computing cycles. through a visiting scholar program, it will also The John von Neumann Center, located near focus on the development of new architectures Princeton, will be managed by the Consortium and algorithms. The Illinois center has re- for Scientific Computing, which represents 13 ceived extensive support from the State of universities.8 At first, the von Neumann Cen- Illinois. ter will use a Cyber 205, then later will up- The Pittsburgh Center for Advanced Com- grade the facilities to include an ETA-10. The puting will not be funded at the same level as center was established to provide researchers the other phase II centers, although NSF is with access to the facilities for scientific re- committed to its long-term operation. A Cray search and to develop new architectures and 1S donated by NASA prompted the establish- algorithms. ment of this new center, which will be dedi- The San Diego Supercomputer Center, lo- cated to providing time on the Cray facilities. cated at the University of California at San A Cray X-MP/48 and SSP will be delivered in Diego, is managed by GA Technologies and April 1986 to update the center’s facilities. The supported by a consortium of 19 universities University of Pittsburgh Center will be man- and research institutions. 9 The State of Cali- aged by Carnegie-Mellon University with par- fornia is committing $1 million per year to this ticipation by Westinghouse Electric Corp. center. The San Diego Supercomputer Center The funding cycles of these centers vary. will use a Cray X-MP/48 and plans to use a The phase I centers will be funded for 2 years, Cray-compatible minisupercomputer pledged through 1986, after which phase II centers will ‘Members are: University of Arizona; Brown University; Co- begin full operation. Funding for each phase lumbia University; University of Colorado; Harvard Univer- II center will be approximately $40 million per sity; Institute for Advanced Study-Princeton, NJ; Massachu- setts Institute of Technology: New York University; University center over a 5-year period. Prototype centers of Pennsylvania; Pennsylvania State University; Princeton (e.g., Cornell University), will be funded for 3 University; University of Rochester; and Rutgers University. years at $20 million. NSF projects that the The members are: Agouron Institute, San Diego, CA; Cali- fornia Institute of Technology; National Optical Astronomy Ob- program will require between $300 million and servatories; Research Institute of Scripps Clinic; Salk Institute $500 million within 5 years. This will cover the for Biological Studies; San Diego State University; Southwest costs of network development and of establish- Fisheries Center; Stanford University; University of Califor- nia at Berkeley; University of California at Los Angeles; Univer- ing 11 to 13 supercomputer centers nation- sity of California at San Diego; Scripps Institution of Ocean- wide, with two systems per center. This esti- ography; University of California at San Franciso; University mate is based on an analysis of projected of Hawaii; University of Maryland; University of Michigan: University of Utah; University of Washington; and the Univer- ‘[’Phase I centers are now going through year two funding sity of Wisconsin. cycles. 13 — — needs of the 20 disciplines funded by NSF. By have focused on algorithm development and 1990, one-third of these disciplines will require numerical techniques. In fiscal year 1986, $1.5 two or more large-scale computing facilities; million is earmarked for software proposals. one-third will require one facility; and the Other divisions of NSF also support re- remaining one-third less than one-half of a search on large-scale facilities, networking, facility. This totals 22 to 24 state-of-the-art software engineering, and related areas. large-scale facilities and minisupercomput- Projects funded by the Division of Computer ers. ” These projections were made prior to the Research over the past 20 years are now the passage of the Gramm-Rudman-Hollings leg- basis for many information technologies in use islation. In a recent briefing on the NSF pro- or have lead to prototype development else- gram, program staff stated that no more cen- where (many of the DARPA projects de- ters would be established. scribed originated within the Division of Com- An important facet of the establishment of puter Research). Projects may be directly these centers is the creation of a new network related to the work planned at the new centers to allow users to communicate with one other, and in OASC, but the focus is varied and dis- both at the centers and around the country. tinct from that of OASC. Unlike OASC, the The stated goal of the NSF networking pro- other divisions place a greater emphasis on gram is to provide the research community funding projects that use multiprocessor tech- with universal access to the large-scale re- nologies. For example, two NSF projects with search facilities. This network is intended to this emphasis, now in the prototype stage, are be the basis for a national research network the Texas Reconfigurable Array Computer for the academic community and eventually project at the University of Texas, Austin, and will connect with international networks. The the Cedar project at the University of Illinois. NSF strategy appears to be twofold: 1) imme- diate access through existing networks, such National Aeronautics and Space as ARPANET (the first long-haul computer Administration network developed under contract by DARPA); 2) followed by development of high- NASA has used supercomputers at various speed networks that will be the backbone of locations around the country for several years the new network, connecting all of the data in support of a number of mission programs. centers. To carry out this strategy, the pro- Until recently, however, NASA has funded gram will soon begin funding pilot network- very little advanced scientific supercomputer ing projects. research designed to create or develop new ar- chitectures or machines, although the agency OASC also intends to fund projects in five did fund the development of the Massively areas of software productivity and computa- Parallel Processor (MPP), now being used for tional mathematics: computer science research S image processing. I (See table 3.) on programming environments, development of software tools, numerical analysis, algo- NASA’s three research centers have super- rithm development, and increasing research computers: Ames, Langley, and Lewis. The effectiveness in using advanced computers. ‘z Goddard Space Flight Center also has super- The fiscal year 1985 budget for this program computer facilities (Cyber 205) though it is not was almost $1 million, but only $500,000 was designated as a research center. ” committed to new grants, some funded jointly with the Divisions of Mathematical Sciences and Computer Research. The funded proposals “MPP, a limited application computer, has o~rer 16,000 proc- essors operating simultaneously’ and a custom integrated cir- cuit containing eight complete processors. “Ames, Cyber 205, X-hi P, I.angley, modified Cyber 205: ‘‘ I,arrJ’ I,ee, National %ience Foundation, personal commu- I,ew’is, Cra~’ 1 S and awaiting an X-hi P: and hlarshall Space nication. ,June 7, 1985, Flight center has an RFP out to purchase a supercomputer in 1’Ibid., ,June 14, 19x5. the near future. ((wnf~nued (JII n(’xr pago 14

Table 3.—NAS Development Budget (in millions) In fiscal year 1984, anticipating the arrival of the Cray 2, NASA began software devel- Fiscal Fiscal year year opment projects. This work has been carried 1986 1987 out both at Cray Research and on a Cray X- MP at Ames. Early in the process, NASA chose a UNIX operating system. In conjunction with the new facilities, NASA plans to establish a network to link all

SOURCE: Randy Graves, National Aeronautics and Space Administration. of NASA communications including computer facilities. High-speed communications be- tween the four large-scale computing centers, One of NASA’s major supercomputer pro- an integral part of the network, will be facili- grams is the Numerical Aerodynamic Simu- tated through satellite and terrestrial links. lation (NAS) Program. NAS is designed to The NAS center will be included in this net- solve problems of aerodynamic and fluid dy- work and in ARPANET and MILNET (the namics, but it is also intended to: Defense Data Network) for access by the pri- . . . act as the pathfinder in advanced large- vate sector and university researchers. scale computer system capability through systematic incorporation of state-of-the-art Department of Energy improvements in computer hardware and soft- ware technologies.15 DOE has a long history of using supercom- When NAS becomes operational in fiscal year puters and supporting architecture develop- 1986, it will be available to interested individ- ment for them. Since the 1950s, a DOE labor- uals from NASA, DOD, other government atory has acquired the first or one of the first agencies, industry, and universities. The NAS manufactured units of nearly every largescale processing system will employ state-of-the-art computer. DOE’s National Laboratories still high-speed processors (designated HSP 1 and hold the greatest concentration of users of su- 2), a mass storage system, a support-process- percomputers; approximately 35 percent of ing subsystem, a subsystem, a the supercomputers in use in the United graphics subsystem, and a long-haul commu- States are located in these laboratories. nications subsystem. HSP-1 will be a Cray 2 DOE uses supercomputers to support a va- supercomputer with four processors configu- riety of its missions. The nuclear weapons pr~ ration and 256 million words of memory. The gram relies on large-scale computers to aid in HSP-2, to be developed as NAS becomes oper- highly complex computations in the design ational, is expected to achieve four times the process. The Magnetic Fusion Energy and the computational capabilities of a Cray 2 and will Inertial Confinement Fusion Programs are include upgraded subsystems and graphics ca- heavily dependent on supercomputers as well. pabilities and expanded wideband communi- The machines are required to model the com- cations. 16 plex behavior of hot plasmas, including the ef- fects of electric and magnetic fields, atomic physics, the interaction with intense radiation, (continued from previous page) Ames Research Center utilizes the Cray IS–upgraded to X- and various boundary conditions. To this end, MP for computational aerodynamic research; Lewis Research DOE hopes by fiscal year 1986 to have an in- Center uses the Cray 1S for internal computational fluid me chanics, and for thermal and structural performance analysis stalled base of 26 supercomputers throughout 7 of propulsion system components; Goddard Space Flight Cen- the agency and its laboratories.1 ter utilizes their Cyber 205 for atmospheric science investi- gations. 17 Briefing, DOE, J~u~ 1985; 18 in support of defense Pro MNASA ~gt,imony, House Science and Technology Committ- grams, 3 in support of magnetic fusion energy, 2 in support ee, June 25, 1985. of the energy sciences, 2 in support of the naval reactors pre ‘6Lewis Peach and Randy Graves, NASA, personal commu- gram, and finally, 1 in support of the uranium enrichment nication, February 1985. program. 15

More recently, DOE has started using large- cated at Lawrence Livermore National Lab- scale computers to support other programs in oratory, the network includes one Cray 2, one the Office of Energy Research (OER) in addi- Cray X-MP, and two Cray 1 computers that tion to Fusion Energy Research, known also provide large-scale computing capability for as the Supercomputer Access Program. In the program. The four largest fusion contrac- February 1983, noting that various disciplines tors are linked by satellite communications, in OER needed supercomputing time, DOE and others have access through telephone set aside 5 percent of the National Magnetic lines. Fusion Energy Computer Center (NMFECC) DOE and Florida State University, in a co- facility at the Lawrence Livermore Laboratory operative agreement, established the Florida for the energy research community through State University Supercomputer Computa- the Energy Sciences Advanced Computation tional Research Institute in 1985. The new in- Program. This supercomputer time, first avail- stitute was established to explore aspects of able in June 1983, was immediately filled, and energy-related computational problems, al- the hours requested exceeded the availability gorithms, and architectural research. DOE by an order of magnitude. ’8 and the State of Florida will both contribute In addition to providing access to largescale to the development of the institute. The State computational facilities for research purposes, will provide 10 faculty positions and the com- DOE also funds supercomputer research to puter facility; DOE will provide 69.5 percent accelerate the development of systems de- of the funds to establish the institute; and signed and built in the United States. Histori- Control Data Corp. will contribute equipment cally, DOE laboratories needed the fastest discounts and some personnel. (See table 4.) computers available to fulfill their missions. As a result, the laboratories have traditionally Supercomputing Research Center, played a key role in applying each succeeding National Security Agency generation of supercomputers, and developing software, since initially most machines have Recognizing a need for research in super- arrived without usable software. The DOE Ap- computing, the Institute for Defense Analy- plied Mathematical Sciences Research Pro- ses has established the Supercomputing Re- gram within OER supports new computer ar- search Center (SRC) for the National Security chitectures and also mathematical and Agency. The new “center of excellence” in par- computer science research. Recently, program allel processing will focus its development ef- emphasis has shifted to new parallel mul- forts on algorithms and systems, and also con- tiprocessors. DOE has supported numerous duct research on national security programs. projects and prototypes, such as the Hyper- Still relatively new, the research agenda for cube machine, a California Institute of Tech- the center has not been set, although basic re- nology project; the Ultracomputer at NYU; and the Cedar project at the University of Il- linois. Table 4 .—DOE Budget (in millions) The Magnetic Fusion Energy Computer Fiscal Fiscal Fiscal year year year Network, established in 1974, provides access 1985 1986 1987 to these supercomputer facilities, while also Energy research ...... $17.8 $20.55 $22.1 satisfying other communication needs. Lo- Florida State University . . 7.0 8.5 0 SCRI ——— Additional funds in support of “U.S. Department of Energy, The Role of Supercornputers supercomputer operations at: in L’nergy Research Programs, February 1985, p. 1. Two Cray Oak Ridge. Sandia(2), 1 computers at the National Magnetic Fusion Energy Computer Los Alamos Scientific Center (N MFECC) and a Cray X-MP will be used in fiscal year Laboratory(2), Livermore, 1985 as well to help meet the Class VI needs of the energy re- MFE Center: ...... 170.0 search community. SOURCE Jam;s Decker Deputy Director Dep~rtment of Energy 16 — search in parallel algorithms, operating sys- DARPA funds several stages of R&D, from tems, languages and compilers, and computer simulation to prototype construction and fi- architectures (including in-house construction) nally to benchmarking, a procedure using a set are the areas that will be investigated.lg of programs and files designed to evaluate the performance of the hardware and software of The center will perform both classified and a computer in a given configuration. In fiscal unclassified research. It is recognized that par- year 1985, DARPA expanded the multiproces- ticipation by the academic and industrial com- sor program and funded a greater number of munities is essential and some exchange of in- research projects. DARPA’s efforts represent formation will be allowed. the bulk of government R&D in multiproces- Budget figures are not available though sor research, surpassing programs of DOE and staffing levels and facility data are available: NSF. Multiprocessor projects supported by 100 professionals; 70 support staff; 100,000 DOD include the Butterfly Multiprocessor, square feet/permanent facility including cur- 20 which can accommodate up to 256 commer- rent supercomputer and other equipment. cially available , the Connec- tion Machine, contains 64,000 processors, and Defense Advanced Research the NONVON machine, with up to 8,000 large Projects Agency and small processors. DARPA has also par- ticipated in joint funding of projects with DARPA supports a large number of re- other agencies, such as the Cosmic Cube search projects that seek to advance the state project at California Institute of Technology. of the art in multiprocessor system architec- (See table 5.) tures. Unlike other programs, DARPA does not use supercomputers to fulfill agency mis- sion requirements, but rather funds promis- Table 5.— Defense Advanced Research Projects ing research projects that may advance the Agency Budget (in millions) knowledge and design of current computer ar- chitectures. Support is directed towards ful- Fiscal Fiscal Fiscal year year year filling the goals of the Strategic Computing 1985 1986 1987 Program, which seeks to create a new genera- 21 13Baslc Research and Exploratory tion of “machine intelligence technology." Development for Advanced Computer The program is partly focused on symbolic Research (this figure includes funding processing for artificial intelligence appli- figures for other areas such as machine Intelligence and robotics) $1241 $125 $125 cations. Machine Architecture 204 40 40 Distributed Computing and Software ‘g’’ Parallel Processor Programs in the Federal Government, ” Systems 17.8 18 18 draft report, 1985, p. 1. Network and Research Facilities 26.7 27 27 ‘“Paul Schneck, Director, Supercomputing Research Center, Total $1890 $21o $210 personal communication, December 1985. SOURCES FCCSET Panel on Advanced Computer Research (n the Federa/ *’Defense Advanced Research Projects Agency, Strategic Government, summer 1985, and Cra!g Fields, personal communica- Computing, 1983, p. 1. tion February 1985 NETWORKS

A number of networks are operating or be- The scientific and research communities are ing created to support government computa- already using several networks. ARPANET, tional programs by linking the users and the developed and operated by DOD, hosts over computational resources. Networks allow 200 computers at nearly 100 universities, gov- large, diverse, and geographically dispersed re ernment laboratories and private sector re- search communities to share resources in this search companies. CSNET is a data commu- case, large-scale computing facilities, exchange nications network linking computer scientists information, and share software. and engineers at over 120 university, govern- As most commonly used, the term “net- ment, and commercial sites throughout the United States and Canada, with gateways to work” refers to a communications system de- Europe and the Far East. BITNET is a net- signed to provide links between two or more work of more than 350 computers at over 100 of a large collection of users. As used in the higher education and research institutions, computer community, the term applies more with direct links to counterparts in Canada specifically to a system that provides data and Europe. Commercial value-added net- transmission links among a set of computers works, such as TELENET, TYMNET, and —called “hosts’ —and among users of those UNINET, provide users with low-speed ter- computers, who link to the network by means minal access and moderate speed host-to-host of a “terminal.” The term “network” com- access. monly is applied to the entire assemblage of computers, communication lines, and user ter- The concept of a network as it is used by minals. most government agencies, goes beyond the notion of supercomputer access for remote A wide variety of networks already exist. users. Rather, networks are viewed as effec- They extend in scale from so-called “local area tive means to make the fullest possible use of networks” that connect desk-top computers the resources. Three networking programs are within an organization, to national and inter- described below: NSF, DOE, and NASA. national networks, such as ARPANET, that connect thousands of users with several very The first recommendation of the Lax panel powerful machines. Networks use many differ- called for ‘‘increased access for the scientific ent data transmission speeds, encoding tech- and engineering research community through niques, and protocols (basic sequences of mes- high bandwidth networks to adequate and sage that tell the network what is to be done) regularly updated supercomputer facilities that attune them to particular types of data and experimental computers. ’22 The develop- communication and use. Because of these var- ment of NSFnet is designed to meet this need, iations, different networks can be very diffi- as is the new NASA network in part. DOE al- cult to interconnect, even when they use the ready has a highly successful and efficient net- same fundamental communications technol- work, MFENET. With the substantial activ- ogy-e.g., satellites or fiber optics—and the ity in this area it is important to be cognizant same type of computers. Because such incom- of the plans of the Federal programs. patibility can be a barrier to development and use of networks, some standards organizations are developing common descriptive models National Science Foundation and interconnection standards at both the na- tional and international level. The combination A major new effort within OASC is the cre- of activities by the standard organizations and ation of a network to link researchers with the the numerous networking activities will even- large-scale computing facilities. The new net- tually lead to common standards and pro- tocols that will satisfy the differing needs of ~ZNation~ Science Foundation, Report of the panel on Large the individual networks. Scale Computing in Science and Engineering, 1982, p. 10.

17 18 work, called NSFnet, is intended to be the ba- tion of CSNET to NSFnet would greatly ex- sis for a national research network. The NSF pand the access base for users. concept for the new network is to “leverage” NSFnet will also be linked with each of the existing resources and networks with a new four centers described above, which together national network that is limited in both fund- encompass a large percentage of the U.S. aca- ing and authority. NSFnet would then try to demic community. Of the four centers, only take advantage of existing and new campus, the San Diego Center will be MFENET based, community, State, and consortium networks with migration to the NSF internet standards –a network of networks following the DARPA planned. Discussions are underway to include internet model. To achieve this “internet” regional networks, such as MERIT and the environment, NSF has adopted interim and Colorado State Network, in NSFnet. long-term standards. (Initially, DOD internet protocol suite–TCP/IP plus existing applica- Pilot projects proposed for phase I of tions–and eventual migration to 1S0/0S1 pro- NSFnet’s development take advantage of tocols.) Future plans for NSFnet are uncertain available technologies in an effort to enhance because the network design and structure will the communications between the user and the be based on knowledge gained during phase resources. OASC is now considering three proj- I. Conceptually, the network will be designed ects: Vitalink Translan,23 DARPA wideband, to link end users with end resources. and workstation projects. Funding for local- access projects has been allocated for fiscal Early efforts of NSFnet have focused on providing interim services—i.e., linking the year 1985, and will be used to connect campus networks and the NSF internet and for local researcher to the resource. To this end, phase computing facilities for supercomputer users. I efforts provided funds to universities to buy The NSF strategy is to favor service organi- equipment such as ; to arrange zations on campus for handling the users con- links between local networks, phase I centers, cerns, rather than placing that burden on the fu- and other networks; and to fund consortium 24 ture administrators and managers of NSFnet. networks and pilot projects. The goal of phase I is to provide the top 50 to 60 campuses with shared or dedicated 56,000 bits per second cir- Department of Energy cuits. NSF allocated $5.9 million to the net- The Magnetic Fusion Energy Network work program in fiscal year 1985, $2.2 million (MFENET) is an integral part of NMFECC. of which was for local-access projects in fis- This network links NMFECC, located at the cal year 1985 only. Lawrence Livermore National Laboratory, NSF is also trying to link three existing net- with computer centers at major fusion labora- works, ARPANET, BITNET, and CSNET. tories and other facilities nationwide. Over OASC and DOD have signed a Memorandum 4,000 users in 100 separate locations use of Understanding to expand the current AR- MFENET. MFENET interconnects all com- PANET by one-third, with NSF funding 25 puters at the center, and links remote user percent of the total costs of the expanded net- service centers that support local computing, work. It is anticipated that ARPANET will experimental data acquisition, printers, termi- be expanded by the time the phase II centers nals, remote user service stations, ARPANET become operational. (via the user service center at NMFECC), and There has also been some discussion of en- hancing BITNET with NSF standard pro- zsvit~ fim91m i9 designed to interconnect several Ether- tocols, because BITNET has extensive links net local area networks in order for those networks to appear with the American academic community and as a single large network. This would employ satellite and ter- restrial lines. international connections to European Aca- “Based on discussions and briefing by D. Jennings, National demic Research Network and Japan. The addi- Science Foundation, June, September, and November 1985. 19

dial-up terminals (via TYMNET and commer- gram and users, but instead will employ cial telephone lines) .25 PSCN. PSCN will be operating by April 1986, with completion expected in the fall of 1986. The DOE fusion laboratories are linked by Interim service for NAS and a limited num- dual 56,000 bits per second (bps) satellite links ber of users will be provided by a Long Haul to NMFECC. Many of the other users are con- Communication Prototype, which will exam- nected to these centers by 4,800 or 9,600 bps ine and evaluate communication requirements leased telephone lines. Other users gain access of the NAS network. to the centers through TYMNET, ARPANET, direct commercial dial, or Federal Telephone PSCN is composed of several levels of serv- System. over 125 user service stations located ice. Eight NASA centers and facilities will be at national laboratories, universities, or else- linked by satellite: Ames, Dryden, Johnson, where provide local computing capabilities Kennedy, Marshall, Langley, Lewis, and God- and, through the connection of the NMFECC, dard. Eleven centers and facilities will be can function as remote output and job-entry linked via 1.5 Mbps (megabits per second) stations. Those users not within the local di- terrestrial backbone links: Ames, Western aling area of a user service station may dial Launch Operations, Jet Propulsion Labora- access to the main computers and other net- tory, Dryden, Johnson, Michoud Assembly, work hosts.2G Slidell Computer Complex, National Space Technology Laboratories, Kennedy, Langley, DOE and the scientific communities have NASA Headquarters, Wallops, and Lewis. discussed expanding the resources available Four centers will be connected by 6.3 Mbps: through MFECC and MFENET. However, Ames, Langley, Goddard, and Lewis Research budgetary restraints preclude this expansion 27 Center. NASA’s three research centers (Ames, in the near future. (See table 6.) Langley, and Lewis) will be linked through the Subsystem (CNS). National Aeronautics and Space Administration CNS, a subsystem of PSCN, is a high-speed computer network linking selected NASA cen- Boeing Computer Services is currently de- ters. Like PSCN, CNS will send jobs and files veloping the Program Support Communica- between these centers, while also maintaining tions Network (PSCN) for NASA. This net- updated information on the files and jobs work will serve NASA and selected user sites within the network. This subsystem is de- by providing wideband and other transmission signed for high-volume data transfer; it will services. The Numerical Aerodynamic Simu- use a T1 link for inter-site transfers and a lation (NAS) Program (see above) will not de- 9,600 bps link for control information. (See ta- velop a separate network to support its pro- ble 7).

“D. Fuss and C. Tull, “Centralized Supercomputer Support for Magnetic Energy Fusion Research, ” Proceedings of the IEEE, January 1984, p. 36. “Ibid. “Don Austin, Department of Energy, personal communica- tion, Aug. 14, 1985; and John Cavalini, Department of Energy, personal communication, June 5, 1985, and December 1985.

Table 6. —DOE Network Budget (in millions) Table 7.—NASA Network NAS Budget (in millions) — Fiscal Fiscal Fiscal Fiscal Fiscal year year year year year 1985 1986 1987 1986 1987 Energy research . . . . ~. ., ...... $2.8 $3.0 $3.3 PSCN/NAS contribution ...... $1.6 $2.0 SOURCE James Decker, Deputy Director Department of Energy — SOURCE Randy Graves, National A=ronautlcs and Space Adminlstrat!on SOFTWARE DESIGN AND DEVELOPMENT

A wide variety of software is necessary to ● Applications Programs: Applications pro- run any large computer system. Some broad grams are the software developed to solve categories of software are as follows: specific problems. They range in scale and purpose from small, relatively simple pro- Operating System: The operating system grams designed to solve a specific prob- software manages the flow of work lem once, to large programs usable by through the machine. It has responsibili- many researchers to solve a variety of re- ties such as assigning and controlling lated problems. physical devices attached to the com- puter, cataloging and keeping track of Underlying these levels of software is a rap- data in the computer memory, and con- idly developing body of computational theory trolling the input of programs and data flowing from mathematics and computer and the output of results. It also provides science. Computational mathematics, for ex- a user with a set of basic tools for accom- ample, by examining how basic methods for plishing certain basic tasks common to calculating solutions to equations behave in most computer applications. terms of efficiency and accuracy, helps to de- Programming Tools and Languages: velop improved methods. In particular, com- Users developing programs can make use puter scientists and mathematicians are only of several types of automated aids. beginning to understand how to use optimally Higher level languages allow the user to the types of highly parallel designs computer express a program in a form that is sim- architects are exploring. As such knowledge pler, more readable and understandable is developed, it will lead to more efficient use by a human, and more closely related to of existing supercomputers, and will help com- the technical language in which users puter architects design even better future ma- communicate their problems, than is the chines. basic computer “machine language. ” Pro- Theoretical computer scientists are develop- grams written in these languages are eas- ing an understanding of how large, complex ier to develop, more easily understandable computational systems behave. For example, by others, and often are more easily trans- they study techniques for scheduling and al- ferred between different machines. In locating resources, they study the structure addition, the compiler, which translates of higher order languages, and they explore the higher level language program into the theoretical basis for determining whether machine language, can assist a user in a program is correct. taking advantage of particular character- istics of a specific computer by restruc- When OTA examined the field of software turing the program during translation. engineering in its report Information Technol- This latter advantage is particularly im- ogy R&D, it found that: “The lack of applica- portant for supercomputers, which have tions software for supercomputers has been unusual features that must be employed a significant barrier to their adoption and efficiently to obtain the very high compu- use. ’28 Concern over the availability and use tation speeds. of software of all types, not only applications Software engineers are developing software, for large-scale computing continues many other types of tools to help users to be a critical issue, and “will grow worse in develop software, both in the program- the near future because of the proliferation of ming and in the diagnostic and testing phase. The OTA report, Information ‘“U.S. Congress, Office of Technology Assessment, Informa- tion Technology R&D: Critical Trends and Issues, OTA-CIT Technology R&D, has additional discus- 268 (Washington, DC: U.S. Government Printing Office, Feb- sion of the state of software engineering. ruary 1985), p. 64.

20 21 significantly different architectures. ’29 Appli- tific computing efforts were hindered by insuf- cations and systems software is available for ficient or inadequate software, and so designed the current generation vector machines, al- the NAS program schedule to avoid this prob- though in some cases, “it does not fully use lem. At both Cray and the Ames Research the capabilities of the machine. ”3° Also, the Center, development work has been underway amount and breadth of systems and applica- since 1984 in the areas of operating systems tions software appears insufficient to satisfy modifications, network communication, distri- both current and projected demands generated buted file systems, batch queuing systems, by the expanded Federal programs.” The soft- and common graphics services. The NAS Proj- ware programs of NSF, NASA, SRC, and ect developed a common user environment, DOE are described below. based on a UNIX operating system, spanning a network of computers from multiple ven- National Science Foundation dors. Except for specific user service projects, these software projects are for the NAS facil- NSF funds academic and some corporate re- ity, not ‘‘research. search in software engineering, software de- velopment, computational mathematics, and Supercomputing Research Center related areas. Together, the Division of Com- puter Research and the OASC provide over $3 The SRC software efforts concentrate on million for software development, although systems software development, which can be not all of this research is directly applicable divided into three research areas: operating to the needs of advanced scientific computing. systems and compilers, language, and per- Within the Division of Computer Research, formance measurement. The research in these the Software Engineering Program, the Soft- fields will be conducted at SRC. ware Systems Science, Computer Systems De- sign, the Theoretical Computer Science and Department of Energy Special Projects programs each fund a vari- ety of research projects. To support its varied research programs DOE is involved in numerous software devel- The OASC will be funding projects in direct opment efforts that address systems, applica- support of the supercomputing centers de- tions, and tools. These efforts, however, are scribed above. These efforts will focus on soft- closely tied to other DOE programs so their ware productivity and computational mathe- budget figures cannot be broken out. For ex- matics. Research will be concentrated in the ample, at least some of the $7.8 million allo- following areas: computer science research on cated in fiscal year 1985 for the Department programming environments, development of analytical and numerical methods program software tools, numerical analysis and al- was devoted to computational software. Sim- gorithm development, and increasing effec- ilarly, some of the $10.4 million budgeted for tiveness of advanced computers in research. advanced computer concepts was allocated to software engineering technologies. This past National Aeronautics and Space year, OER requested that $2 million be set Administration aside specifically for software tool develop- ment, but the Office of Management and Since the mid-1970s, NASA research cen- Budget did not approve this request.32 ters have supported algorithm development for supercomputers (Illiac and Star-100). Agen- cy officials noted that earlier advanced scien-

290 ffice of Technology Assessment 1$’orkshop, Apr. 29, 1985. “’Ibid. “Don Austin, Department of Energy, personal communica- ‘1 Ibid. tion, Aug. 14, 1985, ISSUES: MANAGEMENT AND INSTITUTIONAL QUESTIONS

As the Federal programs in large-scale com- panels were formed focusing on issues of pro- puting continue to grow in fiscal year 1986, curement, access, and research coordination. several management and institutional ques- Representatives from DOE, NSF, NASA, tions arise concerning coordination, network NSA, DOD, the Department of Commerce, development, and software. These questions and the Central Intelligence Agency are mem- may merit greater attention than they now re- bers of pertinent FCCSET subpanels. ceive given their importance and the enormity Deliberations of the procurement subpanel, of the task facing the agencies. Such issues are which was chaired by a DOE representative, particularly important because no single agen- focused on the question, “what should the gov- cy dominates largescale computer research or ernment do to ensure that the United States policy. Topics discussed below are: coordina- retains its lead in supercomputers?” Specific tion between agencies, including issues of recommendations included “guaranteed” buy- center management, resource allocation, and ing of supercomputers by the government, in- manpower and training; the Federal role in creasing government purchasing of machines, software design and development; and net- and designing a supercomputer 200 times the work development. current capabilities. After issuing a report in late 1983, this group merged with the access Coordination panel.” ISSUE: The second subpanel, on access, also chaired With the growing Federal efforts in large-scale by a representative of DOE, published a re- computer research and access, coordination port that called for upgrading current pro- among agencies is necessary to the success of the grams in a variety of ways and also recom- overall national program. Are the current coordi- mended steps to meet the long-term objectives nating mechanisms, the FCCSET panels, suc- of providing scientists and engineers with ac- ceeding at promoting “efficiency” in national cess to supercomputers. 34 programs while also allowing individual agencies to accomplish their missions? The research coordination subpanel, chaired by a representative of DOD, seeks to coordi- The Federal investment in large-scale com- nate government agency programs that fund puting programs is already large and is grow- research that contributes to the U.S. technol- ing. Coordination among programs and gov- ogy base. A recent report of this subpanel out- ernment agencies is one way to realize the best lines present and proposed federally sponsored return on this investment. Coordination research in very-high-performance computer among agencies is also one way to ensure that research, summarizes agency funding in this national goals and interests are achieved, area, including budgets for fiscal year 1983 to despite agencies disparate goals. To date, 1985, and presents findings and recommenda- FCCSET has been the coordinating mecha- tions for future action.35 nism for all Federal programs, and has in- volved the private and academic communities in its deliberations and actions. The FCCSET panels successfully brought about discussions “&port to the Federal Coordinating Council on Science, En~”- between agencies, and established a forum for neering and Technology Supercomputing Panel on Recom- mended Government Actions To Retm”n U.S. Leadership in Su- discussion. percomputers, n.d. ‘Report of the Federal Coordinating Council on ti”ence, Ew”- In 1983, FCCSET established the Panel on neering, and Twhnology Supercomputer Panel on Recom- Supercomputers, which was charged with ex- mended Government Actions To Provide Access to Supercom- ploring what the U.S. Government could do puters, n.d. 35R.qxx-t of the Federal Codinating Council on ti”ence, Engi- to advance the development and use of large- neering and Technology Panel on Advanced Computer Research scale computers. Subsequently, three sub- in the Federal Government, draft report, August 1985.

22 23

More recently, an ad hoc Office of Science new scientific opportunities made possi- and Technology Policy (OSTP) panel was ble by the government programs; target- established to explore avenues to improve co- ing specific areas of concern for FCCSET operation between Federal agencies and the subpanels to address; recommending and private and academic sectors. Although the authorizing actions that cut across more committee headed by Jim Browne of the than one agency program. University of Texas, Austin, has not yet pub- The government emphasis on funding re- lished its findings, Browne in a recent brief- search, rather than prototype development, is ing, recommended a higher level of inter- one such issue that crosscuts most agency agency coordination, either through FCCSET boundaries, has been the focus of a FCCSET or a new mechanism. Browne also suggested subpanel, and remains a concern of scientists, the formation of a “continuing policy review 36 researchers, and government officials. Some committee. ’ steps are being taken toward prototype devel- The call for a higher level of interagency co- opment. DARPA’s Strategic Computing Pro- ordination reflects the limited abilities of gram is actively pushing technologies from the FCCSET committees to substantially alter or research stage into the prototype phase. A establish government policy. Although mem- much smaller effort is underway in NSF Di- bers of the FCCSET committees are them- vision of Computer Research and in the DOE selves intimately familiar with their agency’s program. In this case, DARPA’s efforts are programs, they lack authority to implement significantly greater than the others, which is recommendations made through the FCCSET cause for concern, since the research is focused subpanels. Implementation and policy direc- on a specific mission and cannot have as broad tives must come from either the OSTP, the ad- a perspective in experimental technologies as ministrators of each agency, or both. This re- can NSF. Also, DARPA’s funding in this area mains a potentially serious gap in the overall greatly exceeds that of other agencies. At coordination of government programs. OSTP present, a FCCSET subpanel is examining cur- may wish to consider creating a panel whose rent Federal efforts in the funding of the R&D membership would include pertinent agency of very-high-performance computer research. administrators, scientific and research users, In this instance, an OSTP panel could assist and private sector participants. Such a panel by: could serve several purposes: ● determining if the current Federal efforts ● Evaluate the Federal efforts underway to are sufficient to develop and “support” ensure that they are addressing the origi- new experimental technologies, nal concerns noted in the various reports ● identifying the ‘‘next’ steps that would (e.g. Lax, Bardon, Press, and others). facilitate the transfer of research results Such a reexamination could be a conduit to utilization, and for recommending “mid-course” correc- c examining the balance of agency pro- tions that may be necessary within the grams in the funding of experimental Federal program. technologies. ● Provide a high level of authority to im- plement suggestions and recommenda- Center Management tions made by the current FCCSET panels. ISSUE: ● Review of the Federal efforts in large- The FCCSET subpanels on access, procurement, scale computing periodically. Such a panel and research coordination have acted as forums could review the overall health and direc- for information exchange between Federal pro- tion of the Federal programs; identifying grams in large-scale computer research. Are com- parable FCCSET bodies needed to discuss cen- ter management issues, such as resource policies “Briefing of the NSF Advisory Committee, June 1985. and funding, and manpower and training? 24

As the number of large-scale computing Resource Policies sites proliferate, need continues for FCCSET Federal supercomputer programs operate to coordinate supercomputer issues. FCCSET under a variety of resource or allocation pol- can remain a single forum to consider in full icies. The NSF policies for allocation of time national interests, goals, and programs; the on computing facilities have undergone some centers are also viewed here as national re- changes since the phase I centers were es- sources whose significance extends beyond the tablished. Originally, NSF allocated time in mission needs of the funding agency. batches of 10 hours or less, with no peer re- The DOE supercomputing programs, for ex- view. Currently, in addition to those who re- ample, are designed to achieve specific mis- quest 10 hours or less, one can submit a re- sions. Nevertheless, careful planning, an ap- quest for time to NSF with no maximum time preciation of user needs, and the establishment limit set, and program directors there distrib- of a network have allowed these supercomput- ute hours at their discretion. If the requestor ing programs to become more national in is not an NSF grantee, an application for time scope. NSF has considered a comparable ef- will undergo normal NSF peer review. Under fort–a strong program plan and well-defined the new policy, 60 percent of the centers’ serv- research agenda to provide the framework for ice units will be distributed by NSF, with the success in the NSF centers’ program. A pro- remaining 40 percent allocated by the individ- gram plan will provide the basis for decisions ual centers. Up to 25 percent of each centers’ on how time should be allocated at the facil- allocation may be available for proprietary re- ities, and in turn will define the type of re- search. Also, if use of facilities is sold to for- search performed at the sites. Soon after the profit institutions, the organization will be centers begin operation, research trends of charged the full cost of using the service. Each users will be evident allowing one to see both center is expected to create a review panel of ongoing research and also new research oppor- scientists from multiple disciplines to evalu- tunities that have been created. This is espe- ate each research proposal and to adopt stand- cially important since the NSF centers’ pro- ard NSF peer review principles. gram has no single constituency, but relies on the support of 20 disciplines within the At DOE, time allocation in the Supercom- agency. puter Access Program is based on scientific merit and need. Investigators make requests FCCSET also operates as a forum for agen- to the Office of Energy Research program di- cies and supercomputer center managers to rectors at the beginning of each year, and the exchange information about experiences in program directors rank the proposals. These operating the centers, software, network de- determinations are not always final; program velopment, and general lessons learned. For administrators shift allocations during the instance, the experiences gleaned by DOE in year to allow for immediate program needs. establishing its facilities and network will be This “flexible” program policy highlights the invaluable to NSF and NASA, despite the distinction between the allocation plan of a re agencies’ very different missions. DOE experi- search agency like NSF and a mission-directed ence may be particularly germane over the allocation plan, such as DOE’S. next few years, as NSF judges the success of its funding formula or “seed money philoso- NASA is in the process of establishing NAS phy. ” Unlike other mission agencies that fully usage policy, and guidelines for user alloca- fund their largescale facilities programs, NSF tion. Allocation of time on the NAS system has required the individual centers to raise ad- will be made on the basis of uniqueness and ditional funds to supplement the NSF grants. suitability of the proposal to the NAS facil- Some scientists have claimed that this policy ities. An “announcement of opportunities” has hindered the long-term program by stream- was released recently by NASA, alerting in- lining center staff. terested researchers that the NAS system will 25 — be available for use in fiscal year 1986. Once In anticipation of the NAS starting opera- NASA receives responses to this announce- tion, NASA’s training efforts include exten- ment, the agency will finalize its allocation pol- sive “onboard” support, a user interface group icies. Generally, program managers foresee a consisting of 40 organizations across the coun- rough breakdown of 55 percent NASA-related try, and workshops to familiarize users with research, 5 percent university research (non- the system. NASA sponsored), 25 percent DOD and other These combined efforts will help educate government agency sponsored research, and professionals at universities who are not com- the remaining 15 percent, proprietary re- puter scientists on how to use the facilities. search. Generally the pool of qualified individuals Because each agency has a different mission, available to support the supercomputing fa- each requires its own allocation policies. cilities is Iirnited. With the opening of five new FCCSET has established a mechanism for al- NSF centers, the Florida State University locating time on facilities for scientists who Center, and one NASA center in fiscal year are funded by more than one agency. After the 1987, the demand for talented personnel will NSF centers and the NAS have been operat- grow, both for people to run the facilities and ing for awhile, it may be advantageous to ex- to work through problems with research scien- amine allocation policies to ensure that Fed- tists. And, too few personnel are available to eral goals, such as the advancement of science, meet the current and projected software de- are attained. At present, though, a FCCSET mands, since too few have had access to ex- panel could contribute little else to the issue perimental computing facilities and the num- of allocation. ber of graduate students in computational science and mathematics is small. Two key concerns of scientists are: the training of new Manpower and Training personnel to help operate the new centers and Training and manpower concerns are two- make them effective and well utilized; and pre= fold: how to use the available computer re- venting “raids” of experienced personnel from sources effectively, and how to make the best an existing center to a planned center, to avoid use of available personnel at each site. To pro- hurting an ongoing program. The human re- mote the understanding of the facilities that source questions are of great importance in de- leads to effective use as well as foster new use termining the success of the supercomputer pro- of the large-scale computing resources, NSF grams. It may be worthwhile for a FCCSET and DOD sponsored three summer institutes panel to consider a review of agency training ef- to train researchers in the development of forts and mechanisms to ensure that greater codes, vector techniques, and networks for re- training efforts be undertaken, particularly in mote access. These institutes also gave re- light of the development of new architectures. searchers a forum to present their work and discuss the role of supercomputers in perform- Problems and Prospects for ing it. Software Development

DOE regularly sponsors conferences and ISSUE: tutorials on the supercomputer operating sys- tem (DOE’s Compatible Time-Sharing Sys- The availability of software for large-scale com- tem) and on network access, both at the puting continues to be a critical concern. What Lawrence Livermore Laboratory and else- Federal efforts are necessary to tackle this soft- where around the country. Special conferences ware problem? are also held that explore specific problems The greater availability of time on super- and issues pertinent to members of DOE re- computers as a result of the Federal invest- search communities. ments and programs described above, can 26 both exacerbate and ease the current software rethinking of problems and a rather fundamen- demands. As more and more users turn to su- tal rethinking of strategies, ”41 in the creation percomputers to solve their scientific prob- and development of algorithms. At a meeting lems, new software will be generated that will sponsored by NSF in December 1984, re- increase the usefulness and capabilities of the searchers concluded that a new type of re- machines. Once a network is in place, those search activity was necessary .42 They recom- who use it will share software, as well. At the mended interdisciplinary teams be formed, same time, until more software is developed, consisting of mathematicians, computer scien- users will inevitably be frustrated by their in- tists, and scientists from disciplines with prob- ability to access the machines easily and the lems to be solved, to tackle the software lack of standard “libraries” of applications.37 needed for the new highly parallel architec- Until more scientists use large-scale comput- tures. Recently, NSF convened an ad hoc ers for their research, the lack of applications panel to consider the establishment of a Na- software will remain a problem. Traditionally, tional Supercomputer Software Institute. Al- manufacturers have not developed a substan- though the panel recommended against such tial base of software (including applications a move, it did encourage NSF to: software) for their machines. In fact, it has mount a program to encourage research on been stated that: parallel software by increasing the level of ,.. it has taken over 5 years to develop soft- university access to experimental parallel ware to efficiently support and utilize vector computers and by funding promising research technology .38 projects at sufficient scale to allow major soft- ware developments to be undertaken either by Also, the ability to move software from ma- university groups or consortia or by joint chine to machine and site to site is needed to university-industry teams.43 encourage users to experiment with the ad- vanced scientific computing facilities. Trans- Software is an important determinant of the effi- portability of software will be very important ciency of these machines and the types of prob- in building a base of new users: “We should lems that can be tackled with them. It is also be in a position where we can make it very easy an influence on the design of the next generation to move from one machine to another. ”39 Miss- of machine. For these reasons, an investment in ing a comfortable bridge to move code/soft- algorithms and software development is integral ware from machines in use (e.g., a Vax) to a to any large-scale computation program. supercomputer or parallel processor machine, Finally, as noted earlier, the shortage of per- the scientific community may withhold its sonnel needed to meet the current and pro- support, because of delays in performing re- jected software demands will grow worse when search and the complexity of undertaking the coupled with the introduction of radically research on a supercomputer. As noted in the different and new architectures. FCCSET report: The issues of supercomputer software re- . . . Researchers may not use this important quire better definition and resolution. To this scientific tool if its use proves difficult to end, several ideas were proposed at the OTA learn, frustrating and generally inconvenient.’” Workshop to address these issues: The nurturing of new architectures will also c A follow-on study to the Lax report that bring a host of software problems. The new specifically addresses software issues, de- architectures demand “a rather fundamental fines the problems, and explores possible

370 ffice of Technology Assessment Workshop, Apr. 29, 1985. “D. Fuss and C. Tull, “Supercomputer Support for Magnetic 410 ffice of Technology Assessment Workshop, Apr. 29, 1985. Fusion Research, ” Proc4”ngs of the IEEE, January 1984, p. “Ibid. 41. ‘sJacob Schwartz, Chairman, Ad Hoc Panel on a National Su- 3Wffice of Technology Assessment Workshop, Apr. 29, 1985. percomputer Institute, letter to Erick Block, Director, National ‘“FCCSET, op. cit., Access, Section III. Science Foundation, Nov. 4, 1985. 27

solutions or actions. This study should be to know the physical location of resources or sponsored by one or more agencies, such other users. as NSF, DOE, NASA, or DOD. The Lax As an increasing number of networks are de report recommended: “Increased research signed, developed, and created to support Fed- in computational mathematics, software eral programs in largescale computation, their and algorithms necessary to the effective importance will become more evident. Net- and efficient use of supercomputer sys- 4 works are essential to the success of these pro- tems. “4 grams because they provide access to re- ● Establishment of a software develop- sources and information on a national scale. ment/engineering institute separate from each agency but supported by funding The proliferation of computer networks, from each of the agencies. The institute’s combined with the expansion of Federal large charge would be to develop software and scale computing programs and the identified algorithms for use on all types of large- need for access to these computational re- scale computing machines and to advance sources, has generated interest in the creation the state of the art. of a national research network. The research ● The formation of a new subpanel within needs and goals of the NSF program require FCCSET to address issues-of software a network that will serve a diverse, geographi- and algorithm development. This group cally dispersed group of scientists and engi- would act as a coordinating body within neers. This broad-based approach also suits the government to keep track of agency the development of national research network, efforts and suggest needed improvements a long-term goal of certain segments of the re- and directions. search community. The national research net- work would be a broad-based telecommunica- Network Design and Development tions network designed to serve the complex and diverse requirements of the national re- ISSUE: search community and address the broader is- The National Science Foundation has stated that sue of providing scientific information to this NSFnet will be the basis for a national research community. Such a network could provide re- network. Should NSFnet be the basis for a na- searchers with information, and a means to ex- tional research network? And until this network change it. Services such as file transfer, is in place, how should NSFnet be administered computer-conferencing, electronic mail, and and managed? bulletin boards could be available. It could also Networks permit users easier access to re- stimulate the formation of major new data- sources and facilitate the exchange of infor- bases and new scientific opportunities; and fa- mation between and among them. “The abil- cilitate access to remote resources, including ity of a network to knit together the members large-scale computing resources. of a sprawling community has proved to be The proposed linking of NSFnet with a na- the most powerful way of fostering scientific tional research network raises several ques- 5 advancement yet discovered. “4 Networks tions and issues. make information and resources available to ● the researcher regardless of geography, thus What are the goals of a national research expanding scientists’ research base. Networks network? ● may also provide uniform access to their re- Is NSFnet the appropriate base for a na- sources and users without requiring the user tional research network? ● Are the goals of NSFnet and a national iiNationa] Science Foundation, Report of the panel on I=ge research network compatible? Sca)e Computing in Science and Engineering, 1982. ● 46 Peter Denning, ‘*The Science of Computing, Computer Net- Is the proposed design of a national re- works, ” American Scientist, March-April 1985, p. 127. search network the most feasible approach? 28

● What is the Federal role in the develop- from other participating networks such ment of this national network? as CSNET, and users including the cen- • Is NSF the appropriate lead agency for ter consortiums of NSFnet to assist NSF this endeavor? with policy development/direction. NSF would retain lead authority and be the Some of these questions are more easily an- “chairman” of this Board of Users. A swered than others, but all merit and need qualified outside contractor could also be thorough discussion. What these questions il- made responsible for network management, luminate is the need, in any discussion of a na- user services, etc. tional network, to keep sight of the issues of advanced scientific computing and access to Regardless of the style of management cho- these resources. NSFnet is developing quickly sen by NSF for NSFnet, several other issues and choices made today for pilot projects and need to be addressed in the near future by the other elements, will affect the future configu- agency. For NSFnet to be successful as it is ration of a national research network. currently designed, networks wishing to join NSFnet must either now employ interim in- As NSFnet develops and is used more and ternet working standards, or agree to change more by the user communities, NSF will be to these standards in the future. This shift to called on to be a network manager. But, with compatible protocols and standards must be one exception, the agency has not managed tightly managed and centrally coordinated for the day-to-day operations of a large-scale the strategy to succeed. project following the project’s early stages. In the near future the agency will need to decide The proliferation of government and private what role it will play as NSFnet evolves. NSF- networks, many of which use a variety of tech- OASC must soon decide which of several courses nologies, standards, and protocols, may also of action it should take in the management of prompt concern in the near future. To address NSFnet. Three are outlined below: the issue, the FCCSET has recently formed a networking subcommittee. It is charged with 1. Retain all management operations within examining both individual agency networks the OASC, including day-to-day opera- and community networks that are necessary tions (network operations), user services, in order to fulfill agency mission requirements and financial services, etc. Under this op- and goals. The committee will also examine tion, OASC will be responsible for all the feasibility of a single computer network management aspects of the network in- infrastructure for supercomputer access and cluding interactions with all users and research collaboration in the United States. other networks. The creation of NSFnet is seen as a unique op- 2. Identify and select a private firm with portunity for coordinating with other Federal previous experience in the management agency networks. of a net work to manage daily operations of NSFnet, including interactions with Finally, the industrial community as such has users and other networks. Under this ar- not been included in the NSFnet plans. This rangement, NSF would retain overall pol- may cause two problems: first, users within icy development responsibilities. this community may lack easy access to fa- 3. Create a Board of Users that includes rep- cilities, including the network; and second, resentatives from government agencies NSFnet communities may not be able to tap with participating networks such as private sector institutions with large-scale DOD-DA RPA-ARPANET, individuals resources.