<<

Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

October 1987

NTIS order #PB88-143185 Recommended Citation: U.S. Congress, Office of Technology Assessment, Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information, OTA-CIT-31O (Washington, DC: U.S. Government Printing Office, October 1987).

Library of Congress Catalog Card Number 87-619856

For sale by the Superintendent of Documents U.S. Government Printing Office, Washington, DC 20402-9325 (order form can be found in the back of this report) Foreword

Government agencies, private sector organizations, and individual citizens are increasingly using sophisticated communications and computer technology to store, process, and transmit valuable information. The need to protect the con- fidentiality and integrity of such data has become vital. This report examines Fed- eral policies directed at protecting information, particularly in electronic communications systems. Controversy has been growing over the appropriate role of the government in serving private sector needs for standards development and, particularly, over the appropriate balance of responsibilities between defense/intelligence agencies and civilian agencies in carrying out this role. In defining these roles and striking an appropriate balance, both private sector needs, rights, and responsibilities, on one hand, and national security interests, on the other hand, need to be carefully considered. This report examines the vulnerability of communications and computer sys- tems, and the trends in technology for safeguarding information in these systems. It reviews the primary activities and motivations of stakeholders such as banks, government agencies, vendors, and standards developers to generate and use safeguards. It focuses on issues stemming from possible conflicts among Federal policy goals and addresses important trends taking place in the private sector. OTA prepared the report at the request of the House Committee on Govern- ment Operations and the Subcommittee on Civil and Constitutional Rights of the House Committee on the Judiciary. It is the second component of OTA’s assess- ment of new communications technologies. The first component, The Electronic Supervisor: New Technologies, New Tensions, was published in September, 1987. In preparing this report, OTA drew upon studies conducted by OTA project staff, contractor reports and consultants, a technical workshop, interviews with Federal and private sector representatives, and those involved in research, manufacturing, financial services, consulting, and technical standards develop- ment. Drafts of this report were reviewed by the OTA advisory panel, officials of the and the Department of Defense, the National Bureau of Standards, the General Services Administration, the Department of the Treasury, and other government agencies, and by interested individuals in standards setting organizations, trade associations, and professional and techni- cal associations. OTA appreciates the participation and contributions of these and many other experts. The report itself, however, is solely the responsibility of OTA,

JOHN H. GIBBONS Director

/// Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information Advisory Panel

Granger Morgan, Chairman Carnegie Mellon University

Peter Arment Robert Morris** Division of Telecommunications Chief Scientist State of New York National Computer Security Center National Security Agency Robert R. Belair Kirkpatrick & Lockhart Susan L. Quinones Condello, Ryan, Piscitelli Jerry J. Berman Chief Legislative Counsel Virginia du Rivage American Civil Liberties Union Research Director 9 to 5 National Association of Working H.W. William Caming Women Consultant Forrest Smoker Robert H. Courtney, Jr. Director President Corporate Telecommunications RCI North American Phillips Corp. Harry B. DeMaio* Willis H. Ware Director, Data Security Program Corporate Research Staff IBM Corp. The Rand Corp. John Harris Lawrence Wills Special Assistant to the President Director, Data Security Program American Federation of Government IBM Corp. Employees (AFL-CIO) Fred H. Wynbrandt James M. Kasson Assistant Director Vice President Criminal Identification and Information Rolm Corp. Branch Steven Lipner State of California Digital Equipment Corp. Gary T. Marx Professor of Sociology Massachusetts Institute of Technology

**F; x-officio.

NOTE: OTA appreciates and is grateful for the valuable assistance and thoughtful critiques provided by the ad- visory panel members. The panel does not, however, necessarily approve, disapprove, or endorse this report. OTA assumes full responsibility for the report and the accuracy of its contents.

! v OTA Project Staff Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

John Andelin, Assistant Director, OTA Science, Information, and Natural Resources Division

Fred W. Weingarten, Program Manager Communication and Information Technologies Program

Project Staff Charles K. Wilk, Project Director Karen G Bandy, Analyst Jim Dray, Research Analyst Robert Kost, Analyst Mary Ann Madison, Analyst Joan Winston, Analyst Michael McConathy, Intern Paul Phelps, Editor Jeffrey P. Cohn, Editor

Administrative Staff Elizabeth A. Emanuel, Administrative Assistant Audrey D. Newman, Administrative Secretary Sandra Holland, Secretary Rebecca A. Battle, Secretary Contents

Page Chapter l. Executive Summary ...... 3

Chapter2. Introduction...... 13

Chapter3. The Vulnerabilities of Electronic Information Systems ...... 23

Chapter4. Security Safeguards and Practices ...... 51

Chapter5. Improving Information Security ...... 95

Chapter6. Major Trends in Policy Development ...... 131

Chapter7. Federal Policy Issues and Options ...... 151 Appendix A. Requesting Letters ...... 163 Appendix B. National Policy on Protection of Sensitive, but Unclassified Information in Federal Government Telecommunications and Automated Information Systems . , ...... 165

Appendix C. The Data Encryption Standard ...... 168

Appendix D. Message Authentication, Public-Key , and Digital Signatures ...... 174 Appendix E. Acronyms...... 182 Appendix F. Contributors and Reviewers ...... 183 Chapter 1 Executive Summary CONTENTS

Page The Need for Information Security ...... 4 Safeguard Technology...... 5 Users’ Needs and Actions ...... 6 The Role of the Federal Government ...... 7 Policy Alternatives ...... + ...... 9 Chapter 1 Executive Summary

As society becomes more dependent on com- Policy for information security, long domi- puter and communications systems for the con- nated by national security concerns, is now be- duct of business, government, and personal ing reexamined because of its broadening ef- affairs, it becomes more reliant on the confiden- fects on nondefense interests. At the center tiality and integrity of the information these of the current controversy is the appropriate systems process. Information security has be- role of the Federal Government in information come especially important for applications security. The immediate policy questions fo- where accuracy, authentication, or secrecy are cus on whether NSA, primarily an intelligence essential. agency, or the National Bureau of Standards (NBS), a civilian agency, should be responsi- Today’s needs for information security are ble for developing information security for non- part of a centuries’ long continuum that shifts defense applications. A fundamental issue is in emphasis with changing technology and so- how to resolve conflicts involving the bound- cietal values. Modern electronic information ary between the authority of the legislative and systems are expanding the need for both fa- executive branches to make policy when na- miliar and new forms of information security. tional security is a consideration; a topic with implications extending well beyond the nar- row confines of information security policy. Today’s needs for information secu- rity are a part of a centuries’ long A separate, but related dimension to policy- continuum. making involves recent efforts to provide ad- ditional Government controls on unclassified information in computer databases, some Fed- eral, some commercial. Proponents of greater Developing adequate information security Government controls argue that these data- technology is a challenging task. This task is bases make information so readily available further complicated since some of these evolv- to foreign governments, competitors, and those ing needs can only be satisfied with technol- having criminal intent, that uncontrolled ac- ogy that must itself be kept secret, according cess to them is a threat to national security. to Department of Defense sources, because re- vealing it could be damaging to U.S. intelli- Congress is responding to these issues by l gence operations. This situation raises the examining alternative Federal roles in infor- practical question of whether safeguards de- mation security. Each of the three options signed for use by defense and intelligence agen- for providing leadership-through NSA, NBS, cies can meet the needs of commercial users or greater reliance on the private sector—has without jeopardizing U.S. intelligence objec- its own particular drawbacks and none is likely tives, i.e., whether the National Security Agency to completely satisfy all national objectives. (NSA) can reconcile its traditional secret pos- ture with the openness needed to solve non- There are a number of national interests to defense problems. It also raises the broader be accommodated by policy makers. An opti- issues of the appropriate role of defense and mum outcome would maximize the ability of intelligence agencies in civilian matters, and free market forces to develop and apply tech- how openness and free market forces can coex- nology to meet users’ diverse and unfolding ist with secret operations and controls on sen- needs for information safeguards, while avoid- sitive information. ing unnecessary restrictions on trade, innova- ‘The terms intelligence and intelligence operations are used tion, and the free flow of information as well throughout this assessment to refer to signals intelligence. as compromises to the Nation’s security. 3 4 ● Defending Secrets, Sharing Data: New Locks and Keys for Electron/c /n formation

THE NEED FOR INFORMATION SECURITY The need for information security has existed for thousands of years, but the advent of elec- Information security was not a key tronic information systems—telegraph and factor in the design of most computer telephone, sound and image recording, and computers and databases—has reemphasized and communications systems. As a re- the need for traditional safeguards and created sult, some forms of unauthorized ac- a need for new ones. Early concerns tended to cess, such as wiretaps, intercepting focus on controlling access to information and mobile telephone conversations, or protecting its confidentiality. logging into computers with easily guessed passwords, can be achieved Modern computer and communications sys- with limited resources. tems are being used in ways that often require those using them to authenticate the accuracy of data, verify the identity of senders and re- ceivers, reconstruct the details of transactions, increased vulnerability; optical fibers have de- and control access to sensitive or private data. creased it. Still greater changes may be ahead As the use of these systems increases, the vul- as digital communications come into wider use. nerabilities, threats, and risks of misuse have become clearer, and information security has Increases in computing power and decentral- become a prominent issue for many Govern- ization of computing functions have increased ment agencies and private users. the vulnerability of computer and communi- cations systems to unauthorized use. Two types of misuse should be distinguished: mis- use by those not authorized to use or access Electronic information systems—tele- systems and misuse by authorized users. For graph and telephone, sound and image many public and private organizations, the lat- recording, computers and databases— ter problem is of greater concern. reemphasize the need for traditional safeguards and create needs for new The level of effort, expense, and technical so- ones. phistication needed to gain unauthorized ac- cess to computer or communications systems, even when the system being attacked employs no special safeguards, can vary widely. Some The computer and communications technol- forms of covert access, such as wiretaps, in- ogies on which these information systems are tercepting mobile telephone conversations, or built, however, were not developed originally logging into computers with easily guessed with information security in mind. They were passwords, can be achieved with very limited designed for efficient and reliable service in the resources. Others, such as those intended for presence of accidental error, rather than inten- targeted and consistently successful unauthor- tional misuse, and little attention was given ized access, can require greater resources due to protecting confidentiality. As one result, the to inherent barriers in the design of these sys- public communications network has always tems. Systems protected by appropriate safe- been vulnerable to exploitation by those with guards can deny access even to dedicated for- appropriate resources (see below). eign intelligence agencies. Technology can increase or decrease the vul- Users of computer and communications sys- nerability of communications to misuse. Micro- tems have widely different perceptions of the wave radio and cellular telephones have both threats against which protection is needed. Ch. l—Executive Summary ● 5

Some users protect their systems only against has motivated it to launch programs to im- unintentional error or amateur computer prove the security of nondefense computer and hackers. Others guard against misuse by their communications systems. own employees, outsiders, or the sophisticated Thus, even though virtually all users have intelligence agencies of foreign countries. concern for some combination of confidential- ity, integrity, and continuity of service, the business community and the Government agen- Many businesses are concerned with cies that deal with it often have a very differ- the integrity of certain of their com- ent outlook and need than defense and intelli- gence agencies when it comes to safeguarding puter information, but not greatly information in computer and communications concerned with threats to the confi- systems. This difference is one reason why dentiality of their domestic commu- some of the business community has been nications. reluctant to accept safeguard technologies based on NSA’s assessment of needs or that are tightly controlled by NSA. There are few publicized cases of communi- cations interception and most of these deal Safeguard Technology with the interception of government commu- nications by foreign intelligence agencies. Not The private sector is developing a number surprisingly, most commercial and private of ways to safeguard information in computer users, under ordinary circumstances, are not and communications systems. These include greatly concerned about their communications, technologies to encrypt data to make it con- particularly within the United States, being fidential and to control access to computer sys- intercepted by foreign governments or others. tems (such as with personal identification tech- Indeed, many businesses are concerned primar- ily with the integrity of certain of their busi- ness information and, in other cases, with the Important techniques are emerging to confidentiality of their sensitive information. improve the security of information Early computer systems were designed to in these systems including technical be used by trained operators in reasonably con- means to verify the identities of the trolled work environments; therefore, only lo- senders of messages, authenticate cal access to the systems was of concern. To- their accuracy, and ensure confiden- day’s systems, in contrast, are often designed tiality. to be used by, almost literally, anyone from anywhere. With this ease of access to comput- ers, new problems have emerged, both from hackers and other unauthorized users, and niques), as well as to audit system activity and from employees authorized to use the systems. other administrative procedures. In many Available data suggest that the damage done cases, commercial safeguards for these sys- by computer hackers to poorly safeguarded tems are still evolving, as are users’ under- systems is less severe than originally thought, standing of their needs for them. and that actual and potential misuse from em- ployees who are authorized to use the systems The use of information safeguards, properly is far more significant. applied, can vastly increase the level of re- sources required for potential adversaries to On the other hand, NSA is concerned with successfully gain access to protected systems. foreign intelligence gathering, a concern that Some safeguards require two or more , 6 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

are accomplished has only begun to be explored for applications in finance, commerce, law, and Innovation is especially important for government. the evolution of new applications of information security. Users’ Needs and Actions Commercial and other users want greater in- often trusted employees, to collude in order to formation security to reduce fraud, embezzle- gain unauthorized access, while others leave ment, and errors; cut the costs of operations; audit trails to identify how the system was mis- and protect proprietary and private data. Users used and by whom. But technical safeguards have begun to incorporate information safe- alone cannot protect information systems com- guards in a gradually expanding range of ap- pletely; effective management policies and plications. For example, information security administrative procedures are also needed. is being applied in the banking industry to re- duce errors and opportunities for fraud, and Safeguard products are based both on adap- in other industries as part of an increasing reli- tations of existing technology and on innova- ance on electronic, rather than paper-based, tions. Some of the approaches to controlling transactions. These electronic transactions al- access, for example, rely on the use of pass- low businesses to simplify paper work and re- words or hand-geometry measurements. Tech- duce inventory costs, niques for authenticating messages include those that make use of newly developed math- Although there are significant differences in ematical techniques called public-key cryptog- the needs for information security even among raphy and electronic procedures for providing users within the same industry, civilian users “digital signatures” to verify the identity of often focus on data integrity. They also tend the sender of a message. to be especially sensitive to the importance of the ease of use and cost-effectiveness of safe- As is already becoming clear with cryptog- guards. Many defense needs, too, resemble raphy, innovation is especially important for those of civilian users, but in addition, some the evolution of new applications of informa- defense functions, especially intelligence activ- tion security. The capabilities now evolving ities, have a primary need for confidentiality. will allow advances in the way many electronic These latter needs must be ensured, even if transactions take place, from digital signatures they entail higher cost or lowered ease of use. and legally enforceable electronic contracts to improved individual and corporate accounta- Business users have tended to consolidate bility and assured confidentiality of trans- their requirements for common information actions. The potential of and related mathematical techniques for transform- safeguards through voluntary participation in the activities of U.S. and international orga- ing the ways in which automated transactions nizations that develop open public standards. In contrast, NSA sets its own standards in a process that is sometimes open to the public (as is typical for computer security) and some- Some new safeguard techniques have times not (as is typical for communications only begun to be explored, but show security). These and other differences raise the promise for broad applications in com- question of whether information safeguards de- merce and society. signed by and for the defense and intelligence agencies are well suited to the needs of com- mercial and other users. Ch l—Executive Summary ● 7

THE ROLE OF THE FEDERAL GOVERNMENT The Federal Government has played an ac- tive role in the development of information its activities in providing standards safeguards. NSA was established to unify U.S. signals intelligence operations against foreign and specifications, certifying equip- communications and to protect U.S. military, ment, and developing secret crypto- intelligence, and diplomatic communications graphic algorithms, have made the against foreign government intelligence gather- Government influential in the deci- ing efforts. As NSA’s concerns expanded to sions of some industries about their include computer security, the agency has be- use of information safeguards. gun to provide technological leadership for ci- vilian uses of information safeguards, presum- ably in ways that minimize the impact on its sis for many private cryptographic standards. foreign intelligence operations. It is also the standard in use by other civilian Government agencies. In addition, both NBS and NSA have facilitated the entry of crypto- Federal policy for information secu- graphic-based safeguards into the market by rity has long been dominated by na- certifying and endorsing commercial products tional security interests and con- and developing guidelines for their use. trolled by DoD and NSA. In the mid-1980s, however, changing Gov- ernment policies provided new direction for the Federal role in, and leadership for, information In addition, the National Bureau of Standards security. National Security Decision Directive has played a central role in setting informa- 145 (NSDD-145), issued in 1984, expanded Fed- tion security standards for civilian Govern- eral concerns to include “safeguarding systems ment agencies and certifying commercial prod- which process or communicate sensitive infor- ucts. NBS’s role stems from the Brooks Act mation from hostile exploitation, established of 1965, which authorized it to set standards a high-level interagency group to implement for computers used by Government agencies. the new policy, and assigned key responsibili- ties to the Department of Defense and NSA, One result of NSDD-145 was to authorize A civilian agency, NBS, has become NSA to develop information safeguards for Government agencies to protect unclassified active in the development of computer information. In effect, this meant that respon- security standards since the mid- sibility for certifying DES as a national stand- 1970s. Recent policy directives, how- ard and other safeguard technologies was ever, have shifted control back to DoD transferred from NBS to NSA. In a major shift and NSA, raising questions of the in policy, NSA announced in 1986 that it would boundary between civilian and mili- no longer certify DES-based products for Gov- tary authorities.

There has been controversy about NBS, with the active technical support of DoD restrictions on the export of NSA, spearheaded the development of a na- cryptographic equipment embodying tional standard for cryptography, the Data En- classified technology. cryption Standard (DES). DES, which was adopted by NBS in 1977, has become the ba- 8Ž Defending Secrets, Shaaring Data: New LOCkS and Keys for Electronic /formation

new controls on what it called unclassified, but sensitive information in various Government There are significant differences in information systems and commercial databases. users’ needs for information security These efforts raised such a protest from sci- even among users within the same in- entific and civil liberties organizations and the dustry, which raises the question of business community that the directive was re- whether information safeguards de- scinded during the course of congressional signed by and for defense and intelli- hearings in 1987 and NSDD-145 itself was put gence agencies are well suited to the under review. needs of commercial and other users.

The expanding sphere of national secu- ernment use beginning in 1988. Instead, NSA rity concerns embedded in informa- said it will supply its own, secret cryptographic tion security policy is now seen as designs for use by U.S. companies and civil- competing with other national inter- ian Government agencies—a move that has ests and affecting basic principles raised some industry concerns because it might such as the appropriate balance be- result in restrictions on the use of equipment tween defense and civilian authority embodying these designs and it might also al- and public access to information. low NSA itself to eavesdrop on corporate com- munications. This shift of responsibilities from NBS to NSA raised several other questions. One involves These changes in Federal policies on infor- mation security indicate an expanding sphere the efficacy of NSA-developed standards and of “national security” concerns—a concept guidelines for users outside the national secu- whose definition is subject to interpretation rity community. Another question concerns and change. The changes point out clearly that the scope of NSA’s activities in light of NBS’s Federal policy for information security, until legislated responsibilities under the Brooks recently a topic of little concern beyond the Act. Government’s defense and intelligence com- In a later directive2 intended to implement munities, now has significant impact on much NSDD-145, the National Security Council placed broader areas of national interest, including commerce, innovation, free flow of information, and civil liberties. They also indicate that ten- sions are likely to recur as the use of automated In the current reexamination of pol- information systems continues to expand. icy on information security, the imme- diate policy question is whether NSA or NBS should be responsible for non- defense applications. Longstanding fundamental issues in- clude how to resolve conflicts involv- ing the boundary between the author- ity of the legislative and executive branches when national security is a consideration and the process by which ‘National Policy on Protection of Sensitive, but Unclassified Information in Federal Government Telecommunications and these policies are developed. Automated Information Systems, National Security Council, Oct. 29, 1986. Ch. I—Executive Summary ● 9

POLICY ALTERNATIVES Federal policy for the security of informa- Option 3. Separate the responsibilities of NSA tion in computer and communications systems and NBS for safeguard development along the seeks to achieve a number of objectives rang- lines of defense and nondefense requirements. ing from protecting national security to foster- The bill currently being considered by Con- ing development of private sector competence gress (HR 145) is a variation of option 3 and to meet its own needs. Policy might also seek is an attempt to resolve, by legislative means, to establish a structure within the Government policymaking for information security. One of that can provide leadership and standards both its principal results is that it would clarif y the for defense and intelligence purposes and for roles of NBS and NSA, and tend to separate the business community, Although there are civilian and defense interests. Among its main often strong differences of opinion on the shortcomings is the absence of a capability to merits of specific Federal policies, there seems support unclassified research in safeguard to be broad agreement on the types of goals technology. This capability, perhaps more than that such policies might aim to achieve. Some any other single factor, would strengthen the of these goals are to: ability of the private sector to satisfy its own needs for information security and reduce de- ● foster the ability of the private sector to meet the evolving needs of businesses and pendence on the Government. civilian agencies for information safeguards; In option 3, additional choices can be made. ● minimize risks to intelligence capabilities resulting from independent, private sec- A. Provide Federal support to the private sec- tor to specify, develop, and certify safeguards tor developments; for business and civilian agencies. NBS would ● clarify the roles of Federal agencies con- be the focal point for all safeguard standards cerning safeguard technology, particularly for unclassified information; NSA would re- those of NSA and NBS; main the focal point for classified information. ● promote competition, innovation, and trade; B. Allow free market forces to develop safe- ● guards for nondefense needs, with NBS act- separate, where practical, defense and in- ing as the focal point for Government needs telligence agencies’ missions from those for safeguards for unclassified information. of the private sector and civilian agencies; NSA would satisfy the requirements of De- and, partment of Defense agencies and their con- ● minimize or reduce the tensions between tractors, and provide technical advice for other Federal policies and private sector ac- users. tivities. Each of the three broad options has short- The basic alternatives for policy center around comings. Essentially, the choice depends on the relative roles of NBS, NSA, and the pri- whether policymakers prefer to tolerate greater vate sector in providing leadership in the tech- tensions, a blurred division between defense- nological development and use of safeguards intelligence and civilian matters, and more con- for unclassified electronic information. The op- strained private sector technical capabilities, tions are: or to take larger risks that intelligence capa- bilities will be damaged by proliferation abroad Option 1. Centralize Federal activities relat- ing to safeguarding unclassified information of U.S. safeguard technology. in Government electronic systems under the OTA’s evaluation indicates that centraliz- National Security Agency. ing authority in NSA for developing safeguards Option 2. Continue the current practice of de for unclassified information in Government facto NSA leadership for communications and systems (option 1) or maintaining the current, computer security, with support from the Na- blurred relationship between NBS and NSA tional Bureau of Standards. (option 2) would be the least effective in mini- 10 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information mizing tensions and in separating defense and intelligence missions from civilian matters. On For policies to meet the evolving needs the other hand, U.S. foreign signals intelligence of the Nation, they will have to be gathering operations may be poorly served if NSA is not party to all safeguard development flexible and balance various national (option 3). interests. Independent of institutional arrangements in the United States, however, there are also risks to our intelligence that stem from sources natives for implementing policy differ mainly outside the control of U.S. policy, such as the in the source of national leadership for the de- policies of foreign governments, actions taken velopment and nondefense use of safeguard by international business interests, and the ef- technology, the level of Federal encouragement fects of foreign innovation. or control of private sector innovation, and in flexibility to adjust to changing needs of com- There are inherent tensions between U.S. in- merce and society. telligence interests and evolving nondefense needs for information security technology. In addition, there are enduring conflicts involved Three main observations result from OTA’s in balancing national security and broader na- analysis: tional interests. Potential conflicts also exist 1. Excessive accommodation of either com- between the tendency to restrict access to un- mercial or defense and intelligence con- classified, but sensitive information, and con- cerns could prove damaging to overall cern for the free flow of information and con- U.S. interests. stitutional rights. Perhaps the optimum result 2. Policies that are inflexible, based primar- that legislation should be expected to achieve ily on defense and intelligence interests is to provide a clear policy basis against which or on Government control of technologi- to measure future imbalances. cal advances in the private sector, are In addition, any option that raises the cost likely to create substantial tensions with of safeguards, impairs user operating efficiency, the widening range of other national and or results in incompatible standards for defense international interests affected by them. and non-defense users, will discourage the de- 3. A process for weighing competing na- velopment and use of commercial products. tional interests is needed. Centering pol- icymaking in the Department of Defense There are no options for Federal policy that alone and, in particular, NSA would make clearly and simultaneously foster all national that difficult. objectives without costs to others. The alter- Chapter 2 Introduction CONTENTS

Page Society’s Changing Needs for Information Security ...... 13 Information Security and Government Policy ...... 15 Importance of Information Security Technology and Policies ...... 16 Business Interests in Information Security...... 19 Conclusions ...... 19 Chapter 2 Introduction

On a day nearly 4,000 years ago, in a town able to intentional and accidental error or mis- called Menet Khufu bordering the thin use. This will allow us to use the new systems of the Nile, a master scribe sketched out the with confidence in a widening range of appli- hieroglyphics that told the story of his lord’s cations, such as electronic contract negotia- life and in so doing he opened the recorded his- tory of cryptology. tions, with assurance that private, proprietary, –David Kahn, The Codebreakers: or intellectual information entrusted to them The Story of Secret Writing will be properly protected. Information technology is revolutionizing Progress is being made in developing tech- society as profoundly as mechanical technol- niques for satisfying these needs. However, ogy did in creating the industrial revolution. both the pace and direction of this progress As a result, we are increasingly dependent for will be affected by two factors: society’s everyday functioning on electronic ways to gather, store, manipulate, retrieve, • the traditional use of Federal information transmit, and use information. By all accounts, security policy, often as a means of im- the importance of automated information sys- plementing national security goals; and tems and the communications systems that ● the need to accommodate the variety of link them will continue to increase and trans- national interests that are affected by Fed- form the way we conduct our government, eral policy on information security. business, scientific, and even personal affairs. To put the topic in perspective, just as in- This increasing dependence on information formation security is a small, but vital part technology is creating a need to improve the of the larger framework of information tech- confidentiality and integrity of electronic in- nology, Federal policy on information security formation, i.e., its security, so that computer is a reflection of broad national interests, rather and communications systems are less vulner- than that of national security alone.

SOCIETY’S CHANGING NEEDS FOR INFORMATION SECURITY The need for information security is not new. about unauthorized reproduction. And the pro- It dates back hundreds, even thousands, of liferation of electronic storage quickly brought years. Methods for conveying confidential mes- questions of how to prevent misuse of elec- sages were used in ancient Greece and much tronic data. Indeed, most of the attributes of of the Western world by kings, generals, diplo- information that are of concern today —confi- mats, and lovers. Today, the governments of dentiality, accuracy, accountability-have long most developed nations make extensive use of existed. Technological advances have not only encoding techniques to keep their sensitive modified their importance but have also intro- electronic communications secret. duced fundamentally new issues. Technology itself has long played a leading Today’s technology provides new capabil- role in causing certain attributes of informa- ities that raise both familiar and new concerns tion security to become highlighted. The in- for security. High on the list of current con- troduction of the telegraph brought concern cerns are the need for controls on capabilities about eavesdropping. Inexpensive sound and for accessing, altering, and duplicating elec- video recording capabilities raised concerns tronic data, and the ease of retrievability and

13 newspaper and come back here. searchability of databases. Still other concerns ‘‘seals, codebooks, physical identification, include ensuring the accuracy of messages and separation of authority, and auditable book- verifying their origin, and providing means for keeping procedures are all being used or con- auditing or reconstructing transactions. These sidered today, separately or in combination, concerns have arisen both in Government to contain misuse of electronic information. agencies and in businesses worldwide because Prominent among the recent innovations are traditional physical security measures are lim- public-key cryptography, and the “zero knowl- ited in their ability to prevent misuse of infor- edge’ proof. The former may be used to estab- mation in today’s automated world. lish private communications between previously unacquainted parties, as well as to provide the In addition, as information technology in- electronic equivalent of a personal signature. creasingly substitutes for paper-based systems, The latter can be used to demonstrate that a it is important to retain familiar capabilities person knows a piece of information without of the older technology. In fact, many of the revealing the information in the process. For developments in security are attempts to im- example, it could be used to demonstrate knowl- bue in modern information systems parallels edge of a solution to a “hard problem” with- to the more familiar safeguards and procedures out revealing anything about the specific so- of paper-based and face-to-face forms of busi- lution method. Each of these innovations have ness transactions that we have become accus- broad implications for new applications of in- tomed to using—as discussed later. formation technology. Some security techniques are adaptations of earlier ones, while others are genuine innova- Such encryption-based safeguards provide tions. Modern equivalents of such traditional a basis for today’s sophisticated information security tools as passwords, notary public security technology and an expanding range Ch. 2—Introduction ● 15 of commercial applications. Banks are begin- less expensive electronic equivalents. Expand- ing to use these technologies to safeguard ing these capabilities to include proof of mes- electronic fund transfers. Similarly, some com- sage receipt and acceptance, and protection of panics are beginning to use them to protect the anonymity of those taking part in trans- the confidentiality y of electronic and to re- actions, is likely to require further innovation. place paper-based business transactions with

INFORMATION SECURITY AND GOVERNMENT POLICY

Federal policy for the security of electronic Federal standard. As a consequence, DES is information was, until recently, an obscure gradually becoming used for many applications. topic having little public interest. In the first place, virtually all such policy was related to Interest in information security is now world- the secrecy of military, intelligence, and diplo- wide and an active area of research and devel- matic information. Second, the authority and opment in western European countries and expertise for keeping information secure rested Japan. DES has been considered as an interna- with defense and intelligence agencies that nor- tional standard during recent years in forums mally do not engage in open policymaking. composed of representatives from interna- Moreover, except for defense contractors, Fed- tional businesses and governments (see ch. 5). eral policies had little effect on the public or The proliferation of information technology on private businesses. has made more sensitive data accessible to more users, thereby creating another form of The Government’s national security focus new vulnerability to misuse. In order to limit created an incentive to control the prolifera- potential damage to U.S. interests, particularly tion abroad of communications safeguard prod- from foreign intelligence agencies, the execu- ucts and, in fact, to control the technology tive branch has sought to control access to un- itself. The purpose was to deny foreign adver- classified information that it deemed sensitive. saries access to valuable U.S. technology and Although the definition of such information to protect the viability of U.S. foreign intelli- has been open to considerable debate that is gence operations. still unresolved, it may include proprietary in- formation filed with defense agencies and the During the 1970s, the National Bureau of Environmental Protection Agency, economic Standards (NBS) began to develop computer data collected by the Commerce and Treasury security standards for use by Government Departments, and personal data kept by the agencies based on its authorities stemming Department of Health and Human Services. from the Brooks Act of 1965. In 1977, with technical assistance from the National Secu- Policy directives issued by the executive rity Agency (NSA), NBS adopted the Data En- branch in 1984 and 1986, and ensuing congres- cryption Standard (DES) as the national stand- sional hearings in early 1987, have significantly ard for cryptography. For the first time, a increased public concern over Federal informa- published cryptographic standard became tion security policy. The expanding pattern of available for civilian agencies, and it quickly defense-intelligence interests as a central fo- was adopted by business users and the Amer- cus in the formulation of policy is seen as com- ican National Standards Institute as the ba- peting with other major national interests and sis for many industry standards. NBS also has become the subject of public debate. The began to validate commercial products imple- focus of the debate has been on the potential menting DES, thereby increasing users’ con- impact of these policies on some fundamental fidence in the products’ conformance with the tenets of American government: the separation 16 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic information - — of and appropriate balance between defense serious misgivings about the applicability of and civilian authority, constitutional rights, the security approach taken by the Depart- open science, and Government controls on pub- ment of Defense (DoD) to the needs of the pri- lic access to information. The debate also raises vate sector. the question of how to resolve conflicts involv- ing the boundary between the authorities of At the same time that these Federal policies the legislative and executive branches in mak- and their effects have been unfolding, trends ing policy when national security is a consid- are visible that may significantly influence eration. On a more practical level, there are also commerce and other private sector interests.

IMPORTANCE OF INFORMATION SECURITY TECHNOLOGY AND POLICIES Interest in information security technology trends reflect these interests and are bringing now clearly extends beyond the Federal Gov- attention to the direction of Federal policy. One ernment to the private sector as well. Its im- concerns the Federal Government’s need to portance to business and society cannot be keep an increasing amount of unclassified in- gauged adequately by the dollar amount of formation confidential while, at the same time, sales of products, but by the range of applica- gathering intelligence from other countries. tions that the technology makes possible. The question of what information ought to be kept confidential, or have access to it con- Safeguard technology is likely to become a trolled, is not well defined, but subject to judg- mainstay for facilitating tomorrow’s automated ments concerning potential damage to the Na- world of finance, commerce, and law, much as tion’s security; examples of such information automated message authentication and veri- might include corporate proprietary data that fication are now becoming essential for the could benefit foreign competitors or data use- banking industry worldwide. These technol- ful to terrorists. The other trend is the evolv- ogies are used to authorize transactions, ing and growing need of the private sector to authenticate users, verify the correctness of safeguard certain of its information and infor- messages and documents, certify that legiti- mation resources from theft, destruction, or mate transactions have occurred and identify other misuse. the participants, and protect individual and corporate privacy. Federal policy has been formulated both by the executive and legislative branches, some- Such applications are likely to be used to times with similar purposes. Policy in infor- establish a legally valid electronic equivalent mation security has often been set by the Presi- of the centuries-old, paper-based systems for dent, based on national defense needs. This has authorizing access to information, identifying invariably led to a major role for the DoD. Leg- parties to agreements, authenticating letters islation, on the other hand, has also been used and contracts, ensuring privacy, and certify- to establish policy for information security. ing value. In this sense, they will replace such The latter has often been based on other na- traditional safeguards as letters of introduc- tional interests, such as the privacy of tele- tion, signatures, and seals, and assume an im- phone communications and of data in Govern- portance difficult to foresee from the limited ment computer systems. Such laws typically applications of today. have involved civilian agencies in their imple- mentation. Both Government and industry are inter- ested in improving the security of information Society’s needs and the new demands stim- they own or are entrusted with. Two major ulated by technology are causing these sepa- Ch. 2—introduction ● 17 --

NSA functions are a consolidation of mis- sions previously performed by each of the mil- itary departments. One of its two main missions is foreign signals intelligence, i.e., gathering information principally by intercepting and decoding electronic communications. It also protects U.S. military and diplomatic commu- nications by enciphering them or making them less accessible to interception by the intelli- gence agencies of other countries. Such work is classified. NSA’s work in developing encryp- tion techniques, however, has made it the un- disputed technical leader in the United States. More recently, the agency has widened its scope to include computer security. Photo credft Courtesy of NBC News V(deo Archfve Roof of Soviet embassy in Washington D.C. Some of these Government efforts to reduce showing antennas vulnerabilities from unauthorized access to communications systems are also creating ten- sions with other defense and intelligence in- rate policy paths to converge. With this con- terests. To the extent that methods to reduce vergence, major stresses are becoming visible unauthorized access to these systems enter the in the balancing of competing national inter- public domain, they can be used by other coun- ests and with the process by which policy is tries, thereby damaging NSA’s ability to gather developed. intelligence. The focus of information security policy on Since the 1970s, DoD has become increas- the military, intelligence, and diplomatic in- ingly concerned about the vulnerability y of U.S. terests of Government has particular signifi- communications to foreign intelligence activ- cance for the issues of today for two interrelated ities. As a result, NSA has launched several reasons. First, responsibility for protecting the programs to better safeguard the Government security of Government electronic information electronic communications. NSA has also en- is consolidated within the defense and intelli- couraged domestic common carriers to provide gence communities, where NSA has been given tariffed “confidential” communications serv- the lead responsibility. The second concerns ices for customers and has briefed dozens of the broadening scope of executive branch ac- U.S. companies on the vulnerability of com- tions taken for reasons of national security. munications systems to interception. There is a tendency for this concern to include unclassified, but sensitive information. Second, where “national security’ has gen- erally been used to control classified military NSA was created in 1952 as an agency of and certain diplomatic electronic information, DoD by secret Executive Order. For decades executive branch directives of 1984 and 1986 its existence was not made public, and the only extend this rationale to encompass unclassi- extensive public description of its operations fied information considered to be sensitive. were provided in the book, The Puzzle Palace, which the agency tried to prevent from being A current debate concerns the appropriate published. l NSA has been the subject of con- agency for Federal leadership for developing siderable controversy during the past decade security standards for civilian computer sys- due to its secret operations.’ tems—NSA or the Department of Commerce’s NBS. However, the core issue is more basic. ‘James Bamford, The Puzzle Palace (New York, NY: Pen- It goes to the question of whether or not a de- guin Books, 1983). ‘Ibid. fense agency should control matters that are 18 ● Defending secrets, Sharing Data: New Locks and Keys for Electronic Information central to civilian interests, such as commerce The level of public concern was elevated fur- and the free market, constitutional rights, and ther with the release by the National Security principles of open science. It also involves ques- Council in October 1986 of a policy statement tions about executive branch authority under defining what information is sensitive and the Constitution to set policy based on national therefore possibly in need of safeguarding.3 security. Yet a third dimension involves soci- The release coincided with well-publicized Gov- ety’s evolving needs for information security ernment activities aimed at identifying and and the appropriate Federal role in accommodat- possibly restricting access by selected foreign ing those needs. governments to unclassified, but sensitive data in Government and commercial automated in- The event that triggered the current exami- formation systems. As a result, the issue of nation of Federal policy was the National Secu- rity Decision Directive 145 (NSDD-145), dated Government restrictions on public access to unclassified information, whether or not in September 17, 1984. That executive branch Government systems, has become a public con- directive established as Federal policy the safe- cern. The statement, though rescinded in early guarding of unclassified, but sensitive infor- 1987, caused public alarm that illustrated the mation in communications and computer sys- extent of sensitivities among diverse organi- tems that could otherwise be accessed by zations concerning controls on unclassified in- foreign intelligence services and result in “seri- formation. ous damage to the U.S. and its national secu- rity interests. ” Perhaps the major effect of these executive branch policies to date has been to encourage NSDD-145 also created an interagency man- agement structure to implement the policy. It an examination by Congress of the effects of such defense-oriented policies on civilian mat- gave leading roles to the National Security ters. Legislation has been proposed to reestab- Council, DoD, and NSA. These roles include lish civilian control over the security of unclas- defining what information to protect, decid- sified information systems. In the short term, ing on the appropriate technology for safe- many of the currently prominent issues related guarding unclassified information, developing to information security policy are likely to be technical standards, and assisting civilian addressed by congressional debate over the agencies in determining the vulnerabilities of proposed legislation, including the respective systems to misuse. roles of NBS and NSA in setting standards NSDD-145 raised numerous questions from and the measures to be taken, if any, to con- critics in other Government agencies as well trol access to unclassified information. as from civilian sources, some of which relate For the longer term, however, the vulnera- to the broader issues mentioned above. They bilities to misuse of information systems will include concern for: depend on the development and widespread intermingling defense and civilian matters; use of technical, administrative, and related public access to Government information; safeguards. The availability of high-quality in- the legislated responsibility y of NBS to de- formation safeguards worldwide, especially velop computer standards for the Federal cryptographic-based systems, on the other Government under the Brooks Act of 1965, hand, will make intelligence gathering more as amended; difficult for the United States. private sector development and use of safeguard technology; and expanding the responsibilities of NSA in civilian matters, particularly in light of the conflict of interest between its intelligence ‘National Policy on Protection of Sensitive, but Unclassified Information in Federal Government Telecommunications and mission and commercial needs, and its Automated Information Systems, National Security Council, lack of direct public accountability. Oct. 29, 1986. Ch. 2—Introduction ● 19

BUSINESS INTERESTS IN INFORMATION SECURITY The question of the extent to which infor- pair normal business operations; that is, in or- mation systems should be protected depends der to be useful, security measures must be on the various perceptions of threats to those practical and efficient. U.S. firms engaged in systems. Simply put, U.S. defense and intelli- international commerce and banking, to be able gence agencies are concerned about unauthor- to use these systems, also must be able to ized access to commercial communications and export them to their subsidiaries in other computer systems by the intelligence organi- countries. zations of foreign countries, particularly the Concern for cost is an area in which contrasts Soviet Union. However, U.S. businesses or ci- between defense and intelligence agencies and vilian agencies generally do not consider their business interests are even more apparent. Pri- main risk to be from such sophisticated adver- vate businesses must remain profitable and saries. competitive, and, therefore, they resist safe- The range of threats to business information guards unless they are cost-effective. Defense systems is not as broad as that faced by de- and intelligence agencies, because of their mis- fense and intelligence agencies. Business’ con- sions, are more tolerant of higher costs or of cerns for misuses are mainly by insiders, com- operational impediments that might result petitors, and, to a limited extent, hackers. from adopting security measures. One of their most important goals is to prevent valuable Companies that safeguard their communi- information from falling into the wrong hands, cations seem either to have business interests even if significant trade-offs are involved. at risk (e.g., banking and oil exploration firms concerned about unauthorized interception) or Nevertheless, there are many similarities be- are required by the Government to use pre- tween the various defense and nondefense, as scribed safeguards (e.g., Federal Reserve banks well as between Government and private sec- and defense contractors). A number of busi- tor, requirements for information security, al- nesses are finding additional reasons to pro- though their requirements vary widely. Both vide some safeguards for information in com- need to control access to databases, restrict puters and communications systems. These unauthorized activities, provide audit capabil- reasons include prudent management of re- ities, safeguard sensitive data and trans- sources and methods of improving efficiency, actions, and, generally, maintain the integrity as well as preventing the loss of proprietary of data and continuity of service. Thus, Fed- information or theft of funds. Private busi- eral policies that affect the longstanding NBS nesses, in addition, often are more concerned and NSA roles in developing technology to with information integrity rather than con- safeguard information systems will also affect fidentiality. private sector security programs. For their part, businesses need safeguards that do not unduly slow down or otherwise im-

CONCLUSIONS The need for information security has existed changing needs for improved security depends for a long time. The particular attributes of on its ability to adapt existing technologies security perceived to be important tend to and techniques as well as to innovate (see ch. change emphasis with time and technology, 4). Government policy can bean important de- often in ways that are difficult to predict with terminant of how, when, and by whom these confidence. Society’s ability to satisfy its needs are satisfied. 20 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

These conclusions imply that policies predi- tory that is continually evolving. Federal pol- cated solely on solving current security prob- icy also has an influence on advances in the lems are not likely to endure because needs for technology underlying information security ap- information security are not static. Further, plications, especially when the technology it- those based on controlling or restricting pri- self is controlled for national security purposes. vate sector actions are likely to damage other Regardless of the viewpoint taken, informa- societal needs. In other words, flexibility and tion technology poses a challenge to Govern- balance are important objectives of any pol- ment, industry, and society. Modern informa- icy intended to accommodate a wide range of tion systems and the data within them are users’ needs on a continuing basis. Moreover, vulnerable—they can easily be misused. The it seems apparent that U.S. policies that can- challenge is to find ways to reduce the risks not effectively be enforced internationally risk to acceptable levels while preserving tradi- being overcome by events in other countries. tional democratic values and remaining flexi- ble to accommodate diverse and changing Further, information security policy has a needs. significance that is colored by different inter- ests. One view sees its significance as relating The remainder of this report examines some mainly to the potential for foreign government of the technological foundations for informa- intelligence via U.S. communications and com- tion security and the main policy issues that puter databases and other threats to national are now evolving. In order to focus attention security. From a different viewpoint, however, on the issues facing Congress, many topics the significance of information security in- have been treated in a limited way. The report volves even more diverse interests. These in- is not about potential disruptions to or recov- clude basic democratic principles and civil lib- ery from disasters, for example, nor is it about erties, as well as commercial business interests. physical security or safeguarding classified in- formation or constitutional rights. Its purpose, In addition to these interests, each of which instead, is to describe the conflicting national has its advocates, there is at least one other interests that are shaping U.S. information that has no clear advocate-the evolving needs security policies, the special role of cryptog- of society for information security. Society’s raphy and NSA’s intelligence mission, and the needs for information security has a long his- potential courses of action. Chapter 3 The Vulnerabilities of Electronic Information Systems CONTENTS

Page Findings ...... 23 Introduction...... 23 Vulnerabilities of Communications Systems ...... 24 Background ...... 24 Spectrum of Adversaries’ Resource Requirements ...... 27 Networks ...... 28 Transmission Systems ...... 29 Other System Components ...... 37 Commercial Availability of Interception Equipment ...... 3$ Vulnerabilities of Computer Systems ...... 39 Background ...... 39 Large-Scale Computers ...... 39 Microcomputers ...... 41 Software ...... 42 The Extent of Computer Misuse ...... 44 Typical Vulnerabilities of information Systems ...... 44

Boxes Box Page A. Examples of Historical Concerns for Misuses of Telecommunications Systems ...... 25

Figures Figure No. Page l. Spectrum of Adversaries’ Resource Requirements v. Technologies . . . . 28 Z. The Communications Network ...... 30 3. Example of Antenna Directivity Pattern...... 32 4. Examples of Commercial Equipment for Interception of Microwave Radio Signals...... 33 5.Typical Fiber Optic System...... 35 6. Mainframe Computers in Federal Agencies...... 40 7. Computer Terminals in Federal Agencies ...... 41 8. Trends in Component Density, Silicon Production, and Gallium Arsenide Announcements, 1960 to 1990 ...... 42 9. Microcomputers in Federal Agencies ...... 43 10. Typical Vulnerabilities of Computer Systems ...... 45 11. Technical Safeguards for Computer Systems ...... 47

Tables Table No. Page l. Bell System Circuit Miles of Carrier Systems Using Different Transmission Media ...... 29 2. Telephone Company Fiber Applications ...... 34 3. Sales of Large-Scale Host Computers in the United States ...... 40 4. Sales of Personal Computers in the United States...... 42 Chapter 3 The Vulnerabilities of Electronic Information Systems

FINDINGS

Today’s public communication network is, for the most part, at least as easy to exploit as at any time in the history of telecommunications. The design of the public switched network is such that some parts of it are vulnerable to relatively easy exploitation (wiretaps on cable, over-the-air inter- ception), while others (e.g., fiber optic cable) present greater inherent barri- ers to exploitation. There are, and will likely remain, opportunities for casual, generally untar- geted eavesdropping of communications. However, targeted and consistent- ly successful unauthorized access requires greater resources. For systems with sophisticated safeguards, the resource requirements may frustrate even the efforts of national intelligence agencies. However, adversaries with sufficient resources can eventually defeat all barriers except, perhaps, those based on high-quality encryption. Users of communications systems face a spectrum of vulnerabilities ranging from those that can be exploited by unsophisticated, low-budget adversaries to those that can be exploited only by adversaries with exceptionally large resources. Technological advances may increase the capabilities of adversaries to mis- use computer and communications systems, but these same advances can also be used to enhance security. Increases in computing power and decentralization of functions have increased exposure to some threats. Two types are important: abuse by intruders who are not authorized to use or access the system, and misuse by authorized users. For many organizations, the latter problem is of most concern.

INTRODUCTION Unauthorized disclosure, alteration, or de- unauthorized access to different parts of com- struction of information in computer and com- munications and computer networks. Some munications systems can result from techni- forms of covert access—placing wiretaps, in- cal failure, human error, or penetration. While tercepting mobile telephone calls, hacking into each of these is important to users, this chap- poorly safeguarded computers—require few re- ter focuses on malicious or deliberate unauthor- sources. Others, such as targeted and consist- ized access and alteration, principally because ently successful unauthorized access, require it is in these areas that the impact of Federal greater resources because of the inherent bar- policies is greatest. riers posed by the complex designs of these systems. For systems with sophisticated safe- guards, the resource requirements may frus- Widely different levels of time, money, and trate even the efforts of national intelligence technical sophistication are needed to gain agencies. 23 24 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic /nformation

Today’s communications networks make use Computer and communications systems are of diverse technology. This diversity is accom- becoming increasingly closely intertwined. panied by an uneven ease of unauthorized ac- Consequently, information security is affected cess to different parts of the system. Although by the operation of all segments of these sys- security has not been a consideration in net- tems. Communication networks pose one set work design, today’s systems provide some in- of vulnerabilities to misuse that centers around herent barriers to easy exploitation. However, unauthorized disclosure of information and to adversaries with sufficient resources can even- modification of data. When computers are tually defeat all barriers, except perhaps high- linked by communications networks and are quality encryption.1 Information in comput- remotely accessible, the potential for misuse ers also has been vulnerable to malicious dis- increases. closure or alteration by various methods of This chapter focuses primarily on the vul- penetration, including misuse by both author- nerabilities to misuse of communications sys- ized and unauthorized users. However, signif- tems and methods to safeguard against them. icant advances are being made in the technol- It is intended to raise awareness and under- ogy available for safeguarding computer and standing of some of the technical vulnerabili- communications systems, as discussed in chap- ter 4. ties of these systems without providing a cook- book for prospective exploiters. ’ Because Many users of communications and com- communications and computer designs and ap- puter systems remain unaware or unconvinced plications vary widely, their vulnerabilities are of significant threats due to such vulnerabili- described in general ways in the sections that ties. Most users are not now adding safe- follow. guards, despite the growing volume and value of information being stored in or transmitted across these systems. zAlthough not the subject of this report, it should be noted

] For the ~uwo9es of this report, high-quality encryption that simpler and often less expensive ways than electronic eaves- techniques, for which there are no known and significant weak- dropping can be used to gain access to sensitive information; nesses or deciphering shortcuts, are considered to be fully se- i.e., by bribing employees or by using spies. Thus, users consid- cure, in spite of the fact that trial-and-error attacks will yield ering adopting electronic security measures must weigh all the unenciphered text (plaintext) with a sufficient number of sources of potential losses, including those from human and other trials. errors, as well as from dishonest employees.

VULNERABILITIES OF COMMUNICATIONS SYSTEMS Background The growth in revenues reflects the fact that communications networks have become vital The developed world has become increas- for many purposes, ranging from making in- ingly dependent on communications systems terbank and government fund transfers to run- to operate businesses and governments at all ning national electric power grids and the levels. This can be seen in the revenue growth world’s airlines. There is every indication that of communications services. The operating rev- this dependence will increase with continued enues from the domestic services of common advances and new applications. Both the vol- carriers, for example, grew from $8.4 billion ume of information communicated and its im- in 1960 to $166.5 billion in 1985. Similarly, rev- portance will continue to grow. enues for domestic satellite services rose from near zero in 1975 to $17 billion in 1985. And There have been occasional concerns about Intelsat's revenues from international services the vulnerability of telecommunications sys- went from near zero to $475 million in the past tems to misuse. Illustrations of some histori- two decades. cal concerns and examples of misuse during Ch. 3—The Vu/nerabi/itles of Electronic Intorrnation Systems ● 25 the past century are shown in box A. Although fidentiality. Where additional safeguards are today’s systems are likewise vulnerable to mis- deemed necessary, “add-on” measures are use, commercial demand for improved security taken either by the user directly (adding en- (e.g., message confidentiality and integrity) has cryption or message authentication capabil- been slow to materialize. Nevertheless, mes- ities), through the special service options sage authentication and digital signature ca- offered by some communications carriers (see pabilities are becoming important for a num- chs. 4 and 5), or by a combination of adminis- ber of industries (see ch. 5). trative procedures and the use of a protected private communications network. Little has been done to improve the security of public communications systems themselves. The regulatory climate has also influenced Generally, commercial systems have been de- the confidentiality of systems. The communi- signed for efficiency and reliability rather than cations industry now faces an increasingly security. Also, their inherent barriers to mis- deregulated environment created, in part, by use adequately serve most users’ needs for con- the divestiture of the American Telephone &

Box A.—Examples of Historical Concerns for Misuses of Telecommunications Systems

● In 1845, only one year after Samuel F. B. ● During the 1920s, pervasive Government and Morse’s famous telegraph message “What criminal use of telephone wiretaps triggered hath God wrought, ” a commercial encryption, congressional hearings and antiwiretap leg- or encipherment, was published as a islation. ) means of ensuring secrecy. ● Interception of telecommunications signals ● The first voice scrambler patent application played a key role in the course of World War was dated within 5 years of the first demon- 11. It continues to be a source of foreign in- stration of the telephone in 1881. telligence gathering by major governments.6 ● During the Civil War, ‘‘the first concerted ef - ● In recent years, there has been concern about forts at codebreaking and communications the ease of misuse of a variety of telecommu- system penetration, or telegraph 1ine tapping nications signals, ranging from the pirating were undertaken. “2 and even malicious jamming of subscription ● Soon after radio communications came into television signals transmitted over satellite use in 1895, they were used for intercepting communications systems to the ease of inter- others’ messages, particularly before and dur- ception of cellular radio and mobile radiotele- ing World War I.s In the 1920s, the British phone signals.~ surreptitiously eavesdropped on international cable traffic.’i ● The diversion of an undertaker’s business by an eavesdropping switchboard operator re- sulted in a patent grant for design of the first automatic switch in 1891, eventually eliminat- ing the need for switchboard operators. s

“1 David Kahn, The Codebreakers: The Story of Secret Wrriting ) (New York, NY: MacMillan, 1967), p. 189. ‘ Kahn, op. cit.: Bamford, op. cit.; Peter M’right, Sp} (’atcher (New 2Supplementary Reports on I ntelligence Activities, Book V 1, Fi- York, N-Y: k’iking Penguin, Inc., 1987); also see (;len Zorpette (cd,), nal Report of the Select Committee to Study CJovernment Opera- “Breaking the Enemy’s Code, ” IEEE Spectrum, SepLenlbcr 1977, tions with respect to Intelligence Acti\”ities, U. S, Senate, Apr. 23, pp. 47-51. 1976, p, 51, “’HBO Piracy Incident Stuns other Satellite Users. ” .\”PH, }“ork ‘+ Kahn, op. cit., pp. 298-299. Times, Apr 29, 1986, p. Cl 7; “Look! Up in the Sky!” ~l”>.+~’g.to~ 4James Bamford, 7’he Puzzle Palace (New York, NY: Penguin Post, Apr. 29, 1986, p. (’1; “hlystery Broadcast overpowers HB(), ” Books, 1983), pp. !29-30. The Z,N’STITUTE1, vol. 10, No. 10; october 1986, p, 1; “Uplinks and “.John Brooks, Telephone: The First Hundred l’ears (New York, I{igh Jinks: Satellites Are the Hackers Next Frontier, ” ,\’ewswvek, NY: ~~arper & Row, 1976). Sept. 29, 1986, pp. .56-57, 26 . Defending Secretsj Sharing Data: New Locks and Keys for Electronic Information . — —

Telegraph Co. (AT&T) in January 1984. As a Local area networks (LANs), which already result, cost competitiveness has become an im- have wide use in the United States and abroad portant consideration for communications carr- for linking computer-based systems, represent iers. It discourages them from providing cost- another area of information technology in incurring safeguards for which there is no sig- which security has received little attention. nificant demand. A 1980 survey found that, The same technologies that make possible con- with few exceptions, the Nation’s 10 largest tinued improvements in computer and commu- common-carrier systems were not designed for nications systems can also provide the means securing messages against interception. On the for sorting rapidly through a multitude of sig- other hand, at least one carrier did offer an add- nals in search of specific telephone numbers, on encryption service.3 A 1986 OTA review spoken words, or even voices.7 of six carrier systems indicates that a combi- At the same time, technology and engineer- nation of protective services are becoming ing can complicate the interceptor’s work. Fi- available, including encrypting radio signals ber optics, which is rapidly being installed in or routing selected calls over cable transmis- i the United States to carry telephone and other sion facilities g o communications, requires far more sophisti- Technology plays an important but uneven cation for successful interception because of role in the security of communications sys- the physical medium that carries the message. terns. The rapid proliferation of ground-based However, most customers will continue to have or terrestrial microwave radio since the 1940s copper wires linking their offices and resi- and satellite communications since the 1960s dences with the local telephone company’s of- have made interception easier by making sig- fice for the long term. nals available over wide geographic areas. AT&T’s modern electronic switching net- Other technical designs have also made inter- work, on the other hand, encrypts signaling ception easier. Private lines (dedicated chan- information (the numbers of the called and call- nels) and cordless telephones, for example, can ing parties) prior to transmission, thus deny- be intercepted because of the former’s fixed ing potential interceptors the opportunity position in the electromagnetic spectrum and to target specific users’ messages. Many other the latter’s complete dependence on radio engineering design features, such as signal waves.’ Telephone lines can also be tapped compression, spread-spectrum techniques, relatively easily from wire closets on a user’s 6 channel demand assignment techniques, and premises. packet switching also complicate any intercep- tor’s work. They do so typically as a byproduct IU. S. Department of Commerce, National Telecommunica- of other objectives. And, within the next few tions and Information Administration, “Identification of Events years, as end-to-end digital networks become Impacting Future Carrier System Protections Against Vulner- more commonplace, encryption services are abilities to Passive Interception, 1980. qInformation Security, ‘nC., “Vulnerabilities of Public Tele- likely to become available if demand is ade- communications Systems, ” OTA contract report, 1986. quate. Of course, adversaries with significant ‘For a description of the vulnerabilities of commercial tele- communications systems to unauthorized use, see the M ITRE Corp., Study of Vulnerabih”.ty oflikctrom”c Commumkation S-ys- Ross Engineering Associates. Also, Robert L. Barnard, Intru- tems to Electronic Interception, vols. 1 and 2, January 1977; sion Detection Systems: Principles of Operation and Applica- and the MITRE Corp., Selected Examples of Possible Ap- tion (Stoneham, MA: Butterworth Publishers, 1981). proaches to Ekctrom”c Commm”cation Interception @erations, Whitfield Diffie, “Communications Security and National January 1977. Also, Ross Engineering Associates, “Telephone Security: Business, Technology, and Politics, ” Proceedings of Taps, ” OTA contract report, November 1986. the National Communications Forum, Chicago, IL, 1986, vol. ‘;Technical course material from a seminar on communication 40, Book 2, pp. 733-751. and information security, conducted regularly by Ross Engi- “Bellcore, “Evolving Technologies: Impact on Information neering Associates, Adamstown, MD, and other firms. For ad- Security, ” OTA contract report, Apr. 18, 1986. Also, see U.S. ditional information on wiretaps, surveillance, and related topics, Congress, Office of Technology Assessment, Information Tech- see: “Electronic Eavesdropping Techniques and Equipment, ” nology R&D: Critical Trends and Issues, Case Study 2: Fiber I.aw Enforcement Assistance Administration, National Insti- Optic Communications (Springfield, VA: NTIS #PB 85-245660/AS, tute of Law Enforcement and Criminal Justice, republished by February 1985), pp. 67-75. Ch. 3—The Vulnerabi/ities of Elecfronic /formation Systems ● 27 — resources, such as a national intelligence orga- on the resources available to potential adver- nization, can be expected to readily surmount saries. Targeted, unauthorized access to a spe- most of these obstacles.9 cific user’s communications over the public switched network, with a few exceptions, con- A number of recent developments may also siderably increases the need for technical ex- be making it more difficult to intercept, alter, pertise, sophisticated equipment, and money. or misuse signals. These include the advent of The complexity of these systems can prevent commercial encryption services and products, unsophisticated adversaries who lack the nec- the emergence of new technical standards for essary resources from gaining access to infor- safeguarding communicated and stored mes- mation, but would not stop those who have sages, recent Federal Government policies that adequate resources from readily surmounting influence the safeguarding of sensitive infor- the barriers. mation, and congressional legislation concern- ing unauthorized access to information in some Figure 1 illustrates the spectrum of vulner- systems (see chs. 4 through 6). abilities and adversaries’ resource require- ments. On one extreme are readily exploita- Still another barrier exists to the misuse of ble services (cordless telephones) and facilities data obtained from passively monitoring or in- (copper wire in local loops and wire closets) that tercepting automated information systems— require very limited resources for successful, the problem of obtaining unambiguously the targeted exploitation. Some cordless telephone information of direct interest. This is readily conversations can be monitored using ordinary illustrated with an example of data in the form FM radios. Cellular radiotelephone conversa- of passively intercepted communications sig- nals. Even if an adversary is reasonably as- tions can be monitored using tunable ultra high frequency (UHF) television receivers. Further, sured that the intercepted signals contain use- wiretapping equipment can be purchased for ful data among them, the adversary must as little as $12. At the other end of the spec- select from what may be a wealth of trans- trum are applications and facilities, such as mitted data in the hope of finding the target information in a timely, complete, and under- fiber optic communications and technologies, particularly those using high-quality encryp- standable context. Although these barriers are not likely to prove overwhelming to a deter- tion and other safeguards (see ch. 4), that make unauthorized access much more difficult. mined, sophisticated adversary, they do not exist for an adversary who has the coopera- On the other hand, technology may simplify tion of a knowledgeable inside employee with targeted interception through such means as the ability to select exactly the information computer-based data matching, word recogni- of direct interest and with a full understand- tion, and voice identification. For most users, ing of its context and limitations. concern about unauthorized access is more likely to focus not on potential high- or low- Spectrum of Adversaries’ resource adversaries, such as wiretappers or Resource Requirements Government intelligence agencies, but on those in between. Telecommunication systems are vulnerable to unauthorized access in many ways, but the The situation is far from a static one. The ease of such access varies widely depending spectrum of vulnerabilities shifts as techno- logical advances change the nature of commu- “David Kahn, The Cociebreakers: The Story of Secret Writ- nications systems and the resources available ing (New York, NY: MacMillan, 1967); James Bamford, The Puzzle Palace (New York, NY: Penguin Books, 1983); “Soviets to potential adversaries. Technological ad- Take the High Ground, New Embassy on Mount Alto is a Prime vances, other than those associated with in- W’at,ching and I.istening Post, ” Washing-ton Post, June 16, 1985, formation security, tend to increase the capa- p. B 1; and The Soviet-Cuban Connection in Central America and the Caribbean, released by the Departments of State and bilities of adversaries, especially those of high- Defense, March 1985, Washington, DC, pp. 3-5. and middle-level resources. Perhaps the most 28 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

Figure 1. —Spectrum of Adversaries’ Resource Requirements v. Technologies

SOURCE: Office of Technology Assessment, 1987 visible illustration of this is that of increas- investment in safeguards. However, some cor- ingly powerful personal computers, which porate and Government users who face con- make unauthorized access to communications siderable risk in the event of such accesses (e.g., and data easier. for communications deemed sensitive for na- tional security purposes or for electronic fund But the question remains: Should a business transfers) are taking steps to improve the con- that communicates valuable, sensitive, per- fidentiality and integrity of their communica- sonal, or proprietary information be concerned tions (see ch. 5). about unauthorized access to messages trans- mitted over the public switched network? If Networks history serves as a guide, we can expect few immediate changes in the confidentiality of pri- Early communications networks began as vate communications over public networks ex- relatively simple point-to-point transmission cept where user demand is adequate to justify systems. At first, telegraph and later voice- .—

Ch. 3—The Vulnerabilities of Electronic Information Systems ● 29 modulated electronic signals, were transmitted for example, provide high-capacity circuits exclusively over copper wires, and switching that carry about two-thirds of all telephone toll was accomplished manually. Such networks calls today. They often use highly directional were vulnerable to eavesdropping by wiretap- antennas to transmit signals between stations, ping. Today’s networks, by contrast, consist typically spaced from 10 to 35 miles apart. of cables (copper wire, coaxial, and fiber op- Microwave systems are designed for a wide tic), radio links (terrestrial and satellite), and range of capacities, from as few as 24 voice other equipment providing a complex of grade circuits to as many as 2,400 circuits per services (voice, data, graphics, text, and video) radio channel. In addition, there are multiple through a variety of specialized interconnected radio channels in each of the many frequency networks. Figure 2 illustrates the complexity bands that these systems operate in. For ex- of modern networks. In spite of the added com- ample, in the 6 GHz common carrier band, plexity, however, vulnerabilities remain. there are eight radiofrequency channels, each Communication networks have also vastly capable of carrying 2,400 voice grade circuits. expanded the ability of users, whether from an office building or home personal computer, Interception of point-to-point microwave transmissions is relatively easy if the inter- to gain access to computers nationwide and ceptor has technical information about the even worldwide. The current movement toward transmitter. Most such information is made a worldwide digital network is aimed precisely available to the public by the Federal Commu- at increasing the accessibility of network ca- pabilities, enhancing the variety of services nications Commission (FCC). Interception of signals, however, is only part of an eavesdrop- available, and lowering the costs of services. per’s job. The signals must be demodulated and demultiplexed, which can be done using Transmission Systems the same type of equipment as used by com- Communications systems use two types of mon carriers. But then an eavesdropper must media to transmit signals: over-the-air sys- also be able to sort through individual mes- tems, such as radio transmissions; and conduc- sages and select those of interest. tors, such as copper wire and coaxial or fiber optic cables. In general, over-the-air systems (e.g., cordless telephones) can be intercepted Table 1 .—Bell System Circuit Miles of Carrier Systems and systems that use conductors can be tapped. Using Different Transmission Media Some conductor-based transmission systems Circuit miles at year end (e.g., fiber optic cable) require sophisticated re- (In millions) Media sources to tap, while others require minimal ———.— 1975 1982a . resources (taps of copper wires from wire Analog: closets). Whatever the form of transmission, Paired wire ...... 42 140 Coaxial cable ...... 142 221 it is not necessarily easy to render intercepted Radio ...... 399 737 or monitored signals intelligible. Digital: Paired wire , ...... 58 138 Microwave Radio Systems Coaxial cable ...... — 2 Radio ...... — 6 (62°/0 installed Microwave radio systems, totaling 740 mil- in 1982) lion circuit miles, carry most of the long-dis- Fiber optic ...... — 4 (98°/0 installed in 1982) tance communications messages transmitted Subscriber ...... — 8 (54°/0 Installed 9 within the United States (table 1). Systems — in 1982) aThlS IISI shows rnqor categories of transm~ss!on media Although ana~09 sYs operating at frequencies mostly between 2 and terns were still predominant, the growth of dlgltal systems was almost three 11 GHz (gigahertz or billion cycles per second), times faster from 1975 to 1982 Almost no new analog systems were added dur (ng thts pertod Comparable information IS not available after 1982, but a SI g- niflcant commitment IS being made to glass fiber systems especially by AT&T IOBellcore, Evolving Technologies: Impact on Information MCI and GTE Security, Apr. 18, 1986. SOURCE Bellcore 30 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information cable cable fiber area Local network and optic Coaxial I Network nodes International Area --z-- Metropolitan Smart building -----:z.-.... satellites International ~ Local Telco carrier office Interexchange Network satellite Domestic ~ and .-, fiber links optic link cable Radio Communications --L.-c.-- The 2.- terminal Figure Local Telco Earth office carrier Interexchange 1987 ~ Base. station Assessment. ./ ~' -- Technology L..-----I t"'" of Office Work Video station Mobile Cordless Ji telephone telephone ~--~ teleconference SOURCE: g- Ch. 3—The Vulnerabilities of Electronic Information Systems ● 31

Point-to-point systems are vulnerable to in- versaries except those with considerable re- terception wherever there is sufficient radiated sources and motivation, principally because of signal strength. The geographic area in which the cost of such an operation. adequate signals can be received is generally At the other extreme, there is inexpensive very large. It can cover dozens of miles commercial equipment that can be used to in- in the paths of the antenna’s radiation and in tercept radio signals.12 Figure 4 illustrates the vicinity of either the transmitter or receiver the types of equipment needed and current (figure 3). Custom-built receivers may be de- prices, based on catalog advertisements. The signed with greater sensitivity than those used equipment includes an antenna that can be by the common carriers in order to broaden pointed, low-noise amplifier, receiver, and the area of reception. equipment to extract audio (and video) signals Modern digital systems complicate intercep- (i.e., demultiplex and demodulate them). De- tion by unsophisticated adversaries, but they pending on the particular equipment selected, simplify the work of those that are more so- the total price would range from $1,000 to phisticated. Signals are transmitted as virtu- $50,000. People skilled in the design of com- ally indistinguishable series of ones and zeroes, munications equipment could undoubtedly typically mixed together (multiplexed) with build their own units for less, however. many other signals and encoded prior to trans- mission to reduce the total number of bits Specialized Microwave Systems transmitted and thereby reduce bandwidth re The Multipoint Distribution Service (MDS), quirements. Many systems also route differ- authorized by the FCC in the early 1970s, uses ent segments of the same transmitted signal broadcast microwaves to distribute video and over different paths. For adversaries with few other one-way communications locally. MDS resources, these conditions alone would rep- transmitters use omnidirectional antennas to resent significant obstacles, particularly for 11 broadcast signals that are received by small targeted interception. Other obstacles in- parabolic (directional) antennas. MDS systems clude the need to record a large volume of data are typically used as a radio version of cable and process it to extract the message content. television and, infrequently, to provide one- For adversaries with considerable resources, way business or educational communications. such as very powerful computer processing ca- A Digital Termination System (DTS) is pabilities and the equivalent of the switching another specialized microwave radio service and transmission facilities used by common that is similar to MDS in terms of its broad- carriers, targeted interception would not rep- cast of microwave signals. However, unlike resent a severe challenge. Indeed, these adver- MDS, DTS is designed for full two-way com- saries can sort messages to select those of in- munications between stations. The mechanics terest and undo the various types of signal of intercepting DTS transmissions are simi- processing to recover the message content. To lar to those involved with MDS or point-to- consistently intercept preselected targets in point microwave systems. Interception might switched systems, potential adversaries must be considered easier with DTS because of the be able to carry out functions equivalent to clearer identification of the user’s dedicated those performed by the carriers’ equipment. communications channel. In addition, they need the ability to select those messages of interest and to operate covertly. Dedicated Lines This level of sophistication and investment is assumed to be beyond the means of any ad- Users who need to communicate extensively between two points often use dedicated or pri-

] I The M ITRE corp., study of Vulnerabih”ty Of Hectronic Communication Systems to Electrom”c Interception, vol. 1, Jan- 121nformation Security, Inc., “Vulnerabilities of Public Tele- UWY 1977, p. 96. communications Systems, ” OTA contractor report, 1986. 32 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information —

Figure 3.—Example of Antenna Directivity Pattern

.30° 100 () 100

SOURCE: MITRE Corp., Study of !/u/nera6i/ify of E/ecfron/c Comnun/ca~ion Sysferns fo Elecfronfc Interception, VOI 1, January 1977 Ch. 3—The Vulnerabilities of Electronic Information Systems ● 33

Figure 4.— Examples of Commercial Equipment for Interception of Microwave Radio Signals

Antenna, low noise amplifier, motor drive Receiver

Type of multiplexing and modulation FDM/FM TDM FDM/AM SCPC

Demodulation/down conversion

Type of demodulation HF Digital Scanner Scanner equipment needed AM/SSB TESTSETS VHF/UHF VHF/UHF

Commercial equipment Approximate price

Examples of RAYDX 10.5 antenna, $2,200 Receiving LUX or 990C receiver Good quality reception equipment, Low noise amp.. ALCOA 6.0 antenna, $695 Motor drive ‘ Uniden 2000 receiver Minimum quality reception Examples of HF-AM/SSB ICOM R7000 $969 Demodulation HF-AM/SSB BEARCAT DX1OOC $285 Equipment VHF-UHF Scanner Regency MX5000 $330 Total system cost: Between $1,000 and $3,200 SOURCE lnformat~on Security Inc us[ng catalog prfces from SATCOM, and SCANNER WORLD USA magannes, 1986 vate line services. These are fixed circuit paths Fiber Optic Communications through some combination of terrestrial or sat- Fiber optic cable is being installed rapidly ellite microwave radio or wire transmission fa- by communications carriers in the United cilities. Dedicated lines are commonly used to States, primarily for heavy traffic, long- link corporations’ main switching centers with distance routes, but also for many local uses. one another, to link interactive computer sys- Local telephone companies installed more than tems, and to link computer systems with re- 62,500 miles of fiber in 1984 and 100,000 miles mote terminals. Whereas ordinary dial-up calls in 1985 for their local loops (connecting tele- might be routed along any of a number of paths phone offices with subscriber’s premises). depending on traffic loading conditions, dedi- Another 285,000 miles of fiber were installed cated circuits remain in place on the same by the same companies during those years for transmission path. This simplifies the inter- interoffice trunking (table 2). ceptor’s burden considerably, since the loca- tion of the user’s dedicated line need be found Fiber optics is attractive, in part, because just once. As an example of a part of a dedi- much higher data rates can be transmitted— cated circuit, the local loop connecting the sub- about 1 gigabit per second currently—than scriber’s premise with the local telephone com- using copper wire. One small cable containing pany’s nearest office also provides a fixed path two glass fibers can carry more than 15,000 that is relatively easy to identify. two-way voice telephone conversations, or the 34 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

Table 2.—Telephone Company Fiber Applications by satellite and cable facilities. This balance (fiber miles in thousands) is likely to shift sharply with the planned use 1984 1985 of fiber optic cable for trans-Atlantic service Long Long beginning in 1988 and for trans-Pacific serv- Companv distance Loop distance Loop ice in 1989. NYNEX ...... 25 15 50 25 Bell Atlantic ...... 20 15 30 25 Satellite communications systems operate BellSouth ...... 20 35 60 30 in much the same manner as microwave relay Ameritech ...... 25 15 40 35 systems, except that the repeater or amplifier SW Bell ...... 10 5 50 15 is in geostationary orbit 22,300 miles above U.S. West ...... 15 5 30 10 Pacific Telephone . . . . . 20 15 40 25 the equator. Satellites accept signals from Independents ...... 10 — 10 5 transmitting earth stations (the uplink), trans- Total ...... 145 105 310 170 late the signal to a different frequency band, SOURCE: Annual Reports/Internal Slecor Estimates. and retransmit it at suitable power levels to receiving earth stations (the downlink). Some satellite networks are widely used for equivalent in data signals, for up to 25 miles one-way distribution services, including cable without requiring repeaters (amplifiers). Con- and network television, and a variety of data ventional telephone cables composed of 24 services, such as financial information and gauge copper wires, in contrast, would require weather reports. Two-way distribution serv- 1,250 pairs of wires and repeaters spaced about ices include point-of-sale transactions, data- 1 mile apart in order to carry the same traffic. base inquiries, and inventory control. Most of Further advances in fiber technology are these applications use digital transmission and widely expected. Fiber optics also is less ex- various techniques to share the satellite band- pensive than conventional cable, does not radi- width among the users. ate energy under normal operating conditions, The satellite “footprint,” defined by the and is not as readily subject to passive or ac- beamwidth of the spacecraft antenna, maybe tive interception as are radio signals or signals contoured to the shape of the intended cover- on copper wire. Figure 5 illustrates a typical age area, but is nevertheless likely to be thou- fiber optic system. sands of miles across. The satellite channel is ‘‘visible’ to all points within the coverage area Satellite Communications Systems and, therefore, readily interceptable within that area. The signals from many satellites Satellites were first used for commercial may be received from locations beyond the telecommunications in the mid-1970s. Today, borders of the contiguous United States. The there are more than 100 satellites worldwide, key to targeted interception, consequently, is with some two dozen providing domestic serv- determining which satellite and transponder ices for the United States. Still, satellites ac- channel frequencies are of interest. count for only a small portion of domestic transmission capacity. In terms of domestic One of the simplest methods for intercept- nonbroadcast channel capacity, they are be- ing subscription satellite signals was adver- ing outpaced rapidly by fiber optic cable. In- tised until recently in an amateur radio pub- ternational communications services have, for lication and sold for less than $100. The device the past decade, been provided about equally used a short piece of wire, cut to the proper Figure 5.—Typical Fiber Optic System Electronic Optical Optical fiber to output Input to Demuitiplexer Multiplexer electronic ~ signals signals ~ optical (Typically 1-1,000 Mb/s) converter converter

(Typically digital (Laser or light (Photodiode and blt streams) emitting diode) preamplifier) SOURCE Off Ice of Technology Assessment. 1987 36 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information length for reception at the selected broadcast- the survivability of satellite communications ing frequency, and mounted in an ordinary systems in times of national emergency.16 metal coffee can. This apparatus was con- nected, through the printed circuit card pro- Mobile Radio and Cordless Telephone Systems vided, to the lead-in wires of a television set. Land mobile telephone service typically pro- All that remained was to point the coffee can vides two-way, voice-grade communications at the desired satellite, adjusting by trial and between abase station and mobile units or be- error until an adequate signal was received. tween two mobile units. The mobile unit is These characteristics make communications most commonly a car phone, although some satellites vulnerable to several different types “briefcase phones” have recently appeared on of misuse. The uplinks can be overpowered by the market. The use of these systems is grow- unauthorized users, whether intentionally or ing by about 20 percent annually. In addition, not, who transmit stronger signals than those one-way paging services have become very used on the authorized uplink. This was the popular recently. case in both of the April 1986 takeovers of the The antennas for these units transmit omni- Home Box Office (HBO) channel in which a directionally. Cellular mobile systems use a part-time satellite uplink operator and retailer base station in each cell to communicate with of home receiving dishes overpowered the HBO all mobile units within that cell. A relatively uplink transmitter signal with an unauthorized small number of frequencies are needed for one and put his own message on the screens 13 each cell. Inexpensive scanners can be used to of some 8 million viewers. In addition, the monitor for mobile call signals and to tune in downlinks can be jammed by bogus earth to the next call made. Each call transmitted transmitters. from the base station is addressed to a par- Other vulnerable parts of communications ticular mobile unit within the cell, making tar- satellites include the transponders, whose life- geted, passive eavesdropping simple as long times can be severely shortened by excessive as the eavesdropper knows the telephone num- received signal strength and unprotected ber of interest and the cell the target is in. telemetry systems, which might be manipu- Cordless telephones substitute a duplex lated to move the satellite out of its intended 14 (two-way), low-power radio link for what other- orbit. Broadcasters and communications wise would be a very long extension cord. Their carriers are especially concerned about jam- growing popularity and the relatively small ming since the former could lose millions of number of channels available have created dollars if advertisements are interfered with problems for some users. A cordless phone al- and the latter could lose many tens of millions ways uses the same channel in the same small of dollars if a satellite’s lifetime is prematurely area; thus, these phones are much easier to tar- shortened. Both groups, therefore, want to get by eavesdroppers. Nearby users with the shift to a less vulnerable transmission system, same frequency channel pair can listen to their such as fiber optic cable, if capacity expands 15 neighbors’ calls simply by listening with their sufficiently. There are also concerns about own cordless units. In addition, some people

‘]For a detailed review, see Donald Goldberg, “Captain Mid- IGsW “commercial Satellite Communications Survivability night, HBO, and World War 3,” Mother Jones, October 1986, Report, ” prepared for the National Security Telecommunica- p. 26. tion Advisory Committee (NSTAC), May 20, 1983. This report l~The range of ~lnerabfities of commercial COrnmUfiCatiOnS notes that: satellites were discussed in some detail by representatives from . . . commercial satellite communications systems are vulnerable HBO, CBS Technology Center, and MA-COM, Inc., at a semi- to hostile actions which would deny service in emergency situa- tions, particularly actions by a relatively unsophisticated nar at the Massachusetts Institute of Technology on Oct. 16, antagonist—the so called “cheap shot” attack. For example, to- 1986. Also discussed were a number of safeguards that are be- day’s satellite command links provide only modest protection ing considered for current and, especially, future satellites. against electronic intrusion. Also, in nuclear war, some of the ‘“Ibid. control facilities of satellite systems would become unusable. Ch. 3—The Vulnerabilities of Electronic /formation Systems ● 37 —. have intentionally used their remote units to cally from remote, centrally located mainte- initiate calls by triggering other parties’ base nance sites. From these sites, however, access units, thus avoiding having to pay for the call can be gained to an even greater number of since the related bill (usually for along-distance communications channels. This is an impor- call) is sent to the base unit owner. This “theft tant reason for controlling both physical and of dial tone” is possible when the base unit does remote access to these nodes. A special con- not have appropriate security features.17~ cern is electronic access to the processor used to control these switching systems: A knowl- Mobile and cordless radio have much in com- edgeable individual, for example, could sabo- mon, but two main differences involve signal tage or manipulate the switching system (e.g., range and the ease with which an adversary by rerouting calls destined for one person to may target on particular users. Although cord- another) without physical access to the switch- less phones are easy to target, their range is ing systems. typically no more than 1,000 feet, while con- ventional mobile radio signals may be received Switching systems are equipped with cir- at a distance of 20 to 30 miles. Newer cellular cuitry designed to permit operators to verify mobile phones have a smaller range and use that busy lines are actually in use and, in an a variety of channels and base stations as they emergency, to interrupt ongoing conversa- move from cell to cell, making them slightly tions. By the very nature of the circuitry’s de- harder to pinpoint. sign, it would permit monitoring of conversa- tions if protections were not incorporated. In fact, current versions of this circuitry have Other System Components scramblers built in and, if interruption is re- quired, periodic audible tone bursts are used Switching Systems to alert the users that a third party has joined In addition to the transmission paths that their conversation. connect end users and network nodes, and the network nodes to each other, switching sys- Signaling Elements tems located at the network nodes provide op- Signaling is another element of communica- portunities for misuse. Thousands of commu- tions systems that may provide opportunities nications lines are concentrated at these nodes. for abuse. Signaling is normally used to send With the use of telephone company records, the destination address data between switch- individual circuits assigned to particular cus- ing network nodes. There are signaling meth- tomers can be identified. In order to reduce op- ods that use either slowly pulsed direct current portunities for potential misuse of these rec- or voice band tones that are in predominant ords, the operating companies must carefully use between customers’ premises and the lo- limit both physical and remote access to these cal telephone office.18 These are used for voice nodes. The necessary precautions are the same and a substantial number of data communica- as those described below in connection with tions. Both of these signaling methods can the security of computer systems. be monitored, using methods described above Most electromechanical switching systems for monitoring communications, allowing an require frequent maintenance, particularly eavesdropper to intercept not only message those that serve large numbers of customers. content but also its destination. On the other hand, stored, program-controlled Carriers use pen registers and modern dialed switching systems of comparable size require number recorders to monitor destination ad- less frequent onsite maintenance since many dress signals. A new type of digitally coded of their functions can be controlled electroni- “’Some data communications only use these signaling meth- ods to direct a call through the telephone network to a packet ‘T%e Federal Communications Commission, “Further Notice switched network and thus information about the destination of Proposed Rulemaking, ’ Docket 83-32,5, released May 23, 1984. address is of limited value to an adversary. 38 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic Informationl address signaling is now used in connection of the HBO satellite broadcast in April 1986 with some data communications networks, shows that some of these systems are no bet- such as packet networks. This type of signal- ter protected against such attacks than against ing is also used for internode signaling in passive interception. This may be changing as AT&T’s common channel interoffice signaling a result of the HBO experience. The satellite system (CCIS) and some other networks. Sep- transponder cannot distinguish between the aration of signaling information from message legitimate signal and a bogus one—it simply content increases the confidentiality of mes- selects the stronger signal.19 In the HBO case sages provided the eavesdropper does not have noted earlier, the “adversary’ or hacker was access to the signaling data. The CCIS en- sophisticated technically and had access to crypts the signaling information sent between commercially available and relatively inexpen- nodes. sive transmitter equipment, as well as infor- mation in the public domain concerning the sat- Operational Support Systems ellite’s location and transponder frequency. In addition to the transmission, switching, In a completely different part of the network, and signaling components, communications wiretapping of telephone lines remains one of systems include supporting equipment for test- the simplest forms of eavesdropping, as long ing, repairing, and maintaining customer as physical access to wire closets and other in- records. The information stored in these sys- terconnection points are generally accessi- tems could be valuable to a person seeking to ble.20 Certain types of wiretaps cannot be de- intercept communications. Access to this in- tected by electronic means, and some wiretaps formation can be limited by time of day, ter- can be performed using equipment costing as minal location or function, physical access, log- little as $12.21 A wiretap is sufficiently easy on identification passwords, authorization ver- to install that even a 9-year-old can do it.” ification, and audit trail records. Rooftop terminal junction boxes and residen- A testing system is another type of opera- tial junction boxes are often readily accessi- tional support system. Testing systems can ble to potential wiretappers. In contrast, when gain access to specific trunk and line circuits, fiber optic cable is used to connect the user’s and thus provide an opportunity to either mon- premises to the carrier’s facility (the local loop), itor communications or obtain information re- tapping the fiber cable requires more sophis- ticated and expensive equipment and skill. garding communication links. These systems are often protected by many of the same tech- niques described above. ‘(’’’Mystery Broadcast Overpowers HBO, ” Institute for Elec- trical and Electronics Engineers, THE ZNSTITUTE, vol. 10, Commercial Availability of No. 10, October 1986, p, 1. ~(’In one of the relatively few examples in which telephone Interception Equipment taps are uncovered, a tap and electronic bugging equipment were recently reported discovered in the office of the governor The very technologies that make possible of New Mexico. “Capitol Bug Found, ” Washington Post, Jan. 10, 1987, p. A8. continued improvements in communications “ROSS Engineering Associates, “Telephone Taps, ” OTA ccm- and computer processing also lend themselves tractor report, November 1986. to illicit purposes. The successful disruption 221bid. Ch. 3— The Vulnerabilities of Electronic Informatlon Systems ● 39

Photo credifs Courtesy of Ross Eng/neerlng Adams/own MD

Rooftop Terminal Junction Box Residential Junction Box

VULNERABILITIES OF COMPUTER SYSTEMS’; Background authorized actions by those who are author- ized to use the system. In addition, technical In a simplified form, computer security is failures can be caused by natural disasters. the ability of ensuring that people use infor- mation systems only as they are supposed to. The rapid evolution of computer technology, This involves protecting: and society’s growing dependence on it, have important implications for information secu- Ž the system itself against failures, whether rity. Three distinct kinds of technical trends caused by deliberate misuse, human error, can be identified that have security implica- or technical problems; and tions—the growth of large-scale computers, the Ž the information in the system to ensure evolution of microcomputers, and changes in that it is seen and used only by those who computer software. are authorized to do so and that it is not accidentally or maliciously disclosed or Large-Scale Computers modified. Advances in large-scale computing have dra- Computer “hackers” aside, it is even more vi matically lowered the cost of computation. important to recognize that information secu- The power of machines relative to their cost rity is much broader than just protection and size has been increasing during the last against those who would penetrate informa- 30 years by more than a factor of 10 per dec- tion systems from the outside. People within organizations are perhaps even more likely to ade and is likely to continue increasing for the foreseeable future. These changes have been misuse information systems, including un- complemented by magnetic (and more recently

J ‘Because this report emphasizes telecommunications secu- rity, the treatment of computer security is brief. For more in- “In this anal~sis, a “large-scale” computer generall} means formation on computer security, see U.S. Congress, Office of a machine that is intended to ser~e multiple users performing Technolo~’ Assessment, Federal Go\’ernn]ent Information different tasks at the same time, that is generally not consid- 77echnolo~’: ,Ilanagement, Securit~, and Go\’ernment O%’er- ered to be a ‘‘desk-top’ or ‘‘personal’ computer, and that stores sight, C)TA-C I T-297 (Washington, DC: U, S. ( ;o\’ernment Print- data on large-scale magnetic (or optical) disks rather than flopp}’ ing Office, Fehruar~’ 1986). disks or small. hard disk dritres. 40 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information . —

optical) disks that can hold greater and greater Table 3.—Sales of Large-Scale Host Computers amounts of information on each disk. Commu- in the United States nication between computers has also become Year Units Value considerably more pervasive and efficient— 1965 ...... 36 $200 million line speeds are higher, protocols and techni- 1976 ...... 514 $ 2.2 billion cal standards have been established, and com- 1977 ...... 611 $ 2.4 billion 1978 ...... 1,009 $ 3.9 billion munications systems are generally evolving 1979 ...... 1,461 $ 5.3 billion from analog links to digital technology. 1980 ...... , ...... 1,328 $ 4.8 billion 1981 ...... 887 $ 3.9 billion These increasingly powerful machines have 1982 ...... 1,448 $ 6.8 billion 1983 ...... 1,836 $ 8.1 billion also become much more pervasive in society. 1984 ...... 2,420 $ 9.0 billion Figure 6 shows that the number of mainframe 1985...... 1,617 $ 9.3 billion computers operated by the Federal Govern- 1986 (estimated) ...... 1,800 $ 9.7 billion 1987 (estimated) ...... 2,090 $10.2 billion ment has increased from about 11,000 in 1980 1988 (estimated) ...... 2,070 $10.6 billion to 27,000 in 1985, with most of the increase 1989 (estimated) ...... 2,200 $11.5 billion coming in the Department of Defense. Perhaps 1990 (estimated) ...... 2,300 $12.1 billion NOTE’ Large. scale host computers are those machines serving more than 128 more important, figure 7 shows that the points users In a normal commercial env! ronment Th IS defl nlt!on IS not neces- of access to Federal computers have increased sarily the same as that used in figure 6 geometrically in recent years, from roughly SOURCE International Data Corp 36,000 terminals in 1980 to 173,000 terminals in 1985. Table 3 indicates similar trends in sales tion technology generally. That means that vir- of large-scale host computers in the United tually all Government agencies and private States from 36 units in 1965 to more than 1,600 organizations are more susceptible to techni- in 1985. cal sabotage or failure of their computers. But it also means that there is more information These trends–increased power and use of stored in computers, that this information is large-scale computers—have strong implica- often accessible to more people, and that this tions for security. First of all, the changes have information is accessible at a distance via resulted in increased dependence on informa- telecommunications linkages.

Figure 6.— Mainframe Computers in Federal Agencies 1985: Total reported = 26,682 Reported numbers of mainframe 1980” Total reported = 11,305 computers in Federal agencies

DoD Agency 1980 1985 USDA 13 23 DOC 967 915 DoD 3,765 17,565 Education 3 6 DOE 3,718 2,781 DHHS 48 106 HUD 3 7 DOI 81 250 Just Ice 28 75 DOL 11 15 State 5 6 DOT 16 15 Treasury 205 1,030

20 small agencies 20 small agencies 2,443 3,888 Total 11,305 26,682

DOE

NOTE’ Consistency in definitions of “mainframe” central processing untts cannot be assured because of different Interpretations of the term Deflnttlons may not agree w!th definition of large host computers in table 3 SOURCE” OTA Federal Agency Data Request Ch. 3—The Vu/nerabi/ifies of Electronic /formation Systems Ž 41 - .

Figure 7.—Computer Terminals in Federal Agencies I USDA

DOC

DoD (67,872) Education

DOE

HUD

Treasury

(28,848) 20 small* I 1 I 1 I I 1 I I 0 1,000 2,000 3,000 4,000 5,000 6,000 7,000 8,000 9,000 10,000 Number of terminals

On the other hand, the increased power and gressed to a point where their speed and power sophistication of large-scale computers also nearly equal that of mainframe computers a means that more sophisticated safeguards are decade ago. These improvements are largely more practical than they were with smaller due to the increasing number of circuits that computers. These safeguards include “audit manufacturers can put on a single microproces- programs that log the actions of each user and sor chip. Figure 8 shows the geometric in- more powerful access controls. These and other creases in the complexity of these chips, which safeguards are discussed in chapter 4. has led to a declining cost per unit of comput- ing power.

Microcomputers Microcomputers have also changed from an obscure hobbyist item to a standard and nec- Changes in smaller desk-top or personal com- essary piece of equipment in many homes, bus- puters have been even more striking and rapid inesses, and Government offices. Figure 9 than those in large-scale machines. Since the shows that the number of microcomputers in first microcomputer was commercially pro- the Federal Government rose from only a few duced in the mid- 1970s, these devices have pro thousand in 1980 to about 100,000 in 1985. Ta- 42 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

Figure 8.—Trends in Component Density, Silicon Table 4.—Sales of Personal Computers Production. and Gallium Arsenide Announcements, in the United States 1960 to 1990 10’ Value Physical limit (billions of Year Units (thousands) dollars) 1980 ...... 379 1.1 1981 ...... 644 1.9 1982 ...... 2,884 4.2 1983 ...... 5,872 8.7 1984 ...... 6,586 13.0 1985 ...... 5,689 13.3 1986 (estimated) ...... 6,633 14.6 1987 (estimated) ...... 7,414 15.9 1988 (estimated) ...... 8,262 17,7 1989 (estimated) ...... 9,317 19.8 1990 (estimated) ...... 10,120 21.6 SOURCE International Data CorD

1960 1970 1980 1990 Year and constrain the data transactions that au- SOURCE: AT&T Bell Telephone Laboratories thorized users are permitted to perform, users can copy or manipulate data in an essentially ble 4 shows comparable trends in the Nation uncontrolled fashion. There is a growing ar- as a whole, with sales of personal computers ray of add-on microcomputer security products rising from 380,000 in 1980 to almost 6 mil- that address security problems of this sort, as lion in 1985. well as an increasing awareness of the impor- tance of using these safeguards. Also, the ca- Microcomputers are computers in their own pability (at the mainframe computer) to con- right and thus require appropriate adminis- trol the downloading/uploading of data files trative and technical security measures to safe is well within current technology. However, ac- guard the information they process. Also, tual practice often falls short of the ideal, par- microcomputers can be networked and/or func- ticularly in firms that do not recognize the tion as “smart” terminals to larger computer value of electronic information in microcom- systems. Thus, the rapid proliferation of micro- puters and the importance of safeguarding it. computers cannot only place computing power in the hands of an increasing number of com- Using microcomputers instead of “dumb” puter-literate individuals, but also can decen- terminals, on the other hand, can help if good tralize data processing capabilities. For exam- security practices control the downloading and ple, employees of many firms or Government uploading of data files and if the system is con- agencies are able to collect and manipulate in- figured properly. For example, data integrity formation on their own desk-top microcomput- can be improved by procedures requiring au- ers. Additionally, they are to able use these thorized users to make additions, corrections, microcomputers to copy or “download” large or other modifications to large data files on amounts of information from the organiza- a microcomputer. If the modified data are tion’s central computer and also to “upload” checked, and only then uploaded to the main information they have collected or manipulated computer files, then the probability of acciden- into the central computer files. tal or malicious deletions of main files, for in- stance, can be reduced. The expanding use of microcomputers can have an adverse effect on security if the appro- Software priate security policies, procedures, and prac- tices are not in effect. For instance, in a com- Technical sophistication in software and in puter system that is not organized so as to databases has progressed more slowly than control and monitor users’ access to data files hardware advances. In fact, many people now Ch. 3—The Vulnerabilities of Electronic /formation Systems ● 43

Figure 9. —Microcomputers in Federal Agencies

(16,1 15) DHHS

Just Ice

(16,842)

I I I I I 1 I 1 I o 1,000 2,000 3,000 4,000 5,000 6,000 7,000 8,000 9,000 1000

Number of microcomputers

“20 Independent agencies selected by OTA to receive the data request

recognize that software is the bottleneck for contain far more information that is subject many prospective applications of information to both authorized and unauthorized use. Fur- technology. Nevertheless, the past two decades ther, although not a subject of significant con- have seen significant increases in the size of cern to many users, some security experts con- databases that can be reasonably accommo- sider that the inferential ability to link pieces dated by software and in the sophistication of of information in a database or from different the software itself. This means, for example, databases can have subtle but important im- that software can link disparate pieces of in- plications for security. To date, most attention formation in a database more readily and that to this type of problem has been on the part users can make inquiries of databases using of the defense and intelligence communities, more natural commands. but the problem can be more general. Even when the most sensitive information is unavaila- These changes give more people direct ac- ble, an adversary can infer critical data by cess to computerized data and the databases combining pieces of apparently innocuous in- 44 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic information formation (e.g., determining information about cil surveyed Federal agencies and found the design of a company’s product from its a total of 172 relevant cases of computer orders for raw materials). fraud or abuse. The losses in fraud cases ranged from zero to $177,383, with the The Extent of Computer Misuse highest proportion in the $10,000 to $100,000 range. A variety of recent studies have indicated The Department of Justice’s Bureau of substantial increases in computer misuse. Justice Statistics 1986 report “Electronic However, information available about the ex- Fund Transfer System Fraud. ” This re- tent of computer misuse is spotty. Moreover, ported a study of fraud related to the these studies suffer from serious shortcomings transfer of electronic funds in key banks. that make generalizations difficult (large suc- The study estimated that banks nation- cessful frauds are often not reported, let alone wide lost $70 million to $100 million an- prosecuted). nually from automatic teller fraud. It also examined losses from wire transfers, al- The most significant studies and findings though there were insufficient data to esti- include: mate national loss levels. Twelve banks ● The American Bar Association’s 1984 reported 139 wire transfer fraud incidents “Report on Computer Crime. ” In a sur- within the preceding 5 years, with an aver- vey of 283 public and private sector orga- age net loss (after recovery efforts) per in- nizations, ABA found that 25 percent of cident of $18,861. However, the loss expo- the respondents reported “known and ver- sure or the potential loss per wire transfer ifiable losses due to computer crime dur- incident averaged nearly $1 million. ing the last 12 months. ” Security magazine and the Information ● The American Institute of Certified Pub- Systems Security Association surveyed lic Accountants 1984 “Report on the their subscribers and members in 1985 Study of EDP-Related Fraud in the Bank- and 1986. Eighteen percent of the 1986 ing and Insurance Industries. ” AICPA respondents reported that their company surveyed 5,127 banks and 1,232 insurance had detected a computer crime in the last companies. Two percent of the banks and 5 years, compared with 13 percent re- 3 percent of the insurance companies said ported by the 1985 respondents. The re- they had experienced at least one case of spondents rated the threats to computers, fraud related to electronic data process- in descending order: unauthorized use by ing. Sixteen percent of the frauds were re- employees, routine errors and omissions, ported to involve more than $10,000, al- carelessness with printouts, theft of com- though that figure does not reflect funds puters, fire damage, use/misuse by out- that were recovered. siders, and vandalism. ● The President’s Council on Integrity and Efficiency issued “Computer-Related While these studies are far from conclusive, Fraud in Government Agencies: Perpetra- it is apparent that deliberate misuse of com- tor Interviews, “ in May 1985. The Coun- puters is a significant and growing problem.

TYPICAL VULNERABILITIES OF INFORMATION SYSTEMS The combined advances in communications years ago. Not only is computing power greatly and computer technologies have resulted in in- increased, but it is also more decentralized— formation systems that are an order of mag- and communication between computers and nitude more complex than those of 10 or 20 interconnected devices has become far more Ch. 3— The Vulnerabilifies of Electron;c /formation Systems . 45 —. pervasive. In some ways, this has improved only a moderate level of sophistication on the security in that, for example, information is part of the perpetrator, although in some cases no longer stored in just one large computer, (e.g., falsified disbursements) it requires col- which could result in chaos if it failed. On the lusion between two or more workers. other hand, information systems are much Moving up one step further in level of ad- more extensively linked and interdependent, versary, outsiders who gain unauthorized ac- and the number of points from which techni- cess to a computer system can perpetrate the cal failure, deliberate misuse, or accidental er- same kinds of misuse. This usually requires rors could cause serious problems has increased covertly obtaining an account name and pass- geometrically. word by, for example, looking over the shoul- To illustrate the numerous vulnerabilities of der of an authorized user, finding a discarded computer systems, figure 10 presents a sche- computer printout, using codes written on matic diagram of a typical information system. cards or pieces of paper taped to the terminal, The circled letters in the figure correspond to or simply guessing. Such an outsider could ei- certain types of vulnerabilities (discussed be- ther seek to gain access to an authorized user’s low) that encompass the vast majority of po- local terminal or personal computer (c)or could tential problems caused by deliberate misuse try to access the system from a remote loca- of computer systems. Some problems are more tion via phone lines (d). Access via phone lines important in some systems than in others and is inherently less risky for the perpetrator since potential adversaries may be more or less so- it is often less protected by security measures phisticated. As will be seen in chapter 4, good than local access. The dangers of hobbyists security practices would require security offi- prowling in computers via phone lines are often cials to perform an analysis of each system to overstated compared with misuse by those au- determine which vulnerabilities and threats are thorized to use computer systems. However, most significant and what protective measures it is likely that long-distance computer abuse would be most appropriate and cost-effective. will continue to grow and more serious adver- saries (e.g., technically adept criminals, includ- The first two kinds of vulnerabilities do not ing organized crime) will be involved. require direct on-line access to data and, gen- erally, an adversary needs relatively few re- Computer system operators (e), such as sources to exploit them. The first (a) is theft programmers and managers, sometimes have of storage media that contains valuable data. access to user passwords. Although this is be- Theft (or copying) of personal computer dis- coming less common, they still generally have kettes, for example, can be particularly easy access to stored files unavailable to other users. because personal computer users often do not In particular, programmers have the techni- lock up their diskettes. Similarly, theft of print- cal expertise to perpetrate sophisticated sab- outs (b), especially discarded ones, is typically otage and misuse, including such exotic at- quite easy and has been the source of a signifi- tacks as “logic bombs” that render a system cant amount of computer abuse. The printouts unusable after a specified period of time or at may contain valuable competitive information a specific time (often after the disgruntled or account and password information that al- programmer is no longer employed at the site). lows an unauthorized person to later gain elec- The last two vulnerabilities, eavesdropping tronic access to the system. on computer transmissions through either tele The next type of vulnerability is misuse of communications links (f) or local connections computer systems by those who are author- (g), are discussed in previous sections of this ized to use them (c). The misuse can consist, chapter. Chapter 4 describes some of the safe- for example, of stealing corporate secrets, guards that have been developed to address changing personnel information, causing falsi- these vulnerabilities and prevent such crimes fied checks to be written, or damaging data- in the future. Figure 11 shows how these safe- bases. This type of misuse typically requires guards can be used in computer networks. 46 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

Figure 10.-Typical Vulnerabilities of Computer Systems

SOURCE: Office of Technology Assessment, 1987. Ch. 3—The Vu/nerabi/ities of Electronic /formation Systems . 47

Figure 11 .—Technical Safeguards for Computer Systems

Central site access control and staff Chapter 4 Security Safeguards and Practices CONTENTS

Page Findings ...... 51 Figures Introduction ...... 51 Figure No. Page Encryption ...... 54 12. Common Administrative, Physical, - Encryption Algorithms ...... 54 and Technical Information Security Message Authentication ...... 59 Measures ...... 52

Public-Key Ciphers ...... 61 130 DES Encryption in Electronic Digital Signatures ...... 63 Codebook Mode ...... 56 New Technologies for Private and 14. Federal Standard for Secure Transactions ...... 68 Authentication ...... 62 Key Management ...... 69 15. Public-Key Ciphers ...... 64 Voice and Data Communications 16. Digital Signatures Using a Public- Encryption Devices...... 71 Key ...... 66 Personal Identification and User 17. A Description of the Past Network Verification ...... 72 Environment...... 74 Background ...... 72 18. A Description of the Current/Future Conventional Access Controls ...... 73 Network Development...... 75 Biometric and Behavioral 19. The Mechanics of See-Through Identification Systems ...... 77 Security...,...... 76 Access Control Software and Audit 20. Biometric Identification Trails ...... 83 Configuration Alternatives: Host- Host Access Control Software ...... 83 Based v. Stand-Alone ...... 78 Audit Trails ...... 86 21. Example Reports From Audit Trail Administrative and Procedural Software ...... 87 Measures ...... 86 Computer Architectures ...... 88 Tables Communications Linkage Safeguards . . 89 Table No. Page Port-Protection Devices ...... 89 5. Major Characteristics of Automated Satellite Safeguards ...... 90 Biometric Identification Types . . . . . 79 FiberOptic Communications ...... 90 6. Configurations and Applications of Common Carrier Protected Services.. 90 Biometric Devices ...... 80

Boxes Box Page B. An Example of DES Encryption . . . 57 C. Application of Message Authentication to Electronic Funds Transfer ...... 60 D. Host Access Control Software . . . . . 84 Chapter 4 Security Safeguards and Practices

FINDINGS

● Technical safeguards for computer and communications systems are still evolv- ing, as are users’ understanding of their needs for them. Products and systems are available for controlling access and auditing use and for encrypting data.

● Technical safeguards alone cannot protect information systems completely. Ef- fective information security requires an integrated set of safeguard technol- ogies, management policies, and administrative procedures.

● Information security hinges on the security of each segment of the increasingly intertwined computer and communications network.

● A number of important techniques are emerging to verify the identities of the senders of messages, authenticate their accuracy, and ensure confidentiality. Mathematical techniques using cryptography cannot only provide improved information security, but also broaden the applicability of electronic transactions in commerce.

● The Federal Government has played an important role in promoting technical standards for selected information safeguards, particularly for cryptography. Yet, the public position of the Government in general and the National Security Agency, in particular, has been inconsistent. This inconsistency is especially apparent in providing Federal leadership for the development of information security standards; e.g., in NSA’s reversal of endorsements of an open encryp- tion algorithm and of dependence on consensus agreement in developing encryp- tion-based security standards.

● Questions are being raised about the efficacy of the NSA’s developing unified sets of standards and guidelines for government-wide and private nondefense use.

INTRODUCTION Technology that can help promote informa- Like the range of threats and vulnerabilities tion security can be divided into administra- that afflict different information systems, tive, physical, and technical measures. Figure there is a wide range of safeguards that can 12, which shows examples of each of these cat- help protect them. Although administrative egories, demonstrates the diversity of safe- and procedural measures are also fundamen- guard applications and the range of approaches tally important to good overall security, this to improved safeguards.1 chapter concentrates primarily on technical safeguards. These include the following:

● Encryption, which can be used to encode ‘This section examines safeguards for both computers and communications since many of the measures discussed apply data prior to transmission or while stored to both. in computers, to provide an electronic

51 52 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

Figure 12.—Common Administrative, Physical, and ● Access control software and audit trails, Technical Information Security Measures which protect information systems from Administrative security measures: unauthorized access and keep track of • Background checks for key computer employees. each user’s activities. ● Requiring authority of two employees for disbursements. ● Computer architectures that have been ● Requiring that employees change passwords every few months, do not use the names of relatives or friends, and specifically designed to enhance security. do not post their passwords in their offices. ● Communications linkage safeguards, ● Removing the passwords of terminated employees quickly. which hamper unauthorized access to • Providing security training and awareness programs. ŽEstablishing backup and contingency plans for disasters, computers through phone lines. loss of telecommunications support, etc. The systems of safeguards that are being de- ● Storing copies of critical data off-site. ● Designating security officers for information systems. veloped fall into categories that control access ● Developing a security policy, including criteria for sensi- to data or monitor user activities and others tivity of data. that protect the integrity of data, e.g., verify ● Providing visible upper management support for security. Physical ecurity measures: its accuracy. Technology is paving the way for ● Locking up diskettes and/or the room in which microcom- further improvements in these and still other puters are located. categories. Systems that will combine im- Key locks for microcomputers, especially those with hard proving message integrity with preventing un- disk drives. ● Requiring special badges for entry to computer room. authorized activity are beginning to set the ● Protecting computer rooms from fire, water leakage, power stage for major new applications with broad outages. ● Not locating major computer systems near airports, load- commercial applications. ing docks, flood or earthquake zones. Security is never just a “black box” of tech- Technicai security measures: ● “Audit programs that log activity on computer systems. nical safeguards that can be purchased and ● Access control systems that allow different layers of ac- added to computer or communications sys- cess for different sensitivities of data. tems. Moreover, technical measures would be ● Encrypting data when it is stored or transmitted, or using an encryption code to authenticate electronic transactions. fruitless unless accompanied by suitable pol- ● Techniques for user identification, ranging from simple icies and administrative procedures. For secu- ones such as magnetic stripe cards to more esoteric “bi- rity measures to be effective, they must be ometric” techniques, which rely on hand or eye scanners (just beginning to be used). planned for and managed throughout the de- “Kernel’ ’-based operating systems, which have a central sign and operation of computer and commu- core of software that is tamperproof and controls access nications systems. This chapter, however, within the system. * mainly discusses the technology of safeguard- ● “Tempest” shielding that prevents eavesdroppers from picking up and deciphering the signals given off by elec- ing information systems. tronic equipment., ● ● Generally used only in military or other national security applications in the In addition, for many types of users, the com- United States. bination of reasonable effectiveness and con- SOURCE” U S Congress, Office of Technology Assessment, federal Government Irrformat;on Technology Management, Security, and Congressional venience are more important than extremely Oversight, OTA. CIT-297 (Washington, DC U.S Government Printing high security. Determining which safeguards Office, February 1966), p 61 are appropriate for a particular computer or communications system requires an analysis of such factors as the value of the information ‘‘signature, ‘‘ and to verify that a message at risk, the value of the system’s reliability, has not been tampered with. and the cost of particular safeguards. Security ● Personal identification and user verifica- experts disagree about how this “risk analy- tion techniques, which can help ensure sis” ought to be conducted and about the prob- that the person using a communications lems with, and validity of, risk analyses. But, or computer system is the one authorized some form of risk analysis-whether formal or to do so and, in conjunction with access informal, quantitative or qualitative-remains control systems and other security pro- the chief means by which managers can assess cedures, that authorized users can be held their needs and evaluate the costs and bene- accountable for their actions. fits of various security measures. Ch. 4—Security Safeguards and Practices • 53

The National Bureau of Standards (NBS) has has far greater potential uses. One of the cen- played an important role in developing com- tral observations of this chapter is that meas- puter security standards. This role has become ures, particularly technical measures, are be- complicated by the recent entry of the NSA ginning to be developed that provide some of into the standards arena and by NSA efforts the tools likely to prove important in the long to develop comprehensive standards suitable term for more secure operation of electronic for all users’ needs. information systems in uncontrolled, even hos- There are four driving forces behind the tile environments. These include environ- ments, such as the public switched telephone emergence of the new safeguard technologies: network for example, where sensitive data is 1. developments in microelectronics and in- unavoidably exposed to risks of being im- formation processing, such as smart cards properly accessed, modified, or substituted, or and other hardware implementing encryp- where errors can be introduced by the system tion algorithms; itself, as from normal electronic noise in com- Z. developments in cryptography, such as munications systems. Information security asymmetric and public-key ciphers; technology shows promise for greatly expand- 3. developments in the mathematics under- ing the range of applications of computer and lying signal processing and cryptography; communications systems for commerce and so- and ciety. It will accomplish this by reducing the 4. developments in software, particularly for cost of many of today’s paper-based business processing biometric personal identifica- transactions, by providing legally binding con- tion data. tracts executed electronically, and by protect- ing intellectual property and the privacy of per- A number of technologies exist that can ver- sonal data stored in electronic form. (See ch. 5.) ify that individual users are who they claim to be. Similarly, technologies exist to authen- To achieve most of the above, cryptography ticate the integrity of a message and to ensure is critically important. There are no close sub- its confidentiality. These developments are be- stitutes for cryptography available today. ing applied mainly to solve some of today’s Cryptography, however, is a technology in problems concerning information security. which the Government has acted somewhat Technologies for user verification, often in- inconsistently by controlling private sector tended for use in conjunction with other ac- activity in some ways, while occasionally stim- cess control systems, include: hand-held pass- ulating it in others. Thus, the technology that word generators, “smart” key cards with is important to future applications of informa- embedded microprocessors, and a number of tion security is coupled to Federal policies that personal identification systems based either can encourage or inhibit its advancement. Op- on biometric measurements or other individ- tions for the future role of Federal policies in ual characteristics. Message authentication influencing technological developments are dis- techniques rely on combinations of encrypt- cussed in chapter 7. ing and/or “hashing” schemes to create a code There are two principal uncertainties in the authenticating the accuracy of a message’s future development of safeguards. The first is content. A variation of this technique can pro- vide a “digital signature” that not only authen- the extent to which users of computer and com- munications systems will, in fact, buy and use ticates the message, but also establishes the identity of its sender. Encryption methods are the safeguards that are available. Some of the widely available to protect against unauthor- key factors that will influence users’ actions ized disclosure of messages. include their evolving awareness of threats and vulnerabilities, the practices of their insurance What is becoming increasingly apparent, companies, the evolution of “standards of due however, is that some of this same technology care’ related to security practices, the Federal 54 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information role as a leader and shaper of the field, and news overall computer and communications system media attention to incidents of misuse. Infor- market, the resulting products are more likely mation and communication system risk anal- to be well integrated, easy to use, and low cost. yses, based on historical threat and vulnera- For someone who needs to gain access to his bility profiles, will influence the marketplace or her company’s mainframe computer from for safeguards. If the demand for safeguards home, for example, appropriate safeguards increases, then the market will no doubt re- might include the functions of a hand-held per- spond with more products and techniques. On sonal identification device, encryption of the the other hand, if many users’ interest in secu- telecommunications link, passwords, dial-back rity levels off, there may be a shakeout in the modems, and audit logs at both the microcom- market for safeguard devices, perhaps leaving puter and the host computer. Using such a mainly those products developed for Govern- combination would be tremendously cumber- ment agencies. some at present, requiring multiple pieces of hardware, software, and passwords. Thus, a The second major uncertainty is the extent major challenge for the industry is to develop to which vendors of these safeguards, in col- systems that allow the various safeguards to laboration with users, will be able to develop work together and to become virtually invisi- systems that use multiple safeguards in a sim- ble to the user, as well as cost-effective. ple, integrated fashion. If demand for safe- guards becomes a significant fraction of the

ENCRYPTION Encryption is the most important technique The mathematical sophistication and com- for improving communications security. It is putational complexity of the algorithm it- also one of several key tools for improving com- self.—More complex algorithms may be puter security. Good-quality encryption is the (though not necessarily) harder for an ad- only relatively sure way to prevent many kinds versary to decrypt or break. of deliberate misuse in increasingly complex Whether the algorithm is for a symmetric communications and computer systems with cipher or an asymmetric one. —Symmetric many access points. Of course, encryption is ciphers use the same key for encryption not a panacea for information security prob- and decryption, while asymmetric ciphers lems. It must be used in concert with other use different but related keys. technical and administrative measures, as de- The length of the key used to encrypt and scribed below. In particular, effective key man- decrypt the message.–Each algorithm agement is crucial. uses a series of numbers known as a key that can change with each message, with each user, or according to a fixed sched- Encryption Algorithms ule. Generally, for an algorithm of a given complexity, longer keys are more secure. The various techniques for encrypting mes- One of the important factors in selecting sages, based on mathematical algorithms, vary keys is to make sure that they cannot be widely in their degree of security. The choice of easily guessed (e.g., using a phone num- algorithms and the method of their development ber) and that they are as random as pos- have, in fact, been among the most contro- sible (so that an adversary cannot deter- versial issues in communications and computer mine a pattern linking all the keys if one security. (See ch. 6.) The various algorithms is discovered). currently available differ along the following Whether the algorithm is implemented in dimensions: soft ware (programming) or hardware (built Ch. 4—Security Safeguards and Practices ● 55

into an integrated circuit chip).-Hard- (See app. C.) The CBC and CFB modes can be ware tends to be much faster than soft- used for message authentication. The ECB ware, although less versatile and portable mode, the simplest to understand, is illustrated from one machine to another. in figure 13 and box B. One property of this ● Whether the algorithm is open to public mode, however, is that the same plaintext will scrutiny. -Some nongovernment experts always produce identical ciphertext for a given argue that users have more confidence in encryption key. This characteristic makes the an algorithm if it is publicly known and ECB mode less desirable, especially for re- subject to testing. NSA and others, on the petitive messages or messages with common other hand, assert that the algorithm is content (e.g., routing headers or logon identifi- one of three essential pieces of informa- cations) because a known plaintext crypto- tion an adversary must have to decrypt graphic attack is more easily mounted, i.e., a message (along with the key and access where both the encrypted and unencrypted to the message itself) and that secret al- text are available to the cryptanalyst. gorithms are thus more secure.2 A re- DES is a “private key” cryptographic al- lated argument is that if an algorithm is gorithm, which means that the confidential- publicly known, standardized, and widely ity of the message, under normal conditions, used, it becomes a more attractive target is based on keeping the key secret between the for cracking than algorithms that are sel- sender and receiver of the message. (See the dom used. The Data Encryption Stand- section on key distribution, below.) The other ard (DES, see below) is one of the few principal form of algorithm is called a‘ ‘public working algorithms that is open to pub- key” system, which uses two keys that are lic scrutiny. Most of the other privately mathematically related—one that each user developed and all of the NSA-developed publishes and one that he keeps secret. Using algorithms currently in use have been kept a public key system, many people can encrypt secret. messages sent to a particular correspondent DES is probably the most widely known (using his or her public key), but only that cor- modern encryption algorithm. (See app. C for respondent can decrypt messages because the background on its development.) Based on an decryption key is (in principle) kept secret. algorithm developed by IBM, DES was issued These algorithms are discussed in more detail as a Federal standard in 1977. Although pub- below, and also in appendixes C and D. licly known and subject to scrutiny for more The development of encryption algorithms than 10 years, most experts are confident that has been a rather idiosyncratic, scattered proc- it is secure from virtually any adversary ex- ess, and is likely to continue to be. The aca- cept a foreign government. The level of secu- demic community of cryptographic research- rity is gradually weakening, however, because ers is a growing and active one, although its of the decreasing cost of computer power and numbers are relatively small compared to some the possibility of using many computing de- 3 other scientific fields. Only a handful of peo- vices in parallel to crack the algorithm. ple in the United States outside NSA have DES has four approved modes of operation, attempted seriously to create, validate, and specified in FIPS Publication 81 (“DES Modes implement new high-quality encryption algo- of Operation, ” Dec. 2, 1980). The modes vary rithms. Most algorithms currently in use can in their characteristics and properties. The four be traced to the work of a few individuals. Cryp- modes are the electronic codebook (ECB), ci- tographic research requires a high level of abil- pher block chaining (CBC), cipher feedback ity in specialized areas of mathematics and/or (CFB), and the output feedback (OFB) modes. computer science. Different skills are required

ZT~ GWl@ “why Not DES?” Computers and l%xun”ty, vol. 3R Rivest, Massachusetts Institute of Technology, PerSOn~ 5, March 1986, pp. 24-27. communication with OTA staff, Feb. 4, 1987. 56 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

Figure 13.—DES Encryption in Electronic Codebook Mode

t $ 1

SOURCE: NBS FIPS Publication 74, Apr. 1, 1961, pp. 21-23. to develop operational safeguards than for the cryptography manuscripts.5 Through this oretical research. process, researchers may submit manuscripts to NSA prior to their publication, giving NSA Despite the relatively small size of the sci- the opportunity to request suppression of sen- entific community, cryptography has been a sitive material. Although many researchers controversial science. For example, there have and research institutions take part in this been controversies concerning attempts by voluntary process, others do not, considering NSA to control Federal research funding as it a threat to the free exchange of scientific well as the publication and patenting of pri- e ideas. vate sector and academic results in crypto- graphic research during the past decade for rea- The voluntary review service is similar to sons of national security.4 NSA does not at the one proposed by the Public Cryptography present have the legislated authority to require Study Group of the American Council on Edu- prepublication review of independent, non- cation (ACE), which was assembled in 1980 at government research. the request of NSA. The group accepted the premise that “some information contained in However, following the controversy sparked cryptology manuscripts could be inimical to in part by secrecy orders imposed in 1978 on the national security of the United States. ” two patent applications for cryptographic in- It recommended a voluntary rather than stat- ventions, NSA, in concert with some academic 7 utory solution to this problem. However, researchers, instituted a voluntary review for Sues. Conmss, HOUSS Chnrn.ktee on Government Opera- tions, “The Government’s Classification of Privati Ideas,” Thirty-Fourth Report (House Report No. 96-1540), 96th Cong., 2d SeSS., Dec. 22, 1980. ‘Tom Ferguson, “Private Locks, Public Keys, and Stats 6%s: “Brief U.S. Suppression of Proof Stirs Anger, ” The Secrets: New Problems in Guarding Information with Cryptog- New York Times, Feb. 17, 1987, p. C3 raphy, ” Harvard University Center for Information Policy W “’Report of the Public Cryptography Study Group, ” Aca- search, Program on Information Resources Policy, April 1982. deme, vol. 67, December 1981, pp. 372-382. Ch. 4—Security Safeguards and Practices . 57

Box B.—An Example of DES Encryption The Electronic Codebook (ECB) mode is a basic, block, cryptographic method which transforms 64 bits of input to 64 bits of output as specified in FIPS PUB 46. The analogy to a codebook arises because the same plaintext block always produces the same ciphertext block for a given crypto- graphic key. Thus a list (or codebook) of plaintext blocks and corresponding ciphertext blocks theo- retically could be constructed for any given key. In electronic implementation the codebook entries are calculated each time for the plaintext to be encrypted and, inversely, for the ciphertext to be decrypted. Since each bit of an ECB output block is a complex function of all 64 bits of the input block and all 56 independent (non-parity) bits of the cryptographic key, a single bit error in either a cipher- text block or the non-parity key bits used for decryption will cause the decrypted plaintext block to have an average error rate of 50 percent. However, an error in one ECB ciphertext block will not affect the decryption of other blocks, i.e., there is no error extension between ECB blocks. If block boundaries are lost between encryption and decryption (e.g., a bit slip), then synchroni- zation between the encryption and decryption operations will be lost until correct block boundaries are reestablished. The results of all decryption operations will be incorrect until this occurs. Since the ECB mode is a 64-bit , an ECB device must encrypt data in integral multi- ples of 64 bits. If a user has less than 64 to encrypt, then the least significant bits of the unused portion of the input data block must be padded, e.g., filled with random or pseudo-random bits, prior to ECB encryption. The corresponding decrypting device must then discard these padding bits after decryption of the chapter text block. The same input block always produces the same output block under a fixed key in ECB mode. If this is undesirable in a particular application, the CBC, CFB or OFB modes should be used. An example of the ECB mode is given in table B1.

Table B1 .—An Example of the Electronic Codebook (ECB) Mode

The ECB mode in the encrypt state has been selected. Cryptographic key = O123456789abcdef The plaintext is the ASCII code for “Now is the time for all. ” These seven-bit characters are written in hexadecimal notation (0, b7, b6,..,, b1). Time Plaintext DES input block DES output block Ciphertext 1 4e6f772069732074 4e6f772069732074 - 3fa40e8a984d4815 3fa40e8a984d4815 2 68652074696 d6520 68652074696 d6520 6a271787ab8883f9 6a271787ab8883f9 3 666 f7220616c6c20 666 f7220616c6c20 893d51 ec4b563b53 893d51 ec4b563b53 The ECB mode in the decrypt state has been selected. Time Ci phertext DES input block DES output block Plaintext 1 3fa40e8a984d4815 3fa40e8a984d4815 4e6f772069732074 4e6f 772069732074 2 6a271787ab8883f9 6a271787ab8883f9 68652074696 d6520 68652074696 d6520 3 893d51ec4b563b53 893d51ec4b563b53 666 f7220616c6c20 666f7220616c6c20 SOURCE: NBS, FIPS Publication 81, Dec. 2, 1980, pp. 12-13. some researchers, including one member of the Currently, although some researchers feel ACE group, felt that even voluntary restraints that tensions between NSA and the research would affect the quality and direction of basic community have eased, others still consider research in computer science, engineering, and that the prospect of NSA controls may dis- mathematics.8 courage researchers, particularly academics, from pursuing problems related to cryptogra- “’The Case Against Restraints on Nongovernmental Re- search in Cryptography: A Minority Report by Professor George phy. The issue continues to simmer, particu- 1. Davida, ” Academe, December 1981, pp. 379-382. larly because cryptography presents some 58 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic /nformation interesting mathematical problems. For exam- from using DES except for electronic finan- ple, a controversy recently arose when the U.S. cial transactions. 11 The NSA plans affect Patent and Trademarks Office, at the request safeguard vendors in two major ways: first, of the U.S. Army, placed a secrecy order— only selected U.S. vendors will be allowed to which the Army later requested be rescinded— purchase the modules for incorporation into on a patent application filed by Israel’s Weiz- their products, and second, classified informa- mann Institute. The patent application dealt tion (and the need to handle and protect such with an area of mathematics called “zero- information) will be introduced into the prod- knowledge proof, ” pioneered by Silvio Micali uct design process.lz Also, some industry and colleagues at the Massachusetts Institute sources have expressed concern that the new of Technology, that is considered to hold great secret algorithms are of uncertain reliability promise for identification procedures ranging and will likely allow NSA itself to eavesdrop from credit card verification to military “friend on their communications. 13 or foe” recognition signals.9 In any case, industry has certain needs, most Another controversy concerns NSA’s deci- notably for easily exportable encryption de- sion not to recertify DES when it comes up vices and software-based encryption, that the for its 5-year review in 1987. NSA announced new algorithms are unlikely to meet. Many ex- in 1986 that it will continue to endorse cryp- perts consider software-based encryption less tographic products using DES until January secure than hardware-based encryption, in part 1, 1988, but not certify the DES algorithm or because the key might be exposed during en- new DES products after that date, except for cryption. Also, encryption using software is electronic funds transfer applications. How- much slower than that using hardware or firm- ever, DES equipment and products endorsed ware devices. Nevertheless, some private sec- prior to January 1,1988, maybe sold and used tor users prefer software because it is inexpen- after that date. In justifying this decision, sive and compatible with their existing NSA argues that DES has become too popu- equipment and operations. For instance, reader lar and widespread in use, and thus too attrac- surveys conducted by Security magazine in tive a target for adversaries seeking to crack 1985 and 1986 found that about half of the re- it. Some observers have expressed concern that spondents stated that they used encryption NSA decision implies that DES is no longer software. 14 secure. However, NSA has stated that there To date, there are no Federal software en- are no known security problems or risks in- cryption standards and NSA has stated that volved with the continued use of DES 10 it will not endorse software encryption prod- equipment. ucts. Also, the new encryption modules are not Instead of recertifying DES, NSA plans to provide and certify three classified algorithms. I IThe Treasury llepmtrnent has embarked on a maJOr Plan The new algorithms will use tamper-protected, using DES to authenticate electronic funds transfers. For these integrated circuit modules directly in the prod- applications, Treasury will certify the DES and DES-based equipment. See ch. 5. ucts of qualified vendors. This decision offi- 1%. Lipner, Digital Equipment Corp., personal communica- cially affects only U.S. Government agencies tion with OTA staff, Dec. 24, 1986. See ch. 5 for a description and contractors, but it may discourage others of vendor eligibility requirements. “IEEE Subcommittee on Privacy, meeting at OTA, July 8, 1986. The invention was made by Adi Shamir, Amos Fiat, and ‘iThese data were reported in: Kerrigan Lyndon, “protect- Uriel Feige. According to press accounts, the research had pre- ing the Corporate Computer, Security WorM, Oct. 1985, pp. viously been cleared by NSA voluntary review process, and 35-56; and Susan A. Whitehurst, “How Business Battles Com- NSA intervened to have the secrecy order reversed. The New puter Crime, ” Security, October 1986, pp. 54-60. Of the 1985 York Times, Feb. 17, 1987: “A New Approach to Protecting survey respondents, 48 percent reported using data encryption Secrets Is Discovered, ” p. Cl; and “Brief U.S. Suppression of software compared to only 19 percent reporting use of data en- Proof Stirs Anger, ” p. C3. cryption hardware. Of the 1986 respondents, 47 percent reported IOHarold E. Daniels, Jr., ,National Security Agency! ]etter using encryption software; the percentage using encryption hard- N/2338 to DataPro Research Corp., Dec. 23, 1985. ware was not reported. Ch. 4—Security Safeguards and Practices ● 59 exportable. NSA has not yet announced Message Authentication whether it will provide exportable modules for use by the private sector. Thus, the NSA deci- An “authentic” message is one that it is not sion not to recertify DES has cast doubt on a replay of a previous message, has arrived ex- the reliability of the algorithm without provid- actly as it was sent (without errors or altera- ing a replacement that can meet the full range tions), and comes from the stated source (not of users’ needs. Chapter 6 discusses Federal forged or falsified by an imposter or fraudu- policy in more detail. lently altered by the recipient). ’7 Encryption in itself does not automatically authenticate OTA’s analysis suggests that there are cer- a message. It protects against passive eaves- tain kinds of algorithms not widely available dropping automatically, but does not protect that would substantially increase the range of against some forms of active attack. 18 En- applications for which encryption would be use- cryption can be used to authenticate messages, ful. These include algorithms that are very fast however, and the DES algorithm is the most (require little processing time), secure enough widely used cryptographic basis for message to ensure confidentiality for relatively short authentication. periods (e.g., days or months for financial trans- actions, as opposed to years or decades for As the use of electronic media for financial defense and intelligence information), and eas- and business transactions has proliferated, ily implemented in software, especially soft- message authentication techniques have ware for microcomputers. In addition, because evolved from simple pencil-and-paper calcula- of the widespread acceptance of DES for un- tions to sophisticated, dedicated hardware classified information, some experts argue that processors capable of handling hundreds of it would be fruitful to develop an improved ver- messages a minute. In general, the various sion of that algorithm that would lengthen the techniques can be grouped together according key while using the same essential scheme. to whether they are based on public or, at least However, the commercial market for crypto- in part, on secret knowledge. graphic safeguards is still new and small, and Public techniques share a common weakness: it has thus far been dominated by DES. Al- they check against errors, but not against ma- though a number of firms—mostly NSA con- licious modifications. Therefore, fraudulent tractors or spinoffs of these–are reportedly messages might be accepted as genuine ones working on new encryption algorithms and because they are accompanied by “proper” products for the commercial market, ” as of authentication parameters, based on informa- early 1987 public-key systems are the only area tion that is not secret. Using secret parame- of encryption algorithm development in which ters, however, message authentication cannot substantial nongovernment research and devel- be forged unless the secret parameters are com- opment is evident. Developing a new algorithm promised. A different secret parameter is usu- may take anywhere from 5 to 20 person-years, so many firms—except, perhaps, large firms ‘ ‘For a thorough discussion of message authentication and that ordinarily devote such substantial re- the various techniques used to authen~icate messages, see Da- sources to long-term research and develop- vies & Price, Securit.v for Computer AJ’et works: An Introduc- ment—may hesitate to invest in a new cryp- tion to Data Securit.v in Teleprocessing and F;lectronic Funds Transfers, Ch. 5, (New York, NY: J. Mrile~, 1984}. The descrip- tographic product for a market that, so far, tions of authentication techniques in this section follow Da\ries has been shaky. ” & Price closely. 1“’Passive attack” is described as ea~’esdropping and “ac- tive attack” as the falsification of data and transactions through such means as: 1 ) alteration, deletion, or addition; 2) changing the apparent origin of the message; 3) changing the actual des- tination of the message; 4) altering the sequence of blocks of ‘ “S. Lipner, Digital Equipment Corp., personal communica- data or items in the message: 5) replaying previously transmitted tion with OTA staff. Dec. 24, 1986. or stored data to create a new false message; or 6) falsifying “’Peter Schweitzer and t$’hitfield Diffie, personal communi- an acknowled~ment for a genuine message. See Davies & Price, cations with OTA staff, June 2, 1986. pp. 119-120, 60 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information ally required for each sender-receiver pair. The authentication code and encryption do not safe logistics for distributing this secret informa- guard against replay of messages or malice on tion to the correct parties is analogous to key the part of one of the corresponding parties, distribution for encryption (see below). so various message sequence numbers, date and time stamps, and other features are usu- If privacy as well as authentication is re- ally incorporated into the text of the message. quired, one scheme for encrypting and authen- Box C discusses the use of message authenti- ticating a message involves sequential use of cation in financial transactions. Figure 14 DES with two different secret keys: one to cal- shows a data authentication code (synonymous culate the authenticator (called the message with message authentication code) based on authentication code or MAC) and one to en- the DES algorithm. crypt the message. Even the use of a message

Box C.—Application of Message Authentication to Electronic Funds Transfer Developments in the banking industry provide a good example of how important information should be safeguarded, both because of the large amounts of money involved and because of the early use of new safeguard technology by segments of this industry.1 Roughly $668 billion per day was transferred over the FedWire and Clearing House Interbank Payment System (CHIPS) net- works alone in 1984, representing a 48 percent increase over 1980. Z The fully-automated, online, FedWire system handled 49.5 million domestic transactions in 1986, with an average value of $2.5 million each, for a total of $124.4 trillion. In the same year, CHIPS handled $125 trillion in domestic and international payments for its member banks.3 During recent decades, the financial community has made increasing use of computer and com- munications systems to automate these fund transfers and other transactions. Typically, the com- puter systems of these financial institutions are interconnected with local and long distance public and private communications networks, over which the bankers have only limited control over poten- tial fraud, theft, unauthorized monitoring, and other misuse. Their customers have an expectation of privacy and banks have the obligation to restrict details of financial transactions to those who need to know. Wholesale and retail banking systems have somewhat different requirements for safeguards for funds transferred electronically. Wholesale bankers’ requirements include message authentica- tion and verification, as well as confidentiality of some communications; retail banking requirements additionally include authentication of individual automatic teller machines, confidentiality of cus- tomers’ personal identification numbers, and communications security between the automatic tellers and the host computer. These needs are in sharp contrast with those of the defense-intelligence establishment, where confidentiality is the primary concern. During the past decades, various technical methods have been adopted to reduce errors and to prevent criminal abuse relating to electronic fund transfers. Among these are parity checks, check- sums, testwords, and pattern checks.4 Some of these methods are widely used in various banking networks to verify that user inputs are correct and detect errors rather than protect against crimi- nal activity.

‘Wholesale banking transactions are characterized by large dollar amounts per average transaction (e.g., about $3 million) and daily volumes of transactions that number in the thousands or tens of thousands. Retail banking transactions amounts might average $50 and number in the hundreds of thousands. “’Electronic Funds Transfer Systems Fraud, ” U.S. Department of Justice, Bureau of Justice Statistics, NCJ-1OO461, April 1986. ‘Information on FedWire and CHIPS from F. Young, Division of Federal Reserve Bank Operations, personal communication with OTA staff, Feb. 12, 1987. ‘For a brief description of testwords (or test keys) in banking transactions, see M. Blake Greenlee, “Requirements for Key Management Protocols in the Wholesale Financial Services Industry, ” IEEE Communications Magazine, vol. 23, No. 9, September 1985, Ch, 4—Security Safeguards and Practices Ž 61

One of the major, traditional drawbacks of encryption systems is that of key distribution. Each pair of communicating locations generally requires a matched, unique set of keys or codes, which have to be delivered in some way–usually by a trusted courier–to these users each time the keys are changed. (An alternative is to use a prearranged code book, which can be compromised, as has been well publicized in recent spy trials.) The key distribution problem rapidly becomes onerous as the number of communicators increases. s The discovery of the public-key algorithm, noted earlier, may alleviate some of the key distribution problems—for example, to distribute the secret keys to large networks of users. In the late 1970s, the financial community was quick to realize the potential of the new cryptographic-based message authentication codes as a replacement for testwords. These codes al- low major improvements in safeguards against both errors and intentional abuse, and facilitate the potential of future transaction growth. Thus, this community has pioneered industrywide tech- nical standards both in the United States and worldwide. The message authentication code is a cryptographically derived check sum based on processing the electronic fund transfer message with the DES algorithm (called the Data Encryption Algorithm in the financial services community) and a secret key.6 The sender calculates the code and appends it to the message. The receiver calculates a code independently based on the same message, algorithm, and secret key. Most new bank authentication systems in use or in planning utilize DES to calculate the codes. If the code calculated by the receiver is identical to that sent with the message, then there is a high level of assurance that the originator is authentic and that the content of the received message is identical to that transmitted by the sender, with no alterations of any kind. Also, some banks authenticate and encrypt their wholesale electronic fund transfers whenever practical and in countries where encryption is legally permissible. 7

‘The number of pairs of separate keys needed in a network of “n” communicators, each pair of which requires unique key’s, is n{n - 1 ) 2. Thus, a network of 5 communicators requires 10 separate pairs of keys, while a network of 100 communicators requires 4,950 pairs of keys These numbers pale when considering that 10,000 banks send fund transfers worldwide, the largest of which ha~e thousands of keying rela- tionships. ‘For a thorough discussion of the properties of message authentication techniques, see RR. Jueneman, S.hl. hlatyas, and C’. H. hle}’er, “hlessage Authentication, ” IEEE Communications Magazine, \’ol. 23, No, 9, September 1985. ‘C. Helsing, Bank of America, personal communication with OTA staff, December 1986.

Public-Key Ciphers cial class of asymmetric ciphers are public-key ciphers, in which the encrypting key need not A symmetric cipher is an encryption method be kept secret to ensure a private communica- using one key, known to both the sender and tion.20 Rather, Party A can publicly announce receiver of a message, that is used both to en- his or her encrypting key, PKA, allowing any- crypt and decrypt the message. Obviously, the one who wishes to communicate privately with strength of a symmetric cipher depends on him or her to use it to encrypt a message. Party both parties keeping the key secret from A’s decrypting key (SKA) is kept secret, so others. With DES, for example, the algorithm that only A or someone else who has obtained is known, so revealing the encryption key per- mits the message to be read by any third party. RSA and knapsack algorithms, is given in Martin E. Hellman: An asymmetric cipher is an encryption “The Mathematics of Public-Key Cryptography. ” Scientific American, vol. 241, No. 2, August 1979, pp. 146-157. A pictorial scheme using a pair of keys, one to encrypt example of the RSA public-key method can be found in Under- and a second to decrypt a message. 19 A spe- standing Computers/COMPUTER SECUR17V’ (Alexandria, VA: Time-Life Books, 1986), pp. 112-117. “)The public-key concept was first proposed by Whitfield 1“See Da\’ies & Price, ch. 8, for a more complete discussion Diffie and Martin Hellman in their pathbreaking paper. “New of asymmetric and public-key ciphers. A discussion of the under- Directions in Cryptography, ” IEEE Trans. Inform. 7’heorj-, lying principles of public-key ciphers, including examples of the IT-22, 6, No\ember 1976, pp. 644-654. 62 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information — — —

Figure 14.—Federal Standard for Authentication

The DAA Authentication Process A cryptographic Data Authentication Algorithm (DAA) can protect against both accidental and intentional, but un- authorized, data modification. A Data Authentication Code (DAC) is generated by ap- plying the DAA to data as described in the following section. The DAC, which is a mathematical function of both the data and a cryptographic key, may then be stored or transmitted with the data. When the integrity of the data is to be verified, the DAC is generated on the current data and compared with the previously generated DAC. If the two values are equal, the integrity (i.e., authenticity) of the data is verified. The DAA detects data modifications which occur be- tween the initial generation of the DAC and the validation of the received DAC. It does not detect errors which occur before the DAC is originally generated. Generation of the DAC The Data Authentication Algorithm (DAA) makes use of the Data Encryption Standard (DES) cryptographic algorithm specified in FIPS PUB 46. The DES algorithm transforms (or encrypts) 64-bit input vectors to 64-bit output vectors using a cryptographic key. Let D be any 64-bit input vector and as- sume a key has been selected. The 64-bit vector, O, which is the output of the DES algorithm when DES is applied to D, using the enciphering operation, is represented as follows. O = e(D) The data (e.g., record, file, message, or program) to be authenticated is grouped into contiguous 64-bit blocks: D1, D2 ..., Dn. If the number of data bits is not a multiple of 64, then the final input block will be a partial block of data, left justified, with zeros appended to form a full 64-bit block. The calculation of the DAC is given by the following equations where .+ represents the Exclusive-OR of two Vect ors. 01 = e(Dl) 02 = e(D2 I? 01) 03 = e(D3 I@ 02) . .

The DAC is selected from On. Devices which implement the DAA shall be capable of selected the leftmost M bits of On SOURCE: NBS FIPS Publication 113, May 30, 1985, pp. 3-6. his or her decrypting key can easily convert If the encrypting key is publicly known, how- messages encrypted with PKA back into ever, a properly encrypted message can come plaintext. z’ Knowing the public encrypting from any source. There is no guarantee of its key, even when the encrypted message is also authenticity. It is thus crucial that the public available, does not make computing the secret encrypting key be authentic. An imposter decrypting key easy, so that in practice only could publish his or her own public key, PKI the authorized holder of the secret key can read and pretend it came from A in order to read the encrypted message. messages intended for A, which he or she could intercept and then read using his or her own “For A and B to have private two-way communication, two pairs of keys are required: the “public” encryption keys SKI. Therefore, the strength of the public-key

PK,~and PK~, and the secret decryption keys SKA and SK~. cipher rests on the authenticity of the public Ch. 4—Security Safeguards and Practices Ž 63 key. A variant of the system allows a sender RSA Data Security, Inc. in 1982 and obtained to authenticate messages by “signing” them an exclusive license for their invention from using an encrypting key, which (supposedly) MIT, which owns the patent. The firm has de- is known only to him or her. This very strong veloped proprietary software packages imple- means of authentication is discussed further menting the RSA cipher on personal computer in the section on digital signatures below. networks. These packages, being sold commer- cially, provide software-based communication The RSA public key is one patented system safeguards, including message authentication, available for licensing from RSA Data Secu- digital signatures, key management, and en- rity, Inc. It permits the use of digital signa- cryption. The firm also sells safeguards for tures to resolve disputes between a sender and data files and spread sheets transmitted be- receiver. The RSA system is based on the rela- tween work stations, electronic mail networks, tive difficult y of finding two large prime num- and locally stored files. The software will en- bers, given their product. The recipient of the crypt American Standard Code for Informa- message (and originator of the key pair) first tion Interchange (ASCII), binary, or other files randomly selects two large prime numbers, on an IBM personal computers or compatible called p and q, which are kept secret. The re- machines, and runs on an IBM PC/AT at an cipient then chooses another (odd) integer e, encryption rate of 3,500 bytes per second. which must pass a special mathematical test based on the values of p and q. The product, A number of public-key ciphers have been n, of p times q and the value of e are announced devised by other industry and academic re- as the public encryption key. Even though searchers. Stanford University, for instance, their product is announced publicly, the prime holds four cryptographic patents, potentially factors p and q are not readily obtained from covering a broad range of cryptographic and n. Therefore, revealing the product of p and digital signature applications. Some of these q does not compromise the secret key, which patents have been licensed to various compa- is computed from the individual values of p nies for use in their products.2s and q.22 Current implementations of the ci- pher use keys with 200 or more decimal digits Digital Signatures in the published number N. A more complete description of the RSA system, including a dis- Encryption or message authentication alone cussion of its computational security, is given can only safeguard a communication or trans- in appendix D. action against the actions of third parties. They cannot fully protect one of the communicat- Figure 15 shows a simple illustrative exam- ing parties from fraudulent actions by the ple of a public-key cipher based on the RSA other, such as forgery or repudiation of a mes- algorithm. This simplified example is based on sage or transaction. Nor can they resolve con- small prime numbers and decimal representa- tractual disputes between the two parties. tions of the alphabet. It is important to bear Paper-based systems have long depended on in mind, however, that operational RSA sys- letters of introduction for identification of the tems use much larger primes. parties, signatures for authenticating a letter The RSA system was invented at the Mas- or contract, and sealed envelopes for privacy. sachusetts Institute of Technology (MIT) in The contractual value of paper documents 1978 by Ronald Rivest, Adi Shamir, and hinges on the recognized legal validity of the Leonard Adelman. The three inventors formed signature and the laws against forgery.

“certain special values of (p)(q) can be factored easdy–when p and q are nearly equal, for instance. These special cases need ‘{The companies include the Harris Corp., Northern Telecom, to be avoided in selecting suitable keys. Furthermore, it is im- VISA, Public Key Systems, and Cylink. Lisa Kuuttila, Stan- portant to remember that this cipher system is no more secure ford Office of Technology Licensing. personal communication than the secrecy of the private key. with OTA staff, Sept. 29, 1986. Before a message can be encrypted by the public-key method, it must be blocked and each block assigned a numerical value. Blocks may vary in size, from one character to several; and numerical values may be assigned in many ways, within constraints imposed by the system, In the example used here, each character is treated as a block, and a simple number-assigning system is used: A = 1, B = 2, C = 3, D = 4, and so on (tab/e at top).

C. The Arithmetic of locking and unlocking: the sender, user B, uses PKA to encrypt a message to user A.

The number 19, assigned to the letter S, is The result of 19 raised to the fifth The division yields the number 20,807 and a raised to the fifth power (multiplied by itself power—2,476,099—is divided by the first remainder of 66. Only the remainder is im- five times), as dictated by the second part part of PKA, the number 119. portant. It is the value of the encrypted let- of PKA (5). ter S.

20807 and a remainder of 66 (encrypted S)

3125 26 and a remainder of 31 (encrypted E) 31

The next letter of the message, E, has the The result of multiplying 5 by itself five The division yields the number 26 and a re- assigned value 5. Using the second part of times—3,125—is divided by the other part mainder of 31. Again, only the remainder is PKA, this number is raised to the fifth of PKA 119. significant. it is the value of the encrypted power, letter E. Ch. 4—Security Safeguards and Practices ● 65

B. Creating user A’s keys.

1, Each user has a public and a private key, and each key has two parts To create user A’s keys, two prime numbers, customarily designated P and Q, are generated by an operator at a central computer or key gener- ation center (To quallfy, a prime number must pass a special mathematical test ) 1 P = 7, Q = 17 Here. P IS 7, Q IS 17.

2, In this sImpllfled example, the two 2 7 x 17 = 119 = N primes are multiplied, and the result—N— will be the first part of both keys, N IS 119 3 E=5 3. Next, an odd number is chosen, in this case, 5. (This number—designated E—must 4 6 X 16 X 4 = 384 also pass a special mathematical text.) It forms the second part of the publlc key. PKA. 5 384 + 1 = 385 4. To create the second part of the private key, the numbers are multlpl!ed P minus 1 6 385 ÷ 5 = 77 = D (6, In this case) times Q minus 1 (16) times E minus 1 (4) The result IS 384

5, Next, 1 is added to the result of the previous step, yleldlng 385

6. The sum IS divided by E (5). The result of At the end of the procedure user A has a the division, 77 (designated D), IS the se- public key (119 5) and a private key (119 77) cond part of SKA In reality, these numbers would be many digits long.

D. The recipient, user A, uses his private key, SKA, to decrypt the message.

DecryptIon, using SKA, follows the same The result of the previous step IS divided by The remainder resultlng from the divislon IS steps First, 66—the encrypted S— IS raised 119, the first part of SKA, which is identical 19—the original number assigned to the let- to the 77th power, as dictated by the se- to the first part of user A’s public key. ter S. Thus, the decryption of the first one- cond part of the key, SKA. letter block of the message IS complete

1237. . . 1069 . . . and a remainder of 19 (numerical equivalent of) s 119

77 6836 . . . 5745 . . . and a - remainder of 5 (numerical equivalent of) E 119

The number 31 —the encrypted letter E—Is The result of multiplying 31 by itself 77 The remainder from the dvision IS 5—the raised to the 77th power, as dictated by the times IS divided by 119. the other part of origlnal value assigned to the letter E Each second part SKA the private key SKA letter block will be decrypted in the same way,

SOURCE Adapted from Compufer Securffy (Alexandna, VA Time-Life Books, 1986} pp 112-115 66 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic /formation

Figure 16.— Digital Signatures Using a Public-Key Cipher

This example uses the same key pair (PKA, SKA) generated for user A In figure 15. In this example, the sender (user B) uses his private key (SKB) to “sign” a message intended for user A and then “seals’” it by encrypting, the message with user A’s public key (PKA).

User B’s private key User B’s public key.

When user A receives the signed and sealed message, he uses hls SKA to unseal the message and the sender’s PKB to unsign it,

To begin the encryption technique called The result of raising 19 to the 27th power IS The divis!on yields a very large number, signing, the value of the letter S (l9)—Is divided by 55, the first part of SKB. which is disregarded, and a remainder of raised ‘to the 27th power, as dictated by the 24. This completes the signing process for second part of SKB. the letter S; only user B’s public key PKB can decrypt It

I 27

9 A

To seal the message for secrecy, the result The result of raising 24 to the fifth power is The division yields a number (disregarded) of the first encryption, 24 in this case, is divided by 119, the other part of PKA, and a remainder of 96-the twice-encrypted raised to the fifth power, as dictated by the S. It will be sent when the rest of the mes- second part of the receiver’s public key, sage has undergone the same double en- PKA, cryption, Ch. 4—Security Safeguards and Practices . 67

To decrypt a .sIgned-and-sealed message, The result of the prevtous step IS divided by The division yields a very large number (dis- user A raises the number 96—the double- 119, the other part of SKA, regarded) and a remainder of 24—the cipher encrypted S—to the 77th power, as dictated imposed on the letter S by the sender’s pri- by one part of hls private key, SKA vate key, SKB.

77 4314 . . . 3625 . . . and a — remainder of 96 — 24 (encrypted S) 24 119

— remainder of 19 (numerical equivalent of) s

To decrypt this digital signature, the num- The result of raising 24 to the third power The division yields a number (disregarded) ber 24 IS raised to the third power, as dic- IS divided by 55, as determined by the other and a remainder of 19—the numerical tated by one part of the sender’s public part of PKB equivalent assigned to the letter S by the key, PKB system, Performing the same steps on the rest of the transmission reveals the plaintext,

SOURCE Adapted from Computer Security (Alexandra, VA Time.Life Books, 1986), pp 116.117 68 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information — —

Equivalent functions for electronic docu- nature system by substituting his or her own ments can be provided by using an asymmet- public key and signature for the real author ’s. ric cipher, such as the RSA cipher, to create This letter of introduction could be accom- a digital signature for a document.24 This can plished by several means. The system offered both authenticate their contents and also prove by RSA Data Security, Inc., provides “signed who sent them because only one party is pre- key server certificates” by attaching the cor- sumed to know the secret information used to poration’s own digital signature to its custom- create the signature. If privacy is required, en- ers’ public keys. Thus, customers can attach cryption can be used in addition to the digital their certified public keys to the messages they signature. However, the “proof” of the signa- sign. Note that although a public-key cipher ture hinges on the presumption that only one system is used to set up the digital signature party knows the secret signing key. If this system, the actual text of the message can be secret information is compromised, then the sent in plaintext, if desired, or it can be en- proof fails. crypted using DES or the public-key cipher.25 The equivalent of a letter of introduction is Figure 16 continues the simplified example still necessary to verify that the correct pub- in figure 15 to illustrate the digital signature lic key was used to check the digital signa- technique. ture—an adversary might try to spoof the sig-

ZSFor example, if the author wishes to keep the text of the ‘qOther public-key ciphers using different one-way functions message private, so that only the intended recipient can read could provide the mechanism for a form of digital signature; it, he or she can encrypt the signed message, using the recipi- however, none are commercially available at present. Also, it ent’s public key. Then, the recipient first uses his or her own is possible to use a symmetric cipher such as DES in an asym- secret key to decrypt the signed message and then uses the metric fashion—at least two signature functions of this type sender’s public key to check the signature. In practice, the RSA have been described-but these functions are more inconvenient digital signature system is used to transmit a DES key for use to use than the RSA method and require more administrative in encrypting the text of a message because DES can be imple- effort. See Davies & Price, ch. 9, for a general treatment of digi- mented in hardware and is much faster than using the RSA tal signatures and alternative methods. algorithm to encrypt text in software.

NEW TECHNOLOGIES FOR PRIVATE AND SECURE TRANSACTIONS The public-key and digital signature systems which would correspond to digital pseudonyms described above have important uses for key that could differ for each type of trans- exchange and management, for authenticat- action) .2’ That is, transactions could be made ing messages and transactions, and for permit- without revealing the identity of the individ- ting enforceable “electronic contracts” to be ual, yet at the same time making certain that made, including electronic purchase orders and each transaction is completed accurately and other routine business transactions. Digital properly. signatures might also be used in equally se- cure transaction systems that preserve the privacy of individuals. This would be accom- ‘(’See, for example, David chaum, 4’SeCUrity without 1den- tification: Transactions Systems To Make Big Brother Obso- plished by permitting transactions to be made lete, ” Communications of the ACM, vol. 28, No. 10, October pseudonymously (using digital signatures, 1985. Ch. 4—Security Safeguards and Practices ● 69

Digital signatures could prevent authorities actions, such as retail purchases. Although from cross-matching data from different types individuals would be able to authenticate of transactions or using computer profiling to ownership of their pseudonyms and would be identify individuals who have a particular pat- accountable for their use, the pseudonyms tern of transactions. Database matching is a could not be traced by computer database technique that uses a computer to compare two matching. z8 On the other hand, the use of nu- or more databases to identify individuals in merous digital pseudonyms might make it common (e.g., Federal employees who have more complicated for individuals to check or defaulted on student loans). Computer profil- review all their records. 29 ing uses inductive logic to determine indica- A second difference is the ownership of the tors of characteristics and/or behavior patterns “tokens” used to make transactions. Cur- that are related to the occurrence of certain rently, individuals are issued credentials, such behavior (e.g., developing a set of personal and as paper documents or magnetic stripe cards, transactional criteria that make up a profile 27 to use in transactions with organizations. of a drug courier). Moreover, the information contained on the Public-key systems make it possible to estab- electronic credentials is usually not directly lish a new type of transaction system that pro- reviewable or modifiable by the individual who tects individual privacy while maintaining the uses it. In the scheme described above, indi- security of transactions made by individuals viduals would own the transaction token and and organizations. This new system would cre would control the information on it. ate a security relationship between individuals This system illustrates how technological de and organizations in which an organization and velopments and organizational changes can be the individuals it serves cooperatively provide used to mitigate potential erosions of privacy mutual protection, allowing the parties to pro- that could result from the widespread use of tect their own interests. multi-purpose smart cards and computer For example, instead of individuals using the profiling. However, while the technology and same identification (e.g., Social Security num- organizational infrastructures for the latter, bers, which are now commonly used on drivers’ at least, are already fairly well developed, the licenses, insurance forms, employment records, practical development of privacy systems is tax and banking records, etc.), they would use just beginning.30 a different account number or digital pseudo- nym with each organization they do business ZSA form~ description of a “credenti~ mechanism for pseu- with. Individuals could create their pseudo- donyms is given in David Chaum and Jan-Hendrik Evertse, “A Secure and Privacy-Protecting Protocol for Transmitting Per- nyms, rather than have them issued by a cen- sonal Information Between Organizations, Advances in Cryp- tral authority. A one-time pseudonym might tology: Proceedings of Crypto 86, A.M. Odlyzko (cd.), Springer- even be created for certain types of trans- Verlag Lecture Notes in Computer Science, forthcoming, sum- mer 1987. 2gChaum suggests using a card computer to manage this ‘TFor a further discussion of the implications of computer complexity while maintaining a convenient user interface. Per- database matching and profiling, see the Office of Technology sonal communication with OTA staff, February 1987. Assessment, Federal Government Information Technology: s~he Center for Mathematics and Computer Science in Am- Electronic Record Systems and Individual Privacy, OTA-CIT- sterdam has recently demonstrated a payment system and is 296 (Washington, DC: U.S. Government Printing Office, June working with European groups to develop trial systems. David 1986). Chaum, personal communication with OTA staff, February 1987.

KEY MANAGEMENT Key management is fundamental and cru- The safety of valuables in a locked box de- cial to encryption-based communication and pends as much or more on the care with which information safeguards. As an analogy, one the keys are treated than on the quality of the might say that: lock. It is useless to lock up valuables if the 70 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic information

key is left lying around. The key may be sto- On the other hand, the National Bureau of len, or worse, it may be secretly duplicated and 31 Standards (NBS) operates on the assumption used at the thief’s pleasure. that each user organization should generate Key management encompasses the genera- its own keys and manage its own key distri- tion of encrypting and decrypting keys as well bution center. In the United States, Federal as their storage, distribution, cataloging, and standards for protecting unclassified informa- eventual destruction. These functions may be tion in Government computer systems have 32 handled centrally, distributed among users, or been developed by NBS which has also by some combination of central and local key worked cooperatively with private organiza- management. Also, key distribution can be tions such as the American Bankers Associa- handled through various techniques: by using tion (ABA) and the American National Stand- couriers to distribute data-encrypting keys or ards Institute (ANSI). Additionally, ABA and master (key-encrypting) keys, for instance, or ANSI have developed voluntary standards re- by distributing keys electronically using a lated to cryptography for data privacy and in- public-key cipher. The relative merits of each tegrity, including key management. The In- mode of key management are subject to some ternational Organization for Standardization debate. (IS0) has been developing international stand- ards, often based on those of NBS and/or For example, some technical experts, includ- ANSI.33 Standards of these types are in- ing those at NSA, argue that centralized key tended to specify performance requirements generation and distribution, perhaps per- (accountability for keys, assignment of liabil- formed electronically, efficiently ensures in- ity) and interoperability requirements for com- teroperability among different users and that munications among users. relatively unsophisticated users do not inad- vertently use any weak keys that may exist. According to some experts, it is technically NSA has stated that, for reasons of security possible to handle centralized key distribution and interoperability, it plans to control key so that the key-generating center cannot read generation for the new STU-III secure tele- users’ messages. If this were done, it would phones (see ch. 5), including those purchased provide efficient and authenticated key distri- by private sector users. It is also likely that bution without the potential for misuse by a NSA will control key generation for equipment centralized authority. However, whether NSA using its new encryption modules. plans to use these techniques has not been made public. Some critics of this plan are concerned that NSA might be required—by secret court or- In any event, a key distribution center of der, perhaps—to selectively retain certain some sort is the most prominent feature of key users’ keys in order to monitor their commu- management for multi-user applications. Such nications. Others express concerns that key- a center is needed to establish users’ identi- ing material may be exposed to potentially un- ties and supply them with the keys to be used reliable employees of NSA contractors. At the for communications–usually, with “seed” very least, the prospect of centralized NSA key keys used to establish individual session keys. generation has generated some public con- troversy. “’see, for example, Federal Information Processing Stand- ards (FIPS) Publications FIPS PUB 81, 74, and 113 published by NBS. ““D. Branstad, Institute for Computer Science and Technol- ogy, National Bureau of Standards. Information about NBS and standards development from personal communication with JIThi~ ~~o= i9 from Lee IQeuwh-th: “A Comparison of Four OTA staff, Aug. 6, 1986. For a general discussion of security Key Distribution Methods, ” Telecommunications (Technical standards based on cryptography, see: Dennis K. Branstad and Note), July 1986, pp. 110-115. For a detailed discussion of key Miles E. Smid, “Integrity and Security Standards Based on distribution and key management schemes, also see ch. 6 of Da- Cryptography, ” Computers and Security, vol. 1, 1982, pp. vies & Price. 255-260. Ch. 4—Security Safeguards and Practices . 71

Even in a public-key system, the initial secret To date, the best-known commercial offer- keys must be computed or distributed. NBS ing of a public-key system to secure key dis- has developed a key notarization system that tribution (or other electronic mail or data trans- provides for authenticated distributed keys fers) is by RSA Data Security, Inc. Other and other key management functions.34 NBS public-key systems have been developed, some had initiated a process for developing stand- earlier than RSA, but to date none have yet ards for public-key systems35 but is no longer gained wide commercial acceptance. Although pursuing this activity. RSA initially attempted to implement its al- The traditional means of key distribution— gorithm in hardware, their first successful com- mercial offerings, introduced in 1986, use soft- through couriers—is a time-consuming and ex- ware encryption. The Lotus Development pensive process that places the integrity of the Corp., one of the largest independent software keys, hence the security of the cipher system, companies, has licensed the RSA patent for in the hands of the courier(s). Courier-based use in future products. RSA Data Security has key distribution is especially awkward when also licensed the patent to numerous large and keys need to be changed frequently. Recently, small firms and to universities engaged in re- public-key systems for key distribution have search, as well as to some Federal agencies, been made available allowing encryption keys including the U.S. Navy and the Department (e.g., DES keys) to be securely transmitted of Labor.37 A new hardware implementation over public networks—between personal com- of several public-key ciphers (including RSA puters over the public-switched telephone net- and the SEEK cipher) was offered commer- work, for example. There continue to be new cially in 1986. The chip, developed by Cylink, developments in public-key cryptography re- 36 Inc., will be used in Cylink’s own data encryp- search. tion products and is available to other vendors who wish to use it.38 ‘i13ranstad and Smid, op. cit., p. 258. ‘“’ Ibid., p. 259. ‘TI,etter to OTA staff from

VOICE AND DATA COMMUNICATIONS ENCRYPTION DEVICES A number of commercial products, in the Since 1977, NBS has validated 28 different form of hardware devices or software packages, hardware implementations of the DES al- are available to encrypt voice and data com- gorithm (in semiconductor chips or firmware), munications. Software-based encryption is but NBS does not validate vendors’ software slower than hardware encryption and many implementations of the algorithm. In 1982, the security experts consider it to be relatively General Services Administration (GSA) issued insecure (because, among other reasons, the Federal Standard 1027, “Telecommunications: encryption keys may be ‘exposed’ in the com- Interoperability and Security Requirements puter operations). Still, some commercial users for Use of the DES, in the Physical Layer of prefer software encryption because it is rela- Data Communications. ” At present, equip- tively inexpensive, does not require additional ment purchased by Federal agencies to pro- hardware to be integrated into their operations, tect unclassified information must meet FS and is compatible with their existing equip- 1027 specifications; vendors may submit prod- ment and operations. However, this section ucts built using validated DES chips or firm- will deal only with hardware products, in large ware to NSA for FS 1027 certification. NSA part because only hardware products have has a DES endorsement program to certify been certified for Government use. products for government use, but plans to dis- 72 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information continue this program on January 1, 1988. As decrypted and then reencrypted at each inter- stated earlier in this chapter, DES products mediate communications node between the endorsed prior to this date can be used in- source and the ultimate destination. Thus, the definitely .39 information is encrypted, decrypted, and reen- Hardware encryption products use special crypted as it traverses each link along its com- munications path. semiconductor or firmware devices to imple- ment one or more encryption algorithms. On- By late 1986, the market research firm line encryption (in which data is encrypted as DataPro listed about 30 vendors that were it is transmitted and decrypted as it is received, marketing commercial encryption equipment, as opposed to off-line encryption in which plain- using the DES and/or proprietary algorithms, text is first encrypted and then stored for later and operating at low or high data rates (de- transmission) can be implemented in two ways. pending on the product and vendor, encryp- In the first method, called end-to-end encryp- tion data rates can range from about 100 bits tion, synchronized encryption/decryption de- per second up to 7 million bits per second). vices at the source and destination operate so These vendors offer 40 or more commercial that the transmitted information is encrypted products or families of products, mostly for and remains in its encrypted form throughout data encryption, although a few vendors offer the entire communications path. In the second products for voice encryption. Some vendors method, called link encryption, the transmitted specialize in encryption-only products, while information is also encrypted at the source, and others are data communications service (turn- key) providers offering encryption products complementing the rest of their product line. Published prices range from $500 to several -39 Harold E. Daniels, Jr., Deputy Director for Information Security, NSA, enclosure 3, page 4 in letter S-0033-87 to OTA, thousand dollars per unit, depending on data Feb. 12. 1987. rate and other features.

PERSONAL IDENTIFICATION AND USER VERIFICATION Background word or personal identification number, which could be forgotten, stolen, or divulged to User verification measures aim to ensure another; and the “is” a photo badge or signa- that those who gain access to a computer or ture, which could be forged. Cards and tokens network are authorized to use that computer also face the problem of counterfeiting. or network. Personal identification techniques are used to strengthen user verification by in- Now, new technologies and microelectronics, creasing the assurance the person is actually which are harder to counterfeit, are emerging the authorized user.40 to overcome the shortcomings of the earlier user verification methods. At the same time, User verification techniques typically em- these new techniques are merging the has, ploy a combination of (usually two) criteria, knows, or is criteria, so that one, two, or all such as something an individual has, knows, three of these can be used as the situation dic- or is. Until recently, the “has” has tended to tates. Microelectronics can make the new user be a coded card or token, which could be lost, verification methods compact and portable. stolen, or given away and used by an unauthor- Electronic smart cards, for example, now carry ized individual; the “knows” a memorized pass- prerecorded, usually encrypted, access control information that must be compared with data that the proper authorized user is required to ~~sts ~ no~ that the “person~ identification’ systems provide, such as a memorized personal iden- in common use do not actually identify a person, rather they recognize a user based on pm-enrolled characteristics. The term tification number or biometric data like a fin- “identification” is commonly used in the industry, however. gerprint or retinal scan. Ch. 4—Security Safeguards and Practices ● 73

Merging the criteria serves to authenticate cial needs. In the area of biometrics, vendors the individual to his or her card or token and have formed an industry association. The In- only then to the protected computer or net- ternational Biometrics Association is begin- work. This can increase security since, for ex- ning to address industry issues including per- ample, one’s biometric characteristics cannot formance and interface standards and testing easily be given away, lost, or stolen. Moreover, and has a standing committee on standards biometrics permit automation of the personal and technical support. identification/user verification process. In short, the new access control technologies While false acceptances and false rejections are moving toward the ideal of absolute per- can occur with any identification method, each sonal accountability for users by irrefutably technique has its own range of capabilities and tying access and transactions to a particular attributes: accuracy, reliability, throughput individual. Some enthusiasts and industry ex- rate, user acceptance, and cost. As with other perts foresee great and pervasive applications security technologies, selecting an appropri- for some of the access control technologies, ate system often involves trade-offs. For one even to their evolution into nonsecurity appli- thing, elaborate, very accurate technical safe- cations, such as multiple-application smart guards are ineffective if users resist them or cards (see above). However, a given set of ac- if they impede business functions. The cost and cess control technologies cannot, in them- perceived intrusiveness of a retina scanner selves, fix security problems “once and for all. ” might be acceptable in a high-security defense Changes in information and communication facility, for example, but a relatively low- system infrastructures can eventually under- security site like a college cafeteria might sac- mine previously effective safeguards. There- rifice high reliability for the lower cost, higher fore, safeguards have a life cycle. It is the com- throughput rate, and higher user acceptance bination of attributes, of the safeguard of a hand geometry reader. In banking, where technique, and of the system it seeks to pro- user acceptance is extremely important, sig- tect that determines the useful life of a nature dynamics might be the technology of safeguard. choice. In retail sales, a high throughput rate is extremely important and slower devices Conventional Access Controls would not be acceptable. Password-Based Access Controls Access control technologies will evolve for The earliest and most common forms of user niche markets. Successful commercial prod- verification are the password or password- ucts for the defense and civilian niches will look based access controls. The problem is that very different. As of early 1987, there were no passwords can be stolen, compromised, or in- specific performance standards for most of tentionally disclosed to unauthorized parties. these user verification technologies, but it is In addition, trivial passwords can easily be likely that these will be developed. One incen- guessed and even nontrivial ones can be bro- tive for the development of access control ken by repeated attack.42 Once stolen or com- standards, at least for the Government mar- ‘zCommon password misuses include sharing one’s password ket, is the access control objectives specified with other users (including friends or co-workers), writing down in the so-called “Orange Book. “4] The devel- the “secret series of letters or numbers for reference and stor- ing it in an unsecure place (examples of this abound, including opment of user verification technologies, how- writing passwords or identification numbers on the terminals ever, is being driven significantly by commer- themselves or on desk blotters, calendars, etc., or storing them in wallets or desk drawers), and permitting others to see the ‘4i Department of Defense Trusted Computer System Evalu- log-on/authorization code being keyed in at the terminal. Some ation Criteria. Department of Defense Standard DOD 5200.28 - password schemes allow users to select their own passwords; STD, December 1985. Section 7.4 of the Orange Book specifies while this increases the secrecy of the passwords because they that individual accountability must be ensured whenever clas- are known only by the users, trivial password choices can re- sified or sensitive information is processed. This objective en- duce security if the passwords are easy to guess (examples of compasses the use of user verification, access control software, trivial passwords would be a pet name, a birthdate, or license and audit trails. plate number). 74 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information promised, passwords can be disclosed widely linked to local area networks and connected or even posted on electronic bulletin boards, by fixed and/or public switched telephone lines, resulting in broad exposure of a system to un- as shown in figure 18. Users can readily ex- authorized access. If secu- tend the network by connecting modems to rity is poor, one user who unilaterally com- personal computers for pass-through access. promises his or her own password can As a result, it is no longer possible to iden- compromise the whole system. An even more tify all access points. Communication nodes serious weakness is that, because there may are no longer controlled exclusively by the be no tangible evidence of a security breach, organization when, for example, authorized a compromised password can be misused over users need to gain access from remote loca- and over until either the password is routinely tions. While pass-through techniques facilitate changed, its compromise is discovered, or other access by authorized users, they can also be events occur (e.g., data are lost or fraudulently misused. For example, under some circum- changed). To avoid some of these problems, stances they can be used to defeat even such many modern systems use special procedures security techniques as call-back modems. With to frustrate repeated incorrect attempts to log the increased number of network access points, on. the intrinsic weaknesses of the password fur- Until the last decade or so all access points ther exacerbate the system’s vulnerabilities. to computer systems could be physically iden- tified, which simplified the system administra- Token-Based Access Controls tor’s job of controlling access from them. In Network evolution, therefore, has made user addition, users could be easily defined and their identification and authentication even more terminals had limited capabilities. A network critical. Some of the new access-control tech- of this type is shown in figure 17. nologies can see through the communications Now, new network configurations have network to the end user to authenticate him emerged, characterized by personal computers or her—at least as the ‘‘holder’ of the proper

Figure 17.—A Description of the Past Network Environment

OF MIS CONTROL

T -TERMINAL

In the past network environment, control of all network resources resided with systems profes- sionals. TypicalIy, fixed-function terminals were direct-connected to the mainframe or a termi- nal controller. The communications parameters were specified through tables in the network control program (NCP), also under the direction of the systems group. As a result, the network was totally under the custodianship of systems professionals.

SOURCE Ernst & Whtnney, prepared under contract to OTA, November 1986 Ch. 4—Security Safeguards and Practices ● 75 —

Figure 18.— A Description of the Current/Future Network Environment

In the current/future network environment, systems professionals still control direct connection to the mainframe, Through the network control program (NCP), they maintain the communications parameters that control the access through the devices directly connected to the mainframe. However, the nature of these devices is changing dramatically. Instead of fixed-function terminals, they now consist of departmental minicomputers, local area network (LAN) gateways, and personal computers, All of these devices have the capability to expand the network beyond the scope of mainframe control. This environment invali- dates many of the premises upon which conventional access control mechanisms, such as passwords and call-back modems, were based.

SOURCE Ernst & Whinney prepared under contract to OTA, November 1986 token—regardless of his or her physical loca- of these had been evaluated by NSA’s National tion. Within the limitations of current tech- Computer Security Center (NCSC) and ap- nology, token-based systems are best used in proved for use with the access control software combination with a memorized password or packages on NCSC’s Evaluated Products List. personal identification number identifying the (See ch. 5 for a discussion of NSA’s programs.) user to the token. Token-based systems do much to eliminate In contrast to the password, token-based the threat of external hackers. Under the token- systems offer significantly greater resistance based system, the password has become a one- to a number of threats against the password time numeric response to a random challenge. system. Many token-based systems are com- The individual’s memorized personal identifi- mercially available. By December 1986, two cation number or password to the token itself 76 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic /formation

may be trivial, but the external hacker will or- the authentication process by giving away his dinarily not have physical access to the device, or her token and memorized identification num- which is usually designed to be tamper- ber. However, in this case, that individual no resistant and difficult to counterfeit. longer has access. In this way, the loss of a token serves as a warning that authentication Hackers also have been known to make may be compromised. repeated tries at guessing passwords, or make The see-through token (figure 19), used with use of overseen, stolen, or borrowed passwords. a password, is an active device requiring com- Repeated attack of the password to the host plementary user action. Systems of this type is also thwarted because this password is a ran- currently on the market do not physically con- dom number and/or an encryption-based, one- nect to a terminal, but instead provide a one- time response from the token. The onetime na- time user password for each access session. ture of the host password also eliminates its Tamper-proof electronics safeguard against re- compromise through observation, open dis- verse engineering or lost or stolen tokens. Some play, or any form of electronic monitoring. As versions of these devices can challenge the soon as a response is used, it becomes invalid. host, effectively countering attempts at A subsequent access request will result in a spoofing. different challenge from the host and a differ- ent required response from the token. An in- Two types of see-through tokens are cur- dividual user can still unilaterally compromise rently available from several vendors: auto-

Figure 19.—The Mechanics of See-Through Security AUTHENTICATION DEVICE

❑ 00 ❑ un ❑ on 1 I I

Typical flow of events in a see-through security authentication session 1. User requests access to host through terminal or PC; enters user ID. 2. Host calculates random number (challenge) and transmits it to terminal. 3. User identifies himself to authentication device by entering Personal Identification Number (PIN), or through biometric iden- tification. 4. User enters challenge from host into authentication device. Device uses the security algorithm and the user seed (both in device memory and inaccessible to the user) to calculate a numeric response. 5. User sends numeric response to host via the terminal. 6. Host calculates a response using the same challenge number, the security algorithm, and the user’s seed from the security database. Host compares its response to user response, and grants or denies access.

SOURCE Ernst & Whtnney, prepared under contract to OTA, November 1986 Ch. 4—Security Safeguards and Practices • 77 matic password generators, synchronized with and identification. In addition, the host, and challenge/response devices, using there are currently three classes of physio- numerical key pads or optical character logical-behavioral identification systems based readers. According to some security consul- on voice identification, keystroke rhythm, and tants, these see-through techniques will be signature dynamics. Most systems incorporate commonplace by the 1990s.43 adaptive algorithms to track slow variations in users physical or behavioral characteristics. Incorporating biometrics into these tech- Although these adaptive features reduce the niques will produce powerful safeguards, but rate of false rejections, some can be exploited there are associated risks. If biometric tem- by imposters. Most systems also allow the pre- plates or data streams containing biometric set factory threshold levels for acceptance and information are compromised, the implications rejection to be adjusted by the user. Tables can be quite serious for the affected individ- 5 and 6 illustrate some of the characteristics uals because the particular measurements be- of biometric and behavioral technologies. come invalid as identifiers. These risks can be minimized by properly designing the system Biometrics is currently in a state of : so that biometric data are not stored in a cen- technologies are advancing rapidly, firms are tral file or transmitted during the user verifi- entering and leaving the marketplace, and new cation procedure (as they would be in a host- products are being tested and introduced. based lookup mode). For many, therefore, the These technologies are being developed and preferred operation for biometrics would be in marketed by a relatively large group of firms— a stand-alone mode, with the user carrying a 28 at the end of 1986—some are backed by ven- biometric template in a token (like a smart ture capital, and some are divisions of large card). However, tokens can be lost or stolen, multinational corporations. Many other com- and placing the biometric template on the to- panies were doing preliminary work in biomet- ken removes it from direct control by system ric or behavioral techniques. Therefore, these security personnel. For these reasons, some in- tables and the following discussions of biomet- stallations, especially very high-security facil- ric identification systems represent only a ities using secure computer operating systems, snapshot of the field. may prefer host-based modes of operation. Fig- There is evidence of growing interest in bio- ure 20 illustrates the differences between host- metrics on the part of some Federal agencies. based and stand-alone modes for biometrics. According to Personal Identification News, de- fense and intelligence agencies conducted more Biometric and Behavioral than 10 biometric product evaluations in Identification Systems 1986.45 There are three major classes of biometric- Retina Blood Vessels based identification systems that are commer- cially available for user verification and access Retina-scanning technology for personal control. Since each of these systems is based identification is based on the fact that the pat- on a different biometric principle, they vary tern of blood vessels in the retina is unique for widely in their technologies, operation, ac- each individual. No two people, not even iden- curacy, and potential range of applications. tical twins, have exactly the same retinal vas- The three classes are based on scans of retinal cular patterns. These patterns are very stable blood vessels in the eye,” hand geometry, personal characteristics, altered only by seri- ous physical injury or a small number of dis- 4:iRobert G. Anderson, David C. Clark, and David R. Wilson, eases, and are thus quite reliable for biomet- “See-Through Security, ” MIS Week, Apr. 7, 1986. +iAccor~g t. ~erson~ Identification News, Feb~ary 198T! ric identification. Factors such as dust, grease, a patent has been issued for another type of eye system based on measurements of the iris and pupil (Leonard Flom and Aron Safir, U.S. Patent 4,641,349, Feb. 3, 1987). “Personal Identification AJews, January 1987, p. 2. with terminals instead of users. An organization may require fewer devices i n this’ mode, and the devices do not need to be portable. Cons: The biometric Information can be compromised in transmission or storage. Encrypted information can be diverted and at- tacked cryptologically.

STAND-ALONE BIOMETRICS

HOST

1I I I

PATH OF BIOMETRIC INFORMATION

Description of authentication session The user requests host access through the terminal, and enters his user ID. The host calculates a random challenge and sends the challenge to the user terminal. The user identifies himself to the biometric see-through device through biometric input. The user then enters the random challenge into the device. The device calculates a response based on the algorithm and the user’s algorithm seed. The user enters the response into the terminal for transmission to the host. The host performs the same calculations, obtaining the user’s algorithm seed from the algorithm seed data file, and compares the responses. Access is granted or denied. Pros: No transmission or remote storage of the biometric information is required; the information is only maintained locally in the device itself. Also, the device does not need to be designed for connection to any particular terminal. Cons: Individual biometric devices are needed for each user, and the devices must be portable. This could result in an expensive implementation. Also, administrative issues may be more difficult to resolve in the stand-alone configuration. For example, a device malfunction may result in access denied to a user; in the host-based configuration, the user would gain access through an alternate device.

SOURCE Ernst & Whlnney. prepared under contract to OTA November 1986 Table 5.—Major Characteristics of Automated Biometric Identification Types

Eye retinal Finger print Hand geometry Voice Keystroke Signature Stability of measure (period) Life Life Years Years Variable Variable Claimed odds of accepting an im- 1 in billions 1 in millions 1 in thousands 1 in thousands 1 in a thousand 1 in hundreds poster (technically achievable without a high rate of false re- jections) Ease of physical damage– Difficult–a few diseases Happens–cuts, dint, Happens–rings, swollen Happens–colds, aller- Happens–emotions, Happens–stress, posi- sources of environmentally burns fingers or joints, sprains gies, stress fatigue, learning curve tion of device caused false rejects for device Perceived intrusiveness of Extreme to a small por- Somewhat Modest Modest None Modest measure tion of population

Privacy concerns; surreptitious Not feasible to do a scan Data base can be com- Not a problem Measurement can be Measurement can be Behavior IS already use of measure surreptitiously pared to law enforcement transparent to user transparent to user recognized as an ID files function Intrapersonal variation (chance of Low Low Low Moderate Moderate Moderate a false rejection, given training and experience in use) Size of data template on current 35 bytes Several hundred to 18 bytes to several Several hundred bytes Several hundred bytes 50 bytes to several units several thousand bytes hundred bytes hundred bytes Throughput time (note: level of 2 to 3 seconds 4 to 5 seconds 3 to 4 seconds 3 to 5 seconds Continuous process 2 to 5 seconds security affects processing time) Cost range of products on the $6,000 to $10,000 $3,500 to $10,000 $2,800 to $8,000 $1,500 to $5,000 per $250 per terminal and market (depends on configu- door up $850 to $3,500 ration

Development goal for cost per $2,000 $2,000 $500 to $1 ,000 $100 to $250 $100 to $750 $300 to $500 workstation (by 1990) Approximate number of patents Less than 10 50 30 20 plus Less than 10 100 outstanding Approximate number of firms in 1 (o) 3 (5) 2 (4) 2 (4) 1 (1) 3 (4) market with products or proto- types as of summer 1986 (num- ber with prototypes in parentheses) NOTE Other biometric PIDs under development include wrist vein, full-face. bralnwave skm 011, and weight/gait devices SOURCE Benjamin Miller prepared under contract to OTA October 1986 Oata update as of April 1987 80 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

Table 6.—Configurations and Applications of Biometric Devices

Configurations Off-line: On-line: Applications reference templates stored host Physical Computer Law Financial In device On mag stripe On I.C. card data base security security enforcement transaction Eye/retina ...... U — B u u B B — Fingerprint ...... D — u u u u D Hand geometry ...... B u — u u: — — Voice...... D — D u u B — D Keystroke dynamics ...... B — B B — D Signature ...... D U B u U u — B U = In regular use by industry or Government B = In Beta test use by industry or Government D = In Development SOURCE: Benjamin Miller, prepared under contract to OTA, October 1986. Data updated as of April 1987, and perspiration that can make fingerprint While retina scanning is fast, accurate, and techniques difficult do not affect retinal scan- easy to use, anecdotal reports suggest that the ning, and injuries to the hand or fingers are technique is perceived as being personally more more common than severe eye injuries. intrusive than other biometric methods. Never- theless, at the end of 1986, retinal technology At present, only one firm produces a retina- accounted for the largest installed base of bio- scanning identification device. One of its cur- 47 metric units. rent models, mainly used for physical access control, was introduced in September 1984. Hand Geometry Subjects look into an eyepiece, focus on a visual alignment target, and push a button to initi- Several techniques for personal identifica- ate the scan (done using low-intensity infrared tion using aspects of hand geometry were under light). The retinal pattern scanned is compared development or in production as of early 1986. with a stored template and identification is First developed in the 1970s, more than 200 based on a score that can range from – 1 to hand geometry devices are in use nationwide. +1, depending on the degree of match. A new, The oldest hand geometry technique was low-cost version introduced at the end of 1986, based on the length of fingers and the thick- uses a hand-held unit (the size of a large paper- ness and curvature of the webbing between back book). It is intended for controlling ac- them. Other techniques use the size and propor- cess to computer terminals. tions of the hand or the distances between the Potential applications are varied, but early joints of the fingers, infrared hand topogra- purchasers are using the system for a range phy, palm print and crease geometry, or trans- of uses, from physical access control to em- verse hand geometry (viewing the sides of the ployee time-and-attendance reporting. Instal- fingers to measure hand thickness as well as lations for physical access control have in- shape). Some of these techniques combine the cluded a national laboratory, banks, a state biometric measurement with a personal iden- prison, office buildings, and hospital pharmacy tification number. The biggest measurement centers. According to the trade press, 300 units problems with these devices involve people of the system had been shipped to end-users, who wear rings on their fingers or whose original equipment manufacturers, and dealers fingers are stubbed or swollen. 46 by early 1986. Some overseas users are also The use of hand geometry systems was beginning to order the systems. limited initially to high-security installations because of the cost and physical size of the

46pergon~ Identification News, April 1986. 4TPerson~ Identification News, J~UZWY 1987, P. 3. Ch. 4—Security Safeguards and Practices ● 81 equipment. However, technological advances prerecorded print. The user is verified if the have lowered equipment cost and size, thus ex- recorded and live print match within a prede- tending the market to medium-security facil- termined tolerance. Alternative modes of oper- ities, such as banks, private vaults, university ation use an individual password, identifica- food services, and military paycheck disburs- tion number, or a smart card carrying the ing. According to vendors, users include insur- template fingerprint data. Costs vary accord- ance companies, a jai alai facility, engineering ing to the system configuration, but they are firms, and corporate offices. At the same time, expected to fall rapidly as more systems are more sophisticated systems being developed sold and as very large scale integrated (VLSI) for high-security areas, such as military and technology is used. weapons facilities, use a television camera to By mid-1986, about 100 fingerprint-based scan the top and side of the hand. systems had been installed, mostly in high- security facilities where physical access or sen- sitive databases must be reliably controlled. Fingerprints have been used to identify in- Some units, however, have been installed in dividuals since the mid-1800s.4s Manual fin- health clubs, banks, and securities firms, ei- gerprint identification systems were based on ther to control access or for attendance report- classifying prints according to general char- ing. Also, firms are beginning to find overseas acteristics, such as predominant patterns of markets receptive. Potential applications will loops, whorls, or arches in the tiny fingerprint be wider as the price and size of the systems ridges, plus patterns of branches and termi- decrease. The bulk of near-term applications nations of the ridges (called minutiae). Finger- are expected to be mainly for physical access print file data were obtained by using special control, but work station devices are ink and a ten-print card; fingerprint cross- progressing. checking with local and national records was done manually. The cross-checking process be- Voice Identification gan to be automated in the late 1960s and by Subjective techniques of voice identification 1983 the Federal Bureau of Investigation (FBI) —listening to speakers and identifying them had converted all criminal fingerprint searches through familiarity with their voices—have from manual to automated operations. ’g been admissible evidence in courts of law for Some State and local law enforcement agen- hundreds of years.’” More recently, technical cies are also beginning to automate their fin- developments in electronics, speech process- gerprint records at the point of booking. ing, and computer technology are making Several firms sell fingerprint-based systems possible objective, automatic voice identifica- for physical access control or for use in elec- tion, with several potential security applica- tronic transactions. The systems generally tions and important legal implications.51 The operate by reading the fingerprint ridges and sound produced by the vocal tract is an acous- generating an electronic record, either of loca- ‘OHistorical and theoretical discussion of voice identification tion of minutia points or as a three-dimen- and its legal applications can be found in: Oscar Tosi, Voice sional, terrain-like image. The scanned live Identification: Theory and Lega/ .4pp]ications I Baltimore, Ikl D: print is compared with a template of the user’s University Park Press, 1979). ‘*Although courts in several jurisdictions have ruled that voiceprints are scientifically unreliable, courts in some States, ‘H For a complete discussion of fingerprint identification tech- including Maine, Massachusetts, and Rhode Island, consider niques, see: “Fingerprint Identification, ” U.S. Department of them to be reliable evidence. A recent ruling b? the Rhode 1s- Justice, Federal Bureau of Investigation (rid); and The Science land Supreme Court allowed a jury to consider evidence of voice- of Fingerprints, U.S. Department of Justice, Federal Bureau print comparisons and to decide itself on the reliability of that of Investigation, (Washington, DC: U.S. Government Printing evidence, noting that, “The basic scientific theory involved is Office, Rev. 1284). that every human voice is unique and that the qualities of unique- ~“Charles D. Neudorfer, “Fingerprint .4utomation: Progress ness can be electronically reduced . . .‘ (State v. Wheeler, 84- in the FB 1‘s Identification Division, FBI Law Enforcement 86-C, A., July 29, 1985). Source: Pri\racj’Journal, August 1985. Bulletin, Nlarch 1986. p. 2. 82 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic in Information

tic signal with a phonetic and linguistic pat- ance of another person’s signature-raises sub- tern that varies not only with the speaker’s stantial barriers to the use of static signatures language and dialect, but also with personal (i.e., recognizing the appearance of the signed features that can be used to identify a particu- name) as a secure means of personal identifi- lar speaker. cation. Voice recognition technology has been Newer signature-based techniques use dy- around for some time,s2 but personal identifi- namic signature data that capture the way the cation systems using it are just beginning to signature is written, rather than (or, in addi- reach the market, mainly because of the for- tion to) its static appearance, as the basis for merly high cost and relatively high error verification. The dynamics include the timing, rates.53 Some large electronics and communi- pressure, and speed with which various seg- cations firms have experimented with voice ments of the signature are written, the points recognition systems for many years, but are at which the pen is raised and lowered from just now developing systems to market.” the writing , and the sequence in which actions like dotting an “i” or crossing a “t’ An important distinction should be made are performed. These actions are very idiosyn- here between technologies to understand cratic and relatively constant for each individ- words as spoken by different individuals bs ual, and are very difficult to . (speech recognition) and technologies to under- stand words only as they are spoken by a sin- A number of companies have researched sig- gle individual (speech verification). Voice iden- nature dynamics over the past 10 years and tification systems are based on speech several have produced systems for the mar- verification. They operate by comparing a ket. The systems consist of a specially in- user’s live speech pattern for a preselected strumented pen and/or a sensitive writing sur- word or words with a pre-enrolled template. face. Data are captured electronically and If the live pattern and template match within processed using proprietary mathematical al- a set limit, the identity of the speaker is veri- gorithms to produce a profile that is compared fied. Personal identification numbers are used with the user’s enrolled reference profile or tem- to limit searching in the matching process. plate. The systems work with an identification According to manufacturers and industry number or smart card identifying the profile analysts, potential applications include access and template to be matched. control for computer terminals, computer and Prices for these systems are relatively low data-processing facilities, bank vaults, secu- compared with some other identification tech- rity systems for buildings, credit card author- nologies. Combined with the general user ization, and automatic teller machines. acceptability of signatures (as opposed, say, Signature Dynamics to fingerprinting or retinal scans), this is ex- pected to make signature dynamics suitable A person’s signature is a familiar, almost for a wide range of applications.s6 Potential universally accepted personal verifier with financial applications include credit card trans- well-established legal standing. However, the actions at the point of sale, banking, automatic problem of forgery—duplicating the appear- teller machines, and electronic fund transfers. Systems are currently being tested in bank- ‘zThe of most voice systems can be traced to work over the past 20 years at AT&T Bell Laboratories. Personal “Several signature dynamics systems have adaptive fea- Identification News, October 1985. tures that can allow a person’s signature to vary slowly over 53See Tosi, “Fingerprint Identification, ” U.S. Department of time; enrollment procedures require several signatures to set Justice, Federal Bureau of Investigation (rid); and The Science the reference signature profile and users are permitted more of Fingerprints, U.S. Department of Justice, Federal Bureau than one (usually two) signature attempts for identification. of Investigation (Washington, DC: U.S. Government Printing ‘GGeorge Warfel, “Signature Dynamics: The Coming ID Office, Rev. 12/84), ch. 2. Method, Data Processing and Communications Securit.v, vol. “Personal Identification News, January 1986. 8, No. 1. (n.d. ) Ch. 4—Security Safeguards and Practices • 83 ing (check cashing) and credit card applica- firm, who had received an NSF grant in 1982 tions, where they might eventually replace dial- to investigate typists’ “electronic signatures, ” up customer verification systems. 57~ Systems formed a venture corporation in 1983 to com- connected to a host computer could also pro- mercialize an access control device based on vide access control as well as accountability the technique. He was awarded a patent in late and/or authorization for financial transactions 1986. and controlled materials, among other uses. Keyboard-rhythm devices for user verifica- tion and access control are based on the prem- Keyboard Rhythm ise that the speed and timing with which a per- Early work, beginning in the 1970s, on user son types on a keyboard contains elements of verification through typing dynamics was done a neurophysiological pattern or signature that by SRI International and, with National Sci- can be used for personal identification.60 The ence Foundation (NSF) funding, the Rand stored “user signature” could be developed ex- Corp.58 In 1986, two firms were developing plicitly or so that it would be transparent to commercial personal identification systems the user—perhaps based on between 50 and based on keyboard rhythms for use in control- 100 recorded log-on accesses or 15 to 45 min- ling access to computer terminals or microcom- utes of typing samples if done openly and puters, including large mainframe computers explicitly, or based on several days of normal and computer networks. One of the firms ac- keyboard work if done transparently (or sur- quired the keystroke dynamics technology reptitiously). The stored signature could be up- from SRI International in 1984 and contracted dated periodically to account for normal drifts with SRI to develop a product line. In 1986, in keyboard rhythms. These types of devices the firm reported that it was developing 11 might be used only at log-on, to control access products configured on plug-in printed circuit to selected critical functions, or to prevent boards and that it planned to test these prod- shared sessions from occurring under one user ucts in several large corporations and Govern- log-on. The prices of these systems depend on ment agencies in 1987. By mid-1987, the firm their configuration: current estimates range had contracts with over a dozen Fortune 500 from $1,000 for a card insert for a host com- corporations and five Government agencies to puter capable of supporting several work sta- test its products.’’” A researcher in the second tions to $10,000 for a base system that could store 2,000 user signature patterns and sup- port four channels that communicate simul- ‘-1 bid. taneously. “R. Stockton (~aines, 11’illiam Lisowski, S. James Press, and Norman Shapiro, “Authentication by Keystroke Timing: Some Preliminary f{esults, ” R-2526 -N’ SF. The RAND Corp., Santa ‘OSom~ speculate that this method would only be effective Nlonica, CA, Ma.v 1980. for experienced typists, rather than erratic *’hunt and peck’” “’Rob Hamrnon, International 13ioaccess System Corp., per- novices, but at least one of the firms claims that the method sonal communications with OTA staff, Aug. 4, 1987. can be implemented for use by slow or erratic typists as well.

ACCESS CONTROL SOFTWARE AND AUDIT TRAILS Once the identity of a user has been verified, Host Access Control Software it is still necessary to ensure that he or she has access only to the resources and data that he To provide security for computer systems, or she is authorized to access. For host com- networks, and databases, user identifications puters, these functions are performed by ac- and passwords are commonly employed with cess control software. Records of users’ ac- any of a number of commercially available add- cesses and online activities are maintained as on software packages for host access control, audit trails by audit software. Some have been available since the mid-to-late 84 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic //7 Information

Box D.—Host Access Control Software A number of host access control software packages are commercially available that work with a computer’s operating system to secure data and the computing resources themselves. Access con- trol software is designed to offer an orderly method of verifying the access authority of users. As such, it can protect the system, its data, and terminals from unauthorized users and can also protect the system and its resources from unauthorized access to sensitive data and from errors or abuses that could harm data integrity. Access control software intercepts and checks requests for system or database access; the vari- ous commercial packages vary in the ways they check requirements for authorized access. Most require both user identification and a password to allow access; the user’s identification determines his or her level of access to the system hardware and files. Passwords may be changed from time to time, or even (in some systems) encrypted. To prevent unauthorized users from guessing pass- words, most of these systems limit the number of incorrect access attempts before logging the user off and sending a security alert message (including the user’s identification number). Some pack- ages generate their own passwords; these tend to be more difficult for intruders to guess, but also are more difficult for authorized users to remember. The data files containing user identification numbers and passwords are critical to system security because knowledge of correct identification number and password combinations would allow anyone access to the system and its most sensitive files. Therefore, some access control packages do not allow even security administrators to know user passwords—users set up their own, or the system generates the passwords, which may change frequently. The structure of system-generated passwords is being studied to make them easier to remember. Access control software packages allow for audit features that record unauthorized access at- tempts, send violation warning messages to security, and/or log the violator off the system. Other audit features include keeping a log of users’ work activities on a daily basis, printing reports of use and violation attempts, and allowing security officers to monitor users’ screens. These packages can also be used in conjunction with special facility-specific security access controls implementing other restrictions (time-of-day, database, file, read-only, and location/terminal) written in custom code to fit the application environment. Versions of access control software packages are currently available to protect a variety of manufacturers’ mainframe operating systems and minicomputers. Development of software for commercial host access control began in the early 1970s. Currently, there are more than 24 software packages from different vendors. These packages are designed to work with a variety of host configurations (CPU, operating system, storage space, interfaces to other system software). SOURCE: DataPro Research Corp., “All About Host Access Control Software, ” 1S5’2-001, June 1985,

1970s. (See box D.) As of 1986, three access In all, more than two dozen software pack- control software packages were market lead- ages are being marketed, some for classified ers: RACF, with some 1,500 installations since applications. These packages vary widely in 1976; ACF2, developed by SKK, Inc., and mar- their range of capabilities and applications, and keted by the Cambridge Systems Group, with are usually either licensed with a one-time fee more than 2,000 installations since 1978; and or leased on a monthly or yearly basis. Fees Top Secret, marketed by the CGA Software and maintenance can range from several hun- Products Group, with more than 1,000 pack- dred dollars up to $50,000 per year. ages installed since 1981.6’ Instead of the “add-on” software packages “DataPro, reported in Government Computer News, Dec. 5, mentioned above, the operating systems of 1986, p. 40. many computers include some level of access Ch. 4—Security Safeguards and Practices ● 85 — control built into the basic system software. from Division D (minimal protection) up Most of the built-in systems offer features com- through Class Al of Division A (verified pro- parable to the add-on systems designed for tection). NCSC also evaluates access control commercial use.62 The number of new com- software products submitted by vendors and puter operating systems incorporating access rates them according to the Orange Book cat- control and other security features is expected egories. The evaluations are published in the to increase. Evaluated Products List, which is made avail- able by NCSC to civilian agencies and the pub- Commercial access control software pack- lic. As of May 1987, eight products had re- ages commonly rely on users memorizing their ceived NCSC ratings and more than 20 others identification numbers or passwords keyed were being evaluated. into the terminal. Thus, they tend to rely on the “something known” criterion for security. Despite their importance to host computer They also tend to permit a single individual-in security, particularly for classified applica- principle, the security officer–access to the tions, a detailed look at trusted operating sys- central files containing users’ authorization tems is beyond the scope of this OTA assess- levels and, although less prevalent in newer ment. A number of computer security experts, systems, their users’ passwords. A character- including those at NSA, consider trusted oper- istic of the higher security packages is that ating systems to be crucial to securing unclas- they are designed for applications in which sified, as well as classified, information. They users with varying levels of authorization are consider access controls to be of limited value using a system containing information with without secure operating systems and the varying degrees of sensitivity. An example is NCSC criteria, at least at the B and C levels, a system containing classified information, to be of significant value in both classified and where some is classified “confidential” and commercial applications.64 However, other some “secret.” computer security experts have questioned whether design criteria appropriate for classi- NSA’s National Computer Security Center fied applications can or should be applied to (NCSC) has provided Federal agencies with cri- commercial applications or even to many un- teria to evaluate the security capabilities of classified Government applications. (See ch. 5.) trusted computer systems. According to the NCSC definition, a trusted computer system The recent debate over the applicability of is one that employs sufficient hardware and what some term the ‘military’ model to com- software integrity measures to allow its use mercial computer security65 had progressed for processing simultaneously a range of sen- to the point where plans were made for an in- sitive or classified information. The trusted vitational workshop on this topic to be held system criteria contained in the so-called in the fall, 1987.66 This specific area of con- “Orange Book, 63’ developed by NSA, define cern illustrates the issue of whether or not it four classes of security protection. These range

“Harold E. Daniels, Jr., NSA S-0022-87, Jan. 21, 1987. Safe- guards currently used by the private and civil sectors have re- “S. Lipner, personal communication with OTA staff, Dec. ceived B- and C-level ratings. 24, 1986. %%, for example, David D. Clark and Da\id R. W’ilson, “A ~~DepWtment of Defense Trusted Computer SYstem ‘val- Comparison of Commercial and Military Computer Security Pol- uation Criteria, Department of Defense Standard DoD 5200.28- icies, ” Proc&”ngs, 1987 IEEE Symposium on Securit-v and STD, December 1985. Two companion DoD documents (“Yel- Privacy (Oakland, CA: Institute for Electrical and Electronic low Books”) summarize the technical rationale behind the com- Engineers, Apr. 27-29, 1987). puter security requirements and offer guidance in applying the ~~The Workshop on Integrity policy fOr COmpUter 1 nfOrma- standard to specific Federal applications: Computer Securit.v tion Systems will be held at E3entle~ College, W’altham. hlA Requirements–Guidance for Applying the Department of De- in the fall, 1987. I t is being organized by Ernst & Whinney, fense Trusted Computer System Evaluation Criteria in Specific and is co-sponsored by the Association for Computing Machin- Environments, CSC-STD-O03-85, June 25, 1985; and Technical ery, the Institute for Electrical and Electronic Engineers, the Rationale Behind CSC-STD-003-85: Computer Secun”ty Require- National Bureau of Standards, and the National Computer Secu- ments, CSC-STD-004-85, June 25, 1985. rity Center. 86 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information is in the Nation’s best interests to assign to multi-user minicomputers. However, the fact one agency—namely, NSA—the task of meet- that they are easily available for these ma- ing all the needs of the Government civilian chines does not mean that they are effectively agencies and the private sector while continu- used. Many system managers either do not use ing to carry out its other missions. These con- the audit trails or rarely if ever review the logs cerns will be raised again and explored in chap- once generated. For example, OTA found that ters 5 and 7. only 58 percent of 142 Federal agencies sur- veyed use audit software for computers con- Audit Trails taining unclassified, but sensitive information. Only 22 percent use audit software for all of Another major component of computer secu- their unclassified, but sensitive systems.67 rity, usually part of a host access control sys- Similarly, a 1985 General Accounting Office tem, is the ability to maintain an ongoing rec- (GAO) study that exam.ined 25 major computer ord of who is using the system and what major installations found that only 10 of them met actions are performed. The system’s operators GAO’s criteria for use of audit trails.68 can then review this “audit trail” to determine unusual patterns of activity (e.g., someone con- Part of the reason why audit trails are not sistently using the system after office hours) more widely and effectively used is that they or to reconstruct the events leading to a ma- tend to create voluminous information that is jor error or system failure. tedious to examine and difficult to use. Tech- nical developments can ease this problem by In the past few years, software has begun providing tools to analyze the audit trail in- to combine auditing with personal identifica- formation and call specified types or patterns tion. An audit log can record each time a user of activities to the attention of system secu- seeks access to a new set of data. Figure 21 rity officers. Thus, it would not be necessary, shows a sample audit log. Audit trail software except in case of a disaster, to review the en- is routinely recorded on most mainframe com- tire system log. puters that have many users. Such software is available but seldom used on similar minicomputers, in part because it slows down bTInformation Security, Inc., “Vulnerabilities of Public the performance of the system and is only Telecommunications Systems To Unauthorized Access, ” OTA contractor report, November 1986. rarely available for microcomputers. ~~wil]im S. Franklin, General Accounting Office, statement Audit trails are among the most straight- on Automated Information System Security in Federal Civil- ian Agencies, before House Committee on Science and Tech- forward and potentially most effective forms nology, Subcommittee on Transportation, Aviation, and Ma- of computer security for larger computers and terials, 99th Cong., lst. sess., Oct. 29, 1985.

ADMINISTRATIVE AND PROCEDURAL MEASURES Important as technical safeguard measures less by improper management of “secret” en- like the ones that have been described above cryption or decryption keys (see below). In the can be, administrative and procedural meas- field of computer security, technical measures ures can be even more important to overall of the types mentioned above are almost use- security. For example, encryption-based com- less if they are not administered effectively. munications safeguards can be rendered use- While they can only be raised briefly here, some Ch. 4—Security Safeguards and Practices • 87

Figure 21 .— Example Reports From Audit Trail Software

PRIVILEGES

a 9 0 1

( f e e 1 t t s s r 1 t hat r Y 88 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information of the most important aspects of computer probably one of the simplest and most ne- security administration include:69 glected ways to enhance security. ● Reviewing Audit Trails. Similarly, audit ● Maintaining a Written Security Policy software is of little value unless the logs and Assigning Responsibilities for Secu- created by its use are reviewed. rity. Many organizations simply do not ● Training and Awareness. Relatively sim- have a policy regarding computer security, ple programs can help users understand or the policy is unavailable to computer what kind of security problems the orga- users, or the policy is not followed. Com- nization faces and their role in enhancing puter security experts report that one of security. the most important factors in encourag- Periodic Risk Analyses. Such an analysis ing good computer security is for users involves examining each computer sys- to know that management is indeed com- tem, the sensitivity of the data it handles, mitted to it. Also, it is important that each and the measures that are in use or should individual in the organization be aware be considered to protect the system. that protecting information assets is part ● Personnel Checks. Organizations may of his or her responsibility. wish to avoid putting employees with cer- ● Password Management. Password-based tain kinds of criminal records or financial access control systems are much less ef- problems in jobs with access to sensitive fective if computer users write their pass- information. It maybe difficult, however, words on the wall next to their terminal, to perform such checks without raising if they choose their birthday or spouse’s concerns about employee privacy. name as their password, or if passwords ● Maintaining Backup Plans and Facilities. are never changed. Thus, policies to en- Many organizations do not have any pol- courage reasonable practices in password icy or plans for what to do in the event systems are not only essential, but are of a major disaster involving an essential computer system. For example, in 1985 ‘gFor a more complete discussion of administrative proce- only 57 percent of Federal agencies had dures for computer security, see U.S. Congress, Office of Tech- (or were in the process of developing) nology Assessment, Federal Government Information Technol ogy: Management, Security, and Congressional Oversight, backup plans for their mainframe com- OTA-C IT-297 (Washington, DC: U.S. Government Printing Of- puter systems.70 fice, February 1986). Additionally, the General Accounting Of- fice (GAO) has issued many reports over the last decade iden- tifying major information security problems and surveying T~Data from OTA’S Feder~ Agency Request given in ch. 4 information security practices in Federal agencies (see tables of Federal Government Information Technology: Management, 4-5 in the February 1986 OTA report for a selected list of some Security, and Congressional Oversight, OTA-C IT-297 (Wash- of these GAO reports). ington, DC: U.S. Government Printing Office, February 1986).

COMPUTER ARCHITECTURES The computer itself has to be designed to ticular, NCSC has provided guidelines for se- facilitate good security, particularly for ad- cure systems and has begun to test and vanced security needs. For example, it should evaluate products submitted by manufac- monitor its own activities in a reliable way, pre- turers, rating them according to the four secu- vent users from gaining access to data they rity divisions discussed above. A more thor- are not authorized to see, and be secure from ough discussion of secure computing bases is sophisticated tampering or sabotage. The na- beyond the scope of this assessment. tional security community, especially NSA, has actively encouraged computer manufac- While changes in computer architecture will turers to design more secure systems. In par- gradually improve security, particularly for Ch 4—Security Safeguards and Practices . 89

larger computer users, more sophisticated ar- cation coupled with effective access controls, chitecture is not the primary need of the vast including controls on database management majority of current users outside of the na- systems, are the more urgent needs for most tional security community. Good user verifi- users.

COMMUNICATIONS LINKAGE SAFEGUARDS In the past few years it has become increas- tion must be used in pairs, one at each end with ingly clear that computers are vulnerable to the correct encryption key and algorithm to misuse through the ports that link them to encrypt and decrypt communicated data and telecommunications lines, as well as through instructions. About 20 different models of com- taps on the lines themselves. Although taps mercial security modems were available in and dial-up misuses by hackers may not be as 1986, with various combinations of features, big a problem as commonly perceived, such such as password protection, auditing, dial problems may grow in severity as computers back, and/or encryption. Security modems are increasingly linked through telecommuni- featuring encryption offer the DES and/or pro- cations systems. Similarly, computer and other prietary encryption algorithms. communications using satellite transmissions According to DataPro Research Corp., the motivate users to protect these links. market for security modems has been in a period of rapid change since the early 1980s—new and Port-Protection Devices advanced products have been introduced, more users have adopted remotely accessible data For some computer applications, misuse via operations, and prices have continued to fall. dial-up lines can be dramatically reduced by Prices for security modems range from less the use of dial-back port protection devices than $500 to almost $2,000, depending on the used as a buffer between telecommunications features included. lines and the computer. The market for these is fairly new, but maturing. Some products are An example of the use of this type of port stand-alone, dial-back units, used for single- protection follows: When a remote user wants line protection; others are rackmounted, multi- to logon to the machine, the security modem line protection units that can be hooked up to is programmed to answer the call, ask for his modems, telephones, or computer terminals. or her log-on identification and password, and Some 40 different models of commercial dial- then (if the identification and password are back systems were being sold in 1986, with proper) call back the computer user at the loca- prices ranging from several hundred to sev- tion at which he or she is authorized to have eral thousand dollars (on the order of $500 per a terminal. There may be some inconvenience incoming line), depending on the configuration, in using the device, however, if authorized com- features, and number of lines protected. Some, puter users frequently call from different phone but not all, models offer data encryption as a numbers. In addition, there are ways to thwart feature, using DES and/or proprietary al- dial-back modems, such as using “call- gorithms. forwarding” at the authorized user’s phone to route the computer transmission elsewhere to In addition to these dial-back systems, secu- an unauthorized phone or user. rity modems can be used to protect data com- munications ports. These security modems are Dial-back devices are generally considered microprocessor-based devices that combine too inconvenient to use for one very important features of a modem with network security fea- application: large-scale database applications, tures, such as passwords, dial-back, and/or en- such as commercial credit reporting services. cryption. Security modems featuring encryp- These services can receive thousands of calls 90 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic lnformation a day from terminals in banks and credit bu- addressed, providing selective access to the reaus seeking to verify a person’s credit wor- programming. Scrambling techniques are also thiness, often prior to a loan or establishment being used by some providers of point-to- of a line of credit. The use of dial-back devices multipoint satellite data networks. Since these for such an application are time-consuming and transmissions are typically digital, more effec- costly, and are difficult to administer given the tive encryption systems can be used. In some number of terminals that would have to be con- cases, a device using the Data Encryption nected to the devices. Thus, those who illegally Standard is provided in the subscribers’ re- obtain passwords to access these systems can ceiver equipment and key distribution is ac- now use them relatively easily. complished in real time to selected end users (i.e., to those who have paid to receive the Other technical measures may be useful for broadcast) .7’ large public database systems, however. For example, remote terminals in retail stores could The Department of Defense has had con- be equipped to perform a coded “handshake” tinuing concerns for the vulnerability of satel- with the host computer before they can gain lites to interception and other misuse. The Sen- access to the database. Or, as the telecommu- ate Committee on Appropriations approved nications network evolves toward wider use funds in 1986 for the first year of a 5-year plan of digital signaling equipment, it will increas- developed by NSA that would enable DoD to ingly be possible for host computers to know reimburse satellite carriers for installing en- the phone number of the person trying to gain cryption equipment to protect their trans- access and thus to check that phone number missions. 72 against its list of authorized customers. Fiber Optic Communications Satellite Safeguards Fiber optic communications links provide an In the military, highly directional antennas, important barrier to misuse, because more so- spread-spectrum modulation, and laser com- phisticated means are required to eavesdrop. munication are among the measures used or Further, means are available to detect some contemplated to protect satellite signals from forms of misuse. unauthorized reception. Other methods range from analog scrambling to full digital encryp- Common Carrier Protected Services tion. For encryption, equipment costs and oper- ational complexity tend to inhibit the wide- Several common carriers encrypt their micro- spread deployment of elaborate encryption wave links in selected geographic areas as well techniques. This is particularly true for point- as their satellite links that carry sensitive Gov- to-multipoint networks, where the expense of ernment communications. These protected providing a large number of end users with services are largely the result of NSA and GSA decryption equipment may not be worth the procurements beginning in the 1970s. Much cost. of the following discussion is excerpted from the OTA contractor report, “Vulnerabilities The current trend is toward the implemen- of Public Telecommunications Systems to Un- tation of security by some service providers. authorized Access, prepared by Information For example, the video industry, one of the Security Incorporated, November 10, 1986. largest users of satellite capacity, has begun to use analog scrambling techniques to dis- courage casual theft of service. Methods for ~lNote that the Electronic Communications priV21Cy Act of 1986 (Public Law 99-508) made the private use of “backyard” encrypting video signals range in complexity earth stations legal for the purpose of receiving certain satel- from line-by-line intensity reversal to individ- lite transmissions, unless it is for the purpose of direct or in- ual pixel scrambling. Decryption keys may be direct commercial advantage or for private gain. %ee U.S. Senate, Committee on Armed Services, National broadcast in the vertical blanking interval. In Defense Authorization Act for Fiscal Year 1987. Report 99-331 some systems, individual subscribers can be to accompany S. 2638, 99th Cong., 2d sess., July 8, 1986, p. 295. Ch. 4—Security Safeguards and Practices ● 91

The latest transmission technology using fi- a private-line basis to commercial or business ber optics is difficult to intercept because the customers. 77 information signal is a modulated light beam The American Satellite Co. offers two types confined within a glass cable. NSA judges both of protected carrier services. One uses an en- cable and fiber media to provide adequate pro- crypted satellite service that has been approved tection for unclassified national security- by NSA for protecting unclassified, but sen- related information. sitive information. The second service uses pro- American Telephone & Telegraph Co. (AT&T) tected terrestrial microwave in the NSA-desig- protects its microwave links in Washington, nated areas. This also is available on a private D. C., New York, and San Francisco. Major line basis in the service areas.78 routes are being expanded with fiber optics. Pacific Bell plans to have a complete fiber Protected service is available in areas desig- and cable network between all its central nated by NSA and private line service can be offices within 10 years. These plans include offered over selected fiber and cable routes. In most of San Francisco, Los Angeles, and San addition, customized encryption can be in- Diego; at present, two fiber rings in San Fran- stalled on selected microwave and satellite cir- 73 cisco are routed past all major office buildings. cuits for particular customers. Pacific Bell can offer customers in the San MCI offers protected terrestrial microwave Francisco area fiber optic routes throughout services in those areas specified by NSA. In most of their operating region. In Los Angeles, addition, MCI offers customers the option of the company has 27 locations used in the 1984 protected service in many other major metro- Olympics linked by fiber optic facilities and politan areas. These customers can order pro- is extending its network. These offerings can tected communications throughout the MCI be augmented with new fiber spurs to a cus- portion of the circuit, using MCI fiber optic tomer’s location. All of these services are filed system, encrypted terrestrial microwave, and with the California Public Utility Commission the MCI-encrypted satellite network.74 as special service engineering and are not tariffed by the FCC.79 U.S. Sprint, which reached 2.5 million cus- tomers or about 4 percent of all long-distance Bell Communications Research (Bellcore) is customers in 1986, intends to create an all-fiber developing a service that would be imple- network by the end of 1987 that the company mented by the Bell Operating Companies. The expects will carry more than 95 percent of its service would provide special handling and voice and data traffic.75 This means any call routing over protected or less-interceptable or circuit carried via the Sprint network would (i.e., fiber or cable) lines. The initial goal is to be harder to intercept than unprotected micro- use as much as possible the inherent security wave transmissions. Currently, Sprint has pro- features of the existing network. This service tected microwave radio in the NSA-designated is being designed to meet NSA requirements areas. 76 for protecting unclassified government infor- mation so that costs (for Government contrac- International Telephone & Telegraph Co. tors) will be reimbursable under National (ITT) offers protected service in the NSA-des- COMSEC Instruction 6002 and Department ignated zones, consisting of protected micro- of Defense Instruction 5210.74. Bellcore an- wave circuits. The service is available now on ticipates that this service will also be available “ ‘AT&T Communications Security, marketing literature, to other commercial customers,”) 1986. “’MCI Communication Protection Capabilities, marketing literature, 1986. ‘TITT Private Line Ser\ice-SecuritJ, marketing literature, ‘-’U.S. Sprint, “Clearline,” \’ol. 2, Issue 5, Kansas Cit~’, MO, 1986, spring 1987. ‘“Protected Communications Ser\’ices, marketing literature, “ ‘‘M’hy U.S. Sprint Is Building the First Coast-to-Coast Fi- 1986. ber Optic N“etwork and W’hat’s in It for You, ” U.S. Sprint mar- ‘<’OTA Federal Agency Data Request, op. cit. keting literature, 1986. ‘f’ Ibid., ref. 2’7. Chapter 5 Improving Information Security CONTENTS

Page Findings ...... 95 Introduction...... 95 National Security Objectives and Programs ...... 98 Background ...... 98 Federal Telecommunications Protection Programs ...... 98 Government Procurements ...... 100 Carrier Protection Services ...... 100 DoD Programs Under NSDD-145 ...... 101 Implications of Merging Defense, Civilian Agency, and Private Sector Requirements ...... 106 Objectives and Programs Unrelated to National Security ...... 110 Background ...... + ...110 Private Sector Motivations ...... 110 Linkages in and Contrasts Between Defense Intelligence and Other Needs ...... 117 Technical Standards Development ...... 123 Inherent Diversity of Users’ Needs ...... 127

Box Box Page E. Indicators of Private Sector Interest in Safeguards ...... 111

Tables Table No. Page 7.0verall Ranking of Importance as an Adversary ...... 118 $. Top-Priority Computer and Information Security Concerns Mentioned by Respondents ...... 120 9. Perceived Impacts From NSDD-145 ...... 120 10. Selected Civilian Technical Standards for Safeguarding Information Systems ...... 126 Chapter 5 Improving Information Security

FINDINGS

● The needs of institutional users are changing, expanding gradually and in- crementally, as technology makes practical a broader range of applications of information safeguards. The current trend in user activities is toward con- trolling access to systems, linking transactions with particular individuals and authorizations, and verifying message accuracy. ● Users in civil agencies and the private sector have diverse needs to safeguard their computer and communications systems, even within any one Federal agency or industry. Organizations differ in their needs, perceptions, and atti- tudes towards information security, and see different incentives or mandates to secure information systems. Differences in their concerns for vulnerabili- ties, risks, and adversaries are probably greatest between Government intel- ligence agencies and other users. ● It is unclear whether anyone agency can specify and design one or a few safe- guards for a wide range of users, and particularly questionable for the Na- tional Security Agency due to its propensity for secretiveness and its focus on protecting against foreign intelligence adversaries. ● Cryptography underlies some powerful safeguards that have broad applica- tion, not just for national security needs, but also for an expanding number of commercial needs, such as to ensure the integrity of electronic information and reduce the costs of routine business transactions. Advances in cryptog- raphy have stimulated new nondefense applications of the technology. ● Federal standards and guidelines have a leveraging effect on the private sec- tor, especially in areas related to cryptography. ● It is not clear how motivated the nondefense private sector will be to use some safeguards, such as secure telephones or trusted computers, particularly if these are not easy to use and cost-effective in business applications.

INTRODUCTION

The preceding chapters illustrate the vari- needs of various users—defense and civilian ous vulnerabilities of computer and commu- agencies of the Federal Government, financial nications systems and the range of technol- and other private sector users—as indicated ogies that are becoming available to safeguard by the actions they are taking to safeguard information in these systems. They also intro- their domestic and international operations. duce the notion of a spectrum of adversaries, It also points out some of the diversity in their- differing widely in available resources (time, perceived needs for safeguards, both among money, equipment, and specialized knowledge), users in the private sector and, particularly, against whom these systems may need to be between users in intelligence agencies and protected. This chapter examines the perceived others.

95 96 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

The level of users’ activity toward safeguard- Most businesses and civilian agencies are par- ing electronic information is growing. Various ticularly dependent on the integrity of certain factors are contributing to this interest. These of their electronic information, and many of factors range from wanting to improve busi- these are also concerned about unauthorized ness operations, including the reduction of po- disclosure. And, for some users, such as those tential theft and human errors, to streamlining responsible for public safety (air traffic con- business transactions and adhering to indus- trol) and many financial services, there is an try standards of due care and, in some cases, important, if not critical, need for continuity to requirements imposed by emerging Federal of service. Observations concerning users’ ob- policies. Federal policies, for example, will in- jectives are important because Federal policy fluence the actions of some banks and defense that is misaligned with users’ needs can create contractors. No individual factor is recognized significant tensions. as singularly prominent in driving the use of Government agencies’ and private sector safeguards. needs for information security include capa- Instead, business uses of electronic safe- bilities for authenticating the origin and in- guards are in a transition phase as users con- tegrity of messages, and for verifying the iden- tinue to define their needs and as technical tities and authorizations of system users. The standards are developed, and as Federal pol- Department of the Treasury and the Federal icies and agency roles stabilize further. A num- Reserve System, for example, electronically ber of factors have complicated the situation, transfer huge amounts of money every work- however. Among these is the question of the ing day and, with commercial banks, are pro- influence of the National Bureau of Standards viding leadership in developing and using safe- or the National Security Agency in setting guards with these types of capabilities. standards for information security safeguards, Users’ needs for safeguards are by no means and users’ perceptions of the prospective reach confined to the financial community. The use of Federal policies requiring safeguards for un- of safeguards for securing electronic informa- classified information. (See ch. 6.) tion is being adopted by users in industries One important turning point appears to have ranging from automobile manufacturing to been reached in that users are now better able grocery businesses. However, private sector to distinguish between the protections pro- needs and Government national security con- vided, or not provided, by different forms of cerns are not identical. They differ in their safeguards and their alignment with specific perceptions of the levels of adversaries, the con- needs. Users tend to be concerned with one or sequences of exploitation, and their organiza- more of three main objectives in seeking infor- tional motivations and decision rules for pro- tecting information and investing in safeguard mation safeguards: preventing unauthorized l disclosure; maintaining the integrity of elec- technology. tronic information; and ensuring continuity of In addition, private sector demand for safe- service. The needs of different communities of guards is growing, as is its ability to produce users vary widely and these needs are often them, as noted in chapter 4. Users tend to make critical for one of these objectives and less im- selected use of a broader range of new tech- portant, or nonexistent, for others. For some nologies for safeguarding information that users there is concern for all three objectives. prove cost-effective or are otherwise important for business reasons. Interestingly, many of In spite of the difficulty in distinguishing the emerging commercial uses of message in- between users according to their objectives for tegrity (authentication) techniques, e.g., for information security, some cautious observa- tions can be made. One of these is that a criti- ‘Administrative and technical safeguards, as well as or- cal need for some users, such as intelligence ganizational policies for information safeguards, are also im- agencies, is to prevent unauthorized disclosure. portant for safeguarding electronic information, as noted in ch. 4. Ch, 5—Improving Intormation Security ● 97 cost-reduction purposes, make use of the same many countries in which the firms do business. cryptographic techniques used to improve the Their influence is already beginning to be felt confidentiality of electronic information. Often, through communities of industry users, such however, the commercial motivations for em- as international banking, transportation, and ploying these techniques are unconcerned with manufacturing. preventing unauthorized information disclo- OTA analyzed survey data to gain insights sure or protecting national security. into the influence of Federal policies and stand- What emerges is a sense that although gener- ards on users’ and vendors’ actions. Although alizations of aggregate users’ needs are use- the effects of National Security Decision Direc- ful, individual users tend to have significant tive 145 (NSDD-145), issued in 1984, were still diversity among them. Even within one user evolving, there were indications, as of late community, such as the banking industry, 1986, that the impact of this policy had not there can be considerable diversity of needs, been widely felt on nongovernment users’ ac- depending on size, location, operations, clients, tions. For example, about three-fourths of the and numbers of branches and correspondents. nongovernment respondents to an OTA sur- vey question, and 46 percent of the nongovern- This diversity of needs raises questions with ment respondents to a separate Ernst & Whin- regard to the proper role of the Federal Gov- ney survey, indicated that this policy had no ernment in meeting private sector needs and impact on their organizations’ actions toward the extent to which any one Federal agency safeguarding unclassified information.s can reasonably be expected to meet the safe- guard needs of all users. Such a task would Moreover, OTA’s research has found that require an agency to interact openly and con- some large firms feel that, in general, Federal tinually with a diverse public. The intensity guidelines and assistance programs have not and openness of interaction would require sig- significantly or directly contributed to their nificant adaptation in the operations of an information security efforts.4 Moreover, data agency such as DoD’s National Security Agency from Ernst & Whinney’s computer security (NSA). 2 Without a full appreciation of users’ survey in 1986 shows that, of 474 respondents, needs, there is significant risk of premature two-thirds said that none of their organiza- or “off-target technology standardization or tion’s information and computer security ex- imposing DoD restrictions that are unaccept- pertise came directly from Government-spon- able to users. At the same time, safeguards sored assistance programs, conferences, or that do not meet users’ needs-even those that training programs. On average, according to are federally imposed—are not likely to be ap- estimates by both government and nongovern- plied widely and may distort market forces. ment respondents, only 7 percent of their orga- The users themselves are also likely to be important in shaping information safeguards. ‘]Of 26 computer audit directors from Fortune 100 firms sur- veyed for OTA in October 1986, Ernst & Whinney found that The influence of major international business 17 individuals (74 percent of the 23 answering this question) users on information security standards is only said that NSDD-145 had had “no” impact on their firms’ safe- beginning to be felt, but is likely to be signifi- guarding of unclassified information, four saidNSDD-145 had had “very little” impact, and two said the directive had had cant in the long term. These users can be ex- “some” impact. pected to demand safeguards that integrate Results are reported in OTA contractor report, “OTA Com- well into their business operations in terms, puter Security Survey, “ Ernst& Whinney, Nov. 7, 1986. Ernst & Whinney included many questions from the OTA survey in for example, of being inexpensive, exportable, a survey it conducted at the Computer Security Institute Con- interoperable, and politically acceptable in the ference in November 1986. The raw data from this Ernest & Whinney survey indicated that, of 364 nongovernrnent respond- ents, 46% said that NSDD-145 had had “no’ impact, 27Y0 ‘“very ‘See, for example, “ ‘Advice Most Needed . . . ‘ The Assess- little” impact, 21% “some” impact, and 6% “great” impact (see ment and Advice Effort, ” Deborah M. Claxton, DoD. Presented table 9). Ernest & Whinney has permitted OTA to use the raw at the Ninth National Computer Security Conference, Gaithers- data from this survey. burg, MD, Sept. 18, 1986. ‘OTA survey, October 1986, op. cit. 98 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic /formation nizations’ information and computer security tronic information, ranging from national secu- expertise came directly from government rity to economic self-interest and the need to programs.’ comply with established business practices. Vendors of information security products are For the purposes of this report, user objec- especially, and understandably, sensitive to tives and actions are grouped into two cate- Government policies and standards that influ- gories: ence the use and choice of safeguards among Government agencies and businesses. The rela- 1. those related to national security, which tively small markets for many types of safe- include a number of Federal agency ac- guards make any influences on consumption tions; and of these products particularly important. 2. other Government and private sector ac- tions not directly related to national The following sections examine the range of security. users’ motivations for using safeguard tech- nologies to protect unclassified information The latter category includes Federal agency and spotlight what users are doing to meet actions to protect financial transactions. At- their objectives. They illustrate some of the tention often focuses on cryptography because main objectives of users for safeguarding elec- it is central to many powerful safeguard tech- niques and because the course of technologi- cal development in cryptography-based safe- ~This data is from Ernst & Whinney’s survey administered at the Computer Security Institute Conference on Nov. 17-20. guards has been so tightly meshed with Federal 1986. policies.

NATIONAL SECURITY OBJECTIVES AND PROGRAMS Background tion they collect. Issues relating to the secu- rity of Federal information systems were ex- Traditionally, national security objectives amined in an earlier OTA report, Federal have guided the development and use of effec- Government Information Technology: Man- tive information security techniques. DoD has agement, Security, and Congressional Over- been responsible for safeguarding classified in- sight. 6 This section describes selected pro- formation transmitted, stored, or processed in grams to protect information systems.7 communications and computer systems. Re- cently, through NSDD-145, DoD’s authority Commercial Carrier Protection Program.– has been expanded to include protecting sys- This program, begun prior to the issuance of tems containing certain unclassified informa- Presidential Directive/National Security Coun- tion in civilian agencies and the private sec- cil 24 (PD/NSC-24), involves the Nation’s ma- tor. (See ch. 6.) This includes Government and jor telecommunications carriers. In late 1977, Government-derived economic, human, finan- The New York Times, among other newspapers, cial, technological, and law enforcement infor- reported that President Carter had approved mation, as well as personal or proprietary infor- a broad protection program that included rout- mation provided to the Federal Government.

GOTA.CIT.2g7, Febru~y 1986. Chapter 4 of this report sur- Federal Telecommunications Protection veys the security of unclassified information systems within the Federal Government. Programs Tpmt of this section is based on material taken from chap- ter IV of OTA contractor report, “Vulnerabilities of Public Most Federal agencies have adopted some Telecommunications Systems to Unauthorized Access, ” Infor- policy to protect the security of the informa- mation Security, Inc., November 1986. Ch. 5—/reproving Information Security Ž 99 — — ing nearly all Government telephone messages A successor, the Secure Telephone Unit II in three cities (Washington, D. C., New York, (STU-II), was developed by NSA in the early and San Francisco) through underground ca- 1980s for protecting classified information up ble rather than over more vulnerable radio cir- to Top Secret Compartmented, depending on cuits.8 At the same time, research was ac- the classification of the cryptographic key. The celerated to improve telephone security with STU-II program also implemented a secure key the long-haul, terrestrial commercial carriers. distribution center.’2 STU-II phones, which As a result, entire radio channels are now pro- cost about $12,000 each, operate over ordinary tected between switching stations in the three telephone circuits and could be purchased un- cities. After the technology was developed to til December 1986. The General Services Ad- protect the microwave radio systems, the Gov- ministration (GSA) was made system manager ernment began to require protected service in to support the purchase, operation, and main- civil and defense agencies’ communications tenance of more than 3,000 STU-II phones by procurements. (See ch. 6 for a description of civilian agencies, according to NSA.13 the evolution of these communications secu- The new STU-III program was announced rity programs.) by NSA in March 1985, subsequent to NSDD- Currently, 450 microwave radio channels car- 145. STU-III units will be produced for use by rying more than 1 million voice and data cir- Federal agencies, Government contractors, cuits are protected. More than 1 million sensi- and certain other private sector firms. NSA, tive telephone calls are protected each day and which will manage the cryptographic keys, NSA expects that almost 2 million circuits will plans to produce 500,000 phones at $2,000 be protected in 1988. Although this program each. As of late 1986, orders for 49,640 units was prompted by defense concerns for safe- (to be delivered in late 1987) had been placed, guarding DoD contractor communications, de- with options for additional units. The average fense and non-defense protection requirements unit price was $3,827. As of January 1987, were aggregated for efficient bulk or network- 37,116 of the initial orders were for defense level protection.g agencies and 9,675 for nondefense agencies. About 200 STU-III phones had been ordered Secure Voice Programs. -As reported by The by Government contractors.]’ The STU-III New York Times in late 1977, the Executive program is discussed in more detail later in this Secure Voice Network program was initiated chapter. to provide 100 selected Government executives and surveillance targets10 with a total of 250 secure voice terminals at a cost of $35,000 each.

The equipment, intended to secure classified IZIn the STU-11 pro~~, key distribution for the civil agen- information up to Top Secret Compartmented, cies is handled by GSA Key Distribution Centers. GSA is the overall Government manager for the Federal Secure Telephone used narrowband, dial-up telephone lines. It System (STU-11 phones), serving some 65 to 70 agencies and had a mode for automatic keying based on managing their STU-I I installations, maintenance, system man- secure distribution of the classified crypto- agement, and procurement. In the successor STU-I II Program, the NSA wilI do all keying through the NSA Key Management graphic key from a secure (electronic) key dis- Center. Source: Discussion between OTA staff and GSA Spe- tribution center. NSA funded deployment of cial Programs Division and Electronic Services Division staff, the network. ’l Oct. 8,1986. The STU-111 phones will be procured commercially; plans for maintenance and servicing have not yet been an- nounced. Under the FSTS Systems Manager charter from NSA, GSA ““Carter Approves Plan to Combat Phones by Other Na- supports FSTS operations governmentwide, including operat- tions, ” New York Times, Nov. 20, 1977, p. 34. ing the FSTS Key Distribution Centers (KDCSI. It serves users ‘Harold E. Daniels, NSA S-0033, Feb. 12, 1987, p. 2 of En- in the defense and civil agencies, as well as some private con- closure 3. sultants to the Government. Source: Harold E. Daniels, Jr., NSA IOInformation Security, Inc., “Vulnerabilities of Public Tele- S-0033-87, Feb. 12, 1987, p. 3 of Enclosure 3. communications Systems to Unauthorized Access, OTA con- l~NSA continues t. provide a portion of the cost to sustain tractor report, reference 12, November 1986. GSA’s systems manager responsibilities. llNSA S-0033, op. cit., p. 2 of Enclosure 3. l~NSA S-0033-87, op. cit. 100 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

Government Procurements first-year cost of $350 million. FTS-2000 is in- tended to be compatible with the evolving all- GSA issued the first public competitive digital systems, generally referred to as the procurement for private line protected serv- Integrated Services Digital Network (ISDN). ice between Washington, D.C. and San Fran- cisco in 1980. This set the precedent for nu- In its request for proposal, GSA re- quired four specific security features for FTS- merous subsequent procurements, particularly 16 in having the carriers provide protection. A 2000. The system has to: turnkey system was provided by RCA Amer- 1. protect terrestrial radio systems in certain ican with integrated protection for about a 5- geographic areas and the communications percent cost premium over the unprotected links of any satellite system used to pro- service. The 5-year contract cost about $15 mil- vide services; lion to protect 312 circuits. 2. provide protection from loss, degradation, More recently, the Defense Communications or alteration by intrusion for the portion Agency (DCA) awarded a major contract to of those databases and information proc- AT&T for a nationwide, all-digital service essing systems that are critical for con- called the Defense Commercial Telecommuni- tinued reliable operation; cations Network (DCTN). The 10-year, $1- 3. protect common channel signaling paths billion program provides optional encrypted by NSA-endorsed encryption equipment service among 161 locations, with link encryp- or by other approved, nonencrypted forms tors integrated into the carrier’s earth stations. of protection (e.g., fiber, cable); and DCTN is designed to be flexible enough to al- 4. provide the capability to encrypt the com- low for changes in technology and in customer mand and control link of any-spacecraft requirements over the 10-year period. It also launched after June 17, 1990. permits the use of video teleconferencing, FTS-2000 is expected to significantly affect switched voice, Autovon, and a wide range of communications security in the private sector, data modes. DCA has also awarded a $100- according to National Security Agency offi- million contract to Hawaii Telephone for a se- cials. It is expected to stimulate the develop- cure turnkey network called the OAHU Tele- ment of link encryptors, protected services, phone System. signaling channel protection, and command- The largest program to date is for GSA’s and-control encryption for satellites, thereby Federal Telecommunications Service-2000 making these features more readily available (FTS-2000), a commercial communications to the private sector and at lower prices. service for Federal agencies.15 FTS-2000 will eventually replace GSA’s current long-distance Carrier Protection Services telephone system, which has some 1.3 million Microwave radio systems began to be used subscribers who total 1.5 billion call-minutes to augment the existing AT&T cable infra- per year. structure in the 1950s. By the 1960s they had FTS-2000 differs from the current system become the dominant long-distance transmis- in that it will procure telecommunications serv- sion medium. New companies providing com- ices rather than leased facilities. FTS-2000 in- munications services in the 1970s typically cludes contractor-provided security features. installed microwave circuits or used new com- GSA expects to award a contract by late 1987, munication satellite technology. In the 1980s, with services to begin in 1988 at an expected optic fiber has become the favored medium for new point-to-point circuits, while satellite is still preferred for many broadcast applications. l~Information SWuritie9, Inc., OTA contractor report, “VUl- nerabilities of Public Telecommunications Systems to Unauthor- ‘bInformation Securities, Inc., OTA contractor report, “Vul- ized Access, ” November 1986, and OTA staff discussions with nerabilities of Public Telecommunications Systems to Unauthor- GSA officials August 1986. ized Access, ” November 1986, reference 19. “————-——-——-—.——— -

Ch. 5—/rnproving /nforrnation Security ● 101 — .

(See ch. 3 for a discussion of the vulnerabili- Agency (NSA), as discussed in chapter 6. ties of these systems.) Therefore, most programs initiated under NSDD-145 are under the auspices of the Na- The protected services offered by the com- tional Telecommunications and Information munications common carriers stem in large part from Government efforts in the 1970s to Systems Security Committee (NTISSC), which is chaired by an assistant secretary of defense. develop and install safeguards for microwave According to NSA, the approach being taken circuits. Satellite carriers also developed vari- is to focus on the national interest in address- ous means of encrypting transmissions relayed ing information security, and to develop in- by their geostationary satellites. These efforts tegrated and coordinated safeguards for clas- were sparked by Government encryption re- sified and unclassified information rather than quirements and, in one instance, by anticipated to segregate information security concerns into commercial demand. Several major carriers are defense and civilian needs. By developing in- developing various additional services, includ- tegrated standards for defense and civilian ing protected private-line services, microwave agencies and for private sector use, NSA hopes and satellite link encryption, and all-fiber to lower the cost of safeguard products and, net works. 18 thereby, increase their use. OTA was unable At present, the interexchange carriers have to obtain an unclassified summary of all pro- announced no plans to directly protect the pro- grams initiated by DoD under NSDD-145. posed Integrated Services Digital Network. The following summarizes selected DoD pro- Standards for this future network have not grams under NSDD-145 that affect civil agen- been decided. Nor has it been determined cies and the private sector. It is based on ma- whether U.S. or European designs will be used. 19 terials provided by NTISSC. A large number of switch and PBX manufac- turers are committed to providing ISDN-com- ● Civil Agency Customer Support: A branch patible interfaces to their customers. Users within the National Computer Security wishing to secure ISDN service can follow one Center (NCSC) was organized in 1986 to of two strategies: demand protection from each provide services to civil agencies and de- carrier for the portion of the circuit provided partments, including: by that carrier (link protection) or encrypt their —onsite security enhancement reviews to own communications from end to end.17 End- identify threats and vulnerabilities, and to-end encryption would be under the user’s provide recommendations for im- control, with the encryption taking place in the provements; user’s PBX, in the carrier’s Centrex service, —technical consultations and/or one-time or at the ISDN interface. review visits (less detailed reviews); —assistance in preparing proposals for DoD Programs Under NSDD-145 trusted computer system procurements; —assistance in drafting security policies; DoD Outreach Programs. -According to Na- and tional Security Decision Directive 145 (NSDD- –briefings on computer security, NCSC, 145), the Secretary of Defense is the executive and other related topics. agent for telecommunications and information ● Trusted Computer System Training:. NSA systems security, with the national manager issued the Department of Defense Trusted being the Director of the National Security Computer Systems Evaluation Criteria, also known as the “Orange Book, ” to all Federal agencies and departments in No-

“As of late 1986, DoD appeared to be favoring a link encryp- l~Hmold E. Daniels, Jr., NSA S-0040-87, Feb. 20, 1987. En- tion strategy. Commercial users, who do not have control over closure A. the circuit infrastructure, may be more likely to choose end-to- lgI,etter from Don~d C. Latham to OTA, NTISSC-089186, end encryption. NOV. 7, 1986. 102 . Defending Secrets, Sharing Data: New Locks and Keys for E/ectronic Information

vember 1985 for consideration as a na- These products are available to protect unclas- tional standard. To aid this review, NCSC sified Government information and all levels presented briefings and tutorials to more of sensitive private sector information. than 70 Federal agencies. NSA announced in 1986 that it would ter- Special Assistant for Civil and Private minate the DES Endorsement Program in Sector Programs: To fulfill its obligations 1988 in favor of the Commercial Communica- under NSDD-145, NCSC, in the summer tions Security Endorsement Program (see be- of 1986, created a senior-level position for low).” According to NSA, the change was a a person to help define future directions result of several factors, including the fact that and strategies for NCSC interactions with DES has been a widely applied public al- the civilian agencies and the private sector. gorithm for 15 years and, as such, a worthwhile Computer Security Training for Civil target for adversaries. Therefore, NSA consid- Agencies: NCSC has organized and is giv- ers it prudent for DES to be phased out over ing courses in computer security to Fed- 2s time. eral employees of the civilian agencies. The one-week courses are given twice a year The announcement has led some users to in- and are open to all Federal agencies. Also, fer that DES is now unsound and, reportedly, NCSC has initiated an annual computer to delay adopting safeguards because of con- security training seminar to allow com- fusion over the longevity of DES and the roles puter security trainers throughout the of NSA and NBS in setting standards for cryp- Federal Government to exchange informa- tographic algorithms.z4 In particular, the tion on effective methods. American Bankers Association, which says that the U.S. banking industry had already in- Data Encryption Standard (DES) Endorse- vested years of work and several million dol- ment Program. -Launched by NSA in October lars in DES-based equipment, spent 16 months 1982 (before NSDD-145), this program is de- (from October 1985 to February 1987) educat- signed to test and endorse equipment using ing NSA about their business needs. ABA DES to protect national security-related tele- spokesmen have said that, “Our industry has communications in compliance with Federal lost momentum in adopting improved security Standard 1027.’0 Under the program, vendors technology, and it remains to be seen if we can wishing to supply endorsed cryptographic overcome the damage that has been done to products for unclassified use by Government the perceived security of DES-based tech- agencies and contractors submit their DES niques. ‘Z5 components (electronic devices) to the National Commercial Communications Security (COM- Bureau of Standards (NBS), which validates SEC) Development Programs.–One of NSA’s the component correct implementation of the stated goals is to “make high-quality, low-cost DES algorithm. NSA then determines whether cryptography available to qualified communi- the product meets all other Federal require- cations manufacturers for embedding in their ments for endorsement and certification.

By October 1986, 32 families of equipment (17 voice, 14 data, and 1 file encryptor), totaling 21 ‘zAccording to NSA, DES products endorsed prior to Jan. some 400 models, had been formally endorsed. 1, 1988 can be used indefinitely. Harold. E. Daniels, NSA S-0033-87, Feb. 12, 1987, p. 4 of Enclosure 3. “Harold E. Daniels, NSA S-0033-87, Feb. 12, 1987, p, 4 of Enclosure 3. ~’]’’ Telecommunications: General Security Requirements for “Peter Hager: “NSA Plan to Replace DES Draws Criti- Equipment Using the Data Encryption Standard, ” Apr. 14, cism, ” Government Computer News, May 9, 1986. Cheryl W. 1982. Helsing, Testimony on Behalf of the American Bankers Asso- ‘1 Information Security, Inc., “Vulnerabilities of Public Tele- ciation before the House Committee on Science, Space and Tech- communications Systems to Unauthorized Access, ” OTA con- nology, Feb. 26, 1987, tractor report, November 1986, ref 14. “Ibid., Cheryl W. Helsing. Ch. 5—improving /nformation Security ● 103 products. “26 According to NSA, “qualified’ nerships arranged through memoranda of un- manufacturers of such products must meet derstanding.” The first CCEP secure system four basic criteria.27~ These are: was available in 1986. 29 1. The firm must not be under foreign owner- The second program is another joint NSA/in- ship, control, or influence, as prescribed dustry venture called the Development Cen- by the Defense Investigative Service ter for Embedded COMSEC Products (DIS). (DCECP). Eleven large U.S. corporations– 2. The firm must have or obtain a DIS facil- Harris, Motorola, RCA, Rockwell Interna- ity clearance because the cryptographic tional, Hughes Aircraft, GTE, AT&T Technol- design information is classified even ogies, IBM, Xerox, Intel, and Honeywell— though the resultant products are not. have joined with NSA to produce modules for 3. The product host in which the firm pro- use in products to be developed for the com- poses to embed cryptography must, in mercial COMSEC program. According to NSA’s estimation, make obvious market NSA, these corporations were chosen based sense. on their expertise in making selected telecom- 4. The company must demonstrate that it munication products. Each firm will manufac- can produce products that meet or exceed ture one or more types of the NSA modules NSA’s minimum standards of quality and after NSA has evaluated and approved them. reliability y. Each manufacturer may embed its modules within its own host equipment, a personal com- NSA has established two programs to puter or a secure telephone, for example, and/or achieve its goal: one to develop the host prod- sell the modules to other “qualified” host ucts and the other to develop the embeddable equipment manufacturers. Commercial divi- cryptographic modules. The first, called the sions in each corporation are assisting in the Commercial Communications Security En- design and review of the standard modules to dorsement Program (CCEP), is a “business ensure that they can be used in a wide variety method” partnership between NSA and U.S. 30 of commercial equipment. firms to develop a variety of secure products, such as personal computers, radios, and local In addition to the list of endorsed DES prod- area networks. The approach pairs NSA’s cryp- ucts mentioned above, NSA also maintains tographic expertise, as embodied in embedda- lists of endorsed information security products ble modules that implement secret NSA cryp- and potential products. The information secu- tographic algorithms, with vendors’ investments rity products on these lists have been evalu- to develop host products that incorporate the ated and endorsed by NSA as having met modules. According to NSA, the industry part- standards or requirements for use by the Gov- ner then sells a “value-added” product. As of ernment and its contractors to protect classi- November 1986, NSA had about 40 such part- fied or unclassified, but sensitive information. The endorsement certifies cryptographic sys- tems as having met NSA security specifica- tions for a specified level of security. Items on their potential list are under development. As of December 1, 1986, 14 firms and some 30 “’NSA Press Release for Development Center for Embedded CO N! SEC Products, Jan. 10, 1986 (enclosure in letter from D. I,atham to OTA, No\’. 7, 1986). ‘7 Letter from Harry Daniels to OTA, Feb. 12, 1987, p. 5 of fi~nclosure 3, According to NSA, these criteria are prudent and ““(’commercial COMSFC Endorsement Program,” enclosure not overly burdensome to potential participants. However, the in lett~r to OTA from Donald I,atharn, No\T, i’, 1986. requirements for security clearances from the Defense !n\’es- ‘Lilnformation SecuritJr, Inc., “vulnerabilities of Public Telt~- tigatike Ser\ices might be seen as burdensome by some firms, communications Systems to Unauthorized Access, OTA con- especially smaller firms that do not ordinaril~’ need them for tractor report, November 1986. p. 38. their personnel. “JIbid,, and NSA S-0033-87, p, 6 of Enclosure 3. 104 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information cryptographic products were on the endorsed ● TEPACHE: Data rate up to 10 Mb/s; 6 list; about 30 firms and products were on the cryptographic modes; suitable for mini- potential list. computers, modems, local area networks, and word processors. Further, NSA lists computer systems, soft- ● FORESEE: Data rate up to 20 Mb/s; 7 ware, or components that have been evaluated cryptographic modes; suitable for satel- according to DoD’s evaluation criteria for lite links, microwave links, fiber optic trusted computer systems. NSA also lists com- links, and mainframe computers. panies that provide communications encryp- tion services and equipment evaluated accord- Type 2 modules, which will be available at ing to the National TEMPEST Standard an unspecified future date, have been given (NACSIM 5100A). the names EDGE SHOT (same data rate as WINDSTER), BULLETPROOF (same data Standard NSA Product Line of Cryptographic rate as TEPACHE), and BRUSHSTROKE Modules.–The “modules” being developed un- (same data rate as FORESEE). Types 1 and der the DCECP are sets of integrated circuits 2 modules are intended to be interoperable 32 or printed wiring boards incorporating these within each bandwidth. NSA plans to key “chip sets. ” According to NSA, each module Type 1 modules through a secure key manage- is a general-purpose cryptographic device for ment system. It is not clear whether private digital data. The standard modules are designed firms that choose to use Type 2 modules will to be transparent to the user, with a flexible, be able to control key generation independently microprocessor-compatible interface and control of NSA. 31 structure. The standard module approach is NSA notes that the modules are designed intended by NSA to foster development of in- to perform more system security functions teroperable secure systems, using well-defined than if they contained just a “naked” key interfaces and common design features through- generator chip and to leave fewer security func- out the family of standard modules. tions for the host vendor to add on. However, to accommodate a wider range of commercial In its announcement for the standard Type host products, NSA has an alternative com- 1 product line intended for classified digital mercial Type 2 “naked” key generator chip information, NSA noted such additional fea- available to potential host vendors. Type 2 tures as tamper resistance, electronic and/or modules will be made available to qualified over-the-air re-keying, and enhanced transmis- firms that have a memorandum of understand- sion-error detection. There are four Type 1 ing with NSA, to firms under contract with modules, for classified applications in three NSA or other Government agencies to develop general bandwidths. There also will be three a cryptographic product, to Government agen- Type 2 modules, intended for unclassified, but cies doing cryptographic development, and to sensitive applications. certain other firms approved on a case-by-case Names, specifications, and applications of basis. 33 the Type 1 modules are as follows: Some users have expressed concerns that the ● WINDSTER: Data rate up to 200 kb/s; embedded cryptography will not be readily 9 cryptographic modes; suitable for hand- compatible with their existing equipment and held radios, pocket pagers, and telephones. operations, and others note that the change (Note: A lower performance module called is damaging to manufacturers of DES equip- INDICTOR is also available.) s~Information on l’ypes 1 and Type 2 modules were provided “’’Off the Shelf Information Security Products: A Family of by NSA at a meeting of the IEEE Subcommittee on Privacy, User-Friendly Modules for Embedding Within a Wide Range June 18, 1986. of Telecommunication Systems, NSA; enclosure to letter from s~Harold E. Daniels, Jr., NSA S-0033-87, Feb. 12, 1987, PP. D. Latham to OTA, Nov. 7, 1986. 8-9 of Enclosure 3. Ch. 5—Irnproving Inforrnation Security ● 105 ment. To ease the transition, NSA had offered begin more active marketing efforts. Accord- to work with manufacturers of the Data En- ing to NSA, Type 2 units could be delivered cryption Standard (DES) components and de- to private sector customers beginning in Jan- velop pin-for-pin replaceable circuits using the uary 1988. The production contracts contain new NSA algorithms, so that equipment man- an add-on option allowing additional STU-IIIs ufacturers’ investments in product designs (see above) to be produced at a reduced unit would not be lost. According to NSA, none of cost, in the $2,400 to $2,600 range. 36 the DES component manufacturers expressed 34 Almost all of the current order was for Type interest in this plan. 1 units intended for classified uses, but 300 STU-III Program.-NSA initiated the Secure Type 2 units for unclassified, but sensitive in- Telephone Unit III (STU-III) program in 1984 formation were also included in the initial con- to develop a new generation of secure telephone tract. NSA will be the source of all crypto- equipment using classified NSA algorithms graphic keys for the STU-III phones, including (but not the standard modules being developed those purchased by private sector users. For under the DCECP program). NSA intends that the Type 2 phones, users will be able to estab- the STU-III program serve all Government lish their own internal procedures for key man- agencies and private companies that require agement, except key generation. Type 2 users telephone security. NSA-sponsored studies within the Government will obtain their keys have estimated a market for 2 million units, directly from NSA; private sector users will with DoD being the largest single buyer. Mar- order keys from NSA via their STU-III ket studies by vendors also indicate potential vendors. 37 sales of 1 million to 2 million units to the pri- 35 The Secure Data Network System.–The Se- vate sector, although these conclusions are cure Data Network System (SDNS) project admitted to be soft. According to NSA, the STU-III program will feature the capability seeks to design an architecture for secure com- for multilevel security, availability of Type 2 puter networks. The project will provide a secu- rity architecture design for networks that units to the private sector, and interoperabil- transmit digital data between computers. The ity among all STU-III users. This will make project, certain aspects of which are currently the units attractive to a broad range of Gov- classified, is sponsored by NSA and includes ernment and private sector users. participation by NBS, the Defense Communi- The first production contracts were awarded cations Agency, and about a dozen computer in July 1986 to three vendors—AT&T, RCA, and communications vendors. and Motorola. They are authorized to market SDNS is intended to support both classified their Type 2 product directory to the private sector. The 2-year, fixed-price contracts totaled and unclassified applications. The system will about $190 million for 49,640 units. (See sec- provide confidentiality, data integrity, mes- tion above on Secure Voice Programs.) sage authentication, and access control serv- ices. The services and standards for them are NSA reports that the STU-III vendors still being designed to be compatible with those be- consider the government-contractor and other ing developed by the International Organiza- segments of the private sector market to be tion for Standardization (IS0). Currently, the ‘‘embryonic, in that customers have ex- project is in the prototype development stage. pressed interest but are waiting to seethe prod- Hardware is being developed and tested for uct. Sample Type 2 units will be available in performance, interoperability, and confor- 1987, at which time vendors are expected to mance with IS0 standards.

“Harold E. L)aniels, NSA S-0033-87, Feb. 12, 1987, p. 4 of Enclosure 3. “JNSA response to OTA questions on STU-I I I: NSA S-O033- “’*’ STU-I I I Program Status, ” enclosure in letter from D. 87, Enclosure 1, Feb. 12, 1987. I.atham to OTA, No\T. 7, 1986, ‘; Ibid, 106 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic /formation

Encryption capabilities will be provided with sponsive to current and evolving defense, ci- two different NSA-supplied algorithms, both vilian, and business needs. of which will remain classified. A Type 1 al- There is some early evidence that NSA has gorithm will be used for encrypting classified already begun to encounter difficulty in satis- information and a Type 2 will be used for un- fying the diverse needs of the private sector, classified but sensitive information. beginning with the banking industry. (See ch. Raising Private Sector Awareness.- The Fed- 6.) Moreover, NSA’s controlling role may raise eral Communications Commission (FCC) is tak- barriers to market entry by new vendors. At ing steps to alert the private sector to the vul- a more fundamental level, NSA’s national secu- nerabilities of communications systems. The rity and signals intelligence interests in con- FCC recently issued for NSA a public notice trolling encryption technology appear in tension advising licensees and users that “the Nation’s with its new role in developing and dissemi- telecommunications systems, particularly nating safeguard technologies and products. those involving terrestrial microwave trans- (See below and ch. 7.) mission media and satellites, are extremely vul- nerable to unauthorized access. 38 This no- Possible Barriers to Market Entry .–Only tice, which also applies to telecommunications “qualified” manufacturers meeting the NSA services or equipment that bypass public- criteria noted earlier will have access to NSA designed and endorsed standard cryptographic switched services, encourages concerned users to seek assistance from NSA in “identifying modules. Moreover, there will be accountabil- ity requirements for all modules and, even approved devices for the protection of sensi- tive, but unclassified, national security-related though the hardware modules themselves will communications (Government or nongov- be both unclassified and tamperproof to pre- 39 vent reverse engineering, NSA may place re- ernment). strictions on their export. (See below. ) Implications of Merging Defense, The embeddable modules are being produced Civilian Agency, and Private Sector by the 11 large electronics firms mentioned above, NSA’s “industry partners. ” Because Requirements of the limited number of these firms and be- Advocates of combining security standards cause they will most likely also produce host for unclassified information and guidelines for products incorporating the modules (for the Government agencies with those for the pri- Commercial Communications Security En- vate sector argue that aggregating markets dorsement program), some prospective en- will permit manufacturers to enjoy production trants into the host product market have ex- economies and result in lower prices for safe- pressed concern that competition in this guard products. Moreover, some feel that the potentially lucrative market will be essentially current markets for computer and communi- limited to firms already participating in the cations safeguards, particularly for trusted module program. Faced with the prospect of operating systems and cryptographic prod- purchasing the embeddable modules from ucts, are “fragile. They argue that one coordi- large, vertically integrated competitors, some nated set of Federal standards is needed to en- prospective entrants fear that NSA’s tight con- courage and strengthen these markets. Critics trols on its commercial programs will limit of the present approach of National Security competition. Agency (NSA) standards development and NSA, on the other hand, does not consider product certification see these as not fully re- the qualification criteria particularly burden- some, but, rather, reasonable. For instance, NSA notes that there are over 13,000 Defense ‘“Federal Communications Commission, Security and Pri- vacy of Telecommunications, Public Notice 6970, Sept. 17, 1986. Investigative Service cleared facilities in the “’FCC Public Notice 6970, Sept. 17, 1986. United States and that cryptographic design Ch. 5—Improving Information Security ● 107

information is classified with access limited to Cryptographic hardware and software are U.S. entities in accordance with prudent over- controlled by bilateral agreements and by pat- all security considerations. Similarly, NSA ent and export control legislation and regula- considers that decisions about the quality and tions, including the Export Administration market criteria will be fairly executed, with am- Regulations, the Invention Secrecy Act (35 ple opportunity for vendors and potential ven- U.S.C. 181 et. seq.), and the International Traf- dors to present their cases. According to NSA, fic in Arms Regulations (ITAR), ” as dis- host vendor participation in the CCEP pro- cussed in chapter 6. All equipment and sys- gram has already exceeded participation in the tems based on DES, including those for DES Endorsement Program. $” automatic data processing file security and message authentication for electronic fund As to competition in the host product mar- transfers, are included on the ITAR Munitions ket, NSA’s stated intent is to make the De- List and fall under the jurisdiction of the De- velopment Center for Embedded Communica- partment of State’s Office of Munitions Con- tions Security Products (DCECP) modules trol (OMC). OMC licensing agreements are co- competitively available to host manufacturers. ordinated with NSA. 43 All 11 of the DCECP module vendors have ac- The exportability of cryptographic safe- cess to both Types 1 and 2 design documenta- guards is an important consideration for many tion and, according to NSA, it is a vendor de- businesses that have overseas correspondents cision as to which module(s) to fabricate and produce. The Government owns the designs or subsidiaries. Prominent among these is the and NSA has stated that, should a particular banking industry, which has spent some years module not be chosen by any of the 11 manu- developing techniques and standards for trans- facturers for fabrication and production, or action authentication and confidentiality. These are based on DES, which can be licensed should there not be competitive sources for a for export and use abroad. When NSA an- given module, then the agency will seek addi- tional sources for the modules. NSA also notes nounced its planned replacement of DES with that, in order to achieve scale economies, com- secret (CCEP) algorithms, bankers and the American Bankers Association (ABA) became petitors may sell to each other–a practice that 41 concerned that the CCEP algorithms and mod- is common in the electronics industry. ules could not be used by the financial indus- try as a substitute for DES. For one thing, reli- DoD Control of Encryption Technology.– ance on one or a few algorithms would be NSA sees its signals intelligence mission to unacceptable for use in some foreign countries beat risk if effective cryptography were avail- or banks, even if NSA would permit their use able worldwide. As a result, NSA faces ten- abroad. Also, according to the initial NSA an- sions between its missions of encouraging do- nouncement, the (Type 2) modules may not be mestic use of effective encryption and other used internationally or placed in equipment for safeguards while controlling the transfer of en- use by non-U. S. entities. cryption technology overseas. Thus, its strat- egies to improve the availability of safeguards Finally, the bankers found the prospect of for use by U.S. nondefense Government agen- NSA retaining control of the cryptographic cies and businesses also include controls on the keys to be an unacceptable transfer of bank dissemination of such products and technical responsibility to a Government agency. As of data, some of which have already begun to mid-1987, NSA and ABA were still discuss- cause new tensions with the private sector. J. ~lultilaterall}, agreed upon export controls are determined through an international coordinating committee (COCOhl) whose membership includes representatives of the United States “’Harold F]. Daniels, ,Jr., NSA S-0033-87, Feb. 12, 198’7, pp. and 13 LT. S, allies, 8-9 of Enclosure 2; p, 5 of Enclosure 3. 1‘,J. Smaldone, Office of Munitions Control, personal commu- ‘] Ibid., pp. X-10 of Enclosure 2. nication with OT.4 staff. Sept. 24, 1986. 108 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information ing whether NSA would provide an acceptable In contrast, the DES standard as promul- exportable module for use overseas to authen- gated is public information, not limited to spe- ticate financial transactions. In mid-February cific manufacturers and vendors, and provides 1987, NSA and ABA reached agreement that more visibility into the algorithm itself. The NSA would continue to support the financial fact that the algorithm was published made industry’s use of DES-based technology until possible independent evaluations of its robust- an acceptable replacement is available.44 ness, as well as (unvalidated) software imple- mentations, thereby contributing to private NSA appears to be reconsidering the export- sector capabilities in commercially useful cryp- ability issue for Type 2 modules. In February tography. 1987, in response to a question from OTA, NSA officials stated that: On the other hand, NSA believes that asser- tions to the effect that current policies and the The NSA desires that host products employ- DCECP and CCEP programs limit competi- ing Type 2 modules be usable by U.S. entities tion and stifle private sector innovations and outside the U.S. For example, a U.S. firm oper- development are unsupported. According to ating in Europe should be able to purchase and NSA officials, the agency is actively encourag- use a Type 2 product, or a foreign subsidiary ing private sector innovation and the devel- should be able to use a Type 2 product as long opment of information safeguards for business as ownership was maintained by a U.S. entity. Use by foreign firms or individuals, when it needs. For example, NSA cites the CCEP pro- is in the U.S. interests for interoperability is gram, in which prospective host product ven- possible, depending on the country involved dors determine which products to produce and inter-country agreements. 45 based on their assessments of market needs. Moreover, part of the rationale for NSA’s The various NSA outreach and industry approach is to use interfirm competition to partnership activities seem tailored to the drive down the cost of information security agency’s dual missions of encouraging the use products like the STU-III phones. NSA and of safeguards while controlling the spread of the rest of DoD have been concerned that rela- cryptographic and cryptanalytic expertise. For tively high costs have limited their use within the former, NSA uses site visits, briefings, ex- DoD and elsewhere. The resulting small mar- changes of personnel and information, and ket was not attractive to producers. By mak- product evaluation and endorsement in addi- ing information security products more afford- tion to written standards and guidelines. For able, NSA hopes to increase their availability the latter, NSA makes cryptographic hardware and use. In achieving this, according to NSA, and interface specifications generally available “technological competitiveness is the goal in to host equipment vendors and users, without driving costs down versus cryptographic com- broadly transferring expertise in crypto- petitiveness which does nothing for cost and graphic design and cryptanalysts. For in- can have a deleterious effect on national stance, it is unclear whether even the 11 mod- security. ’46 ule manufacturers know all the cryptologic criteria used by NSA in developing the al- Technology Development and Dissemina- gorithms, although NSA gives them the de- tion.–After a number of DoD-sponsored sign information and expertise needed to man- studies and demonstration projects during the ufacture the hardware that implements the 1970s to address technical problems associated algorithms. with controlling the flow of classified and other information in multiuser computer systems, the DoD Computer Security Initiative was ~iCheryl W. Helsing, Testimony on Behalf of the American Banking Association before the House Committee on Science, Space, and Technology, Feb. 26, 1987. “Op. cit., Harold E. Daniels, Jr., NSA S-0033-87, p. 10 of En- IGHarold E, Daniels, Jr., NSA S-0040-87, Feb. 20, 1987, En- closure 2. closures D and E. Ch. 5—Improving Information Security ● 109 started in 1977. Concurrently, the National Bu- and B (mandatory protection), to the most com- reau of Standards (NBS) began to define the prehensive Division A (verified protection). construction, evaluation, and auditing of se- Each division represents an improvement in cure computer systems. As an outgrowth of the overall confidence that can be placed in the recommendations from a 1978 NBS workshop system to protect information. Within divi- paper on criteria for evaluating technical com- sions C and B, security classes such as Cl, C2 puter security effectiveness, and in support of or Bl, B2, and B3 correspond to progressively the DoD Computer Security Initiative, the stronger security features. MITRE Corp. began to develop a set of cri- teria for assessing the level of trust that could NSA produces a number of computer secu- be placed in a computer system to protect clas- rity documents ranging from trusted operating sified data. systems (the “Orange” and “Yellow Books’ to forthcoming criteria for trusted computer In 1981, the DoD Computer Security Evalu- networks and data bases.49 Some users ap- ation Center was established to continue the parently have reported difficulties in interpret- work started under the DoD Computer Secu- ing the Orange Book criteria at the higher pro- rity Initiative. The center, located within NSA, tection levels; as one response to this, NSA was renamed the National Computer Security has developed a rules-based expert system Center after its responsibilities were expanded available to guide users through the Yellow by National Security Decision Directive 145 Books. (NSDD-145). The Orange Book criteria have been adopted The National Computer Security Center as a DoD standard (DoD 5200.28 -STD, Decem- (NCSC) developed the “Orange Book” criteria ber 1985), and therefore these security require- for evaluating multilevel security in commer- ments must be included in specifications for cial computer systems. The original criteria new systems being developed by DoD. How- were published as the Department of Defense ever, the question of whether the Orange Book Trusted Computer System Evaluation Criteria criteria and evaluated products program will (CSC-STD-001-83, August 15, 1983). A deriva- best serve the unclassified, but sensitive in- tive but slightly different document was later formation security needs of civil agencies and published as DoD 5200.28-STD in December the private sector is being debated within the 1985. The Orange Book criteria evolved from computer-security community, especially out- the earlier NBS and MITRE work.” NCSC side NSA. (See the section below on differences has also released “Yellow Books” that help between military and commercial models of users apply the comprehensive Orange Book 4s security.) As of May 1987, the NCSC’s Evalu- criteria to specific computer facilities. ated Products List reported security class rat- The criteria specify four divisions, ranging ings according to the Orange Book criteria for from Division D (minimal protection) up 8 products, and about 20 more products were through Divisions C (discretionary protection) being evaluated.so

4TFrom information on the history of the Orange Book cri- teria contained in DoD 5200.28 -STD, which provides a more detailed history and rationale for the trusted computer system 4gPresentation by P. Gallagher of NSA, at an IEEE Subcom- evaluation criteria. mittee on Privacy meeting at George Washington University

‘“DoD Computer Security Center: “Computer Security Re- in Washington, D. C,, Nov. 13, 1986. quirements: Guidance for Applying the DoD Trusted Computer ‘(’Some information on the evaluated products program was System Evaluation Criteria in Specific Environments (CSC-STD- contained in a letter from Harold E. Daniels, Jr., NSA S-O033- 003-85 ),” June 25, 1985; and “Technical Rationale Behind CSC- 87, Feb. 12, 1987, p. 7 of Enclosure 2. See also: National Com- STD-003-85: Computer Security Requirements (CSC-STD-O04- puter Security Center, Evaluated Products L“st for Trusted Comp- 85), ” June 25, 1985. uter Systems, Dec. 1, 1986 (updated May 31, 1987). 110 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information — —

OBJECTIVES AND PROGRAMS UNRELATED TO NATIONAL SECURITY

Background pleted questionnaires (a 12 percent response rate) were returned on site or by mail; 141 re- As part of this study, OTA surveyed the sponses (25 percent) were from Government data and information security procedures, pol- employees and the remainder came from a icies, and practices of large U.S. corporations. broad spectrum of business and industry. Of The survey also tried to determine the extent the respondents, another 18 percent were from to which these firms are aware of Government- manufacturing, 15 percent from financial serv- sponsored assistance and whether they have ices, 9 percent from insurance, and 8 percent been affected by National Security Decision from communications firms. Only 3 percent of Directive 145. the respondents identified themselves as from the defense industry. With Ernst & Whinney’s The survey was self-administered at an Oc- permission, some of their survey data are used tober 1986 meeting of Palmer Associates, a in this chapter, in addition to the OTA survey group of computer audit directors of Fortune data. 100 companies. Questionnaires were completed by all 26 people present, a sample that is far Private Sector Motivations too small to be representative of U.S. indus- try at large or for statistical generalizations. Private industry and civilian agencies want information safeguards to:

Nevertheless, the results are of value for two ● major reasons. First, they illustrate the per- protect corporate proprietary or sensitive ceptions of some knowledgeable corporate information from unauthorized disclosure leaders about security needs and practices. Sec- or access and ensure the integrity of data and its processing; ond, the vast majority of the respondents were ● from nondefense companies (92 percent, with reduce losses from fraud and errors in elec- 42 percent from banking alone), while most of tronic funds transfers and other financial NSA’s experience with the private sector has transactions, limit associated increases in been with defense contractors. The survey re- insurance premiums, and limit exposure sults may shed some welcome light on the to legal liabilities for preventable losses; and desirability and feasibility of NSA’s plans to ● meet aggregated users’ needs with one set of take advantage of new opportunities to standards, guidelines, and technologies, and reduce costs. can provide a context for the section below on Box E provides several indicators of increased differences between military and commercial private sector interest in electronic safeguards. models of information security. Protection of valuable corporate electronic Also, the consulting firm of Ernst & Whin- information from disclosure (confidentiality) ney included some of the same questions in a is important to many firms, but this need is separate survey that was self-administered by not necessarily a firm’s major concern for in- attendees of the Computer Security Institute’s formation security. The OTA survey found 13th Annual Conference held in November that the 26 respondents placed roughly equal 1986 in Atlanta, Georgia. A total of 562 com- importance on integrity, confidentiality, and Ch. 5—/reproving /formation Security . 111

Box E.—Indicators of Private Sector Interest in Safeguards Even though many industry spokesmen consider the market for many advanced safeguards fragile and emerging, OTA has noted a number of indications of growing private sector interest in improved safeguards, including: ● Rapid growth in the number of computer-communications security conferences during recent years and in their attendance levels.—Attendance at the National Computer Security Confer- ence, sponsored jointly by the National Security Agency (NSA) and the National Bureau of Standards (NBS), increased three-fold in the last 7 years, from about 350 in 1980 to more than 1,000 in 1986. Capacity constraints at NBS conference locations have forced sponsors to limit attendance. Some other conferences, such as the Institute of Electrical and Electronics Engineers (IEEE) Symposium on Security and Privacy are also limited by space constraints. Attendance at the Computer Securit y Institute’s annual Computer Security Conference/Exhi- bition doubled–from 600 to l,200–between 1981 and 1985, and the American Society for Industrial Security Seminar and Exhibition has expanded to include computer security, bio- metrics, and access control. In addition, many new conferences and workshops given by secu- rity consultants and user groups have sprung up over the past 3 years. Among the latter are conferences and workshops for users of the Top Secret and RACF access control software packages. Other annual conferences include CRYPTO in the United States and EUROCRYPT, both sponsored by the International Association of Cryptologic Research. ● Increases in the level of sales of safeguard equipment and software. -According to market reports, installations of two of the most popular commercial access control software pack- ages, ACF2 and Top Secret, have grown by more than a factor of 10 over the past 6 or so years. ● The rise in the number of computer and communications security consultants and in the num- ber of organizations for security professionals.-The number of security consultants listed in directories have increased, and new professional groups are , such as the Informa- tion Systems Security Association (ISSA). Consulting firms are expanding their information security practices and new services organizations are being established. such as the Interna- tional Information Integrity Institute (SRI International). ● The increasing number of technical articles being published on topics related to computer and communications security .—OTA staff did a word search using the abstracts of articles published in the ABI/INFORM journal set, a collection of more than 650 U.S. and foreign business publications including such areas as accounting, banking, data processing, economics, finance, insurance, and telecommunications. The 200-word abstracts for the years 1971, 1976, 1981, and 1985 were searched for 5 selected phrases (computer security, communications secu- rity, encryption, data integrity, and personal identification) in order to determine whether the relative frequencies of these had increased. OTA found that the number of abstracts in- cluding these phrases had grown in real as well as nominal terms, in particular, the phrase “computer security” occurred in only one out of 1,737 abstracts in 1971, but occurred in 268 out of 38,375 in 1985—a 10-fold increase in relative frequency (none of the other 4 phrases occurred in any of the 1,737 abstracts in 1971). The phrases “data integrity” and “encryp- tion” occurred in only two and eight out of 14,356 abstracts, respectively, in 1976. By 1985, they occurred in 45 and 85 out of 38,375 abstracts, respectively–a three-fold and ten-fold increase in relative frequency. The phrases “personal identification” and “communications security” occurred infrequently and did not show significant increases in relative frequency. 112 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information reliability/continuity of service as components 2 years. About one-third reported “significant of their organization’s information security, information or data security problems” dur- with integrity being rated slightly more im- ing the past 2 years, mostly in the form of un- portant overall. The larger Ernst & Whinney authorized access and loss of integrity (in one survey found similar results, with both Gov- case, engineering data was destroyed). In only ernment and nongovernment respondents rat- one instance was loss of confidentiality cited, ing integrity slightly higher than confidential- resulting in invalid competitive bids—which ity and reliability/continuity. Interestingly, may be an indication of the difficulty of de- Government respondents rated confidential- tecting some misuses, rather than their ab- ity slightly higher than continuity of service, sence. Only 2 percent of the information han- while the opposite was the case for nongovern- dled by these firms is classified for reasons of ment respondents. national security, according to respondents to the OTA survey. Encryption or access control technologies can protect valuable proprietary information The majority of Ernst & Whinney survey from disclosure, but they can also preserve its respondents considered that the security risks integrity and protect it from accidental or ma- faced by their organizations have increased licious modifications or deletions. This can be over the past 5 years, and about one-third of particularly important where large databases the business and one-fourth of the government are a major revenue-producing asset. The re- respondents considered that these risks were gional Bell operating companies, for example, not adequately met. Half of the respondents safeguard their on-line database for their Yel- reported financial losses as a result of secu- low Pages to preserve the integrity of the data rity problems or downtime, mostly under and to prevent unauthorized use, not to pre- $50,000, although a few losses were reported vent disclosure. In that sense, a recent news to be in excess of $1 million (note that this ques- story reported that a disgruntled employee had tion included losses due to downtime, which attempted to rewrite parts of the 1988 edition the OTA survey did not include). About one- of the Encyclopedia Britannica. The sabotage third of the respondents reported non-financial attempt failed, according to a company spokes- losses, mostly in the form of unauthorized ac- man, because of safeguards that prevented un- cess by employees and hackers. For Govern- authorized changes to the computer database.51 ment respondents, about 31 percent of the in- formation mix handled by their organizations Most of the OTA survey respondents and was classified for purposes of national secu- almost 90 percent of the Ernst& Whinney sur- rity, versus only 4 percent for nongovernment vey respondents judged information security respondents. as being of ‘fair’ or ‘extreme’ importance to their organizations. Of the Ernst & Whinney Reducing EFT Fraud and Other Losses.—U.S. respondents, Government respondents as- banks transferred some $167 trillion in 60 mil- signed slightly more importance overall to in- lion separate transactions in 1984. The actual formation security than did the nongovern- amount of wire transfer fraud experienced by ment respondents. banks is unknown. One estimate by the Bu- reau of Justice Statistics suggests aggregated All the OTA survey respondents noted an electronic fund transfers (EFT) and automated increase in the importance of data and infor- teller machine (ATM) losses of $70 million to mation security to their firms over the past $100 million a year during the early 1980s, but a large fraction of this figure is due to ATM bl’’Britannica Sabotage Thwarted, ” Washington Post, Sept. losses from fraud (by “con men, ” etc.) against 6, 1986, p. D3. the owners of the bank cards. Another Bureau Ch. 5—/reproving /formation Security . 113 of Justice Statistics report examined some 139 to loss (before recovery efforts) of some problem wire transfers. It found an average $880,000 per loss and an average net loss (af- potential loss per transaction of $800,000, al- ter recovery efforts) of about $19,000 per in- though some potential losses were significantly cident. 55 larger.” Whatever the actual amount of the losses, Similarly, an American Bar Association there is another indirect indicator that this is (ABA) survey of private and public sector orga- a serious problem: insurance premiums are ris- nizations found that one-quarter (72) of those ing for protection against fraud and other responding reported “known and verifiable types of losses related to electronic transfers losses due to computer crime in the last 12 of funds.5G During the past year, financial in- months. Losses reported by respondents stitutions’ motivations to safeguard value- overall ranged from a few thousand dollars to bearing transactions-EFTs, letters of credit, more than $100 million. Most losses reported and securities transfers-have been strength- by the (anonymous) respondents were less than ened by actions of their insurers, some of which $100,000.5” are raising premiums and/or requiring the use of message authentication methods approved A large survey by the American Institute by the American National Standards Institute of Certified Public Accountants revealed that (ANSI). As industry applies safeguards more 2 percent (105) and 3 percent (40), respectively, widely and as the use of certified safeguards of the banks and insurance companies sur- becomes more commonplace, expectations for veyed had experienced at least 1 case of fraud responsible corporate behavior will be raised. related to electronic data processing (EDP). A new standard, and perhaps a legal criterion, Most perpetrators were employees. More than appears to be evolving for gauging responsi- 80 percent of the frauds involved amounts un- ble corporate behavior, or “due care, in busi- der $100,000.” nesses where firms are expected to provide rea- The Department of Justice Bureau of Jus- sonable safeguards for information whose loss tice Statistics (BJS) recently examined the could do significant harm. scope of EFT fraud, based on extrapolations The wholesale banking industry is leading from a limited sample of 16 banks. The BJS this trend, prompted by liability and “due study suggested annual losses nationwide in care’ considerations, by the recommendations the $70-$100 million range for automatic teller of internal and external auditors, and by Treas- machine fraud. Twelve of the banks reported ury Department policies. Treasury has issued 139 wire transfer fraud incidents within the policy directives requiring all Federal elec- preceding five years, with an average exposure tronic fund transactions to be authenticated by June 1988. Dated August 16,1984, TD-81.80

“See U.S. Congress, Office of Technology Assessment, “Ch. 5, Computer Crime, ” Federal Go\rernment Information Tech- ‘sBureau of Justice Statistics Report NCJ-1OO461, Elec- nology: Management, Sectirit-y, and Oversight, OTA-C IT-297 tronic Fund Transfer Systems Fraud, April 1986. (Washington, DC: U.S. Government Printing Office, February ‘GThe experience of a west coast bank illustrates the magni- 1986), for an overview of the scope of computer-related crime tude of the changes in coverage being offered by insurers for and losses from electronic fund transfers and automated data EFT loss claims. Until recently, the bank’s insurance coverage processing. cost about $1 million annually, and pro~’ided protection of up ~iReport on Computer Crime, Task Force on Computer to $50 million per electronic transfer claim, with a $1 million Crime, Section on Criminal Justice, ABA, 1984. deductible. The policy premium in mid-1 986 rose to $5 million, ‘iAmerican Institute of Certified Public Accountants, EDP with a $10 million deductible, and an upper limit of $100 mil- Fraud Review Task Force, Report on the Study of EDP-Related lion for total annual claims. [Source: OTA staff discussion with Fraud in the Banking and Insurance Industries, 1984. bank officials, May 1986. ] 114 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Informaflon specified that Federal EFT transactions be A superseding Treasury Directive, TD-16.02 authenticated using measures based on DES (dated October 2, 1986), extended the authen- and conforming to ANSI standards. This ac- tication requirement to securities transfers and tion is expected to have widespread effects stated that equipment designed and used to throughout the banking industry because of authenticate Federal EFTs must comply with the large number of systems and communica- Federal Standard 1027, which specifies secu- tions links that will use the system, and be- rity requirements to be satisfied in implement- cause some standards set by Treasury and the ing DES (FIPS Pub. 46). Keying material used Federal Reserve System (which serves as the in DES authentication must be generated and interface between Treasury and the wholesale processed in accordance with ANSI Standard banks) become defacto industry standards. As X9.9. The broader requirement is expected to certified hardware for authentication becomes speed the dissemination of authentication tech- more widely used, economies of scale will lower niques throughout the private sector. prices for authentication hardware. As prices A number of private financial institutions fall, additional end users are likely to adopt are taking aggressive steps to prevent certain techniques and hardware to safeguard other types of misuse. Citibank, for instance, now business functions, creating a ripple effect has more than 4,000 encrypted links overseas. throughout the private sector. Similarly, the private Clearing House Inter- Thus, an early and important exception to bank Payments System (CHIPS), whose $240 the non-recertification of DES was made by billion in daily settlements is second in size only NSA in the area of electronic fund transfers. to the Federal Reserve System, uses ANSI- Through a memorandum of understanding, the approved standards to authenticate its trans- Treasury Department will certify commercial actions. s9 Large U.S. banks have also been data security devices for securing fund trans- among the most active participants in the de- fers, with technical guidance and support from velopment of technical standards through NSA and NBS. DES will remain the encryp- ANSI (see below). tion algorithm for EFT transactions while Reducing Costs.—Companies can also reduce authentication measures will be specified by the costs of routine business transactions by ANSI standards adopted by the wholesale conducting them via computer-to-computer banking community.” More recently, NSA communications that make use of cryptograph- agreed to support use of the DES for bank mes- ic-based authentication techniques. These sage authentication until an acceptable re- inter-organization transactions use standard- placement became available. Widespread use ized formats for the electronic interchange of of DES to authenticate electronic fund trans- business data between independently orga- fers will increase demand for DES-based hard- nized, owned, and/or operated computer and ware. That could lower its price and encourage communication systems. This is accomplished its adoption for other applications in whole- by each corporate participant assembling its sale and retail banking and elsewhere. As an transaction data in predefined sequences, called example of a retail banking application, the “transaction sets. ” DES is used to encrypt customers’ personal identification numbers in interbank automatic teller machine networks in the United States and Canada.sB ‘qAuthentication in CHIPS, New York Clearing House, Jan. 17, 1985. In 1986, CHIPS transactions amounted to $125 trillion, com- pared with $124.4 trillion in domestic transactions handled by 57 Memorandum of Understanding #S52-99-84-018, Parts IV the FEDWIRE system. The FEDWIRE system handles a and V. This memorandum may be renewed in 1987, according greater volume of transactions than CHIPS, and has many more to NBS staff. on-line correspondents (7000 depository institutions compared “’gEddie Zeitler, Security Pacific National Bank and Nancy to the 121 CHIPS member banks). [Source: Florence Young, Floyd, Citicorp/Quadstar. Personal communications with OTA Division of Federal Bank Operations, Federal Reserve System. staff, Feb. 18, 1987. Personal communication with OTA staff, Feb. 13, 1987.] Ch. 5—Improving /n format/on Security ● 115

Several industry-specific interchange stand- required for business data interchange will ards have previously been developed, includ- have to be eligible for use in other countries. ing transaction sets for air, motor, ocean, and The current ANSI authentication standard, rail transportation, as well as for public ware- based on the DES, is exportable, but its housing, and for the grocery industry. Devel- replacement may not be. opment of an American National Standard for The original focus area for electronic data electronic data interchange is under way, in- interchange was in transportation, beginning tended to replace the many paper and special- in 1968.62 The Transportation Data Coordi- purpose business methods by 1988. One of the nating Committee (TDCC) worked with repre- long-term goals of this standard is the realiza- sentatives of the rail, motor, ocean, and air tion of paperless trade transactions and trans- transport industries to develop EDI trans- portation arrangements. Standards for this action sets for these modes. The first success- purpose are being developed by the ANSI X12 ful data interchange transmission occurred Committee, which was chartered in 1978. The with railway bills, in 1975. Around the same first set of X12 standards for electronic busi- time, TDCC organized a group of computer and ness data interchange was approved by ANSI communications experts to develop specific in 1983, and more were published in 1986. business applications of this type of electronic The national standards are intended to be transaction. Among the outcomes of this group broad enough to encompass all forms of busi- activity were the development of purchase or- ness transactions amenable to standardization, der and invoice transaction sets and movement including inter-industry transactions. The elec- toward generic transaction sets for industry. tronic transactions, referred to as Electronic In the early 1970s, large corporations, such as Data Interchange (ED I) or Electronic Business Sears, JC Penney, and K-Mart had started Data Interchange (EBDI), are intended to re- transmitting purchase orders electronically, duce business costs by speeding up the pur- with specialized formats. This was feasible in chase order cycle, reducing the inventory part because these retailers were often their buffers firms must carry, and streamlining suppliers’ sole or largest customer. However, cash flow. Dozens of common transactions will benefits due to improved transaction accuracy be integrated using these standards, includ- and timeliness accrued to both parties, increas- ing purchase orders, invoices, shipping notices, ing interest in electronic transactions. check payment vouchers, requests for quota- 60 Movement toward further development of tions, and marketing information. These generic transaction sets was formalized in transactions amount to billions of dollars an- 1978, when the ANSI X12 Committee was nually. An estimated $38 million worth of them formed. TDCC and the Credit Research Foun- were handled electronically in 1985; by 1990, dation provided technical support to the new electronic business transactions are expected 61 committee, and TDCC is the current X12 Sec- to amount to more than $1 billion. retariat. In 1979, the grocery industry began These standards, or some compatible form its industry-specific Uniform Communication of them, may also be adopted worldwide, Standard (UCS), which is compatible with the thereby facilitating international transactions EDI architecture developed by TDCC for the in different currencies. For this reason, any transportation standards. Subsequently, message authentication product, such as that standards for public warehousing applications (Warehouse Information Network Standards, ‘OSee: tJack Shaw, “Electronic Business Data Interchange: A New Strategy for Transacting Business, ” AfSA Update, Man- agement Science America, Inc., March’ April 1985; “Detroit Tries to I.evel a filountain of Paperwork, }~u.~ine.ss 14’eek, AUg. YG, 1%+5. pp. 94-96. “’Information on the etolution of electronic data interchange “’Management Information Systems W’eek, Jan. 20, 1986. standards was pro~’ided by Paul Lemme, Transportation Data Estimates provided b~ tJack Shaw at the ANSI ASC X 12 mt’c~t Coordinating Committee, .ANSI X12 Secretariat. Personal conl- ing on ,June 9, 1986 are o~er $3 billion by 1990. munication with ()’1’.4 staff. [)ecemher 1987, 116 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information or WINS) were developed, also compatible with duction volume of 5 million cars per year, the EDI architecture. These standards include through use of electronic business data inter- various security features. change. c’ Caterpillar Tractor Co. has insti- tuted an electronic transaction system linking The ANSI X12 Committee is developing cc some 400 sites. generic standards for electronic business data interchange. In November 1986, industry rep- Because of the automobile industry’s large resentatives agreed on a common data diction- number of suppliers, contractors, and distrib- ary for the ANSI X12 standards, the WINS utors, their use of the new data interchange and UCS standards, and the TDCC ED I stand- standards is expected to accelerate the spread ard.cs The ANSI X12 Security Structures of these standards to other industries. These Taskgroup is developing transaction security include metals, plastics, and rubber, as well standards under the auspices of the X12 Fi- as chemicals, transportation, electronics, aero- nance Project Team, and the X 12 Committee space, banking, and retail sales.c7 The move- has joined with the ANSI X9 Committee to ment toward electronic business transactions deal with encryption and encryption-related is giving rise to new, network-based “electronic business requirements. According to the X12 clearinghouses’ with market entrants such as Secretariat, the latter include: electronic sig- IBM, GTE Telenet, GEISCO, Tymshare, and natures (“telex signature”); data integrity, GM’s Electronic Data Systems.68 “hash controls” (digests); message authenti- Potential savings to the Federal Govern- cation and sender verification; confidentiality ment from electronic purchasing alone have of business data error detection; end-to-end 69 : been estimated to be $20 billion/year or more. security; and protection against replay, spoof- The DoD, for instance, has begun to use elec- ing, modification, or impersonation. tronic data interchange to reduce the time re- Benefits from electronic transactions are ex- quired to get supplies to overseas commis- pected to be substantial for diverse user saries, and expects to shorten immediately the groups, and some are already being realized. 75-day purchase cycle by 5 or 6 days, thereby In 1980, a report prepared for the American reducing inventory requirements. Other com- Grocery Industry projected $300 million in missary and procurement paperwork-reduction profits for the industry as a result of imple- projects have been under way within DoD for menting standardized electronic transactions. a few years.70 The grocers’ UCS standards were completed in 1981, and the resulting industry gains have reportedly exceeded the projections.c4 The Automotive Industry Action Group, composed of the the major U.S. automobile manufac- turers and about 300 of their largest suppliers, ~sFord’s estimate is from “GEISC() PIZUIS TO Move Rock- began their movement toward standardization ville Jobs in Bid to Get Edge in Global Markets, ” Washington of electronic business transactions in 1981. Post, Sept. 29, 1986, Business Section, p. 4. The cost savings for GM is from a presentation by Jack Shaw at the ANSI X12 According to some estimates, General Motors ASC meeting, June 9, 1986. This estimate does not include other and Ford expect to realize a $200-per-car sav- potential savings from ED I facilitating just-in-time manufac- ings, or some $1 billion a year, on a typical pro- turing with reduced supply inventories. Shaw also reported that implementation of EDI enabled one large Eastern railroad to halve its purchasing data processing staff and is expected to cut another railroad’s purchase order lead time from 10 days to 3. ‘Gh-win Greenstein, “Caterpillar Erects Paperless Network, ” MIS Week, Jan. 20, 1986. ‘7Business Week, op. cit., Aug. 26, 1985. E:jEli~abeth Horwitt, “Move to EDI Gathers Steam as ‘P’’ GEISCO plans . . . .,” Washington Post, op. cit., Sept. 29, Standards Clear, Benefits Grow, ” Compu.terWorM, Dec. 15, 1986. 1986, p. 5. ‘gJack Shaw, ANSI X12 meeting on June 9, 1986. “Paul Lemme, TDCC. Personal communication with OTA 70Brad Bass, “Moving Data Electronically Expedites Sup- staff, December 1986. ply Delivery, ” Government Computer IVews, Jan. 30, 1987, p. 22. Ch. 5—improving Information Security ● 117 —.

Linkages in and Contrasts Between some $3 billion, split evenly between commu- 71 Defense-Intelligence and Other Needs nications security and computer security. Some Linkages Between Private Sector Activ- Other important linkages between Govern- ities and Federal Policy .—Private industry and ment policies and the private sector, and be- civilian Government agencies’ interest in safe- tween defense and civilian agencies, are in the guarding their computer and communications areas of security awareness, education, and information are becoming intertwined with assistance. During the past few years, there Government policies even though these inter- has been mounting confusion concerning the ests are increasingly independent of national distinction between the roles of NBS and NSA security. The linkages between private users in these areas. In addition to its Federal stand- and the Government, and between the civil ards development, NBS, under its authority agencies and NSA, tend to blur this independ- in the Brooks Act, as amended, participates ence. These linkages are especially influential in the voluntary activities of standards orga- where NSA’s technical expertise or Govern- nizations and works with the private sector and ment certification is important, or where Gov- civilian agencies to develop computer and com- ernment agencies, as major purchasers, tend puter network safeguards techniques, includ- to drive commercial equipment designs. ing security components for the open system interconnection (0SI) architecture. However, Although NSA’s technical knowledge in high NSA, under the auspices of NSDD-145, has quality cryptography and cryptanalysts is ac- expanded its relationships with civil agencies, knowledged to be the cornerstone of U.S. ca- providing threat assessments and awareness pabilities, very little of it is unclassified. Be- briefings and advice in selecting cost-effective cause of this, private users depend on NSA’s and appropriate safeguards. NSA reports that willingness to provide information and advice, it has provided assistance to 36 different civil which currently takes place, in part, in the form agencies and departments, plus the U.S. Sen- of NSA-certified commercial products. ate, for diverse application areas including Understandably, private sector users place trade and finance, drug interdiction, law en- forcement, health, agriculture, immigration, a high value on certified, validated, and stand- 7z ardized safeguard products. This dependence and aviation and national security, as well as to Government contractors and other has required considerable involvement by NBS 73 and NSA in the absence of private sector in- firms. stitutions fully competent to independently de- velop standards and certification processes. However, NSA’s plans to replace DES in 1988 ‘]Of the $1.5 billion budgeted for communications security with hardware modules that use secret al- in 1986, 66 percent was budgeted by DoD, about 7 percent by gorithms will tend to deepen and perpetuate other Federal agencies, and 27 percent by the private sector (including defense firms). Of the $1.5 billion for computer secu- private sector users’ dependency on NSA ex- rity, however, DoD and other Federal agencies only accounted pertise as long as these users have no independ- for 13 and 11 percent, respectively, while the pri~’ate sector ac- ent alternative for developing a certified, non- counted for about 75 percent. Electronic Industries Associa- tion: “COMSEC and COMPUSEC Market Study, ” Jan. 14, secret, and exportable successor to DES. 1987. “Agencies and departments that have been assisted by NSA Government agencies represent a large mar- include the United States Trade Representative, International ket for some information security products, Trade Administration, Securities and Exchange Commission. therefore their choice of standards has a sig- Federal Reserve Board, Department of Labor, N’ational Nar- cotics Border Interdiction System, Immigration and Naturali- nificant influence on manufacturers. Accord- zation Service, Drug Enforcement Administration, Center for ing to estimates from a study conducted by Disease Control, National Institutes of Health, Department of the Electronic Industries Association (E IA) Agriculture, and Federal Aviation Administration. Harold E. Daniels, Jr.. NSA S-0040-87, Feb. 20, 1987. Attachment 2 to in cooperation with NSA, Federal and private Enclosure D. sector budgets for information security totaled ‘;Ibid., Attachment 1 to Enclosure D. 118 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

Vendors of safeguard technologies and Table 7.—Overall Ranking of Importance as private-sector defense contractors are also an Adversary (Highest = 7) closely linked to Federal information security OTA survey responsesa policies and programs, such as NSDD-145. Be- Mean Fraction of cause the new, NSA-certified encryption mod- ranking of responses ranking ules are expected to have a large, stable mar- Category of adversary category category #l or #2 ket among Federal agencies, vendors are Your competition ...... 6.7 920/, Some of your internal unlikely to attempt development of riskier, un- employees ...... 4.8 31 certified, encryption-based safeguards. Private Foreign governments . . . . . 3,1 4 sector users, therefore, may be faced with Your suppliers ...... 4.1 15 Your customers ...... 4.9 27 limited new options if the supply of encryption- Public interest groups . . . . 4.0 19

based safeguards is determined by “technol- aA(l respondents were non. government ogy push” (from NSA) rather than “demand pull” (from unconstrained market forces). Ernst & Whinney survey — non-government responses Emerging Differences.–What is open to Mean Fraction of question is the extent to which the concerns, ranking of responses ranking Category of adversary category— category #1 or #2 priorities, and needs of the defense- and na- . Your competition ...... 6.5 - 89%, tional security-oriented user communities are Some of your internal generalizable to civilian agencies and the bulk employees ...... 4.9 43 of the private sector. Foreign governments . . . . 3.9 30 Your suppliers ...... 4,1 11 One interesting set of findings from the OTA Your customers...... 4.7 35 7J Public interest groups ,. . 3.9 15 and Ernst & Whinney surveys, mentioned DBetWeen 200.3~ OUt of a total of 421 non-government respondents ranked each earlier, is based on the respondents’ percep- category of adversary, the rest did not rank that category tions of who their organizations’ adversaries Ernst & Whinney survey are and illustrates an important difference be- Government responses tween perceived Government and private sec- Mean Fraction of tor information security needs: who the most ranking of responses ranking Category of adversary significant adversaries are, and what level of category category— #l or #2 Your competition ...... - 4.1 35 ”/0 resources they possess. Table 7 summarizes Some of your internal responses to a question in each survey that employees ...... 5,3 53 asked respondents to rank categories of adver- Foreign governments . . . . . 6.1 74 Your suppliers ...... 4,3 24 saries according to how relatively important Your customers ...... 4.5 34 it is to protect their organizations’ significant Public interest groups ., . 5.0 48 (unclassified) “company confidential” or pro- CBet~een 26.49 Out of a total of 141 government respondents ranked each prietary information from them. For example, category of adversary the remainder dld not rank that category the group of 26 nongovernment individuals surveyed for OTA, predominantly nondefense Fortune 100 executives, rated foreign govern- ments as their least important adversary. 7s ‘iThe OTA computer security survey was conducted in Oc- tober 1986, at a meeting of Palmer Associates. The 26 respond- Similarly, the larger sample of non-Gov- ents to the questionnaire were data processing audit vice- ernment respondents surveyed by Ernst & presidents and data processing audit directors of Fortune 500 Whinney ranked foreign government adver- companies. Ernst & Whinney, ‘*OTA Computer Security Sur- vey, ” OTA contractor report, Nov. 7, 1986. saries lowest overall. Instead, both non-Gov- Ernst & Whinney conducted a separate survey in November 1986 at the 13th annual conference of the Computer Security Institute. About 500 attendees responded to this self-admin- ‘:’ One of the OTA survey respondents noted that his firm istered survey, most of whom had responsibility for computer was most concerned with protecting information from foreign security functions. The data were made available to OTA in governments; another was concerned with protecting confiden- February 1987. tial customer information from the U.S. Government. Ch. 5—Improving Information Security ● 119 ernment groups considered their competition relatively low level of perceived impact (as of as the most important single adversary, fol- Fall 1986) from NSDD-145 on non-Govern- lowed by customers and some internal employ- ment organizations safeguarding of unclassi- ees, and then by suppliers, public interest fied information. Table 9 summarizes responses groups, and foreign governments. The Govern- to a survey question about the impacts of ment respondents surveyed by Ernst& Whin- NSDD-145. Almost three-quarters of the re- ney considered foreign governments (perhaps spondents (all non-Government) to the OTA analogous to “your competition’ for busi- survey and almost half of the non-Government nesses) to be the most important adversary, Ernst & Whinney survey respondents felt that followed by some internal employees, public NSDD-145 had had no impact on their orga- interest groups, and the other categories. An nizations’ safeguarding of unclassified infor- important difference between business com- mation. Moreover, fewer than 10 percent of the petitors and foreign government adversaries respondents to the OTA survey and fewer than is, obviously, the level of resources that each 30 percent of the non-Government respondents type could deploy to gain access to information. to the Ernst& Whinney survey considered that the directive had impacted their firms’ secu- The Electronic Industries Association mar- rity practices for unclassified information ket study mentioned earlier also found “widely 77 “somewhat” or “greatly. By contrast, the different perceptions of the threat to informa- Government respondents in the Ernst a Whin- tion systems and this results in different and often conflicting and competing security re- ney survey reported much higher levels of im- quirements . . .” The study notes a national pact overall, with only one-quarter reporting security perspective that focuses on external no impact from NSDD-145 on unclassified in- threats while others’ perceptions are of inter- formation security and almost 60 percent re- nal sources as the principal threat.76 It also porting that the directive had impacted their notes that businesses and civilian agencies at- organizations’ unclassified information secu- tached considerable importance to the cost of rity at least somewhat. safeguards and their effect on operations. More than two-thirds of both the OTA sur- vey respondents and the non-Government re- Other differences (and similarities) between spondents to the Ernst & Whinney survey felt current Government and private sector infor- mation security priorities are suggested by a that their firms’ information and data secu- rity measures were at least fairly adequate to survey question asking respondents to list meet their needs. What is somewhat surpris- their organizations ‘ “top-priority” computer- ing is the relatively low percentages of these security and information-security concerns. These responses are summarized in table 8. firms’ total information and computer secu- Although the same types of concerns are men- rity expertise attributed to Government-spon- tioned by Government and private sector re- sored assistance programs, conferences, and spondents, their relative priorities are dif- training programs. Only 2 of the 26 OTA sur- ferent. vey respondents indicated that even a small percentage of their firms’ information and data An important effect of these perceptions and security expertise came directly from Govern- priorities is on the users’ decisions concerning ment assistance programs. This low percent- the use and choice of safeguards. age is likely due to the composition of the Palmer Associates group surveyed and is in Another interesting finding from both the marked contrast to what one might expect OTA and Ernst & Whinney surveys was the

‘-Two firms in the OTA surve~, indicated that the~r had in~- 7(] Electronic Industries Association, “C0.MSEC AND COM- plemented encryption or scrambling to protect sensitive com- PUSEC Marke~ Study, ’( Jan. 14, 1987. This stud~r was based munications in response to N SD II- 145, and one of these firms on 75 interviews, 64 of which were with Federal agencies, in- also implemented access control soft ware, passwords, and ac- cluding 39 ha~’ing defense and intelligence missions. quired special communications channels. 120 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

Table 8.—Top-Priority Computer and Information Security Concerns Mentioned by Respondents

Ernst & Whinney Ernst & Whinney OTA survey group (non-government) non-government group Government group Data security/data integrity Network security Contingency planning/disaster recovery Network security Data/information classification and Data/information classification and security security Contingency planning; training Micro/PC security Network security Quality security throughout firm; Dial-up security/communications Micro/PC security telecommunications links; internal hacking SOURCE Data from surveys conducted by Ernst & Whlnney In October and November 1986

Table 9.–Perceived Impacts From NSDD-145 (Fall 1986)

Question: “On September 17, 1984, President Reagan signed National Security Decision Directive 145 (N SDD-145), the National Policy on Telecommunications and Automated infor- mation Systems Security. This policy has led to much more active involvement by the National Security Agency and the National Computer Security Center in provid- ing advice to business and industry. How has NSDD-145 impacted your organization in safeguarding information that is not classified for purposes of national security?” Survey responses OTA survey total responses Ernst & Whinney Ernst & Whinney Ernst & Whinney to question total responses non-government government (all non-government) to question responses responses Response (23) (486) (364) (122) Not at all . . . . 74 “/0 41 “/0 46% 25% Very little . . . 17 25 27 18 Somewhat . . . 9 24 21 33 Greatly...... 0 11 6 24 SOURCE Data from surveys conducted by Ernst & Whinney In October and November 1986 from an alternative group composed of defense security standards and evaluated products pro- contractors, computer firms, or firms produc- gram will serve the needs of civilian agencies ing security products for the Government mar- and private businesses is receiving increased ket. In fact, only two of the respondents to the attention within the computer security com- OTA survey indicated awareness of any spe- munity. One of the most crucial aspects of the cific Government-sponsored information and debate concerns the security policy underlying assistance programs. Of the 22 individuals re- the Orange Book criteria, the mechanisms sponding to a question concerning their per- needed to enforce security policy, and how well ceptions of the helpfulness of Government these match the security policies (and associ- guidelines, 17 answered “not at all, ” while 5 ated mechanisms) that are common in commer- said these had been “somewhat” helpful to cial practice. According to computer security their organizations. The respondents who did experts at NSA, for example, the National find Government guidelines helpful cited the Computer Security Center (NCSC) has worked NBS Federal Information Processing Stand- —and continues to work— “hand in glove” with ards (FIPS), including DES, as well as guide- the civilian agencies to understand their needs lines for protecting privacy-related and clas- and provide appropriate computer security sified information. solutions7g and, moreover, products that have been evaluated by NSA and that have received Differences Between Military and Civilian Computer Security Models.–The debate about ‘“Harold E. Daniels, Jr., NSA S-0033-87, Feb. 12, 1987, p. how well the NSA’s Orange Book computer 7 of Enclosure 2. Ch. 5—improving /formation Security . 121

B- and C-level ratings are being used in the pri- By contrast, Clark and Wilson assert that vate sector (some of these, such as RACF, what underlies commercial data processing ACF2, and Top Secret, were developed well be- security practices is the prevention of fraud fore the Orange Book was published but have and error and, therefore, that a “commercial” been modified to meet Orange Book stand- security policy should address integrity rather ards). Other experts disagree with this posi- than disclosure. Some of the mechanisms to tion, and argue that the security policy and enforce this type of policy are common with mechanisms specified in the Orange Book do those for the military model (for example, user not meet important needs in commercial data authentication), while others are very differ- processing. ent. Among these others, Clark and Wilson identify two principal mechanisms: the well- Among the latter group are David D. Clark formed transaction (in which a user can (MIT Laboratory for Computer Science) and manipulate or record data in constrained ways David R. Wilson (Ernst & Whinney). In their that preserve or ensure the integrity of the paper, “A Comparison of Commercial and Mil- 79 data–analogous to a paper-and-ink accounts itary Computer Security Policies, ’ they book in which correction entries, rather than present a security model based on commercial erasures, are made); and separation of duty data processing practices and compare the among employees (in which the user permitted mechanisms needed to enforce this model’s to create or certify a well-formed transaction rules with those needed to enforce the (lattice) may not be permitted to execute it—analogous model of security embodied in the NSA criteria. to double-entry bookkeeping in which a check Other experts have also offered criticisms of for payment must be balanced against a match- the Orange Book’s applicability to business ing entry in the accounts-payable column). Sep- needs. However, a brief summary of the Clark aration of duty is a fundamental principle of and Wilson paper, offered here as an example, commercial data integrity control, and is con- points out some of the main points of criticism. sidered effective except in the case of collusion According to Clark and Wilson, the “mili- among employees. tary” (NSA/DoD) security policy is really a set In their paper, Clark and Wilson conclude of policies designed to control classified infor- that the integrity mechanisms inherent in the mation from unauthorized disclosure or declas- commercial security model differ from the man- sification. Mechanisms used to enforce this datory controls in the military (nondisclosure) security policy include mandatory labeling of security model in important ways, and controls documents or data items, assigning of user ac- based on the military model are not sufficient cess categories based on security clearances, to enforce the commercial (integrity) model. generating audit information, etc. The higher- They then introduce a more formal model for level security policies include mandatory data integrity in computer systems, based on checks on all read and write transactions; these the use of constrained data items and trans- mandatory controls constrain the user so that formation procedures for enforcing an integrity any action taken is consistent with the secu- policy. Comparing this model with other in- rity policy. In addition to these mandatory con- tegrity models, Clark and Wilson argue that trols, discretionary controls can be used to fur- their model, unlike the Orange Book standard, ther restrict data accessibility (e.g., “need to is applicable to a wide range of integrity know” controls), but, say Clark and Wilson, policies. these cannot increase the scope of security con- trols in a manner inconsistent with the under- By early 1987, debate on the general applica- lying multi-level classification concept. bility of the Orange Book criteria and devel- opment of alternative models of computer and information security had developed to the ex- ‘gDavid D. Clark and David R. Wilson, “Commercial Secu- rity Policies, ” Proceedings, 1987 IEEE Symposium on Secu- tent that plans were made for an invitational rity and Pri}racy, Oakland, CA, Apr. 27-29, 1987. Workshop on Integrity Policy for Computer 122 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic /formation

Information Systems, organized by Ernst & ury has announced that technology to secure Whinney and cosponsored by the Institute for Federal electronic fund transfers (EFTs) must Electrical and Electronic Engineers, the Asso- be compatible with systems used by the Fed- ciation for Computing Machinery, NBS, and eral Reserve System and the commercial bank- NSA’s National Computer Security Center ing community. Specifically: (NCSC), to address military versus commer- ● Treasury will continue to support and im- cial security policy issues. The workshop is 80 plement ANSI financial standards as the scheduled to be held in late 1987. common method for securing Federal Civilian Agency Actions.–In addition to the EFTs and will only transition from the NBS activities described earlier, related to current (DES based) ANSI standards to DES, FIPS publications, and voluntary stand- any new ANSI standard (not based on ards development, there are other civilian DES) if the transition is based on “sound agency activities related to safeguarding elec- business decisions and security needs. ” tronic information. (An earlier OTA report sur- ● Treasury will rely on NSA’s commitment, veys civilian agency programs for computer of November 12, 1985, that DES will be security. )81 The Treasury Department, for ex- supported indefinitely for the financial ample, requires the use of safeguards for in- community. formation systems that handle sensitive, as ● Treasury will rely on NBS to continue to well as classified, information.82 All Federal validate DES chips. electronic fund and securities transfer systems ● Treasury will continue to certify equip- must also have safeguards in place by June ment and techniques for Federal use to 1988. The requirement applies to all Federal provide authentication/encryption and agencies (except DoD, which has its own pol- automated key management for EFTs. icy) and to wholesale banks that do business Treasury will continue to develop, in con- with Treasury and use the Federal Reserve junction with NBS, automated test System as the interface.83 The Treasury De- beds/bulletin boards so that NBS can vali- partment Order (TO 106-09) requires that date successful hardware and software im- authentication measures conform to the Amer- plementations of ANSI financial 86 ican National Standard Institute (ANSI) X9.9 standards. standard “or equivalent authentication tech- 4 The Federal Reserve System publicly ex- nique. “8 According to Department of Treas- pressed its commitment to electronic data ury officials, the DES “is and will remain fun- security in early 1985, when it announced spe- damental to the Department’s security 8S cific plans to enhance its electronic payment strategy for the foreseeable future. “ Treas- services in order to increase their security. The Federal Reserve is a highly-visible participant ‘“Information on workshop from David Wilson and Jenny in the Nation’s electronic payments system, Sobrasky (Ernst& Whinney), private communications with OTA both as an operator (performing electronic fund staff May 5-6, 1987, and from an IEEE press release (May 1987). and securities transactions, serving as an auto- “*OTA-CIT-297, op. cit. mated clearinghouse, etc.) and as a regulator. ~’2Depmtment of the Treasury, Directives M~u~, Informa- tion Systems Security, Ch. TD 81, Section 40, Apr. 2, 1985. In its role as an operator, the Federal Reserve SSDepmtment of th Treasury, Directives M~u~, “Elec- e must protect its value transactions; as a regu- tronic Funds and Transfer Policy—Message Authentication, TD 81, Section 80, Aug. 16, 1984. Superseded by: Department lator, the Federal Reserve intends that its secu- of the Treasury. “Electronic Funds and Securities Transfer rity and reliability standards serve as models Policy–Message Authentication and Endorsed Security, ” for depository institutions to emulate in secur- TD816-02, Oct. 3, 1986, TD/16-02 is authorized by Treasury Order 106-09, Oct. 2, 1986. ing their own electronic payments operations. S4Depmtment of the Treasury Order #106-09, “Electronic Funds and Securities Transfer Policy–Message Authentica- tion and Enhanced Security, ” Oct. 2, 1986. ‘5J. Martin Ferris, Security Programs, Department of the Treasury, Washington, DC, letter to OTA staff, Dec. 16, 1986. ‘G Ibid. Ch. 5—improving /formation Security ● 123

The Federal Reserve’s plans include encryp- One of the earliest of these national stand- tion of depository-institution connections; as ards, DES, (FIPS 46, released in 1977) is dis- of late 1986, over 60 percent of these were en- cussed in appendix C. DES, which is now pro- crypted and the Federal Reserve plans to have duced in hardware and software both in the almost 100 percent of them encrypted by the United States and overseas,”* has been end of 1987. In addition, the Federal Reserve adopted by ANSI in a number of its technical is currently testing the use of message authen- standards, and was considered for use as an tication within the Federal Reserve environ- international standard by an IS0 technical ment.87 The National Bureau of Standards is committee in 1986, as discussed later. providing technical support to the Federal Private Sector Participation. -Active partici- Reserve. pation in the development of technical stand- ards for information safeguards is another in- Technical Standards Development dication of the current and future needs of Technical standards are important for a business users. ANSI has had active partici- number of reasons. Among other things, they pation from several dozen major corporations, help to aggregate markets by improving the including banks, equipment vendors, and (more uniformity, interoperability, and compatibil- recently) other manufacturers. For example, ity of vendors’ products. several large U.S. banks and the American Bankers Association (ABA), the Canadian Federal Agency Participation.–NSA and Bankers Association, and about 30 vendors are NBS activities in the development of stand- among the participants in developing stand- ards have been noted earlier. Other agencies ards of interest to the banking community, in involved in the development and promulgation addition to NBS, the Treasury Department, of regulations and standards include the Of- and NSA. Suppliers and users of sophisticated fice of Management and Budget, the General safeguards such as biometrics and other tech- Services Administration (GSA) and DoD’s Na- nologies not based on cryptography have acted tional Communications System (NCS). GSA more independently of the Federal Govern- promulgates Federal procurement regulations ment, sometimes in the absence of technical generally, including telecommunications, and standards. Defense agencies are major con- has delegated its responsibilities for produc- sumers of these products, but the Federal Gov- ing and coordinating communications stand- ernment does not enjoy the near monopoly in ards to NCS, which has issued DES-related technical expertise that it has in cryptography. standards for telecommunications security and In the area of biometrics, the International Bi- interoperability. ometric Association was formed in 1986 to ad- NBS has had considerable success during the dress industry issues, including establishing past decade in developing a variety of stand- a testing and standards program. ards for information security, as well as by pub- Most large corporations have developed or lishing dozens of guidelines. Known as Fed- are developing their own information safe- eral Information Processing Standards (FIPS), guard policies. For example, the Chemical NBS standards apply to civilian agencies. Sev- Bank of New York, which has more than 250 eral have also become the basis for standards branches, has developed its own policies and developed or adopted by NSA and by private a security training program for bank employ- standards-setting organizations such as ANSI, ees.” The bank’s policies, published in 1985, the ABA, and the International Organization ““Federal Government certification applies onl} to implemen- for Standardization (IS0). tations of DES in electronic devices. “:’’’ Corporate Data Security Standards, ” Chemical Bank (Chemical New York Corp.), 1985; also Presentation by Joan ‘TJack Dennis, Assistant Director of Federal Reserve Bank Reynolds (Chemical Bank), panelist in “Guidelines and Stand- Operations, Washington, D.C. Personal communication with ards Panel, ” Ninth National Computer Security Conference, OTA staff, Aug. 26, 1986 and letter, Dec. 17, 1987. Gaithersburg, MD, Sept. 16, 1986. 124 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic /formation define security and custodianship responsibil- project is developing techniques for secure ac- ities in the bank’s distributed operating envi- cess to private and subscription databases. In ronment and govern the transfer of informa- the fall of 1986, AITRC was licensing a low- tion in hard copy and electronic forms to cost, credit card device for remote user iden- protect the bank’s information service and data tification. 92 assets. The bank has developed a software Technical Standards Bodies.—Another indi- package that it uses to train branch officers cation of the variety of users’ needs and de- to perform risk assessments for their local mands is provided by the activities of the tech- offices and to implement the corporate secu- nical standards-making bodies. Users and rity standards. By late 1986, the software vendors in the banking and information proc- package had been used in at least 30 Chemical 9o essing communities, and in civilian Govern- Bank locations. ment agencies, have been working with con- The Small Business Computer Security and siderable success for the past decade to develop Education Act (Public Law 98-362) provided standards to meet their needs for improved in- another mechanism for private sector partici- formation safeguards. These groups recognize pation in developing information security that standards establish common levels of standards and guidelines. Passed in July 1984, cryptographic-based security and interopera- the act set up a 10-member Small Business bility for communications and data storage Computer Security Advisory Council to advise systems. 93 small businesses on the vulnerabilities to mis- The leading information standards-making use of computer technologies (especially in dis- organizations in the United States have been tributed network environments) and on the ef- the Institute for Computer Sciences and Tech- fectiveness of technological and management nology at NBS, the American National Stand- techniques to reduce these vulnerabilities. It ards Institute (ANSI), and the American also develops guidelines and information to as- Bankers Association (ABA), as noted earlier. sist small businesses and plans to distribute The International Organization for Standard- written materials, including a small business ization (IS0), develops voluntary standards for guide to computer security (to be published by 9 international use. Through these bodies, users NBS) in mid-1987. ’ A report to Congress will and vendors are setting the stage for im- be issued by December 1987. proving the integrity and security of computer The Applied Information Technologies Re- and communications systems world-wide. search Center (AITRC) represents yet another The American National Standards Institute private sector approach to meeting informa- (ANSI) serves as a national coordinator and tion safeguard needs. A consortium of scien- clearinghouse for information on U.S. and in- tific, technological, and business organizations ternational standards. It is the central non- based in Columbus, Ohio, AITRC is part of government institution in the United States this State-supported program. It was sup- for developing computer, communications, and ported by an initial State grant of $1.4 mil- other technical standards for industry. ANSI lion. Its industrial members include leaders in online information services, and one AITRC

“Sources: Information Hotline, July-August 1986, pp. 6-7; and personal communication between OTA staff and Richard ~per~ond communication between OTA staff md Denise Bowers, AITRC, Sept. 8, 1986. Ulmer, Chemical Bank of New York, Sept. 25, 1986. 9SD Branstad and M. Smid, “Integrity and SeCllrity St~d- The software package, RiskPacTM, is also being marketed ards Based on Cryptography, ” North Holland Publishing Co., commercially through Chemical Bank Information Products and Computers & Security 1 (1982) CASOO043 [NC]. Also, see Orga- Profile Analysis Corporation, Ridgefield, Connecticut. Personal nization for Economic and Co-operative Development, Commit- communication between OTA staff and Peter S. Brown, Pro- tee for Information, Computers, and Communications Policy, file Analysis Corp., Sept. 25, 1986. “Standards and Standard-Setting in Information Technology- gI Information provided by Peter S. Brown, ch~rman, sm~l st~es, Strate@es, and International Implications, ” Sept. 5, Business Computer Security Advisory Council, Sept. 25,1986. 1985. Ch. 5—lmproving Information Security . 125 — —

members represent a broad range of industries cember 1985, an IS0 technical subcommittee and technical disciplines. NBS is a member of recommended that DES be adopted as an in- many ANSI committees, including those deal- ternational standard.’}” Any standard adopted ing with message authentication and encryp- by the ISO would likely be used throughout tion; other Federal agencies including Treas- much of the developed world to safeguard com- ury and NSA also have memberships. ANSI munication and computer systems. Disagree- serves as the U.S. representative to the Inter- ments within the U.S. delegation (between national Organization for Standardization NSA and the business community members (ISO). of ANSI) led the U.S. delegation to abstain dur- ing the IS0 vote on DES.95 ANSI, in mid- These organizations are structured inter- nally into committees, technical committees, 1986, recommended to IS0 that cryptographic algorithms not be the subject of international and working groups to accommodate the spe- standardization. This change from ANSI's pre- cial interests of their members and to provide vious position probably came in response to a narrow focus, where needed, for developing NSA suggestions.96 Several months later, the particular standards and guidelines. Among IS0 Technical Committee TC97 announced the the structures related to information security 97 are: withdrawal of the proposed DE A-1 standard. Some of the other nations involved in the . ANSI X3 (Information Processing Sys- IS0 deliberations have proposed their own al- tems) Committee, which includes the en- 98 cryption technical committee; and ANSI gorithms as alternatives to DES. This pro- X9 (Financial Services) Committee, which posal may give credence to what many believe, i.e., that not only can other nations offer en- includes the financial institution message cryption algorithms for international use, but authentication working group, the finan- cial institution key management commit- that future encryption services will be decided tee, and the bank card security working based on international commercial needs. The group (focusing on personal identification number, management, and security); Ž ABA, which focuses on financial trans- ) actions safeguards, including encryption ‘’ Vincent McClellan, “The Pentagon Couldn’t Defeat IBhl in Battle Over DES Standard, ” Information It ”eek, Feb. 24, 1986, and message authentication; and pp. 24-27. • ISO’s Technical Committee 97 (TC-97) and “Ibid., pp. 24-27. its various subcommittees and working 9’During a meeting with NSA officials in June 1986, OTA staff were advised that since most pri~’ate sector foreign repre- groups, which are responsible for devel- sentatives to the I SO have close ties with their governments, oping standards for information process- the final 1S0 decision on whether to adopt the DES could be ing systems; and Technical Committee 68, decided prior to I SO voting through pri~’ate negotiations among governments. Furthermore, NSA officials have stated that NSA which has similar responsibilities for the is not in favor of DES (or any one algorithm) being used as an financial community. international encryption standard. Harold E. Daniels, Jr., NSA S-0033-87, Feb. 12, 1987, p. 2 of Enclosure 2. These bodies make extensive use of one Critics of NSA are sometimes inconsistent. For example, there was speculation that the real reason that NSA opposes DES, another’s work, often adopting the other’s or any other algorithm, as an international standard is that it standard intact or with modifications. Table would damage NSA signals intelligence operations or benefit 10 shows the progress being made in the de- criminal elements. On the other hand, others speculate that DP~S velopment of standards and guidelines, as well is easy for a government intelligence agency to decipher. However, according to one NSA executive, there is no evi- as many of the contributions of different civil- dence that anyone has yet found a way to break the DES. But, ian institutions. because DES has come into such widespread use, it ma~’ be- come an attractive target for just such attempts. OTA staff The interests of many developed countries meeting with Harold E. Daniels, Jr., ,NSA, Aug. 13, 1986. 7 in establishing an international standard for ‘ Vincent McClellan, “The Pentagon Couldn’t Defeat IBM in Battle O\~er DES Standard, information J1’eek, Feb. 24, 1986, cryptography have recently culminated more PP. 24-27. than 5 years of deliberation in the IS0. In De- “hlbid. 126 . Defending Secrets, Sharing Data.’ New Locks and Keys for Electronic Information

Table 10.—Selected Civilian Technical Standards for Safeguarding Information Systems

Standard/guideline Developer/year Principal and other users/uses Data Encryption Standard (DES) (FIPS PUB 46) NBS (1977) U.S. Government (computer and communication security); increasing use in private sector DES Modes of Operation (FIPS PUB 81) NBS (1980) U.S. Government (key management, character transmission, packet transmission, voice) Key Notarization System (U.S. patent NBS (1980) U.S. Government (notarized identification of 4,386,233) originator and receiver of secure message or data file); also used in banks Guidelines for Implementing the DES (FIPS NBS (1981) U.S. Government (general DES user PUB 74) information) Computer Data Authentication (FIPS PUB 113) NBS (1985) U.S. Government (authentication code for data integrity in ADP systems and networks); some use in private sector Password Usage Standard (FIPS-112) NBS (1985) U.S. Government (identifies ten security factors for a password system) General Security Requirements for GSA (1982) U.S. Government (physical and electrical Equipment Using DES (FS-1027) security of DES devices) Interoperability and Security Requirements of GSA (1983) U.S. Government the DES in the Physical Layer of Data Communications (FS-1026) Data Encryption Algorithm (DEA) ANSI X3.92 (1981) U.S. industry (voluntary standard, DEA is ANSI terminology for the DES) Data Link Encryption Standard ANSI X3.105 (1983) U.S. industry DEA Modes of Operation ANSI X3.106 (1983) U.S industry Financial Institution Message Authentication ANSI X9.9 (1983) Wholesale banks (message authentication); (wholesale) industry (electronic procurement message authentication) Personal Identification Number (PIN) ANSI X9.8 (1982) Retail banks (DEA encryption of PINs; Management and Security retailers (computer access control) Financial Institution Key Management ANSI X9.17 (1985) Wholesale banks and industry (cryptographic keys for encryption and message authentication) Financial Institution Message Authentication ANSI X9.19 (1986) Retail banks (message authentication using (Retail) DEA) Financial Institution Encryption of Wholesale ANSI X9.23 (draft) Wholesale banks and industry Financial Messages Management and Use of PINs ABA (1979) Banks (general guidance) Protection of PINs in Interchange ABA (1979) Banks (general guidance) Key Management Standard Dec. 43 ABA (1980) Banks (general guidance) Data Encryption Algorithm (DEA-1) ISO (1986) Proposed international version of DES (FIPS-46); withdrawn by ISO Technical Committee TC97. Modes of Operation of DEA-1 ISO/DIS 8372 Draft international standard has been approved (title may change due to withdrawal of proposed DEA-1 standard) Data Link Enciphering Standard ISO/DIS 9160 Draft international standard, version of ANSI X9.105 Message Authentication ISO/DIS 8730 Draft international standard for message authentication; Part 1 specifies the DEA-1 algorithm, Part 2 specifies the MAA algorithm Public Key Encryption Algorithm and ISO/DP 9307 Draft proposal for standards (may be Systems stricken) Banking: Key Management (wholesale) ISO/DIS 8732 Draft international standard for wholesale banks SOURCE Of f!ce of Technology Assessment, 1987 Ch. 5—Improving Information Security ● 127

trend toward the standardization of encryp- tudes, and perceptions. In order to understand tion-based safeguards, principally for improv- the differences in each user’s requirements, pri- ing message integrity (virtually all of which orities, and perceived risks and threats, infor- are currently based on DES, often in conjunc- mation such as the following must be evalu- tion with public-key cryptography) suggests ated by each user: that within a few years major segments of the world’s businesses will have standardized in- What are the user’s major concerns? For formation safeguards where needed. example, what is the relative priority for various types of information for integrity Second, these trends indicate that the role versus confidentiality, versus reliability of the U.S. Government is shifting from that and continuity of service? of the principal developer of safeguard stand- What sensitive information may warrant ards in the early 1970s to a more limited role better safeguards than are now provided? of one participant among many, although with Who are the adversaries that need to be continuing and important responsibilities. protected against (employees, competitors, foreign governments) and the resources Inherent Diversity of User Needs they are likely to use? Decisions on arcane technical standards, What are the likely consequences (finan- originally based on national security concerns, cial, embarrassment, privacy) of different have already begun to be influenced by vari- types of losses? What has been the loss ous, growing nondefense interests in the experience to date? United States and worldwide. If safeguard What are the decision criteria (costs and products meeting Federal standards for cer- benefits for bolstering safeguards, re- tification do not fully meet commercial needs, quired by law, risk aversion)? then users are likely to seek greater independ- ence from the Federal Government. Some Responses to these and other questions help movement in this direction is already taking to define the user’s needs for safeguards and place, as evidenced by: unpublicized plans in are likely to be different from one user to 1987 of the U.S. banking community to by- another, even when they are in the same gen- pass NSA’s secret algorithms; growing com- eral business. A defense contractor bound by mercial interest in proprietary public-key al- DoD policies and regulations for safeguarding gorithms, which have no Federal standard but classified information from foreign adversaries, meet users needs for electronic key distribu- for example, can recover the costs of safe- tion and digital signatures; and, the workshop guards from the Government. This is a very on Integrity Policy for Computer Information different situation than that of a large retailer Systems, planned for late 1987, which will fo- who needs to authenticate thousands of trans- cus on military v. commercial security models. actions per day, with emphasis on service de- livery, costs, data integrity, and protection The foregoing description of various users’ against dishonest employees and customers. needs, and actions that Government and And, the retailer’s needs bear little resem- private sector groups have undertaken to meet blance to the bank manager’s requirement to them, serves to point out the inherent diver- show that he has exercised due care in safe- sity and heterogeneity of users’ needs for in- guarding the bank’s assets. formation safeguards. Within the Federal Gov- ernment itself, for example, different Chapter 6 focuses on some of the major laws requirements exist among defense and civil- and policy directives concerning information ian agencies, and even between classified and security. In tracing the development of pub- unclassified applications (such as food service lic policy, it seeks to provide insights into the or routine procurement) within DoD. The pri- question: “How did we get where we are vate sector is no more uniform in its needs, atti- today?” Chapter 6 Major Trends in Policy Development CONTENTS

Page Findings ...... 131 Introduction...... 131 The Evolution of Federal Policy for Safeguarding Unclassified Information in Communications and Computer Systems ...... 135 Executive Branch Activities in Information Security ...... 135 Computer Security ...... 136 Communications Security...... 137 Definition of Sensitive Information ...... ,.....139 Policy Development in Congress ...... 140 Government Controls on Unclassified Information ...... 141 Controls Through Legislation ...... 141 Executive Branch Directives and Other Restrictions ...... 143 The Environment for Policy Development...... ,...... 145 The Early Environment ...... 145 The Changing Environment and Federal Policies ...... 145 The Current and Future Environment ...... 146 Current Congressional Interest ...... 147

Tables Table No. Page 11. Selected Government Policies Related to Controls on Information Flows: A Context for Electronic Information Security , ...... 132 12. Government Actions Affecting the Security of Information in Computer and Communications Systems ...... 134 13. Committees Guiding the Implementation of NSDD 145...... 139 Chapter 6 Major Trends in Policy Development

FINDINGS

● Federal policy limiting the disclosure of information has expanded over the last decade to include growing concern for protecting unclassified, but sensi- tive information, such as that in commercial and Government databases. As part of this process, the role of the defense and intelligence communities has also expanded and “national security,” as a criteria for non-disclosure, is be- ing interpreted more broadly. ● Federal policies on information security are creating tensions with broad na- tional interests and, in contrast with earlier times, can no longer be isolated from them. ● Most recent Federal policies on information security are based principally on national security concerns. Now that information security is becoming im- portant to commerce, more broadly based policies will be more appropriate. ● The National Security Agency (NSA), in carrying out its role under National Security Decision Directive (NSDD-145) to develop computer and communi- cations security standards for use by Government and industry, is involved in two policy conflicts. One conflict involves responsibilities for developing security standards, with the National Bureau of Standards (NBS) charged by the Brooks Act of 1965, as amended, and NSA having overlapping respon- sibilities under NSDD-145. The second is a continuing, inherent conflict be- tween NSA’s mission to perform signals intelligence and its efforts to develop computer and communications safeguards for widespread nondefense use.

INTRODUCTION

Policy for the security of electronic informa- This chapter provides a brief review of two tion has developed in recent years in a setting of these influences: of diverse interests. These interests have in- ● the context of Government controls on un- cluded national security and the separation of classified information that has evolved powers for governmental policymaking, as well during the past few decades, and; as civil liberties, including personal privacy, . the progression of prior policies concern- and commercial needs for improved informa- ing the privacy and security of electronic tion safeguards. The current tensions in infor- information that have led to today’s mation security policy reflect all of these in- policies. fluences. To a large extent, these tensions have their basis in different views within Govern- Policies designed to keep electronic informa- ment of overall national interests and the cen- tion secure developed historically largely in the tral historical role of the Government, particu- context of protecting national security. One larly the Department of Defense (DoD), in of the important ways that has been used to developing technology and setting policies for limit potential damage to the nation’s secu- safeguarding electronic information. rity is through controls on the dissemination

131 132 ● Defending Secrets, Sharing Data; New Locks and Keys for Electronic Information of information. Federal limitations, dating to Table 11 .—Selected Government Policies Related to before the turn of the century, sought to pre- Controls on Information Flows: A Context for Electronic Information Security vent the disclosure and distribution of mili- tarily sensitive, Government-owned or -con- 1940s: ] a trolled information. ● Atomic Energy Act Ž Export Control Actb c Traditionally, information protected for na- ● National Security Act tional security reasons has been limited to mil- —establishes the Central Intelligence Agency 1950s: itary and diplomatic categories. Since the . Invention Secrecy Actd 1940s, a number of laws have been passed and 1960s: presidential directives issued that have grad- ● Export Administration Act of 1969e ually expanded the range of information deemed 1970s: vital to U.S. national security. Controls have . Arms Export Control Act of 1976f Ž PD/NSC-249 been placed on data relating to, for example, —safeguard sensitive Government information in atomic energy, space programs, and a variety communications systems of other technologies. (See table 11.) Similarly, 1980s: efforts have been made to keep intelligence ● Defense Authorization Act, 1984h —controls, on miIitary and space technical data 1 sources and methods secret and there have ● NSDD 189 been discussions on whether controls might —clarify controls on basic research data J be warranted for satellite imagery gathered for ● NSDD 145 2 —safeguard sensitive information in computer and the news media. communications systems At the same time, the medium of informa- Recent reports: ● Air Force study of foreign access to commercial tion that is to be controlled—i.e., oral, print, databases photographic, or electronic—has also expanded. ● Soviet acquisition of Western technology’ m The setting for the transfer of controlled infor- ● Senate report on counterintelIigence aAt~~lC Ener9Y Act of 1946 (60 Stat 755) mation has become irrelevant, whether through bExport Control Act of 1949 (63 Stat. 7) the export of products or services, sales presen- cNational Security Act of 1947 (5o U S C. 403, Sec. 403). This Act also provides standards for classifying and safeguarding Information for the protection of na- tations, university laboratories and classrooms, tional security, notably Intelligence sources and methods dlnvention Secrecy Act of 1951 (U S.C. 181-188). or scientific or trade conferences. eExport Administration Act of 1979 (50 App USC 2401.2413). as amended 1979 1981, 1985. fAms EXpo~t control ACt of 1976 (22 USC 2571 et seq.) r Against this backdrop, computer and com- gPresldentlal Dtrectlve/National Security Council.24, (PD/NSC 24), Telecommu. munications systems are among the media for nlcat!ons Protect Ion Policy (unclassified excerpts, dated Feb 9. 1979), Nov 16 1977 (classified) controlling the transfer of such sensitive in- h Department of Defense Authorlzatlon Act, 1984, P L 98.94, SePt 241983 Sec- tion 1217, Authority to Wtthhold from Disclosure Certain Technical Data (10 formation. Concern for their vulnerability to U.s c 140C) penetration, particularly by foreign intelli- 1 NSDD 189, National Policy on the Transfer of Sclentlflc, Technical, and Engineer. Ing Information, Sept 21, 1985 gence entities, has resulted in pressure to in- jNational Security Decision Directive 145 (NSDD 145), Policy on Telecommunl. cations and Automated Information Systems Security, Sept 17, 1984 crease the security of these systems. k,, The &ploltation of Western Data Bases,” Report of the Air Force Management Analysis Group, (Secret), June 30, 1986 A second context that affects Government I, Soviet ACqUISltlOfl of Mllitarlly Significant Western Technology An Update, ” Department of Defense, September 1985 controls on information concerns the respec- m, Meeting the Espionage challenge,” Senate Select Committee on Intell19ence, tive roles of and occasional conflicts between Report No 99-522, Oct 3, 1966 the executive and legislative branches in set- ting policy when national security is at stake.3 The history of this controversy has its origins ‘-]’’The Evolution and Organization of the Federal Intelligence in the drafting of the Constitution and it con- Function: A Brief Overview (1776-1975 ),” Supplementary Reports on Intelligence Activities, Book 6, Senate Select Com- tinues to raise complex issues for both branches. mittee to Study Government Operations, Report 94-755, Apr, Since the beginning of the Cold War in the mid- 23, 1976. W.S. Congress, Office of Technology Assessment, Commerc- ial Newsgathering From Space—Technical Memorandum, ‘Harold C. Relyea, “National Security and Information, ” OTA-TM-I SC-40 (Washington, DC: U.S. Government Printing Government Information Quarterly, vol. 4, No. 1, 1987, pp. Office, May 1987). 11-28. Ch. 6—Major Trends in Policy Development Ž 133 —

1940s, the debate over the roles of the two gained momentum during the past two dec- branches has included such topics as atomic ades independent of concerns for foreign in- energy, satellite communications, and the telligence gathering. funding of research in fields such as electronics As a consequence of these various influences, and supercomputers and of the roles of the mil- most of which have ramifications that extend itary v. civilian agencies. well beyond information security, policy for- mulation has followed at least two interdepen- The controversy over policymaking respon- dent paths, at times initiated by Congress and sibilities within the Federal Government has at other times by the executive branch. The a direct bearing on Federal policy in informa- resulting policies, are highlighted in table 12. tion security primarily because it influences In this process, however, there has been a grow- the scope of national interests to be embraced ing influence of defense and intelligence inter- in such policies and, in that process, the pri- ests in shaping policy for the security of un- orities emphasized. For example, one view of classified electronic information. national interests places priority on military advantage and defense capability, with na- Until recently, Federal policies on electronic tional security often being promoted through information security, whatever their objec- reliance on secrecy and Government controls. tives, have not raised tensions. What is differ- Advocates of this view accept the idea of Gov- ent about the policies of the 1980s, however, ernment control of access to information in the is that some of these have begun to affect seg- greater interest of national security. The other ments of the private sector more significantly. viewpoint focuses on the United States as a In contrast with earlier policies, which had neg- free and open society in which access to infor- ligible influence on nondefense businesses or mation, for realizing scientific, economic, and private citizens, recent policies have tended to intellectual achievement, should be subject to impose added burdens on some businesses, to only minimal Government control when there raise concerns for new restrictions on private is clear justification. sector access to unclassified, but sensitive in- formation, and to interject an intelligence In addition, the process by which policy is agency in normal business operations. (See ch. developed is becoming increasingly important 5.) as the range of national interests affected ex- pands beyond national security concerns and, Some of the key questions that arise are: consequently, as tensions among competing where is policy for the security of electronic objectives are created. Policymaking in Con- information leading? can the current issues be gress tends to be an open process, in contrast resolved? what new issues might arise? The with the often closed process underlying past review of the evolution of policy in the re- executive branch policies concerning commu- mainder of this chapter provides limited in- nications and computer security. sights into the answers to these questions. For example, there is little indication that any per- Federal policy on electronic information manent change is about to occur to reconcile security has also been shaped by concerns for the different views of the national interest and privacy and civil liberties. Laws have been how these should be addressed in policy on the passed limiting warrantless Government wire- security of electronic information. It is more taps and prohibiting eavesdropping on others’ likely, given the complexity of the issues, that private communications or gaining unauthor- the narrower ones will be addressed, such as ized access to computer systems. This path of the extent of controls on information flows v. Federal policymaking, which has its origins the ease of public access to Federal informa- with the Communications Act of 1934, has tion intended by the Freedom of Information 134 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic /formation

Table 12.—Government Actions Affecting the Security of Information in Computer and Communications Systems

Executive Branch Legislative Branch Key Reports World War I ...... War Departmenta Post World War I American Black Chamberb 1934 ...... Communications Actc 1952 ...... NSA createdd 1965 ., ...... Brooks Acte 1968 ...... Omnibus Crime Control and Safe Streets Actf 1976 ...... Senate report on Federal intelligence functions 1977 ...... NBS establishes DES as U.S. MITRE reports on standard communications securityh 1978 ...... Foreign Intelligence Surveillance Acti 1979 ...... Policy on protection of RAND report on computer government communications, security k PD/NSC-24j 1980 ...... NTIA White Paper’ 1981 ...... Executive Order 12333M 1984 ...... Policy on protection of government computer and communications systems, NSDD 145. ” $985 ...... House hearings on computer security policy” 1986 ...... Policy on protection of HR 145, Computer Security sensitive information ActP NSA decision to replace DES Computer Fraud and Abuse actr Electronic Communications Privacy Acts 1987 ...... Planned review of NSDDq House hearings on HR 145t House report on computer security” aR~S~O”Slb,llt,~~ of the war Department i“cl”cfed safeguarding classified and diplomatic rneSSa9eS and S19na15 lntell19enCe operations bH o YardleY ~~e Afne~f~~fl ~l~~k c~arnber, BObbS.rderrlll CO , Indianapolis, 1931 As reported in David Kahn. ~~e code~feakers. PP 360”361 cThe Communlcatlons Act of 1934, Sect Ion 6r35 (now section 705), as amended dThe National Security Agency was created bY a ~tlll.classlf{ed presidential memorandum In 1952. NSA’S respons!billtles Include safeguarding Government classl fled and diplomatic commun icattons and foreign signals Intel I igence operation ‘Public Law 89.306 fTltle 3 of the Omnibus Crtme Control and Safe Streets Act of 1968 protects the privacy of wire and oral communications and delineates cond!tlons under which Intercept Ion of wire and oral communications may be authorized gJ’The Evolutlon and organization of the Federal Intelllgency Function A Brief Overview (1776 -1975),” Supplementary Reports on Intelligence Act!vltles, Book VI Senate Select Committee to Study Government Operations, Report 94-755, Apr 23, 1976 hstudy of the vulnerability of Elfjctronlc Communlcatlons Systems to Electronic lnterceptlonl ” Volumns 1 & 2, the MITRE Corp , January 1977, “Selected Examples of Possible Approaches to Electronic Communications Intercept Operation s,” the MITRE Corp , January 1977 These reports were prepared under contract to the Office of Telecommunications Policy, Executive Off Ice of the President I Publlc Law 95.511 establishes standards and procedures for the use of electronic su rvel I lance for Government I ntelllgence CO I Iect!on within the Un Ited States. I ncludlng wiretaps and radio interception Jpresldentlal Dlrectlve/National Security Council.24 (PD/NSC-24) Telecommunlcatlons ProtectIon Policy (unclasstfted excerpts, dated Feb. 9, 1979), NOV 16, 1977 (classified) ksecurlty Controls for Computer Systems, ” Report of the Defense Science Board, Task Force on Computer Secur!ty Or{gtnally publlshed as a class!fled docu. ment (R-609), February 1970 Republished as R.609.1 by the RAND Corp , October 1979 (unclasslfled) 1’ Analysts of Nattonal Policy Opttons for Cvptography ” National Telecommunications and Information Adm!nlstratlon, Department of Commerce, Oct 29, 1980 ‘Executive Order 12333, Untted States Intel I igence Act!vltles, Dec. 4, 1981 The order includes a descrl pt!on of certain authorities of NSA for commun Icatlons security safeguards. n NSDD 145, Pol Icy on Telecommun!catlons and Automated In format Ion System Secu rlty, Sept 17 1984, assigns responslb!l! ty for computer and communtcat Ions security to a single executive agent. the Secretary of Defense, and a single nattonal manager, the Director of NSA OHearlng S on computer security Policlesl House Subcon’lrnlttee on TranSpOrtatlOfl, Aviation, and fvfaterlals, Committee on Sc!ence and Technology June 27, 1985 pThe computer Security Act of 1986 (now 1987), HR 145. qpollcy on protection of Senslttve, but Unclassified Information In Federal Government Telecommunication and Automated Systems, NTISSP NO 2 Oct 29, 1986 This policy prov!des a def!nit!on of such sens!ttve Information and notes the responsibilities of department heads for deciding when safeguards are warranted This policy was rescinded tn March 1987 by Frank Carlucc!. Chairman, National Security Council, and at the same time, a review of NSDD 145 was ordered ‘Publlc Law 99-474 prowdes penalties for unauthorized access to certa!n f!nanclal records In computer systems and for trespassing on Federal computers spubllc Law gg.s06 amends Title 3 of th Omnibus crime Control and Safe Streets Act of 1988 it protects against the unauthorized lnterCeptl On Of electronic e communications tHearlng s on HR 145 of the House Subcommittee on Legislation and National Security, Feb 25-26, and Mar 17, 1987, and )Olnt hearings Of the House Subcommittee on Transportation, Avlatlon and Materials, Feb 26, 1987 ‘House report 100-153, Parts 1 and 2, June 11, 1987 10Oth Cong 1st Sess Ch. 6—Major Trends in Po/icy Development ● 135

Act, and the appropriate roles of NSA v. NBS impacts of policies on users of information in providing safeguard standards for non- security products and on providers of infor- defense use. mation services, particularly where the pub- lic does not understand or agree with the need Finding an appropriate balance between for controls, or where impacts fall unevenly. these different views is not easy. Both are em- The second major shift, one that is still being bodied in laws and policies, and both have deliberated, is a reluctance by Congress to ac- strong advocates within and outside Govern- cept executive branch policies on information ment. Moreover, they have an existence that security when they require subordinating other transcends the current debate over informa- important national interests. These two trends tion security. Still, the issues raised by the suggest that future policies for national secu- debate demand attention now because of the rity will have to be integrated with other in- implications of information security for the terests, or alternative means found for satis- conduct of government, business, science, and fying them, such as through the technological our personal lives. and administrative safeguard measures noted Two important shifts appear to be occurring, in chapters 4 and 5. however. The first is a wider recognition of the

THE EVOLUTION OF FEDERAL POLICY FOR SAFEGUARDING UNCLASSIFIED INFORMATION IN COMMUNICATIONS AND COMPUTER SYSTEMS Executive Branch Activities in While U.S. defense agencies have long had Information Security an interest in preventing Soviet acquisition of various militarily useful equipment produced Government policies have focused on the in this country or by our allies, they have also confidentiality of electronic communications begun of late to urge export protection of tech- since before World War I.4 These policies pro- nical information that could be used for mili- vided the means for protecting classified de- tary or commercial purposes. Consequently, fense and diplomatic messages transmitted in some policy circles, the concept of national over Government and commercial communi- security, which in times past was very famil- cations systems. For most of this century, U.S. iar to our understanding of ‘‘national defense policies included both communications secu- and foreign policy, ” has taken on a broader rity and signals intelligence operations against meaning, one encompassing a wide range of foreign governments.’ These functions be- economic, technical, scientific, and business in- came dispersed within each of the military de- formation. partments, but were consolidated with the cre- ation of the National Security Agency (NSA) At the same time, two other concerns have within DoD in 1952.6 arisen. One is over Soviet and other countries’ electronic intelligence gathering in the United ‘For an account of early Government intelligence operations, States. A second involves an increase in the including wiretapping, codemaking, and codebreaking, see: Sup- range of potential international adversaries. plementary Reports on Intelligence Activities, Book 6, Final No longer are they perceived as limited to mil- Report of the Select Committee to Study Government Opera- tions with respect to Intelligence Activities, U.S. Senate, Re- itary and diplomatic opponents, but include port No. 94-’755, Apr. 23, 1976. economic rivals as well as terrorists, drug “James Bamford, The Puzzle Palace (New York, NY: Pen- traffickers, and organized crime. guin Books, 1983), p. 206. The Army’s cryptologic capabilit~’ dates at least to World M’ar 1. ‘Ibid., p. 81. NS,A was created by a top secret presidential From such considerations have come an in- order signed by President Harry S Truman on Oct. 24. 1952. creasing interest in protecting information 136 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information — —- — that, although not classified, is nevertheless groups, assisted other civilian agencies, and, important or sensitive enough alone or in com- with NSA, cosponsored annual conferences on bination with other unclassified information computer security. NBS also works with users to warrant special precautions. As a conse- and vendors in developing many of their prod- quence, anew category of unclassified, but sen- ucts. Recently, the agency has contributed to sitive information has developed. the development of standards for network security as part of the ‘open system intercon- Computer Security nection network. ”

Executive branch interest in computer secu- The 1970 task force report also prompted rity began with the establishment of a task DoD to improve the security of classified in- force in 1967 to recommend safeguards to pro- formation in computer systems. Research and tect classified information in multi-access, development undertaken by the Air Force, De- resource-sharing computer systems. The work fense Advanced Research Projects Agency, of the task force, which was sponsored by and other defense agencies in the early- and DoD’s Defense Advanced Research Projects mid- 1970s demonstrated approaches to tech- nical problems associated with controlling Agency, resulted in a classified report issued 9 by the Defense Science Board in 1970, a declas- shared-use computer systems. sified version of which was published in As a result of these activities, DoD launched 1979.7 the Computer Security Initiative in 1978, a pro- At the same time, NSA, which was con- gram largely transferred to NSA in 1981, to cerned about the vulnerability of the U.S. bank- address the department’s computer security ing system, began encouraging NBS to become needs. The program became the National Com- involved in computer security. Based on the puter Security Center (NCSC) in 1984 with the authority of the Automatic Data Processing issuance of NSDD-145. NCSC develops stand- Equipment Act (widely known as the Brooks ards and guidelines, evaluates computer hard- Act) of 1965,8 NBS was already developing ware and software security properties, under- performance standards for computers used by takes research and development, and trains the Federal Government. As a result, NBS and users. According to NCSC literature, the cen- the Association of Computing Machinery ter addresses the Nation’s computer security cosponsored a conference in 1972 on computer problems rather than just those associated security. Following the conference, NBS initi- with classified information or defense agency ated a program in computer and communica- requirements. tions security in 1973 based on the Brooks Act. Many of NCSC’s activities affect civilian This program led to the adoption in 1977 of agencies and the private sector. Among these the Data Encryption Standard (DES), as a na- are the development of criteria for evaluating tional standard for cryptography. (See ch. 4.) the security of trusted computers, known as Since then, NBS has published dozens of the “Orange Book. ”]” NCSC, or other parts Federal Information Processing Standards and of NSA, rate commercial products based on guidelines, validated commercial encryption the orange book criteria and train people in devices, participated in voluntary standards computer security, evaluate commercial DES products and other cryptographic devices,’] de-

‘Security Controls for Computer Systems, Report of Defense ‘J. P. Anderson, “Computer Security Technology Planning Science Board Task Force on Computer Security, Office of the Study, ” ESD-TR-73-51, vol. I, AD-758 206, ESD/AFSC, Oc- Secretary of Defense. Originally published as a classified docu- tober 1972. ment (R-609), February 1970; republished as an unclassified doc- “)Department of Defense Trusted Computer System Evalu- ument {R-609-1 ), October 1979, by the RAND Corp., Willis Ware, ation Criteria, DoD 5200.28 -STD, December 1985. See alsoCSC- editor. STD-001-83, Aug. 15, 1983. ‘Public Law 89-306, Automatic Data Processing Equipment 1‘Under the Commercial Communications Security Endorse- Act of 1965. ment Program. Ch, 6—Major Trends in Policy Development ● 137

sign cryptographic modules for vendor manu- DoD contractors over satellite links. It was facture, and develop secure telephone equip- followed by NCSC-ll,14 which broadened ment. NCSC also publishes standards and NCSC-10 to protect all transmission systems guidelines for computer security and partici- carrying sensitive information from Govern- pates in voluntary standards activities with ment and DoD contractors. Neither NCSC-10 industry. (See ch. 5.) nor NCSC-11 included a funding mechanism, but NSA issued National Communication 15 Communications Security Security Instruction 6002 in 1984. It au- thorized Federal agencies and Government PD/NSC-24 contractors to purchase approved equipment Increasing concern during the mid-1970s and services to protect unclassified, but sen- about Soviet interception of unclassified U.S. sitive information. (See ch. 5.) For its part, domestic communications led to a change in NTIA conducted seminars on communications executive branch policy. Presidential Direc- vulnerabilities for more than 1,500 Federal em- tive/National Security Council-24 (PD/NSC-24) ployees. was signed by President Jimmy Carter in 1977. DoD and Commerce were not able to develop It expanded the authority of DoD and, in a a joint proposal for a national policy on cryp- more limited way, the Department of Com- tography, however, because of disagreements merce, for safeguarding unclassified, but sen- over compromises concerning national secu- sitive communications that “would be useful rity, trade, innovation, and First Amendment 12’ to an adversary. PD/NSC-24 directed Fed- rights. Instead DoD and Commerce submitted eral department heads to protect unclassified, separate proposals. Essentially, the DoD pro- but sensitive communications. It assigned posal called for a continuation of various Gov- responsibility to DoD for the security of clas- ernment controls on cryptography, such as on sified communications and for unclassified, but patents and the export of equipment and tech- sensitive communications related to national nical data, while Commerce proposed minimiz- security. It also assigned responsibility to the ing these controls and argued for greater sen- Department of Commerce for raising users’ sitivity to the negative effects they have on awareness of the vulnerability to interception broader national interests.16 of communications systems. In addition, PD/ NSC-24 charged the Defense and Commerce The NTIA effort under PD/NSC-24 was hin- Departments with developing a joint proposal dered significantly by the absence of defini- for a national policy on cryptography. DoD’s tions of the terms “sensitive information and responsibilities were carried out by NSA and “useful to a foreign adversary” that could Commerce’s National Telecommunications serve as practical guides to department heads. and Information Administration (NTIA). This shortcoming is significant because the broad definition provided in NSDD-145 later Several DoD directives were issued to im- had to be withdrawn due to public apprehen- plement PD/NSC-24. The first, National Com- sion about its potentially wide applicability. munications Security Council Policy-10 (NCSC- 13 10), called for the protection of sensitive in- ‘iNational Policy for Protection of Telecommunication Sys- formation transmitted by the Government or tems Handling Unclassified National Security Related Infor- mation (NCSC-1 1), May 3, 1982. ‘sProtection of Government Contractor Telecommunications, National Communication Security Instruction 6002 (NACSI- ‘zPresidential Directive/National Security Council-24 (PD 6002), June 1984. NSC-24), Telecommunications Protection Policy (unclassified IGThis assessment stems from OTA staff inter~iews in April excerpts, dated Feb. 9, 1979), Nov. 16, 1977 (classified). 1987, with former NTIA officials involved in developing the 1‘National Policy for the Protection of U.S. ,National Secu- Department of Commerce proposal for a national policy on cryp- rity Related Information Transmitted over Satellite Systems, tography. Also, see “White Paper: Analysis of National Policy NCSC-10, Apr. 26, 1982. The N“ational Communications Secu- Options for Cryptography, ” National Telecommunications and rity Council was a predecessor organization to that established Information Administration, U.S. Department of Commerce, under NSDD-145. Oct. 29, 1980. 138 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

PD/NSC-24 had at least two notable effects: major Government resources to be used it pioneered an experiment by assigning some to “encourage, advise, and if appropriate limited responsibility for safeguarding Gov- assist the private sector to protect ernment communications to a civilian agency against exploitation of communications —the Commerce Department-and it provided and automated information systems. authority for NSA to protect unclassified com- NSDD-145 states that telecommunications munications. The assignment to NSA was the and automated information systems “are beginning of a trend toward consolidating and highly susceptible to interception, unauthor- broadening responsibilities for the security of ized electronic access, and related forms of unclassified electronic information within DoD. technical exploitation, as well as other forms The joint Defense and Commerce programs, of hostile intelligence threat. ” It recognizes begun in 1978 under PD/NSC-24, were short- that exploitation can occur from terrorist lived. They ended when NTIA’s involvement groups and criminal elements, and that private was discontinued in 1982 due to reasons of gen- or proprietary information can become targets eral agency budget reductions. Further, PD/ for foreign exploitation. NSDD-145 focuses on NSC-24 itself was superseded by NSDD-145 unclassified, but sensitive electronic “Govern- in 1984. Many of the activities initiated under ment and Government-derived information, PD/NSC-24 now come under the authority of the loss of which could adversely affect the na- NSDD-145. tional security interest. ”

NSDD-145 The directive establishes an interagency organization that includes virtually all Federal The current national charter for information defense, intelligence, and law enforcement, as security is provided by Executive Order well as some civilian agencies. The leadership 17 12333 and National Security Decision Direc- of the interagency group is also responsible for tive 145 (NSDD-145). Executive Order 12333 the security of classified information. assigns to the Secretary of Defense responsi- bility for making Government communications The organizational structure is shown in ta- secure. ble 13. The key points to note are: ● NSDD-145 is the current fundamental pol- The Systems Security Steering Group icy for communications and computer security. oversees the implementation of NSDD- 145. It is composed of the secretaries of State, Treasury, and Defense, the Attor- recognizes the merging of communica- ney General, the director of the Office of tions and computer technology and is in- Management and Budget, and the direc- tended to direct a coordinated approach tor of the Central Intelligence Agency, and to securing both types of systems; was chaired by the President advisor for continues the emphasis on protecting un- National Security Affairs as recently as classified, but sensitive information begun 1987. under PD/NSC-24; ● Working under the steering group’s guid- assigns responsibility for computer and ance is the National Telecommunications communications security solely to a sin- and Information Systems Security Com- gle executive agent, the Secretary of De- mittee (NTISSC), which develops operat- fense, and a single national manager, the ing policies and provides security guid- Director of the National Security Agency; ance to Government agencies. NTISSC is and composed of representatives of Govern- establishes a specific responsibility for ment agencies and departments having principle or major missions in military, in- ‘Executive Order 12333, United States Intelligence Activi- telligence, and law enforcement, among ties, Dec. 4, 1981. others. It is chaired by the assistant sec- Ch. 6—Major Trends in Policy Development ● 139 .

Table 13.—Committees Guiding the telligence interests and that the National Secu- Implementation of NSDD 145 rity Council, as chair of the steering group, acts in a decisionmaking capacity rather than as Systems Security Steering Group: 1. Secretary of State an advisor to the President. They also charge 2. Secretary of the Treasury that NSDD-145 raises a conflict by giving au- a 3, Secretary of Defense thority to NSA to develop standards for com- 4. Attorney General 5. Director of OMB puter security, authority that was previously 6 Director of Central Intelligencea given to NBS under the Brooks Act. The con- 7, Assistant to the President for National Security flict has caused manufacturers and business Affairs, chaira users of information security products to ques- National Telecommunications and Information Systems Security Committee: tion which Government agency has leadership Consists of a voting representative of each of the above, for standards development, equipment en- plus a representative designated by each of the following: dorsement, and related functions, and raised 8. Secretary of Commerce the issues of the appropriate division of respon- 9. Secretary of Transportation 10. Secretary of Energy sibility between civilian and military agencies, 11. Chairman, Joint Chiefs of Staffa as well as the secrecy and absence of open ac- 12. Administrator. GSA countability of NSA. 13, Director, FBI 14. Director, Federal Emergency Management Agency 15. Chief of Staff, Armya Definition of Sensitive Information 16. Chief of Naval Operatlonsa 17, Chief of Staff, Air Forcea 18. Commandant, Marine Corpsa Finally, there has been considerable concern 19. Director, Defense Intelligence Agencya over public access to unclassified, but sensi- a 20. Director, National Security Agency tive information (see below). One main reason 21. Manager, National Communications Systema 22. Assistant Secretary of Defense for Command, was the definition of the term as information Control, Communications,— and Intelligence, chair’ whose loss, misuse, alteration, or destruction aDenotes a reDresentat Ive closely assoc Iated WI th the de fenselnat! on al secu nty “could adversely affect national security or community SOURCE Donald C Latham Assistant Secretary of Defense Command. Con other Federal interests. These national secu- trol Communlcatlons and Intelligence, testimony before the House rity interests were defined as: Subcommittee on Transportation Avlatlon, and Materials and Subcom m I ttee on Sc I ence Research, and Technology Feb 26 1987 See also NSDD 145 Sept 17 1984 ,.. matters that relate to the national defense or the foreign relations of the U.S. Govern- retary of defense (for command, control, ment. Other Government interests are those related, but not limited to the wide range of communications, and intelligence). Government or Government-derived economic, ● The interagency group’s executive agent human, financial, industrial, agricultural, tech- for telecommunications and information nological, and law enforcement information, systems security is the Secretary of De- as well as the privacy or confidentiality of per- fense, who approves standards and doc- sonal or commercial proprietary information trine, and reviews the security budgets of provided to the U.S. Government by its other departments and agencies. citizens. 18 ● The national manager for telecommunica- Shortly after this definition was issued, Di- tions and automate-d information systems ane Fountaine, director of information systems security is the director of NSA, who serves for the Office of the Assistant Secretary of De- as the Government focal point for cryp- fense, Command, Control, Communications tography, telecommunications, and auto- and Intelligence, spoke before the Information mated information systems security, con- Industry Association in New York City on No- ducts R&D for security, and approves all vember 11, 1986. This official was widely standards, techniques, systems, and equip- quoted as saying: ments for the security of these systems. “National Policy on Protection of Sensitive, but Unclassi- Critics of NSDD-145 have charged that the fied Information in Federal Government Telecommunications organization is dominated by defense and in- and .Automated Systems, NTISSP No. 2, Oct. 29, 1986. 140 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

I don’t believe that the issue is whether or security concerns. Still other laws, not shown, not we [DoD} are going to protect information. protect the privacy of individuals, such as the I really believe that the issue is what informa- Privacy Act and the Fair Credit Reporting Act. tion we are going to protect, both from the Federal Government, both within DoD and The Communications Act of 1934, Section 605 also within industry. 19 (now Section 705), as amended, provides that “No person not authorized by the sender shall The overall statement was apparently in- intercept any communications and divulge. . . tended to assure listeners that the restrictions the content. ” Notwithstanding this legislation would apply to Soviet access to U.S. databases and the 1938 Supreme Court interpretation and not to the U.S. scientific and technical com- (Nardone v. United States, 302 U.S. 379) that munity. Nevertheless, it was generally seen as Section 605 prohibited all telephone wiretap- foreboding by those who fear further Federal ping even when done by Federal Government restrictions on unclassified information. zz officers, Government wiretapping continued. At about the same time, two other related Title III of The Omnibus Crime Control and events were publicized that reinforced concerns Safe Streets Act of 1968 includes sections that for Government restrictions on unclassified in- protect the privacy of wire and oral communi- formation. One involved reports of a classified cations, and delineate on a uniform basis the Air Force study on foreign access to databases circumstances and conditions under which the in the United States and other Western coun- interception of wire and oral communications tries, and what can be done to limit such ac- 23 may be authorized. cess. zo The other involved well-publicized visits to commercial database firms by repre- The Foreign Intelligence Surveillance Act of sentatives from the Federal Bureau of Inves- 1978 (Public Law 95-511) establishes legal tigation, Central Intelligence Agency, and standards and procedures for the use of elec- NSA asking how controls might be placed on tronic surveillance in collecting foreign intel- subscribers to their systems. These visits re- ligence and counterintelligence within the ceived considerable publicity by the news United States. Electronic surveillance is de- media. fined to include wiretaps, radio intercepts, and other forms of surveillance.24 Policy Development in Congress The Electronic Communications Privacy Act While passing legislation that provided the of 1986 (Public Law 99-508) protects against the unauthorized interception of electronic legal basis for some Government controls on information, Congress has also sought to pro- communications. It amends Title III of the Omnibus Crime Control and Safe Streets Act tect the confidentiality of electronic commu- of 1968. The Electronic Communications Pri- nications and computer information as well as vacy Act addresses three limitations in Title individuals’ rights and privacy. The laws iden- III protection that had developed as a result tified below illustrate this trend, which has of technological changes.25 The limitations been occurring simultaneously and parallel to concern the “aural acquisition” of oral Com- executive branch directives aimed at national munications (in contrast with the acquisition

‘zU. S. Congress, Office of Technology Assessment, ~ederal Government Information Technology: Electronic Surveillance lgDraft transcript of speech by Diane Fountaine’s presenta- and Civil Liberties, OTA-C IT-293 (Washington, DC: U.S Gov- tion at the Information Industry Association Annual Conven- ernment Printing Office, October 1985), p. 18. tion, Nov. 11, 1986. Transcript provided by the Information 231 bid., pp. 18-21. Industry Association. See also “Pentagon Weighs Data Bank “Ibid., pp. 20-21. Curbs, ” New York Times, Nov. 11, 1986. mFor a more thorough discussion of technological changes ‘(’Op. cit., Fountaine statement. and the legal protections for the privacy of communications see: ‘l’’ Pentagon Weighs Data Bank Curbs, ” New York Times, Federal Government Information Technology: Electronic Sur- Nov. 12, 1986; “Are Data Bases A Threat to National Secu- veillance and Civil Liberties, OTA-C IT-293, op. cit., October rity?” Business Week, Dec. 1, 1986. 1985. Ch. 6—Major Trends in PolIcy Development ● 141

of digital communications), Communications or radio communications is not penalized. The over nonwire facilities, and communications Electronic Communications Privacy Act also over systems other than public telephone protects against the disclosure of stored wire systems. and electronic communications (e.g., electronic mail records) and provides legal standards for The Electronic Communications Privacy Act access to the transactional records of commu- in each of these of 1986 extends legal protection nications providers. These extended protec- areas. It prohibits unauthorized interception tions address some of the vulnerabilities of of video and data communications. It defines communication systems identified in chapters 3. “electronic communication” to include “any transfer of signs, signals, writing, images, The Computer Fraud and Abuse Act of 1986 sounds, data, or intelligence of any nature. (Public Law 99-474) provides penalties for un- Exceptions to this include the radio portion authorized access to certain financial records of a cordless telephone communication, any in computer systems, including a 5-year felony communication made through a tone-only pag- provision for unauthorized access to a “Fed- ing device, and any communication made eral interest computer” with an intent to de- through a tracking device, such as is used for fraud. It also provides for a penalty for inten- electronic surveillance. The 1986 act also ex- tional trespassing on Federal computers. The tends protection to communications trans- act establishes a felony provision for malicious mitted “in whole or in part by a wire, radio, damage to a Federal interest computer and a electromagnetic, photoelectronic or photo- misdemeanor provision for posting passwords optical system. ” on “pirate bulletin boards. 26 Communications also are protected against intentional interception regardless of the “Public I.aw 99-474, The Computer Fraud and Abuse Act of 1986, signed into law Oct. 16, 1986. Computer Crime and means by which they are transmitted. But the Security, Issue Brief, Congressional Research Ser\’ice, 11385155, inadvertent reception of satellite transmissions Mar. 10, 1987.

GOVERNMENT CONTROLS ON UNCLASSIFIED INFORMATION Controls Through Legislation and development. z; The Atomic Energy Act created the category of Restricted Data, which At the same time that it sought to protect it defined as “all data concerning (1) design, individual rights and privacy from Govern- manufacture or utilization of atomic weapons; ment and other intrusions, Congress also gave (2) the production of special nuclear material; the executive branch authority to limit public or (3) the use of special nuclear material in the access to certain kinds of information, both production of energy. ” A revised version, classified and unclassified. A series of laws enacted as the Atomic Energy Act of 1954 (68 were enacted that gave the President and cer- Stat. 919; 42 U.S.C. 201 1-2296), permitted ac- tain department and agency heads power to cess and retention to some Restricted Data by withhold information to protect its secrecy and private firms engaged under license in indus- to restrict access to it. trial applications of nuclear power, provided that they obtained the necessary security clear- The Atomic Energy Act of 1946 (60 Stat. 755). ances and abided by the required information One of the Federal Government’s oldest mech- controls. anisms for controlling scientific communica- tions, the Atomic Energy Act of 1946, had its origins in the rigid secrecy surrounding the ‘;U. S. Congress, House Committee on Government Opera- World War II Manhattan Project and the Gov- tions, “The (;o\’ernment’s Classification of Private Ideas, ” ernment monopoly on atomic energy research Iiouse Report 96-1540. 96th Cong., 2d sess.. Dec. 22, 1980. 142 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information —— — — —

Without explicitly using the phrase “born of State, which licenses items subject to ITAR. classified, ” the Atomic Energy Act provides The export of communications and computer that Restricted Data is subject to secrecy from security products and technical data are con- the moment of its creation, even though the trolled through EAR and ITAR. The Defense creator may be a private individual. The Gov- Department plays an advisory role regarding ernment has taken legal action against private the application of these regulations to techni- parties, most notably The Progressive maga- cal data. zine, which was planning to publish an article The term “technical data” is defined broadly (based on declassified, publicly available infor- to restrict the domestic dissemination of sci- mation) on the workings of a hydrogen bomb. entific and technical information to foreigners, The Government sought to restrain the maga- including the presentation of papers at open zine from printing the story. A court prelimi- scientific meetings.30 This broad definition of nary injunction was later vacated after simi- “export and the extent to which much scien- lar information was published in a newspaper.” tific research can be (at least indirectly) related The act’s scope was broadened in 1981 to to items subject to controls have aroused much permit the Secretary of Energy to prohibit dis- controversy during the past 7 years. The con- semination of certain unclassified information troversy pits the research and academic com- if dissemination could reasonably be expected munities against the Departments of Com- to have a significant adverse effect on the merce, State, and Defense. health and safety of the public or the national Specific issues have included prepublication defense and security by significantly increas- review clauses and other contract restraints ing the likelihood of illegal weapons produc- on unclassified Government-sponsored univer- tion or theft, diversion, or sabotage of nuclear sity research, controls on foreign visitors, in- materials, equipment, or facilities. Declassifi- quiries into and restrictions on foreign student cation alone may not release certain types of activities (including access to supercomputer information from statutory control of its dis- and advanced materials research), and DoD semination. ’g controls on the content of scientific communi- The Export Administration Act and Arms Ex- cations at normally open professional meetings. port Control Act. Both the Export Adminis- An example of the latter was the meeting held tration Act (50 U.S.C. App. 2401-2420) and the by the Society of Photo-Optical Engineers in Arms Export Control Act (22 U.S.C. 2751- 1982, at which DoD forced the withdrawal of 2794) provide authority to control the dissem- about 100 unclassified technical papers.3] ination to foreign nationals of scientific and technical data related to items requiring ex- port licenses according to the Export Admin- ‘“Relyea, op. cit., CRS IB82083, p. 8. istration Regulations (EAR) or the Interna- ]]See, for example: “Federal Restrictions on the Free Flow tional Traffic in Arms Regulations (ITAR). The of Academic Information and Ideas, Government Information implementing regulations are administered by Quarterly, vol. 3, No. 1, 1986; Mitchel B. Wallerstein, “Scien- tific Communication and National Security in 1984, ” Science, the Department of Commerce, which licenses vol. 224, pp. 460-466; Paul Mann, ‘‘Strictures on Non-Secret items subject to EAR, and by the Department Data Concern Scientific Community,” Aw”ation Week and Space Technology, Nov. 19, 1984, pp. 24-25; James K. Gordon, “Univer- sities Resisting Potential Supercomputer Access Restrictions, ‘~Harold C. Relyea: “National Security Controls and Scien- Aviation Week and Space Technology, Aug. 26, 1985, pp. 59-62. tific Information, ” CRS Issue Brief IB82083, June 17, 1986, One of the outcomes of these controversies was the estab- p. 7. lishment of an ad hoc National Academy of Sciences Panel on ‘gBy contrast, uncontrolled dissemination of declassified Scientific Communication and Nationzd Security, chaired by Cor- documents–through NTIS, for example–has been criticized nell University president-emeritus Dale Corson. The Corson as being a continuing and important source of U.S. technology panel report concluded that national policies of “security through for the Soviet Union. See, for example: Soviet Acquisition of secrecy” would ultimately weaken U.S. technological capabil- Militarily Significant Western Technology: An Update, DoD, ities and recommended that contract controls be used for the 1985; and “Baldridge Claims U.S. Agencies Give Technology (few) “gray” unclassified areas that could not reasonably be to Soviets, ” Research and Development, April 1985, p. 54. completely open. Ch. 6—Major Trends In Policy Development . 143

The Invention Secrecy Act. The Invention Executive Branch Directives Secrecy Act of 1951 (35 U.S.C. 181-188) pro- and Other Restrictions vides that whenever the publication or disclo- sure by the grant of a patent on an invention— As a compromise response to the con- whether or not the Government has a prop- troversy concerning restraints on the commu- erty interest— might, in the opinion of the Sec- nication of scientific research and to the Na- retary of Energy or the head of any designated tional Academy of Science’s Corson Panel defense agency (and the Department of Jus- Report, President Ronald Reagan issued a tice), be detrimental to national security, then directive on the transfer of scientific, techni- that agency head can request the Commis- cal, and engineering information on Septem- sioner of Patents and Trademarks to order that ber 21, 1985. Known as National Security De- the invention be kept secret and withhold cision Directive 189, (NSDD-189), the directive granting a patent. A patent secrecy order is sought to minimize controls on fundamental issued for one year, but may be extended. research and to use classified procedures where controls are needed. In addition to domestic patent secrecy orders, the Invention Secrecy Act provides that a Specifically, NSDD-189 states: must be obtained from the Commis- license . . . to the maximum extent possible, the prod- sioner of Patents and Trademarks before fil- ucts of fundamental research remain unre- ing any foreign patent application or register- stricted. It is also the policy of this Admin- ing any such design or model with a foreign istration that, where the national security patent office or agency for an invention made requires control, the mechanism for control of in the United States (35 U.S.C. 184). information generated during Federally funded fundamental research in science, technology, Although the number of secrecy orders on and engineering at colleges, universities, and cryptography patent inventions is small now, laboratories is classification. . . . No restriction that was not always the case. According to a may be placed on the conduct or reporting of former director, NSA rescinded 62 of them in Federally funded fundamental research that one year alone and sponsored 260 secrecy has not received national security classifica- orders over a period of time.32 It is not clear tion, except as provided in applicable U.S. how much of a chilling effect prospective statutes. secrecy orders have on inventors. NSDD-189 made Federal agencies sponsoring The Defense Authorization Act of 1984. The research responsible for determining, before Defense Authorization Act of 1984 provides the award of a research contract or grant, authority to the Secretary of Defense to with- whether classification is appropriate and for hold from public disclosure certain technical periodically reviewing grants and contracts for data with military or space applications. The potential classification. data must be in the possession or under the The directive did not quell all controversy, control of DoD and must fall within the scope however, because it left “applicable U.S. stat- of U.S. export control regulations (i.e., the data utes, such as the export control laws, avail- must be already subject to export controls) .33 able as an alternative method of controlling federally sponsored, unclassified research re- sults. Since export controls on scientific infor- “Testimony of NSA Director, Admiral Bobby R. Inman, be- mation had been a cause of the original con- fore the House Subcommittee on Government Information, Mar. troversy, NSDD-189 thus failed to resolve the 20, 1980. Also see “White Paper: Analysis of NationaJ Polic~’ issue. 34 Options for Cryptography, ” National Telecommunications and Information Administration, Department of Commerce, Oct. 29, 1980. I ~Department of Defense Authorization Act, 1984, public ~’See: Relyea. op. cit., CRS IB82083, pp. 12-13; and “Reagan Law 98-94, Sept. 24, 1983, Sec. 1217, Authority to Withhold Issues Order on Science Secrecy: I+’ill It Be Obeyed?” Ph.vsics from Public Disclosure Certain Technical Data. 10 U.S.C. 140c. Toda~’, November 1985, pp. 55-58. 144 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

Meanwhile, the National Aeronautics and competitiveness and inferred from the export Space Administration (NASA) limits its dis- control laws, resulted in the NASA “No-No” tribution of some unclassified scientific and list often being cited in the controversies sur- technical information, including that pertain- rounding National Telecommunications and ing to dual-use technologies such as the space Information Systems Security Policy Number station, satellites, experimental aircraft, or 2 (NTISSP No. 2) 37 and the prospect of Gov- transatmospheric vehicles. Such data can be ernment controls on commercial databases. restricted from dissemination to foreigners NTISSP No. 2 was formally adopted as na- through export control laws, particularly tional policy on October 29, 1986. It defines through ITAR, or through other means (see unclassified, but sensitive information to be below), if they have significant potential do- used in accordance with the telecommunica- mestic benefit. Some NASA officials, however, tions and automated information system secu- feel a need for stronger protection against Free rity policy set out in NSDD-145. NTISSP No. dom of Information Act requests from citizens 2 extended Federal concerns for safeguarding of foreign countries. NASA officials try to information beyond national security interests screen such requests for unclassified reports to concerns for broader national interests as listed in the RECON database, which contains described above. Federal agency and depart- abstracts and briefs from NASA technical ment heads were directed to identify unclassi- reports. Foreign requesters are referred to the fied, but sensitive information that might Department of State for licensing if the mate- warrant protection in telecommunications or rial is subject to ITAR. information processing systems, to determine NASA’s charter calls for the agency to dis- in coordination with the National Security seminate information in an “appropriate” man- Agency (NSA) the threats to and vulnerabili- ner. This can include “early domestic dissem- ties of these systems, and to implement appro- ination” of data that is subject to limited priate security measures consistent with Of- distribution, in which case the data is made fice of Management and Budget Circulars available to U.S. industry with the proviso that A-123 and A-130. (See ch. 5.) it not be published or disseminated abroad for NTISSP No. 2’s broad definition of unclas- a period of time. In some cases, “appropriate” sified, but sensitive information and its implied dissemination may be determined by consid- extension of NSDD-145 into such a wide range eration of U.S. economic competitiveness as of public and private sector information sys- well as by national security concerns.35 tems caused considerable controversy and out- NASA does not make the services and doc- cry, as noted earlier, particularly because of uments in its technical utilization program implications for controls on scientific and available to foreign requesters or to their do- financial information and commercial data- ’ mestic U.S. representatives. For many years bases NTISSP No. 2 was rescinded in the NASA Scientific and Technical Informa- March, 1987. tion Facility has screened all requests for sub- NSA does not have statutory authority to scription to NASA Tech Briefs, technical sup- 36 require prepublication review of independent, port packages, and other documentation. nongovernment research in cryptography. This practice, apparently motivated by con- Nevertheless, the agency has attempted dur- cerns for national security and/or economic ing the past decade to control publication and research funding in cryptography, efforts that :MOTA telephone interview With G. T. MCCOY, NASA OffiCe of the General Counsel, Patent Counsel Section, Mar. 31, 1987; 370p. cit. National Policy on Protection of Sensitive, but Un- comments from R. F. Kempf, Associate General Counsel for classified Information in Federal Government Telecommuni- Intellectual Property Law, NASA, received May 8, 1987. cations and Automated Information Systems, Oct. 29, 1986. ~~wdter Heiland in NASA memo, “The So-called No-No 38*’ Making Waves: Poindexter Sails Into Scientific Data- List, ” dated Sept. 30, 1986. bases, ” Physics Today, January 1987, pp. 51-52. Ch. 6—Ma]or Trends in Policy Development ● 145 have caused controversy. In the mid- and late establishment of a voluntary prepublication 1970s, NSA attempted to assume the respon- review arrangement between NSA and aca- sibilities of the National Science Foundation demic researchers. ’() for funding unclassified cryptographic re- As a result, NSA established a voluntary search, including reviewing research proposals process for cryptographic manuscripts, with and results. 39 simultaneous review by NSA officials and pro- NSA has also requested patent secrecy fessional journals, and with an appeals com- orders on applications for cryptographic equip- mittee. Although the merits of such a process ment and algorithms under authority of the are still subject to some debate, some partici- invention Secrecy Act. Controversy concern- pants consider that it works in a reasonably ing two secrecy orders led NSA to request the satisfactory manner. According to Science American Council on Education (ACE) to form magazine, about 200 papers had been sub- a study group on cryptography. The ACE mitted to NSA for review by 1984. According group was assembled in 1980 and issued its to NSA, of that number, nine papers were chal- report the next year. It recommended the lenged. Six of these were modified and three withdrawn. dl

‘%ee ch. 4. See also Tom Ferguson, “Private Locks, Public Keys and State Secrets: New Problems in Guarding Informa- tion with Cryptography, ” Center for Information Policy Re- ‘[’’’Report to the Public Cryptography Stud}’ Group, ” .4ca- search, Harvard University, April 1982, and “The Government’s deme, vol. 67, December 1981. Classification of Private Ideas, ” House Committee on Govern- ‘$l Mitchel B. Wallerstein, “Scientific Communications and ment Operations (op. cit.). National Security in 1984, ” Science, vol. 224, pp. 460-466.

THE ENVIRONMENT FOR POLICY DEVELOPMENT The Early Environment of this secrecy and controls on private sector activities, presumably, have been relatively mi- Cryptography has long been the principal nor, with the possible exception of restrictions method for protecting the confidentiality of on patents and exports of cryptographic equip- communications. Since World War I, and in- ment and technical data. creasingly during the past four decades, the Federal Government has been the Nation’s main source of expertise in the U.S. for devel- The Changing Environment and oping cryptographic techniques. With ex- Federal Policies ceptions, these developments, including cryp- During the past decade, a number of events tographic algorithms, have been kept secret, have changed the external environment, changes as has similar work in other nations. that are still taking place. The first of these Prior to the mid- 1970s, there were relatively has to do with shifts in Federal policy and the few external complications to communications second concerns the changing external envi- policies based exclusively on national security ronment for policymaking. concerns. NSA, and DoD generally, had responsibility for communications security and Federal policy took a sharp turn in the 1970s the private sector had little interest in crypto- when a nondefense agency, NBS, became in- graphic technology. The Government could volved in cryptography for the first time. The protect its interest by classifying R&D, con- result of NBS’ efforts, which NSA assisted, trolling patent grants and exports, and monop- was adoption of the Data Encryption Stand- olizing talent in the field. Any negative effect ard (DES) as a national standard for cryptog- 146 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

raphy, the inner workings of which were pub- The Current and Future Environment lished in the open literature. The current external environment continues The change in policy direction from secrecy to evolve in a number of ways, some of which to openness appears to have signaled increas- are an extrapolation of the past decade. For ing interest in the defense and intelligence com- example: munities in finding ways to thwart the ability ● The private sector and civilian Govern- of the Soviet Union and others to gain access ment agencies are increasingly interested to unclassified, unprotected U.S. communica- in improved safeguards for automated in- tions. The policy shift is widely known to have formation systems, particularly for com- triggered debate within NSA as to the tradeoffs puter systems and for computer-communi- between potential gains in securing commu- cations networks. Computer safeguards nications at the expense of losses to the agency’s are developing rapidly using a number of signals intelligence mission. Debates outside technologies, only a few of which are based of the agency questioned whether NSA, in view on cryptography. of its signals intelligence mission, would per- ● Business applications for cryptography mit a high quality cryptographic algorithm to are still growing both in the United States be published in its entirety. 42 and overseas. Uses include improved Then, in the late 1970s, heightened Federal confidentiality of data, message authen- concern for foreign interception of U.S. Gov- tication and verification, and user iden- ernment and private sector communications tification. These new applications often resulted in the issuance of Presidential Direc- take unpredictable forms, such as stream- tive/National Security Council 24, as noted lining routine paper transactions in au- earlier. PD/NSC-24 called for raising public tomobile manufacturing and reducing in- awareness of the vulnerability of communica- ventory costs in the grocery industry. tions systems to interception. Thus, cryptog- ● There is an expanding, although by no raphy, the central means for safeguarding com- means comprehensive, technical compe- munications that are easy to intercept, was tence in the private sector to develop destined to play a role in the security of non- cryptographic-based and other safeguard defense communications. technologies. As these Federal policies evolved, important In this setting, defense policymaking has re- changes were also taking place outside the sulted in two recent changes. First, NSA sees its current role as the focal point for all com- Government as private sector interests and competence in cryptography and other safe- puter and communications security for the guard technologies began to grow. These Federal Government and private industry, in- cluding the protection of unclassified, but sen- changes were stimulated by the almost simul- 4s taneous invention of DES and the public-key sitive information. algorithm by researchers from industry and Secondly, NSA changed the Federal Govern- academia.(See ch. 4.) This was followed in the ment’s practice of openly publishing crypto- late 1970s and early 1980s by private sector graphic algorithms. The agency announced in users recognizing new applications for these 1986 that it would not recertify DES-based technologies. The result was anew set of stake- products after January 1988. Previously en- holders with an interest in Federal policies in dorsed DES products may continue to be used, this area.(See ch. 5.) In addition, business in- in general, and DES also may continue to be terest in cryptography became international. These events contribute to an environment “Richard I. Polis, “European Needs and Attitudes Toward that contrasts sharply with the relatively tran- Information Security, ” unpublished paper prepared for the Fif- teenth Annual Telecommunications Policy Research Conference, quil one in which earlier U.S. policies were Airlie, VA, Sept. 27-30, 1987. established. ‘]NSA announcement, April 1986. - ..

Ch. 6—Major Trends in Policy Development ● 147

used for Government electronic fund trans- Thus, the prior Federal Government policy fers. 44In place of DES, NSA announced that of providing certified, published algorithms, it would offer a family of NSA-designed and developed as consensual standards under NBS -certified algorithms embedded in tamper-proof stewardship, has in fact shifted to NSA-pro- modules to protect unclassified information. vialed, secret algorithms as a means of provid- ing improved protection against the misuse of ‘lLetter from NSA to OTA from Michael C. Gidos, Chief, IN- unclassified electronic information. F’OSE;C Polic~, COMSEC Doctrine, and Liaison Staff, dated Juiy 23, 1986,

CURRENT CONGRESSIONAL INTEREST

The various interests, concerns, and policy versely affect the national interest or the trends described in this chapter provide a back- conduct of Federal programs, or the pri- ground for a set of policy issues reflected in vacy to which individuals are entitled proposed legislation and hearings on computer under . . . the Privacy Act . . .“; and and communications security in Congress dur- provides an advisor-y role for NSA to NBS ing 1986 and 1987. Issues and concerns that concerning safeguard technology, but previously were spoken of privately now were does not affect NSA’s responsibilities for said in public and for the record. The result safeguarding classified information. may be a vehicle for resolving, at least in the short run, some of the conflicting interests and HR 145 establishes agency responsibilities views of national security as they pertain to for the development and standardization of the security of computer and communications safeguards to protect sensitive information information. against loss and unauthorized modification or disclosure, and to prevent computer-related The Computer Security Act of 1987 (HR 145) fraud and misuse. As part of its role, NBS was introduced in the House of Representa- would develop standards and validation pro- 4s tives in 1987. It would establish a Govern- cedures for safeguards, provide liaison with ment-wide program to ensure the security of other Government agencies and private orga- sensitive information in computer and commu- nizations, and assist Federal agencies and the nications systems. Specifically, the bill: private sector in applying NBS-developed ● assigns to NBS responsibility for assess- standards and guidelines. An advisory board ing the vulnerability of the Federal Gov- would be established to assist NBS, which ernment computer and communications would include NSA representation. systems, and for developing appropriate security standards and guidelines, as well Congressional hearings were conducted on as providing technical assistance to other HR 145 and NSDD 145 on February 25 and agencies; 26 and on March 17, 1987, by the Subcommit- tee on Legislation and National Security of the ● requires NBS to develop guidelines for use in training Federal personnel in computer House Committee on Government Operations. security; Joint hearings were also held on February 26, 1987 by the Subcommittee on Science, Re- ● defines unclassified, but sensitive infor- mation broadly to include information, search, and Technology, and the Subcommit- “the loss, misuse, or unauthorized access tee on Transportation, Aviation, and Materi- to, or modification of, which could ad- als of the House Committee on Science, Space, and Technology.

‘:’H. R. 145, The Computer Security Act of 1987, Jan. 6, 1987, The hearings were significant because they and report 100-153, Parts 1 and 2, June 11, 1987. allowed representatives of important scientific,

‘/[)-9, ’ 1 () - 8’/ - ‘) 148 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic information professional, and trade groups to publicly ex- vate sector or to Government information sub- press their concerns. Witnesses at the hear- ject to release under the Freedom of Informa- ings commenting on their experiences or views tion Act. In commenting about the purposes on NSDD-145 were generally negative or ap- of NSDD-145, these officials pointed out that prehensive. Their comments tended to focus the Government needs to prevent invasions on three main points: of citizens’ privacy, the obtaining of unfair advantage in business dealings, and avoidance 1. NSA’s expanding role in civilian agency of law enforcement efforts,47 once again ad- and private sector computer security; dressing the question of the scope of national Z. the “disruptive, “ “counterproductive” ef- security interests. fects of NSA’s restrictions on U.S. banks’ use of NSA-provided cryptographic al- During the course of these hearings, the def- gorithms; and inition of unclassified, but sensitive informa- 3. apprehension regarding potential DoD tion provided in NTISSP No. 2 was rescinded controls on unclassified information. and the National Security Council initiated a review of NSDD-145 aimed at reducing or elim- Many witnesses at these hearings were con- 4s cerned that, under NSDD-145, the Govern- inating its operational role. At about the same time, civilian agency participation in ment would restrict access to information in 49 public libraries, engineering and scientific pub- NTISSC was expanded. lications, and Government and commercial on- These current congressional activities are line databases. Challenges were raised as to the latest attempt to grapple with the diverse the authority of the Government to withhold issues surrounding information security pol- unclassified information from the public, the icy, many of which are of long standing. effect on First Amendment protections, and Regardless of the outcome of HR 145, the fun- potential damage to the free flow of informa- damental issues–such as the separation of tion in society and to the principle of open gov- 46 power, the role of the Government, and the ernment boundaries between military and civilian In response, DoD officials assured the sub- agency responsibilities-will require reexami- committees that NSDD-145 would not extend nation to determine the appropriate balance the authority of DoD or NSA to control ac- of national interests. cess to unclassified, but sensitive information, nor would it apply to information in the pri- “Op. cit., Latham testimony, Feb. 26, 1987. 4T,etters from Frank Carlucci, Assistant to the President for National Security Affairs to Congressman Jack Brooks, Chair- man, Cornmi ttee on Government Operations, U.S. House of Rep- ~hSee, for example, testimony of the Information Industry resentatives, Mar. 12 and 17, 1987; Letter from Howard H. Association, the Institute for Electrical and Electronic Engi- Baker, Chief of Staff to the President, to Congressman Jack neers, the American Library Association, the Association of Brooks, Mar. 16, 1987. Research Libraries, the American Physics Society, David Kahn, ‘gFrom material provided by NSA staff to OTA, Dec. 22, and the American Civil Liberties Union. 1986. Chapter 7 Federal Policy Issues and Options CONTENTS

Page Introduction. ● ...... 151 The Influence of Federal Policies ...... 151 Factors Influencing Information Safeguard Developments ...... 151 The National Security Influence in Policy Formulation ...... 152 Interrelated Federal Policies and Changing Concerns ...... 152 Policy' Analysis ...... 158 Important Trends for Policy ...... 153 National Values and Objectives ...... 154 Levers for Implementing Policy ...... 156 Alternative Policy Options ...... 156 Evaluation of Options ...... 158 Policy Observations ...... 160

Table Table No. Page 14. Policy Options ...... 157 Chapter 7 Federal Policy Issues and Options

INTRODUCTION Policy formulation in the area of informa- tal to U.S. signals intelligence interests. Tech- tion security is important today and will be- nical and other controls on access to unclassi- come more so in the coming decade and be- fied, but sensitive information in automated yond. Its importance stems from the broad systems are being considered as a means for impact of electronic information on society and regulating the export of valuable information the potentially major applications of safeguard to foreign interests. technology for commerce and government. As discussed earlier in this report, applica- Federal policies require adjustments over tions of information safeguard technology are time as the external environment changes. already being adopted to improve the effi- During an earlier era when protecting Govern- ciency, integrity, and control of business and ment-classified communications from foreign Government automated transactions, and to exploitation was virtually the only objective, improve their confidentiality as well. Much policies shaped exclusively by this need went larger and more pervasive applications for unchallenged and tensions with other national commerce and society are foreseen, further objectives were nonexistent or minimal. Now, stimulated by continued advances in this tech- however, the objectives of Federal policy are nology. increasingly expanding to include nondefense interests, such as the prevention of embezzle- The Influence of Federal Policies ment of electronic funds transfers, the disrup- tion of public services (e.g., air traffic control Federal policies can have a strong influence and Social Security transfer payments), and on the development and use of information the theft of proprietary information from U. S.- safeguards. Policies may encourage private in- owned firms by foreign competitors. At the vestment in safeguard technologies or, on the same time, the expansion of earlier policies cen- other hand, can discourage such activities. tered on national security and Government Chapters 5 and 6 provide a number of exam- controls is creating tensions with other na- ples of policies and programs that have a com- tional interests. Thus, new objectives are be- bination of these effects. For example, the Gov- coming important and a different balance for ernment can stimulate the use of safeguards Federal policy may be more appropriate. by setting technical standards, requiring spe- cific message authentication and verification procedures for certain applications, and issu- Factors Influencing Information ing performance guidelines and specifications. Safeguard Developments On the other hand, secret Government de- How and when society fully realizes the po- signs for safeguards may result in high-qualit y, tential benefits of information safeguard tech- federally endorsed commercial safeguards, but nology will be determined by a number of fac- discourage independent innovation in the pri- tors. One is the aggregate need of users. We vate sector. Secret designs also foster private can anticipate two effects from those needs: sector dependence on the Federal Government 1) that private sector users will increasingly for equipment validation and certification, and set the pace in new applications of safeguard the provision of replacement designs. technology; and 2) that market forces will re- In the defense and intelligence communities, spond to user demand for new products, ab- however, Government controls are seen as vi- sent Government-imposed constraints.

151 152 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information —

A second factor concerns the net effect of the mainstay for providing confidentiality and Federal policies on stimulating private sector integrity of information that is unprotected by developments. Federal policies to date have physical or hardware/software security meas- helped some developments in computer and ures (as when such information is in transit communications security technology and hin- on a network). Cryptography allows the United dered others. (See chs. 4 and 5.) States to safeguard its classified defense and diplomatic communications. The absence of A third influence concerns private sector in- high-quality encryption in foreign communi- novation itself. The occurrence and rate of in- cations makes possible some U.S. signals in- novations is unpredictable. Important ad- telligence operations. Because of these defense vances, such as public-key cryptography, have and intelligence community interests and the occurred without Federal encouragement. Yet, general lack of nondefense interests in earlier Federal policies can affect the climate for times, public policy concerning cryptography creativity by stimulating research or, alterna- has tended to be shaped and controlled by the tively, creating a chilling effect. Department of Defense (DoD). In this view, Federal policies have a signifi- Until recently, policy directions based exclu- cant, but not the sole influence on private sively on national security concerns adequately sector developments. Nevertheless, they are served the Nation’s needs, with little visible particularly important because today’s tech- impact on the rest of society. That situation nology is still immature and market demand is changing, spurred in large part by new op- limited and, in some cases, fragile. Therefore, portunities and challenges created by techno- the policies of nations that are at the forefront logical change, continued pressure to improve of technological development and innovative business and government operations, and the applications, such as the United States, will emerging internationalization of applications have a major impact on the pace and direction of this underpinning technology. This chang- of private sector advances in information ing environment also is likely to bring further security. challenges to policy makers as the needs of so- ciety continue to change both in the United The National Security Influence States and abroad. in Policy Formulation Interrelated Federal Policies An analysis of information security issues and Changing Concerns in a report based entirely on unclassified data is hindered by a number of factors, one of which National security interests clearly have an is the strong influence of classified informa- important and continuing place in Federal in- tion in shaping policy development. Neither formation security policies. The prospect of Presidential Directive/National Security Coun- worldwide use of high-quality information cil 24 (PD/NSC-24) nor National Security De- security safeguards threatens U.S. signals in- cision Directive 145 (NSDD-145), for example, telligence operations, as does the dissemina- were debated openly. In fact, they were clas- tion abroad of critical technical data on infor- sified while being developed, although even- mation security. As technology continues to tually unclassified versions were issued. Thus, advance and as safeguards for computers and the process of policy development, at least communications systems come into wider use within the executive branch, has been a rela- worldwide, the effectiveness of U.S. signals in- tively closed one. telligence may become more limited and its pri- ority lowered among national objectives. Secrecy is also an important factor in pol- icies concerning the development of cryptog- Another policy involves control of access by raphy. Unlike other safeguard technologies foreign governments to commercial databases useful in computer security, cryptography is in the United States that contain unclassified, Ch. 7—Federal Policy Issues and Options ● 153

but sensitive information. On-line databases as the Defense Technical Information Center allow rapid access and sorting through a and the National Technical Information wealth of information. Defense and intelligence Service. agencies seek to prevent foreign intelligence agencies or businesses from acquiring valuable Still another change is foreseeable. For ex- technical data that can help other countries ample, the proliferation of information secu- compete with the United States militarily or rity technology could further broaden the scope economically. of national security concerns. Within a decade, good quality, inexpensive, easy-to-use com- Government concerns about communica- puter and communications safeguards may be tions and computer security, signals intelli- used worldwide for many applications. That gence, and controls on foreign access to un- could shift attention away from the central na- classified information change with time. PD/ tional security issues of today, possibly toward NSC-24, for example, elevated attention about countering the use of secret transactions for the vulnerability y to misuse of communications conducting illegal or subversive business. systems to Federal policy status. Now, there are a number of DoD programs, some of which Almost at the same time that these changes are classified, to reduce those vulnerabilities. in Federal concerns have been taking place, the Similarly, computer security was just being trend in private sector users’ needs for infor- identified as an area warranting Federal con- mation security can now be seen as overlap- cern in the early 1970s. Today, substantial re- ping some of the Government’s applications sources are being applied to bolster computer requiring message authentication, user veri- security. Now, concern is extending to encom- fication, auditing of transactions, confirmation pass access to Government databases, such of authorizations, and confidentiality.

POLICY ANALYSIS The preceding sections and chapters raise restrictions on access to Government informa- questions as to the appropriate overall objec- tion of the National Technical Information tives of Federal policies, the direction in which Service and the Defense Technical Information current policies may lead, and whether or not Center, as well as access to information in com- other alternatives might better serve the Na- mercial database services. Thus, there has been tion’s interests. Based on the needs of the a tendency in Federal policy toward greater different stakeholders, e.g., businesses, scien- control of selected information and, recently, tific organizations, and civil, defense, and in- of access to information in certain types of sys- telligence agencies, it is clear that each would tems. Some of these policies have not recog- provide a considerably different perspective nized the needs of the public. to an analysis of policy options. Because computers, information systems, Important Trends for Policy and communications networks are changing so rapidly, policies based only on current needs Chapter 6 described some of the Government are likely to become outdated quickly. Policies efforts during the past few decades to solve are needed that are flexible and anticipate the particular problems through controls on un- changing needs of industry and society. The classified information. In recent years, Gov- factors discussed below are among those that ernment efforts have included restrictions, for are changing. They will significantly influence example, on the dissemination of unclassified future policy deliberations, either because of technical reports from the National Aeronau- changes in the policy environment or because tics and Space Administration and potential of the public’s attitude about Federal policies 154 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information that affect business operations and the free the development and use of more secure flow of information. Each provides insights systems for communications and comput- into future directions for policy. er systems. For example, encouraging un- fettered private sector innovation in cryp- ● Although some important improvements tography increases the chance of major are foreseeable in the confidentiality of technological advances that benefit com- public communications systems, these are merce and society. But perhaps another likely to be uneven. Many segments of country will use the same technology to these systems will remain vulnerable to protect its own electronic information exploitation by those with appropriate re- from U.S. intelligence operations. On the sources. other hand, if the National Security To the extent that DOD programs de- Agency (NSA) provides nondefense users pend on encouraging businesses to pay in- with safeguard technology, the foreign in- dependently for reducing the vulnerabili- terception and access threat may be re- ties of their communications against duced earlier, but the ready availability Soviet or other foreign government inter- of “adequate” solutions from NSA may ception, failure is likely since business act as a disincentive for the private sec- profits are not perceived to be affected. tor to develop solutions better tailored to There are strong indications, however, its unique requirements. that some nondefense users will have busi- ● Although the current trends are not yet ness reasons to protect the integrity of cer- altogether clear, there are indications that tain of their information in computer and businesses have diverse and specialized communications systems and the confi- needs for cryptographic-based systems dentiality of selected communications. and other safeguards for a variety of non- Both interests can be served with crypto- defense applications. graphic-based safeguards. Almost certainly, no Federal agency will ● A broad range of techniques for safeguard- be able to satisfy the diverse needs of ing unclassified information in computer many of these users with Government- systems and networks are available or are designed systems, especially if significant being developed. Private sector capabil- constraints must be placed on users. ities to develop these safeguards to meet Private sector capabilities for develop- their own needs are significant and ex- ing computer and communications safe- panding. guards can meet most of the demand of ● Academic researchers and businesses Government agencies and other users. For have begun to demonstrate a level of ex- the procurement of other commercial prod- pertise in developing certain types of ucts, the typical practice among Federal cryptographic-based safeguards (e.g., the agencies would be to provide their specific Data Encryption Standard and two-key performance requirements and to pur- systems). Further developments in this chase competitively. The arguments fa- field are unpredictable. However, based voring a central role for DOD/NSA in on recent experience, Federal support for carrying out these responsibilities are be- private innovation through unclassified coming less convincing, although there is research could yield promising results. a clearer need for NSA technical assis- Any additional major advances may also tance in selected areas, such as cryptanal- result in still more valuable new appli- ysis and equipment evaluation. cations. ● Flexible Federal policies with minimal re- These trends highlight a serious straints are likely to have abetter chance dilemma for Government policymakers: of success than others. The banking indus- How to maintain effective signals intelli- try’s experience with NSA’s planned re- gence while simultaneously encouraging strictions indicates that Govemment-pro- Ch. 7—Federal Policy Issues and Options ● 155

vialed safeguards, with rigid restraints sified, but sensitive information by telephone. associated with their use, are not likely Still other businesses are likely to support Gov- to satisfy the needs of business users. (See ernment initiatives that enhance their opera- ch. 5.) tional needs, such as Federal endorsement of There is international demand for im- data encryption algorithms and certification proved safeguards and foreign capabilities of commercial safeguard equipment. But for developing them. (See ch. 5.)’ others are likely to oppose any Federal policies DoD efforts to restrain or monitor foreign that detract from trade, innovation, open sci- access to commercial on-line databases ence, and civil liberties. have already raised public concerns. (See Finally, there are questions raised about ch. 6.) Further, these services are becom- which branch of Government should make pol- ing a significant industry in the United icy on information security. Both the execu- States and a source of U.S. exports.’ tive and the legislative branches have adopted Government efforts to control access to policies that show few signs of coordination. commercial databases are likely to con- The executive branch has been most active in tinue to be resisted by this rapidly grow- recent years, notably with NSDD-145, and the ing, competitive industry. defense and intelligence communities, specifi- Today, NSA appears to be attempting to re- cally NSA, have been the principal implement- tain as much control or influence as is practi- ers. Executive branch policies have been based cal in these matters. The controls are exercised primarily on national security considerations. mainly through authority provided under NSDD-145 and various NSA programs, includ- National Values and Objectives ing those that stimulate the availability of com- mercial safeguard products. Yet, the above Because there are important stakes at risk trends suggest that Federal policies concern- for the Nation in formulating policy for safe- ing the development and use of safeguard tech- guarding information, Congress has to care- nology, and access to unclassified, but sensi- fully consider what the Government’s broad tive information in commercial databases, will goals are that these policies seek to protector have to be carefully aligned with changing and encourage. Although there are often strong more intensive domestic and international differences of opinion on the merits of specific business interests and with congressional and Federal policies, there seems to be broad agree- other institutions. ment on the types of goals that such policies might aim to achieve. Some of these goals are Some businesses are unaffected by DoD ini- to: tiatives, such as those that improve the con- fidentiality of common carrier communications ● foster the ability of the private sector to systems or that require Government-reim- meet the evolving needs of businesses and bursed voice protection equipment to be used civil agencies for safeguard technology, by defense contractors when discussing unclas- ● minimize risks to U.S. signals intelligence from private sector developments, and ● clarify the roles of Federal agencies con- 1 Richard I. Polis, “European Needs and Attitudes Toward cerning unclassified information and the Information Security’ (unpublished), Telecommunications Pol- development and use of technology to pro- icy and Research Conference, Airlie, VA, Sept. 30, 1987. tect it. ‘These companies had revenues of $3.65 billion in 1984. Christopher Burns and Patricia Martin, “The Economics of In- At the same time, achievement of the follow- formation, 1985, OTA contractor report No. 433-9520. The industry had 486 companies by 1986. The number of data- ing, more general goals may also be desirable: base producers worldwide increased from 221 in 1979 to 1,500 in 1986, while the number of databases increased from 400 to Ž promote competition, innovation, and 3,200 during that same period. OTA staff interview with Ken- trade; neth Allen, Information Industry Association, February 198’7. Ž separate, where practical, defense and in- 156 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic /formation

telligence agencies’ responsibilities from Also, some policies may encourage continued those of the private sector and civilian private sector dependence on the Federal Gov- agencies; ernment while others are more likely to lead • retain a free flow of information and an toward an independent technical competence open society, while encouraging privacy; in the private sector for meeting its own needs. and These effects are treated in more detail in the ● minimize or reduce the tensions between subsequent section that evaluates alternative Federal policies and private sector ac- policy options. tivities. The focus of decisionmaking, however, is on the respective roles of NBS and NSA, and im- Levers for Implementing Policy plementing policy around these roles. A number of incentives and constraints can Alternative Policy Options be used to implement policies regarding safe- Several options exist for national policy. guard technologies. These include programs They can be distinguished mainly by the de- to certify vendors’ equipment, transfer tech- gree of centralization within the Federal Gov- nology, standardize designs, procure devices, ernment, the level of involvement in or con- and encourage the development and use of im- trol of private sector activities exercised by proved safeguards. Controls on exports and the Government, the separation of defense and patents are clear examples of constraints. The nondefense interests, the importance of na- funding of research by the Government can tional security, and the flexibility of the pri- be either a constraint (e.g., by keeping the re- vate sector in developing information technol- sults classified) or an incentive. ogy safeguards to meet its needs. Table 14 illustrates the options in their main division Depending on how some of these levers are of responsibilities between the National Bu- actually used, they could simultaneously pro- reau of Standards (NBS) and NSA. mote and restrain private sector activities. Cur- rent Government practices in transferring Option 1: Centralize Federal activities relat- cryptographic technology to the private sec- ing to safeguarding unclassified information tor appear to accomplish both. They also il- in Government electronic systems under the lustrate how policy levers can be used. For ex- National Security Agency. ample, providing a few manufacturers with Option 2: Continue the current practice of de high-quality, inexpensive, tamper-proof, facto NSA leadership for communications Government-certified cryptographic devices and computer security, with support from whose design is secret may meet the immedi- the National Bureau of Standards. ate needs of private sector users and vendors for certified systems. Simultaneously, national Option 3: Separate the responsibilities of NSA security objectives are served by encouraging and NBS for safeguard development along the use of improved safeguards. In addition, the lines of defense and nondefense re- the Federal Government can control the ex- quirements. port of these products, in part because the In Option 3, additional choices can be underlying technology is produced by a limited made. number of U.S. companies for NSA. At the A: Provide Federal support to specify, de- same time, however, this approach discourages velop, and certify safeguards for busi- further private sector innovation since it is un- nesses and civilian Government agencies. likely that many users will want or that man- NBS would be the focal point for all safe- ufacturers will produce competing products guard standards for unclassified informa- that lack NSA certification and have limited tion. This option most closely resembles demand. HR 145. Ch., 7—Federal Policy Issues and Options ● 157

Table 14.—Policy Options

Option 3A Option 3B Responsibilities Option 1 Option 2 Option 3 Support private Market forces for developing Centralize under Continue current Separate defense standards for unclassified standards NSA practice and nondefense development- needs All classified ...... NSA NSA NSA NSA NSA Unclassified Communications: Defense ., ...... NSA NSA NSA NBS NBS Nondefense ...... NSA NSA NBSa NBSa N BSa Computer: Defense ., ...... NSA NSA NSA NBS NBS Nondefense . . . . NSA NSA NBSb NBSb NBSb Key distinctions ... . . Centralization, NSA defacto Mixed technical Commonality with Commonality with NSA leadership leadership leadership non government non government safeguards safeguards Private sector leadership aRefers t. N BS’S corn-m u n I cat Ions secu rlty standards responstbl II t!es affll lated wit h COm PLJter secu rltY bRefers t. NBS s standards responsibllltles under the Brooks Act (Publlc Law 89-306) SOURCE Off Ice of Technology Assessment 1987

B: Allow free market forces to develop safe- Option 2 would continue the current conflict- guards for nondefense needs, with NBS ing authorities assigned to NBS and NSA. It acting as the focal point for Government would also continue the current practice of needs for safeguards for unclassified in- NSA having de facto leadership in developing formation. NSA specifies the require- communications and computer security stand- ments of DoD and defense contractors ards for the Nation, including increasing dom- and provides technical advice for other inance over the development of cryptography. users. NBS would retain its current modest role in developing occasional, consensual technical The discussion of these policy options as- guidelines and standards for civilian agency sumes that NSA would retain responsibility use. for matters relating to classified information in computer and communications systems un- Option 3 would assign to NBS responsibility der all options and that complementary NBS for developing safeguards for all Government and NSA activities would be coordinated as agencies’ needs other than those specifically necessary. assigned to NSA. NSA would provide techni- Options 1 and 3 would clarify the present cal assistance to NBS, as needed. Under this confusion concerning the roles of NSA and option, NSA would be responsible for only NBS. Option 1 would provide one focal point those safeguard standards and developments in the Federal Government for efforts to de- required exclusively by defense agencies. velop safeguard technology for unclassified information in Government systems. This op- Option 3A would look to a nongovernment tion would make use of NSA’s technical ex- group or organization to take a lead role in de- pertise in cryptology and would concentrate veloping consensual guidelines and standards the focus of U.S. policy toward national secu- for safeguarding unclassified information in rity objectives. The role of NBS in safeguard private sector and civilian agency systems. development would either be terminated or re- Both NBS and NSA would actively support duced to those civilian agency requirements these private sector activities. NBS would that support NSA’s role. serve as the focal point for civilian and defense 158 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic /formation agency standards for safeguarding unclassi- U.S. export control regulations and the avail- fied information. As in Option 3, NSA would ability of comparable technology from foreign be responsible for providing advice to the non- sources. government standards group. The current situation, which has produced considerable controversy and confusion, is es- Option 3B is similar to Option 3A, except sentially Option 2. Almost any option would that the Federal role would be diminished fur- represent an improvement in clarifying the ther. It would abandon Federal responsibili- roles of NBS and NSA. This is true whether ties for developing safeguards for unclassified responsibilities are centralized in one agency information and, instead, would look to the or divided according to divisions such as clas- market place to meet both private sector and sified and unclassified information, defense and civilian agency requirements. NBS would serve nondefense, or almost any other scheme. as the Government focal point for the needs of Government agencies for safeguards for un- Diminishing NSA’s role is likely to reduce classified information. tensions between Federal policies and private sector activities in safeguard development and use. Similarly, such tensions are likely to de- Evaluation of Options cline as defense and intelligence interests are The national values and objectives described separated from nondefense interests. earlier provide a useful starting point for com- Each of these options have other advantages paring the policy options. It is apparent that: and disadvantages that distinguish them. None offers a completely favorable assessment The ability of the private sector to meet its based on the objectives against which they are own needs is fostered as the Government in- being evaluated. For example: creasingly allows the marketplace to satisfy agencies’ needs. In computer security, where Option 1: industry and the private sector have histori- Pros: The key advantage that distinguishes Op- cally led, NSA’s trusted computer security tion 1, in addition to clarifying the responsi- program has benefited from significant man- bility of the National Security Agency, is the ufacturer input. In cryptography, the commer- ability to maximize NSA’s control over pri- cial communication security endorsement pro- vate sector activity in safeguard development, gram has limited the scope of manufacturer particularly those based on cryptography. innovation of encryption algorithms, reflect- That will allow it to minimize the risks to U.S. ing the historical NSA domination of this tech- signals intelligence from independent private nology. In the area of network protocols, the sector developments. Option 1 would be pre- interface between computer security and cryp- ferred if signals intelligence were the only or tography, there has been significant “give and even the predominant policy consideration. take” between NSA and the private sector par- Cons: The main disadvantages are the likely af- ties directly involved in the development of fects of blurring defense and intelligence and standards. civilian interests, and raising tensions due to differences in needs. Option 1 would probably On the other hand, U.S. signals intelligence have a stultifying effect on private sector in- capabilities would be better–protected if con- novation. The latter problem is most likely to trol of private sector developments in (cryp- occur in cases where new developments of tography-based) safeguards are centralized un- value to society are detrimental to intelligence der NSA. In the extreme case of relatively un- operations. The absence of a Federal stand- fettered free market forces, there is a risk that ard for public-key cryptography, in spite of its obvious need, is an example of the effect signals intelligence will suffer as foreign intel- of such a conflict. ligence targets benefit from safeguard prod- ucts or designs developed by U.S. industry. Option 2: Other factors that will affect the transfer of Pros: This option retains most of the advantages technology abroad include the effectiveness of of Option 1 while retaining a civilian agency Ch. 7—Federal Policy Issues and Options . 159 — . . .

to interact with private sector users, vendors, Cons: As in Option 3A, the main shortcoming and standards organizations. In this role, is in potential damage to U.S. signals intelli- NBS would maintain an awareness and per- gence operations. haps advocacy of the needs of civilian users. Cons: Perhaps the most prominent shortcom- There are other factors for Congress to con- ing is the lack of clarity between the roles of sider in evaluating the options. These include NBS and NSA concerning information secu- the resources required to carry out agency rity. In the current situation, NBS has statu- responsibilities under the various options, the tory responsibility for the development of need to carry out extensive coordination with computer security standards and for serving commercial users and others in the develop- as the Government’s representative in tech- ment of standards, the ability to engender the nical standards organizations. At the same trust of users, vendors, scientists, and others, time, NSDD-145 has assigned similar respon- and the ability to carry out needed research sibilities to NSA, which is charged with to benefit users generally. reviewing and approving all standards, tech- niques, systems, and equipment for telecom- It should also be recognized that NSA’s tech- munications and automated information sys- nical expertise will bean important part of any tems security, This option also suffers from of the options, e.g., evaluating safeguard tech- the problems of Option 1. niques and equipments, especially those em- Option 3: ploying cryptographic methods. Pros: The division of responsibilities clarifies the As a practical matter, the resources avail- roles of NBS and NSA, and provides for sep- able to NBS and NSA have not been compara- aration between defense and nondefense ble. NBS’s budget for computer-related secu- needs. This option also affords an opportunity to consolidate the Government nondefense rity standards has been about $10 million or needs with comparable needs of the private less during recent years, and a staff of about sector and to reduce tensions between defense 10 professionals, while NSA’s National Com- and intelligence interests and those of the pri- puter Security Center alone employs some 300 vate sector. people. (NSA’s budget is classified.) For op- Cons: The main shortcoming of this option con- tions in which NBS or NSA have a significant cerns a lessening of NSA control of private role in standards development, their efforts sector innovation and its potential for dam- need to be coordinated with the needs and age to U.S. signals intelligence capabilities. activities of the private sector. Although this This option also risks diluting a market that study has not attempted to estimate the re- is already fragile by encouraging the adoption source requirements under any of the options, of different standards for defense and non- defense applications. some options would require changes in the funding levels of either or both NBS and NSA. Option 3A: In addition, it can be anticipated that any sig- Pros: Option 3A also would promote competi- nificant increase in responsibilities for the de- tion and private sector competence to meet velopment of information safeguard technol- its own needs and reduce tensions through in- ogy will suffer from start-up problems, such creased Government dependence on and align- as maintaining a high level of staff expertise, ment with industry standards. as has been the experience at NSA’s National Cons: The main shortcoming, once again, con- cerns the potential damage to U.S. signals in- Computer Security Center. telligence capabilities. There are a number of assumptions implicit Option 3B: in some of the options. One is that public Pros: The advantages are similar to those of Op- acceptance of NBS standards would be based tion 3A, but Option 3B further frees market on the open scrutiny and consensual decisions forces and makes the Government dependent that usually accompany the workings of civil- on the private sector rather than the other ian agencies. This assumption may not apply way around. to NSA in a comparable standards-setting role 160 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information . given the secretive way the agency normally There is also the practical question of how operates and its unilateral decision to replace effective restrictions imposed by the United DES with a secretly developed algorithm. States on its citizens might be’ if foreign in- novations, publications, and product manufac- None of the options make allowance for con- ture and export are not subject to comparable ducting research. Yet, OTA’s analysis indi- restraints. cates that society’s evolving information needs depend on continuing innovations in safeguard technology. Based on observations of the rapid Policy Observations acceptance of DES and public key cryptog- There are no options for Federal policy that raphy for business applications, it seems clear clearly and simultaneously foster all national that there are ready applications for innova- goals without harming some. The alternatives tions but a limited supply of them. For now, differ mainly in which Government agency NSA is the main source of innovation in the leads in the development of safeguard technol- Federal Government. However, its signals in- ogy, the level of Federal encouragement or con- telligence mission is likely to prevent the dis- trol of private sector innovation, and in flexi- semination abroad of U.S. innovations. Be- bility to adjust to changing needs of businesses cause of this constraint, innovations generated and society. by NSA may not be made available to the pub- lic at all. Three main observations result from OTA’s analysis: Generally, there has been little motivation for industry to sponsor long-term research 14 None of the policy options simultaneously from which it cannot benefit on a proprietary satisfy all objectives. basis. However, the quality of proprietary 2, Excessive accommodation of either busi- cryptography tends to be suspect by some U.S. ness or defense and intelligence concerns critics. 3 In this situation, the Government could damage overall U.S. interests. may decide to undertake research into selected 3. A process for weighing competing na- safeguard technologies. Research into crypto- tional interests is needed. Centering pol- graphic technology is likely to raise concerns icymaking in the Department of Defense for national security if undertaken openly by alone, and in particular NSA, would make NBS and concerns about public trust if under- that difficult. taken secretly by NSA.

‘]There are, however, indications that many Western Euro- pean businesses find proprietary cryptography acceptable, 4Richard I. Polis, “European Needs and Attitudes Toward according to consultant Cipher Deavours. OTA staff commu- Information Security” (unpublished), Telecommunications Pol- nications, May 1987. icy and Research Conference, Air-lie, VA, Sept. 30, 1987. Appendixes Appendix A Requesting Letters

NINETYWt41’m CONORE$S

As we all know, It has been possible for decades to record telephone conversations surreptitiously. Today’s technology, however, makes possible data collection and moni- toring with an ease and on a scale never before realized. Tracing Incoming calls; auto- matically recording the numbers, times, and durations of all incoming and outgoing calls; and recording prearranged conference calls are now features of many telephone systems. Other information systems used by the Federal Government, such as personal computer work stations as well as mainframes and centralized computer service systems, provide even more avenues for surreptitious data collection.

Although modern Information technology is an essential Ingredient in today’s environment, Its misuse would pose grave threats to our Individual freedoms. As a result, I request that OTA undertake a review to determine the potential for abuse of Federal telecommunications and related information systems, considering both current and expected future technology, and weigh the prospects for limiting such abuses through technology and/or legislation. I appreciate your prompt attention to this request.

163 June 3, 1985

Dr . John H. Gibbons Director Office of Technology Assessment Us. Congress Washington , D. C. 20510

Dear Dr. Gibbons:

The Judiciary Committee’s Subcommittee on Civil and Constitu- tional Rights, which I chair here in the House of Representatives, has long been concerned with the need to preserve traditional civil liberties in the context of advancing communications and computer technology.

It would be most helpful to have from your office a study of current and expected developments in information technology and the prospects for protecting privacy and other civil liberties through legislation or through the technology itself.

I look forward to receiving the benefits of your office’s expertise in this important area.

Sincerely,

Don Edwards Chairman Subcommittee on Civil and Constitutional Rights Appendix B National Policy on Protection of Sensitive, but Unclassified Information in Federal Government Telecommunications and Automated Information Systems National Telecommunications and Information Systems Security Policy NTISSP No. 2, Oct. 29, 1986

165 FEDERAL

Federal telecommunications and automated information systems handling sensitive, but unclassified information will protect such information to the level of risk and the magnitude of loss or harm that could result from disclosure, loss, misuse, alteration, or destruction.

SECTION II - DEFINITION Sensitive, but unclassified information is information the disclosure, loss, misuse, alteration, or destruction of which could adversely affect national security or other Federal Government interests. National security interests are those unclassified matters that relate to the national defense or the foreign relations of the U.S. Government. Other qovernment interests are those related, but not limited to the wide range of government or government-derived economic, human, financial, industrial, agricultural, technological, and law enforcement information, as well as the privacy or confidentiality of personal or commercial proprietary information provided to the U.S. Government by its citizens.

SECTION III - APPLICABILITY

This policy applies to all Federal Executive Branch Departments and Agencies, and entities, including their contractors, which electronically transfer, store, process, or communicate sensitive, but unclassified information;

SECTTON IV - RESPONSIBILITIES This policy assigns to the heads of Federal Government Departments and Agencies the responsibility to determine what information is sensitive, but uncla ssified and to provid systems protection of such informat ion which is - electronically communi cated, transf erred, processed, or stored on telecommunic ations and autom ated information systems. The Director of Central I ntelligence shall, in addition, be responsib le for ident ifying sensitive, but unclassified informati on bearing on intelligence sources methods and for establishing the system. security handling procedures and the protection requiued for such information. App. B—National Policy on Protection of Sensitive, but Unclassified Information Ž 167

Federal Government Department and Agency heads shall :

a. Determine which o f their department ‘ s or agency ‘ s information is sensitive , but unclassified and may warrant protection. as sensitive during communications or processing via telecommunications or automated information systems. This determination should be based on the department’s or agency’s responsibilities , policies, and experience, and those require- ments imposed by Federal statutes, as well as National Manager guidance on areas that potential adversaries have targeted;

b. Identify the systems which electronically process, store, transfer, or communicate sensitive, unclassified information requiring protection;

c. Determine, in. coordination with the National Manaqer, as appropriate, the threat to and the vulnerability of those identified systems and;

d. Develop, fund and implement telecommunications and automated information security to the extent consistent With their mission responsibilities and in coordination with the National Manager, as appropriate, to satisfy their security or protection requirements. e. Ensure implementation of telecommunications and automated information systems security consistent with the procedures and safeguards set forth in OMB Circular A-123 and A-130.

The National Manager shall, when requested, assist Federal Government Departments and Agencies to assess the threat to and vulnerability of targeted systems, to identify and document their telecommunications and automated informa- tion systems protection needs, and to develop the necessary security architectures. Appendix C The Data Encryption Standard

Background on Encryption An encryption scheme that is used as a stand- ard should be able to withstand chosen plaintext The algorithms currently in use to encrypt (or attacks, especially if the algorithms E and D are encipher) messages and data are based on sophis- published as part of the standard. The strength ticated mathematics and are usually implemented of an encryption scheme is determined by the al- using computers or dedicated microprocessors. gorithm itself and by the complexity of the secret Nevertheless, their underlying objective is quite information (in the case of modern encryption simple and can be traced back to antiquity:l to schemes, by the length of the key). In general, transform a message (or data) into a form that can- longer keys (i.e., more digits or binary bits) cor- not be understood by anyone who does not pos- respond to a stronger cipher, but this is not neces- sess special knowledge—the “key’ ’-that unlocks sarily the case: for a given algorithm, a shorter key the cipher and reveals the message. weakens the cipher, but for different algorithms, Encryption takes a plaintext message and trans- one using a shorter key may be stronger overall forms it into a ciphertext (or encrypted) message than one using a longer key length. using an encryption procedure and an encryption The strength of any encryption scheme rests fun- key. Thus, if P is the plaintext, E is the encryption damentally on the integrity of the key(s) used. procedure, and K. is the encryption key, then the Therefore, proper key management is fundamen- ciphertext, C, can be expressed mathematically as: tal to the security provided by any encryption scheme or cipher.4 C = E(K., P). The inverse process, decryption, given by D, trans- forms the ciphertext back into plaintext using the Evolution of the Data Encryption decryption key, Kd: Standard P = D(Kd, C). The Solicitation for a Standard In many encryption algorithms, the encryption No single event or act of Congress led the Fed- and decryption keys are identical (Kd = KJ and can eral Government to adopt a published encryption be represented simply by K.2 The algorithm that standard for Federal agencies to protect their un- is used in the Data Encryption Standard (DES) classified computer data and communications. In- uses one key, K, which is called a “private key” stead, a number of developments and concerns because the key is kept secret to ensure that out- came together in the 1960s and 1970s that caused siders cannot use it to read enciphered messages.3 many people in and out of Government to conclude The strength of an encryption algorithm (or ci- that a common means of protecting the Govern- pher) can be measured by its “work factor’ ’–the ment’s electronic information was needed. amount of effort (number of steps and time) re- One of these developments was the Brooks Act quired to “break” the cipher and read any en- of 1965 (Public Law 89-306), which authorizes the crypted message without the key. An algorithm’s National Bureau of Standards (NBS) to develop strength can be described in terms of the kinds of standards governing the purchase and use of com- “attacks” (attempts to break the cipher) it can puters by the Federal Government, to do research withstand. The most difficult type of attack to supporting the development of these standards, withstand is called the “chosen plaintext attack. and to assist Federal agencies in implementing In this type of attack, an adversary is able to sub- them. At the same time there was an increasing mit any amount of plaintext to the encryption al- interest in ensuring the confidentiality and secu- gorithm and obtain the corresponding ciphertext. rity of the Federal Government’s computer files The (P,C) pairs can then be used to try to deter- containing data on individual citizens.5 Addition- mine the secret key and break the cipher. 1 See, for example, David Kahn: The Codebreakers: The Story of Secret W’riting (New York, NY: The MacMillan Co., 1967). 4See the discussion of key management in ch. 4. r ~These are called symmetric encryption algorithms. Asymmetric 5The~e concerns were address~ in the privacy Act of 19741 ‘o ‘x- ciphers also exist, such as the “public-key” algorithms. (See the discus- ample. For more background on DES, see: U.S. Senate Select Commit- sions in ch. 4 and app. I). ) tee on Intelligence. “Unclassified Summary: Involvement of NSA in %ee R.C. Summers, “An overview of Computer Security, ” IBM S.YS- the Development of the Data Encryption Standard” (Staff Report), 98th tems Journal, vol. 23, No. 4, 1984, pp. 309-325. Cong., 2d sess., April 1978.

168 App. C—The Data Encryption Standard ● 169 ally, electronic transactions, such as fund trans- to submit algorithms for possible consideration as fers, were beginning to proliferate both within the a data encryption standard. There were few re- Federal Government and in the private sector. sponses; none were considered suitable. A second These trends gave impetus to growing concerns solicitation was issued on August 27, 1974. IBM for the security of Federal electronic information responded to the second solicitation; its algorithm and transactions. A consensus developed among eventually became the Data Encryption Standard computer security researchers at NBS and the (DES). National Security Agency (NSA) that a technical IBM had already done considerable work devel- means should be developed for safeguarding them oping encryption algorithms. Prior to the solicita- against accidental error as well as from assaults tion for DES, IBM had developed and patented by organized crime. At the time, they anticipated a 64-bit Cash Issuing Algorithm for safeguarding that the useful lifetime of this safeguard technol- financial transactions and a 128-bit encryption al- ogy would be about 30 years—until the late gorithm called Lucifer.7 As part of the patenting 1990s. 6 process, IBM’s algorithms were submitted to NSA NBS initiated a study in 1968 to evaluate the for review to determine whether or not the al- Federal Government’s computer security needs. gorithms should be classified. NSA chose not to As a result, NBS decided in 1972 to develop a gov- classify the algorithms and suggested to IBM that ernmentwide standard for encrypting unclassified one of them, with some modification, should be sub- Government data using an encryption algorithm mitted to NBS. to be published as a public standard. NBS initiated This step in the process has given rise to a great a computer security program within its Institute deal of controversy over the years. Although the for Computer Sciences and Technology (ICST) in algorithm that IBM submitted to NBS was exactly mid-1972. In early 1973, NBS and NSA staff met that which was published later as the Data Encryp- to discuss the encryption project. Throughout the tion Standard, this algorithm differed from the development of the standard, NBS made use of original IBM algorithm in a couple of fundamen- NSA’s recognized expertise, including the evalua- tal ways. These changes were made by IBM on the tion of algorithms proposed for the standard. Also, advice of NSA, which later led to questions as to some technical personnel left NSA and joined NBS whether NSA had “tampered’ with the algorithm during the early 1970s to staff the latter’s new com- or weakened it in some way, perhaps creating a puter security program. A chronology of DES de- “trapdoor” that NSA could spring. First, the key velopment, provided by NBS, is shown in table 15, length was shortened to 56 bits. Second, changes On May 15, 1973, NBS issued a solicitation through the Federal Register for interested parties

‘For a discussion of Lucifer and a description of the algorithm. +ee: “I), Branstad, NBS ICS1’, Private communication with OTA staff, Horst Feistel, “Cryptography and Computer I’ri\,ac}., ” Scientific ,4 mtr- Au~, 6, 1986. ican, vol. 228, No, 5, MaJ’ 1973, pp. 15-23.

Table 15.—Chronology of DES Development (major Federal agency events)

Event Date . NBS identifies need for computer security standards, ...... August 1971 ● NBS initiates program in computer security ...... July 1972 • NBS meets with NSA on encryption project ...... February 1973 Ž NBS publishes request for encryption algorithms ...... May 1973 . NSA reports no suitable algorithms were submitted ...... December 1973 . NBS publishes second request for algorithms ...... August 1974 ● NSA reports one submitted algorithm is acceptable ...... October 1974 ● NSA approves publication of proposed algorithm ...... January 1975 ● DOJ approves publication of proposed algorithm ...... February 1975 . NBS publishes proposed algorithm for comment ...... March 1975 . NBS publishes proposed DES for comment ...... August 1975 Ž NBS briefs DOJ on competition issues ...... February 1976 . NBS holds workshop on technology concerning DES. . . . August 1976 . NBS holds workshop on mathematical foundation of DES ...... September 1976 Ž DOC approves DES as a FIPS ...... November 1976 ● NBS publishes DES as FIPS PUB 46 ...... January 1977 SOURCE National Bureau of Standards, circa 1978 170 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

were made in the internal structure of the substi- changes have been made to the algorithm itself and tution functions—often referred to as the “S-boxes’ no substantive changes have been made to the —contained within the algorithm.8 standard which would warrant further solicitation In response to these concerns, NSA publicly for comments. ”12 (See box F.) stated that the reduced key size was sufficient for One of the specific recommendations contained use in unclassified applications and, furthermore, in the comments was that only hardware imple- that the IBM algorithm proposed for the data en- mentations should be considered. In response, cryption standard was “to the best of their knowl- NBS stated that “hardware is the only recom- edge, free of any statistical or mathematical weak- mended implementation.13Nevertheless, several ness. “g However, it was difficult for individuals software implementations of DES have been de- outside of NBS, NSA, or IBM to independently veloped by vendors for use by the private sector; substantiate (or refute) these statements. At the request of NSA, IBM had not disclosed all of the 121bid, design criteria used in the creation of the candi- ‘:{ Ibid, date algorithm-in particular, those resulting from NSA’s testing and evaluation of the original al- Box F.–Data Encryption Standard gorithm and the criteria that had been used to se- Summary of General Concerns lect the modified S-boxes and shorter key length. The following is a summary of the substan- Thus, although the proposed DES was published for comment, not all of the evaluative criteria that tive general concerns about the proposed has been used in developing the algorithm were Data Encryption Standard stated in the com- made public. ments received by NBS: 1. Computer equipment and related data Comments on the Proposed Standard processing equipment not based on a 64- bit architecture will be placed at a com- Comments on the proposed standard were solic- petitive disadvantage (A2, A3, A4, A5, ited in the Federal Register on March 17 and Au- A9, A10, B2, B5, B7, B10), gust 1, 1975, and in an August 1, 1975 letter sent 2. Certain types of communication systems to all Federal Information Processing Standards may be degraded to a significant degree points of contact in Federal agencies. ’” NBS pre- pared an analysis of the comments from the three (Al, A2, A3, A4, A5, A10, All, B2, B4, 11 B5, B9). solicitations. According to NBS, “all responses 3. The proposed algorithm is too complex, have been carefully considered and changes made especially when implemented in software to the standard where appropriate. However, no (Al, A3, A4, A8, A10, All, B4, B9). 4. Applicability of the algorithm, including when and where to use it, is not speci- ‘These were some of a set of allegations to the effect that NSA was fied (All, B2, C2, C6). improperly involved in the development of DES and was attempting 5. The proposed standard does not contain to exert undue influence on university and private-sector cryptological research. The Senate Select Comrnittee on Intelligence conducted a clas- information on electrical, mechanical and sified investigation of these allegations. Among its findings was that: functional interfaces to devices imple- “NSA did not tamper with the design of the algorithm in any way. IBM menting the standard (A2, A7, A9, B2, invented and designed the algorithm, made all pertinent decisions re- garding it, and concurred that the agreed upon key size was more than B5, B7, C2, C6). adequate for all commercial applications for which the DES was in- 6. Administrative procedures for valida- tended. ” U.S. Senate, Select Committee on Intelligence, “Unclassified tion, procurement and testing have not Summary: Involvement of NSA in the Development of the Data En- cryption Standard” (Staff Report), 95th Cong., 2d sess., April 1978, p. 4. been described (Al, A4, A7, B2, C2, C6). Others contend that the modifications that were made to the S-boxes 7. Policy for exporting devices implement- improved them and also were, at least in part, intended to minimize their logic to permit a smaller chip size when DES was implemented ing the proposed DES has not been made in hardware. (B2). %J.S. Senate Select Committee on Intelligence, April 1978, op. cit. 8. The algorithm does not provide an ade- ‘OThe Department of Commerce/NBS published the proposed Data Encryption Algorithm in the Federal Register on Mar. 17, 1975 (vol. quate level of security (A2, A3, A4, A6, 40, No, 52, p. 12134 et, seq.) and solicited comments to be submitted A8, B7, B9, B10). to NBS by May 16, 1975. SOURCE: “Analysis of Comments on the Data Encryption 1“’Analysis of Comments on the Data Encryption Standard, ” Standard, ” unpublished data available for public re- NBSICST (n.d., circa 1978). In total, 18 industry, 10 Federal, and 1 view at NBS. congressional source responded. Copies of all comments received by N’BS and NBS’ responses are available for public review at N“BS. App, C—The Data Encryption Standard ● 171 the first software simulation of DES (by Compu- of DES were substantial-for NBS; other Federal tation Planning, Inc. ) was announced in Novem- agencies; the private sector (including vendors, the ber 1975. banking community, university researchers, and The previously mentioned controversy and de- others); and for Congress, its staff, and support bate concerning the strength of the proposed agencies. According to NBS, DES consumed some standard and NSA's role in its development con- 3 man-years of effort for DES-related interactions tinued through the 1970s. To address some of these alone by 1978, exclusive of IBM technical devel- concerns, NBS sponsored two workshops on DES opment of the algorithm. Although exact statis- and also briefed the Department of Justice con- tics were not compiled, these interactions included cerning possible competition issues involving the a conference at NBS and some 2,000 technical and proposed standard. The first workshop, held in Au- policy meetings, telephone discussions, and mail gust 1976, addressed the technical and economic contacts. A 3-year projection of continued inter- feasibility of constructing a special-purpose com- actions more than tripled the man-year estimates, puter to attack DES through computational brute Developing the standards to support DES–for force, The second workshop, held in September use in communications, data storage, message 1977, addressed the mathematical foundations of authentication, user/terminal authentication, phys- the DES algorithm. Although the outcome of these ical security, magnetic stripe encryption, and key workshops was to allay most fears that DES was management—consumed an estimated 6½ man- not sound or could be inexpensively broken by years at NBS and another 34 man-years elsewhere brute force before the 1990s, participants ex- between 1977 and 1980.16 pressed concerns that it had not been possible to One estimate of the total (administrative, tech- assess all of the design characteristics of DES be- nical, test, and validation) DES-related costs cause some had not been made public. *4 through 1977 amounted to about $515,000 for NBS,

Also, in late 1975, Congressman Jack Brooks (D- some $6 to $10 million for IBM, about $460,000 Tex.), writing in response to the solicitation for for NSA, and around $1.5 million for other users comments, asked whether NSA had put undue in- and vendors. The estimated NBS support cost for fluence on NBS in setting the security level of DES DES during the period 1978-80 was more than and what the NBS role had been in DES develop- $800,000. ment and key generation. Prompted by Brooks’ in- As of January 1987, about 20 industry vendors quiry, the Senate Select Committee on Intelligence had produced one or more versions of hardware or staff ultimately responded, after a classified in- firmware devices (chips) implementing the DES al- quiry, that there had been no undue NSA influ- gorithm, for use in their own products or for sale ence. 15 At the time, NBS stated that, although it to other manufacturers, And, as of that date, NBS would provide guidance and good techniques for had validated 28 implementations of the DES al- individual Federal agencies to generate their own gorithm in hardware or firmware, produced by 11 DES keys in accordance with Federal Information vendors. Processing Standards (FIPS), no Government NBS, which takes the position that software im- agency should generate keys for other agencies or plementations of DES would not comply with the for the private sector. Federal standard, only validates electronic devices (hardware or firmware) implementing the DES al- Promulgation of the Standard gorithm. The rationale is that hardware implemen- tations are faster than software and that they are The Department of Commerce approved DES as thought to be more reliable and harder for an ad- a standard in November 1976. NBS published it versary to modify “behind the user’s back.17; as FIPS PUB 46 in January 1977, with the provi- Software implementations of DES are being mar- sion that DES would be reviewed for continued keted, but are not validated or certified for Gov- suitability at 5-year intervals and would be recer- ernment use. Also, some vendors choose not to sub- tified (or not) every 5 years by NSA. DES was last recertified in 1982. The administrative and technical workloads 1fiSource: Unpublished estimates de~eloped at NBS in the late 1970s. associated with the development and promulgation 1‘At the same time, it is worth noting that software irnpkmentatiuns of high-quality encryption are much more difficult to control in terms of their dissemination and exportability. Because the DES algorithm is published, almost anyone with the requisite technical skills can pro- 11’ ‘Computer Encryption and the National Security Agencj Connect- duce soft ware \ersions of it, producing microprocessor-based implemen- ion,” Science, vol. 197, July 29, 1977, pp. 438-440. tations is more difficult, The new NS.A secret algorithms are easier to i “U.S. Senate Select Committee on Intelligence, April 1978, op. cit. control because the~’ are not published. 172 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Inforrnation mit their hardware or firmware DES products for Figure 22.— DES Validation and Procurement validation or certification for Government use. Implementation According to NBS staff, the Department of De- NBS fense is one of the largest single Federal custom- request ers for DES-based devices. Figure 22 shows the roles of NBS and GSA in DES-based product validation and procurement. Fail Description of DES A short technical summary of the encryption al- gorithm used in DES is given in figure 13 and box certificate B of chapter 4. Complete technical descriptions of w the four DES modes of operation, including initiali- t zation and error propagation properties and use GSA for message authentication, may be found in FIPS 1 Publications 74, 81, and 113, issued by NBS.18 Diagrams of DES modes of operation, taken from Approved NBS publications, are given in figure 23.

*’U.S. Department of Commerce, National Bureau of Standards: Certlficate “Guidelines for Implementing and using the NBS Data Encryption Agency with proposals Standard, ” FIPS PUB 74, Apr. 1, 1981; “DES Modes of Operation, ” FIPS PUB 81, Dec. 2, 1980; and “Computer Data Authentication, ” FIPS SOURCE National Bureau of Standards/Institute for Computer Scfences and PUB 113, May 30, 1985. Technology App. C— The Data Encrypt/on Standard ● 173

ECB ENCRYPTION ECB DECRYPTION ENCRYPTION DECRYPTION

SHIFT PLAIN TEXT r INPUT BLOCK

(Dl, D2, . . . D64) (64 K) BITS : K BIT S

I h FEED BACK K BITS

DES ENCRYPT -

O UTP U T BLOCK SELECT ; DISCARD DES ENCRYPT DES DECRYPT X BITS : (64 K) BITS I I IL 1 I I I I K 1 I OUTPUT BLOCK I t OUTPUT BLOCK 1

CIPHER TEXT (Cl, C2, ..., C64)

ENCRYPTION DECRYPTION Cipher Block SHIFT — TIME 1 r -

FEED BACK K BITS

DES ENCRYPT DEs ENCRYPT

t I OUTP:UT BLOCt OUTP,UT B1OCK SELECT : DISCARD SELECT ~ OISCARO K BITS ~ (64-K) BIT S K 011S : (64-K) BI T S

1 1

K K BITS * ++;::++

K 1 K F

I Appendix D Message Authentication, Public-Key Ciphers, and Digital Signatures

Message Authentication In general, the various message authentication schemes that are used can be grouped together An “authentic” message is one that has arrived according to whether they are based on public exactly as it was sent (without errors or altera- knowledge or, at least in part, on secret knowledge. tions), is not a replay of a previous message, and Among the former are message authentication comes from the stated source (not forged or falsi- using check fields,3 parity checks,4 and cyclic fied by an imposter or fraudulent altered by the redundancy checks.5 These share a common weak- 1 recipient). Encipherment in itself does not auto- ness: unauthorized or fraudulent modifications matically authenticate a message: it protects may go undetected because they are accompanied against passive eavesdropping automatically, but by matching, yet fraudulent, authentication pa- does not protect against some forms of active rameters that can be calculated by unauthorized 2 attack. parties because the authentication parameters are Encipherment algorithms can be used to authen- not secret. ticate messages, however, and the algorithm used Using secret authentication parameters known in the Data Encryption Standard (DES) is the most only to the sender and receiver permits a stronger widely used cryptographic basis for message au- form of message authentication because the check thentication. As discussed in more detail later, field data cannot be forged by a third part y unless there are profound differences in using a symmet- ric cipher, such as the current DES algorithm, rather than an asymmetric one like the RSA al- - gorithm named after its inventors: Ronald Rivest, Ocheck-field techniques we designed to ensure that stored or trans mitted information has been prepared correctly. A check field is a sim- Adi Shamir, and Leonard Adelman. Use of a sym- ple function of the numerical characters in the important fields of the metric cipher for message authentication can only message that makes it highly likely that the most common types of protect against third parties and not against fraud mistakes and errors will be detected. The use of check fields does not safeguard against deliberate fraud by data-entry operators or others; by either the sender or receiver (both of whom know because the check sum function and the data used to create it are not the secret key), while an asymmetric algorithm can secret, a data-entry operator could calculate the ‘‘correct check sum and transmit it along with a fraudulently altered message. Also, it is be used to resolve disputes between the sender- possible to generate false messages that have the same calculated check receiver pair. sum value as the original message. (See Davies & Price, op. cit., pp. As the uses of electronic media for financial and 121-122.) 4Parity checks can be used to detect accidental errors during trans- business transactions have proliferated, message mission, usually either by using an eighth “parity bit with each seven- authentication techniques have become increas- bit message word or by providing longitudinal parity checks using ingly important and have evolved from simple modulo-2 addition on successive words. Parity checks are weak in that multiple errors ardor missing blocks can sometimes go undetected. (See pencil-and-paper calculations to sophisticated, Davies & Price, op. cit., p. 122. ) high-speed hardware processors. ‘According to Davies & Price, cyclic redundancy checks are the best- known error detection method and offer strong protection against ac- cidental error. However, the procedure for creating the check data via a predetermined polynomial is public knowledge. Therefore, the checks do not provide strong protection against an active attack. In this form of message authentication, the cyclic check operation calculates the 1 For a thorough discussion of message authentication and the vari- check data by dividing the polynomial formed by a block of message ous techniques used to authenticate messages, see I)avies & Price, Secu- bits by the predetermined check polynomial and using the “remainder” rit.v for Computer N’etworks: An Introduction to Data Security in from the polynomial division as a check field. The check field is appended Teleprocessing and Electronic Fund Transfers (New York, NY: J. Wiley to the message block and transmitted with the message. Upon its re- & Sons, 1984) ch. 5. The descriptions of authentication techniques in ceipt, the recipient performs the same polynomial division operation this section follow Davies & Price closely. on the message and compares the remainder wit~ the transmitted check ‘Davies & Price describe passive attack as eavesdropping and active field to authenticate the message. The cyclic check does not protect attack as the falsification of data and transactions through such means against active attack because the means of creating the check data— as: 1 I alteration, deletion, or addition; 2) changing the apparent origin the polynomial division operation—is not secret. An active wiretapper of the message; 3) changing the actual destination of the message; 4) can divert the message, alter it, calculate a new check field using the altering the sequence of blocks of data or items in the message; 5) replay- correct predetermined polynomial, and retransmit the altered message ing previously transmitted or stored data to create a new false mes- with the new check field appended. The message will appear to be authen- sage; or 6) falsifying an acknowledgment for a genuine message. (See tic when the recipient compares the check fields. (See Davies & Price, I)avies & Price, op. cit., pp. 119-120. ) op. cit., pp. 122-123. )

174 App. D—Message Authentication, Public-Key Ciphers, and Digital Signatures ● 175 — the secret parameters are compromised. A differ- DES hardware in an appropriate mode of opera- ent secret parameter is usually required for each tion in order to create a message authentication sender-receiver pair. The logistics for distributing code. 11 this secret information to the correct parties is When DES hardware is used for authentication, analogous to key distribution for encryption. Com- it is operated in either the cipher block chaining promise of the secret parameters invalidates the (CBC) or cipher feedback (CFB) mode; the chain- integrity of the safeguarding function because it ing or feedback operation ensures that the authen- could then be forged. (See ch. 4.) ticator, selected from the last state of the DES In the most general sense, an authentication hardware output register, is a function of the en- function based on secret knowledge can be con- tire message stream input to the DES device. 12 structed from a public authenticator algorithm and The authenticator is appended to the message and a secret authentication key.6 Examples of authen- transmitted along with it. The recipient removes ticators based on secret keys include the Decimal the authenticator from the received message and Shift and Add (DSA) algorithm,7 and the propri- calculates his or her own value of the authentica- etary S. W. I.F.T. (Society for Worldwide Interbank tor using the secret key and initialization vector Financial Telecommunications) and Data Seal al- shared with the sender. If the two authenticator gorithms, which use binary arithmetic.8 Although values are the same, then there is increased assur- authentication can be based on encryption (the ance that the message is authentic. DES algorithm, for example, is widely used for The message can be transmitted in plaintext message authentication), the strict requirements without compromising its authenticity. If the mes- for an authenticator algorithm differ from those sage is altered by a third party who does not use for an encryption algorithm because authentica- the secret DES key to calculate a forged authenti- tion does not require the existence of an inverse cator to append to the altered message, then the function (i.e., decryption). It is also possible to base authenticator calculated by the receiver will not message authentication on secret numbers used in have the same value as the one transmitted with conjunction with special one-way functions. g the message. However, because both sender and Encryption alone is not always sufficient to com- receiver know the secret parameters, either could pletely authenticate a message. If decryption of fraudulently alter the message and deny having an encrypted message yields “sensible” plaintext, done so. This type of dispute can be resolved without garbled portions, then there is reasonable through use of an asymmetric cipher, as will be dis- certainty that the message was originated by the cussed below in the sections on public-key ciphers other “authorized” holder of the secret key. How- and digital signatures. ever, some types of message alterations can go un- If privacy as well as authentication is required, detected. ]() A more robust authentication method then one scheme for encrypting and authenticat- (than DES encryption alone, for example) is to use ing a message involves sequential use of DES using — two different secret keys: one to calculate the “For mathematical and functional descriptions of authenticator func- authenticator (called the message authentication tions, see: Davies & Price, op. cit., pp. 123- 135; and Et. R. Jueneman, S,M. code or MAC), and one to encrypt the message. Nlatyas, and C. H hleyer. ‘ ‘Message Authentication, ” IEEE Commu- These operations can be performed in either order; nications .!lagazlne, vol. 23, No, 9, September 1985, ‘I)SA 1s based on parallel computations performed by the sender and the ANSI X9.23 standard requires that the MAC recei~er, the starting point for the computations are two secret 10-digit be calculated before encryption. ” Even the MAC decimal numbers. The message to be authenticated is treated as a string of decimal digits, thus DSA requires that alphabetic characters be en- coded into numeric form, although the numeric content of a message 1 lStrictly speaking, any block encryption algorithm could be used. (e. g., the value of a financial transaction) does not require any conver- However, in practice, the cipher used is DES because the algorithm is sion. According to Davies & Price isee pp. 127-130 for an example of readily available in hardware form. The DES algorithm IS relatiirelj S1OW I)SA), the algorithm can be implemented using a programmable deci- in software form, which makes the hardware form much morcj con~.enient mal calculator. for data transmission. “Because these are proprietary, the>, are not a~’ailable for use as a 121n the CBC mode, the authenticator i~ the most sigmficant n bits published standard. from the last block output by the device. In the (7FH mode, the I)ES “A one-way function has the special property that. although the func- device is operated one additional time after ~hc last message block is t ion it+elf 1+ relati~ely easy to compute, its inverse is quite difficult to input, and the authenticator is selected as the most significant n bits compute— i.e., even if the ~’alues of authenticators for many messages of the final output block. The length of the authenticator (usual]} 32 are known, it is almost impossible to recover the text of a message given bits for EFT authentication, according to the standard) is determined only the \,alue of its authenticator. Some, hut not alE, ‘‘hash functions ‘— jointl~ by the sender and receiver. functions that appear to generate random outputs from nonrandom I i 1 f“ the kf AC checks the ciphertex~, then an ad~’ersar}. is a~)le to Ln uts — are suitable for message authentication. mount a known plaintext attack on the key used for authentication, Y ‘iSee I)a\’ies & price, op cit., pp. 134-135 for examples. Juencman, If, however, the NIAC checks the plaintext, ~hen an ad~ersary mus~ et al , also discuss strengths and weaknesses of ~anous authentication break both the MAC kc}’ and the encryption key in order to send fraudu- and manipulation detect ion techniques. lent messages, 176 . Defending Secrets, Sharing Data: New Locks and Keys for Electronic /formation and encryption do not safeguard against replay of Knowing the public encryption key–even when messages (e.g., electronic fund transfers). There- the encrypted message is also available-does not fore, various message sequence numbers or date make computing the secret decrypting key easy, and time stamps are usually incorporated into the so that in practice only the authorized holder of text of the message. The ANSI X9.9 standard re- the secret key can read the encrypted message.17 quires a message identifier field to prevent replay. However, with the encrypting key being publicly known, a properly encrypted message can come Public-Key Ciphers from any source, so there is no guarantee of its authenticity. A symmetric cypher is an encryption method It is also crucial that the public encrypting key using one key, known to both the sender and re- be authentic. An imposter could publish his own ceiver of a message, that is used both to encrypt key, PKI, and for example, pretend it came from and to decrypt a message. Obviously, the strength A in order to read messages to A, which he could of a symmetric cipher depends on both parties intercept and then read using his own SKI. There- keeping the key secret from others. With DES fore, the strength of the public-key cipher rests on cipher, for example, for which the algorithm is the authenticity of the public-key and the secrecy known, revealing the encryption key permits the of the private key. A variant of a public-key sys- message to be read by any third party. tem allows a sender to authenticate messages by An asymmetric cypher is an encryption scheme “signing” them using an encrypting key, which (supposedly) is known only to him. This is a very using a pair of keys, one to encrypt and a second- to decrypt a message. 14 A special class of asym- strong means of authentication and is discussed metric ciphers are public-key ciphers, for which the further in the following section on digital sig- natures. encrypting key need not be kept secret to ensure review and illustrate the private communication. 15 Rather, Party A can Davies and Price 18 functional requirements for a general public-key publicly announce his encrypting key, PKA, allow- ing anyone who wishes to communicate privately cryptosystem. A brief overview follows here, but with him to use it to encrypt a message. Party A’s a detailed description of the underlying mathe- matics is beyond the scope of this appendix, decrypting key, SKA, is kept secret, so that only A (or someone else who has obtained the secret If encipherment is performed by some function decrypting key ) can easily convert messages en- E{ K,) and decipherment by another D(Kd), then in 16 order to make the decipherment function the in- crypted with PKA back into plaintext. verse of encipherment, the encrypting key, Ke, and the decrypting key, Kd, must be related somehow. Suppose both keys are derived from a randomly 14 See Davies & Price, op. cit., ch. 8, for a more complete discussion of asymmetric and public-key ciphers. selected starting key, or seed key, K, such that A discussion of the underlying principles of public-key ciphers, includ- Ke.= F(Ks) and Kd = G(Ks), where the functions F, ing examples of the RSA and knapsack algorithms, is given in: Martin E. Hellman, “The Mathematics of Public-Key Cryptography, ” Scien- G, D, and E defined above are published. Party tific American, vol. 241, No. 2, August 1979, pp. 146-157. A would then select a KS (which would be kept A pictorial example of the RSA public-key method can be found in secret), use it to calculate Ke and Kd, and publish Computer Security (one of the Understanding Computers series) (Alex- andria VA: Time-Life Books, 1986), pp. 112-117. Ke, as his public key, PKA, while keeping Kd secret s ‘ The public-key concept was first proposed by Whitfield Diffie and as the secret key, SKA. Martin Hellman in “New Directions in Cryptography, ” IEEE Trans. If P is the plaintext message and C is the en- Information Theory, IT-22, 6, November 1976, pp. 644-654. Diffie and Hellman also described how such a public-key cryptosystem could be crypted message, then C =E(P) and P= D(E(P)); used to “sign” individual messages and to simplify the distribution of that is, D(E(P)) must be the inverse of E. However, secret keys. Their work was the basis for Rivest, Shamir, and Adelman’s practicaf implementation of such a system in 1978, called the RSA cipher. because E is really the function E(K,) and is pub- Some ciphers proposed for public-key systems have subsequently been lic, the function E must not be readily invertible broken. For example, the Diffie-HelIman “discrete exponentiaf” cipher or else an opponent can readily calculate P given was broken several years later by Donald Coppersmith of IBM [G. Kolata: “Another Promising Code Falls, ” Science, vol. 226, Dec. 16, 1983, C and E. This property is described as making E p. 1224]. The “trap-door knapsack” cipher, another public-key cipher proposed in 1976 by Hellman and Ralph Merkle, was broken by Shamir and Adelman in 1982. (See Computer Security, op. cit., pp. 100-101; and Hellman’s 1979 article in Scientific American.) 1~This section uses the notation PK for “public key” (usu~y, the en- ITUse of the two keys might also be used to separate access to “read” crypting key) and SK for “secret key” (usually, the decrypting key). and 4 ‘write’ data functions. For example, by controlling dissemination For A and B to have two-way communication, two pairs of keys are of the encryption key, one might control write access; by controlling required: the “public” encryption keys PK ~ and PK1l, and the secret dissemination of the decryption key, read access. decryption keys SK,i and SK~Z. lhDavies & price, Op. cit.. Ch. 8“ App. D—Message Authentication, Public-Key Ciphers, and Digital Signatures ● 177

(and also the function F, which generates K, from door knapsack cipher is based on combinatorial Ks one-way functions that cannot readily be in- mathematics as well, The mathematical problems verted. underlying the RSA cipher and the knapsack The requirements that E be a noninvertible, one- public-key cipher belong to a class of problems way function and that D(E(P)) be its inverse are called “nondeterministic, polynomial-time prob- reconcilable when E is not invertible without lems, ” or NP problems. The computational burden knowledge of Ke, but the inverse of E is readily of finding a solution to the hardest NP problems, obtained using the secret key Kd. Thus, knowledge using published methods, grows very rapidly as of the secret key is a‘ ‘trapdoor, which makes the the size of the problem increases. It is strongly be- inverse of E simple to implement. E(P), where lieved (but not proved) that the burden must grow E = E(Ke), is a one-way function with a trapdoor very rapidly, no matter what method of solution D(E(P)), which allows it to be inverted. Knowledge is used. However, once the solution is found, it can 19 of Kd springs the trapdoor. be checked very easily. 21 Even so, it is possible A public-key cryptosystem consists of encryp- that advances in mathematics and computer sci- tion and decryption functions, together with meth- ence may undermine public-key cryptosystems ods for generating pairs of keys from the random based on “one-way” functions. One instance of this seed values. The one-way property of a “one-way was the “breaking” of the trapdoor knapsack ci- function, ” such as E, is really only a matter of com- pher. Box G describes this cipher. putational complexity. The encryption function The knapsack cipher system was thought to be should be relatively easy to carry out, given Ke. and effectively unbreakable (computationally but not E, but given the ciphertext C = E(P), the plaintext unconditionally secure) and Merkle issued an open P =D(E(P)) should be very hard to calculate and challenge to cryptologists in 1976 to break it, In should require a very large number of steps, un- 1982, Adi Shamir at the Massachusetts Institute less Kd is known. of Technology (MIT) broke the cipher analytically. In principle, it should be possible to calculate Soon afterward, Leonard Adelman, a former col- values of C for many values of P and then to sort league of Shamir, used Shamir’s method and an and tabulate the pairs of (Pi, Ci) to obtain an ex- Apple II computer to break an example of the plicit inversion of E. Because this type of exhaus- knapsack cipher. tive search process requires a large number of com- Another public-key system, called the RSA sys- putational steps and large computer memory size, tem, was announced in 1978. The RSA system is both of which grow exponentially with the key size, computationally more complex to implement than E is effectively a one-way function if the explicit the trapdoor knapsack cipher and it has not yet inversion requires a very large number of (P,C) been broken. Also, the RSA system does not ex- pairs. pand the plaintext message the way the knapsack Like all of modern cryptography, public-key cipher does, Message expansion occurs with the cryptosystems rely heavily on mathematics and, knapsack cipher because the sum of the “hard” in particular, on number theory. The RSA cipher knapsack vector used in the knapsack public key . 20 and the trap- is based on modular arithmetic is larger than the sum of the “easy” vector used in the secret key. Therefore, more binary bits are ‘<’See Ijavies & I’rice, op. cit., ch. /3 for a more thorough explanation. required to represent the ciphertext than to repre- I)iffie and Hellman introduced the concept of trapdoor one-way func- sent the plaintex!. tions in their 1976 paper {op. cit.1, but did not present any examples. In their 1978 paper, Rivest, Shamir, and Adelman presented their im- plementation of a public-key system using a one-way trapdoor function. The RSA Public-Key System See also R.L. Rivest, A. Shamir, and L. Adelman “A Method for Ob- taining Digital Signatures and Public-Key Cryptosystems, ” Commurri- cations of the ACM, vol. 21, No. 2, February 1978; and Hellman’s ar~i- The RSA public-key system is thought to be the cle in the August 1979 issue of Scientific American op. cit. most computationally secure, commercially avail- ‘(} Finite arithmetic with modulus m (modular arithmetic) has the oper- ations of addition, multiplication, subtraction, and division defined. A able public-key system. It also enables the prob- prime number—e,g., 3, 5, or 7 in modulo 10 arithmetic—has no factors other than 1 and itself. Finite arithmetic with a prime modulus p has the additional property that multiplication always has an ini’erse. This property is crucial for cryptography}’. I n modulus 10 arithmetic, for example, the number 57 would be rep- resented by its remainder, 7 : ,57 10 = 5, with a remainder of 7. Itlore The exponential function a’ in modulus p arithmetic is valuable as general}’, the remainder always has a value between O and (m 1), where a one-wa~, function: calculating the exponential y = ax is eas}’. but cal- m is the modulus Thus, in modulus 3 arithmetic, 57 would be reprc’- culating Its interse, x = Iogcly is difficult for large p. sented by the remainder of 0: in modulus I 1 arithmetic by the remainder ‘1 See Hellman’s article in the Augyst 1979 Scientific American op. of 2, etc cit. 178 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Informafion

Box G.—The Trapdoor Knapsack Cipher The trapdoor knapsack system was proposed by Ralph Merkle and Martin Hellman in 1976.1 The “knapsack puzzle” or “knapsack problem” is well-known in mathematics and can be summa- rized briefly as the following problem: Suppose you are given a set of N weights of assorted (and known) integer sizes. You want to use them to balance a knapsack that holds an unknown combination of the same weights. You are given the values of the set of integer weights (called the knapsack vector) and the weight to be matched (called the knapsack total). Find the subset of the N weights that will exactly balance the knapsack! Although it is possible to find examples of this problem that are fairly easy to solve by exhaustive search— when N is a small number or when each weight is heavier than the sum of the proceeding weights, for example–all of the known methods of solving the general knapsack problem have a computational requirement that grows exponentially in the key size and, therefore, are impossible to implement for rea- sonably large key sizes. An exhaustive search is quite lengthy for large N. Suppose that N is 5. Then, the knapsack vector has five components (one corresponding to each weight), each of which could be equal to 1 (the weight is used to try to balance the knapsack) or O (the weight is not used in this try). There are 2S or 32 possible vectors to be tried in an exhaustive search. If N were 10, up to 1,024 tries would be covered in an exhaustive search. If N were 1,000, an exhaustive search would clearly be infeasible. Sometimes the problem is posed differently, as a cylindrical knapsack of fixed length and a set of rods of different lengths, with the problem being to find the subset of rods that will completely fill the knapsack. In either case, the problem is an example of the general class of NP problems. The “trapdoor” knapsack problem is a special case, which is not computationally difficult to solve provided that one has special information that enables the problem to be solved more easily than for the general case. In this case, the “trapdoor” enables the intended recipient who knows the secret key (the trapdoor information) to solve the knapsack problem and reveal the plaintext message without having to do an exhaustive search. The intended receiver and originator of the public and secret keys builds a secret structure into the knapsack problem. The receiver generates the keys by first generating an “easy” knapsack vec- tor, called a super-increasing vector, in which each weight is larger than the sum of the preceding weights. The sum of the super-increasing knapsack vector is the heaviest possible knapsack. The receiver then chooses a modulus m larger than this maximum weight and a multiplier a such that m and a are relatively prime. The “hard” knapsack vector is constructed by multiplying the “easy” vector by a, using modulus m arithmetic. The “hard’ knapsack vector, arranged in order of increas- ing weight, forms the public (encryption) key. The “easy” vector, along with the values used for m and a, are kept as the secret key. Merkle and Hellman’s public-key system was based on special key pairs that were used to en- crypt and decrypt plaintext. Briefly (see Computer Security, pp. 100-101), each character in the plaintext was assigned a numerical value and all the numbers were then summed together. The secret key enabled the individual numbers to be recovered and, from them, the plaintext.

‘ For the history of the trapdoor knapsack system, see Computer. Securit.v, one of the Understanding Computers series (Alexandria, VA: Time-I,ife Books, 1986), pp. 100-101; Davies & Price. Security for Computer IVetworks; An Introduction to Data Securitxy in Teleprocessing and Electronic Funds Transfer, (New York, NY: J. Wiley & Sons, 1984), p. 251; and Martin E. Hellman, “The Mathematics of Public-Key Cryptography, ” Scientific ,4merican, vol. 241, No. 2, August 1979, p. 148. App. D—Message Authentication, Public-Key Ciphers, and Digita/ Signatures ● 179 lem of disputes between sender and receiver to be If each block of the plaintext message is repre- resolved through the method of digital signatures. 22 sented as an integer between O and (n – 1), then en- The RSA system is based on a problem that is cryption is accomplished by raising the plaintext even older than the knapsack problem: that of “fac- to the eth power, modulus n. toring” a large number (finding all the prime num- The secret (signing and/or decrypting) key, d, is bers that divide it evenly) .23 This is another com- computed from p, q, and e as the “multiplicative putationally “one-way” problem in that factoring inverse” of e, modulus (p–I)(q–1); that is, the a large number takes much longer (by hand or by product of d and eis1,modulus(p–l)(q–1). Thus, computer) than does verifying that two or more individual knowledge of p and q are thought to be numbers are prime factors of the same large necessary to create the secret key. Decryption is number. accomplished by raising the ciphertext to the dth The proprietary RSA system is thought to be power, modulus n. based on the relative difficulty of finding two large It is possible to break the RSA cipher if the prime prime numbers, given their product. The recipient factors p and q can be determined by factoring the (and originator of the key pair) randomly selects value of n that was given in the public key. Many two large prime numbers, called p and q. These factoring algorithms exist, some based on the work prime numbers are kept secret. Another (odd) in- of Fermat, Legendre, and Gauss. Others have been teger e is chosen, which must pass a special math- developed more recently to take advantage of com- ematical test based on values of p and q. The prod- puters and special processors to do fast factori- uct, n, of p times q and the value of e are announced zation. as the public encryption key. Even though their Depending on the factorization method used, it product is announced publicly, the prime factors is possible to estimate the number of computa- p and q are not readily obtained from n. Therefore, tional steps required to factor a number, n. The revealing the product of p and q does not com- number of steps and the speed with which they can promise the secret key, which is computed from be performed determine the time required to fac- the individual values of p and q.24 tor n. The number of steps required to factor n— The RSA public (encrypting) key consists of two thus, the work and time required to “break” the integers, n and e, where n is the product n=(p)(q). RSA cipher by the factorization approach (finding

systems have p and q)–increases rapidly as the number of digits zzoth~~ ~UbliC.key been developed, some e~fier ‘hm 25 RSA, but have not gained as wide an acceptance in commercial mar- inn increases. Thus, an important feature of the kets. There continue to be new developments in public-key cryptogra- phy (see, e.g., S. Goldwaaser, S. MicaIi, and R. Rivest, MIT Laboratory for Computer Science, “A Digital Signature Scheme Secure Against zsAccording t. Davies & Rice, op. cit., pp. 242-244, them-y shows Adaptive Chosen Message Attack, Rev. Apr. 23, 1986), but some of these that, for the better-known factorization algorithms, the relationship be- are of more academic interest than immediate practicability for safe- tween the number of steps and n is exponential. In the early 1980s, ex- guarding communications. perimental work doing fast factorization using a number of techniques, 23FOT diassions of the underlying mathematics, see Davies & ~ice~ including special processors, pointed to a “limit” of 70 to 80 decimal op. cit., ch. 8; Rivest Shamir, and Adelman in the February 1978 Com- digits for factoring in one day. mum”cations of the ACM, and Hellman in the August 1979 Scientific Advances in theory and in microprocessor and computer technologies American op. cit.; Computer Security, op. cit., pp. 112-115, has an illus- can serve to make estimates of this type obsolete. For example, Rivest, trated example. Shamir, and Adelman’s 1978 article in the Communications of the ACiV The relationship between the RSA exponential functions used to en- (February 1978, p. 125) provided a table estimating the number of oper- cipher and decipher follows from an identity due to Euler and Fermat ations required to factor n using the fastest-known method then. As- which demonstrates the properties that e and d must have, givenp and suming that a computer was used and that each operation took one microsecond, the authors estimated that a 50-decimal-digit value of n ‘“ Zicerttin spWi~ v~ues of (p)(q) can be factored easily—when P ~d could be factored in under 4 hours, a 75-digit value in 104 days, a 100- q are nearly equal, for instance. These special cases need to be avoided digit value in 74 years, and that a 200-decimal-digit n would require in selecting suitable keys. almost 4 billion years to factor. 180 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic Information

RSA cipher is that the desired level of security repudiation of a message or transaction, for exam- (measured by the work required to break the ci- ple) and cannot in themselves resolve contractual pher, or its “work factor”) can be tailored from a disputes between the two parties. Paper-based sys- wide range of levels simply by varying the num- tems have long been based on mechanisms like let- ber of digits in the key. ters of introduction for identification of the par- Davies and Price report on a new factorization ties, signatures for authenticating a letter or method using parallel computation by special, contract, and sealing a (physical) envelope for large-scale integration (LSI) devices. Assuming privacy. The contractual value of paper documents that the parallel LSI devices use 128-bit arithmetic hinges on the recognized legal validity of the sig- and run off a 1O-MHZ clock, a 100-decimal-digit nature and on the laws against forgery. value of n would take a little over 2 years to fac- It is possible to provide equivalent functions for tor, a 150-digit n would take 6,300 years to factor, electronic documents by using a digital signature and a 200-digit n would take 860,000 years to fac- to authenticate the contents and also prove who tor.26 Current implementations of the cipher use originated the document (because only one party keys of 200 digits or longer; that is, the number knows the secret information used to create the n has 200 or more decimal digits. signature). A digital signature can be created using Rivest, Shamir, and Adelman formed RSA Data a symmetric cipher (such as DES), but in general Security, Inc., in 1982 and obtained an exclusive asymmetric ciphers provide for more efficient oper- license for their invention from MIT, which owned ations. The digital signature method in most com- the patent.27 RSA Data Security has developed mon use commercially is based on the RSA ci- proprietary software packages implementing the pher.28 The digital signature can be used with RSA cipher on personal computer networks. These encipherment if privacy is required. packages, being marketed commercially, provide The equivalent of a “letter of introduction” is software-based communication safeguards, includ- still necessary to verif y that the correct public key ing message authentication, digital signatures, key is used to check the digital signature since an ad- management, and encryption. Another offering is versary might try to spoof the signature system designed to safeguard data files and spreadsheets by substituting his or her own public key and sig- being transmitted between intelligent worksta- nature for the real author’s. This letter of intro- tions, electronic mail networks, and files being duction could be accomplished by several means. stored locally. The RSA Data Security package The RSA Data Security system provides “signed that safeguards electronic mail and spreadsheets key server certificates” by attaching the corpora- sells for about $250, including one copy of the pro- tion’s own digital signature to the users’ public gram, a key generating program, and a registered keys so that users can attach their certified public and authenticated user identification number. signature keys to their signed messages. Note that although a public-key cipher system is used to set Digital Signatures up the digital signature system, the actual text of the message can be sent in plaintext, if desired, Encryption or message authentication alone can or it can be encrypted using DES or the public- only safeguard a communication or transaction key cipher.29 against the actions of third parties. They cannot The RSA Data Security digital signature sys- fully protect one of the communicating parties tem works as follows: from fraudulent actions by the other (forgery or First, a cryptographic “hashing” algorithm cre- ates a shorter, 128-bit “digest” of the message. The message digest, similar to a checksum, is virtually 2%ee Davies & Price, op. cit., pp. 243-244. ZTOther “~ver9ity rese~ch in cryptography has ~so been Patented and licensed. For instance, Stanford University has four cryptography patents available for licensing on a non-exclusive basis, for a wide range 28sw Davie8 & price, op. cit., ch. 9, for a general of digit~ of potential applications (including protection of tape and disk drives; treatmerIt atures and alternative methods. time-shared computers; satellite, microwave, and mobile radio commu- ‘i%-? nications equipment; computer terminals; and electronic banking). Stan- For example, if the RSA digital signature is used to sign and en- ford University considers that one of these patents (Public Key Crypto- crypt, the sender’s secret key is used to sign the message and the in- graphic Apparatus and Method, U.S. Patent #4,218,582, Aug. 19,1980, tended recipient’s public key is used to encrypt the message. The recip- Martin E. Hellman and Ralph C. Merkle), covers any public-key system ient uses his secret key to decrypt the message and the sender’s public in any implementation. key to check the signature. In practice, the RSA digital signature sys- Source: Letter dated 9/29/86 to OTA from Lisa Kuuttila, Stanford tem is used to transmit a DES key for use in encrypting the text of Office of Technology Licensing, Re: Stanford Dockets S77-012,-015, -048; a message because DES can be implemented in hardware and is much S78-080, “Encryption Technology. ” faster than using the RSA algorithm to encrypt text in software. App, D—Message Authentication, Public-Key Ciphers, and Digital Signatures ● 181 unique30 to each text. If a single bit of the plain- The enciphered message digest is attached to the text message is altered, the message digest will text and both are sent to the intended recipient. change substantially. A one-way hashing function The recipient removes the appended message di- is used to prevent the document from being recon- gest and runs the text of the message through the structed from the digest. same hashing function to produce his own copy of Next, the message digest is encrypted with the the message digest. Then, the recipient deciphers 31 author’s secret key. In the RSA system, each the message digest that was sent along with the key is the inverse of the other; that is, each key message, using the supposed author’s public key. can decipher text enciphered with the other key. If the two message digests are identical, then Therefore, using Party A’s public key to decipher the message did indeed come from the supposed a message into sensible plaintext proves that Party author and the contents of the text were received A’s secret key was used to encipher the message. exactly as sent, unless someone has learned the The integrity of this system hinges on preventing author’s secret key and used it to forge a message the secret key from being compromised and ensur- digest for a message of his own or one that he has ing that an imposter does not post his own public altered. key and pretend that it is the real Party A’s. If the author wants to keep the text of the mes- sage private, so that only the intended recipient can read it, he or she can encrypt the signed mes- sage, using the recipient’s public key. Then, the j(’According to the tender. the probabilit~ of two different plaint exts recipient first uses his or her own secret key to ha~ing the same message digest is on the order of one in a trillion ‘1 Note that for ordinary encryption to preserte pri~acj, th(’ rcclpl- decrypt the signed message before going through ent “s public key is the one used LO encr}.pt. the procedure described above. Appendix E Acronyms

ABA —American Bankers Association ICST –Institute for Computer Sciences and ACE —American Council on Education Technology AICPA –American Institute of Certified Pub- IEEE —Institute of Electrical and Electronics lic Accountants Engineers AITRC –Applied Information Technologies Re- ISDN –Integrated Services Digital Network search Center IS0 –International Organization for Stand- ANSI –American National Standards Institute ardization ATM —Automatic Teller Machine ISSA –Information Systems Security Asso- AT&T –American Telephone & Telegraph Co. ciation BAI –Bank Administration Institute ITAR –International Traffic in Arms Regu- BJS –Bureau of Justice Statistics lation CBC –Cipher Block Chaining LAN –Local Area Network CCEP –Commercial Communications Endorse- LSI –Large Scale Integration ment Program MAC —Message Authentication CFB –Cipher Feedback MAN –Metropolitan Area Network CHIPS –Clearing House Interbank Payments NASA —National Aeronautics and Space System Administration CIA –Central Intelligence Agency NBS –National Bureau of Standards COMSEC–Communications Security NCSC –National Computer Security Center DCA –Defense Communications Agency NSA –National Security Agency DCTN –Defense Commercial Telecommunica- NSC –National Security Council tions Network NSDD –National Security Decision Directive DES –Data Encryption Standard NSF –National Science Foundation DIS –Defense Investigative Service NTIA –National Telecommunications and In- DoD –Department of Defense formation Administration DSA –Decimal Shift and Add [Algorithm] NTISSC –National Telecommunications and DTS –Digital Termination System Information Systems Steering Com- EAR –Export Administration Regulations mittee EBDI –Electronic Business Data Interchange OMC –Office of Munitions Control EDI –Electronic Data Interchange 0SI –Open System Interconnection EDP –Electronic Data Processing OTA –Office of Technology Assessment EFT –Electronic Fund Transfer PC —Personal Computer EIA —Electronic Industries Association PIN –Personal Identification Number ESVN —Executive Secure Voice Network SDNS –Secure Data Network System FBI –Federal Bureau of Investigations STU-III –Secure Telephone Unit III FCC —Federal Communications Commission S. W. I, F, T.–Society for Worldwide Interbank Fi- FIPS –Federal Information Processing nancial Telecommunications Standards TDCC –Transportation Data Coordinating FTS —Federal Telecommunications Service Committee GAO –Government Accounting Office UCS –Uniform Communication Standard GSA —General Services Administration VLSI –Very Large Scale Integrated Circuits HBO –Home Box Office WINS –Warehouse Information Network Standards

182 Appendix F Contributors and Reviewers

CONTRIBUTORS

Kenneth Allen Christine Martin Jon P. Stairs David Peyton Atalla Corp. Director, Electronic Services Information Industry G.T. McCoy Information Resources Association NASA Headquarters Management Service Peter S. Brown Office of the General Counsel General Services Administration Profile Analysis Corp. Terri McGuire Michael Thompson Thomas IL Burke Frost & Sullivan, Inc. Security Magazine Cahners Publishing Co. Chief, Technical Services Glenn McLaughlin General Services Administration Congressional Research Service Denise Ulmer Lawrence D. Dietz Library of Congress Chemical Bank The Alec Group Charles Miller Marjolijn Van der Velde Robert C. Elander Director of Public Affairs Bank Administration Institute Nancy Floyd AT&T Steven Weiland Citicorp/Quadstar Dorm B. Parker Bank Administration Institute George F. Flynn, Jr. Senior Management Consultant Florence Young Director, Special Programs SRI International Advisor Division Stuart Personick Board of Governors of the General Services Administration Bell Communications Research Federal Reserve System Cathi Kachurik Division of Federal Reserve Jeremy Ross Bank Operations Director, X3 Secretariat Time Life Books CBEMA Joseph Smaldone Eleanor Kask Chief, Licensing Division Time Life Books Office of Munitions Control Lisa Kuuttila Department of State Stanford University Office of Technology Licensing

EXTERNAL REVIEWERS

Lara H. Baker* David Chaum Harold E. Daniels, Jr. Los Alamos National Centre for Mathematics & NSA Laboratory Computer Science Cipher A. Deavours Jim Bidzos Morey J. Chick Jack Dennis RSA Data Security, Inc. Manager Assistant Director Dennis Branstad U.S. General Accounting Office Division of Federal Reserve National Bureau of Standards John Clement Bank Operations Randolph W. Bricker, Jr. AFIPS Board of Governors of the Principal Michael Corby Federal Reserve System Booz-Allen & Hamilton, Inc. AFIPS Whitfield Diffie Richard Case Prentice Cress Bell Northern Research AFIPS General Services Administration

*ltetired Itep]aced by I,awrence J$’ills * * E x-officio.

183 184 ● Defending Secrets, Sharing Data: New Locks and Keys for Electronic /formation

Jim Dray Stanley Kurzban Charles A. Pulfrey Digital Equipment Corp. IBM Corp. Director, Program Development Unisys Martin Ferris Donald C. Latham Program Analyst Assistant Secretary for Harold Relyea U.S. Treasury Command, Control, Government Division Communications and Congressional Research Service Diane Fountaine Library of Congress Mark Frankel Intelligence AAAS Department of Defense John M. Richardson IEEE CCIP Henry Geller Paul Lemme Consultant Transportation Data Ron Rivest Coordinating Committee MIT Laboratory for Computer Steven Gould Science AAAS Robert Massey (Retired) National Telecommunciations and Peter Schweitzer Edward Hall Information Administration, Consultant American Bankers Association and Army Intelligence & Security Heinz Seipel Martin Hellman Command Science Counselor Stanford University Vincent McLellan Embassy of the Federal Cheryl W. Helsing InfoWeek Republic of Germany Vice President Robert Meadows Miles Smid Bank of America American Bankers Association National Bureau of Standards Robert Jueneman Charles Miller Edward Springer Computer Science Corp. Director of Public Affairs OMB OIRA David Kahn AT&T Dennis Steinauer Consultant William Murray NBS Stuart W. Katzke Consultant Rein Turn Chief, System Components David B. Newman, Jr. AFIPS Division George Washington University National Bureau of Standards Mitchel B. Wallerstein Robert Park Associate Executive Director Ron Keelan American Physical Society Office of International Affairs Data Security Program NAS IBM Corp. David Peyton Information Industry Eddie L. Zeitler Richard Kemmerer Association Vice President AFIPS Jon Postel Security Pacific National Bank Robert F. Kempf AFIPS Information Systems Security NASA Headquarters Division Associate General Counsel for Intellectual Property Law

CONTRACTORS

Bell Communications Research: Marc Schare Personal Identification News: Ernst Brickell Howard Stearns Benjamin L. Miller Daniel Collins Richard Wolff Ross Engineering, Inc.: Daniel Hochvert Ernst & Whinney: James A. Ross Kenneth Hopper Robert Linnemanstons Consultant: R.J. Keevers Catherine Travis Peter Schweitzer Bruce Leary David R. Wilson Stewart Personick Information Security, Inc.: Philip Porter Noel Matchett .

App. F—Contributors and Reviewers . 185 — .

OTA STAFF REVIEWERS

Michael Callaham Priscilla Regan International Security and Communication and Information Commerce Program Technologies Program Vary Coates Fred Wood Communication and Information Communication and Information Technologies Program Technologies Program