The Mathematics Community and the NSA

Total Page:16

File Type:pdf, Size:1020Kb

The Mathematics Community and the NSA The Mathematics Community and the NSA POST-PUBLICATION EDITOR'S NOTE: This article is a part of the ongoing series "Mathematicians Discuss the Snowden Revelations". At the time of the writing of this piece Michael Wertheimer was the Director of Research at the NSA; he recently retired from that position. He can be reached at [email protected]. This is the latest installment in the Notices discus- Over the past several months a discussion about sion of the National Security Agency (NSA). Previ- the role of mathematics, mathematicians, and ous Notices pieces on this topic are: the activities of the National Security Agency has been hosted on the pages of the Notices. As an “AMS Should Sever Ties with the NSA” (Letter NSA mathematician I would like to provide some to the Editor), by Alexander Beilinson (December context to what has been reported in the press and 2013); “Dear NSA: Long-Term Security Depends share with the American Mathematical Society im- on Freedom”, by Stefan Forcey (January 2014); portant facts and information. In particular I would “The NSA Backdoor to NIST”, by Thomas C. Hales like to address two hot-button issues shaping this (February 2014); “The NSA: A Betrayal of Trust”, by conversation: “weakening” Internet encryption and Keith Devlin (June/July 2014); “The Mathematical impacts of data on privacy. Community and the National Security Agency”, by Andrew Odlyzko (June/July 2014); “NSA and the The US National Institute for Standards and Tech- Snowden Issues”, by Richard George (August 2014); nology (NIST), the American National Standards “The Danger of Success”, by William Binney (Sep- Institute (ANSI), the Internet Engineering Task tember 2014); “Opposing an NSA Boycott” (Letter Force (IETF), and the International Standards Orga- to the Editor), by Roger Schlafly (November 2014). nization (ISO) are the four main bodies with which the NSA participates in developing standards for See also the Letters to the Editor in this issue. cryptography. NSA has worked with each of these for over twenty-five years. We value and are com- Unsolicited submissions on this topic are wel- mitted to the important work of these groups in come. Inquiries and submissions may be sent to producing secure cryptographic standards that [email protected]. Articles of 800 words protect global communications. NSA has a long and or less are preferred. Those of 400 words or less documented record of providing security enhance- can be considered as Letters to the Editor and ments to openly published international standards. should be sent to [email protected]. However, recently our work has been questioned in several standards that are elliptic curve based, — Allyn Jackson the most significant of which is an NIST-proposed Notices Deputy Editor random number generator that I discuss below. [email protected] NSA mathematicians remain steadfast in ad- vocating secure international standards. Nev- ertheless, we are mindful that there has been Encryption and considerable discussion regarding NIST publi- cation SP 800-90A. This publication is entitled “Recommendation for Random Number Genera- the NSA Role in tion Using Deterministic Random Bit Generators” and contains specifications for four pseudoran- International Standards dom number generations for use in cryptographic applications. One of these describes a particular Michael Wertheimer random number generator associated with NSA: the Dual Elliptic Curve Deterministic Random Bit Generator (Dual_EC_DRBG). The discussion centers Michael Wertheimer is Director of Research at the National on NSA’s role in the design and advocacy for this Security Agency. His email address is [email protected]. algorithm despite a mathematical demonstration DOI: http://dx.doi.org/10.1090/noti1213 of the potential for a trapdoor. FEBRUARY 2015 NOTICES OF THE AMS 165 A trapdoor, simply put, is information that al- With hindsight, NSA should have ceased sup- lows the inverse of a seemingly one-way function porting the dual EC_DRBG algorithm immediately to be computed easily. In other words, compute x after security researchers discovered the potential from f(x). In cryptographic applications, functions for a trapdoor. In truth, I can think of no better f are specifically designed to make the x to f(x) way to describe our failure to drop support for the computation very fast but the inverse computation Dual_EC_DRBG algorithm as anything other than intractable (hence, the term one-way). If an attacker regrettable. The costs to the Defense Department knows “secret” information about f that allows to deploy a new algorithm were not an adequate an inverse to be calculated, the attacker might reason to sustain our support for a questionable be able to decrypt messages or, in the case of the algorithm. Indeed, we support NIST’s April 2014 Dual_EC_DRBG, predict future outputs. decision to remove the algorithm. Furthermore, we During the development of the ANSI standard realize that our advocacy for the DUAL_EC_DRBG based on the NIST publication, members of X9F1 casts suspicion on the broader body of work NSA (the ANSI-approved working group responsible for has done to promote secure standards. Indeed, cryptographic tools) raised concerns about the po- some colleagues have extrapolated this single ac- tential that elliptic curve points used as parameters tion to allege that NSA has a broader agenda to for the Dual_EC_DRBG could harbor a trapdoor “undermine Internet encryption.” A fair reading secret known only to, and exploitable only by, the of our track record speaks otherwise. Neverthe- person who generated the points. As a result, the less, we understand that NSA must be much more X9F1 committee expanded the standard to include transparent in its standards work and act accord- verifiable random point generation. Since the NSA ing to that transparency. That effort can begin with was using the algorithm at the time and had gen- the AMS now. erated elliptic curve points for protecting Depart- NSA strongly endorses the NIST outline for ment of Defense users, the NSA-generated points cryptographic standards development, which were included in the standard. In other words, can be found at csrc.nist.gov/groups/ST/ any implementation that used the NSA-generated crypto-review/process.html. One significant, points would be deemed compliant. Shortly there- and correct, change is that all NSA comments will after, NIST negotiated with ANSI to use the ANSI be in writing and published for review. In other Random Number Generation Standard as the basis words, we will be open and transparent about our for an NIST Random Number Generation Standard. cryptographic contributions to standards. In ad- ANSI also approved submitting a version of this dition, we will publish algorithms before they are standard to the ISO. considered for standardization to allow more time In 2007 several Microsoft researchers, includ- for public scrutiny (as we did recently with the ing Microsoft’s representative to the ANSI X9F1 new SIMON and SPECK algorithms, eprint.iacr. committee that created the ANSI version of the org/2013/404.pdf). With these measures in place, standard, raised concerns in a talk at a crypto- even those not disposed to trust NSA’s motives can graphic conference about the trapdoor potential determine for themselves the appropriateness of in the Dual_EC_DRBG. These concerns were picked our submissions, and we will continue to advocate up by the media and widely disseminated. NIST for better security in open-source software, such as and ANSI reviewed this information and elected to Security Enhancements for Linux and Security En- retain both the verifiable point generation scheme hancements for Android (selinuxproject.org). and the NSA-generated points. We hope this open affirmation and our adher- In 2013 the same concerns were again raised ence to it will chart a course that all mathemati- and promulgated by the media. This time NSA’s cians will agree is appropriate and correct. actions were portrayed as a subversion of stan- dards. However, the facts remain: Data and Privacy •The Dual_EC_DRBG was one of four NSA mathematicians carry on a long and storied random number generators in the NIST tradition of making and breaking codes and ci- standard; it is neither required nor the phers. Perhaps most celebrated are feats that our default. forebearers, American and Allied, made in break- ing German and Japanese ciphers in World War •The NSA-generated elliptic curve II. Ironically, less than 5 percent of the encrypted points were necessary for accreditation material collected during that war was successfully of the Dual_EC_DRBG but only had to decrypted, and of that amount only a scant fraction be implemented for actual use in cer- contributed to any sort of measurable action. Such tain DoD applications. is the nature of intelligence. Today’s communications environment makes •The trapdoor concerns were openly 5 percent appear staggeringly large. The simple studied by ANSI X9F1, NIST, and by the act of using a particular encryption algorithm no public in 2007. longer identifies the sender or receiver (as the 166 NOTICES OF THE AMS VOLUME 62, NUMBER 2 ENIGMA cipher did in World War II); the variety by a complex set of laws, policies, and implement- of protocols, products, and services for secure ing rules. This type of data, lawfully obtained and communications numbers in the thousands; and properly evaluated, helps us to avoid surprise. It the ease and frequency of changing identifiable is used to discover new threats, refine both our features is unprecedented. To achieve our foreign filters and selectors, and ultimately create a rising intelligence mission lawfully and effectively, NSA tide that lifts our intelligence insights and privacy mathematicians lead efforts that determine how protections. Mathematicians are leading the way we “filter,” “select,” and “process” data while to design and implement the algorithms that cre- continuously verifying that our processes and pro- ate this rising tide.
Recommended publications
  • Polynomial Sequences of Binomial Type and Path Integrals
    Preprint math/9808040, 1998 LEEDS-MATH-PURE-2001-31 Polynomial Sequences of Binomial Type and Path Integrals∗ Vladimir V. Kisil† August 8, 1998; Revised October 19, 2001 Abstract Polynomial sequences pn(x) of binomial type are a principal tool in the umbral calculus of enumerative combinatorics. We express pn(x) as a path integral in the “phase space” N × [−π,π]. The Hamilto- ∞ ′ inφ nian is h(φ)= n=0 pn(0)/n! e and it produces a Schr¨odinger type equation for pn(x). This establishes a bridge between enumerative P combinatorics and quantum field theory. It also provides an algorithm for parallel quantum computations. Contents 1 Introduction on Polynomial Sequences of Binomial Type 2 2 Preliminaries on Path Integrals 4 3 Polynomial Sequences from Path Integrals 6 4 Some Applications 10 arXiv:math/9808040v2 [math.CO] 22 Oct 2001 ∗Partially supported by the grant YSU081025 of Renessance foundation (Ukraine). †On leave form the Odessa National University (Ukraine). Keywords and phrases. Feynman path integral, umbral calculus, polynomial sequence of binomial type, token, Schr¨odinger equation, propagator, wave function, cumulants, quantum computation. 2000 Mathematics Subject Classification. Primary: 05A40; Secondary: 05A15, 58D30, 81Q30, 81R30, 81S40. 1 Polynomial Sequences and Path Integrals 2 Under the inspiring guidance of Feynman, a short- hand way of expressing—and of thinking about—these quantities have been developed. Lewis H. Ryder [22, Chap. 5]. 1 Introduction on Polynomial Sequences of Binomial Type The umbral calculus [16, 18, 19, 21] in enumerative combinatorics uses poly- nomial sequences of binomial type as a principal ingredient. A polynomial sequence pn(x) is said to be of binomial type [18, p.
    [Show full text]
  • RIEMANN's HYPOTHESIS 1. Gauss There Are 4 Prime Numbers Less
    RIEMANN'S HYPOTHESIS BRIAN CONREY 1. Gauss There are 4 prime numbers less than 10; there are 25 primes less than 100; there are 168 primes less than 1000, and 1229 primes less than 10000. At what rate do the primes thin out? Today we use the notation π(x) to denote the number of primes less than or equal to x; so π(1000) = 168. Carl Friedrich Gauss in an 1849 letter to his former student Encke provided us with the answer to this question. Gauss described his work from 58 years earlier (when he was 15 or 16) where he came to the conclusion that the likelihood of a number n being prime, without knowing anything about it except its size, is 1 : log n Since log 10 = 2:303 ::: the means that about 1 in 16 seven digit numbers are prime and the 100 digit primes are spaced apart by about 230. Gauss came to his conclusion empirically: he kept statistics on how many primes there are in each sequence of 100 numbers all the way up to 3 million or so! He claimed that he could count the primes in a chiliad (a block of 1000) in 15 minutes! Thus we expect that 1 1 1 1 π(N) ≈ + + + ··· + : log 2 log 3 log 4 log N This is within a constant of Z N du li(N) = 0 log u so Gauss' conjecture may be expressed as π(N) = li(N) + E(N) Date: January 27, 2015. 1 2 BRIAN CONREY where E(N) is an error term.
    [Show full text]
  • Computation and the Riemann Hypothesis: Dedicated to Herman Te Riele on the Occasion of His Retirement
    Computation and the Riemann Hypothesis: Dedicated to Herman te Riele on the occasion of his retirement Andrew Odlyzko School of Mathematics University of Minnesota [email protected] http://www.dtc.umn.edu/∼odlyzko December 2, 2011 Andrew Odlyzko ( School of Mathematics UniversityComputation of Minnesota and [email protected] Riemann Hypothesis: http://www.dtc.umn.edu/ Dedicated toDecember2,2011 Herman te∼ Rieodlyzkole on) the 1/15 occasion The mysteries of prime numbers: Mathematicians have tried in vain to discover some order in the sequence of prime numbers but we have every reason to believe that there are some mysteries which the human mind will never penetrate. To convince ourselves, we have only to cast a glance at tables of primes (which some have constructed to values beyond 100,000) and we should perceive that there reigns neither order nor rule. Euler, 1751 Andrew Odlyzko ( School of Mathematics UniversityComputation of Minnesota and [email protected] Riemann Hypothesis: http://www.dtc.umn.edu/ Dedicated toDecember2,2011 Herman te∼ Rieodlyzkole on) the 2/15 occasion Herman te Riele’s contributions towards elucidating some of the mysteries of primes: numerical verifications of Riemann Hypothesis (RH) improved bounds in counterexamples to the π(x) < Li(x) conjecture disproof and improved lower bounds for Mertens conjecture improved bounds for de Bruijn – Newman constant ... Andrew Odlyzko ( School of Mathematics UniversityComputation of Minnesota and [email protected] Riemann Hypothesis: http://www.dtc.umn.edu/ Dedicated toDecember2,2011
    [Show full text]
  • Umbral Calculus and Cancellative Semigroup Algebras
    Preprint funct-an/9704001, 1997 Umbral Calculus and Cancellative Semigroup Algebras∗ Vladimir V. Kisil April 1, 1997; Revised February 4, 2000 Dedicated to the memory of Gian-Carlo Rota Abstract We describe some connections between three different fields: com- binatorics (umbral calculus), functional analysis (linear functionals and operators) and harmonic analysis (convolutions on group-like struc- tures). Systematic usage of cancellative semigroup, their convolution algebras, and tokens between them provides a common language for description of objects from these three fields. Contents 1 Introduction 2 2 The Umbral Calculus 3 arXiv:funct-an/9704001v2 8 Feb 2000 2.1 Finite Operator Description ................... 3 2.2 Hopf Algebras Description .................... 4 2.3 The Semantic Description .................... 4 3 Cancellative Semigroups and Tokens 5 3.1 Cancellative Semigroups: Definition and Examples ....... 5 3.2 Tokens: Definition and Examples ................ 10 3.3 t-Transform ............................ 13 ∗Supported by grant 3GP03196 of the FWO-Vlaanderen (Fund of Scientific Research- Flanders), Scientific Research Network “Fundamental Methods and Technique in Mathe- matics”. 2000 Mathematics Subject Classification. Primary: 43A20; Secondary: 05A40, 20N++. Keywords and phrases. Cancellative semigroups, umbral calculus, harmonic analysis, token, convolution algebra, integral transform. 1 2 Vladimir V. Kisil 4 Shift Invariant Operators 16 4.1 Delta Families and Basic Distributions ............. 17 4.2 Generating Functions and Umbral Functionals ......... 22 4.3 Recurrence Operators and Generating Functions ........ 24 5 Models for the Umbral Calculus 28 5.1 Finite Operator Description: a Realization ........... 28 5.2 Hopf Algebra Description: a Realization ............ 28 5.3 The Semantic Description: a Realization ............ 29 Acknowledgments 30 It is a common occurrence in combinatorics for a sim- ple idea to have vast ramifications; the trick is to real- ize that many seemingly disparate phenomena fit un- der the same umbrella.
    [Show full text]
  • USC Phi Beta Kappa Visiting Scholar Program
    The Phi Beta Kappa chapter of the University of South Carolina is pleased to present Dr. Andrew Odlyzko as the 2010 Visiting Schol- ar. Dr. Odlyzko will present two main talks, both in Amoco Hall of the Swearingen Center on the USC campus: 3:30 pm, Thursday, 25 February 2010 Technology manias: From railroads to the Internet and beyond A comparison of the Internet bubble with the British Rail- way Mania of the 1840s, the greatest technology mania in history, provides many tantalizing similarities as well as contrasts. Especially interesting is the presence in both cases of clear quantitative measures showing a priori that these manias were bound to fail financially, measures that were not considered by investors in their pursuit of “effortless riches.” In contrast to the Internet case, the Railway Mania had many vocal and influential skeptics, yet even then, these skeptics were deluded into issuing warnings that were often counterproductive in that they inflated the bubble further. The role of such imperfect information dissemination in these two manias leads to speculative projections on how future technology bubbles will develop. 2:30 pm, Friday, 26 February 2010 How to live and prosper with insecure cyberinfra- structure Mathematics has contributed immensely to the development of secure cryptosystems and protocols. Yet our networks are terribly insecure, and we are constantly threatened with the prospect of im- minent doom. Furthermore, even though such warnings have been common for the last two decades, the situation has not gotten any better. On the other hand, there have not been any great disasters either.
    [Show full text]
  • The Past, Evolving Present and Future of Discrete Logarithm
    The Past, evolving Present and Future of Discrete Logarithm Antoine Joux, Andrew Odlyzko and Cécile Pierrot Abstract The first practical public key cryptosystem ever published, the Diffie-Hellman key exchange algorithm, relies for its security on the assumption that discrete logarithms are hard to compute. This intractability hypothesis is also the foundation for the security of a large variety of other public key systems and protocols. Since the introduction of the Diffie-Hellman key exchange more than three decades ago, there have been substantial algorithmic advances in the computation of discrete logarithms. However, in general the discrete logarithm problem is still considered to be hard. In particular, this is the case for the multiplicative group of finite fields with medium to large characteristic and for the additive group of a general elliptic curve. This paper presents a current survey of the state of the art concerning discrete logarithms and their computation. 1 Introduction 1.1 The Discrete Logarithm Problem Many popular public key cryptosystems are based on discrete exponentiation. If G is a multi- plicative group, such as the group of invertible elements in a finite field or the group of points on an elliptic curve, and g is an element of G, then gx is the discrete exponentiation of base g to the power x. This operation shares basic properties with ordinary exponentiation, for example gx+y = gx · gy. The inverse operation is, given h in G, to determine a value of x, if it exists, such that h = gx. Such a number x is called a discrete logarithm of h to the base g, since it shares many properties with the ordinary logarithm.
    [Show full text]
  • Providing Security with Insecure Systems
    Providing security with insecure systems Andrew Odlyzko School of Mathematics and Digital Technology Center University of Minnesota http://www.dtc.umn.edu/~odlyzko 1 AO 8/03 University of Minnesota Motivation and Outline: • Basic question: What is the role of cryptography and security in society? – Why haven’t cryptography and security lived up to their promise? – Is the future going to be any better? • Main points: – Strong economic, social, and psychological reasons for insecurity – Chewing gum and baling wire will continue to rule – Not absolute security but “speed bumps” – New design philosophy and new research directions 2 AO 8/03 University of Minnesota Security pyramid: users systems protocols algorithms 3 AO 8/03 University of Minnesota Half a century of evidence: •People cannot build secure systems •People cannot live with secure systems 4 4 AO 8/03 University of Minnesota Honor System Virus: This virus works on the honor system. Please forward this message to everyone you know and then delete all the files on your hard disk. Thank you for your cooperation. 5 AO 8/03 University of Minnesota Major problem with secure systems: • secretaries could not forge their bosses’ signatures 6 AO 8/03 University of Minnesota Proposed solution: •Build messy, not clean •(Lessons from past and now) •(Related to “defense in depth,” “resilience.” …) 7 7 AO 8/03 University of Minnesota The dog that did not bark: • Cyberspace is horribly insecure • But no big disasters!!! 8 8 AO 8/03 University of Minnesota The Big Question: •Why have we done so
    [Show full text]
  • Master's Project COMPUTATIONAL METHODS for the RIEMANN
    Master's Project COMPUTATIONAL METHODS FOR THE RIEMANN ZETA FUNCTION under the guidance of Prof. Frank Massey Department of Mathematics and Statistics The University of Michigan − Dearborn Matthew Kehoe [email protected] In partial fulfillment of the requirements for the degree of MASTER of SCIENCE in Applied and Computational Mathematics December 19, 2015 Acknowledgments I would like to thank my supervisor, Dr. Massey, for his help and support while completing the project. I would also like to thank all of the professors at the University of Michigan − Dearborn who continually teach mathematics. Many authors, including Andrew Odlyzko, Glen Pugh, Ken Takusagawa, Harold Edwards, and Aleksandar Ivic assisted me by writing articles and books about the Riemann zeta function. I am also grateful to the online communities of Stack Overflow and Mathematics Stack Exchange. Both of these communities assisted me in my studies and are helpful for all students. i Abstract The purpose of this project is to explore different computational methods for the Riemann zeta function. Modern analysis shows that a variety of different methods have been used to compute values for the zeta function and its zeroes. I am looking to develop multiple programs that will find non-trivial zeroes and specific function values. All computer implementations have been done inside the Java programming language. One could easily transfer the programs written inside the appendix to their preferred programming language. My main motivation for writing these programs is because I am interested in learning more about the mathematics behind the zeta function and its zeroes. The zeta function is an extremely important part of mathematics and is formally denoted by ζ(s).
    [Show full text]
  • Andrew Odlyzko
    Mathematical computation, mathematical insight, and mathematical truth Andrew Odlyzko School of Mathematics University of Minnesota [email protected] http://www.dtc.umn.edu/∼odlyzko July 31, 2014 Andrew Odlyzko ( SchoolMathematical of Mathematics University of Minnesota computation,[email protected] http://www.dtc.umn.edu/ 31, 2014 mathematical 1 / 22 The mysteries of prime numbers: Mathematicians have tried in vain to discover some order in the sequence of prime numbers but we have every reason to believe that there are some mysteries which the human mind will never penetrate. To convince ourselves, we have only to cast a glance at tables of primes (which some have constructed to values beyond 100,000) and we should perceive that there reigns neither order nor rule. Euler, 1751 Andrew Odlyzko ( SchoolMathematical of Mathematics University of Minnesota computation,[email protected] http://www.dtc.umn.edu/ 31, 2014 mathematical 2 / 22 Computing in mathematics: long history growing role “You waste that which is plentiful” (George Gilder) needed to cope with greater complexity and sophistication Andrew Odlyzko ( SchoolMathematical of Mathematics University of Minnesota computation,[email protected] http://www.dtc.umn.edu/ 31, 2014 mathematical 3 / 22 Computing in mathematics: Gaining insight and intuition, or just knowledge. Discovering new facts, patterns, and relationships. Graphing to expose mathematical facts, structures, or principles. Rigorously testing and especially falsifying conjectures. Exploring a possible result to see if
    [Show full text]
  • Turing and the Riemann Zeta Function
    Turing and the Riemann zeta function Andrew Odlyzko School of Mathematics University of Minnesota [email protected] http://www.dtc.umn.edu/∼odlyzko May 11, 2012 Andrew Odlyzko ( School of Mathematics UniversityTuring of Minnesotaand the [email protected] zeta function http://www.dtc.umn.edu/May11,2012∼odlyzko ) 1/23 The mysteries of prime numbers: Mathematicians have tried in vain to discover some order in the sequence of prime numbers but we have every reason to believe that there are some mysteries which the human mind will never penetrate. To convince ourselves, we have only to cast a glance at tables of primes (which some have constructed to values beyond 100,000) and we should perceive that there reigns neither order nor rule. Euler, 1751 Andrew Odlyzko ( School of Mathematics UniversityTuring of Minnesotaand the [email protected] zeta function http://www.dtc.umn.edu/May11,2012∼odlyzko ) 2/23 Riemann, 1850s: ∞ s 1 s C s ζ( ) = ns , ∈ , Re( ) > 1 . Xn=1 Showed ζ(s) can be continued analytically to C \{1} and has a first order pole at s = 1 with residue 1. If s ξ(s) = π− /2Γ(s/2)ζ(s) , then (functional equation) ξ(s) = ξ(1 − s) . Andrew Odlyzko ( School of Mathematics UniversityTuring of Minnesotaand the [email protected] zeta function http://www.dtc.umn.edu/May11,2012∼odlyzko ) 3/23 Critical strip: complex s−plane critical strip critical line 0 1/2 1 Andrew Odlyzko ( School of Mathematics UniversityTuring of Minnesotaand the [email protected] zeta function http://www.dtc.umn.edu/May11,2012∼odlyzko ) 4/23 Riemann explicit
    [Show full text]
  • Economics, Psychology, and Sociology of Security
    Economics, Psychology, and Sociology of Security Andrew Odlyzko Digital Technology Center, University of Minnesota, 499 Walter Library, 117 Pleasant St. SE, Minneapolis, MN 55455, USA [email protected] http://www.dtc.umn.edu/∼odlyzko Abstract. Security is not an isolated good, but just one component of a complicated economy. That imposes limitations on how effective it can be. The interactions of human society and human nature suggest that security will continue being applied as an afterthought. We will have to put up with the equivalent of bailing wire and chewing gum, and to live on the edge of intolerable frustration. However, that is not likely to to be a fatal impediment to the development and deployment of information technology. It will be most productive to think of security not as a way to provide ironclad protection, but the equivalent of speed bumps, decreasing the velocity and impact of electronic attacks to a level where other protection mechanisms can operate. 1 Introduction This is an extended version of my remarks at the panel on “Economics of Secu- rity” at the Financial Cryptography 2003 Conference. It briefly outlines some of the seldom discussed-reasons security is and will continue to be hard to achieve. Computer and communication security attract extensive press coverage, and are prominent in the minds of government and corporate decision makers. This is largely a result of the growing stream of actual attacks and discovered vul- nerabilities, and of the increasing reliance of our society on its information and communication infrastructure. There are predictions and promises that soon the situation will change, and industry is going to deliver secure systems.
    [Show full text]
  • A Knapsack Type Public Key Ctyptosystern Based on Arithmetic in Finite Fields
    A Knapsack Type Public Key Ctyptosystern Based On Arithmetic in Finite Fields (preliminary draft) Benny Chor Ronald L. Rivest Laboratory for Computer Science Massachusetts Institute of Technology Cambridge, Massachusetts 02139 Absttact-We introduce a new knapsack type public key cryptosystem. The system is based on a novel application of arithmetic in finite fields, following a construction by Bose and Chowla. Appropriately choosing the parameters, we can control the density of the resulting knapsack. In particular, the density can be made high enough to foil "low density" attacks against our system. At the moment, we do not know of any zttncks capable of "breaking" this systen h a reasonable amount of time. 1. INTRODUCTION In 1976, Diffie and Hellman [7] introduced the idea of public key cryptography, in which two different keys are used: one for encryption, and one for decryption. Each user keeps his decryption key secret, whiie making the encryption key public, so it can be used by everyone wishing to send messages to him. A few months later, the first two implementation of public key cryptosystems were discovered: The Merkle-Heilman scheme [13] and the Rivest-Shamir-Adelman scheme [17]. Some more PKC have been proposed since that time. Most of them can be put into two categories': a. PKC based on hard number-theoretic problems ([17],[16],[8]). b. PKC related to the knapsack problem ([13],[2]). While no efficicnt attacks against number theoretic PKC are known, some knapsack type PKC f Research supported by NSF grant MCS-8006938. Par!, of this research wag done while the first author was visiting Bell Laboratorics, Murray Hill, NJ.
    [Show full text]