Public Key Cryptography's Impact on Society: How Diffie and Hellman Changed the World

Total Page:16

File Type:pdf, Size:1020Kb

Public Key Cryptography's Impact on Society: How Diffie and Hellman Changed the World Public Key Cryptography’s Impact on Society: How Diffie and Hellman Changed the World Paul C. van Oorschot∗ Abstract. In 1975 and 1976, Whitfield Diffie and Martin Hellman conceived and intro- duced fundamental new methods that changed how communications are secured. Their landmark paper “New Directions in Cryptography” explained both public key cryptography and what would become known as Diffie-Hellman key exchange. These ideas, influenced and augmented by a few souls within a small community, set the world on a new course by establishing novel cryptographic techniques for protecting information transmitted over untrusted channels. Our aim herein is to consider how public key cryptography has changed the world, and in particular its impact on society. We review the original contributions of Diffie and Hellman, and provide context to relate these to pre-existing and subsequent cryptographic techniques. Aided by this understanding, we connect their contributions to resulting major changes in society. To retain accessibility for non-specialists, our treat- ment largely avoids mathematical details, while selectively introducing technical terms to maintain technical accuracy. 1 Security background We begin with some basic concepts and terminology to develop a working vocabulary. When information is transmitted over a physical channel (physical line) such as a traditional phone line, cable, or optical fibre, the line may be physically shielded or isolated, to reduce the risk of unauthorized access such as by a physical wiretap. If such a communication channel is accessible to unintended parties, it is called an open or untrusted channel. In general, ordinary information (plaintext) sent over untrusted channels is at risk of interception. For example, plaintext sent over a radio channel is accessible to anyone with a suitable wireless receiver. A common defense is to convert plaintext characters into a related sequence of characters (ciphertext) that are not meaningful even if intercepted. To do so, at the sender’s end a sequence of instructions (called an encryption algorithm) is used to convert plaintext to ciphertext, which is then transmitted. To recover the plaintext, the operation is reversed at the receiver’s end by a decryption algorithm. In this way, encryption provides a confidentiality property, whereby the meaningful con- tent is available only to authorized parties. Unauthorized parties cannot recover the plain- text because the encryption and decryption algorithms require a secret number, which may be viewed as a random string of 0s and 1s; 128 of these would be called a 128-bit crypto- graphic key. The aim is that only the sender and recipient (i.e., their computing devices) share this secret key.1 Historically, decryption requires the same key as used for encryption; in this case we use the terms symmetric-key algorithms and symmetric keys. Distinct from confidentiality or secrecy is the concept of authentication. The ability to recognize individuals (entity authentication) is taken for granted in human-to-human interactions, but more challenging in written communications. (Can you be sure that the postcard you have received is legitimately from your sister?) To provide a modest degree of authentication beyond relying solely on context and semantic content, we conventionally ∗14 April 2020. A version of this paper is to appear in a forthcoming ACM book. 1Viewing the encryption process as an algorithm, the key is a required input parameter. For a given plaintext message, a different key results in a different ciphertext. 1 end handwritten letters with personal signatures. Historically, various alternatives have been used to provide arguably stronger assurances. One example is wax seals created using recognizable signet rings, in some cases affixed alongside a handwritten signature, and in other cases on an exterior envelope, thereby providing also a signal in case the envelope has been opened and/or the content altered. Such means serve to prevent or detect alterations, also called integrity violations. Mechanisms that convey assurances regarding the source of transmitted content are said to provide data origin authentication; in practice, such mechanisms also provide integrity protection, i.e., are able to detect if content has been altered. In dealing with computer data, it is helpful to define data authentication as the combination of data origin authentication and data integrity. We then try to avoid the ambiguous term authentication, instead using entity authentication or data authentication. 2 Context: motivation and environment The invention of public key cryptography resulted from a combination of elements. These included the inventors’ personal perspectives and motivation, and an environment that led to key management and data authentication being recognized as important open problems. The need for digital signatures was a seed planted in 1970. While working on artificial intelligence research for John McCarthy at Stanford, Diffie was exposed to McCarthy’s early interest in “buying and selling through home terminals” [68]. What we now call electronic commerce (e-commerce) was a new idea in this age before personal computers, when even home-based terminal connections to workplaces were rare. This led Diffie to recognize an open question [18, 88]: How would one authenticate digital transactions with remote parties, in lieu of assurances provided by handwritten signatures on paper contracts? Known techniques [18] for authentication at that time, using a symmetric key shared between buyer and seller would fail if one party originated a message but then denied this, claiming that the other party had done so. Indeed, parties in business and consumer transactions often have competing interests, while business partners on one project may compete on the next. Moving now to encrypted communication between two remote parties, this convention- ally required that a symmetric key be shared between them, prior to the beginning of the secure communication. The key is not sent over the same channel as the message—if that channel itself provided confidentiality, encryption would not be needed for the mes- sage. Thus conventionally, it was necessary to (1) know, ahead of time, who you planned to communicate with securely; and (2) pre-distribute a secret key to them over a trusted channel—thus the stereotypical briefcase chained to a courier’s wrist. This is of course costly in both time and money. However in specific environments, such as the military and some government organizations, it is feasible due to, e.g., centralized organization with hierarchi- cal reporting trees, and relatively static, well-known chains of command. Nevertheless, such pre-distribution of keys is particularly challenging at scale; according to Diffie, at one time the US National Security Agency (NSA) reportedly had “the biggest printing press in the world at Fort Meade” [17], used at least in part for printing keying material. In May 1973, the US government’s National Bureau of Standards (NBS, the precursor of NIST) solicited proposals, for “algorithms for the encryption of computer data to ensure their protection during transmission over exposed communications facilities or while recorded on media in transport or in storage” [60]. Following a second call in August 1974 [61], IBM submitted an algorithm that would become the Data Encryption Standard (DES), after its formal proposal by NBS in March 1975 [62]. In November 1976, DES was approved as a Federal Information Processing Standard (FIPS), FIPS 46. This DES-related activity opened, for the first time, the possibility of broad use of encryption by the general public. In parallel, Hellman and Diffie, who had met in September 1974 [68], recognized key management as a barrier [18] to broad adoption of encrypted communications. They noted: if there are n users, for each to share a distinct key with each of n − 1 others would require n(n − 1)=2 or roughly n2 keys across the system. This number grows quickly as n increases. Pre-distribution of this many keys, e.g., by physical 2 courier, would be impractical across the general public—due to both the number of users, and absence of the simplifying structures noted above. As a separate issue, centralized key distribution was not an appealing solution for indi- viduals who preferred not to rely on, or place trust in, governments and large organizations. Such individuals included Diffie, “who grew up in a libertarian and leftist tradition” [84]. In Diffie’s words, “I had a very counter-cultural viewpoint and it was very anti-central author- ity” [88]. His interest in key management dated back to 1965 when he mistakenly2 came to believe that for within-facility calls, NSA phone lines were encrypted, using keys distributed by a central facility [88]. He didn’t see the point of such an architecture—his view was that cryptography should reduce the number of parties that one had to trust. Thus two com- municating parties should use keying material known to them alone. A key management solution that required trusting a central facility missed the mark, leaving key management as an open problem in his mind. In the fall of 1974, Diffie also first learned of the NBS plan to standardize encryption from Butler Lampson [88]. Believing that the US government might not standardize an algorithm unless they themselves could break it, Diffie began to ponder whether somehow a “trap-door” might be built into the algorithm, such that a party with special knowledge of the trap-door (but no one else) could easily break the algorithm—an idea that he credits [88] for inspiring public key cryptography. Thus the DES activity may be seen to have motivated the inventive contributions of Diffie and Hellman both through its heightening the need for key management, and by triggering the idea of trap-door cryptosystems. As we explain next, public key encryption, and Diffie-Hellman key exchange, provided two different solutions to the key management problem. Public key cryptography also provided a solution to the open problem of digital-world signatures.
Recommended publications
  • Diamondoid Mechanosynthesis Prepared for the International Technology Roadmap for Productive Nanosystems
    IMM White Paper Scanning Probe Diamondoid Mechanosynthesis Prepared for the International Technology Roadmap for Productive Nanosystems 1 August 2007 D.R. Forrest, R. A. Freitas, N. Jacobstein One proposed pathway to atomically precise manufacturing is scanning probe diamondoid mechanosynthesis (DMS): employing scanning probe technology for positional control in combination with novel reactive tips to fabricate atomically-precise diamondoid components under positional control. This pathway has its roots in the 1986 book Engines of Creation, in which the manufacture of diamondoid parts was proposed as a long-term objective by Drexler [1], and in the 1989 demonstration by Donald Eigler at IBM that individual atoms could be manipulated by a scanning tunelling microscope [2]. The proposed DMS-based pathway would skip the intermediate enabling technologies proposed by Drexler [1a, 1b, 1c] (these begin with polymeric structures and solution-phase synthesis) and would instead move toward advanced DMS in a more direct way. Although DMS has not yet been realized experimentally, there is a strong base of experimental results and theory that indicate it can be achieved in the near term. • Scanning probe positional assembly with single atoms has been successfully demonstrated in by different research groups for Fe and CO on Ag, Si on Si, and H on Si and CNHCH3. • Theoretical treatments of tip reactions show that carbon dimers1 can be transferred to diamond surfaces with high fidelity. • A study on tip design showed that many variations on a design turn out to be suitable for accurate carbon dimer placement. Therefore, efforts can be focused on the variations of tooltips of many kinds that are easier to synthesize.
    [Show full text]
  • Cryptography: DH And
    1 ì Key Exchange Secure Software Systems Fall 2018 2 Challenge – Exchanging Keys & & − 1 6(6 − 1) !"#ℎ%&'() = = = 15 & 2 2 The more parties in communication, ! $ the more keys that need to be securely exchanged Do we have to use out-of-band " # methods? (e.g., phone?) % Secure Software Systems Fall 2018 3 Key Exchange ì Insecure communica-ons ì Alice and Bob agree on a channel shared secret (“key”) that ì Eve can see everything! Eve doesn’t know ì Despite Eve seeing everything! ! " (alice) (bob) # (eve) Secure Software Systems Fall 2018 Whitfield Diffie and Martin Hellman, 4 “New directions in cryptography,” in IEEE Transactions on Information Theory, vol. 22, no. 6, Nov 1976. Proposed public key cryptography. Diffie-Hellman key exchange. Secure Software Systems Fall 2018 5 Diffie-Hellman Color Analogy (1) It’s easy to mix two colors: + = (2) Mixing two or more colors in a different order results in + + = the same color: + + = (3) Mixing colors is one-way (Impossible to determine which colors went in to produce final result) https://www.crypto101.io/ Secure Software Systems Fall 2018 6 Diffie-Hellman Color Analogy ! # " (alice) (eve) (bob) + + $ $ = = Mix Mix (1) Start with public color ▇ – share across network (2) Alice picks secret color ▇ and mixes it to get ▇ (3) Bob picks secret color ▇ and mixes it to get ▇ Secure Software Systems Fall 2018 7 Diffie-Hellman Color Analogy ! # " (alice) (eve) (bob) $ $ Mix Mix = = Eve can’t calculate ▇ !! (secret keys were never shared) (4) Alice and Bob exchange their mixed colors (▇,▇) (5) Eve will
    [Show full text]
  • Nanorobot Construction Crews
    Nanorobot Construction Crews Jaeseung Jeong, Ph.D Department of Bio and Brain engineering, KAIST Nanorobotics • Nanorobotics is the technology of creating machines or robots atltthit or close to the microscopi c scal lfe of a nanomet res (10-9 metres). More specifically, nanorobotics refers to the still largely ‘hypothetical’ nanotechnology engineering discipline of designing and building nanorobots. • Nanorobots ((,,nanobots, nanoids, nanites or nanonites ) would be typically devices ranging in size from 0.1-10 micrometers and constructed of nanoscale or molecular components. As no artificial non-biological nanorobots have yet been created, they remain a ‘hypothetical’ concept. • Another definition sometimes used is a robot which allows precision interactions with nanoscale objects , or can manipulate with nanoscale resolution. • Followingggpp this definition even a large apparatus such as an atomic force microscope can be considered a nanorobotic instrument when configured to perform nanomanipulation. • Also, macroscalble robots or mi crorob ots which can move wi th nanoscale precision can also be considered nanorobots. The T-1000 in Terminator 2: Judggyment Day • Since nanorobots would be microscopic in size , it would probably be necessary for very large numbers of them to work together to perform microscopic and macroscopic tasks. • These nanorobot swarms are fdifound in many sci ence fi fitiction stories, such as The T-1000 in Terminator 2: Judgment Day, nanomachine i n Meta l G ear So lid. • The word "nanobot" (also "nanite",,g, "nanogene", or "nanoant") is often used to indicate this fictional context and is a n info rma l o r eve n pejo rat ive term to refer to the engineering concept of nanorobots.
    [Show full text]
  • Fall 2016 Dear Electrical Engineering Alumni and Friends, This Past
    ABBAS EL GAMAL Fortinet Founders Chair of the Department of Electrical Engineering Hitachi America Professor Fall 2016 Dear Electrical Engineering Alumni and Friends, This past academic year was another very successful one for the department. We made great progress toward implementing the vision of our strategic plan (EE in the 21st Century, or EE21 for short), which I outlined in my letter to you last year. I am also proud to share some of the exciting research in the department and the significant recognitions our faculty have received. I will first briefly describe the progress we have made toward implementing our EE21 plan. Faculty hiring. The top priority in our strategic plan is hiring faculty with complementary vision and expertise and who enhance our faculty diversity. This past academic year, we conducted a junior faculty broad area search and participated in a School of Engineering wide search in the area of robotics. I am happy to report that Mary Wootters joined our faculty in September as an assistant professor jointly with Computer Science. Mary’s research focuses on applying probability to coding theory, signal processing, and randomized algorithms. She also explores quantum information theory and complexity theory. Mary was previously an NSF postdoctoral fellow in the CS department at Carnegie Mellon University. The robotics search yielded two top candidates. I will report on the final results of this search in my next year’s letter. Reinventing the undergraduate curriculum. We continue to innovate our undergraduate curriculum, introducing two new, exciting project-oriented courses: EE107: Embedded Networked Systems and EE267: Virtual Reality.
    [Show full text]
  • Martin Hellman, Walker and Company, New York,1988, Pantheon Books, New York,1986
    Risk Analysis of Nuclear Deterrence by Dr. Martin E. Hellman, New York Epsilon ’66 he first fundamental canon of The Code of A terrorist attack involving a nuclear weapon would Ethics for Engineers adopted by Tau Beta Pi be a catastrophe of immense proportions: “A 10-kiloton states that “Engineers shall hold paramount bomb detonated at Grand Central Station on a typical work the safety, health, and welfare of the public day would likely kill some half a million people, and inflict in the performance of their professional over a trillion dollars in direct economic damage. America Tduties.” When we design systems, we routinely use large and its way of life would be changed forever.” [Bunn 2003, safety factors to account for unforeseen circumstances. pages viii-ix]. The Golden Gate Bridge was designed with a safety factor The likelihood of such an attack is also significant. For- several times the anticipated load. This “over design” saved mer Secretary of Defense William Perry has estimated the bridge, along with the lives of the 300,000 people who the chance of a nuclear terrorist incident within the next thronged onto it in 1987 to celebrate its fiftieth anniver- decade to be roughly 50 percent [Bunn 2007, page 15]. sary. The weight of all those people presented a load that David Albright, a former weapons inspector in Iraq, was several times the design load1, visibly flattening the estimates those odds at less than one percent, but notes, bridge’s arched roadway. Watching the roadway deform, “We would never accept a situation where the chance of a bridge engineers feared that the span might collapse, but major nuclear accident like Chernobyl would be anywhere engineering conservatism saved the day.
    [Show full text]
  • The Impetus to Creativity in Technology
    The Impetus to Creativity in Technology Alan G. Konheim Professor Emeritus Department of Computer Science University of California Santa Barbara, California 93106 [email protected] [email protected] Abstract: We describe the technical developments ensuing from two well-known publications in the 20th century containing significant and seminal results, a paper by Claude Shannon in 1948 and a patent by Horst Feistel in 1971. Near the beginning, Shannon’s paper sets the tone with the statement ``the fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected *sent+ at another point.‛ Shannon’s Coding Theorem established the relationship between the probability of error and rate measuring the transmission efficiency. Shannon proved the existence of codes achieving optimal performance, but it required forty-five years to exhibit an actual code achieving it. These Shannon optimal-efficient codes are responsible for a wide range of communication technology we enjoy today, from GPS, to the NASA rovers Spirit and Opportunity on Mars, and lastly to worldwide communication over the Internet. The US Patent #3798539A filed by the IBM Corporation in1971 described Horst Feistel’s Block Cipher Cryptographic System, a new paradigm for encryption systems. It was largely a departure from the current technology based on shift-register stream encryption for voice and the many of the electro-mechanical cipher machines introduced nearly fifty years before. Horst’s vision directed to its application to secure the privacy of computer files. Invented at a propitious moment in time and implemented by IBM in automated teller machines for the Lloyds Bank Cashpoint System.
    [Show full text]
  • Diffie and Hellman Receive 2015 Turing Award Rod Searcey/Stanford University
    Diffie and Hellman Receive 2015 Turing Award Rod Searcey/Stanford University. Linda A. Cicero/Stanford News Service. Whitfield Diffie Martin E. Hellman ernment–private sector relations, and attracts billions of Whitfield Diffie, former chief security officer of Sun Mi- dollars in research and development,” said ACM President crosystems, and Martin E. Hellman, professor emeritus Alexander L. Wolf. “In 1976, Diffie and Hellman imagined of electrical engineering at Stanford University, have been a future where people would regularly communicate awarded the 2015 A. M. Turing Award of the Association through electronic networks and be vulnerable to having for Computing Machinery for their critical contributions their communications stolen or altered. Now, after nearly to modern cryptography. forty years, we see that their forecasts were remarkably Citation prescient.” The ability for two parties to use encryption to commu- “Public-key cryptography is fundamental for our indus- nicate privately over an otherwise insecure channel is try,” said Andrei Broder, Google Distinguished Scientist. fundamental for billions of people around the world. On “The ability to protect private data rests on protocols for a daily basis, individuals establish secure online connec- confirming an owner’s identity and for ensuring the integ- tions with banks, e-commerce sites, email servers, and the rity and confidentiality of communications. These widely cloud. Diffie and Hellman’s groundbreaking 1976 paper, used protocols were made possible through the ideas and “New Directions in Cryptography,” introduced the ideas of methods pioneered by Diffie and Hellman.” public-key cryptography and digital signatures, which are Cryptography is a practice that facilitates communi- the foundation for most regularly used security protocols cation between two parties so that the communication on the Internet today.
    [Show full text]
  • The Molecular Epair of the Rain, Part II by Ralph Merkle, Ph.D
    THIS ISSUE'S FEATURE: The Molecular epair of the rain, Part II By Ralph Merkle, Ph.D. Michael Perry, Ph.D. introduces us to "The Realities of Patient Storage" in this issue's For the Record lUS: Book Reviews, Relocation Reports, and Cryonics Fiction by Linda Dunn and Richard Shock ISSN 1054-4305 $4.50 "What is Cryonics?" Cryonics is the ultra-low-temperature preservation (biostasis) of terminal patients. The goal of biostasis and the technology of cryonics is the transport of today's terminal patients to a time in the future when cell and tissue repair technology will be available, and restoration to full function and health will be possible, a time when cures will exist for virtually all of today's diseases, including aging. As human knowledge and medical technology continue to expand in scope, people considered beyond hope of restoration (by today's medical standards) will be restored to health. (This historical trend is very clear.) The coming control over living systems should allow fabrication of new organisms and sub-cell-sized devices for repair and revival of patients waiting in cryonic suspension. The challenge for cryonicists today is to devise techniques that will ensure the patients' survival. Subscribe to CRY 0 ICS Magazine! CRYONICS magazine explores the practical, research, nanotechnology and molecular scientific, and social aspects of ultra­ engineering, book reviews, the physical low temperature pre­ format of memory and personality, the servation of humans. nature of identity, cryonics history, and As the quarterly much more. ·s FEAfURE' • .r h< • • f h THISISSUE oena\f'JI pu bl 1cat1on o t e •Ao\ecu\ar"' ~"h~>~'"''·~··o· C r If you're a first- The IV\ Bg Ra~1 RYfl'J\ Alcor Foundation-the THIS issuE's • ~L vJcs time subscriber, you p\1.\l.
    [Show full text]
  • Molecular Nanotechnology - Wikipedia, the Free Encyclopedia
    Molecular nanotechnology - Wikipedia, the free encyclopedia http://en.wikipedia.org/wiki/Molecular_manufacturing Molecular nanotechnology From Wikipedia, the free encyclopedia (Redirected from Molecular manufacturing) Part of the article series on Molecular nanotechnology (MNT) is the concept of Nanotechnology topics Molecular Nanotechnology engineering functional mechanical systems at the History · Implications Applications · Organizations molecular scale.[1] An equivalent definition would be Molecular assembler Popular culture · List of topics "machines at the molecular scale designed and built Mechanosynthesis Subfields and related fields atom-by-atom". This is distinct from nanoscale Nanorobotics Nanomedicine materials. Based on Richard Feynman's vision of Molecular self-assembly Grey goo miniature factories using nanomachines to build Molecular electronics K. Eric Drexler complex products (including additional Scanning probe microscopy Engines of Creation Nanolithography nanomachines), this advanced form of See also: Nanotechnology Molecular nanotechnology [2] nanotechnology (or molecular manufacturing ) Nanomaterials would make use of positionally-controlled Nanomaterials · Fullerene mechanosynthesis guided by molecular machine systems. MNT would involve combining Carbon nanotubes physical principles demonstrated by chemistry, other nanotechnologies, and the molecular Nanotube membranes machinery Fullerene chemistry Applications · Popular culture Timeline · Carbon allotropes Nanoparticles · Quantum dots Colloidal gold · Colloidal
    [Show full text]
  • Practical Forward Secure Signatures Using Minimal Security Assumptions
    Practical Forward Secure Signatures using Minimal Security Assumptions Vom Fachbereich Informatik der Technischen Universit¨atDarmstadt genehmigte Dissertation zur Erlangung des Grades Doktor rerum naturalium (Dr. rer. nat.) von Dipl.-Inform. Andreas H¨ulsing geboren in Karlsruhe. Referenten: Prof. Dr. Johannes Buchmann Prof. Dr. Tanja Lange Tag der Einreichung: 07. August 2013 Tag der m¨undlichen Pr¨ufung: 23. September 2013 Hochschulkennziffer: D 17 Darmstadt 2013 List of Publications [1] Johannes Buchmann, Erik Dahmen, Sarah Ereth, Andreas H¨ulsing,and Markus R¨uckert. On the security of the Winternitz one-time signature scheme. In A. Ni- taj and D. Pointcheval, editors, Africacrypt 2011, volume 6737 of Lecture Notes in Computer Science, pages 363{378. Springer Berlin / Heidelberg, 2011. Cited on page 17. [2] Johannes Buchmann, Erik Dahmen, and Andreas H¨ulsing.XMSS - a practical forward secure signature scheme based on minimal security assumptions. In Bo- Yin Yang, editor, Post-Quantum Cryptography, volume 7071 of Lecture Notes in Computer Science, pages 117{129. Springer Berlin / Heidelberg, 2011. Cited on pages 41, 73, and 81. [3] Andreas H¨ulsing,Albrecht Petzoldt, Michael Schneider, and Sidi Mohamed El Yousfi Alaoui. Postquantum Signaturverfahren Heute. In Ulrich Waldmann, editor, 22. SIT-Smartcard Workshop 2012, IHK Darmstadt, Feb 2012. Fraun- hofer Verlag Stuttgart. [4] Andreas H¨ulsing,Christoph Busold, and Johannes Buchmann. Forward secure signatures on smart cards. In Lars R. Knudsen and Huapeng Wu, editors, Se- lected Areas in Cryptography, volume 7707 of Lecture Notes in Computer Science, pages 66{80. Springer Berlin Heidelberg, 2013. Cited on pages 63, 73, and 81. [5] Johannes Braun, Andreas H¨ulsing,Alex Wiesmaier, Martin A.G.
    [Show full text]
  • The Mathematics of the RSA Public-Key Cryptosystem Burt Kaliski RSA Laboratories
    The Mathematics of the RSA Public-Key Cryptosystem Burt Kaliski RSA Laboratories ABOUT THE AUTHOR: Dr Burt Kaliski is a computer scientist whose involvement with the security industry has been through the company that Ronald Rivest, Adi Shamir and Leonard Adleman started in 1982 to commercialize the RSA encryption algorithm that they had invented. At the time, Kaliski had just started his undergraduate degree at MIT. Professor Rivest was the advisor of his bachelor’s, master’s and doctoral theses, all of which were about cryptography. When Kaliski finished his graduate work, Rivest asked him to consider joining the company, then called RSA Data Security. In 1989, he became employed as the company’s first full-time scientist. He is currently chief scientist at RSA Laboratories and vice president of research for RSA Security. Introduction Number theory may be one of the “purest” branches of mathematics, but it has turned out to be one of the most useful when it comes to computer security. For instance, number theory helps to protect sensitive data such as credit card numbers when you shop online. This is the result of some remarkable mathematics research from the 1970s that is now being applied worldwide. Sensitive data exchanged between a user and a Web site needs to be encrypted to prevent it from being disclosed to or modified by unauthorized parties. The encryption must be done in such a way that decryption is only possible with knowledge of a secret decryption key. The decryption key should only be known by authorized parties. In traditional cryptography, such as was available prior to the 1970s, the encryption and decryption operations are performed with the same key.
    [Show full text]
  • The Greatest Pioneers in Computer Science
    The Greatest Pioneers in Computer Science Ivan Srba, Veronika Gondová 19th October 2017 5th Heidelberg Laureate Forum 2 Laureates of mathematics and computer science meet the next generation September 24–29, 2017, Heidelberg https://www.heidelberg-laureate-forum.org 3 Awards in Computer Science 4 Awards in Computer Science 5 • ACM A.M. Turing Award • “Nobel Prize of Computing” • Awarded to “an individual selected for contributions of a technical nature made to the computing community” • Accompanied by a prize of $1 million Awards in Computer Science 6 • ACM A.M. Turing Award • “Nobel Prize of Computing” • Awarded to “an individual selected for contributions of a technical nature made to the computing community” • Accompanied by a prize of $1 million • ACM Prize in Computing • Awarded to “an early to mid-career fundamental innovative contribution in computing” • Accompanied by a prize of $250,000 Some of Laureates We Met at 5th HLF 7 PeWe Postcard 8 Leslie Lamport 9 ACM A.M. Turing Award (2013) Source: https://www.heidelberg-laureate-forum.org/blog/laureate/leslie-lamport/ Leslie Lamport 10 ACM A.M. Turing Award (2013) “for fundamental contributions to the theory and practice of distributed and concurrent systems, notably the invention of concepts such as causality and logical clocks, safety and liveness, replicated state machines, and sequential consistency.” • Developed Lamport Clocks for distributed systems • The paper “Time, Clocks, and the Ordering of Events in a Distributed System” from 1978 has become one of the most cited works in computer science • Developed LaTeX • Invented the first digital signature algorithm • Currently work in Microsoft Research Source: https://www.heidelberg-laureate-forum.org/blog/laureate/leslie-lamport/ Balmer’s Peak 11 • The theory that computer programmers obtain quasi-magical, superhuman coding ability • when they have a blood alcohol concentration percentage between 0.129% and 0.138%.
    [Show full text]