Draft Lecture Notes

Total Page:16

File Type:pdf, Size:1020Kb

Draft Lecture Notes CS-282A / MATH-209A Foundations of Cryptography Draft Lecture Notes Winter 2010 Rafail Ostrovsky UCLA Copyright © Rafail Ostrovsky 2003-2010 Acknowledgements: These lecture notes were produced while I was teaching this graduate course at UCLA. After each lecture, I asked students to read old notes and update them according to the lecture. I am grateful to all of the students who participated in this massive endeavor throughout the years, for asking questions, taking notes and improving these lecture notes each time. Table of contents PART 1: Overview, Complexity classes, Weak and Strong One-way functions, Number Theory Background. PART 2: Hard-Core Bits. PART 3: Private Key Encryption, Perfectly Secure Encryption and its limitations, Semantic Security, Pseudo-Random Generators. PART 4: Implications of Pseudo-Random Generators, Pseudo-random Functions and its applications. PART 5: Digital Signatures. PART 6: Trapdoor Permutations, Public-Key Encryption and its definitions, Goldwasser-Micali, El-Gamal and Cramer-Shoup cryptosystems. PART 7: Interactive Proofs and Zero-Knowledge. PART 8: Bit Commitment Protocols in different settings. PART 9: Non-Interactive Zero Knowledge (NIZK) PART 10: CCA1 and CCA2 Public Key Encryption from general Assumptions: Naor-Yung, DDN. PART 11: Non-Malleable Commitments, Non-Malleable NIZK. PART 12: Multi-Server PIR PART 13: Single-Server PIR, OT, 1-2-OT, MPC, Program Obfuscation. PART 14: More on MPC including GMW and BGW protocols in the honest-but-curious setting. PART 15: Yao’s garbled circuits, Efficient ZK Arguments, Non-Black-Box Zero Knowledge. PART 16: Remote Secure Data Storage (in a Cloud), Oblivious RAMs. CS 282A/MATH 209A: Foundations of Cryptography °c 2006-2010 Prof. Rafail Ostrovsky Part 1 1 Overview of Cryptography This section gives an overview of the various branches of cryptography. 1.1 Brief History While cryptography has been in use since the times of the Romans, it has only recently been formalized into a science. ² Claude Shannon (1940s) { Formalized \Perfect Security" while working on the ENIGMA project { De¯ned the private key communication model ² Whit¯eld Di±e and Martin Hellman (1970s) { Laid foundations for cryptographic techniques based on theoretical complexity assumptions; if a technique is cracked this directly implies a solution to a long- standing problem in mathematics { De¯ned public-key cryptography for secure transmissions over an insecure chan- nel ² Rivest, Shamir, and Adleman (1978) { RSA public-key cryptosystem ² Blum, Micali and Yao (in early 1980s) { Developed rigorous theoretical foundations of pseudo-random generators. ² Goldwasser and Micali (1980s) { Provided formal and ¯rst satisfactory de¯nition of encryption { Developed de¯nition for probabilistic encryption: partial information of the un- encoded message should not provide information about the encrypted message 1-1 1.2 Example Problem Spaces in Cryptography Secrecy in Communication Two parties (Alice and Bob) want to be able to communicate with each other in some manner such that an adversary (Eve) intercepting their communication will not be able to get any information about the message being sent. One simple protocol is as follows: ² Alice locks her message with lock LA and sends it to Bob. ² Bob locks the message with lock LB and sends it back to Alice. ² Alice unlocks LA and sends the message back again. ² Bob unlocks LB and reads the message. Non-Malleability A non-malleable encryption scheme has the property that it is di±cult to modify an en- crypted message to another similar message. As an example, consider a blind auction in which the auction-goers Gi send their bids to an auctioneer A. Without any encryption, A can collude with G1 so that G1 can send in a bid equal to max(Gi) + 1; i > 1. Simply using a commitment protocol f(bid) is not su±cient, as it can be possible for A to determine f(max(Gi) + 1); i > 1 without knowing the value of f(max(Gi)); i > 1. A non-malleable encryption scheme would prevent collusion. Authentication/Data Integrity In this problem Alice would like to send Bob a message in such a way that Eve cannot replace the message with her own message without Bob's knowledge. A common form of authentication is a digital signature, which is a piece of data that certi¯es a document so that the acceptor can be sure that it was issued by the correct sender. 1.3 Cryptographic Primitives Cryptographic primitives are used as building blocks to build more advanced security pro- tocols and cryptosystems. 1-2 Interactive/Zero Knowledge Proofs In an interactive proof (due GMR) there exist two parties, a prover P and a veri¯er V. P has a statement that he wishes to prove to V. P and V would like to run a protocol at the end of which V is convinced that the statement is true (to within any arbitrary degree of certainty) if it is true, but P (no matter how computationally powerful) is unable to prove a false statement to V. In a zero-knowledge proof, V should additionally gain no more knowledge than the validity of the statement; that is, V must accept the statement as true but does not know why it is true. Interactive proofs are far more powerful than conventional proofs, being able to prove any true statement in PSPACE. Regular proofs, by comparison, can only prove statements in NP. Coin-Flipping Protocols A coin flipping protocol is one in which Alice and Bob, who don't trust each other, run a protocol at the end of which, they agree on a fair coin flip such that neither of them can cheat. Commitment Protocols A commitment protocol is a protocol in which both parties commit to a decision in a way that appears simultaneous. This requires that once one party's decision is committed, he is unable to modify it, and also that the other party is unable to make an unfair decision based on the committed decision. Commitment protocols can form the basis for coin-flipping protocols. Secure Two-Party Computation In this protocol Alice and Bob have inputs xa and xb and they wish to compute some function f(xa; xb). At the end of the protocol, Alice and Bob should learn the computed value f(xa; xb) without learning the value of the other's input. 1-3 Private Information Retrieval Here, there is a database D of bits and a user wants to ¯nd out the value of bit i without revealing to the database program the index i. While this can be done by sending the user the entire database, there are more e±cient ways to provide the appropriate data without revealing this index i. 2 Background in Complexity Theory 2.1 Basing Cryptography on Complexity Theory Modern cryptography is based on assumptions on complexity thoery. One of these assump- tions is the existance of a one way function. Informally a one way function is a function that is easy to compute on an input, but given the output it is hard to come up with an inverse. Computation is computationally \easy" if it takes polynomial time in the size of the input. Computation is coputationally \hard" if it takes super-polynomial time in the size of the input. It is easy to see that if P=NP than no one way function can exist. Thus we must make the assumption that P 6= NP . While there is no proof that these complexity assumptions are true many researches believe them to be true. Once we have made some basic complexity assumptions we can then build protocols on these assumptions by showing that if the protocol is broken we can break the underlining complexity assumption. In other words if there is a program that can break the protocol we can use this program as a subroutine for breaking the underlining complexity assumption. Thus the protocol is at least as strong as the underlining complexity assumption. For example one complexity assumption is that it is hard to factor the product of two large primes. We can show that if we assume this to be hard then there exists a One- way Function. In turn, this implies that there exists a pseudo-random number generator which implies that there exists a secure commitment scheme. Finally, existence of a secure commitment scheme implies that there exists a coin-flipping protocol. Thus if there exists an adversary that can break the coin flipping protocol, we can use this adversary to factor large primes which would violate the complexity assumption. 2.2 Complexity Classes A language is a subset of a universal set of alphabet, that adheres to some rules. An algorithm is a procedure that decides if a given input is in the language. 1-4 The class of languages P is the set of all languages that can be recognized in deterministic polynomial time. The class of languages NP is the set of all languages that can be recognized in non- deterministic polynomial time. A Probabilistic Turing Machine is a non-deterministic Turing machine which randomly chooses transitions based on some underlying probability distribution. A probabilistic Tur- ing machine may give its output with certain error probabilities. Thus we can de¯ne com- plexity classes based on these error probabilities. 1 A function g(n): N ! N is negligible if 8c > 0; 9Nc > 0 such that 8n > Nc; jg(n)j < nc . Otherwise, g(n) is said to be non-negligible. Randomized Polytime (RP) is the set of all languages that can be recognized in poly- nomial time. Additionally, any RP algorithm satis¯es the following properties 1. it always runs in polynomial time as a function of input size.
Recommended publications
  • GMR-1 and GMR-2) and finally a Widely Deployed Digital Locking System
    PRACTICAL CRYPTANALYSIS OF REAL-WORLD SYSTEMS An Engineer’s Approach DISSERTATION zur Erlangung des Grades eines Doktor-Ingenieurs der Fakultät für Elektrotechnik und Informationstechnik an der Ruhr-Universität Bochum Benedikt Driessen Bochum, July 2013 Practical Cryptanalysis of Real-World Systems Thesis Advisor Prof. Christof Paar, Ruhr-Universität Bochum, Germany External Referee Prof. Ross Anderson, University of Cambridge, England Date of submission May 22, 2013 Date of defense July 9, 2013 Date of last revision July 16, 2013 To Ursula and Walter, my parents. iii Abstract This thesis is dedicated to the analysis of symmetric cryptographic algorithms. More specifically, this doc- ument focuses on proprietary constructions found in four globally distributed systems. All of these con- structions were uncovered by means of reverse engineering, three of them while working on this thesis, but only one by the author of this document. The recovered designs were subsequently analyzed and attacked. Targeted systems range from the GSM standard for mobile communication to the two major standards for satellite communication (GMR-1 and GMR-2) and finally a widely deployed digital locking system. Surpris- ingly, although much progress has been made in the area of specialized cryptography, our attacks on the newly reverse engineered systems show that even younger designs still suffer from severe design flaws. The GSM stream ciphers A5/1 and A5/2 were reverse engineered and cryptanalyzed more than a decade ago. While the published attacks can nowadays be implemented and executed in practice, they also inspired our research into alternative, more efficient hardware architectures. In this work, we first propose a design to solve linear equation systems with binary coefficients in an unconventional, but supposedly fast way.
    [Show full text]
  • TS 101 377-3-10 V1.1.1 (2001-03) Technical Specification
    ETSI TS 101 377-3-10 V1.1.1 (2001-03) Technical Specification GEO-Mobile Radio Interface Specifications; Part 3: Network specifications; Sub-part 10: Security Related Network Functions; GMR-2 03.020 GMR-2 03.020 2 ETSI TS 101 377-3-10 V1.1.1 (2001-03) Reference DTS/SES-002-03020 Keywords GMR, GSM, GSO, interface, MES, mobile, MSS, network, radio, satellite, security, S-PCN ETSI 650 Route des Lucioles F-06921 Sophia Antipolis Cedex - FRANCE Tel.:+33492944200 Fax:+33493654716 Siret N° 348 623 562 00017 - NAF 742 C Association à but non lucratif enregistrée à la Sous-Préfecture de Grasse (06) N° 7803/88 Important notice Individual copies of the present document can be downloaded from: http://www.etsi.org The present document may be made available in more than one electronic version or in print. In any case of existing or perceived difference in contents between such versions, the reference version is the Portable Document Format (PDF). In case of dispute, the reference shall be the printing on ETSI printers of the PDF version kept on a specific network drive within ETSI Secretariat. Users of the present document should be aware that the document may be subject to revision or change of status. Information on the current status of this and other ETSI documents is available at http://www.etsi.org/tb/status/ If you find errors in the present document, send your comment to: [email protected] Copyright Notification No part may be reproduced except as authorized by written permission. The copyright and the foregoing restriction extend to reproduction in all media.
    [Show full text]
  • NTRU Cryptosystem: Recent Developments and Emerging Mathematical Problems in Finite Polynomial Rings
    XXXX, 1–33 © De Gruyter YYYY NTRU Cryptosystem: Recent Developments and Emerging Mathematical Problems in Finite Polynomial Rings Ron Steinfeld Abstract. The NTRU public-key cryptosystem, proposed in 1996 by Hoffstein, Pipher and Silverman, is a fast and practical alternative to classical schemes based on factorization or discrete logarithms. In contrast to the latter schemes, it offers quasi-optimal asymptotic effi- ciency and conjectured security against quantum computing attacks. The scheme is defined over finite polynomial rings, and its security analysis involves the study of natural statistical and computational problems defined over these rings. We survey several recent developments in both the security analysis and in the applica- tions of NTRU and its variants, within the broader field of lattice-based cryptography. These developments include a provable relation between the security of NTRU and the computa- tional hardness of worst-case instances of certain lattice problems, and the construction of fully homomorphic and multilinear cryptographic algorithms. In the process, we identify the underlying statistical and computational problems in finite rings. Keywords. NTRU Cryptosystem, lattice-based cryptography, fully homomorphic encryption, multilinear maps. AMS classification. 68Q17, 68Q87, 68Q12, 11T55, 11T71, 11T22. 1 Introduction The NTRU public-key cryptosystem has attracted much attention by the cryptographic community since its introduction in 1996 by Hoffstein, Pipher and Silverman [32, 33]. Unlike more classical public-key cryptosystems based on the hardness of integer factorisation or the discrete logarithm over finite fields and elliptic curves, NTRU is based on the hardness of finding ‘small’ solutions to systems of linear equations over polynomial rings, and as a consequence is closely related to geometric problems on certain classes of high-dimensional Euclidean lattices.
    [Show full text]
  • On the Foundations of Cryptography∗
    On the Foundations of Cryptography∗ Oded Goldreich Department of Computer Science Weizmann Institute of Science Rehovot, Israel. [email protected] May 6, 2019 Abstract We survey the main paradigms, approaches and techniques used to conceptualize, define and provide solutions to natural cryptographic problems. We start by presenting some of the central tools used in cryptography; that is, computational difficulty (in the form of one-way functions), pseudorandomness, and zero-knowledge proofs. Based on these tools, we turn to the treatment of basic cryptographic applications such as encryption and signature schemes as well as the design of general secure cryptographic protocols. Our presentation assumes basic knowledge of algorithms, probability theory and complexity theory, but nothing beyond this. Keywords: Cryptography, Theory of Computation. ∗This revision of the primer [59] will appear as Chapter 17 in an ACM book celebrating the work of Goldwasser and Micali. 1 Contents 1 Introduction and Preliminaries 1 1.1 Introduction.................................... ............... 1 1.2 Preliminaries ................................... ............... 4 I Basic Tools 6 2 Computational Difficulty and One-Way Functions 6 2.1 One-WayFunctions................................ ............... 6 2.2 Hard-CorePredicates . .. .. .. .. .. .. .. ................ 9 3 Pseudorandomness 11 3.1 Computational Indistinguishability . ....................... 11 3.2 PseudorandomGenerators. ................. 12 3.3 PseudorandomFunctions . ...............
    [Show full text]
  • 11 Digital Signatures
    This is a Chapter from the Handbook of Applied Cryptography, by A. Menezes, P. van Oorschot, and S. Vanstone, CRC Press, 1996. For further information, see www.cacr.math.uwaterloo.ca/hac CRC Press has granted the following specific permissions for the electronic version of this book: Permission is granted to retrieve, print and store a single copy of this chapter for personal use. This permission does not extend to binding multiple chapters of the book, photocopying or producing copies for other than personal use of the person creating the copy, or making electronic copies available for retrieval by others without prior permission in writing from CRC Press. Except where over-ridden by the specific permission above, the standard copyright notice from CRC Press applies to this electronic version: Neither this book nor any part may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, microfilming, and recording, or by any information storage or retrieval system, without prior permission in writing from the publisher. The consent of CRC Press does not extend to copying for general distribution, for promotion, for creating new works, or for resale. Specific permission must be obtained in writing from CRC Press for such copying. c 1997 by CRC Press, Inc. Chapter 11 Digital Signatures Contents in Brief 11.1 Introduction :::::::::::::::::::::::::::::425 11.2 A framework for digital signature mechanisms ::::::::::426 11.3 RSA and related signature schemes :::::::::::::::::433 11.4 Fiat-Shamir signature schemes :::::::::::::::::::447 11.5 The DSA and related signature schemes ::::::::::::::451 11.6 One-time digital signatures :::::::::::::::::::::462 11.7 Other signature schemes ::::::::::::::::::::::471 11.8 Signatures with additional functionality ::::::::::::::474 11.9 Notes and further references ::::::::::::::::::::481 11.1 Introduction This chapter considers techniques designed to provide the digital counterpart to a handwrit- ten signature.
    [Show full text]
  • State of the Art in Lightweight Symmetric Cryptography
    State of the Art in Lightweight Symmetric Cryptography Alex Biryukov1 and Léo Perrin2 1 SnT, CSC, University of Luxembourg, [email protected] 2 SnT, University of Luxembourg, [email protected] Abstract. Lightweight cryptography has been one of the “hot topics” in symmetric cryptography in the recent years. A huge number of lightweight algorithms have been published, standardized and/or used in commercial products. In this paper, we discuss the different implementation constraints that a “lightweight” algorithm is usually designed to satisfy. We also present an extensive survey of all lightweight symmetric primitives we are aware of. It covers designs from the academic community, from government agencies and proprietary algorithms which were reverse-engineered or leaked. Relevant national (nist...) and international (iso/iec...) standards are listed. We then discuss some trends we identified in the design of lightweight algorithms, namely the designers’ preference for arx-based and bitsliced-S-Box-based designs and simple key schedules. Finally, we argue that lightweight cryptography is too large a field and that it should be split into two related but distinct areas: ultra-lightweight and IoT cryptography. The former deals only with the smallest of devices for which a lower security level may be justified by the very harsh design constraints. The latter corresponds to low-power embedded processors for which the Aes and modern hash function are costly but which have to provide a high level security due to their greater connectivity. Keywords: Lightweight cryptography · Ultra-Lightweight · IoT · Internet of Things · SoK · Survey · Standards · Industry 1 Introduction The Internet of Things (IoT) is one of the foremost buzzwords in computer science and information technology at the time of writing.
    [Show full text]
  • A Low Data Complexity Attack on the GMR-2 Cipher Used in the Satellite Phones
    A Low Data Complexity Attack on the GMR-2 Cipher Used in the Satellite Phones Ruilin Li, Heng Li, Chao Li, Bing Sun National University of Defense Technology, Changsha, China FSE 2013, Singapore 11th ~13th March, 2013 Outline • Backgrounds and the GMR-2 Cipher • Revisit the Component of the GMR-2 Cipher • The Low Data Complexity Attack • Experimental Result • Conclusion 2 Outline • Backgrounds and the GMR-2 Cipher • Revisit each Component of the GMR-2 Cipher • The Low Data Complexity Attack • Experimental Result • Conclusion 3 Backgrounds and the GMR-2 Cipher • Mobile communication systems have revolutionized the way we interact with each other – GSM, UMTS, CDMA2000, 3GPP LTE • When do we need satellite based mobile system? – In some special cases • researchers on a field trip in a desert • crew on ships on open sea • people living in remote areas or areas that are affected by a natural disaster 4 Backgrounds and the GMR-2 Cipher • What is GMR? – GMR stands for GEO-Mobile Radio – GEO stands for Geostationary Earth Orbit – Design heavily inspired from GSM 5 Backgrounds and the GMR-2 Cipher • Two major GMR Standards – GMR-1 (de-facto standard, Thuraya etc) – GMR-2 (Inmarsat and AcES) • How to protect the security of the communication in GMR system? – Using symmetric cryptography – Both the authentication and encryption are similar as that of GSM A3/A5 algorithms. 6 Backgrounds and the GMR-2 Cipher • Encryption Algorithms in GMR – Stream ciphers – Reconstructed by Driessen et al. • GMR-1 Cipher – Based on A5/2 of GSM – Totally broken by ciphertext-only
    [Show full text]
  • DIGITAL SIGNATURES IntroducOn > Whoami WHOAMI
    DAVID ACLAND - DIGITAL SIGNATURES Introduc6on > whoami WHOAMI David Acland @davidacland Introduc6on > whoami Introduc6on > About this talk ABOUT THIS TALK Introduc6on > About this talk > Progress Progress Bar Where are we? Introduc6on > About this talk > What WHAT WE’RE TALKING ABOUT Cryptology Cryptography Cryptanalysis Asymmetric Symmetric Encrypon Digital Signatures Introduc6on > About this talk > Why WHY WE’RE TALKING ABOUT IT DNSSEC Signed Packages Gatekeeper TCC Secure boot SSL DKIM AutoPkg SSH Kernel Extensions Introduc6on > About this talk > Why WHY WE’RE TALKING ABOUT IT Introduc6on > About this talk > Why WHY WE’RE TALKING ABOUT IT Trusted signatures are becoming a requirement Introduc6on > About this talk > Contents CONTENTS ▸ Introduc6on to digital signatures ▸ Underlying technologies ▸ Prac6cal usage Introduc6on > About this talk > Warning THERE’S MATHS IN HERE (sorry) Introduc6on > About this talk > Follow along IT’S INTERACTIVE https://bit.ly/2TR6y9a Introduc6on > Introduc6on to digital signatures INTRODUCTION TO DIGITAL SIGNATURES Introduc6on > Introduc6on to digital signatures WHAT IS A SIGNATURE? Introduc6on > Introduc6on to digital signatures > What’s a signature? WHAT’S A SIGNATURE? Introduc6on > Introduc6on to digital signatures > What’s a signature? WHAT DO THEY GIVE YOU? ▸ Authen6ca6on ▸ Integrity Security Services ▸ Non-repudia6on Introduc6on > Introduc6on to digital signatures > Paper Signatures ARE PAPER SIGNATURES EFFECTIVE? Introduc6on > Introduc6on to digital signatures > Paper Signatures ARE PAPER SIGNATURES
    [Show full text]
  • The Early Days of RSA -- History and Lessons Ronald L. Rivest MIT Lab for Computer Science
    The Early Days of RSA -- History and Lessons Ronald L. Rivest MIT Lab for Computer Science ACM Turing Award Lecture Lessons Learned Try to solve “real-world” problems … using computer science theory … and number theory. Be optimistic: do the “impossible”. Invention of RSA. Moore’s Law matters. Do cryptography in public. Crypto theory matters. Organizations matter: ACM, IACR, RSA Try to solve real-world problems Diffie and Hellman published “New Directions in Cryptography” Nov ’76: “We stand today at the brink of a revolution in cryptography.” Proposed “Public-Key Cryptosystem” . (This remarkable idea developed jointly with Merkle.) Introduced even more remarkable notion of digital signatures. Good cryptography is motivated by applications. (e-commerce, mental poker, voting, auctions, …) … using computer science theory In 1976 “complexity theory” and “algorithms” were just beginning… Cryptography is a “theory consumer”: it needs – easy problems (such as multiplication or prime-finding, for the “good guys”) and – hard problems (such as factorization, to defeat an adversary). …and number theory Diffie/Hellman used number theory for “key agreement” (two parties agree on a secret key, using exponentiation modulo a prime number). Some algebraic structure seemed essential for a PKC; we kept returning to number theory and modular arithmetic… Difficulty of factoring not well studied then, but seemed hard… Be optimistic: do the “impossible” Diffie and Hellman left open the problem of realizing a PKC: D(E(M)) = E(D(M)) = M where E is public, D is private. At times, we thought it impossible… Since then, we have learned “Meta-theorem of Cryptography”: Any apparently contradictory set of requirements can be met using right mathematical approach… Invention of RSA Tried and discarded many approaches, including some “knapsack-based” ones.
    [Show full text]
  • How to Encipher Messages on a Small Domain Deterministic Encryption and the Thorp Shuffle
    How to Encipher Messages on a Small Domain Deterministic Encryption and the Thorp Shuffle Ben Morris1, Phillip Rogaway2, and Till Stegers2 1 Dept. of Mathematics, University of California, Davis, California 95616, USA 2 Dept. of Computer Science, University of California, Davis, California 95616, USA Abstract. We analyze the security of the Thorp shuffle, or, equivalently, a maximally unbalanced Feistel network. Roughly said, the Thorp shuffle on N cards mixes any N 1−1/r of them in O(r lg N) steps. Correspond- ingly, making O(r) passes of maximally unbalanced Feistel over an n-bit string ensures CCA-security to 2n(1−1/r) queries. Our results, which em- ploy Markov-chain techniques, enable the construction of a practical and provably-secure blockcipher-based scheme for deterministically encipher- ing credit card numbers and the like using a conventional blockcipher. 1 Introduction Small-space encryption. Suppose you want to encrypt a 9-decimal-digit plaintext, say a U.S. social-security number, into a ciphertext that is again a 9-decimal-digit number. A shared key K is used to control the encryption. Syn- tactically, you seek a cipher E: K×M → M where M = {0, 1,...,N− 1}, 9 N =10 ,andEK = E(K, ·) is a permutation for each key K ∈K. You aim to construct your scheme from a well-known primitive, say AES, and to prove your scheme is as secure as the primitive from which you start. The problem is harder than it sounds. You can’t just encode each plaintext M ∈Mas a 128-bit string and then apply AES, say, as that will return a 128-bit string and projecting back onto M will destroy permutivity.
    [Show full text]
  • Stdin (Ditroff)
    - 50 - Foundations of Cryptography Notes of lecture No. 5B (given on Apr. 2nd) Notes taken by Eyal Kushilevitz Summary In this (half) lecture we introduce and define the concept of secure encryption. (It is assumed that the reader is familiar with the notions of private-key and public-key encryption, but we will define these notions too for the sake of self-containment). 1. Secure encryption - motivation. Secure encryption is one of the fundamental problems in the field of cryptography. One of the important results of this field in the last years, is the success to give formal definitions to intuitive concepts (such as secure encryption), and the ability to suggest efficient implementations to such systems. This ability looks more impressive when we take into account the strong requirements of these definitions. Loosely speaking, an encryption system is called "secure", if seeing the encrypted message does not give any partial information about the message, that is not known beforehand. We describe here the properties that we would like the definition of encryption system to capture: g Computational hardness - As usual, the definition should deal with efficient procedures. That is, by saying that the encryption does not give any partial information about the message, we mean that no efficient algorithm is able to gain such an information Clearly, we also require that the encryption and decryption procedures will be efficient. g Security with respect to any probability distribution of messages - The encryption system should be secure independently of the probability distribution of the messages we encrypt. For example: we do not want a system which is secure with respect to messages taken from a uniform distribution, but not secure with respect to messages written in English.
    [Show full text]
  • On One-Way Functions and Kolmogorov Complexity
    On One-way Functions and Kolmogorov Complexity Yanyi Liu Rafael Pass∗ Cornell University Cornell Tech [email protected] [email protected] September 25, 2020 Abstract We prove that the equivalence of two fundamental problems in the theory of computing. For every polynomial t(n) ≥ (1 + ")n; " > 0, the following are equivalent: • One-way functions exists (which in turn is equivalent to the existence of secure private-key encryption schemes, digital signatures, pseudorandom generators, pseudorandom functions, commitment schemes, and more); • t-time bounded Kolmogorov Complexity, Kt, is mildly hard-on-average (i.e., there exists a t 1 polynomial p(n) > 0 such that no PPT algorithm can compute K , for more than a 1 − p(n) fraction of n-bit strings). In doing so, we present the first natural, and well-studied, computational problem characterizing the feasibility of the central private-key primitives and protocols in Cryptography. arXiv:2009.11514v1 [cs.CC] 24 Sep 2020 ∗Supported in part by NSF Award SATC-1704788, NSF Award RI-1703846, AFOSR Award FA9550-18-1-0267, and a JP Morgan Faculty Award. This research is based upon work supported in part by the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA), via 2019-19-020700006. The views and conclusions contained herein are those of the authors and should not be interpreted as necessarily representing the official policies, either expressed or implied, of ODNI, IARPA, or the U.S. Government. The U.S. Government is authorized to reproduce and distribute reprints for governmental purposes notwithstanding any copyright annotation therein.
    [Show full text]