GPU Implementation of Quantum Secure ABC Cryptosystem on Cuda
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Four Results of Jon Kleinberg a Talk for St.Petersburg Mathematical Society
Four Results of Jon Kleinberg A Talk for St.Petersburg Mathematical Society Yury Lifshits Steklov Institute of Mathematics at St.Petersburg May 2007 1 / 43 2 Hubs and Authorities 3 Nearest Neighbors: Faster Than Brute Force 4 Navigation in a Small World 5 Bursty Structure in Streams Outline 1 Nevanlinna Prize for Jon Kleinberg History of Nevanlinna Prize Who is Jon Kleinberg 2 / 43 3 Nearest Neighbors: Faster Than Brute Force 4 Navigation in a Small World 5 Bursty Structure in Streams Outline 1 Nevanlinna Prize for Jon Kleinberg History of Nevanlinna Prize Who is Jon Kleinberg 2 Hubs and Authorities 2 / 43 4 Navigation in a Small World 5 Bursty Structure in Streams Outline 1 Nevanlinna Prize for Jon Kleinberg History of Nevanlinna Prize Who is Jon Kleinberg 2 Hubs and Authorities 3 Nearest Neighbors: Faster Than Brute Force 2 / 43 5 Bursty Structure in Streams Outline 1 Nevanlinna Prize for Jon Kleinberg History of Nevanlinna Prize Who is Jon Kleinberg 2 Hubs and Authorities 3 Nearest Neighbors: Faster Than Brute Force 4 Navigation in a Small World 2 / 43 Outline 1 Nevanlinna Prize for Jon Kleinberg History of Nevanlinna Prize Who is Jon Kleinberg 2 Hubs and Authorities 3 Nearest Neighbors: Faster Than Brute Force 4 Navigation in a Small World 5 Bursty Structure in Streams 2 / 43 Part I History of Nevanlinna Prize Career of Jon Kleinberg 3 / 43 Nevanlinna Prize The Rolf Nevanlinna Prize is awarded once every 4 years at the International Congress of Mathematicians, for outstanding contributions in Mathematical Aspects of Information Sciences including: 1 All mathematical aspects of computer science, including complexity theory, logic of programming languages, analysis of algorithms, cryptography, computer vision, pattern recognition, information processing and modelling of intelligence. -
An Archeology of Cryptography: Rewriting Plaintext, Encryption, and Ciphertext
An Archeology of Cryptography: Rewriting Plaintext, Encryption, and Ciphertext By Isaac Quinn DuPont A thesis submitted in conformity with the requirements for the degree of Doctor of Philosophy Faculty of Information University of Toronto © Copyright by Isaac Quinn DuPont 2017 ii An Archeology of Cryptography: Rewriting Plaintext, Encryption, and Ciphertext Isaac Quinn DuPont Doctor of Philosophy Faculty of Information University of Toronto 2017 Abstract Tis dissertation is an archeological study of cryptography. It questions the validity of thinking about cryptography in familiar, instrumentalist terms, and instead reveals the ways that cryptography can been understood as writing, media, and computation. In this dissertation, I ofer a critique of the prevailing views of cryptography by tracing a number of long overlooked themes in its history, including the development of artifcial languages, machine translation, media, code, notation, silence, and order. Using an archeological method, I detail historical conditions of possibility and the technical a priori of cryptography. Te conditions of possibility are explored in three parts, where I rhetorically rewrite the conventional terms of art, namely, plaintext, encryption, and ciphertext. I argue that plaintext has historically been understood as kind of inscription or form of writing, and has been associated with the development of artifcial languages, and used to analyze and investigate the natural world. I argue that the technical a priori of plaintext, encryption, and ciphertext is constitutive of the syntactic iii and semantic properties detailed in Nelson Goodman’s theory of notation, as described in his Languages of Art. I argue that encryption (and its reverse, decryption) are deterministic modes of transcription, which have historically been thought of as the medium between plaintext and ciphertext. -
Navigability of Small World Networks
Navigability of Small World Networks Pierre Fraigniaud CNRS and University Paris Sud http://www.lri.fr/~pierre Introduction Interaction Networks • Communication networks – Internet – Ad hoc and sensor networks • Societal networks – The Web – P2P networks (the unstructured ones) • Social network – Acquaintance – Mail exchanges • Biology (Interactome network), linguistics, etc. Dec. 19, 2006 HiPC'06 3 Common statistical properties • Low density • “Small world” properties: – Average distance between two nodes is small, typically O(log n) – The probability p that two distinct neighbors u1 and u2 of a same node v are neighbors is large. p = clustering coefficient • “Scale free” properties: – Heavy tailed probability distributions (e.g., of the degrees) Dec. 19, 2006 HiPC'06 4 Gaussian vs. Heavy tail Example : human sizes Example : salaries µ Dec. 19, 2006 HiPC'06 5 Power law loglog ppk prob{prob{ X=kX=k }} ≈≈ kk-α loglog kk Dec. 19, 2006 HiPC'06 6 Random graphs vs. Interaction networks • Random graphs: prob{e exists} ≈ log(n)/n – low clustering coefficient – Gaussian distribution of the degrees • Interaction networks – High clustering coefficient – Heavy tailed distribution of the degrees Dec. 19, 2006 HiPC'06 7 New problematic • Why these networks share these properties? • What model for – Performance analysis of these networks – Algorithm design for these networks • Impact of the measures? • This lecture addresses navigability Dec. 19, 2006 HiPC'06 8 Navigability Milgram Experiment • Source person s (e.g., in Wichita) • Target person t (e.g., in Cambridge) – Name, professional occupation, city of living, etc. • Letter transmitted via a chain of individuals related on a personal basis • Result: “six degrees of separation” Dec. -
A Decade of Lattice Cryptography
Full text available at: http://dx.doi.org/10.1561/0400000074 A Decade of Lattice Cryptography Chris Peikert Computer Science and Engineering University of Michigan, United States Boston — Delft Full text available at: http://dx.doi.org/10.1561/0400000074 Foundations and Trends R in Theoretical Computer Science Published, sold and distributed by: now Publishers Inc. PO Box 1024 Hanover, MA 02339 United States Tel. +1-781-985-4510 www.nowpublishers.com [email protected] Outside North America: now Publishers Inc. PO Box 179 2600 AD Delft The Netherlands Tel. +31-6-51115274 The preferred citation for this publication is C. Peikert. A Decade of Lattice Cryptography. Foundations and Trends R in Theoretical Computer Science, vol. 10, no. 4, pp. 283–424, 2014. R This Foundations and Trends issue was typeset in LATEX using a class file designed by Neal Parikh. Printed on acid-free paper. ISBN: 978-1-68083-113-9 c 2016 C. Peikert All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, mechanical, photocopying, recording or otherwise, without prior written permission of the publishers. Photocopying. In the USA: This journal is registered at the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923. Authorization to photocopy items for in- ternal or personal use, or the internal or personal use of specific clients, is granted by now Publishers Inc for users registered with the Copyright Clearance Center (CCC). The ‘services’ for users can be found on the internet at: www.copyright.com For those organizations that have been granted a photocopy license, a separate system of payment has been arranged. -
Constraint Based Dimension Correlation and Distance
Preface The papers in this volume were presented at the Fourteenth Annual IEEE Conference on Computational Complexity held from May 4-6, 1999 in Atlanta, Georgia, in conjunction with the Federated Computing Research Conference. This conference was sponsored by the IEEE Computer Society Technical Committee on Mathematical Foundations of Computing, in cooperation with the ACM SIGACT (The special interest group on Algorithms and Complexity Theory) and EATCS (The European Association for Theoretical Computer Science). The call for papers sought original research papers in all areas of computational complexity. A total of 70 papers were submitted for consideration of which 28 papers were accepted for the conference and for inclusion in these proceedings. Six of these papers were accepted to a joint STOC/Complexity session. For these papers the full conference paper appears in the STOC proceedings and a one-page summary appears in these proceedings. The program committee invited two distinguished researchers in computational complexity - Avi Wigderson and Jin-Yi Cai - to present invited talks. These proceedings contain survey articles based on their talks. The program committee thanks Pradyut Shah and Marcus Schaefer for their organizational and computer help, Steve Tate and the SIGACT Electronic Publishing Board for the use and help of the electronic submissions server, Peter Shor and Mike Saks for the electronic conference meeting software and Danielle Martin of the IEEE for editing this volume. The committee would also like to thank the following people for their help in reviewing the papers: E. Allender, V. Arvind, M. Ajtai, A. Ambainis, G. Barequet, S. Baumer, A. Berthiaume, S. -
Inter Actions Department of Physics 2015
INTER ACTIONS DEPARTMENT OF PHYSICS 2015 The Department of Physics has a very accomplished family of alumni. In this issue we begin to recognize just a few of them with stories submitted by graduates from each of the decades since 1950. We want to invite all of you to renew and strengthen your ties to our department. We would love to hear from you, telling us what you are doing and commenting on how the department has helped in your career and life. Alumna returns to the Physics Department to Implement New MCS Core Education When I heard that the Department of Physics needed a hand implementing the new MCS Core Curriculum, I couldn’t imagine a better fit. Not only did it provide an opportunity to return to my hometown, but Carnegie Mellon itself had been a fixture in my life for many years, from taking classes as a high school student to teaching after graduation. I was thrilled to have the chance to return. I graduated from Carnegie Mellon in 2008, with a B.S. in physics and a B.A. in Japanese. Afterwards, I continued to teach at CMARC’s Summer Academy for Math and Science during the summers while studying down the street at the University of Pittsburgh. In 2010, I earned my M.A. in East Asian Studies there based on my research into how cultural differences between Japan and America have helped shape their respective robotics industries. After receiving my master’s degree and spending a summer studying in Japan, I taught for Kaplan as graduate faculty for a year before joining the Department of Physics at Cornell University. -
Modified Alternating Step Generators with Non-Linear Scrambler
CORE brought to you by Pobrane z czasopisma Annales AI- Informatica http://ai.annales.umcs.pl Data: 04/08/2020 20:39:03 Annales UMCS Informatica AI XIV, 1 (2014) 61–74 DOI: 10.2478/umcsinfo-2014-0003 View metadata, citation and similar papers at core.ac.uk Modified Alternating Step Generators with Non-Linear Scrambler Robert Wicik1∗, Tomasz Rachwalik1†, Rafał Gliwa1‡ 1 Military Communication Institute, Cryptology Department, Zegrze, Poland Abstract – Pseudorandom generators, which produce keystreams for stream ciphers by the exclusive- or sum of outputs of alternately clocked linear feedback shift registers, are vulnerable to cryptanalysis. In order to increase their resistance to attacks, we introduce a non-linear scrambler at the output of these generators. Non-linear feedback shift register plays the role of the scrambler. In addition, we propose Modified Alternating Step Generator with a non-linear scrambler (MASG1S ) built with non-linear feedback shift register and regularly or irregularly clocked linear feedback shift registers with non-linear filtering functions. UMCS1 Introduction Pseudorandom generators of a keystream composed of linear feedback shift registers (LFSR) are basic components of classical stream ciphers. An LFSR with properly selected feedback gives a sequence of maximal period and good statistical properties but has a low linear complexity. It is vulnerable to the Berlecamp-Massey [1] algorithm and can be easily reconstructed having a short output segment. Stop and go or alternating clocking of shift registers are two of the methods to increase linear complexity of the keystream. Other techniques introduce non-linearity to the feedback or to the output of the shift register. -
The Probability Advantages of Two Linear Expressions in Symmetric Ciphers
The Probability Advantages of Two Linear Expressions in Symmetric Ciphers Haina Zhang ∗ Shaohui Wang † Xiaoyun Wang ‡ Abstract In this paper, we prove the probability advantages of two linear expressions which are summarized from the ABC stream cipher sub- mitted to ECRPYT Estream Project. Two linear expressions with probability advantages reflect the linear correlations among Modular Addition equations. Corresponding to each linear expression and its advantage, a large amount of weak keys are derived under which all the ABC main keys can be retrieved successively. The first linear expres- sion is a generic bit linear correlation between two Modular Addition equations. The second is a linear correlation of bit carries derived from three Modular Addition equations and the linear equation of LFSR in ABC. It is remarked that the second is found by Wu and Preneel, and has been used to find 296 weak keys. In the cryptanalysis of ABC, Wu and Preneel only utilized its estimated probability advantage which is concluded by experimental data, and they did not give its strict proof. Modular Addition and XOR operations are widely used in designing symmetric ciphers. We believe that these types of linear expressions with probability advantages not only can be used to analyze some other symmetric ciphers, but also are important criteria in designing secure symmetric ciphers. Key Words. Cryptanalysis, probability advantage, Modular Addi- tion, ABC stream cipher. ∗Shandong University, China. Email:[email protected]. †Shandong University, China. Email:[email protected]. ‡Shandong University and Tsinghua University, China. Email:[email protected]. 1 1 Introduction For a secure cryptographic algorithm, there should not exist any mathe- matic rule with substantial probability advantage. -
Loads of Codes – Cryptography Activities for the Classroom
Loads of Codes – Cryptography Activities for the Classroom Paul Kelley Anoka High School Anoka, Minnesota In the next 90 minutes, we’ll look at cryptosystems: Caesar cipher St. Cyr cipher Tie-ins with algebra Frequency distribution Vigenere cipher Cryptosystem – an algorithm (or series of algorithms) needed to implement encryption and decryption. For our purposes, the words encrypt and encipher will be used interchangeably, as will decrypt and decipher. The idea behind all this is that you want some message to get somewhere in a secure fashion, without being intercepted by “the bad guys.” Code – a substitution at the level of words or phrases Cipher – a substitution at the level of letters or symbols However, I think “Loads of Codes” sounds much cooler than “Loads of Ciphers.” Blackmail = King = Today = Capture = Prince = Tonight = Protect = Minister = Tomorrow = Capture King Tomorrow Plaintext: the letter before encryption Ciphertext: the letter after encryption Rail Fence Cipher – an example of a “transposition cipher,” one which doesn’t change any letters when enciphered. Example: Encipher “DO NOT DELAY IN ESCAPING,” using a rail fence cipher. You would send: DNTEAIECPN OODLYNSAIG Null cipher – not the entire message is meaningful. My aunt is not supposed to read every epistle tonight. BXMT SSESSBW POE ILTWQS RIA QBTNMAAD OPMNIKQT RMI MNDLJ ALNN BRIGH PIG ORHD LLTYQ BXMT SSESSBW POE ILTWQS RIA QBTNMAAD OPMNIKQT RMI MNDLJ ALNN BRIGH PIG ORHD LLTYQ Anagram – use the letters of one word, phrase or sentence to form a different one. Example: “Meet behind the castle” becomes “These belched a mitten.” Substitution cipher – one in which the letters change during encryption. -
The BBVA Foundation Recognizes Charles H. Bennett, Gilles Brassard
www.frontiersofknowledgeawards-fbbva.es Press release 3 March, 2020 The BBVA Foundation recognizes Charles H. Bennett, Gilles Brassard and Peter Shor for their fundamental role in the development of quantum computation and cryptography In the 1980s, chemical physicist Bennett and computer scientist Brassard invented quantum cryptography, a technology that using the laws of quantum physics in a way that makes them unreadable to eavesdroppers, in the words of the award committee The importance of this technology was demonstrated ten years later, when mathematician Shor discovered that a hypothetical quantum computer would drive a hole through the conventional cryptographic systems that underpin the security and privacy of Internet communications Their landmark contributions have boosted the development of quantum computers, which promise to perform calculations at a far greater speed and scale than machines, and enabled cryptographic systems that guarantee the inviolability of communications The BBVA Foundation Frontiers of Knowledge Award in Basic Sciences has gone in this twelfth edition to Charles Bennett, Gilles Brassard and Peter Shor for their outstanding contributions to the field of quantum computation and communication Bennett and Brassard, a chemical physicist and computer scientist respectively, invented quantum cryptography in the 1980s to ensure the physical inviolability of data communications. The importance of their work became apparent ten years later, when mathematician Peter Shor discovered that a hypothetical quantum computer would render effectively useless the communications. The award committee, chaired by Nobel Physics laureate Theodor Hänsch with quantum physicist Ignacio Cirac acting as its secretary, remarked on the leap forward in quantum www.frontiersofknowledgeawards-fbbva.es Press release 3 March, 2020 technologies witnessed in these last few years, an advance which draws heavily on the new . -
Quantum-Proofing the Blockchain
QUANTUM-PROOFING THE BLOCKCHAIN Vlad Gheorghiu, Sergey Gorbunov, Michele Mosca, and Bill Munson University of Waterloo November 2017 A BLOCKCHAIN RESEARCH INSTITUTE BIG IDEA WHITEPAPER Realizing the new promise of the digital economy In 1994, Don Tapscott coined the phrase, “the digital economy,” with his book of that title. It discussed how the Web and the Internet of information would bring important changes in business and society. Today the Internet of value creates profound new possibilities. In 2017, Don and Alex Tapscott launched the Blockchain Research Institute to help realize the new promise of the digital economy. We research the strategic implications of blockchain technology and produce practical insights to contribute global blockchain knowledge and help our members navigate this revolution. Our findings, conclusions, and recommendations are initially proprietary to our members and ultimately released to the public in support of our mission. To find out more, please visitwww.blockchainresearchinstitute.org . Blockchain Research Institute, 2018 Except where otherwise noted, this work is copyrighted 2018 by the Blockchain Research Institute and licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License. To view a copy of this license, send a letter to Creative Commons, PO Box 1866, Mountain View, CA 94042, USA, or visit creativecommons.org/ licenses/by-nc-nd/4.0/legalcode. This document represents the views of its author(s), not necessarily those of Blockchain Research Institute or the Tapscott Group. This material is for informational purposes only; it is neither investment advice nor managerial consulting. Use of this material does not create or constitute any kind of business relationship with the Blockchain Research Institute or the Tapscott Group, and neither the Blockchain Research Institute nor the Tapscott Group is liable for the actions of persons or organizations relying on this material. -
Quantum Error Correction
Quantum Error Correction Peter Young (Dated: March 6, 2020) I. INTRODUCTION Quantum error correction has developed into a huge topic, so here we will only be able to describe the main ideas. Error correction is essential for quantum computing, but appeared at first to be impossible, for reasons that we shall see. The field was transformed in 1995 by Shor[1] and Steane[2] who showed that quantum error correction is feasible. Before Shor and Steane, the goal of building a useful quantum computer seemed clearly unattainable. After those two papers, while building a quantum computer obviously posed enormous challenges, it was not necessarily impossible. Some general references on quantum error correction are Refs. [3{6]. Let us start by giving a simple discussion of classical error correction which will motivate our study of quantum error correction. In classical computers error correction is not necessary. This is because the hardware for one bit is huge on an atomic scale and the states 0 and 1 are so different that the probability of an unwanted flip is tiny. However, error correction is needed classically for transmitting a signal over large distances where it attenuates and can be corrupted by noise. To do error correction one needs to have redundancy. One simple way of doing classical error correction is to encode each logical bit by three physical bits, i.e. j0i ! j0i ≡ j0ij0ij0i ≡ j000i ; (1a) j1i ! j1i ≡ j1ij1ij1i ≡ j111i : (1b) The sets of three bits, j000i and j111i, are called codewords. One monitors the codewords to look for errors. If the bits in a codeword are not all the same one uses \majority rule" to correct.