Symmetric Cryptography
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Cryptanalysis of Feistel Networks with Secret Round Functions Alex Biryukov, Gaëtan Leurent, Léo Perrin
Cryptanalysis of Feistel Networks with Secret Round Functions Alex Biryukov, Gaëtan Leurent, Léo Perrin To cite this version: Alex Biryukov, Gaëtan Leurent, Léo Perrin. Cryptanalysis of Feistel Networks with Secret Round Functions. Selected Areas in Cryptography - SAC 2015, Aug 2015, Sackville, Canada. hal-01243130 HAL Id: hal-01243130 https://hal.inria.fr/hal-01243130 Submitted on 14 Dec 2015 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Cryptanalysis of Feistel Networks with Secret Round Functions ? Alex Biryukov1, Gaëtan Leurent2, and Léo Perrin3 1 [email protected], University of Luxembourg 2 [email protected], Inria, France 3 [email protected], SnT,University of Luxembourg Abstract. Generic distinguishers against Feistel Network with up to 5 rounds exist in the regular setting and up to 6 rounds in a multi-key setting. We present new cryptanalyses against Feistel Networks with 5, 6 and 7 rounds which are not simply distinguishers but actually recover completely the unknown Feistel functions. When an exclusive-or is used to combine the output of the round function with the other branch, we use the so-called yoyo game which we improved using a heuristic based on particular cycle structures. -
Second Preimage Attacks on Dithered Hash Functions
Second Preimage Attacks on Dithered Hash Functions Elena Andreeva1, Charles Bouillaguet2, Pierre-Alain Fouque2, Jonathan J. Hoch3, John Kelsey4, Adi Shamir2,3, and Sebastien Zimmer2 1 SCD-COSIC, Dept. of Electrical Engineering, Katholieke Universiteit Leuven, [email protected] 2 École normale supérieure (Département d’Informatique), CNRS, INRIA, {Charles.Bouillaguet,Pierre-Alain.Fouque,Sebastien.Zimmer}@ens.fr 3 Weizmann Institute of Science, {Adi.Shamir,Yaakov.Hoch}@weizmann.ac.il 4 National Institute of Standards and Technology, [email protected] Abstract. We develop a new generic long-message second preimage at- tack, based on combining the techniques in the second preimage attacks of Dean [8] and Kelsey and Schneier [16] with the herding attack of Kelsey and Kohno [15]. We show that these generic attacks apply to hash func- tions using the Merkle-Damgård construction with only slightly more work than the previously known attack, but allow enormously more con- trol of the contents of the second preimage found. Additionally, we show that our new attack applies to several hash function constructions which are not vulnerable to the previously known attack, including the dithered hash proposal of Rivest [25], Shoup’s UOWHF[26] and the ROX hash construction [2]. We analyze the properties of the dithering sequence used in [25], and develop a time-memory tradeoff which allows us to apply our second preimage attack to a wide range of dithering sequences, including sequences which are much stronger than those in Rivest’s proposals. Fi- nally, we show that both the existing second preimage attacks [8, 16] and our new attack can be applied even more efficiently to multiple target messages; in general, given a set of many target messages with a total of 2R message blocks, these second preimage attacks can find a second preimage for one of those target messages with no more work than would be necessary to find a second preimage for a single target message of 2R message blocks. -
Zero Correlation Linear Cryptanalysis on LEA Family Ciphers
Journal of Communications Vol. 11, No. 7, July 2016 Zero Correlation Linear Cryptanalysis on LEA Family Ciphers Kai Zhang, Jie Guan, and Bin Hu Information Science and Technology Institute, Zhengzhou 450000, China Email: [email protected]; [email protected]; [email protected] Abstract—In recent two years, zero correlation linear Zero correlation linear cryptanalysis was firstly cryptanalysis has shown its great potential in cryptanalysis and proposed by Andrey Bogdanov and Vicent Rijmen in it has proven to be effective against massive ciphers. LEA is a 2011 [2], [3]. Generally speaking, this cryptanalytic block cipher proposed by Deukjo Hong, who is the designer of method can be concluded as “use linear approximation of an ISO standard block cipher - HIGHT. This paper evaluates the probability 1/2 to eliminate the wrong key candidates”. security level on LEA family ciphers against zero correlation linear cryptanalysis. Firstly, we identify some 9-round zero However, in this basic model of zero correlation linear correlation linear hulls for LEA. Accordingly, we propose a cryptanalysis, the data complexity is about half of the full distinguishing attack on all variants of 9-round LEA family code book. The high data complexity greatly limits the ciphers. Then we propose the first zero correlation linear application of this new method. In FSE 2012, multiple cryptanalysis on 13-round LEA-192 and 14-round LEA-256. zero correlation linear cryptanalysis [4] was proposed For 13-round LEA-192, we propose a key recovery attack with which use multiple zero correlation linear approximations time complexity of 2131.30 13-round LEA encryptions, data to reduce the data complexity. -
A Full Key Recovery Attack on HMAC-AURORA-512
A Full Key Recovery Attack on HMAC-AURORA-512 Yu Sasaki NTT Information Sharing Platform Laboratories, NTT Corporation 3-9-11 Midori-cho, Musashino-shi, Tokyo, 180-8585 Japan [email protected] Abstract. In this note, we present a full key recovery attack on HMAC- AURORA-512 when 512-bit secret keys are used and the MAC length is 512-bit long. Our attack requires 2257 queries and the off-line com- plexity is 2259 AURORA-512 operations, which is significantly less than the complexity of the exhaustive search for a 512-bit key. The attack can be carried out with a negligible amount of memory. Our attack can also recover the inner-key of HMAC-AURORA-384 with almost the same complexity as in HMAC-AURORA-512. This attack does not recover the outer-key of HMAC-AURORA-384, but universal forgery is possible by combining the inner-key recovery and 2nd-preimage attacks. Our attack exploits some weaknesses in the mode of operation. keywords: AURORA, DMMD, HMAC, Key recovery attack 1 Description 1.1 Mode of operation for AURORA-512 We briefly describe the specification of AURORA-512. Please refer to Ref. [2] for details. An input message is padded to be a multiple of 512 bits by the standard MD message padding, then, the padded message is divided into 512-bit message blocks (M0;M1;:::;MN¡1). 256 512 256 In AURORA-512, compression functions Fk : f0; 1g £f0; 1g ! f0; 1g 256 512 256 512 and Gk : f0; 1g £ f0; 1g ! f0; 1g , two functions MF : f0; 1g ! f0; 1g512 and MFF : f0; 1g512 ! f0; 1g512, and two initial 256-bit chaining U D 1 values H0 and H0 are defined . -
3GPP TR 55.919 V6.1.0 (2002-12) Technical Report
3GPP TR 55.919 V6.1.0 (2002-12) Technical Report 3rd Generation Partnership Project; Technical Specification Group Services and System Aspects; 3G Security; Specification of the A5/3 Encryption Algorithms for GSM and ECSD, and the GEA3 Encryption Algorithm for GPRS; Document 4: Design and evaluation report (Release 6) R GLOBAL SYSTEM FOR MOBILE COMMUNICATIONS The present document has been developed within the 3rd Generation Partnership Project (3GPP TM) and may be further elaborated for the purposes of 3GPP. The present document has not been subject to any approval process by the 3GPP Organizational Partners and shall not be implemented. This Specification is provided for future development work within 3GPP only. The Organizational Partners accept no liability for any use of this Specification. Specifications and reports for implementation of the 3GPP TM system should be obtained via the 3GPP Organizational Partners' Publications Offices. Release 6 2 3GPP TR 55.919 V6.1.0 (2002-12) Keywords GSM, GPRS, security, algorithm 3GPP Postal address 3GPP support office address 650 Route des Lucioles - Sophia Antipolis Valbonne - FRANCE Tel.: +33 4 92 94 42 00 Fax: +33 4 93 65 47 16 Internet http://www.3gpp.org Copyright Notification No part may be reproduced except as authorized by written permission. The copyright and the foregoing restriction extend to reproduction in all media. © 2002, 3GPP Organizational Partners (ARIB, CWTS, ETSI, T1, TTA, TTC). All rights reserved. 3GPP Release 6 3 3GPP TR 55.919 V6.1.0 (2002-12) Contents Foreword ............................................................................................................................................................5 -
Iso/Iec Jtc 1/Sc 27/Wg 2 N 1804
ISO/IEC JTC 1/SC 27/WG 2 N 1804 ISO/IEC JTC 1/SC 27/WG 2 Cryptography and security mechanisms Convenorship: JISC (Japan) Document type: Officer's Contribution Title: A Memo on Kuznyechik S-Box Status: Date of document: 2018-09-27 Source: Project Editor (Vasily Shishkin), Project Co-editor (Grigory Marshalko) Expected action: ACT Action due date: 2018-10-04 No. of pages: 1 + 4 Email of convenor: [email protected] Committee URL: https://isotc.iso.org/livelink/livelink/open/jtc1sc27wg2 A Memo on Kuznyechik S-Box During ballot on 18033-3 DAmd1 (finished in June 2018) comments from some NB were received concerning issues regarding Kuznyechik design rationale and its parameters origins. The goal of this Memo is to provide all relevant information known to the designers of the algorithm. Kuznyechik has transparent and well studied design. Its design rationale was introduced to WG2 experts in 2016 by presentation (WG2) N1740. Kuznyechik is based on well examined constructions. Each transformation provides certain cryptographic properties. All transformations used as building blocks have transparent origin. The only transformation used in Kuznyechik which origin may be questioned is the S-box π. This S-box was chosen from Streebog hash-function and it was synthesized in 2007. Note that through many years of cryptanalysis no weakness of this S-box was found. The S-box π was obtained by pseudo- random search and the following properties were taken into account. 1. The differential characteristic π pπ = max pα,β; α,β2V8nf0g π where pα,β = Pfπ(x ⊕ α) ⊕ π(x) = βg. -
The Missing Difference Problem, and Its Applications to Counter Mode
The Missing Difference Problem, and its Applications to Counter Mode Encryption? Ga¨etanLeurent and Ferdinand Sibleyras Inria, France fgaetan.leurent,[email protected] Abstract. The counter mode (CTR) is a simple, efficient and widely used encryption mode using a block cipher. It comes with a security proof that guarantees no attacks up to the birthday bound (i.e. as long as the number of encrypted blocks σ satisfies σ 2n=2), and a matching attack that can distinguish plaintext/ciphertext pairs from random using about 2n=2 blocks of data. The main goal of this paper is to study attacks against the counter mode beyond this simple distinguisher. We focus on message recovery attacks, with realistic assumptions about the capabilities of an adversary, and evaluate the full time complexity of the attacks rather than just the query complexity. Our main result is an attack to recover a block of message with complexity O~(2n=2). This shows that the actual security of CTR is similar to that of CBC, where collision attacks are well known to reveal information about the message. To achieve this result, we study a simple algorithmic problem related to the security of the CTR mode: the missing difference problem. We give efficient algorithms for this problem in two practically relevant cases: where the missing difference is known to be in some linear subspace, and when the amount of data is higher than strictly required. As a further application, we show that the second algorithm can also be used to break some polynomial MACs such as GMAC and Poly1305, with a universal forgery attack with complexity O~(22n=3). -
Balanced Permutations Even–Mansour Ciphers
Article Balanced Permutations Even–Mansour Ciphers Shoni Gilboa 1, Shay Gueron 2,3 ∗ and Mridul Nandi 4 1 Department of Mathematics and Computer Science, The Open University of Israel, Raanana 4353701, Israel; [email protected] 2 Department of Mathematics, University of Haifa, Haifa 3498838, Israel 3 Intel Corporation, Israel Development Center, Haifa 31015, Israel 4 Indian Statistical Institute, Kolkata 700108, India; [email protected] * Correspondence: [email protected]; Tel.: +04-824-0161 Academic Editor: Kwangjo Kim Received: 2 February 2016; Accepted: 30 March 2016; Published: 1 April 2016 Abstract: The r-rounds Even–Mansour block cipher is a generalization of the well known Even–Mansour block cipher to r iterations. Attacks on this construction were described by Nikoli´c et al. and Dinur et al. for r = 2, 3. These attacks are only marginally better than brute force but are based on an interesting observation (due to Nikoli´c et al.): for a “typical” permutation P, the distribution of P(x) ⊕ x is not uniform. This naturally raises the following question. Let us call permutations for which the distribution of P(x) ⊕ x is uniformly “balanced” — is there a sufficiently large family of balanced permutations, and what is the security of the resulting Even–Mansour block cipher? We show how to generate families of balanced permutations from the Luby–Rackoff construction and use them to define a 2n-bit block cipher from the 2-round Even–Mansour scheme. We prove that this cipher is indistinguishable from a random permutation of f0, 1g2n, for any adversary who has oracle access to the public permutations and to an encryption/decryption oracle, as long as the number of queries is o(2n/2). -
State of the Art in Lightweight Symmetric Cryptography
State of the Art in Lightweight Symmetric Cryptography Alex Biryukov1 and Léo Perrin2 1 SnT, CSC, University of Luxembourg, [email protected] 2 SnT, University of Luxembourg, [email protected] Abstract. Lightweight cryptography has been one of the “hot topics” in symmetric cryptography in the recent years. A huge number of lightweight algorithms have been published, standardized and/or used in commercial products. In this paper, we discuss the different implementation constraints that a “lightweight” algorithm is usually designed to satisfy. We also present an extensive survey of all lightweight symmetric primitives we are aware of. It covers designs from the academic community, from government agencies and proprietary algorithms which were reverse-engineered or leaked. Relevant national (nist...) and international (iso/iec...) standards are listed. We then discuss some trends we identified in the design of lightweight algorithms, namely the designers’ preference for arx-based and bitsliced-S-Box-based designs and simple key schedules. Finally, we argue that lightweight cryptography is too large a field and that it should be split into two related but distinct areas: ultra-lightweight and IoT cryptography. The former deals only with the smallest of devices for which a lower security level may be justified by the very harsh design constraints. The latter corresponds to low-power embedded processors for which the Aes and modern hash function are costly but which have to provide a high level security due to their greater connectivity. Keywords: Lightweight cryptography · Ultra-Lightweight · IoT · Internet of Things · SoK · Survey · Standards · Industry 1 Introduction The Internet of Things (IoT) is one of the foremost buzzwords in computer science and information technology at the time of writing. -
Security of VSH in the Real World
Security of VSH in the Real World Markku-Juhani O. Saarinen Information Security Group Royal Holloway, University of London Egham, Surrey TW20 0EX, UK. [email protected] Abstract. In Eurocrypt 2006, Contini, Lenstra, and Steinfeld proposed a new hash function primitive, VSH, very smooth hash. In this brief paper we offer com- mentary on the resistance of VSH against some standard cryptanalytic attacks, in- cluding preimage attacks and collision search for a truncated VSH. Although the authors of VSH claim only collision resistance, we show why one must be very careful when using VSH in cryptographic engineering, where additional security properties are often required. 1 Introduction Many existing cryptographic hash functions were originally designed to be message di- gests for use in digital signature schemes. However, they are also often used as building blocks for other cryptographic primitives, such as pseudorandom number generators (PRNGs), message authentication codes, password security schemes, and for deriving keying material in cryptographic protocols such as SSL, TLS, and IPSec. These applications may use truncated versions of the hashes with an implicit as- sumption that the security of such a variant against attacks is directly proportional to the amount of entropy (bits) used from the hash result. An example of this is the HMAC−n construction in IPSec [1]. Some signature schemes also use truncated hashes. Hence we are driven to the following slightly nonstandard definition of security goals for a hash function usable in practice: 1. Preimage resistance. For essentially all pre-specified outputs X, it is difficult to find a message Y such that H(Y ) = X. -
Thesis Submitted for the Degree of Doctor of Philosophy
Optimizations in Algebraic and Differential Cryptanalysis Theodosis Mourouzis Department of Computer Science University College London A thesis submitted for the degree of Doctor of Philosophy January 2015 Title of the Thesis: Optimizations in Algebraic and Differential Cryptanalysis Ph.D. student: Theodosis Mourouzis Department of Computer Science University College London Address: Gower Street, London, WC1E 6BT E-mail: [email protected] Supervisors: Nicolas T. Courtois Department of Computer Science University College London Address: Gower Street, London, WC1E 6BT E-mail: [email protected] Committee Members: 1. Reviewer 1: Professor Kenny Paterson 2. Reviewer 2: Dr Christophe Petit Day of the Defense: Signature from head of PhD committee: ii Declaration I herewith declare that I have produced this paper without the prohibited assistance of third parties and without making use of aids other than those specified; notions taken over directly or indirectly from other sources have been identified as such. This paper has not previously been presented in identical or similar form to any other English or foreign examination board. The following thesis work was written by Theodosis Mourouzis under the supervision of Dr Nicolas T. Courtois at University College London. Signature from the author: Abstract In this thesis, we study how to enhance current cryptanalytic techniques, especially in Differential Cryptanalysis (DC) and to some degree in Al- gebraic Cryptanalysis (AC), by considering and solving some underlying optimization problems based on the general structure of the algorithm. In the first part, we study techniques for optimizing arbitrary algebraic computations in the general non-commutative setting with respect to sev- eral metrics [42, 44]. -
Data Encryption Standard
Data Encryption Standard The Data Encryption Standard (DES /ˌdiːˌiːˈɛs, dɛz/) is a Data Encryption Standard symmetric-key algorithm for the encryption of electronic data. Although insecure, it was highly influential in the advancement of modern cryptography. Developed in the early 1970s atIBM and based on an earlier design by Horst Feistel, the algorithm was submitted to the National Bureau of Standards (NBS) following the agency's invitation to propose a candidate for the protection of sensitive, unclassified electronic government data. In 1976, after consultation with theNational Security Agency (NSA), the NBS eventually selected a slightly modified version (strengthened against differential cryptanalysis, but weakened against brute-force attacks), which was published as an official Federal Information Processing Standard (FIPS) for the United States in 1977. The publication of an NSA-approved encryption standard simultaneously resulted in its quick international adoption and widespread academic scrutiny. Controversies arose out of classified The Feistel function (F function) of DES design elements, a relatively short key length of the symmetric-key General block cipher design, and the involvement of the NSA, nourishing Designers IBM suspicions about a backdoor. Today it is known that the S-boxes that had raised those suspicions were in fact designed by the NSA to First 1975 (Federal Register) actually remove a backdoor they secretly knew (differential published (standardized in January 1977) cryptanalysis). However, the NSA also ensured that the key size was Derived Lucifer drastically reduced such that they could break it by brute force from [2] attack. The intense academic scrutiny the algorithm received over Successors Triple DES, G-DES, DES-X, time led to the modern understanding of block ciphers and their LOKI89, ICE cryptanalysis.