SHA-1 and the Strict Avalanche Criterion

Total Page:16

File Type:pdf, Size:1020Kb

SHA-1 and the Strict Avalanche Criterion SHA-1 and the Strict Avalanche Criterion Yusuf Motara Barry Irwin Rhodes University Rhodes University Grahamstown 6140 Grahamstown 6140 SOUTH AFRICA SOUTH AFRICA Email: [email protected] Email: [email protected] Abstract—The Strict Avalanche Criterion (SAC) is a measure 3) An exploration of intermediate results of both confusion and diffusion, which are key properties of Section 2 of this paper examines related work and argues a cryptographic hash function. This work provides a working definition of the SAC, describes an experimental methodology that the SAC as proposed by Webster & Tavares [1] has that can be used to statistically evaluate whether a cryptographic been misunderstood in much of the contemporaneous critical hash meets the SAC, and uses this to investigate the degree to literature. Section 3 introduces salient points of a well-known which compression function of the SHA-1 hash meets the SAC. cryptographic hash (SHA-1) which is assumed to exhibit the The results (P < 0:01) are heartening: SHA-1 closely tracks SAC, and describes an experimental design to test its SAC- the SAC after the first 24 rounds, and demonstrates excellent properties of confusion and diffusion throughout. compliance. Section 4 presents experimental results, and some discussion follows. I. INTRODUCTION Many computer scientists know little about the inner work- II. RELATED WORK ings of cryptographic hashes, though they may know some- thing about their properties. One of these properties is the The original definition [1] of the SAC is: “avalanche effect”, by analogy with the idea of a small stone Consider X and Xi, two n-bit, binary plaintext causing a large avalanche of changes. The “avalanche effect” vectors, such that X and Xi differ only in bit i, explains how a small change in the input data can result in a 1 < i < n. Let large change in the output hash. However, many questions around the effect are unanswered. For example, how large Vi = Y ⊕ Yi is the effect? After how many “rounds” of a compression function can it be seen? Do all inputs result in such an where Y = f(X) , Yi = f(Xi) and f is the effect? Little experimental work has been done to answer these cryptographic transformation, under consideration. questions for any hash function, and this paper contributes If f is to meet the strict avalanche criterion, the experimental results that help in this regard. probability that each bit in Vi is equal to 1 should be m A boolean n-bit hash function H is the transform Z2 ! one half over the set of all possible plaintext vectors n Z2 . A cryptographic hash function attempts to obscure the X and Xi. This should be true for all values of i. relationship between the input and output of F , and the degree Forre´ [2] expresses this as: to which this is accomplished is directly related to the (second- Let x and x denote two n-bit vectors, such that )preimage resistance of the hash function. This implies that i x and x differ only in bit i, 1 ≤ i ≤ n. Zn two similar inputs should have very different outputs. i 2 denotes the n-dimensional vector space over 0,1. arXiv:1609.00616v1 [cs.CR] 2 Sep 2016 The Strict Avalanche Criterion (SAC) ([1], [2]) formalizes The function f(x) = z; z 2 f0; 1g fulfills the SAC this notion by measuring the amount of change introduced in if and only if the output by a small change in the input. It builds on the definition of completeness, which means that each bit of the X output depends on all the bits of the input, in a way that is f(x)⊕f(x ) = 2n−1; for all i with 1 ≤ i ≤ n: cryptographically relevant. Using the definition of H as above, i x2Zn an output H(x) = y is obtained for an input x. The initial 2 bit of x is now flipped, giving H(x0) = y0. This process is Similarly, Lloyd [3] understands the SAC as: repeated for x1::n, resulting in y1::n. The SAC is met when Let f : Zn 7! Zm be a cryptographic transfor- n 2 2 the Hamming distance between y and y0::n is, on average, 2 . mation. Then f satisfies the strict avalanche criterion There are three contributions that this paper makes to the if and only if existing body of research: 1) A definition of what the SAC is; X n−1 n−1 2) Experimental SAC results for a particular cryptographic f(x)⊕f(x⊕ci) = (2 ; :::; 2 ) for all i, 1 ≤ i ≤ n: n hash (SHA-1); x2Z2 where ⊕ denotes bitwise exclusive or and ci is seems to be more useful (and more in line with the original a vector of length n with a 1 in the ith position and definition) to understand how far a particular sample diverges 0 elsewhere. from the SAC. Therefore, this paper regards the SAC as a Other works ([4], [5], [6], [7]) follow in the same vein. continuum but takes Lloyd’s formulation as the definition of However, these definitions calculate the sum over all possible what it means to “meet” the SAC. inputs as leading to the fulfillment of the SAC, which is Preneel [4] suggests a generalisation of the SAC called the contrary to the original definition. The original definition propagation criterion (PC), defined as separates a baseline value from the avalanche vectors, and Let f be a Boolean function of n variables. Then states that the SAC holds true when “the probability that each f satisfies the propagation criterion of degree k, bit [in the avalanche vectors] is equal to 1 should be one half PC(k),(1 ≤ k ≤ n), if f^(x) changes with a over the set of all possible plaintext vectors” [1]. Therefore, probability of 1/2 whenever i (1 ≤ i ≤ k) bits of x n a better test of whether f : Z2 7! Z2 fulfills the SAC would are complemented. use a universal quantifier, It can be seen that the SAC is equivalent to PC(1). The same work defines an extended propagation criterion which regards the SAC as a continuum. Much of the subsequent 8x 2 n;P (f(x) = f(x )) = 0:5 Z2 i work ([8], [9], [10], [11], [12], [13]) in this area has more for all xi which differ from x in bit i; 1 ≤ i ≤ n closely examined the relationship between PC and nonlinearity A simple example clarifies the difference. Babbage [6] uses characteristics. Many of these extend the PC in interesting Lloyd’s [3] definition of the SAC and defines a SAC-compliant ways and examine ways of constructing functions which function: satisfy PC(n), but experimental research that targets existing n algorithms is scarce. Define f : Z2 7! Z2 by Although there are proven theoretical ways to construct a function which satisfies the SAC [7], there is no way (apart ( f(x1; :::; xn) = 0 if x1 = 0 from exhaustive testing) to verify that an existing function satisfies the SAC. By contrast, useful cryptographic properties f(x1; :::; xn) = x2 ⊕ ::: ⊕ xn if x1 = 1 such as non-degeneracy [14] or bentness [15] are verifiable The simplest function of this nature is f(x) = x0^x1. Then, without having to resort to exhaustive testing. However, the taking g(x) = f(x)⊕f(x⊕01) and h(x) = f(x)⊕f(x⊕10), SAC metric is no worse in this regard than the correlation immunity [16] and balance [17] metrics which also require x f(x) g(x) h(x) P (f(x) = f(xi)) exhaustive testing. 00 0 0 0 1.0 III. EXPERIMENTAL DESIGN 01 0 0 1 0.5 The SHA-1 hash [18] is a well-known cryptographic hash function which generates a 160-bit hash value. It is the 10 0 1 0 0.5 successor to the equally well-known MD5 cryptographic hash 11 1 1 1 1.0 function which generated a 128-bit hash value. SHA-1 was Sum: 2 2 designed by the National Security Agency of the United States of America and published in 1995 as National Institute Note that the sum of each of the third and fourth columns is of Standards and Technology (NIST) Federal Information n−1 2 , as predicted, and that this function fulfills the summed Processing Standard 180-1. definition of the SAC. However, the first and last rows do not fulfill the original definition of the SAC at all: the probability A. Hash details of change, given the baseline values 00 and 11, is 0.0 in The SHA-1 hash is constructed using the Merkle-Damgard˚ each case. It is therefore more reasonable to regard the paradigm ([19], [20]), which means that it consists of padding, row probability as important. This understanding is also in chunking, and compression stages. These stages are necessary accordance with the original text that defined the term. Under for the hash algorithm to be able to handle inputs which are this definition, x0 ^ x1 is not SAC-compliant. greater than 447 bits in length; however, they are unnecessary It is worth noting that the original definition, as per Webster to consider in an examination of the compression function & Tavares [1], is slightly ambiguous. They state that “the itself, since the strength of the Merkle-Damgard˚ paradigm is probability that each bit in Vi is equal to 1 should be one predicated on the characteristics of the compression function. half over the set of all possible plaintext vectors X and Xi”; This paper examines only the compression function itself, and however, they also state that “to satisfy the strict avalanche does not concern itself with padding, chunking, or Davies- criterion, every element must have a value close to one half ” Meyer strengthening [21].
Recommended publications
  • Advanced Encryption Standard (Aes) Modes of Operation
    ADVANCED ENCRYPTION STANDARD (AES) MODES OF OPERATION 1 Arya Rohan Under the guidance of Dr. Edward Schneider University of Maryland, College Park MISSION: TO SIMULATE BLOCK CIPHER MODES OF OPERATION FOR AES IN MATLAB Simulation of the AES (Rijndael Algorithm) in MATLAB for 128 bit key-length. Simulation of the five block cipher modes of operation for AES as per FIPS publication. Comparison of the five modes based on Avalanche Effect. Future Work 2 OUTLINE A brief history of AES Galois Field Theory De-Ciphering the Algorithm-ENCRYPTION De-Ciphering the Algorithm-DECRYPTION Block Cipher Modes of Operation Avalanche Effect Simulation in MATLAB Conclusion & Future Work References 3 A BRIEF HISTORY OF AES 4 In January 1997, researchers world-over were invited by NIST to submit proposals for a new standard to be called Advanced Encryption Standard (AES). From 15 serious proposals, the Rijndael algorithm proposed by Vincent Rijmen and Joan Daemen, two Belgian cryptographers won the contest. The Rijndael algorithm supported plaintext sizes of 128, 192 and 256 bits, as well as, key-lengths of 128, 192 and 256 bits. The Rijndael algorithm is based on the Galois field theory and hence it gives the algorithm provable 5 security properties. GALOIS FIELD 6 GALOIS FIELD - GROUP Group/Albelian Group: A group G or {G, .} is a set of elements with a binary operation denoted by . , that associates to each ordered pair (a, b) of elements in G an element (a . b) such that the following properties are obeyed: Closure: If a & b belong to G, then a . b also belongs to G.
    [Show full text]
  • Block Ciphers
    Block Ciphers Chester Rebeiro IIT Madras CR STINSON : chapters 3 Block Cipher KE KD untrusted communication link Alice E D Bob #%AR3Xf34^$ “Attack at Dawn!!” message encryption (ciphertext) decryption “Attack at Dawn!!” Encryption key is the same as the decryption key (KE = K D) CR 2 Block Cipher : Encryption Key Length Secret Key Plaintext Ciphertext Block Cipher (Encryption) Block Length • A block cipher encryption algorithm encrypts n bits of plaintext at a time • May need to pad the plaintext if necessary • y = ek(x) CR 3 Block Cipher : Decryption Key Length Secret Key Ciphertext Plaintext Block Cipher (Decryption) Block Length • A block cipher decryption algorithm recovers the plaintext from the ciphertext. • x = dk(y) CR 4 Inside the Block Cipher PlaintextBlock (an iterative cipher) Key Whitening Round 1 key1 Round 2 key2 Round 3 key3 Round n keyn Ciphertext Block • Each round has the same endomorphic cryptosystem, which takes a key and produces an intermediate ouput • Size of the key is huge… much larger than the block size. CR 5 Inside the Block Cipher (the key schedule) PlaintextBlock Secret Key Key Whitening Round 1 Round Key 1 Round 2 Round Key 2 Round 3 Round Key 3 Key Expansion Expansion Key Key Round n Round Key n Ciphertext Block • A single secret key of fixed size used to generate ‘round keys’ for each round CR 6 Inside the Round Function Round Input • Add Round key : Add Round Key Mixing operation between the round input and the round key. typically, an ex-or operation Confusion Layer • Confusion layer : Makes the relationship between round Diffusion Layer input and output complex.
    [Show full text]
  • Visualization of the Avalanche Effect in CT2
    University of Mannheim Faculty for Business Informatics & Business Mathematics Theoretical Computer Science and IT Security Group Bachelor's Thesis Visualization of the Avalanche Effect in CT2 as part of the degree program Bachelor of Science Wirtschaftsinformatik submitted by Camilo Echeverri [email protected] on October 31, 2016 (2nd revised public version, Apr 18, 2017) Supervisors: Prof. Dr. Frederik Armknecht Prof. Bernhard Esslinger Visualization of the Avalanche Effect in CT2 Abstract Cryptographic algorithms must fulfill certain properties concerning their security. This thesis aims at providing insights into the importance of the avalanche effect property by introducing a new plugin for the cryptography and cryptanalysis platform CrypTool 2. The thesis addresses some of the desired properties, discusses the implementation of the plugin for modern and classic ciphers, guides the reader on how to use it, applies the proposed tool in order to test the avalanche effect of different cryptographic ciphers and hash functions, and interprets the results obtained. 2 Contents Abstract .......................................... 2 Contents .......................................... 3 List of Abbreviations .................................. 5 List of Figures ...................................... 6 List of Tables ....................................... 7 1 Introduction ..................................... 8 1.1 CrypTool 2 . 8 1.2 Outline of the Thesis . 9 2 Properties of Secure Block Ciphers ....................... 10 2.1 Avalanche Effect . 10 2.2 Completeness . 10 3 Related Work ..................................... 11 4 Plugin Design and Implementation ....................... 12 4.1 General Description of the Plugin . 12 4.2 Prepared Methods . 14 4.2.1 AES and DES . 14 4.3 Unprepared Methods . 20 4.3.1 Classic Ciphers, Modern Ciphers, and Hash Functions . 20 4.4 Architecture of the Code . 22 4.5 Limitations and Future Work .
    [Show full text]
  • A Block Cipher Algorithm to Enhance the Avalanche Effect Using Dynamic Key- Dependent S-Box and Genetic Operations 1Balajee Maram and 2J.M
    International Journal of Pure and Applied Mathematics Volume 119 No. 10 2018, 399-418 ISSN: 1311-8080 (printed version); ISSN: 1314-3395 (on-line version) url: http://www.ijpam.eu Special Issue ijpam.eu A Block Cipher Algorithm to Enhance the Avalanche Effect Using Dynamic Key- Dependent S-Box and Genetic Operations 1Balajee Maram and 2J.M. Gnanasekar 1Department of CSE, GMRIT, Rajam, India. Research and Development Centre, Bharathiar University, Coimbatore. [email protected] 2Department of Computer Science & Engineering, Sri Venkateswara College of Engineering, Sriperumbudur Tamil Nadu. [email protected] Abstract In digital data security, an encryption technique plays a vital role to convert digital data into intelligible form. In this paper, a light-weight S- box is generated that depends on Pseudo-Random-Number-Generators. According to shared-secret-key, all the Pseudo-Random-Numbers are scrambled and input to the S-box. The complexity of S-box generation is very simple. Here the plain-text is encrypted using Genetic Operations and S-box which is generated based on shared-secret-key. The proposed algorithm is experimentally investigates the complexity, quality and performance using the S-box parameters which includes Hamming Distance, Balanced Output and the characteristic of cryptography is Avalanche Effect. Finally the comparison results motivates that the dynamic key-dependent S-box has good quality and performance than existing algorithms. 399 International Journal of Pure and Applied Mathematics Special Issue Index Terms:S-BOX, data security, random number, cryptography, genetic operations. 400 International Journal of Pure and Applied Mathematics Special Issue 1. Introduction In public network, several types of attacks1 can be avoided by applying Data Encryption/Decryption2.
    [Show full text]
  • FPGA Implementation and Analysis of the Block Cipher Mode Architectures for the PRESENT Light Weight Encryption Algorithm
    ISSN (Print) : 0974-6846 Indian Journal of Science and Technology, Vol 9(38), DOI: 10.17485/ijst/2016/v9i38/90314, October 2016 ISSN (Online) : 0974-5645 FPGA Implementation and Analysis of the Block Cipher Mode Architectures for the PRESENT Light Weight Encryption Algorithm A. Prathiba* and V. S. Kanchana Bhaaskaran School of Electronics Engineering, VIT University Chennai, Chennai - 600127, India; prathi_communication@yahoo. co.in, [email protected] Abstract Objective: This paper presents the Field Programmable Gate Array (FPGA) implementations of the different block cipher mode architectures of the ISO standardized light weight block cipher PRESENT, designed for resource constrained devices. Methods/ Statistical Analysis: The performance evaluations compare the implementations of the different block cipher modes, namely Electronic Code Book (ECB) mode, Cipher Block Chaining (CBC) mode, Cipher Feedback Mode (CFB), Output Feed Back Mode (OFB) and CounTeR (CTR) mode for the PRESENT cipher. The throughput of encryption of three successive 64 bit blocks of data ranges from 565.312Mbps to 574.784Mbps for the modes other than the cipher feedback arrives as 68.912 Mbps, 155.392Mbps and300.8 Mbps for a 64 bit block of data for the input streams of size 8 bits, 16 bits andmode 32 in bits the respectively.Spartan-3 FPGA. Findings: The throughput The throughput for providing of the confidentialityblock cipher mode through hardware encryption architectures in the cipher of the feedback light weight mode cipher PRESENT demonstrates the high speed performance of the cipher in encryption/decryption of data as blocks and streams. Application/ Improvement: different modes of operation for the light weight block cipher PRESENT.
    [Show full text]
  • FPGA Implementation Using VHDL of the AES-GCM 256-Bit Authenticated Encryption Algorithm
    NATIONAL AND KAPODISTRIAN UNIVERSITY OF ATHENS DEPARTMENT OF INFORMATICS AND TELECOMMUNICATIONS THESIS FPGA Implementation using VHDL of the AES-GCM 256-bit Authenticated Encryption Algorithm Ioannis-T. - Stavrou Supervisor: Antonis Paschalis, Professor ATHENS NOVEMBER 2016 ΕΘΝΙΚΟ ΚΑΙ ΚΑΠΟΔΙΣΤΡΙΑΚΟ ΠΑΝΕΠΙΣΤΗΜΙΟ ΑΘΗΝΩΝ ΣΧΟΛΗ ΘΕΤΙΚΩΝ ΕΠΙΣΤΗΜΩΝ ΤΜΗΜΑ ΠΛΗΡΟΦΟΡΙΚΗΣ ΚΑΙ ΤΗΛΕΠΙΚΟΙΝΩΝΙΩΝ ΠΤΥΧΙΑΚΗ ΕΡΓΑΣΙΑ Υλοποίηση σε FPGA με τη Χρήση VHDL του Αλγόριθμου Κρυπτογράφησης με Πιστοποίηση Αυθεντικότητας AES-GCM 256-bit Ιωάννης - Θ. - Σταύρου Επιβλέπων: Αντώνης Πασχάλης, Καθηγητής ΑΘΗΝΑ ΝΟΕΜΒΡΙΟΣ 2016 THESIS FPGA Implementation using VHDL of the AES-GCM 256-bit Authenticated Encryption Algorithm Ioannis - T. - Stavrou Α.Μ.: 1115200700128 Supervisor: Antonis Paschalis, Professor ΠΤΥΧΙΑΚΗ ΕΡΓΑΣΙΑ Υλοποίηση σε FPGA με τη Χρήση VHDL του Αλγόριθμου Κρυπτογράφησης με Πιστοποίηση Αυθεντικότητας AES-GCM 256-bit Ιωάννης - Θ. - Σταύρου Α.Μ.:1115200700128 Επιβλέπων: Αντώνης Πασχάλης, Καθηγητής ABSTRACT Achieving high-speed network performance along with data integrity and security was always a challenge. The necessity to communicate through public channels securely led to the use of asymmetric key cryptography algorithms that commonly use a “hand- shake” mechanism allowing the implementation of a “trust” system that could quarantine the security of the transaction and the integrity of the data as long as the algorithm could provide strong resistance to cryptographic attacks. Although, there is no mathematical proof that these algorithms are invulnerable to attacks there is strong indication that they are highly resistant to most of them, making brute force the only attack that can have a 100% success rate which is countered by the huge computational power someone needs to succeed. While asymmetric key cryptography algorithms where the solution to public communication, the ongoing demand for higher bandwidth made the use of them inefficient, because the complexity of the algorithms demanded a processing cost that were creating latency gaps.
    [Show full text]
  • A Novel Construction of Efficient Substitution-Boxes Using Cubic
    entropy Article A Novel Construction of Efficient Substitution-Boxes Using Cubic Fractional Transformation Amjad Hussain Zahid 1,2, Muhammad Junaid Arshad 2 and Musheer Ahmad 3,* 1 Department of Computer Science, University of Management and Technology, Lahore 54000, Pakistan; [email protected] 2 Department of Computer Science, University of Engineering and Technology, Lahore 54000, Pakistan; [email protected] 3 Department of Computer Engineering, Jamia Millia Islamia, New Delhi 110025, India * Correspondence: [email protected]; Tel.: +91-112-698-0281 Received: 27 January 2019; Accepted: 28 February 2019; Published: 5 March 2019 Abstract: A symmetric block cipher employing a substitution–permutation duo is an effective technique for the provision of information security. For substitution, modern block ciphers use one or more substitution boxes (S-Boxes). Certain criteria and design principles are fulfilled and followed for the construction of a good S-Box. In this paper, an innovative technique to construct substitution-boxes using our cubic fractional transformation (CFT) is presented. The cryptographic strength of the proposed S-box is critically evaluated against the state of the art performance criteria of strong S-boxes, including bijection, nonlinearity, bit independence criterion, strict avalanche effect, and linear and differential approximation probabilities. The performance results of the proposed S-Box are compared with recently investigated S-Boxes to prove its cryptographic strength. The simulation and comparison analyses validate that the proposed S-Box construction method has adequate efficacy to generate efficient candidate S-Boxes for usage in block ciphers. Keywords: substitution box; cubic fractional transformation; block ciphers; security 1. Introduction Cryptography helps individuals and organizations to protect their data.
    [Show full text]
  • A Fast New Cryptographic Hash Function Based on Integer Tent Mapping System
    JOURNAL OF COMPUTERS, VOL. 7, NO. 7, JULY 2012 1671 A Fast New Cryptographic Hash Function Based on Integer Tent Mapping System Jiandong Liu Information Engineering College, Beijing Institute of Petrochemical Technology, Beijing, China [email protected] Xiahui Wang, Kai Yang, Chen Zhao Information Engineering College, Beijing Institute of Petrochemical Technology, Beijing, China {Wnagxiahui, Yangkai0212, zhao_chen}@bipt.edu.cn Abstract—This paper proposes a novel one-way Hash collisions by SHA-1” by Wang and Yu is another function which is based on the Coupled Integer Tent breakthrough in the hash functions history [11]. Mapping System and termed as THA (THA-160, THA-256). Based on the MD iteration structure [3], the The THA-160 compresses a message of arbitrary length into conventional Hash functions (MDx and SHA) have many a fingerprint of 160 bits, well the THA-256 compresses a common design guidelines. The designs of their mixed message of arbitrary length into a fingerprint of 256 bits. The algorithm adopts a piecewise message expansion operations for each round are very similar, and all of scheme. Compared with SHA-1 and SHA-256 message them adopt integer modulo addition and logic function; expansion, the message expansion scheme has enhanced the therefore, many Hash functions have been breached degree of nonlinear diffusion of the message expansion, and successively within a short period of time, which thus increased the computation efficiency. In addition, as indicates the defects in the design of the Hash functions. the major nonlinear component of compression function, the In recent years, in order to obtain more secure hash traditional logic functions are replaced by the integer tent functions, the research on constructing one-way hash map, and so the scheme has ideal properties of diffusion and function has been carried out and it has achieved progress confusion.
    [Show full text]
  • Data Encryption Standard (DES)
    6 Data Encryption Standard (DES) Objectives In this chapter, we discuss the Data Encryption Standard (DES), the modern symmetric-key block cipher. The following are our main objectives for this chapter: + To review a short history of DES + To defi ne the basic structure of DES + To describe the details of building elements of DES + To describe the round keys generation process + To analyze DES he emphasis is on how DES uses a Feistel cipher to achieve confusion and diffusion of bits from the Tplaintext to the ciphertext. 6.1 INTRODUCTION The Data Encryption Standard (DES) is a symmetric-key block cipher published by the National Institute of Standards and Technology (NIST). 6.1.1 History In 1973, NIST published a request for proposals for a national symmetric-key cryptosystem. A proposal from IBM, a modifi cation of a project called Lucifer, was accepted as DES. DES was published in the Federal Register in March 1975 as a draft of the Federal Information Processing Standard (FIPS). After the publication, the draft was criticized severely for two reasons. First, critics questioned the small key length (only 56 bits), which could make the cipher vulnerable to brute-force attack. Second, critics were concerned about some hidden design behind the internal structure of DES. They were suspicious that some part of the structure (the S-boxes) may have some hidden trapdoor that would allow the National Security Agency (NSA) to decrypt the messages without the need for the key. Later IBM designers mentioned that the internal structure was designed to prevent differential cryptanalysis.
    [Show full text]
  • RAGHAV a New Low Power S-P Network Encryption Design for Resource Constrained Environment
    RAGHAV A new low power S-P network encryption design for resource constrained environment Gaurav Bansod Associate Professor [email protected] Pune Institute of Computer Technology(PICT), Pune India Abstract— This paper proposes a new ultra lightweight cipher considered to be the workhorse in the cryptographic RAGHAV. RAGHAV is a Substitution-Permutation (SP) environment. The most ignored metric in design of a network, which operates on 64 bit plaintext and supports a lightweight cipher is power dissipation which is a very crucial 128/80 bit key scheduling. It needs only 994.25 GEs by using in the environments like IoT and Wireless Sensor Networks 0.13µm ASIC technology for a 128 bit key scheduling. It also (WSN). In Wireless Sensor Network, most of the nodes are needs less memory i.e. 2204 bytes of FLASH memory , which is less as compared to all existing S-P network lightweight ciphers. battery powered and there is a need to protect these nodes This paper presents a complete security analysis of RAGHAV, against external attacks. The versatile cipher like AES, Triple which includes basic attacks like linear cryptanalysis and DES fails in such kind of environment as they need huge differential cryptanalysis. This paper also covers advanced attack memory space as well as they dissipated more power. There is like zero correlation attack, Biclique attack, Algebraic attack, urgent need to secure these nodes without incurring more Avalanche effect, key collision attack and key schedule attack. In power to make the technologies like IoT feasible. This paper this cipher,use of block permutation helps the design to improve presents a cipher RAGHAV which has less GE’s, needs less the throughput.
    [Show full text]
  • CRYPTREC Report 2001
    CRYPTREC 2001 CRYPTREC Report 2001 March 2002 Information-technology Promotion Agency, Japan Telecommunications Advancement Organization of Japan CRYPTREC 2001 Contents Introduction 1 On the CRYPTREC Evaluation Committee Report 3 Note on the use of this report 7 1 Overview of Cryptographic Technique Evaluation 8 1.1 Evaluation Organs and Schedule ・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・8 1.2 How cryptography evaluation was carried out. ・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・12 1.3 Terminology ・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・13 1.4 Evaluation Committee Members ・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・14 2 Evaluation of public key cryptographic techniques 17 2.1 Target of Evaluation and Evaluation Method ・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・17 2.1.1 Evaluated Cryptographic Techniques ・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・17 2.1.2 Evaluation Policy・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・17 2.1.3 Evaluation Method ・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・19 2.2 Evaluation result ・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・21 2.2.1 Outline of evaluation result ・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・21 2.2.2 General Evaluation of the Difficulty of Arithmetic Problems・・・・・・・・・・・・・・・・・23 2.2.3 Overall Judgment of Cryptographic Techniques that were the Target of Detailed Evaluation ・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・・23 2.2.4 Overall Judgment of Cryptographic Techniques under Observation ・・・・・・・・・・・26 2.2.5 Overall Judgment of Cryptosystems that were Targets of Screening Evaluations in 2001
    [Show full text]
  • Modified Advanced Encryption Standard Algorithm for Information
    S S symmetry Article Modified Advanced Encryption Standard Algorithm for Information Security Oluwakemi Christiana Abikoye 1 , Ahmad Dokoro Haruna 2 , Abdullahi Abubakar 1, Noah Oluwatobi Akande 3,* and Emmanuel Oluwatobi Asani 3 1 Computer Science Department, University of Ilorin, Ilorin 240103, Kwara State, Nigeria; [email protected] (A.O.C.); [email protected] (A.A.) 2 Computer Science Department, Gombe State University, Gombe 760214, Gombe State, Nigeria; [email protected] 3 Data and Information Security Research Group, Computer Science Department, Landmark University, Omu-Aran 251101, Kwara State, Nigeria; [email protected] * Correspondence: [email protected] Received: 19 July 2019; Accepted: 9 September 2019; Published: 5 December 2019 Abstract: The wide acceptability of Advanced Encryption Standard (AES) as the most efficient of all of the symmetric cryptographic techniques has further opened it up to more attacks. Efforts that were aimed at securing information while using AES is still being undermined by the activities of attackers This has further necessitated the need for researchers to come up with ways of enhancing the strength of AES. This article presents an enhanced AES algorithm that was achieved by modifying its SubBytes and ShiftRows transformations. The SubBytes transformation is modified to be round key dependent, while the ShiftRows transformation is randomized. The rationale behind the modification is to make the two transformations round key dependent, so that a single bit change in the key will produce a significant change in the cipher text. The conventional and modified AES algorithms are both implemented and evaluated in terms avalanche effect and execution time.
    [Show full text]