scispace - formally typeset
Search or ask a question

A Quantitative Study of Advanced Encryption Standard Performance as it Relates to Cryptographic Attack Feasibility

01 Jan 2018-
About: The article was published on 2018-01-01 and is currently open access. It has received 3 citations till now. The article focuses on the topics: Advanced Encryption Standard & Cryptography.

Content maybe subject to copyright    Report

Citations
More filters
DOI
13 Jan 2011
TL;DR: This Recommendation (SP 800-131A) provides more specific guidance for transitions to the use of stronger cryptographic keys and more robust algorithms.
Abstract: The National Institute of Standards and Technology (NIST) provides cryptographic key management guidance for defining and implementing appropriate key management procedures, using algorithms that adequately protect sensitive information, and planning ahead for possible changes in the use of cryptography because of algorithm breaks or the availability of more powerful computing techniques. NIST Special Publication (SP) 800-57 Part 1 includes a general approach for transitioning from one algorithm or key length to another. This Recommendation (SP 800-131A Revision 1) provides more specific guidance for transitions to the use of stronger cryptographic keys and more robust algorithms.

158 citations

Proceedings ArticleDOI
22 Sep 2020
TL;DR: In this article, the performance of a Raspberry Pi cluster to a power-efficient next unit of computing (NUC) and a midrange desktop (MRD) on three leading cryptographic algorithms (AES, Twofish, and Serpent) is compared.
Abstract: ARM-based single board computers (SBCs) such as the Raspberry Pi capture the imaginations of hobbyists and scientists due to their low cost and versatility. With the deluge of data produced in edge environments, SBCs and SBC clusters have emerged as low-cost platform for data collection and analysis. Simultaneously, security is a growing concern as new regulations require secure communication for data collected from the edge. In this paper, we compare the performance of a Raspberry Pi cluster to a power-efficient next unit of computing (NUC) and a midrange desktop (MRD) on three leading cryptographic algorithms (AES, Twofish, and Serpent) and assess the general-purpose performance of the three systems using the HPL benchmark. Our results suggest that hardware-level instruction sets for all three cryptographic algorithms should be implemented on single board computers to aid with secure data transfer on the edge.

6 citations

Proceedings ArticleDOI
22 Apr 2020
TL;DR: Variable World Length is a proposed quantum-proof symmetric cryptography algorithm that is low-labor, low-latency, and low-power; with a keyspace of 10511 in its simplest form (for comparison, AES-256 is 2256).
Abstract: Quantum computing is coming; and, with it, the security of modern cryptography will be compromised. Technologies such as industrial control systems, banking, and smartphones depend on cryptography to ensure the confidentiality and integrity of operations. Any data sent over the internet relies on cryptography. In some cases, information is so sensitive that if the encryption is broken years in the future the results could be catastrophic. A post-quantum cryptographic solution is required today. Variable World Length is a proposed quantum-proof symmetric cryptography algorithm. It is low-labor, low-latency, and low-power; with a keyspace of 10511 in its simplest form (for comparison, AES-256 is 2256). Variable Word Length performs bit manipulation to send “words” of varied length. Without the key, an attacker would need to brute force all possible combinations. Due to the design of the algorithm, some of these combinations will decrypt real plaintext that was not the encrypted message. Additionally, the keyspace can be increased without an exponential tax on computing resources.

1 citations


Cites methods from "A Quantitative Study of Advanced En..."

  • ...A quantum computer that has implemented Grover’s algorithm would reduce the key-space to 2(64), which is easily breakable by modern standards [16]....

    [...]

  • ...The Biclique technique is the only known general attack against AES, and only reduces the AES-256 key size to 2 [16]....

    [...]

References
More filters
Journal ArticleDOI
01 Jan 1998
TL;DR: Integrated circuits will lead to such wonders as home computers or at least terminals connected to a central computer, automatic controls for automobiles, and personal portable communications equipment as mentioned in this paper. But the biggest potential lies in the production of large systems.
Abstract: The future of integrated electronics is the future of electronics itself. The advantages of integration will bring about a proliferation of electronics, pushing this science into many new areas. Integrated circuits will lead to such wonders as home computers—or at least terminals connected to a central computer—automatic controls for automobiles, and personal portable communications equipment. The electronic wristwatch needs only a display to be feasible today. But the biggest potential lies in the production of large systems. In telephone communications, integrated circuits in digital filters will separate channels on multiplex equipment. Integrated circuits will also switch telephone circuits and perform data processing. Computers will be more powerful, and will be organized in completely different ways. For example, memories built of integrated electronics may be distributed throughout the machine instead of being concentrated in a central unit. In addition, the improved reliability made possible by integrated circuits will allow the construction of larger processing units. Machines similar to those in existence today will be built at lower costs and with faster turnaround.

9,647 citations

Book
01 Jan 1993
TL;DR: This book introduces a new cryptographic method, called differential cryptanalysis, which can be applied to analyze cryptosystems, and describes the cryptanalysis of DES, deals with the influence of its building blocks on security, and analyzes modified variants.
Abstract: DES, the Data Encryption Standard, is one of several cryptographic standards. The authors of this text detail their cryptanalytic "attack" upon DES and several other systems, using creative and novel tactics to demonstrate how they broke DES up into 16 rounds of coding. The methodology used offers valuable insights to cryptographers and cryptanalysts alike in creating new encryption standards, strengthening current ones, and exploring new ways to test important data protection schemes. This book introduces a new cryptographic method, called differential cryptanalysis, which can be applied to analyze cryptosystems. It describes the cryptanalysis of DES, deals with the influence of its building blocks on security, and analyzes modified variants. The differential cryptanalysis of "Feal" and several other cryptosystems is also described. This method can also be used to cryptanalyze hash functions, as is exemplified by the cryptanalysis of "Snefru".

1,009 citations


"A Quantitative Study of Advanced En..." refers background in this paper

  • ...Biham and Shamir (1993) discovered and published the new technique, which become known as differential cryptanalysis....

    [...]

  • ...By 1993, however, the concern of the key length and the potential attack complexity reduction of differential cryptanalysis entered the literature (Biham & Shamir, 1993)....

    [...]

  • ...Since these attacks were known even prior to attacks becoming feasible against DES, they were appropriately considered during the AES selection process (Baudron et al., 1999; Biham & Shamir, 1993)....

    [...]

  • ...A month before publication, Coppersmith, one of the members of the design team announced that they were aware of differential cryptanalysis in 1974 and took design precautions to partially mitigate the method but kept it a secret for the sake of national security (Biham & Shamir, 1993)....

    [...]

Journal ArticleDOI
TL;DR: A comprehensive review of recent research in the field of model-based performance prediction at software development time is presented in order to assess the maturity of the field and point out promising research directions.
Abstract: Over the last decade, a lot of research has been directed toward integrating performance analysis into the software development process. Traditional software development methods focus on software correctness, introducing performance issues later in the development process. This approach does not take into account the fact that performance problems may require considerable changes in design, for example, at the software architecture level, or even worse at the requirement analysis level. Several approaches were proposed in order to address early software performance analysis. Although some of them have been successfully applied, we are still far from seeing performance analysis integrated into ordinary software development. In this paper, we present a comprehensive review of recent research in the field of model-based performance prediction at software development time in order to assess the maturity of the field and point out promising research directions.

803 citations

OtherDOI
29 Sep 2014
TL;DR: The Belmont Report summarizes the basic ethical principles and guidelines identified by the commission to assist in the protection of human subjects in research and outlines general recommendations regarding obtaining informed consent, the assessment of risk and benefit, and the recruitment of participants.
Abstract: The Belmont Report summarizes the basic ethical principles and guidelines identified by the commission to assist in the protection of human subjects in research. The report focuses on three main areas, (1) the boundaries between research and practice, (2) the basic ethical principles that should underlie the conduct of research (autonomy, beneficence, justice) and the protection of human subjects, and (3) the application of these principles into practice. The report outlines general recommendations regarding obtaining informed consent, the assessment of risk and benefit, and the recruitment of participants. The Belmont Report lays out many of the ethical considerations that, to this day, are used in formulating regulations and in ensuring the protection of human study volunteers. Keywords: belmont report; research ethics; human subjects; biomedical research

711 citations

Journal ArticleDOI
Don Coppersmith1
TL;DR: Some of the safeguards against differential cryptanalysis that were built into the DES system from the beginning are shown, with the result that more than 10 15 bytes of chosen plaintext are required for this attack to succeed.
Abstract: The Data Encryption Standard (DES) was developed by an IBM team around 1974 and adopted as a national standard in 1977. Since that time, many cryptanalysts have attempted to find shortcuts for breaking the system. In this paper, we examine one such attempt, the method of differential cryptanalysis, published by Biham and Shamir. We show some of the safeguards against differential cryptanalysis that were built into the system from the beginning, with the result that more than 10 15 bytes of chosen plaintext are required for this attack to succeed.

560 citations