scispace - formally typeset
Search or ask a question

Showing papers on "Differential cryptanalysis published in 2012"


Book ChapterDOI
15 Apr 2012
TL;DR: Even and Mansour as discussed by the authors showed that the Even-Mansour scheme is not minimal in the sense that it can be simplified into a single key scheme with half as many key bits, which can be argued to be the simplest conceivable provably secure block cipher.
Abstract: In this paper we consider the following fundamental problem: What is the simplest possible construction of a block cipher which is provably secure in some formal sense? This problem motivated Even and Mansour to develop their scheme in 1991, but its exact security remained open for more than 20 years in the sense that the lower bound proof considered known plaintexts, whereas the best published attack (which was based on differential cryptanalysis) required chosen plaintexts. In this paper we solve this open problem by describing the new Slidex attack which matches the T=Ω(2n/D) lower bound on the time T for any number of known plaintexts D. Once we obtain this tight bound, we can show that the original two-key Even-Mansour scheme is not minimal in the sense that it can be simplified into a single key scheme with half as many key bits which provides exactly the same security, and which can be argued to be the simplest conceivable provably secure block cipher. We then show that there can be no comparable lower bound on the memory requirements of such attacks, by developing a new memoryless attack which can be applied with the same time complexity but only in the special case of D=2n/2. In the last part of the paper we analyze the security of several other variants of the Even-Mansour scheme, showing that some of them provide the same level of security while in others the lower bound proof fails for very delicate reasons.

156 citations


Journal ArticleDOI
TL;DR: This work proposes the use of nonlinear functional chaos-based substitution process which employs a continuous time Lorenz system, which eliminates the need of independent round keys in a substitution-permutation network.
Abstract: In cryptographic systems, the encryption process relies on the nonlinear mapping of original data or plaintext to the secure data. The mapping of data is facilitated by the application of the substitution process embedded in the cipher. It is desirable to have resistance against differential cryptanalysis, which assists in providing clues about the composition of keys, and linear secret system, where a simple approximation is created to emulate the original cipher characteristics. In this work, we propose the use of nonlinear functional chaos-based substitution process which employs a continuous time Lorenz system. The proposed substitution system eliminates the need of independent round keys in a substitution-permutation network. The performance of the new substitution box is evaluated by nonlinearity analysis, strict avalanche criterion, bit independence criterion, linear approximation probability, and differential approximation probability.

134 citations


Book ChapterDOI
19 Mar 2012
TL;DR: The concept of biclique as a tool for preimage attacks was introduced in this paper, which employs many powerful techniques from differential cryptanalysis of block ciphers and hash functions.
Abstract: We present a new concept of biclique as a tool for preimage attacks, which employs many powerful techniques from differential cryptanalysis of block ciphers and hash functions. The new tool has proved to be widely applicable by inspiring many authors to publish new results of the full versions of AES, KASUMI, IDEA, and Square. In this paper, we show how our concept leads to the first cryptanalysis of the round-reduced Skein hash function, and describe an attack on the SHA-2 hash function with more rounds than before.

128 citations


Book ChapterDOI
19 Mar 2012
TL;DR: In this paper, a statistical technique was proposed to reduce the data complexity of zero correlation linear cryptanalysis (ZCLC) by using the high number of linear approximations available.
Abstract: Zero correlation linear cryptanalysis is a novel key recovery technique for block ciphers proposed in [5]. It is based on linear approximations with probability of exactly 1/2 (which corresponds to the zero correlation). Some block ciphers turn out to have multiple linear approximations with correlation zero for each key over a considerable number of rounds. Zero correlation linear cryptanalysis is the counterpart of impossible differential cryptanalysis in the domain of linear cryptanalysis, though having many technical distinctions and sometimes resulting in stronger attacks. In this paper, we propose a statistical technique to significantly reduce the data complexity using the high number of zero correlation linear approximations available. We also identify zero correlation linear approximations for 14 and 15 rounds of TEA and XTEA. Those result in key-recovery attacks for 21-round TEA and 25-round XTEA, while requiring less data than the full code book. In the single secret key setting, these are structural attacks breaking the highest number of rounds for both ciphers. The findings of this paper demonstrate that the prohibitive data complexity requirements are not inherent in the zero correlation linear cryptanalysis and can be overcome. Moreover, our results suggest that zero correlation linear cryptanalysis can actually break more rounds than the best known impossible differential cryptanalysis does for relevant block ciphers. This might make a security re-evaluation of some ciphers necessary in the view of the new attack.

93 citations


Book ChapterDOI
15 Apr 2012
TL;DR: For the first time, an approach is described to noticeably speed-up key-recovery for the full 8.5 round IDEA and it is shown that the biclique approach to block cipher cryptanalysis not only obtains results on more rounds, but also improves time and data complexities over existing attacks.
Abstract: We apply and extend the recently introduced biclique framework to IDEA and for the first time describe an approach to noticeably speed-up key-recovery for the full 8.5 round IDEA. We also show that the biclique approach to block cipher cryptanalysis not only obtains results on more rounds, but also improves time and data complexities over existing attacks. We consider the first 7.5 rounds of IDEA and demonstrate a variant of the approach that works with practical data complexity. The conceptual contribution is the narrow-bicliques technique: the recently introduced independent-biclique approach extended with ways to allow for a significantly reduced data complexity with everything else being equal. For this we use available degrees of freedom as known from hash cryptanalysis to narrow the relevant differential trails. Our cryptanalysis is of high computational complexity, and does not threaten the practical use of IDEA in any way, yet the techniques are practically verified to a large extent.

84 citations


Journal ArticleDOI
TL;DR: This study analyzes the security weaknesses of the “C.

74 citations


Book ChapterDOI
15 Aug 2012
TL;DR: This paper revisits the design strategy of PHOTON lightweight hash family and the work of FSE 2012, in which perfect diffusion layers are constructed by one bundle-based LFSR, and investigates new strategies to constructperfect diffusion layers using more than one Bundle-Based LFSRs.
Abstract: Diffusion layers with maximum branch numbers are widely used in block ciphers and hash functions. In this paper, we construct recursive diffusion layers using Linear Feedback Shift Registers (LFSRs). Unlike the MDS matrix used in AES, whose elements are limited in a finite field, a diffusion layer in this paper is a square matrix composed of linear transformations over a vector space. Perfect diffusion layers with branch numbers from 5 to 9 are constructed. On the one hand, we revisit the design strategy of PHOTON lightweight hash family and the work of FSE 2012, in which perfect diffusion layers are constructed by one bundle-based LFSR. We get better results and they can be used to replace those of PHOTON to gain smaller hardware implementations. On the other hand, we investigate new strategies to construct perfect diffusion layers using more than one bundle-based LFSRs. Finally, we construct perfect diffusion layers by increasing the number of iterations and using bit-level LFSRs. Since most of our proposals have lightweight examples corresponding to 4-bit and 8-bit Sboxes, we expect that they will be useful in designing (lightweight) block ciphers and (lightweight) hash functions.

71 citations


Journal ArticleDOI
TL;DR: This paper presents attacks on up to four rounds of AES that require at most three known/chosen plaintexts, and applies these attacks to cryptanalyze an AES-based stream cipher, and to mount the best known plaintext attack on six-round AES.
Abstract: The majority of current attacks on reduced-round variants of block ciphers seeks to maximize the number of rounds that can be broken, using less data than the entire codebook and less time than exhaustive key search. In this paper, we pursue a different approach, restricting the data available to the adversary to a few plaintext/ciphertext pairs. We argue that consideration of such attacks (which received little attention in recent years) improves our understanding of the security of block ciphers and of other cryptographic primitives based on block ciphers. In particular, these attacks can be leveraged to more complex attacks, either on the block cipher itself or on other primitives (e.g., stream ciphers, MACs, or hash functions) that use a small number of rounds of the block cipher as one of their components. As a case study, we consider the Advanced Encryption Standard (AES)-the most widely used block cipher. The AES round function is used in many cryptographic primitives, such as the hash functions Lane, SHAvite-3, and Vortex or the message authentication codes ALPHA-MAC, Pelican, and Marvin. We present attacks on up to four rounds of AES that require at most three known/chosen plaintexts. We then apply these attacks to cryptanalyze an AES-based stream cipher (which follows the leak extraction methodology), and to mount the best known plaintext attack on six-round AES.

66 citations


Journal ArticleDOI
TL;DR: In this article, a chaos-based image encryption algorithm with an alternate structure (IEAS) was proposed and applied the differential cryptanalysis on the IEAS and found that some of its properties favor the differential attack which can recover an equivalent secret key with only a few number of chosen plain-images.

59 citations


Book ChapterDOI
15 Aug 2012
TL;DR: A new attack is proposed that implicitly mounts several standard, truncated, impossible, improbable and possible future variants of differential attacks in parallel and hence allows to significantly improve upon known differential attacks using the same input difference.
Abstract: We present a framework that unifies several standard differential techniques. This unified view allows us to consider many, potentially all, output differences for a given input difference and to combine the information derived from them in an optimal way. We then propose a new attack that implicitly mounts several standard, truncated, impossible, improbable and possible future variants of differential attacks in parallel and hence allows to significantly improve upon known differential attacks using the same input difference. To demonstrate the viability of our techniques, we apply them to KATAN-32. In particular, our attack allows us to break 115 rounds of KATAN-32. For this, our attack exploits the non-uniformity of the difference distribution after 91 rounds which is 20 rounds more than the previously best known differential characteristic.

56 citations


Book ChapterDOI
09 Apr 2012
TL;DR: By analyzing the distribution of the subkeys, this work presents a biclique cryptanalysis of full round Piccolo-80 without postwhitening keys and 28-roundPiccolo-128 without prewhitens keys, and improves diffusion property of Piccolo.
Abstract: Piccolo is a lightweight block cipher, with a fixed 64-bit block size and variable key length 80- or 128-bit, which was proposed at CHES 2011. The iterative structure of Piccolo is a variant of Generalized Feistel Network. The transformation utilizing different-size-word based permutation improves diffusion property of Piccolo and the simple key schedule algorithm reduces hardware costs. By analyzing the distribution of the subkeys, we present a biclique cryptanalysis of full round Piccolo-80 without postwhitening keys and 28-round Piccolo-128 without prewhitening keys. The attacks are respectively with data complexity of 248 and 224 chosen ciphertexts, and with time complexity of 278.95 and 2126.79 encryptions.

Journal ArticleDOI
TL;DR: In this article, the authors investigated compression of data encrypted with block ciphers, such as the Advanced Encryption Standard (AES), and showed that such data can be feasibly compressed without knowledge of the secret key.
Abstract: This paper investigates compression of data encrypted with block ciphers, such as the Advanced Encryption Standard. It is shown that such data can be feasibly compressed without knowledge of the secret key. Block ciphers operating in various chaining modes are considered and it is shown how compression can be achieved without compromising security of the encryption scheme. Further, it is shown that there exists a fundamental limitation to the practical compressibility of block ciphers when no chaining is used between blocks. Some performance results for practical code constructions used to compress binary sources are presented.

Journal ArticleDOI
TL;DR: The proposed scheme is simple, fast and sensitive to the secret key, and the experimental results show that the proposed encryption technique is efficient and has high security features.
Abstract: In this paper, a new image encryption scheme using a secret key of 144-bits is proposed. In the substitution process of the scheme, image is divided into blocks and subsequently into color components. Each color component is modified by performing bitwise operation which depends on secret key as well as a few most significant bits of its previous and next color component. Three rounds are taken to complete substitution process. To make cipher more robust, a feedback mechanism is also applied by modifying used secret key after encrypting each block. Further, resultant image is partitioned into several key based dynamic sub-images. Each sub-image passes through the scrambling process where pixels of sub-image are reshuffled within itself by using a generated magic square matrix. Five rounds are taken for scrambling process. The propose scheme is simple, fast and sensitive to the secret key. Due to high order of substitution and permutation, common attacks like linear and differential cryptanalysis are infeasible. The experimental results show that the proposed encryption technique is efficient and has high security features.

Journal ArticleDOI
TL;DR: Two constructions of on-line ciphers are provided, HCBC1 and HCBC2, based on a given block cipher E and a family of computationally AXU functions, which are proven secure against chosen-plaintext attacks and security definitions for this primitive are provided.
Abstract: We initiate a study of on-line ciphers. These are ciphers that can take input plaintexts of large and varying lengths and will output the i th block of the ciphertext after having processed only the first i blocks of the plaintext. Such ciphers permit length-preserving encryption of a data stream with only a single pass through the data. We provide security definitions for this primitive and study its basic properties. We then provide attacks on some possible candidates, including CBC with fixed IV. We then provide two constructions, HCBC1 and HCBC2, based on a given block cipher E and a family of computationally AXU functions. HCBC1 is proven secure against chosen-plaintext attacks assuming that E is a PRP secure against chosen-plaintext attacks, while HCBC2 is proven secure against chosen-ciphertext attacks assuming that E is a PRP secure against chosen-ciphertext attacks.

Book ChapterDOI
12 Nov 2012
TL;DR: This attack was practically, and successfully, applied on DES and Triple-DES and trained a neural network to retrieve plaintext from ciphertext without retrieving the key used in encryption.
Abstract: In this paper, we apply a new cryptanalytic attack on DES and Triple-DES. The implemented attack is a known-plaintext attack based on neural networks. In this attack we trained a neural network to retrieve plaintext from ciphertext without retrieving the key used in encryption. The attack was practically, and successfully, applied on DES and Triple-DES. This attack required an average of 211 plaintext-ciphertext pairs to perform cryptanalysis of DES in an average duration of 51 minutes. For the cryptanalysis of Triple-DES, an average of only 212 plaintext-ciphertext pairs was required in an average duration of 72 minutes. As compared to other attacks, this attack is an improvement in terms of number of known-plaintexts required, as well as the time required to perform the complete attack.

Book ChapterDOI
20 Jun 2012
TL;DR: This paper improves the impossible differential attack on 20-round LBlock given in the design paper of the LBlock cipher using relations between the round keys and uses the same 14-round impossible differential characteristic observed by the designers to attack on 21 rounds.
Abstract: In this paper, we improve the impossible differential attack on 20-round LBlock given in the design paper of the LBlock cipher. Using relations between the round keys we attack on 21-round and 22-round LBlock with a complexity of 269.5 and 279.28 encryptions respectively. We use the same 14-round impossible differential characteristic observed by the designers to attack on 21 rounds and another 14-round impossible differential characteristic to attack on 22 rounds of LBlock.

Journal ArticleDOI
TL;DR: This work is the first known cryptanalytic result on LED-64, a 64-bit block cipher suitable for the efficient implementation in constrained hardware environments such as WSN.

Journal ArticleDOI
TL;DR: The objective is to survey what ciphers are suitable for security in Radio Frequency Identification (RFID) and other security applications with demanding area restrictions.

Book ChapterDOI
12 Dec 2012
TL;DR: The comparison of symbolic expressions suggests that Grain-128a is immune against dynamic cube attacks and also immune against differential attacks as the best attack could find results in a bias at round 189 out of 256.
Abstract: Grain-128a is a new version of the stream cipher Grain-128. To analyse the security of the cipher, we study the monomial structure and use high order differential attacks on both the new and old versions. The comparison of symbolic expressions suggests that Grain-128a is immune against dynamic cube attacks. Additionally, we find that it is also immune against differential attacks as the best attack we could find results in a bias at round 189 out of 256.

Journal ArticleDOI
TL;DR: A flaw in the approach used to choose plaintexts or ciphertexts in certain previously published square-like cryptanalytic results for Camellia is described and two possible approaches to correct them are given.
Abstract: The Camellia block cipher has a 128-bit block length, a user key 128, 192 or 256 bits long and a total of 18 rounds for a 128-bit key and 24 rounds for a 192 or 256-bit key. It is a Japanese CRYPTREC-recommended e-government cipher, a European new European schemes for signatures, integrity and encryption (NESSIE) selected cipher and an ISO international standard. In this study, the authors describe a flaw in the approach used to choose plaintexts or ciphertexts in certain previously published square-like cryptanalytic results for Camellia and give two possible approaches to correct them. Finally, by taking advantage of the early abort technique and a few observations on the key schedule of Camellia, the authors present impossible differential attacks on 10-round Camellia with the FL/FL−1 functions under 128 key bits, 11-round Camellia with the FL/FL−1 functions under 192 key bits, 14-round Camellia without the FL/FL−1 functions under 192 key bits and 16-round Camellia without the FL/FL−1 functions under 256 key bits.

Journal ArticleDOI
TL;DR: In this article, a new image encryption scheme using a secret key of 144-bits is proposed, in which image is divided into blocks and subsequently into color components, each color component is modified by performing bitwise operation which depends on secret key as well as a few most significant bits of its previous and next color component.
Abstract: In this paper, a new image encryption scheme using a secret key of 144-bits is proposed. In the substitution process of the scheme, image is divided into blocks and subsequently into color components. Each color component is modified by performing bitwise operation which depends on secret key as well as a few most significant bits of its previous and next color component. Three rounds are taken to complete substitution process. To make cipher more robust, a feedback mechanism is also applied by modifying used secret key after encrypting each block. Further, resultant image is partitioned into several key based dynamic sub-images. Each sub-image passes through the scrambling process where pixels of sub-image are reshuffled within itself by using a generated magic square matrix. Five rounds are taken for scrambling process. The propose scheme is simple, fast and sensitive to the secret key. Due to high order of substitution and permutation, common attacks like linear and differential cryptanalysis are infeasible. The experimental results show that the proposed encryption technique is efficient and has high security features.

Journal ArticleDOI
TL;DR: The strength of this cipher against related‐key impossible differential cryptanalysis is investigated, and two 6‐round related‐ key impossible differentials for mCrypton‐96 and m Crypton‐128 are constructed.
Abstract: mCrypton is a 64-bit lightweight block cipher designed for use in low-cost and resource-constrained applications such as RFID tags and sensors in wireless sensor networks. In this paper, we investigate the strength of this cipher against related-key impossible differential cryptanalysis. First, we construct two 6-round related-key impossible differentials for mCrypton-96 and mCrypton-128. Then, using these distinguishers, we present 9-round related-key impossible differential attacks on these two versions. The attack on mCrypton-96 requires 259.9 chosen plaintexts, and has a time complexity of about 274.9 encryptions. The data and time complexities for the attack on mCrypton-128 are 259.7 chosen plaintexts and 266.7 encryptions, respectively. Copyright © 2011 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: This paper describes in detail how to apply cube attacks to stream ciphers in various settings with different assumptions on the target stream cipher and on the data available to the attacker.
Abstract: Cube attacks were introduced in Dinur and Shamir (2009) as a cryptanalytic technique that requires only black box access to the underlying cryptosystem. The attack exploits the existence of low degree polynomial representation of a single output bit (as a function of the key and plaintext bits) in order to recover the secret key. Although cube attacks can be applied in principle to almost any cryptosystem, most block ciphers iteratively apply a highly non-linear round function (based on Sboxes or arithmetic operations) a large number of times which makes them resistant to cube attacks. On the other hand, many stream ciphers (such as Trivium (De Canniere and Preneel 2008)), are built using linear or low degree components and are natural targets for cube attacks. In this paper, we describe in detail how to apply cube attacks to stream ciphers in various settings with different assumptions on the target stream cipher and on the data available to the attacker.

Journal ArticleDOI
TL;DR: A robust secure scan structure design for crypto cores is proposed as a countermeasure against scan-based attacks to maintain high security without compromising the testability.
Abstract: Scan technology carries the potential risk of being misused as a “side channel” to leak out the secrets of crypto cores. The existing scan-based attacks could be viewed as one kind of differential cryptanalysis, which takes advantages of scan chains to observe the bit changes between pairs of chosen plaintexts so as to identify the secret keys. To address such a design/test challenge, this paper proposes a robust secure scan structure design for crypto cores as a countermeasure against scan-based attacks to maintain high security without compromising the testability.

Book ChapterDOI
19 Mar 2012
TL;DR: A general model and complexity analysis for structure attacks for differential attacks is given and how to choose the set of differentials to minimize the time and data complexities is shown.
Abstract: As a classic cryptanalytic method for block ciphers, hash functions and stream ciphers, many extensions and refinements of differential cryptanalysis have been developed. In this paper, we focus on the use of so-called structures in differential attacks, i.e. the use of multiple input and one output difference. We give a general model and complexity analysis for structure attacks and show how to choose the set of differentials to minimize the time and data complexities. Being a subclass of multiple differential attacks in general, structure attacks can also be analyzed in the model of Blondeau et al. from FSE 2011. In this very general model, a restrictive condition on the set of input differences is required for the complexity analysis. We demonstrate that in our dedicated model for structure attacks, this condition can be relaxed, which allows us to consider a wider range of differentials. Finally, we point out an inconsistency in the FSE 2011 attack on 18 rounds of the block cipher PRESENT and use our model for structure attacks to attack 18-round PRESENT and improve the previous structure attacks on 7-round and 8-round Serpent. To the best of our knowledge, those attacks are the best known differential attacks on these two block ciphers.

Posted Content
TL;DR: This paper introduces new frameworks for full disclosure attacks on ultralightweight authentication protocols based on new concepts of recursive linear and recursive differential cryptanalysis and applies them on some well-known ultralightsweight protocols.
Abstract: Privacy is faced with serious challenges in the ubiquitous computing world. In order to handle this problem, some researchers in recent years have focused on design and analysis of privacy-friendly ultralightweight authentication protocols. Although the majority of these schemes have been broken to a greater or lesser extent, most of these attacks are based on ad-hoc methods that are not extensible to a large class of ultralightweight protocols. So this research area still suffers from the lack of structured cryptanalysis and evaluation methods. In this paper, we introduce new frameworks for full disclosure attacks on ultralightweight authentication protocols based on new concepts of recursive linear and recursive differential cryptanalysis. The recursive linear attack is passive, deterministic, and requires only a single authentication session, if it can be applied successfully. The recursive differential attack is more powerful and can be applied to the protocols on which the linear attack may not work. This attack is probabilistic, active in the sense that the attacker suffices only to block some specific messages, and requires a few authentication sessions. Having introduced these frameworks in a general view, we apply them on some well-known ultralightweight protocols. The first attack can retrieve all the secret data of Yeh and SLMAP authentication protocols and the second one can retrieve all the secret data of LMAP++, SASI, and David-Prasad authentication protocols.

Journal ArticleDOI
TL;DR: The proposed approach for cryptanalysis primarily depends on the order of normality of the employed Boolean function in Grain-128, and the results are an evidence of the cryptographic significance of the normality criteria of Boolean functions.
Abstract: This paper considers security implications of k-normal Boolean functions when they are employed in certain stream ciphers. A generic algorithm is proposed for cryptanalysis of the considered class of stream ciphers based on a security weakness of k-normal Boolean functions. The proposed algorithm yields a framework for mounting cryptanalysis against particular stream ciphers within the considered class. Also, the proposed algorithm for cryptanalysis implies certain design guidelines for avoiding certain weak stream cipher constructions. A particular objective of this paper is security evaluation of stream cipher Grain-128 employing the developed generic algorithm. Contrary to the best known attacks against Grain-128 which provide complexity of a secret key recovery lower than exhaustive search only over a subset of secret keys which is just a fraction (up to 5%) of all possible secret keys, the cryptanalysis proposed in this paper provides significantly lower complexity than exhaustive search for any secret key. The proposed approach for cryptanalysis primarily depends on the order of normality of the employed Boolean function in Grain-128. Accordingly, in addition to the security evaluation insights of Grain-128, the results of this paper are also an evidence of the cryptographic significance of the normality criteria of Boolean functions.

Book ChapterDOI
19 Aug 2012
TL;DR: Improved preimage attacks against reduced SHA-1 up to 57 steps come out of a differential view on the meet-in-the-middle technique originally developed by Aoki and Sasaki which turns out to be particularly useful for hash functions with linear message expansion and weak diffusion properties.
Abstract: This paper shows preimage attacks against reduced SHA-1 up to 57 steps. The best previous attack has been presented at CRYPTO 2009 and was for 48 steps finding a two-block preimage with incorrect padding at the cost of $$2^{159.3}$$ evaluations of the compression function. For the same variant our attacks find a one-block preimage at $$2^{150.6}$$ and a correctly padded two-block preimage at $$2^{151.1}$$ evaluations of the compression function. The improved results come out of a differential view on the meet-in-the-middle technique originally developed by Aoki and Sasaki. The new framework closely relates meet-in-the-middle attacks to differential cryptanalysis which turns out to be particularly useful for hash functions with linear message expansion and weak diffusion properties.

Posted Content
TL;DR: In this paper, the authors proposed a general model for understanding multiple differential cryptanalysis and proposed new attacks based on tools used in multidimensional linear cryptanalysis (namely LLR and χ statistical tests).
Abstract: Recent block ciphers have been designed to be resistant against differential cryptanalysis. Nevertheless it has been shown that such resistance claims may not be as accurate as wished due to recent advances in this field. One of the main improvements to differential cryptanalysis is the use of many differentials to reduce the data complexity. In this paper we propose a general model for understanding multiple differential cryptanalysis and propose new attacks based on tools used in multidimensional linear cryptanalysis (namely LLR and χ statistical tests). Practical cases to evaluate different approaches for selecting and combining differentials are considered on a reduced version of the cipher PRESENT. We also consider the accuracy of the theoretical estimates corresponding to these attacks.

Book ChapterDOI
05 Sep 2012
TL;DR: In this article, the authors proposed a general model for understanding multiple differential cryptanalysis and proposed new attacks based on tools used in multidimensional linear cryptanalysis (namely LLR and χ2 statistical tests).
Abstract: Recent block ciphers have been designed to be resistant against differential cryptanalysis. Nevertheless it has been shown that such resistance claims may not be as accurate as wished due to recent advances in this field. One of the main improvements to differential cryptanalysis is the use of many differentials to reduce the data complexity. In this paper we propose a general model for understanding multiple differential cryptanalysis and propose new attacks based on tools used in multidimensional linear cryptanalysis (namely LLR and χ2 statistical tests). Practical cases to evaluate different approaches for selecting and combining differentials are considered on a reduced version of the cipher PRESENT. We also consider the accuracy of the theoretical estimates corresponding to these attacks.