scispace - formally typeset
Search or ask a question
Topic

Collision attack

About: Collision attack is a research topic. Over the lifetime, 1093 publications have been published within this topic receiving 28389 citations.


Papers
More filters
Book ChapterDOI
04 Nov 2009
TL;DR: A semi-free start collision attack on reduced-roundLANE-256-(3,3) with complexity of 262 compression function evaluations and 269 memory is found and this technique can be applied to LANE-512-( 3,4) to get a semi- free start collisions attack with the same complexity of262 and 269Memory.
Abstract: The LANE[4] hash function is designed by Sebastiaan Indesteege and Bart Preneel. It is now a first round candidate of NIST's SHA-3 competition. The LANE hash function contains four concrete designs with different digest length of 224, 256, 384 and 512. The LANE hash function uses two permutations P and Q, which consist of different number of AES[1]-like rounds. LANE-224/256 uses 6-round P and 3-round Q. LANE-384/512 uses 8-round P and 4-round Q. We will use LANE-n-(a,b) to denote a variant of LANE with a-round P, b-round Q and a digest length n. We have found a semi-free start collision attack on reduced-round LANE-256-(3,3) with complexity of 262 compression function evaluations and 269 memory. This technique can be applied to LANE-512-(3,4) to get a semi-free start collision attack with the same complexity of 262 and 269 memory. We also propose a collision attack on LANE-512-(3,4) with complexity of 294 and 2133 memory.

17 citations

Book ChapterDOI
17 Oct 2008
TL;DR: The different variants and attacks against the SHA-3 hash function are reviewed so as to clearly point out which choices are secure and which are not.
Abstract: Hash functions are a hot topic at the moment in cryptography. Many proposals are going to be made for SHA-3, and among them, some provably collision resistant hash functions might also be proposed. These do not really compete with "standard" designs as they are usually much slower and not well suited for constrained environments. However, they present an interesting alternative when speed is not the main objective. As always when dealing with provable security, hard problems are involved, and the fast syndrome-based cryptographic hash function proposed by Augot, Finiasz and Sendrier at Mycrypt 2005 relies on the problem of Syndrome Decoding, a well known "Post Quantum" problem from coding theory. In this article we review the different variants and attacks against it so as to clearly point out which choices are secure and which are not.

17 citations

Patent
28 Sep 2012
TL;DR: A first hash value is obtained by applying a first hash function to a first input, such as an implicit certificate, message to be signed, a message to verify, or other suitable information as mentioned in this paper.
Abstract: Methods, systems, and computer programs for producing hash values are disclosed. A first hash value is obtained by applying a first hash function to a first input. The first input can be based on an implicit certificate, a message to be signed, a message to be verified, or other suitable information. A second hash value is obtained by applying a second hash function to a second input. The second input is based on the first hash value. The second hash value is used in a cryptographic scheme. In some instances, a public key or a private key is generated based on the second hash value. In some instances, a digital signature is generated based on the second hash value, or a digital signature is verified based on the second hash value, as appropriate.

17 citations

Book ChapterDOI
26 Mar 2007
TL;DR: The main insight of this work comes from the fact that, by using randomized message preprocessing via a short random salt p, one can use the "hash then encrypt" paradigm with suboptimal "practical" e-universal hash functions, and still improve its exact security to optimal O(q2/2k).
Abstract: "Hash then encrypt" is an approach to message authentication, where first the message is hashed down using an e-universal hash function, and then the resulting k-bit value is encrypted, say with a block-cipher. The security of this scheme is proportional to eq2, where q is the number of MACs the adversary can request. As e is at least 2-k, the best one can hope for is O(q2/2k) security. Unfortunately, such small e is not achieved by simple hash functions used in practice, such as the polynomial evaluation or the Merkle-Damgard construction, where e grows with the message length L. The main insight of this work comes from the fact that, by using randomized message preprocessing via a short random salt p (which must then be sent as part of the authentication tag), we can use the "hash then encrypt" paradigm with suboptimal "practical" e-universal hash functions, and still improve its exact security to optimal O(q2/2k). Specifically, by using at most an O(log L)-bit salt p, one can always regain the optimal exact security O(q2/2k), even in situations where e grows polynomially with L. We also give very simple preprocessing maps for popular "suboptimal" hash functions, namely polynomial evaluation and the Merkle-Damgard construction. Our results come from a general extension of the classical Carter-Wegman paradigm, which we believe is of independent interest. On a high level, it shows that public randomization allows one to use the potentially much smaller "average-case" collision probability in place of the "worst-case" collision probability e.

17 citations

Book ChapterDOI
17 Dec 2013
TL;DR: Reducing the capacity to the output size of the SHA-3 standard slightly improves attacks, while reducing the permutation size degrades attacks on Keccak.
Abstract: In October 2012, NIST has announced Keccak as the winner of the SHA-3 cryptographic hash function competition. Recently, at CT-RSA 2013, NIST brought up the idea to standardize Keccak variants with different parameters than those submitted to the SHA-3 competition. In particular, NIST considers to reduce the capacity to the output size of the SHA-3 standard and additionally, standardize a Keccak variant with a permutation size of 800 instead of 1600 bits. However, these variants have not been analyzed very well during the SHA-3 competition. Especially for the variant using an 800-bit permutation no analysis on the hash function has been published so far. In this work, we analyze these newly proposed Keccak variants and provide practical collisions for up to 4 rounds for all output sizes by constructing internal collisions. Our attacks are based on standard differential cryptanalysis contrary to the recent attacks by Dinur at al. from FSEi¾ź2013. We use a non-linear low probability path for the first two rounds and use methods from coding theory to find a high-probability path for the last two rounds. The low probability path as well as the conforming message pair is found using an automatic differential path search tool. Our results indicate that reducing the capacity slightly improves attacks, while reducing the permutation size degrades attacks on Keccak.

17 citations


Network Information
Related Topics (5)
Cryptography
37.3K papers, 854.5K citations
88% related
Public-key cryptography
27.2K papers, 547.7K citations
87% related
Hash function
31.5K papers, 538.5K citations
85% related
Encryption
98.3K papers, 1.4M citations
85% related
Computer security model
18.1K papers, 352.9K citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202311
202224
202115
202013
201919
201815