scispace - formally typeset
Search or ask a question

Showing papers on "Collision resistance published in 2016"


Journal ArticleDOI
TL;DR: A new chaotic system is proposed and employed to design a secure and fast hash function, which has a dynamic random array of functions and can be implemented by a parallel architecture and proves security of the proposed function.
Abstract: Hash functions play important role in the information security era. Although there are different methods to design these functions, in recent years chaos theory has emerged as a strong solution in this area. Chaotic hash functions use one-dimensional maps such as logistic and tent, or employ complex multi-dimensional maps which are typically insecure or slow and most of them has been successfully attacked. In this paper, we propose a new chaotic system and employ it to design a secure and fast hash function. The improved security factor has roots in the hyper sensitivity of the proposed chaotic map while properties like speed and security can be parameterized. On the other hand, the proposed hash function has a dynamic random array of functions and can be implemented by a parallel architecture. This data-level parallel architecture makes it fast to generate the hash value. Statistical simulations show success of the proposed hashing scheme. Cryptanalysis of proposed function, such as key sensitivity, meet-in-the-middle attack, collision, preimage resistance and high level attacks, proves security of the proposed function.

48 citations


Journal ArticleDOI
TL;DR: A lightweight hash function with reduced complexity in terms of hardware implementation, capable of achieving standard security, that uses sponge construction with permutation function involving the update of two non-linear feedback shift registers is proposed.
Abstract: The increased demand for lightweight applications has triggered the need for appropriate security mechanisms in them. Lightweight cryptographic hash functions are among the major responses toward such a requirement. The authors thus have a handful of such hash functions such as QUARK, PHOTON, SPONGENT and GLUON introduced already. The cryptanalysis of these hash functions is crucial in analysing their strength and often calls for improvement in designs. Their performance, are also to be taken care of, in terms of both software and hardware implementations. Here, they propose a lightweight hash function with reduced complexity in terms of hardware implementation, capable of achieving standard security. It uses sponge construction with permutation function involving the update of two non-linear feedback shift registers. Thus, in terms of sponge capacity it provides at least 80 bit security against generic attacks which is acceptable currently.

42 citations


Journal ArticleDOI
TL;DR: This work will survey known attacks and hardness results, discuss different flavors of hardness (one-wayness, pseudorandomness, collision resistance, public-key encryption, and mention applications to other problems in cryptography and computational complexity, with the hope to develop a systematic study of the cryptographic hardness of local functions.
Abstract: Constant parallel-time cryptography allows to perform complex cryptographic tasks at an ultimate level of parallelism, namely by local functions that each of their output bits depend on a constant number of input bits. A natural way to obtain local cryptographic constructions is to use random local functions in which each output bit is computed by applying some fixed d-ary predicate P to a randomly chosen d-size subset of the input bits. In this work, we will study the cryptographic hardness of random local functions. In particular, we will survey known attacks and hardness results, discuss different flavors of hardness (one-wayness, pseudorandomness, collision resistance, public-key encryption), and mention applications to other problems in cryptography and computational complexity. We also present some open questions with the hope to develop a systematic study of the cryptographic hardness of local functions.

41 citations


Book ChapterDOI
26 Sep 2016
TL;DR: In this paper, the authors define four security properties of logging schemes such as certificate transparency that can be assured via cryptographic means, and show that certificate transparency does achieve these security properties, including security against a malicious logger attempting to present different views of the log to different parties or at different points in time.
Abstract: Since hundreds of certificate authorities (CAs) can issue browser-trusted certificates, it can be difficult for domain owners to detect certificates that have been fraudulently issued for their domain. Certificate Transparency (CT) is a recent standard by the Internet Engineering Task Force (IETF) that aims to construct public logs of all certificates issued by CAs, making it easier for domain owners to monitor for fraudulently issued certificates. To avoid relying on trusted log servers, CT includes mechanisms by which monitors and auditors can check whether logs are behaving honestly or not; these mechanisms are primarily based on Merkle tree hashing and authentication proofs. Given that CT is now being deployed, it is important to verify that it achieves its security goals. In this work, we define four security properties of logging schemes such as CT that can be assured via cryptographic means, and show that CT does achieve these security properties. We consider two classes of security goals: those involving security against a malicious logger attempting to present different views of the log to different parties or at different points in time, and those involving security against malicious monitors who attempt to frame an honest log for failing to include a certificate in the log. We show that Certificate Transparency satisfies these security properties under various assumptions on Merkle trees all of which reduce to collision resistance of the underlying hash function (and in one case with the additional assumption of unforgeable signatures).

38 citations


Proceedings ArticleDOI
24 Oct 2016
TL;DR: In this paper, the authors improved the efficiency of non-malleable codes in the split state model by constructing a code with codeword length (roughly), where |s| is the length of the message, and k is the security parameter.
Abstract: In this work, we significantly improve the efficiency of non-malleable codes in the split state model, by constructing a code with codeword length (roughly), where |s| is the length of the message, and k is the security parameter. This is a substantial improvement over previous constructions, both asymptotically and concretely. Our construction relies on a new primitive which we define and study, called l-more extractable hash functions. This notion, which may be of independent interest, is strictly stronger than the previous notion of extractable hash by Goldwasser et al. (Eprint '11) and Bitansky et al. (ITCS '12, Eprint '14), yet we can instantiate it under the same assumption used for the previous extractable hash function (a variant of the Knowledge of Exponent Assumption).

33 citations


Book ChapterDOI
04 Dec 2016
TL;DR: This work presents generic \(\mathsf {CRF}\) constructions for several widely used cryptographic protocols based on a new notion named malleable smooth projective hash function.
Abstract: Motivated by the revelations of Edward Snowden, post-Snowden cryptography has become a prominent research direction in recent years. In Eurocrypt 2015, Mironov and Stephens-Davidowitz proposed a novel concept named cryptographic reverse firewall (\(\mathsf {CRF}\)) which can resist exfiltration of secret information from an arbitrarily compromised machine. In this work, we continue this line of research and present generic \(\mathsf {CRF}\) constructions for several widely used cryptographic protocols based on a new notion named malleable smooth projective hash function. Our contributions can be summarized as follows.

29 citations


Journal ArticleDOI
TL;DR: Compared with the existing chaotic hash algorithms, the proposed chaotic hash algorithm shows moderate statistical performance, better speed, randomness tests, and flexibility, and the results demonstrate that the proposed algorithm has strong security strength.
Abstract: We propose a chaotic hash algorithm based on circular shifts with variable parameters in this paper. We exploit piecewise linear chaotic map and one-way coupled map lattice to produce initial values and variable parameters. Circular shifts are introduced to improve the randomness of hash values. We evaluate the proposed hash algorithm in terms of distribution of the hash value, sensitivity of the hash value to slight modifications of the original message and secret keys, confusion and diffusion properties, robustness against birthday and meet-in-the-middle attacks, collision tests, analysis of speed, randomness tests, flexibility, computational complexity, and the results demonstrate that the proposed algorithm has strong security strength. Compared with the existing chaotic hash algorithms, our algorithm shows moderate statistical performance, better speed, randomness tests, and flexibility.

25 citations


Proceedings ArticleDOI
28 Jun 2016
TL;DR: It is shown that with current proposed parameters, combined with the use of efficient hashing hardware, it can lead to a feasible attack with significant collision probability, and possible modifications to make TESLA-based NMA more robust to such attacks.
Abstract: In the proposals for Global Navigation Satellite Systems (GNSS) Navigation Message Authentication (NMA) that are based on adapting the Timed Efficient Stream Loss-Tolerant Authentication (TESLA) protocol, the length of the one-time keys is limited (e.g. to 80 bits) by the low transmission rate. As a consequence, the hash function that is used to build the one-way key chain is constructed having a longer, secure hash function (e.g. SHA-256), preceded by a time-varying yet deterministic padding of the input and followed by a truncation of the output. We evaluate the impact of this construction on the collision resistance of the resulting hash function and of the whole chain, and show that with current proposed parameters, combined with the use of efficient hashing hardware, it can lead to a feasible attack with significant collision probability. The collision can be leveraged to mount a long lasting spoofing attack, where the victim receiver accepts all the one time keys and the navigation messages transmitted by the attacker as authentic. We conclude by suggesting possible modifications to make TESLA-based NMA more robust to such attacks.

22 citations


Journal ArticleDOI
01 Jan 2016
TL;DR: This paper presents an explicit quantum hash function which is "balanced" one-way resistant and collision resistant and demonstrates how to build a large family of balanced quantum hash functions.
Abstract: In the paper we define a notion of a resistant quantum hash function which combines a notion of pre-image (one-way) resistance and the notion of collision resistance. In the quantum setting one-way resistance property and collision resistance property are correlated: the "more" a quantum function is one-way resistant the "less" it is collision resistant and vice versa. We present an explicit quantum hash function which is "balanced" one-way resistant and collision resistant and demonstrate how to build a large family of balanced quantum hash functions.

22 citations


Posted Content
TL;DR: Using tools from graph theory and additive number theory, several open problems and conjectures concerning bounds and constructions for separating hash families are solved and a bridge between perfect hash families and hypergraph Tur{\'a}n problems is established.
Abstract: Separating hash families are useful combinatorial structures which are generalizations of many well-studied objects in combinatorics, cryptography and coding theory. In this paper, using tools from graph theory and additive number theory, we solve several open problems and conjectures concerning bounds and constructions for separating hash families. Firstly, we discover that the cardinality of a separating hash family satisfies a Johnson-type inequality. As a result, we obtain a new upper bound, which is superior to all previous ones. Secondly, we present a construction for an infinite class of perfect hash families. It is based on the Hamming graphs in coding theory and generalizes many constructions that appeared before. It provides an affirmative answer to both Bazrafshan-Trung's open problem on separating hash families and Alon-Stav's conjecture on parent-identifying codes. Thirdly, let $p_t(N,q)$ denote the maximal cardinality of a $t$-perfect hash family of length $N$ over an alphabet of size $q$. Walker II and Colbourn conjectured that $p_3(3,q)=o(q^2)$. We verify this conjecture by proving $q^{2-o(1)}

16 citations


Journal ArticleDOI
Yantao Li1
01 May 2016-Optik
TL;DR: This paper utilizes message extension to enhance the correlation of plaintexts in the message and aggregation operation to improve therelation of sequences of message blocks, which significantly increase the sensitivity between message and hash values, thereby greatly resisting the collision.

Posted Content
TL;DR: The hash function inversion problem for fixed targets is reduced into the satisfiability problem for Boolean logic, and MapleCrypt is presented which is a SAT solver-based cryptanalysis tool for inverting hash functions.
Abstract: SAT solvers are increasingly being used for cryptanalysis of hash functions and symmetric encryption schemes. Inspired by this trend, we present MapleCrypt which is a SAT solver-based cryptanalysis tool for inverting hash functions. We reduce the hash function inversion problem for fixed targets into the satisfiability problem for Boolean logic, and use MapleCrypt to construct preimages for these targets. MapleCrypt has two key features, namely, a multi-armed bandit based adaptive restart (MABR) policy and a counterexample-guided abstraction refinement (CEGAR) technique. The MABR technique uses reinforcement learning to adaptively choose between different restart policies during the run of the solver. The CEGAR technique abstracts away certain steps of the input hash function, replacing them with the identity function, and verifies whether the solution constructed by MapleCrypt indeed hashes to the previously fixed targets. If it is determined that the solution produced is spurious, the abstraction is refined until a correct inversion to the input hash target is produced. We show that the resultant system is faster for inverting the SHA-1 hash function than state-of-the-art inversion tools.

Proceedings ArticleDOI
09 Jun 2016
TL;DR: This paper presents a new structure of Hash function that integrates a strong Chaotic generator into neurons instead of using simple Chaotic maps and demonstrates the efficiency of the implemented structure in terms of strong Collision Resistance and High Message Sensitivity compared to SHA-2 and some Chaos-based Hash functions.
Abstract: Secure Hash Algorithm (SHA) is the most popular standard of Cryptographic Hash functions. Several security protocols use SHA to provide message integrity, authentication and digital signature. Nowadays, a new technology based on Chaotic Neural Networks is used to design Hash functions due to the following important properties of Chaos and Neural Networks: non-linearity, compression, confusion and diffusion. Compared to existing Hash functions based on Chaotic Neural Networks, the proposed structure integrates a strong Chaotic generator into neurons instead of using simple Chaotic maps. In fact, simple chaotic maps are not very robust, even against some statistical attacks (Uniformity and NIST). To also reduce the complexity of hash function proposed in ICITST conference (2015), while maintaining strength, we present in this paper a new structure of Hash function. The theoretical analysis and the obtained experimental performances demonstrate the efficiency o f the implemented structure in terms of strong Collision Resistance and High Message Sensitivity compared to SHA-2 and some Chaos-based Hash functions.

Journal ArticleDOI
TL;DR: In this article, two new constructions of quantum hash functions based on expander graphs and extractor functions are presented. But they do not estimate the amount of randomness that is needed to construct them.
Abstract: We present two new constructions of quantum hash functions: the first based on expander graphs and the second based on extractor functions and estimate the amount of randomness that is needed to construct them. We also propose a keyed quantum hash function based on extractor function that can be used in quantum message authentication codes and assess its security in a limited attacker model.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a new cryptanalysis method for double-branch hash functions and applied it on the standard RIPEMD-128, greatly improving over previously known results on this algorithm.
Abstract: In this article we propose a new cryptanalysis method for double-branch hash functions and we apply it on the standard RIPEMD-128, greatly improving over previously known results on this algorithm. Namely, we are able to build a very good differential path by placing one nonlinear differential part in each computation branch of the RIPEMD-128 compression function, but not necessarily in the early steps. In order to handle the low differential probability induced by the nonlinear part located in later steps, we propose a new method for using the available freedom degrees, by attacking each branch separately and then merging them with free message blocks. Overall, we present the first collision attack on the full RIPEMD-128 compression function as well as the first distinguisher on the full RIPEMD-128 hash function. Experiments on reduced number of rounds were conducted, confirming our reasoning and complexity analysis. Our results show that 16-year-old RIPEMD-128, one of the last unbroken primitives belonging to the MD-SHA family, might not be as secure as originally thought.

Journal ArticleDOI
TL;DR: The Elliptic Curve Multiset Hash (ECMH) is introduced, which combines a usual bit string-valued hash function like BLAKE2 with an efficient encoding into binary elliptic curves to overcome both difficulties.
Abstract: A homomorphic, or incremental, multiset hash function, associates a hash value to arbitrary collections of objects (with possible repetitions) in such a way that the hash of the union of two collections is easy to compute from the hashes of the two collections themselves: it is simply their sum under a suitable group operation. In particular, hash values of large collections can be computed incrementally and/or in parallel. Homomorphic hashing is thus a very useful primitive with applications ranging from database integrity verification to streaming set/multiset comparison and network coding. Unfortunately, constructions of homomorphic hash functions in the literature are hampered by two main drawbacks: they tend to be much longer than usual hash functions at the same security level (e.g. to achieve a collision resistance of 2^128, they are several thousand bits long, as opposed to 256 bits for usual hash functions), and they are also quite slow. In this paper, we introduce the Elliptic Curve Multiset Hash (ECMH), which combines a usual bit string-valued hash function like BLAKE2 with an efficient encoding into binary elliptic curves to overcome both difficulties. On the one hand, the size of ECMH digests is essentially optimal: 2m-bit hash values provide O(2^m) collision resistance. On the other hand, we demonstrate a highly-efficient software implementation of ECMH, which our thorough empirical evaluation shows to be capable of processing over 3 million set elements per second on a 4 GHz Intel Haswell machine at the 128-bit security level---many times faster than previous practical methods.

Journal ArticleDOI
TL;DR: A generalization of quantum hash functions for arbitrary groups is considered and it is shown that quantum hash function exists for arbitrary abelian group and some restrictions on Hilbert space dimension and group are proved.
Abstract: In this paper we consider a generalization of quantum hash functions for arbitrary groups. We show that quantum hash function exists for arbitrary abelian group. We construct a set of “good” automorphisms—a key component of quantum hash funciton. We prove some restrictions on Hilbert space dimension and group used in quantum hash function.

Journal ArticleDOI
15 Jul 2016
TL;DR: This paper investigates the use of hash value truncation in preserving ID anonymity in WSNs and the impact of hashvalue truncation on four criteria attributes (security against brute force attacks, probability of pseudonym collisions, energy trade-off and end-to-end packet delivery delay) and reports the possible impacts of other factors.
Abstract: Hash functions have been used to address security requirements such as integrity, message authentication and non-repudiation. In WSNs, these functions are also used to preserve sensor nodes' identity (ID) anonymity, i.e., they are used to generate and verify dynamic pseudonyms that are used to identify sensor nodes in a communication session. In this latter application, there is an open issue as to how long the output of a hash function (i.e. hash value) we should use in pseudonym generation. The longer the hash value, the longer is the pseudonym, thus the harder it is to guess a pseudonym that is generated by using a hash function. On the other hand, the use of a longer hash value also means that the bandwidth and energy costs in transmitting the pseudonym will be higher. As sensor nodes typically have limited resources and are battery powered, the balance between the protection level of ID anonymity and performance and energy costs incurred in providing such a protection is an open issue. This paper investigates the use of hash value truncation in preserving ID anonymity in WSNs and the impact of hash value truncation on four criteria attributes (security against brute force attacks, probability of pseudonym collisions, energy trade-off and end-to-end packet delivery delay). It reports the possible impacts of other factors including the type and usage of hash functions, sensor node capabilities, adversary capabilities, ability to resolve pseudonym collisions, network density and data collection rate. The results show that the impacts of these factors may be contradictory. Therefore, the determination of an optimal level of hash value truncation should consider all trade-offs brought by these factors.

Journal ArticleDOI
TL;DR: A timestamp defined hash algorithm is proposed in the present work for secure data dissemination among vehicles that fulfils all the basic properties such as preimage resistance, collision resistance of a one-way unkeyed hash function.

Journal ArticleDOI
TL;DR: Compared with the Chaum–Heijst–Pfitzmann hash based on a discrete logarithm problem, the new hash is lightweight, and thus it opens a door to convenience for utilization of lightweight digital signing schemes.

Book ChapterDOI
01 Jan 2016
TL;DR: A cryptographic hash function H is a function which takes arbitrary length bit strings as input and produces a fixed-length bit string as output; the output is often called a digest, hashcode or hash value.
Abstract: A cryptographic hash function H is a function which takes arbitrary length bit strings as input and produces a fixed-length bit string as output; the output is often called a digest, hashcode or hash value. Hash functions are used a lot in computer science, but the crucial difference between a standard hash function and a cryptographic hash function is that a cryptographic hash function should at least have the property of being one-way.

Book ChapterDOI
20 Mar 2016
TL;DR: If the universal hash functions in the schemes are replaced with corresponding constructions, the problems about related-key attack can be solved for some RKD sets.
Abstract: Universal hash functions UHFs have been extensively used in the design of cryptographic schemes If we consider the related-key attack RKA against these UHF-based schemes, some of them may not be secure, especially those using the key of UHF as a part of the whole key of scheme, due to the weakness of UHF in the RKA setting In order to solve this issue, we propose a new concept of related-key almost universal hash function, which is a natural extension to almost universal hash function in the RKA setting We define related-key almost universal RKA-AU hash function and related-key almost XOR universal RKA-AXU hash function However almost all the existing UHFs do not satisfy the new definitions We construct one fixed-input-length universal hash function named RH1 and two variable-input-length universal hash functions named RH2 and RH3 We show that RH1 and RH2 are both RKA-AXU, and RH3 is RKA-AU for the RKD set $$\varPhi ^\oplus $$ Furthermore, RH1, RH2 and RH3 are nearly as efficient as previously similar constructions RKA-AU RKA-AXU hash functions can be used as components in the related-key secure cryptographic schemes If we replace the universal hash functions in the schemes with our corresponding constructions, the problems about related-key attack can be solved for some RKD sets More specifically, we give four concrete applications of RKA-AU and RKA-AXU in related-key secure message authentication codes and tweakable block ciphers

Book ChapterDOI
10 Aug 2016
TL;DR: Full-round collision attacks on the proposed Simpira-4 Davies-Meyer hash construction are proposed, which violate the designers’ security claims that there are no structural distinguishers with complexity below \(2^{128}\).
Abstract: Simpira v1 is a recently proposed family of permutations, based on the AES round function. The design includes recommendations for using the Simpira permutations in block ciphers, hash functions, or authenticated ciphers. The designers’ security analysis is based on computer-aided bounds for the minimum number of active S-boxes. We show that the underlying assumptions of independence, and thus the derived bounds, are incorrect. For family member Simpira-4, we provide differential trails with only 40 (instead of 75) active S-boxes for the recommended 15 rounds. Based on these trails, we propose full-round collision attacks on the proposed Simpira-4 Davies-Meyer hash construction, with complexity \(2^{82.62}\) for the recommended full 15 rounds and a truncated 256-bit hash value, and complexity \(2^{110.16}\) for 16 rounds and the full 512-bit hash value. These attacks violate the designers’ security claims that there are no structural distinguishers with complexity below \(2^{128}\).

DOI
24 May 2016
TL;DR: A modern cryptographic hash function can compute the hash value for any initial message as discussed by the authors, and one of the basic hash function properties is the difficulty of finding two different messages with the same hash value (collision resistance).
Abstract: A modern cryptographic hash function can compute the hash value for any initial message. One of the basic hash function properties is the difficulty of finding two different messages with the same hash value (collision resistance). Due to this property, hash functions are widely used in popular security protocols like the Transport Layer Security or the Internet Protocol Security.

Book ChapterDOI
08 May 2016
TL;DR: In this article, the concatenation combiner of hash functions with an n-bit internal state does not offer better collision and preimage resistance compared to a single strong nbit hash function, and the problem of devising second preimage attacks faster than $2^n/2n against this combiner has remained open since 2005 when Kelsey and Schneier showed that a single Merkle-Damgard hash function did not offer optimal second image resistance for long messages.
Abstract: We study the security of the concatenation combiner $$H_1M \Vert H_2M$$H1Mi¾?H2M for two independent iterated hash functions with n-bit outputs that are built using the Merkle-Damgard construction. In 2004 Joux showed that the concatenation combiner of hash functions with an n-bit internal state does not offer better collision and preimage resistance compared to a single strong n-bit hash function. On the other hand, the problem of devising second preimage attacks faster than $$2^n$$2n against this combiner has remained open since 2005 when Kelsey and Schneier showed that a single Merkle-Damgard hash function does not offer optimal second preimage resistance for long messages. In this paper, we develop new algorithms for cryptanalysis of hash combiners and use them to devise the first second preimage attack on the concatenation combiner. The attack finds second preimages faster than $$2^n$$2n for messages longer than $$2^{2n/7}$$22n/7 and has optimal complexity of $$2^{3n/4}$$23n/4. This shows that the concatenation of two Merkle-Damgard hash functions is not as strong a single ideal hash function. Our methods are also applicable to other well-studied combiners, and we use them to devise a new preimage attack with complexity of $$2^{2n/3}$$22n/3 on the XOR combiner $$H_1M \oplus H_2M$$H1Mi¾?H2M of two Merkle-Damgard hash functions. This improves upon the attack by Leurent and Wang presented at Eurocrypt 2015 whose complexity is $$2^{5n/6}$$25n/6 but unlike our attack is also applicable to HAIFA hash functions. Our algorithms exploit properties of random mappings generated by fixing the message block input to the compression functions of $$H_1$$H1 and $$H_2$$H2. Such random mappings have been widely used in cryptanalysis, but we exploit them in new ways to attack hash function combiners.

Journal ArticleDOI
TL;DR: A compact and effective chaos-based keyed hash function implemented by a cross-coupled topology of chaotic maps, which employs absolute-value of sinusoidal nonlinearity, and offers robust chaotic regions over broad parameter spaces with high degree of randomness through chaoticity measurements using the Lyapunov exponent.
Abstract: This paper presents a compact and effective chaos-based keyed hash function implemented by a cross-coupled topology of chaotic maps, which employs absolute-value of sinusoidal nonlinearity, and offers robust chaotic regions over broad parameter spaces with high degree of randomness through chaoticity measurements using the Lyapunov exponent. Hash function operations involve an initial stage when the chaotic map accepts initial conditions and a hashing stage that accepts input messages and generates the alterable-length hash values. Hashing performances are evaluated in terms of original message condition changes, statistical analyses, and collision analyses. The results of hashing performances show that the mean changed probabilities are very close to 50%, and the mean number of bit changes is also close to a half of hash value lengths. The collision tests reveal the mean absolute difference of each character values for the hash values of 128, 160 and 256 bits are close to the ideal value of 85.43. The proposed keyed hash function enhances the collision resistance, comparing to MD5 and SHA1, and the other complicated chaos-based approaches. An implementation of hash function Android application is demonstrated.

Book ChapterDOI
20 Mar 2016
TL;DR: It is shown that it is actually possible to mount rebound attacks, despite the presence of modular constant additions in the hash function Kupyna, and how to use the rebound attack for creating collisions for the round-reduced hash function itself.
Abstract: The hash function Kupyna was recently published as the Ukrainian standard DSTU 7564:2014. It is structurally very similar to the SHA-3 finalist GrOstl, but differs in details of the round transformations. Most notably, some of the round constants are added with a modular addition, rather than bitwise xor. This change prevents a straightforward application of some recent attacks, in particular of the rebound attacks on the compression function of similar AES-like hash constructions. However, we show that it is actually possible to mount rebound attacks, despite the presence of modular constant additions. More specifically, we describe collision attacks on the compression function for 6 out of 10 rounds of Kupyna-256 with an attack complexity of $$2^{70}$$, and for 7 rounds with complexity $$2^{125.8}$$. In addition, we can use the rebound attack for creating collisions for the round-reduced hash function itself. This is possible for 4 rounds of Kupyna-256 with complexity $$2^{67}$$ and for 5 rounds with complexity $$2^{120}$$.

Proceedings ArticleDOI
01 Sep 2016
TL;DR: A method for designing one-way cryptographic hash function and a block ciphering scheme based on proposed hash codes is proposed and the experimental outcomes justify striking performance of proposed chaotic hash method.
Abstract: Secure hashes have an indispensable purpose to play in modern multimedia image encryptions. Traditional block ciphering techniques are quite complex, command colossal processing time for key generation and sometimes are a source of redundancy. This paper proposes to suggest a method for designing one-way cryptographic hash function and a block ciphering scheme based on proposed hash codes. In the proposed work, we have divided the message into blocks with each block individually processed by chaotic systems. The transitional hashes are created utilizing advanced control and input parameters. The two hash codes are utilized to create a final hash. The experimental outcomes justify striking performance of proposed chaotic hash method. Moreover, the generated hash code is applied for realizing an image block ciphering technique. The encryption process is plain-image dependent thereby exhibits satisfactory encryption effect suitable for practical applications.

Posted Content
TL;DR: The Elliptic Curve Multiset Hash (ECMH) as mentioned in this paper combines a usual bit string-valued hash function like BLAKE2 with an efficient encoding into binary elliptic curves.
Abstract: A homomorphic, or incremental, multiset hash function, associates a hash value to arbitrary collections of objects (with possible repetitions) in such a way that the hash of the union of two collections is easy to compute from the hashes of the two collections themselves: it is simply their sum under a suitable group operation. In particular, hash values of large collections can be computed incrementally and/or in parallel. Homomorphic hashing is thus a very useful primitive with applications ranging from database integrity verification to streaming set/multiset comparison and network coding. Unfortunately, constructions of homomorphic hash functions in the literature are hampered by two main drawbacks: they tend to be much longer than usual hash functions at the same security level (e.g. to achieve a collision resistance of 2^128, they are several thousand bits long, as opposed to 256 bits for usual hash functions), and they are also quite slow. In this paper, we introduce the Elliptic Curve Multiset Hash (ECMH), which combines a usual bit string-valued hash function like BLAKE2 with an efficient encoding into binary elliptic curves to overcome both difficulties. On the one hand, the size of ECMH digests is essentially optimal: 2m-bit hash values provide O(2^m) collision resistance. On the other hand, we demonstrate a highly-efficient software implementation of ECMH, which our thorough empirical evaluation shows to be capable of processing over 3 million set elements per second on a 4 GHz Intel Haswell machine at the 128-bit security level---many times faster than previous practical methods.

Journal ArticleDOI
TL;DR: It is proved that S^r$$Sr achieves asymptotically optimal collision security against semi-adaptive adversaries up to almost 2n/2 queries and that it can be made preimage secure up to $$2^n$$2n queries using a simple tweak.
Abstract: A well-established method of constructing hash functions is to base them on non-compressing primitives, such as one-way functions or permutations. In this work, we present $$S^r$$Sr, an $$rn$$rn-to-$$n$$n-bit compression function (for $$r\ge 1$$r?1) making $$2r-1$$2r-1 calls to $$n$$n-to-$$n$$n-bit primitives (random functions or permutations). $$S^r$$Sr compresses its inputs at a rate (the amount of message blocks per primitive call) up to almost 1/2, and it outperforms all existing schemes with respect to rate and/or the size of underlying primitives. For instance, instantiated with the $$1600$$1600-bit permutation of NIST's SHA-3 hash function standard, it offers about $$800$$800-bit security at a rate of almost 1/2, while SHA-3-512 itself achieves only $$512$$512-bit security at a rate of about $$1/3$$1/3. We prove that $$S^r$$Sr achieves asymptotically optimal collision security against semi-adaptive adversaries up to almost $$2^{n/2}$$2n/2 queries and that it can be made preimage secure up to $$2^n$$2n queries using a simple tweak.