scispace - formally typeset
Search or ask a question

Showing papers on "Sponge function published in 2010"


Book ChapterDOI
17 Aug 2010
TL;DR: A novel way to use a sponge function, and inputs and outputs blocks in a continuous fashion, allowing to interleave the feed of seeding material with the fetch of pseudo-random numbers without latency.
Abstract: This paper proposes a new construction for the generation of pseudo-random numbers. The construction is based on sponge functions and is suitable for embedded security devices as it requires few resources. We propose a model for such generators and explain how to define one on top of a sponge function. The construction is a novel way to use a sponge function, and inputs and outputs blocks in a continuous fashion, allowing to interleave the feed of seeding material with the fetch of pseudo-random numbers without latency. We describe the consequences of the sponge indifferentiability results to this construction and study the resistance of the construction against generic state recovery attacks. Finally, we propose a concrete example based on a member of the KECCAK family with small width.

118 citations


Journal ArticleDOI
TL;DR: A technique is described which combines a collision-resistant hash function with a protocol for Authenticated Encryption (AE) that is both simple and generic and does not require any additional key material beyond that of the AE protocol.
Abstract: We revisit the problem of constructing a protocol for performing Authenticated Encryption with Associated Data (AEAD). A technique is described which combines a collision-resistant hash function with a protocol for Authenticated Encryption (AE). The technique is both simple and generic and does not require any additional key material beyond that of the AE protocol. Concrete instantiations are shown where a 256-bit hash function is combined with some known single-pass AE protocols employing either 128-bit or 256-bit block ciphers. This results in possible efficiency improvement in the processing of the header.

108 citations


Book ChapterDOI
08 Jun 2010
TL;DR: This paper presents a lightweight implementation of the permutation Keccak-f[200] and KeCCak- f[400] of the SHA-3 candidate hash function Keccack, which is also the first lightweight Implementation of a sponge function, which differentiates it from the previous works.
Abstract: In this paper, we present a lightweight implementation of the permutation Keccak-f[200] and Keccak-f[400] of the SHA-3 candidate hash function Keccak. Our design is well suited for radio-frequency identification (RFID) applications that have limited resources and demand lightweight cryptographic hardware. Besides its low-area and low-power, our design gives a decent throughput. To the best of our knowledge, it is also the first lightweight implementation of a sponge function, which differentiates it from the previous works. By implementing the new hash algorithm Keccak, we have utilized unique advantages of the sponge construction. Although the implementation is targeted for Application Specific Integrated Circuit (ASIC) platforms, it is also suitable for Field Programmable Gate Arrays (FPGA). To obtain a compact design, serialized data processing principles are exploited together with algorithm-specific optimizations. The design requires only 2.52K gates with a throughput of 8 Kbps at 100 KHz system clock based on 0.13-µm CMOS standard cell library.

90 citations


Proceedings ArticleDOI
26 Feb 2010
TL;DR: This paper analyzes a popular and cryptographically significant class of non-linear Boolean functions for their resistance to algebraic attacks.
Abstract: This paper mainly analysis and describe the design issue of stream ciphers in Network security as the streams are widely used to protecting the privacy of digital information. A variety of attacks against stream cipher exist;(algebraic and so on). These attacks have been very successful against a variety of stream ciphers. So in this paper efforts have been done to design and analyze stream ciphers. The main contribution is to design new stream ciphers through analysis of the algebraic immunity of Boolean functions and S-Boxes. In this paper, the cryptographic properties of non-linear transformation have been used for designing of stream ciphers Many LFSR (Linear feedback Shift Register) based stream ciphers use non-linear Boolean function to destroy the linearity of the LFSR(s) output. Many of these designs have been broken by algebraic attacks. Here we analyze a popular and cryptographically significant class of non-linear Boolean functions for their resistance to algebraic attacks.

28 citations


Book ChapterDOI
08 Sep 2010
TL;DR: In this paper, the authors discuss the state of the art of cryptographic algorithms as deployed for securing computing networks and argue that the design of efficient cryptographic algorithms is the easy part of securing a large scale network, but very often security problems are identified in algorithms and their implementations.
Abstract: This article discusses the state of the art of cryptographic algorithms as deployed for securing computing networks. While it has been argued that the design of efficient cryptographic algorithms is the "easy" part of securing a large scale network, it seems that very often security problems are identified in algorithms and their implementations.

7 citations


Journal ArticleDOI
TL;DR: A new threshold authenticated encryption scheme using labor-division signature is proposed without redundancy added to message blocks, which is secure against chosen-ciphertext attacks and existentially unforgeable against the chosen-message attacks in the random oracle model.
Abstract: This paper shows several security weaknesses of a threshold authenticated encryption scheme. A new threshold authenticated encryption scheme using labor-division signature is proposed without redundancy added to message blocks. On the assumptions of EDDH problems, the proposed scheme is secure against chosen-ciphertext attacks and existentially unforgeable against the chosen-message attacks in the random oracle model.

2 citations


Journal ArticleDOI
TL;DR: The paper demonstrates that there exists a design defect, the threshold authentication signature scheme cannot resist against insider attack and the scheme is not robust, and an improved authenticated encryption scheme based on elliptic curve cryptosystem is proposed.
Abstract: The authenticated encryption scheme allows one signer to generate an authenticated cipher -text so that no one except the designated verifier can recover the message and verify the message. In a (t, n) threshold authenticated encryption scheme, any t or more signers can generate an authenticated encryption for a message and send it to the designated verifier. Compared with the conventional encryption-then-signature schemes, threshold authenticated encryption schemes can meet more security requirements, including robustness, confidentiality, unforgeability, integrity, authenticity and non-repudiation. Based on Tseng and Jan’s authenticated encryption scheme and elliptic curve cryptosystem, Chung et al. [2] recently proposed an efficient (t, n) threshold authenticated encryption scheme which can reduce the load of the signers by applying a division-of-labor signature technique. However, the paper demonstrates that there exists a design defect, the threshold authentication signature scheme cannot resist against insider attack and the scheme is not robust. Then, an improved authenticated encryption scheme based on elliptic curve cryptosystem is proposed. The novel authenticated encryption scheme removes the above-mentioned weaknesses.

2 citations


Proceedings ArticleDOI
13 Jun 2010
TL;DR: In this article, the authors focus on symmetric ciphers for database encryption since they are the only type of cipher with acceptable performance for most applications and point out that stream cipher is the adequate type of encryption schemes.
Abstract: Protecting the confidentiality in large databases without degrading their performance is a challenging problem, especially when encryption and decryption must be performed at the database-level or at the application-level We here focus on symmetric ciphers for database encryption since they are the only type of ciphers with acceptable performance for most applications We point out that stream ciphers are the adequate type of encryption schemes We present an attack on a dedicated stream cipher proposed by Ge and Zdonic in 2007

1 citations


Proceedings ArticleDOI
26 May 2010
TL;DR: An improved scheme on the checksum is proposed to avoid the existing collision attacks efficiently and the random element without additional complicated calculation is introduced to translate the segment plaintexts in the Checksum.
Abstract: An efficient scheme of authenticated encryption with associated data is provided by combining a collision resistant hash function with an authenticated encryption scheme. The hash function is used to compress an arbitrary length header to a fixed length nounce. The authenticated encryption scheme is the improvement of OCB mode of operation. OCB is believed to provide extremely high protection with encryption and message authentication in a most efficient way. However, when OCB mode of operation is used to handle large amount of data, it is easy to find collision so that the mode will lose the authenticity capability with probability one. An improved scheme on the checksum is proposed to avoid the existing collision attacks efficiently. The random element without additional complicated calculation is introduced to translate the segment plaintexts in the checksum. At last, the security of the scheme of authenticated encryption with associated data is analyzed.

1 citations


Posted Content
TL;DR: Reduced-round variants of CubeHash where the adversary controls the full 1024-bit input to reduced-round Cube-Hash and can observe its full output are analyzed and it is shown that linear approximations with high biases exist in reduced- round variants.
Abstract: Recent developments in the field of cryptanalysis of hash functions has inspired NIST to announce a competition for selecting a new cryptographic hash function to join the SHA family of standards. One of the 14 second-round candidates is CubeHash designed by Daniel J. Bernstein. CubeHash is a unique hash function in the sense that it does not iterate a common compression function, and offers a structure which resembles a sponge function, even though it is not exactly a sponge function. In this paper we analyze reduced-round variants of CubeHash where the adversary controls the full 1024-bit input to reduced-round CubeHash and can observe its full output. We show that linear approximations with high biases exist in reduced-round variants. For example, we present an 11-round linear approximation with bias of 2, which allows distinguishing 11round CubeHash using about 2 queries. We also discuss the extension of this distinguisher to 12 rounds using message modification techniques. Finally, we present a linear distinguisher for 14-round CubeHash which uses about 2 queries.

1 citations


Proceedings ArticleDOI
13 Jun 2010
TL;DR: Hardware architectures for the hash algorithm Luffa, which is a candidate for the next-generation hash standard SHA-3, exhibited the advantage of flexible implementation ranging from high-speed to compact circuits.
Abstract: This paper presents hardware architectures for the hash algorithm Luffa, which is a candidate for the next-generation hash standard SHA-3. The architectures were implemented by using a 90-nm CMOS standard cell library. A high throughput of 35 Gbps for a high-speed architecture and a gate count of 14.7 kgate for a compact architecture were obtained. In comparison with Keccak, other SHA-3 candidate in the sponge function category, as well as with the current hash standard SHA-256, Luffa exhibited the advantage of flexible implementation ranging from high-speed to compact circuits.