scispace - formally typeset
Search or ask a question
Book ChapterDOI

The State of Cryptographic Hash Functions

01 Jan 1999-Lecture Notes in Computer Science (Springer-Verlag)-Vol. 1561, pp 158-182
TL;DR: The state of the art for cryptographic hash functions is described, different definitions are compared, and the few theoretical results on hash functions are discussed.
Abstract: This paper describes the state of the art for cryptographic hash functions. Different definitions are compared, and the few theoretical results on hash functions are discussed. A brief overview is presented of the most important constructions, and some open problems are presented.

Summary (7 min read)

1 INTRODUCTION

  • During the last decades, the nature of telecommunications has changed completely.
  • Recent developments in mobile telecommunications like the GSM system [122] make it possible to reach a person anywhere in Europe, independent of whether he is at home, in his office, or on the road.
  • Electronic mail has become the preferable way of communication between researchers all over the world, and many companies have introduced this service.
  • Section 2 situates hash functions within the wider area of cryptology and discusses the application of hash functions to several problems.
  • Section 4 summarizes the main theoretical results on 1NFWO postdoctoral researcher, sponsored by the National Fund for Scientific Research .

2 AUTHENTICATION AND PRIVACY

  • This section discusses the basic concepts of cryptography, and clarifies the importance of hash functions in the protection of information authentication.
  • At the end of this section, other applications of hash functions are presented.

2.1 Privacy protection with symmetric cryptology

  • Cryptology has been used for thousands of years to protect communications of kings, soldiers, and diplomats.
  • In the encryption operation, the sender transforms the message to be sent, which is called the plaintext, into the ciphertext.
  • In the old days the authenticity was protected by the intrinsic properties of the communication channel.
  • The publication in 1977 of the Data Encryption Standard (DES) by the U.S. National Bureau of Standards [39] was without any doubt an important milestone.
  • The first idea to solve this problem was to add a simple form of redundancy to the plaintext before encryption, namely the sum modulo 2 of all plaintext blocks [61].

2.2 Authentication with symmetric cryptology

  • In the military world it was known for some time that modern telecommunication channels like radio require additional protection of the authenticity.
  • One of the techniques applied was to append a secret key to the plaintext before encryption [117].
  • In the banking environment, there is a strong requirement for protecting the authenticity of transactions.
  • The basic idea of the protection of the integrity is to add redundancy to the information.
  • If the hash function uses no secret key, it is called a Manipulation Detection Code or MDC; in this case it is necessary to encrypt the hashcode and/or the information with a secret key.

2.3 Asymmetric or public-key cryptology

  • From a scientific viewpoint, the most important breakthrough of the last decennia is certainly the invention of public-key cryptology in the mid seventies by W. Diffie and M. Hellman [38], and independently by R. Merkle [78].
  • Protection of the authenticity of the information is possible, but this requires a second independent operation.
  • In the first years after the invention of public-key cryptosystems, serious doubts have been raised about their security.
  • A good example is the rise and fall of the knapsack-based schemes.
  • Soon it was realized that one could have the best of both worlds, i.e., more flexibility, a less cumbersome key management, and a high performance, by using hybrid schemes.

2.4 Other applications of hash functions

  • Hash functions have been designed in the first place to protect the authenticity of information.
  • When efficient and secure hash functions became available, it was realized that under certain assumptions they can be used for many other applications.
  • This implies that there is no correlation between input and output bits, no correlation between output bits, etc.
  • The most important applications are the following: Protection of pass-phrases: passphrases are passwords of arbitrary length.
  • Appeared in Proceedings of the 3rd Symposium on State and Progress of Research in Cryptography, W. Wolfowicz (ed.), Fondazione Ugo Bordoni, pp. 161–171, 1993.

3 DEFINITIONS

  • In Section 2 two classed of hash functions have been introduced, namely Message Authentication Codes or MAC’s (which use a secret key), and Manipulation Detection Codes or MDC’s, which do not make use of a secret key.
  • According to their properties, the class of MDC’s will be further divided into one-way hash functions (OWHF) and collision resistant hash functions (CRHF).
  • In the following the hash function will be denoted with h, and its argument, i.e., the information to be protected with X.
  • The general requirements are that the computation of the hashcode is “easy” if all arguments are known.
  • Moreover it is assumed that the description of the hash function is public; for MAC’s the only secret information lies is the secret key.

3.1 One-way hash function (OWHF)

  • A one-way hash function is a function h satisfying the following conditions: 1. The argument X can be of arbitrary length and the result h(X) has a fixed length of n bits (with n ≥ 64).
  • The first part of the second condition corresponds to the intuitive concept of one-wayness, namely that it is “hard” to find a preimage of a given value in the range.
  • In the case of permutations or injective functions only this concept is relevant.
  • The meaning of “hard” still has to be specified.
  • In the case of “ideal security”, introduced by X. Lai and J. Massey [68], producing a preimage requires 2n operations.

3.2 Collision resistant hash function (CRHF)

  • The first formal definition of a CRHF was given by I. Damg̊ard [26]; an informal definition was given by R. Merkle [80].
  • The hash function must be collision resistant: this means that it is “hard” to find two distinct messages that hash to the same result.
  • Again several options are available to specify the word “hard”.
  • In the case of “ideal security” [68], producing a preimage requires 2n operations and producing a collision requires O(2n/2) operations (cf. Section 4.3.2).
  • On the other hand, it should be noted that designing a OWHF is easier, and that the storage for the hashcode can be halved (64 bits instead of 128 bits).

3.3 Message Authentication Code (MAC)

  • Message Authentication Codes have developed from the test keys in the banking community.
  • These algorithms did not satisfy this strong definition.
  • This last attack is called an adaptive chosen text attack.
  • Appeared in Proceedings of the 3rd Symposium on State and Progress of Research in Cryptography, W. Wolfowicz (ed.), Fondazione Ugo Bordoni, pp. 161–171, 1993.
  • Note that this last property implies that the MAC should be both one-way and collision resistant for someone who does not know the secret key K.

4 THREE APPROACHES TO HASH FUNCTIONS

  • In this section a taxonomy for cryptographic hash functions will be presented.
  • The authors taxonomy deviates from the approach by G. Simmons [117], and is based on the taxonomy for stream ciphers of R. Rueppel [112].
  • A first method is based on information theory, and it offers unconditional security, i.e., security independent of the computing power of an adversary.
  • The complexity theoretic approach starts from an abstract model for computation, and assumes that the opponent has limited computing power.
  • The system based approach tries to produce practical solutions, and the security estimates are based on the best algorithm known to break the system and on realistic estimates of the necessary computing power or dedicated hardware to carry out the algorithm.

4.1 Information theoretic approach

  • This approach results in a characterization of unconditionally secure solutions, which implies that the security of the system is independent of the computing power of the opponent.
  • Eve can choose freely between both strategies (deception attack).
  • For a perfect authentication code where Pi and Ps meet the combinatorial bound, the number of key bits per message bit is at least n/m, which is much larger than 1 if m is small.
  • In [118], D. Stinson has given an overview of characterizations of this type of authentication codes.
  • If stronger conditions are imposed on Ps, it seems to be more difficult to find efficient constructions.

4.2 Complexity theoretic approach

  • The approach taken here is to define a model of computation, like a Turing machine [1] or a Boolean circuit.
  • This research program has been initiated in 1982 by A.C. Yao [125] and tries to base cryptographic primitives on general assumptions.
  • The complexity theoretic approach yields only results on the worst case or average case problems in a general class of problems.
  • Cryptographers studying the security of a scheme are more interested in the subset of problems that is easy.
  • M. Naor and M. Yung have introduced the concept of a Universal One-Way Hash Function [91].

4.3 System based or practical approach

  • Paying special attention to the efficiency of software and hardware implementations.
  • The goal of this approach is to ensure that breaking a cryptosystem is a difficult problem for the cryptanalyst.
  • Typical examples are statistical attacks and meet-in-the-middle attacks.
  • Thirdly, the assembly of basic building blocks to design a cryptographic hash function can be based on theorems.
  • Results of this type are often formulated and proven in a complexity theoretic setting, but can easily be adopted for a more practical definition of “hardness” that is useful in a system based approach.

4.3.1 General model

  • The general model allows for a compact description of specific constructions.
  • This has been called an “iterated” hash function in [68].
  • The IV should be considered as part of the description of the hash function.
  • The first result is by X. Lai and J. Massey [68] and gives necessary and sufficient conditions for f in order to obtain an “ideally secure” hash function h.
  • Assume that the padding contains the length of the input string, and that the message X (without padding) contains at least 2 blocks.

4.3.2 Attacks on hash functions

  • The discussion of attacks will be restricted to attacks which depend only on the size of the external parameters (size of hashcode and possibly size of key); they are thus independent of the nature of the algorithm.
  • In order to asses the feasibility of these attacks, it is important to know that for the time being 256 operations is considered to be on the edge of feasibility.
  • In case of a good hash function, his probability of success equals 1/2n with n the number of bits of the hashcode.
  • Note that in case of a MAC the opponent is unable to generate the MAC of a message.
  • In case of digital signatures, a sender can attack his own signature or the receiver or a third party could offer the signer a message he’s willing to sign and replace it later with the bogus message.

5 AN OVERVIEW OF MDC PROPOSALS

  • In this section the authors will attempt to summarize the large number of proposals for practical MDC’s and to discuss their status.
  • The hash functions have been divided in four classes: hash functions based on a block cipher, hash functions based on modular arithmetic, hash functions based on a knapsack, and dedicated hash functions.
  • For a more detailed discussion, the reader is referred to [99].

5.1 Hash functions based on a block cipher

  • Two arguments can be indicated for designers of hash functions to base their schemes on existing encryption algorithms.
  • The first argument is the minimization of the design and implementation effort: hash functions and block ciphers that are both efficient and secure are hard to design, and many examples to support this view can be found in the literature.
  • The major advantage however is that the trust in existing encryption algorithms can be transferred to a hash function.
  • The probabilistic linear relations between plaintext, ciphertext, and key bits of the DES which have been found by M. Matsui [74] form a more serious problem.
  • The size of the plaintext and ciphertext or the block length will be denoted with r, while the key size will be denoted with k.

5.1.1 Size of hashcode equal to the block length

  • From Definition 2 it follows that in this case the hash function can only be collision resistant if the block length r is at least 128 bits.
  • Many schemes have been proposed in this class, but the first secure schemes were only proposed after several years.
  • The strength of these schemes is based on the feedforward of the plaintext, which makes the round function hard to invert.
  • D. Davies has confirmed in a personal communication to the author that he did not propose the scheme.
  • It was also shown by the author that the security level of these hash functions is limited by min(k, r), even if the size of some internal variables is equal to max(k, r).

5.1.2 Size of hashcode equal to twice the block length

  • This type of functions has been proposed to construct a collision resistant hash function based on a block cipher with a block length of 64 bits like the DES.
  • A security “proof” was given under the assumption that the DES has sufficient random behavior.
  • Here H10 and H20 are initialized with IV1 and IV2 respectively, and the hashcode is equal to H1t ‖ H2t.
  • The analysis of all these schemes is rather involved.
  • It was shown that with respect to attacks on the round function (in contrast to attacks on the hash function), this scheme has the same security level as MDC-2.

5.1.3 Size of key equal to twice the block length

  • Some block ciphers have been proposed for which the key size is approximately twice the block length.
  • Size of hashcode equal to the block length A scheme in this class was proposed by R. Merkle in [78].
  • These constructions can only yield a CRHF if the block length is larger than 128 bits (R. Merkle suggested 100 bits in 1979), and if the key size sufficiently large.
  • The security depends strongly on the key scheduling of the cipher.
  • Both try to extend the Davies-Meyer scheme.

5.1.4 Schemes with a fixed key

  • All previous schemes (except for the ARDFP schemes of Section 5.1.2) modify the key of the block cipher during the iteration phase.
  • The key scheduling process is generally slower than the encryption.
  • Moreover many attacks exploit the fact that the key can be manipulated (e.g., attacks based on weak keys).
  • The more efficient schemes with a security level of more than 60 bits have a rate equal slightly higher than 4 and need an internal memory of about Appeared in Proceedings of the 3rd Symposium on State and Progress of Research in Cryptography, W. Wolfowicz (ed.), Fondazione Ugo Bordoni, pp. 161–171, 1993.
  • The design principles in this paper could be exploited to increase the security level of other hash functions like MDC-2.

5.2 Hash functions based on modular arithmetic

  • These hash functions are designed to use the modular arithmetic hardware that is required to produce digital signatures.
  • If the hash function is combined with a multiplicative signature scheme like RSA [107], one can exploit this attack to forge signatures [21].
  • Stronger schemes have been proposed that require more operations.
  • This allows to simplify the redundancy [46].
  • Both the factoring and the discrete logarithm problem are believed to be difficult number theoretic problems.

5.3 Hash functions based on a knapsack

  • The knapsack problem was used in 1978 by R. Merkle and M. Hellman to construct the first public key encryption system [77].
  • The problem is so attractive because both hardware and software implementations are very fast compared to schemes based on number theoretic problems.
  • Also, one has the problem of trapdoors: the person who chooses the knapsack can easily generate it such that he knows collisions.
  • The first multiplicative knapsack proposed by J. Bosset [11] was broken by P. Camion [15].
  • For the suggested parameters it can be shown that two messages with the same hashcode will differ in at least 215 bits.

5.4 Dedicated hash functions

  • I.e., algorithms that were especially designed for hashing operations.
  • This Appeared in Proceedings of the 3rd Symposium on State and Progress of Research in Cryptography, W. Wolfowicz (ed.), Fondazione Ugo Bordoni, pp. 161–171, 1993. 21 resulted in a strengthened version of MD4, namely MD5 [110].
  • The first version was broken independently by J. Daemen, A. Bosselaers, R. Govaerts, and J. Vandewalle [24] and by T. Baritaud, H. Gilbert, and M. Girault [4].
  • These measures increase the size of the substitution tables and decrease the performance.
  • In the same paper these authors have proposed Cellhash, a new hash function based on a cellular automaton [23].

6 AN OVERVIEW OF MAC PROPOSALS

  • The general model for an iterated MAC is similar as the model for an MDC.
  • The Message Authentication Algorithm (MAA) is a dedicated MAC.
  • The security can certainly be improved if the round function is made key dependent too.
  • Weaknesses of this algorithm have been identified in [48, 99].
  • The scheme by F. Cohen [19] and its improvement by Y. Huang and F. Cohen [51] proved susceptible to an adaptive chosen message attack [96].

7 PERFORMANCE OF HASH FUNCTIONS

  • In order to compare the performance of software implementations of hash functions, an overview has been compiled in Table 1.
  • The implementations were written by A. Bosselaers.
  • The table has been completed with the speed of the DES, a modular squaring and a modular exponentiation.
  • From these figures it can be derived that MDC-2 will run at about 100 Kbit/sec.
  • On this type of processor, SHA is only about 15% slower than MD5.

8 CONCLUSIONS

  • The importance of hash functions for protecting the authenticity of information has been demonstrated.
  • In addition, hash functions are becoming an important basic tool to solve other security problems.
  • For the time being only a limited number of provably secure constructions exist, that are rather slow or require a large number of key bits.
  • Therefore it is important that new proposals are evaluated thoroughly by several independent researchers and that they are not implemented too quickly.
  • Moreover implementations should be modular such that upgrading of the algorithm is feasible.

Did you find this useful? Give us your feedback

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors present a survey of the current state of the art in grid resource discovery, resource taxonomy with focus on the computational grid paradigm, P2P taxonomy, and a detailed survey of existing work that can support rf-dimensional grid resource queries.
Abstract: An efficient resource discovery mechanism is one of the fundamental requirements for grid computing systems, as it aids in resource management and scheduling of applications. Resource discovery activity involves searching for the appropriate resource types that match the user's application requirements. Various kinds of solutions to grid resource discovery have been suggested, including centralized and hierarchical information server approaches. However, both of these approaches have serious limitations in regard to scalability, fault tolerance, and network congestion. To overcome these limitations, indexing resource information using a decentralized (e.g., peer-to-peer (P2P)) network model has been actively proposed in the past few years. This article investigates various decentralized resource discovery techniques primarily driven by the P2P network model. To summarize, this article presents a: summary of the current state of the art in grid resource discovery, resource taxonomy with focus on the computational grid paradigm, P2P taxonomy with a focus on extending the current structured systems (e.g., distributed hash tables) for indexing d-dimensional grid resource queries,1 a detailed survey of existing work that can support rf-dimensional grid resource queries, and classification of the surveyed approaches based on the proposed P2P taxonomy. We believe that this taxonomy and its mapping to relevant systems would be useful for academic and industry-based researchers who are engaged in the design of scalable grid and P2P systems.

122 citations

Journal ArticleDOI
TL;DR: In this article, the security of various problems motivated by the notion of a secure hash function is analyzed in the random oracle model, and it is shown that the obvious trivial algorithms are optimal.
Abstract: In this paper, we study issues related to the notion of "secure" hash functions. Several necessary conditions are considered, as well as a popular sufficient condition (the so-called random oracle model). We study the security of various problems that are motivated by the notion of a secure hash function. These problems are analyzed in the random oracle model, and we prove that the obvious trivial algorithms are optimal. As well, we look closely at reductions between various problems. In particular, we consider the important question "does collision resistance imply preimage resistance?". We provide partial answers to this question --- both positive and negative! --- based on uniformity properties of the hash function under consideration.

116 citations

Journal ArticleDOI
TL;DR: In this paper, the state of the art of privacy-preserving schemes for ad hoc social networks including mobile social networks (MSNs) and vehicular social network (VSNs) is reviewed.
Abstract: We review the state of the art of privacy-preserving schemes for ad hoc social networks including mobile social networks (MSNs) and vehicular social networks (VSNs). Specifically, we select and examine in-detail 33 privacy-preserving schemes developed for or applied in the context of ad hoc social networks. Based on novel schemes published between 2008 and 2016, we survey privacy preservation models including location privacy, identity privacy, anonymity, traceability, interest privacy, backward privacy, and content oriented privacy. Recent significant attacks of leaking privacy, countermeasures, and game theoretic approaches in VSNs and MSNs are summarized in the form of tables. In addition, an overview of recommendations for further research is provided. With this survey, readers can acquire a thorough understanding of research trends in privacy-preserving schemes for ad hoc social networks.

112 citations

Book ChapterDOI
28 Sep 2005
TL;DR: This article presents a family of secure hash functions, whose security is directly related to the syndrome decoding problem from the theory of error-correcting codes, and proposes a few sets of parameters giving a good security and either a faster hashing or a shorter description for the function.
Abstract: Recently, some collisions have been exposed for a variety of cryptographic hash functions [20,21] including some of the most widely used today. Many other hash functions using similar constructions can however still be considered secure. Nevertheless, this has drawn attention on the need for new hash function designs. In this article is presented a family of secure hash functions, whose security is directly related to the syndrome decoding problem from the theory of error-correcting codes. Taking into account the analysis by Coron and Joux [4] based on Wagner’s generalized birthday algorithm [19] we study the asymptotical security of our functions. We demonstrate that this attack is always exponential in terms of the length of the hash value. We also study the work-factor of this attack, along with other attacks from coding theory, for non asymptotic range, i.e. for practical values. Accordingly, we propose a few sets of parameters giving a good security and either a faster hashing or a shorter description for the function.

103 citations

Journal ArticleDOI
TL;DR: It is proved the sufficiency of certain conditions to ensure the Elliptic Curve Digital Signature Algorithm (ECDSA) existentially unforgeable by adaptive chosen-message attacks.
Abstract: Proved here is the sufficiency of certain conditions to ensure the Elliptic Curve Digital Signature Algorithm (ECDSA) existentially unforgeable by adaptive chosen-message attacks. The sufficient conditions include (i) a uniformity property and collision-resistance for the underlying hash function, (ii) pseudorandomness in the private key space for the ephemeral private key generator, (iii) generic treatment of the underlying group, and (iv) a further condition on how the ephemeral public keys are mapped into the private key space. For completeness, a brief survey of necessary security conditions is also given. Some of the necessary conditions are weaker than the corresponding sufficient conditions used in the security proofs here, but others are identical. Despite the similarity between DSA and ECDSA, the main result is not appropriate for DSA, because the fourth condition above seems to fail for DSA. (The corresponding necessary condition is plausible for DSA, but is not proved here nor is the security of DSA proved assuming this weaker condition.) Brickell et al. [Vol. 1751 of Lecture Notes in computer Science, pp. 276--292], Jakobsson et al. [Vol. 1976 of Lecture Notes in computer Science, pp. 73--89] and Pointcheval et al. [Vol. 13 of Journal of Cryptology, pp. 361--396] only consider signature schemes that include the ephemeral public key in the hash input, which ECDSA does not do, and moreover, assume a condition on the hash function stronger than the first condition above. This work seems to be the first advance in the provable security of ECDSA.

98 citations


Cites background from "The State of Cryptographic Hash Fun..."

  • ...[40], Preneel [46], Simon [51] and Stinson [52, 53]....

    [...]

References
More filters
Book
01 Jan 1990
TL;DR: The updated new edition of the classic Introduction to Algorithms is intended primarily for use in undergraduate or graduate courses in algorithms or data structures and presents a rich variety of algorithms and covers them in considerable depth while making their design and analysis accessible to all levels of readers.
Abstract: From the Publisher: The updated new edition of the classic Introduction to Algorithms is intended primarily for use in undergraduate or graduate courses in algorithms or data structures. Like the first edition,this text can also be used for self-study by technical professionals since it discusses engineering issues in algorithm design as well as the mathematical aspects. In its new edition,Introduction to Algorithms continues to provide a comprehensive introduction to the modern study of algorithms. The revision has been updated to reflect changes in the years since the book's original publication. New chapters on the role of algorithms in computing and on probabilistic analysis and randomized algorithms have been included. Sections throughout the book have been rewritten for increased clarity,and material has been added wherever a fuller explanation has seemed useful or new information warrants expanded coverage. As in the classic first edition,this new edition of Introduction to Algorithms presents a rich variety of algorithms and covers them in considerable depth while making their design and analysis accessible to all levels of readers. Further,the algorithms are presented in pseudocode to make the book easily accessible to students from all programming language backgrounds. Each chapter presents an algorithm,a design technique,an application area,or a related topic. The chapters are not dependent on one another,so the instructor can organize his or her use of the book in the way that best suits the course's needs. Additionally,the new edition offers a 25% increase over the first edition in the number of problems,giving the book 155 problems and over 900 exercises thatreinforcethe concepts the students are learning.

21,651 citations

Journal ArticleDOI
TL;DR: This paper suggests ways to solve currently open problems in cryptography, and discusses how the theories of communication and computation are beginning to provide the tools to solve cryptographic problems of long standing.
Abstract: Two kinds of contemporary developments in cryptography are examined. Widening applications of teleprocessing have given rise to a need for new types of cryptographic systems, which minimize the need for secure key distribution channels and supply the equivalent of a written signature. This paper suggests ways to solve these currently open problems. It also discusses how the theories of communication and computation are beginning to provide the tools to solve cryptographic problems of long standing.

14,980 citations

Journal ArticleDOI
TL;DR: An encryption method is presented with the novel property that publicly revealing an encryption key does not thereby reveal the corresponding decryption key.
Abstract: An encryption method is presented with the novel property that publicly revealing an encryption key does not thereby reveal the corresponding decryption key. This has two important consequences: (1) Couriers or other secure means are not needed to transmit keys, since a message can be enciphered using an encryption key publicly revealed by the intented recipient. Only he can decipher the message, since only he knows the corresponding decryption key. (2) A message can be “signed” using a privately held decryption key. Anyone can verify this signature using the corresponding publicly revealed encryption key. Signatures cannot be forged, and a signer cannot later deny the validity of his signature. This has obvious applications in “electronic mail” and “electronic funds transfer” systems. A message is encrypted by representing it as a number M, raising M to a publicly specified power e, and then taking the remainder when the result is divided by the publicly specified product, n, of two large secret primer numbers p and q. Decryption is similar; only a different, secret, power d is used, where e * d ≡ 1(mod (p - 1) * (q - 1)). The security of the system rests in part on the difficulty of factoring the published divisor, n.

14,659 citations

Book
01 Jan 1996
TL;DR: A valuable reference for the novice as well as for the expert who needs a wider scope of coverage within the area of cryptography, this book provides easy and rapid access of information and includes more than 200 algorithms and protocols.
Abstract: From the Publisher: A valuable reference for the novice as well as for the expert who needs a wider scope of coverage within the area of cryptography, this book provides easy and rapid access of information and includes more than 200 algorithms and protocols; more than 200 tables and figures; more than 1,000 numbered definitions, facts, examples, notes, and remarks; and over 1,250 significant references, including brief comments on each paper.

13,597 citations


"The State of Cryptographic Hash Fun..." refers background in this paper

  • ...The relation between preimage resistance and second preimage resistance is discussed in [ 58 ,67]....

    [...]

Proceedings ArticleDOI
Mihir Bellare1, Phillip Rogaway1
01 Dec 1993
TL;DR: It is argued that the random oracles model—where all parties have access to a public random oracle—provides a bridge between cryptographic theory and cryptographic practice, and yields protocols much more efficient than standard ones while retaining many of the advantages of provable security.
Abstract: We argue that the random oracle model—where all parties have access to a public random oracle—provides a bridge between cryptographic theory and cryptographic practice. In the paradigm we suggest, a practical protocol P is produced by first devising and proving correct a protocol PR for the random oracle model, and then replacing oracle accesses by the computation of an “appropriately chosen” function h. This paradigm yields protocols much more efficient than standard ones while retaining many of the advantages of provable security. We illustrate these gains for problems including encryption, signatures, and zero-knowledge proofs.

5,313 citations

Frequently Asked Questions (15)
Q1. What have the authors contributed in "Cryptographic hash functions" ?

This paper sketches the history of the concept, discusses the applications of hash functions, and presents the approaches which have been followed to construct hash functions. An overview of practical constructions and their performance is given and some attacks are discussed. 

Examples of problems that have been intensively used are the factoring of a product of two large primes and the discrete logarithm problem modulo a prime and modulo a composite that is the product of two large prime. 

The most widespread method to compute a MAC are the Cipher Block Chaining (CBC) and Cipher FeedBack (CFB) mode of the DES [3, 41, 53, 55, 82]. 

The advent of electronic computers and telecommunication networks created the need for a widespread commercial encryption algorithm. 

One uses public key techniques for key establishment, and subsequently a conventional algorithm like DES or triple-DES to encipher large quantities of data. 

In view of the fact that the speed of computers is multiplied by four every three years, 264 operations is sufficient for the next 10 years, but it will be only marginally secure within 20 years. 

The probability of finding a bogus message and a genuine message that hash to the same result is given by1− exp(− r1 · r2 2n),which is about 63 % when r = r1 = r2 = 2 n 2 . 

It was also shown by the author that the security level of these hash functions is limited by min(k, r), even if the size of some internal variables is equal to max(k, r). 

The disadvantage is that the complexity theoretic approach has only a limited impact on practical implementations, due to limitations that are inherently present in the models. 

Examples of general assumptions to which these primitives can be reduced are the existence of one-way functions, injections, or permutations, and the existence of trapdoor one-way permutations. 

The problem is so attractive because both hardware and software implementations are very fast compared to schemes based on number theoretic problems. 

These constructions can only yield a CRHF if the block length is larger than 128 bits (R. Merkle suggested 100 bits in 1979), and if the key size sufficiently large. 

Construction of efficient digital signature schemes: this comprises the construction of efficient signature schemes based on hash functions only [79], as well as the construction of digital signature schemes from zero-knowledge protocols.• 

Both schemes have rate equal to 2, and are claimed to be ideally secure, or finding a pseudopreimage takes 22n operations and finding a collision takes 2n operations. 

If these schemes are used in a practical setting, it remains a disadvantage that a single key can be used to authenticate only one message; this can be avoided by encrypting the MAC with the Vernam scheme, which means that n additional key bits per message are required.