scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Cryptology in 2008"


Journal ArticleDOI
TL;DR: In this paper, the authors consider two possible notions of authenticity for authenticated encryption schemes, namely integrity of plaintexts and integrity of ciphertexts, and relate them, when coupled with IND-CPA (indistinguishability under chosen-plaintext attack), to the standard notions of privacy IND-CCA and NMCPA, and provide proofs for the cases where the answer is "yes" and counter-examples for the answer "no".
Abstract: An authenticated encryption scheme is a symmetric encryption scheme whose goal is to provide both privacy and integrity. We consider two possible notions of authenticity for such schemes, namely integrity of plaintexts and integrity of ciphertexts, and relate them, when coupled with IND-CPA (indistinguishability under chosen-plaintext attack), to the standard notions of privacy IND-CCA and NM-CPA (indistinguishability under chosen-ciphertext attack and nonmalleability under chosen-plaintext attack) by presenting implications and separations between all notions considered. We then analyze the security of authenticated encryption schemes designed by “generic composition,” meaning making black-box use of a given symmetric encryption scheme and a given MAC. Three composition methods are considered, namely Encrypt-and-MAC, MAC-then-encrypt, and Encrypt-then-MAC. For each of these and for each notion of security, we indicate whether or not the resulting scheme meets the notion in question assuming that the given symmetric encryption scheme is secure against chosen-plaintext attack and the given MAC is unforgeable under chosen-message attack. We provide proofs for the cases where the answer is “yes” and counter-examples for the cases where the answer is “no.”

586 citations


Journal ArticleDOI
TL;DR: In this article, the authors describe a short signature scheme that is strongly existentially unforgeable under an adaptive chosen message attack in the standard security model, and give a tight reduction proving that their scheme is secure in any group in which the Strong Diffie-Hellman (SDH) assumption holds, without relying on the random oracle model.
Abstract: We describe a short signature scheme that is strongly existentially unforgeable under an adaptive chosen message attack in the standard security model. Our construction works in groups equipped with an efficient bilinear map, or, more generally, an algorithm for the Decision Diffie-Hellman problem. The security of our scheme depends on a new intractability assumption we call Strong Diffie-Hellman (SDH), by analogy to the Strong RSA assumption with which it shares many properties. Signature generation in our system is fast and the resulting signatures are as short as DSA signatures for comparable security. We give a tight reduction proving that our scheme is secure in any group in which the SDH assumption holds, without relying on the random oracle model.

577 citations


Journal ArticleDOI
TL;DR: This work identifies and fills some gaps with regard to consistency (the extent to which false positives are produced) for public-key encryption with keyword search (PEKS) and defines computational and statistical relaxations of the existing notion of perfect consistency.
Abstract: We identify and fill some gaps with regard to consistency (the extent to which false positives are produced) for public-key encryption with keyword search (PEKS). We define computational and statistical relaxations of the existing notion of perfect consistency, show that the scheme of Boneh et al. (Advances in Cryptology—EUROCRYPT 2004, ed. by C. Cachin, J. Camenisch, pp. 506–522, 2004) is computationally consistent, and provide a new scheme that is statistically consistent. We also provide a transform of an anonymous identity-based encryption (IBE) scheme to a secure PEKS scheme that, unlike the previous one, guarantees consistency. Finally, we suggest three extensions of the basic notions considered here, namely anonymous hierarchical identity-based encryption, public-key encryption with temporary keyword search, and identity-based encryption with keyword search.

463 citations


Journal ArticleDOI
TL;DR: In this paper, a hash function is constructed from one of Pizers Ramanujan graphs, (the set of supersingular elliptic curves over with l-isogenies, l a prime different from p).
Abstract: We propose constructing provable collision resistant hash functions from expander graphs in which finding cycles is hard. As examples, we investigate two specific families of optimal expander graphs for provable collision resistant hash function constructions: the families of Ramanujan graphs constructed by Lubotzky-Phillips-Sarnak and Pizer respectively. When the hash function is constructed from one of Pizers Ramanujan graphs, (the set of supersingular elliptic curves over with l-isogenies, l a prime different from p), then collision resistance follows from hardness of computing isogenies between supersingular elliptic curves. For the LPS graphs, the underlying hard problem is a representation problem in group theory. Constructing our hash functions from optimal expander graphs implies that the outputs closely approximate the uniform distribution. This property is useful for arguing that the output is indistinguishable from random sequences of bits. We estimate the cost per bit to compute these hash functions, and we implement our hash function for several members of the Pizer and LPS graph families and give actual timings.

283 citations


Journal ArticleDOI
TL;DR: A framework that on the one hand helps explain how these schemes are derived and on the other hand enables modular security analyses, thereby helping to understand, simplify, and unify previous work is provided.
Abstract: This paper provides either security proofs or attacks for a large number of identity-based identification and signature schemes defined either explicitly or implicitly in existing literature. Underlying these is a framework that on the one hand helps explain how these schemes are derived and on the other hand enables modular security analyses, thereby helping to understand, simplify, and unify previous work. We also analyze a generic folklore construction that in particular yields identity-based identification and signature schemes without random oracles.

206 citations


Journal ArticleDOI
TL;DR: An analytical calculation of the success probability of linear and differential cryptanalytic attacks is presented to an extended sense of the term “success” where the correct key is found not necessarily as the highest-ranking candidate but within a set of high-ranking candidates.
Abstract: Despite their widespread usage in block cipher security, linear and differential cryptanalysis still lack a robust treatment of their success probability, and the success chances of these attacks have commonly been estimated in a rather ad hoc fashion. In this paper, we present an analytical calculation of the success probability of linear and differential cryptanalytic attacks. The results apply to an extended sense of the term “success” where the correct key is found not necessarily as the highest-ranking candidate but within a set of high-ranking candidates. Experimental results show that the analysis provides accurate results in most cases, especially in linear cryptanalysis. In cases where the results are less accurate, as in certain cases of differential cryptanalysis, the results are useful to provide approximate estimates of the success probability and the necessary plaintext requirement. The analysis also reveals that the attacked key length in differential cryptanalysis is one of the factors that affect the success probability directly besides the signal-to-noise ratio and the available plaintext amount.

199 citations


Journal ArticleDOI
TL;DR: The Gabidulin version of the McEliece cryptosystem and its variants are looked at, with the result that there are no secure parameter sets left for GPT variants, which one would like to use in practice.
Abstract: In this paper we look at the Gabidulin version of the McEliece cryptosystem (GPT) and its variants. We give an overview over the existing structural attacks on the basic scheme, and show how to combine them to get an effective attack for every GPT variant. As a consequence, there are no secure parameter sets left for GPT variants, which one would like to use in practice.

149 citations


Journal ArticleDOI
TL;DR: In this paper, the concept of key encapsulation was extended to the primitives of identity-based and certificateless encryption, and it was shown that the natural combination of ID-KEMs or CL-Kems with data encapsulation mechanisms results in encryption schemes that are secure in a strong sense.
Abstract: We extend the concept of key encapsulation to the primitives of identity-based and certificateless encryption. We show that the natural combination of ID-KEMs or CL-KEMs with data encapsulation mechanisms results in encryption schemes that are secure in a strong sense. In addition, we give generic constructions of ID-KEMs and CL-KEMs that are provably secure in the random oracle model.

135 citations


Journal ArticleDOI
TL;DR: In this paper, a ciphertext-only cryptanalysis of GSM (Global System for Mobile communications) encrypted communication is presented, and various active attacks on the GSM protocols are discussed.
Abstract: In this paper we present a very practical ciphertext-only cryptanalysis of GSM (Global System for Mobile communications) encrypted communication, and various active attacks on the GSM protocols. These attacks can even break into GSM networks that use “unbreakable” ciphers. We first describe a ciphertext-only attack on A5/2 that requires a few dozen milliseconds of encrypted off-the-air cellular conversation and finds the correct key in less than a second on a personal computer. We extend this attack to a (more complex) ciphertext-only attack on A5/1. We then describe new (active) attacks on the protocols of networks that use A5/1, A5/3, or even GPRS (General Packet Radio Service). These attacks exploit flaws in the GSM protocols, and they work whenever the mobile phone supports a weak cipher such as A5/2. We emphasize that these attacks are on the protocols, and are thus applicable whenever the cellular phone supports a weak cipher, for example, they are also applicable for attacking A5/3 networks using the cryptanalysis of A5/1. Unlike previous attacks on GSM that require unrealistic information, like long known-plaintext periods, our attacks are very practical and do not require any knowledge of the content of the conversation. Furthermore, we describe how to fortify the attacks to withstand reception errors. As a result, our attacks allow attackers to tap conversations and decrypt them either in real-time, or at any later time. We present several attack scenarios such as call hijacking, altering of data messages and call theft.

130 citations


Journal ArticleDOI
TL;DR: This paper proposes new definitions of anonymity and unforgeability which address these threats, and shows the first constructions of ring signature schemes in the standard model that satisfies the strongest definitions of security.
Abstract: Ring signatures, first introduced by Rivest, Shamir, and Tauman, enable a user to sign a message so that a ring of possible signers (of which the user is a member) is identified, without revealing exactly which member of that ring actually generated the signature. In contrast to group signatures, ring signatures are completely “ad-hoc” and do not require any central authority or coordination among the various users (indeed, users do not even need to be aware of each other); furthermore, ring signature schemes grant users fine-grained control over the level of anonymity associated with any particular signature. This paper has two main areas of focus. First, we examine previous definitions of security for ring signature schemes and suggest that most of these prior definitions are too weak, in the sense that they do not take into account certain realistic attacks. We propose new definitions of anonymity and unforgeability which address these threats, and give separation results proving that our new notions are strictly stronger than previous ones. Second, we show the first constructions of ring signature schemes in the standard model. One scheme is based on generic assumptions and satisfies our strongest definitions of security. Two additional schemes are more efficient, but achieve weaker security guarantees and more limited functionality.

101 citations


Journal ArticleDOI
Charanjit S. Jutla1
TL;DR: The Integrity Aware Parallelizable Mode (IAPM) as discussed by the authors requires a total of m+1 block cipher evaluations on a plain-text of length m blocks, which is the same as the CBC-MAC.
Abstract: We define a new mode of operation for block ciphers which, in addition to providing confidentiality, also ensures message integrity. In contrast, previously for message integrity a separate pass was required to compute a cryptographic message authentication code (MAC). The new mode of operation, called Integrity Aware Parallelizable Mode (IAPM), requires a total of m+1 block cipher evaluations on a plain-text of length m blocks. For comparison, the well-known CBC (cipher block chaining) encryption mode requires m block cipher evaluations, and the second pass of computing the CBC-MAC essentially requires additional m+1 block cipher evaluations. As the name suggests, the new mode is also highly parallelizable.

Journal ArticleDOI
TL;DR: In this paper, the generic construction of hybrid encryption schemes is presented, which produces more efficient schemes than the ones known before, and it allows immediate conversion from a class of threshold public-key encryption to a threshold hybrid one without considerable overhead.
Abstract: This paper presents a novel framework for the generic construction of hybrid encryption schemes which produces more efficient schemes than the ones known before. A previous framework introduced by Shoup combines a key encapsulation mechanism (KEM) and a data encryption mechanism (DEM). While it is sufficient to require both components to be secure against chosen ciphertext attacks (CCA-secure), Kurosawa and Desmedt showed a particular example of KEM that is not CCA-secure but can be securely combined with a specific type of CCA-secure DEM to obtain a more efficient, CCA-secure hybrid encryption scheme. There are also many other efficient hybrid encryption schemes in the literature that do not fit into Shoup’s framework. These facts serve as motivation to seek another framework. The framework we propose yields more efficient hybrid scheme, and in addition provides insightful explanation about existing schemes that do not fit into the previous framework. Moreover, it allows immediate conversion from a class of threshold public-key encryption to a threshold hybrid one without considerable overhead, which may not be possible in the previous approach.

Journal ArticleDOI
TL;DR: A new approach to designing public-key cryptosystems based on covers and logarithmic signatures of non-abelian finite groups is presented, and the proposed underlying group, represented as a matrix group, affords significant space and time efficiency.
Abstract: We present a new approach to designing public-key cryptosystems based on covers and logarithmic signatures of non-abelian finite groups. Initially, we describe a generic version of the system for a large class of groups. We then propose a class of 2-groups and argue heuristically about the system’s security. The system is scalable, and the proposed underlying group, represented as a matrix group, affords significant space and time efficiency.

Journal ArticleDOI
TL;DR: A heuristic analysis of the algorithm is presented which indicates that the DLP in degree 0 class groups of non-hyperelliptic curves of genus 3 can be solved in an expected time of $\tilde{O}(q)$ .
Abstract: We study an index calculus algorithm to solve the discrete logarithm problem (DLP) in degree 0 class groups of non-hyperelliptic curves of genus 3 over finite fields. We present a heuristic analysis of the algorithm which indicates that the DLP in degree 0 class groups of non-hyperelliptic curves of genus 3 can be solved in an expected time of $\tilde{O}(q)$. This heuristic result relies on one heuristic assumption which is studied experimentally. We also present experimental data which show that a variant of the algorithm is faster than the Rho method even for small group sizes, and we address practical limitations of the algorithm.

Journal ArticleDOI
TL;DR: The first proof that parallel repetition with thresholds improves the security of challenge-response protocols is given, with a very general result about an attacker’s ability to solve a large fraction of many independent instances of a hard problem.
Abstract: Consider a challenge-response protocol where the probability of a correct response is at least α for a legitimate user and at most β<α for an attacker. One example is a CAPTCHA challenge, where a human should have a significantly higher chance of answering a single challenge (e.g., uncovering a distorted letter) than an attacker; another example is an argument system without perfect completeness. A natural approach to boost the gap between legitimate users and attackers is to issue many challenges and accept if the response is correct for more than a threshold fraction, for the threshold chosen between α and β. We give the first proof that parallel repetition with thresholds improves the security of such protocols. We do this with a very general result about an attacker’s ability to solve a large fraction of many independent instances of a hard problem, showing a Chernoff-like convergence of the fraction solved incorrectly to the probability of failure for a single instance.

Journal ArticleDOI
TL;DR: In this paper, the authors prove lower bounds and impossibility results for secure protocols in the setting of concurrent self composition, where a single protocol is executed many times concurrently in a network.
Abstract: In the setting of concurrent self composition, a single protocol is executed many times concurrently in a network. In this paper, we prove lower bounds and impossibility results for secure protocols in this setting. First and foremost, we prove that there exist large classes of functionalities that cannot be securely computed under concurrent self composition, by any protocol. We also prove a communication complexity lower bound on protocols that securely compute a large class of functionalities in this setting. Specifically, we show that any protocol that computes a functionality from this class and remains secure for m concurrent executions, must have bandwidth of at least m bits. The above results are unconditional and hold for any type of simulation (i.e., even for non-black-box simulation). In addition, we prove a severe lower bound on protocols that are proven secure using black-box simulation. Specifically, we show that any protocol that computes the blind signature or oblivious transfer functionalities and remains secure for m concurrent executions, where security is proven via black-box simulation, must have at least m rounds of communication. Our results hold for the plain model, where no trusted setup phase is assumed. While proving our impossibility results, we also show that for many functionalities, security under concurrent self composition (where a single secure protocol is run many times) is actually equivalent to the seemingly more stringent requirement of security under concurrent general composition (where a secure protocol is run concurrently with other arbitrary protocols). This observation has significance beyond the impossibility results that are derived by it for concurrent self composition.

Journal ArticleDOI
TL;DR: In this paper, a general computational framework, called Sequential Probabilistic Process Calculus (SPPC), is used to clarify the relationships between the simulation-based security conditions, which are carried out based on a small set of equivalence principles involving processes and distributed systems.
Abstract: Several compositional forms of simulation-based security have been proposed in the literature, including Universal Composability, Black-Box Simulatability, and variants thereof. These relations between a protocol and an ideal functionality are similar enough that they can be ordered from strongest to weakest according to the logical form of their definitions. However, determining whether two relations are in fact identical depends on some subtle features that have not been brought out in previous studies. We identify two main factors: the position of a “master process” in the distributed system and some limitations on transparent message forwarding within computational complexity bounds. Using a general computational framework, called Sequential Probabilistic Process Calculus (SPPC), we clarify the relationships between the simulation-based security conditions. Many of the proofs are carried out based on a small set of equivalence principles involving processes and distributed systems. These equivalences exhibit the essential properties needed to prove relationships between security notions and allow us to carry over our results to those computational models which satisfy these equivalences.

Journal ArticleDOI
TL;DR: An algorithm based on Fast Walsh Transform (FWT) is devised to solve the MLD problem for any linear code with dimension L and length n within time O(n+L⋅2L).
Abstract: In this paper, we study an E0-like combiner with memory as the keystream generator. First, we formulate a systematic and simple method to compute correlations of the FSM output sequences (up to certain bits). An upper bound of the correlations is given, which is useful to the designer. Second, we show how to build either a uni-bias-based or multi-bias-based distinguisher to distinguish the keystream produced by the combiner from a truly random sequence, once correlations are found. The data complexity of both distinguishers is carefully analyzed for performance comparison. We show that the multi-bias-based distinguisher outperforms the uni-bias-based distinguisher only when the patterns of the largest biases are linearly dependent. The keystream distinguisher is then upgraded for use in the key-recovery attack. The latter actually reduces to the well-known Maximum Likelihood Decoding (MLD) problem given the keystream long enough. We devise an algorithm based on Fast Walsh Transform (FWT) to solve the MLD problem for any linear code with dimension L and length n within time O(n+L·2 L ). Meanwhile, we summarize a design criterion for our E0-like combiner with memory to resist the proposed attacks.

Journal ArticleDOI
TL;DR: In this paper, the reliability and security of information transmission in networks are studied. But the authors consider the framework of Franklin and Wright (J.Cryptol. 13(1):9-30, 2000): multicast communication and byzantine adversary.
Abstract: This paper studies reliability and security of information transmission in networks. We consider the framework of Franklin and Wright (J. Cryptol. 13(1):9–30, 2000): multicast communication and byzantine adversary. Franklin and Wright studied particular neighbor graphs with neighbor-disjoint paths. The aim of the present work is to drop this assumption and to give necessary and sufficient conditions on the neighbor graph allowing reliable and secure information transmission.

Journal ArticleDOI
TL;DR: Goldreich and Lindell as mentioned in this paper presented a simplified version of the GL protocol for the special case when the dictionary is of the form ρ = ρ 0, 1 ρ √ d, i.e., the password is a short string chosen uniformly at random (in the spirit of an ATM PIN number).
Abstract: Goldreich and Lindell (CRYPTO ’01) recently presented the first protocol for password-authenticated key exchange in the standard model (with no common reference string or set-up assumptions other than the shared password). However, their protocol uses several heavy tools and has a complicated analysis. We present a simplification of the Goldreich–Lindell (GL) protocol and analysis for the special case when the dictionary is of the form $\mathcal{D}=\{0,1\}^{d}$ i.e., the password is a short string chosen uniformly at random (in the spirit of an ATM PIN number). The security bound achieved by our protocol is somewhat worse than the GL protocol. Roughly speaking, our protocol guarantees that the adversary can “break” the scheme with probability at most $O(\mathrm{poly}(n)/|\mathcal{D}|)^{\Omega(1)}$, whereas the GL protocol guarantees a bound of $O(1/|\mathcal{D}|)$. We also present an alternative, more natural definition of security than the “augmented definition” of Goldreich and Lindell, and prove that the two definitions are equivalent.

Journal ArticleDOI
TL;DR: Two different attacks against the ISO/IEC 9796-1 signature standard for RSA and Rabin are described: a variant of Desmedt and Odlyzko's attack and a more powerful attack that requires only three signatures.
Abstract: We describe two different attacks against the ISO/IEC 9796-1 signature standard for RSA and Rabin. Both attacks consist in an existential forgery under a chosen-message attack: the attacker asks for the signature of some messages of his choice, and is then able to produce the signature of a message that was never signed by the legitimate signer. The first attack is a variant of Desmedt and Odlyzko’s attack and requires a few hundreds of signatures. The second attack is more powerful and requires only three signatures.

Journal ArticleDOI
TL;DR: A short signature scheme is described that is strongly existentially unforgeable under an adaptive chosen message attack in the standard security model.
Abstract: We describe a short signature scheme that is strongly existentially unforgeable under an adaptive chosen message attack in the standard security model. Our construction works in groups equipped wit...

Journal ArticleDOI
TL;DR: It is shown that an eavesdropper can always recover efficiently the private key of one of the two parts of the public key cryptography protocol introduced by Shpilrain and Ushakov, making the protocol insecure.
Abstract: This paper shows that an eavesdropper can always recover efficiently the private key of one of the two parts of the public key cryptography protocol introduced by Shpilrain and Ushakov (ACNS 2005, Lecture Notes in Comput. Sci., vol. 3531, pp. 151–163, 2005). Thus an eavesdropper can always recover the shared secret key, making the protocol insecure.

Journal ArticleDOI
TL;DR: New trade-offs in the scenario of single execution of a quantum string commitment scheme (QSC) protocol are presented, which immediately imply the trade-off shown by Buhrman et al. in the asymptotic regime.
Abstract: String commitment schemes are similar to the well-studied bit commitment schemes in cryptography with the difference that the committing party, say $\mathsf {Alice}$, is supposed to commit a long string instead of a single bit to another party, say $\mathsf {Bob}$ Similar to bit commitment schemes, such schemes are supposed to be binding, ie, $\mathsf {Alice}$cannot change her choice after committing, and concealing, ie, $\mathsf {Bob}$cannot find $\mathsf {Alice}$’s committed string before $\mathsf {Alice}$reveals it Ideal commitment schemes are known to be impossible Even if some degree of cheating is allowed, Buhrman et al (quant-ph/0504078, Nov 2007) have recently shown that there are some binding-concealing trade-offs that any quantum string commitment scheme ( $\mathsf {QSC}$) must follow They showed trade-offs both in the scenario of single execution of the protocol and in the asymptotic regime of sufficiently large number of parallel executions of the protocol We present here new trade-offs in the scenario of single execution of a $\mathsf {QSC}$protocol Our trade-offs also immediately imply the trade-off shown by Buhrman et al in the asymptotic regime We show our results by making a central use of an important information theoretic tool called the substate theorem due to Jain et al (Proceedings of the 43rd Annual IEEE Symposium on Foundations of Computer Science, pp 429–438, 2002) Our techniques are quite different from that of Buhrman et al (quant-ph/0504078, Nov 2007) and may be of independent interest

Journal ArticleDOI
TL;DR: In this paper, the authors develop techniques for dealing with expected polynomial-time adversaries in simulation-based security proofs, and show that simulation is essential for achieving constant-round black-box zero-knowledge protocols.
Abstract: The standard class of adversaries considered in cryptography is that of strict polynomial-time probabilistic machines. However, expected polynomial-time machines are often also considered. For example, there are many zero-knowledge protocols for which the only known simulation techniques run in expected (and not strict) polynomial time. In addition, it has been shown that expected polynomial-time simulation is essential for achieving constant-round black-box zero-knowledge protocols. This reliance on expected polynomial-time simulation introduces a number of conceptual and technical difficulties. In this paper, we develop techniques for dealing with expected polynomial-time adversaries in simulation-based security proofs.

Journal ArticleDOI
TL;DR: A careful, fixed-size parameter analysis of a standard to construct very practical pseudo-random generators with strong properties based on plausible assumptions, effective even for security parameters/key-sizes supported by typical block ciphers and hash functions.
Abstract: We give a careful, fixed-size parameter analysis of a standard (Blum and Micali in SIAM J. Comput. 13(4):850–864, 1984; Goldreich and Levin in Proceedings of 21st ACM Symposium on Theory of Computing, pp. 25–32, 1989) way to form a pseudo-random generator from a one-way function and then pseudo-random functions from said generator (Goldreich et al. in J. Assoc. Comput. Mach. 33(4):792–807, 1986) While the analysis is done in the model of exact security, we improve known bounds also asymptotically when many bits are output each round and we find all auxiliary parameters efficiently, giving a uniform result. These optimizations makes the analysis effective even for security parameters/key-sizes supported by typical block ciphers and hash functions. This enables us to construct very practical pseudo-random generators with strong properties based on plausible assumptions.