scispace - formally typeset
Search or ask a question

Showing papers on "Cryptography published in 1998"


Book ChapterDOI
31 May 1998
TL;DR: A definition of protocol divertibility is given that applies to arbitrary 2-party protocols and is compatible with Okamoto and Ohta's definition in the case of interactive zero-knowledge proofs and generalizes to cover several protocols not normally associated with divertibility.
Abstract: First, we introduce the notion of divertibility as a protocol property as opposed to the existing notion as a language property (see Okamoto, Ohta [OO90]) We give a definition of protocol divertibility that applies to arbitrary 2-party protocols and is compatible with Okamoto and Ohta's definition in the case of interactive zero-knowledge proofs Other important examples falling under the new definition are blind signature protocols We propose a sufficiency criterion for divertibility that is satisfied by many existing protocols and which, surprisingly, generalizes to cover several protocols not normally associated with divertibility (eg, Diffie-Hellman key exchange) Next, we introduce atomic proxy cryptography, in which an atomic proxy function, in conjunction with a public proxy key, converts ciphertexts (messages or signatures) for one key into ciphertexts for another Proxy keys, once generated, may be made public and proxy functions applied in untrusted environments We present atomic proxy functions for discrete-log-based encryption, identification, and signature schemes It is not clear whether atomic proxy functions exist in general for all public-key cryptosystems Finally, we discuss the relationship between divertibility and proxy cryptography

1,533 citations


Journal ArticleDOI
TL;DR: It is shown that public key information hiding systems exist, and are not necessarily constrained to the case where the warden is passive, and the use of parity checks to amplify covertness and provide public key steganography.
Abstract: In this paper, we clarify what steganography is and what it can do. We contrast it with the related disciplines of cryptography and traffic security, present a unified terminology agreed at the first international workshop on the subject, and outline a number of approaches-many of them developed to hide encrypted copyright marks or serial numbers in digital audio or video. We then present a number of attacks, some new, on such information hiding schemes. This leads to a discussion of the formidable obstacles that lie in the way of a general theory of information hiding systems (in the sense that Shannon gave us a general theory of secrecy systems). However, theoretical considerations lead to ideas of practical value, such as the use of parity checks to amplify covertness and provide public key steganography. Finally, we show that public key information hiding systems exist, and are not necessarily constrained to the case where the warden is passive.

1,270 citations


Journal Article
Ronald Cramer1, Victor Shoup2
TL;DR: In this article, a new public key cryptosystem is proposed and analyzed, which is provably secure against adaptive chosen ciphertext attack under standard intractability assumptions. But the scheme is quite practical, and is not provable to be used in practice.
Abstract: A new public key cryptosystem is proposed and analyzed. The scheme is quite practical, and is provably secure against adaptive chosen ciphertext attack under standard intractability assumptions. There appears to be no previous cryptosystem in the literature that enjoys both of these properties simultaneously.

1,228 citations


Book ChapterDOI
14 Apr 1998
TL;DR: An information-theoretic model for steganography with passive adversaries is proposed and several secure steganographic schemes are presented; one of them is a universal information hiding scheme based on universal data compression techniques that requires no knowledge of the covertext statistics.
Abstract: An information-theoretic model for steganography with passive adversaries is proposed. The adversary’s task of distinguishing between an innocent cover message C and a modified message S containing a secret part is interpreted as a hypothesis testing problem. The security of a steganographic system is quantified in terms of the relative entropy (or discrimination) between P C and P S. Several secure steganographic schemes are presented in this model; one of them is a universal information hiding scheme based on universal data compression techniques that requires no knowledge of the covertext statistics.

882 citations


Journal ArticleDOI
TL;DR: It is argued that steganography by itself does not ensure secrecy, but neither does simple encryption, and if these methods are combined, however, stronger encryption methods result.
Abstract: Steganography is the art of hiding information in ways that prevent the detection of hidden messages. It includes a vast array of secret communications methods that conceal the message's very existence. These methods include invisible inks, microdots, character arrangement, digital signatures, covert channels, and spread spectrum communications. Steganography and cryptography are cousins in the spycraft family: cryptography scrambles a message so it cannot be understood while steganography hides the message so it cannot be seen. In this article the authors discuss image files and how to hide information in them, and discuss results obtained from evaluating available steganographic software. They argue that steganography by itself does not ensure secrecy, but neither does simple encryption. If these methods are combined, however, stronger encryption methods result. If an encrypted message is intercepted, the interceptor knows the text is an encrypted message. But with steganography, the interceptor may not know that a hidden message even exists. For a brief look at how steganography evolved, there is included a sidebar titled "Steganography: Some History."

644 citations


Proceedings ArticleDOI
01 Jun 1998
TL;DR: A very simple Verifiable Secret Sharing protocol is presented which is based on fast cryptographic primitives and avoids altogether the need for expensive zero-knowledge proofs and a highly simplified protocol to compute multiplications over shared secrets.
Abstract: The goal of this paper is to introduce a simple verifiable secret sharing scheme, to improve the efficiency of known secure multiparty protocols and, by employing these techniques, to improve the efficiency of applications which use these protocols. First we present a very simple Verifiable Secret Sharing protocol which is based on fast cryptographic primitives and avoids altogether the need for expensive zero-knowledge proofs. This is followed by a highly simplified protocol to compute multiplications over shared secrets. This is a major component in secure multiparty computation protocols and accounts for much of the complexity of proposed solutions. Using our protocol as a plug-in unit in known protocols reduces their complexity. We show how to achieve efficient multiparty computations in the computational model, through the application of homomorphic commitments. Finally, we present fast-track multiparty computation protocols. In a model in which malicious faults are rare we show that it is possible to carry out a simpler and more efficient protocol which does not perform all the expensive checks needed to combat a malicious adversary from foiling the computation. Yet, the protocol still enables detection of faults and recovers the computation when faults occur without giving any information advantage to the adversary. This results in protocols which are much more efficient under normal operation of the system i.e. when there are no faults. As an example of the practical impact of our work we show how our techniques can be used to greatly improve the speed and the fault-tolerance of existing threshold cryptography protocols. * IBM T.J. Watscm Research Center, PO Box 704, Yorktowo Heights, New York 10598, USA Email: rosarioOwatsotl.ibnl.coln. + Harvard University sod Hebrew University. Email: rabin@cs.huii.ac.il * IBM ‘f.J. Watsoo Research Center, PO Box 704, Yorktowo Heights, New York 10598, USA Email: talrOwatsoll.ibtn.corlr. Contact author Permission to make digital or hard copies of all or part of this work for pelmal cr ClassroOm use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear tbii notice and the fit11 citation on the fti page. To copy otherwise, to republish, to post on servers or to redisbibute to lists, requires prior specific permission a&or a fee. PODC 98 Fkerto Vallarta Mexico Copyright ACM 1998th89791.97%7/9816...$5.00

529 citations


Proceedings ArticleDOI
03 May 1998
TL;DR: This paper studies secure off-line authenticated user identification schemes based on a biometric system that can measure a user's biometrics accurately (up to some Hamming distance) and investigates a new technology which allows a users' biometric data to facilitate cryptographic mechanisms.
Abstract: In developing secure applications and systems, designers must often incorporate secure user identification in the design specification. In this paper, we study secure off-line authenticated user identification schemes based on a biometric system that can measure a user's biometrics accurately (up to some Hamming distance). The presented schemes enhance identification and authorization in secure applications by binding a biometric template with authorization information on a token such as a magnetic strip. Also developed are schemes specifically designed to minimize the compromising of a user's private biometrics data, encapsulated in the authorization information, without requiring secure hardware tokens. We also study the feasibility of biometrics performing as an enabling technology for secure systems and applications design. We investigate a new technology which allows a user's biometrics to facilitate cryptographic mechanisms.

520 citations


Journal Article
TL;DR: A new coordinate system and a new mixed coordinates strategy are proposed, which significantly improves on the number of basic operations needed for elliptic curve exponentiation.
Abstract: Elliptic curve cryptosystems, proposed by Koblitz ([12]) and Miller ([16]), can be constructed over a smaller field of definition than the ElGamal cryptosystems ([6]) or the RSA cryptosystems ([20]) This is why elliptic curve cryptosystems have begun to attract notice In this paper, we investigate efficient elliptic curve exponentiation We propose a new coordinate system and a new mixed coordinates strategy, which significantly improves on the number of basic operations needed for elliptic curve exponentiation

487 citations


Journal ArticleDOI
Carlo Blundo, Alfredo De Santis, Ugo Vaccaro, Amir Herzberg1, Shay Kutten1, Moti Yong 
TL;DR: This paper considers the model where interaction is allowed in the common key computation phase and shows a gap between the models by exhibiting a one-round interactive scheme in which the user's information is only k + t −1 times the size of the commonKey.
Abstract: In this paper we analyze perfectly secure key distribution schemes for dynamic conferences. In this setting, any member of a group of t users can compute a common key using only his private initial piece of information and the identities of the other t −1 users in the group. Keys are secure against coalitions of up to k users; that is, even if k users pool together their pieces they cannot compute anything about a key of any conference comprised of t other users. First we consider a noninteractive model where users compute the common key without any interaction. We prove the tight bound on the size of each user's piece of information of[formula]times the size of the common key. Then, we consider the model where interaction is allowed in the common key computation phase and show a gap between the models by exhibiting a one-round interactive scheme in which the user's information is only k + t −1 times the size of the common key. Finally, we present its adaptation to network topologies with neighbourhood constraints and to asymmetric (e.g., client-server) communication models.

473 citations


Book ChapterDOI
31 May 1998
TL;DR: This work would like to do verification of a basic operation like modular exponentiation in some group by re-computing gx and checking that gx = y, and faster.
Abstract: Many tasks in cryptography (e.g., digital signature verification) call for verification of a basic operation like modular exponentiation in some group: given (g, x, y) check that gx = y. This is typically done by re-computing gx and checking we get y. We would like to do it differently, and faster.

470 citations


Book ChapterDOI
05 Feb 1998
TL;DR: It is shown directly that the decision Diffie-Hellman assumption implies the security of the original ElGamal encryption scheme (with messages from a subgroup) without modification.
Abstract: The ElGamal encryption scheme has been proposed several years ago and is one of the few probabilistic encryption schemes. However, its security has never been concretely proven based on clearly understood and accepted primitives. Here we show directly that the decision Diffie-Hellman assumption implies the security of the original ElGamal encryption scheme (with messages from a subgroup) without modification. In addition, we show that the opposite direction holds, i.e., the semantic security of the ElGamal encryption is actually equivalent to the decision Diffie-Hellman problem. We also present an exact analysis of the efficiency of the reduction.

Proceedings Article
01 Jan 1998
TL;DR: In this article, the authors take a critical look at the relationship between the security of cryptographic schemes in the Random Oracle Model, and the schemes that result from implementing the random oracle by so called ''cryptographic hash functions''.
Abstract: We take a critical look at the relationship between the security of cryptographic schemes in the Random Oracle Model, and the security of the schemes that result from implementing the random oracle by so called \cryptographic hash functions". The main result of this paper is a negative one: There exist signature and encryption schemes that are secure in the Random Oracle Model, but for which any implementation of the random oracle results in insecure schemes. In the process of devising the above schemes, we consider possible denitions for the notion of a \good implementation" of a random oracle, pointing out limitations and challenges.

Book ChapterDOI
14 Apr 1998
TL;DR: Techniques that enable the software on a computer to control the electromagnetic radiation it transmits and a trusted screen driver can display sensitive information using fonts which minimise the energy of these emissions are discussed.
Abstract: It is well known that eavesdroppers can reconstruct video screen content from radio frequency emanations. We discuss techniques that enable the software on a computer to control the electromagnetic radiation it transmits. This can be used for both attack and defence. To attack a system, malicious code can encode stolen information in the machine’s RF emissions and optimise them for some combination of reception range, receiver cost and covertness. To defend a system, a trusted screen driver can display sensitive information using fonts which minimise the energy of these emissions. There is also an interesting potential application to software copyright protection.

Journal ArticleDOI
TL;DR: The CR capacity is shown to be achievable robustly, by common randomness of nearly uniform distribution no matter what the unknown parameters are, and also yield a new result on the regular (transmission) capacity of arbitrarily varying channels with feedback.
Abstract: For pt.I see ibid., vol.39, p.1121, 1993. The common randomness (CR) capacity of a two-terminal model is defined as the maximum rate of common randomness that the terminals can generate using resources specified by the given model. We determine CR capacity for several models, including those whose statistics depend on unknown parameters. The CR capacity is shown to be achievable robustly, by common randomness of nearly uniform distribution no matter what the unknown parameters are. Our CR capacity results are relevant for the problem of identification capacity, and also yield a new result on the regular (transmission) capacity of arbitrarily varying channels with feedback.

Book ChapterDOI
Daniel R. Simon1
31 May 1998
TL;DR: It is proved the existence of an oracle relative to which there exist several well-known cryptographic primitives, including one-way permutations, but excluding (for a suitably strong definition) collision-intractible hash functions.
Abstract: We prove the existence of an oracle relative to which there exist several well-known cryptographic primitives, including one-way permutations, but excluding (for a suitably strong definition) collision-intractible hash functions. Thus any proof that such functions can be derived from these weaker primitives is necessarily non-relativizing; in particular, no provable construction of a collision-intractable hash function can exist based solely on a “black box” one-way permutation. This result can be viewed as a partial justification for the common practice of treating the collision-intractable hash function as a cryptographic primitive, rather than attempting to derive it from a weaker primitive (such as a one-way permutation).

Book ChapterDOI
14 Sep 1998
TL;DR: In this paper, the authors proposed several improvements on Kocher's ideas, leading to a practical implementation that is able to break a 512-bit key in few hours, provided they are able to collect 300000 timing measurements (128-bit keys can be recovered in few seconds using a personal computer and less than 10000 samples).
Abstract: When the running time of a cryptographic algorithm is non-constant, timing measurements can leak information about the secret key. This idea, first publicly introduced by Kocher, is developed here to attack an earlier version of the CASCADE smart card. We propose several improvements on Kocher’s ideas, leading to a practical implementation that is able to break a 512-bit key in few hours, provided we are able to collect 300000 timing measurements (128-bit keys can be recovered in few seconds using a personal computer and less than 10000 samples). We therefore show that the timing attack represents an important threat against cryptosystems, which must be very seriously taken into account.

Book
24 Nov 1998
TL;DR: This book focuses on cryptography along with two related areas: the study of probabilistic proof systems, and the theory of computational pseudorandomness, following a common theme that explores the interplay between randomness and computation.
Abstract: From the Publisher: This book focuses on cryptography along with two related areas: the study of probabilistic proof systems, and the theory of computational pseudorandomness. Following a common theme that explores the interplay between randomness and computation, the important notions in each field are covered, as well as novel ideas and insights.

Proceedings ArticleDOI
03 May 1998
TL;DR: This paper conceptualizes the specific cryptographic problems posed by mobile code, and it is able to provide a solution for some of these problems, and presents techniques to achieve "non-interactive evaluation with encrypted functions" in certain cases.
Abstract: Mobile code technology has become a driving force for recent advances in distributed systems. The concept of the mobility of executable code raises major security problems. In this paper, we deal with the protection of mobile code from possibly malicious hosts. We conceptualize the specific cryptographic problems posed by mobile code, and we are able to provide a solution for some of these problems. We present techniques to achieve "non-interactive evaluation with encrypted functions" in certain cases and give a complete solution for this problem in important instances. We further present a way in which an agent might securely perform a cryptographic primitive-digital signing-in an untrusted execution environment. Our results are based on the use of homomorphic encryption schemes and function composition techniques.

Book ChapterDOI
Victor Shoup1, Rosario Gennaro1
31 May 1998
TL;DR: This paper is to present two very practical threshold cryptosystems, and to prove that they are secure against chosen ciphertext attack in the random hash function model.
Abstract: For the most compelling applications of threshold cryptosystems, security against chosen ciphertext attack seems to be a requirement. However, there appear to be no practical threshold cryptosystems in the literature that are provably chosen-ciphertext secure, even in the idealized random hash function model. The contribution of this paper is to present two very practical threshold cryptosystems, and to prove that they are secure against chosen ciphertext attack in the random hash function model.

Journal Article
TL;DR: In this paper, the authors presented a method for finding collisions in SHA-0 which is related to differential cryptanalysis of block ciphers and obtained a theoretical attack on the compression function SHA-O with complexity 2 61, which is thus better than the birthday paradox attack.
Abstract: In this paper we present a method for finding collisions in SHA-0 which is related to differential cryptanalysis of block ciphers. Using this method, we obtain a theoretical attack on the compression function SHA-O with complexity 2 61 , which is thus better than the birthday paradox attack. In the case of SHA-1, this method is unable to find collisions faster than the birthday paradox. This is a strong evidence that the transition to version 1 indeed raised the level of security of SHA.

Book ChapterDOI
Tal Rabin1
23 Aug 1998
TL;DR: The signing key, in the solution, is shared at all times in additive form, which allows for simple signing and for a particularly efficient and straightforward refreshing process for proactivization.
Abstract: We present a solution to both the robust threshold RSA and proactive RSA problems. Our solutions are conceptually simple, and allow for an easy design of the system. The signing key, in our solution, is shared at all times in additive form, which allows for simple signing and for a particularly efficient and straightforward refreshing process for proactivization. The key size is (up to a very small constant) the size of the RSA modulus, and the protocol runs in constant time, even when faults occur, unlike previous protocols where either the size of the key has a linear blow-up (at best) in the number of players or the run time of the protocol is linear in the number of faults. The protocol is optimal in its resilience as it can tolerate a minority of faulty players. Furthermore, unlike previous solutions, the existence and availability of the key throughout the lifetime of the system, is guaranteed without probability of error.

Book ChapterDOI
16 Sep 1998
TL;DR: The notion of side-channel cryptanalysis: cryptanalysis using implementation data is introduced andSide-channel attacks against three product ciphers are demonstrated and generalized to other cryptosystems are generalized.
Abstract: Building on the work of Kocher [Koc96], we introduce the notion of side-channel cryptanalysis: cryptanalysis using implementation data. We discuss the notion of side-channel attacks and the vulnerabilities they introduce, demonstrate side-channel attacks against three product ciphers-timing attack against IDEA, processor-flag attack against RC5, and Hamming weight attack against DES-and then generalize our research to other cryptosystems.

Journal ArticleDOI
TL;DR: The paper aims to develop a specific theory appropriate to the analysis of authentication protocols, built on top of the general CSP semantic framework.
Abstract: This paper presents a general approach for analysis and verification of authentication properties using the theory of Communicating Sequential Processes (CSP). The paper aims to develop a specific theory appropriate to the analysis of authentication protocols, built on top of the general CSP semantic framework. This approach aims to combine the ability to express such protocols in a natural and precise way with the ability to reason formally about the properties they exhibit. The theory is illustrated by an examination of the Needham-Schroeder (1978) public key protocol. The protocol is first examined with respect to a single run and then more generally with respect to multiple concurrent runs.

Journal ArticleDOI
TL;DR: This analysis shows that when compared with signaturethen-encryption on elliptic curves, signcryption on the curves represents a 58%saving in computational cost and a 40% saving in communication overhead.

Book ChapterDOI
01 Jan 1998
TL;DR: This paper presents a mechanism based on execution tracing and cryptography that allows one to detect attacks against code, state, and execution flow of mobile software components.
Abstract: Mobile code systems are technologies that allow applications to move their code, and possibly the corresponding state, among the nodes of a wide-area network. Code mobility is a flexible and powerful mechanism that can be exploited to build distributed applications in an Internet scale. At the same time, the ability to move code to and from remote hosts introduces serious security issues. These issues include authentication of the parties involved and protection of the hosts from malicious code. However, the most difficult task is to protect mobile code against attacks coming from hosts. This paper presents a mechanism based on execution tracing and cryptography that allows one to detect attacks against code, state, and execution flow of mobile software components.

Book ChapterDOI
05 Feb 1998
TL;DR: Zheng's scheme is modified so that the recipient's private key is no longer needed in signature verification, and the computational cost is higher than that of Zheng's scheme but lower than that that of the signature-then-encryption approach.
Abstract: Signcryption, first proposed by Zheng [4, 5], is a cryptographic primitive which combines both the functions of digital signature and public key encryption in a logical single step, and with a computational cost siginficantly lower than that needed by the traditional signature-then-encryption approach. In Zheng's scheme, the signature verification can be done either by the recipient directly (using his private key) or by engaging a zero-knowledge interative protocol with a third party, without disclosing recipient's private key. In this note, we modify Zheng's scheme so that the recipient's private key is no longer needed in signature verification. The computational cost of the modified scheme is higher than that of Zheng's scheme but lower than that of the signature-then-encryption approach.

Book ChapterDOI
23 Aug 1998
TL;DR: In this article, a 3-round zero-knowledge protocol for any NP language is proposed, which is based on a non-black-box simulation technique, based on the Diffie-Hellman problem.
Abstract: In this paper, we construct a 3-round zero-knowledge protocol for any NP language. Goldreich and Krawczyk proved that a 3-round black-box simulation zero-knowledge protocol exists only for BPP languages. However, there is no contradiction here. That is, our proposed protocol achieves a weaker notion of zero-knowledge: auxiliary-input non-uniform zero-knowledge. Since this notion has not been investigated in the literature, we classify several zero-knowledge notions including it and discuss the relationships among them. Our main contribution is to provide a non-black-box simulation technique. It is based on a novel computational assumption related to the Diffie-Hellman problem. Although this assumption is strong and non-standard, its non-standard nature seems essential for our simulation technique.

Proceedings ArticleDOI
23 May 1998
TL;DR: The disclosed method can be combined with proactive function sharing techniques to establish the first efficient, optimal-resilience, robust and proactively-secure RSA-based distributed trust services where the key is never entrusted to a single entity.
Abstract: The invention provides for robust efficient distributed generation of RSA keys. An efficient protocol is one which is independent of the primality test “circuit size”, while a robust protocol allows correct completion even in the presence of a minority of arbitrarily misbehaving malicious parties. The disclosed protocol is secure against any minority of malicious parties (which is optimal). The disclosed method is useful in establishing sensitive distributed cryptographic function sharing services (certification authorities, signature schemes with distributed trust, and key escrow authorities), as well as other applications besides RSA (namely: composite ElGamal, identification schemes, simultaneous bit exchange, etc.). The disclosed method can be combined with proactive function sharing techniques to establish the first efficient, optimal-resilience, robust and proactively-secure RSA-based distributed trust services where the key is never entrusted to a single entity (i.e., distributed trust totally “from scratch”). The disclosed method involves new efficient “robustness assurance techniques” which guarantee “correct computations” by mutually distrusting parties with malicious minority.

Book ChapterDOI
23 Feb 1998
TL;DR: This paper constructs a practical group blind signature scheme that is an extension of Camenisch and Stadler's Group Signature Scheme that adds the blindness property and shows how to use it to construct an electronic cash system in which multiple banks can securely distribute anonymous and untraceable e-cash.
Abstract: In this paper we construct a practical group blind signature scheme. Our scheme combines the already existing notions of blind signatures and group signatures. It is an extension of Camenisch and Stadler's Group Signature Scheme [5] that adds the blindness property. We show how to use our group blind signatures to construct an electronic cash system in which multiple banks can securely distribute anonymous and untraceable e-cash. Moreover, the identity of the e-cash issuing bank is concealed, which is conceptually novel. The space, time, and communication complexities of the relevant parameters and operations are independent of the group size.

Book ChapterDOI
31 May 1998
TL;DR: Lower bounds on the sizes of keys and ciphertexts are derived and an optimum one-time use scheme which approximately meets bounds is presented and proven to be secure as well as much more efficient than the schemes by Chor, Fiat and Naor.
Abstract: A traceability scheme is a broadcast encryption scheme such that a data supplier T can trace malicious authorized users (traitors) who gave a decryption key to an unauthorized user (pirate). This paper first derives lower bounds on the sizes of keys and ciphertexts. These bounds are all tight because an optimum one-time use scheme is also presented. We then propose a multiple-use scheme which approximately meets our bounds. This scheme is proven to be secure as well as much more efficient than the schemes by Chor, Fiat and Naor. Finally, practical types of asymmetric schemes with arbiter are discussed in which T cannot frame any authorized user as a traitor.