scispace - formally typeset
Search or ask a question

Showing papers on "Cryptography published in 1993"


Proceedings ArticleDOI
Mihir Bellare1, Phillip Rogaway1
01 Dec 1993
TL;DR: It is argued that the random oracles model—where all parties have access to a public random oracle—provides a bridge between cryptographic theory and cryptographic practice, and yields protocols much more efficient than standard ones while retaining many of the advantages of provable security.
Abstract: We argue that the random oracle model—where all parties have access to a public random oracle—provides a bridge between cryptographic theory and cryptographic practice. In the paradigm we suggest, a practical protocol P is produced by first devising and proving correct a protocol PR for the random oracle model, and then replacing oracle accesses by the computation of an “appropriately chosen” function h. This paradigm yields protocols much more efficient than standard ones while retaining many of the advantages of provable security. We illustrate these gains for problems including encryption, signatures, and zero-knowledge proofs.

5,313 citations


Book
10 Nov 1993
TL;DR: This document describes the construction of protocols and their use in the real world, as well as some examples of protocols used in the virtual world.
Abstract: CRYPTOGRAPHIC PROTOCOLS. Protocol Building Blocks. Basic Protocols. Intermediate Protocols. Advanced Protocols. Esoteric Protocols. CRYPTOGRAPHIC TECHNIQUES. Key Length. Key Management. Algorithm Types and Modes. Using Algorithms. CRYPTOGRAPHIC ALGORITHMS. Data Encryption Standard (DES). Other Block Ciphers. Other Stream Ciphers and Real Random-Sequence Generators. Public-Key Algorithms. Special Algorithms for Protocols. THE REAL WORLD. Example Implementations. Politics. SOURCE CODE.source Code. References.

3,432 citations


Journal ArticleDOI
TL;DR: As the first part of a study of problems involving common randomness at distance locations, information-theoretic models of secret sharing (generating a common random key at two terminals, without letting an eavesdropper obtain information about this key) are considered.
Abstract: As the first part of a study of problems involving common randomness at distance locations, information-theoretic models of secret sharing (generating a common random key at two terminals, without letting an eavesdropper obtain information about this key) are considered. The concept of key-capacity is defined. Single-letter formulas of key-capacity are obtained for several models, and bounds to key-capacity are derived for other models. >

1,471 citations


Book ChapterDOI
22 Aug 1993
TL;DR: This paper proposes a new identification scheme, based on error-correcting codes, which is zero-knowledge and is of practical value, and describes several variants, including one which has an identity based character.
Abstract: Zero-knowledge proofs were introduced in 1985, in a paper by Goldwasser, Micali and Rackoff ([6]). Their practical significance was soon demonstrated in the work of Fiat and Shamir ([4]), who turned zero-knowledge proofs of quadratic residuosity into efficient means of establishing user identities. Still, as is almost always the case in public-key cryptography, the Fiat-Shamir scheme relied on arithmetic operations on large numbers. In 1989, there were two attempts to build identification protocols that only use simple operations (see [11, 10]). One appeared in the EUROCRYPT proceedings and relies on the intractability of some coding problems, the other waa presented at the CRYPTO rump session and depends on the so-called Permuted Kernel problem (PKP). Unfortunately, the first of the schemes was not really practical. In the present paper, we propose a new identification scheme, based on error-correcting codes, which is zero-knowledge and is of practical value. Furthermore, we describe several variants, including one which has an identity based character. The security of our scheme depends on the hardness of decoding a word of given syndrome w.r.t. some binary linear error-correcting code.

385 citations


Proceedings ArticleDOI
01 Dec 1993
TL;DR: It turns out that the threat model commonly used by cryptosystem designers was wrong: most frauds were not caused by cryptanalysis or other technical attacks, but by implementation errors and management failures, suggesting that a paradigm shift is overdue in computer security.
Abstract: Designers of cryptographic systems are at a disadvantage to most other engineers, in that information on how their systems fail is hard to get: their major users have traditionally been government agencies, which are very secretive about their mistakes.In this article, we present the results of a survey of the failure modes of retail banking systems, which constitute the next largest application of cryptology. It turns out that the threat model commonly used by cryptosystem designers was wrong: most frauds were not caused by cryptanalysis or other technical attacks, but by implementation errors and management failures. This suggests that a paradigm shift is overdue in computer security; we look at some of the alternatives, and see some signs that this shift may be getting under way.

307 citations


Journal ArticleDOI
TL;DR: The authors describe a VLSI Galois field processor and how it can be applied to the implementation of elliptic curve groups and demonstrate the feasibility of constructing very fast, and very secure, public key systems with a relatively simple device.
Abstract: The authors describe a VLSI Galois field processor and how it can be applied to the implementation of elliptic curve groups. They demonstrate the feasibility of constructing very fast, and very secure, public key systems with a relatively simple device, and the possibility of putting such a system on a smart card. The registers necessary to implement the elliptic curve system will require less than 1 mm/sup 2/ (or less than 4%) of the area available on the card. >

261 citations


Proceedings ArticleDOI
29 Jun 1993
TL;DR: The authors detail and analyze the critical techniques that may be combined in the design of fast hardware for RSA cryptography: chinese remainders, star chains, Hensel's odd division, carry-save representation, quotient pipelining, and asynchronous carry completion adders.
Abstract: The authors detail and analyze the critical techniques that may be combined in the design of fast hardware for RSA cryptography: chinese remainders, star chains, Hensel's odd division (also known as Montgomery modular reduction), carry-save representation, quotient pipelining, and asynchronous carry completion adders. A fully operational PAM (programmable active memory) implementation of RSA that combines all of the techniques presented here delivers an RSA secret decryption rate over 600-kb/s for 512-b keys, and 165-kb/s for 1-kb keys. This is an order of magnitude faster than any previously reported running implementation. While the implementation makes full use of the PAM's reconfigurability, it is possible to derive from the (multiple PAM designs) implementation a (single) gate-array specification with estimated size under 100 K gates and speed over 1 Mb/s for RSA 512-b keys. Matching gains in software performance which are also analyzed. >

245 citations



Journal ArticleDOI
TL;DR: Public-key/private-key hybrid key agreements and authentication protocols which maintain privacy of conversation and location information, and deter usage fraud, are presented, and the tradeoffs are discussed.
Abstract: Public-key/private-key hybrid key agreements and authentication protocols which maintain privacy of conversation and location information, and deter usage fraud, are presented. These protocols are optimized for low complexity in the portable unit and network infrastructure. The basic cryptographic techniques are described, and some complexity information obtained from these laboratory experiments and from other sources are presented. The three public-key protocols described have differing levels of security and complexity: and the tradeoffs are discussed. Because of the complexity concerns mentioned above, the public-key protocols are compared to a representative private-key approach in the areas of both security and computational complexity. >

206 citations


Book
19 Jan 1993

199 citations


Journal Article
TL;DR: In this paper, the substitution boxes of DES are relatively small in dimension and they can be generated by testing randomly chosen functions for required design criteria, but when the dimensions grow larger, analytic construction methods become necessary.
Abstract: Highly nonlinear permutations play an important role in the design of cryptographic transformations such as block ciphers, hash functions and stream ciphers. The substitution boxes of DES are relatively small in dimension and they can be generated by testing randomly chosen functions for required design criteria. Security may be increased by the use of substitution transformations of higher dimensions. But when the dimensions grow larger, analytic construction methods become necessary.

Journal ArticleDOI
TL;DR: A methodology for systematically building and testing the security of a family of cryptographic two-way authentication protocols that are as simple as possible yet resistant to a wide class of attacks, efficient, easy to implement and use, and amenable to many different networking environments is described.
Abstract: Most existing designs for two-way cryptographic authentication protocols suffer from one or more limitations. Among other things, they require synchronization of local clocks, they are subject to export restrictions because of the way they use cryptographic functions, and they are not amenable to use in lower layers of network protocols because of the size and complexity of messages they use. Designing suitable cryptographic protocols that cater to large and dynamic network communities but do not suffer from these problems presents substantial problems. It is shown how a few simple protocols, including one proposed by ISO, can easily be broken, and properties that authentication protocols should exhibit are derived. A methodology for systematically building and testing the security of a family of cryptographic two-way authentication protocols that are as simple as possible yet resistant to a wide class of attacks, efficient, easy to implement and use, and amenable to many different networking environments is described. Examples of protocols of that family that presents various advantages in specific distributed system scenarios are discussed. >

01 Feb 1993
TL;DR: This document provides definitions, formats, references, and citations for cryptographic algorithms, usage modes, and associated identifiers and parameters used in support of Privacy Enhanced Mail (PEM) in the Internet community.
Abstract: This document provides definitions, formats, references, and citations for cryptographic algorithms, usage modes, and associated identifiers and parameters used in support of Privacy Enhanced Mail (PEM) in the Internet community [STANDARDS-TRACK]

Journal ArticleDOI
TL;DR: It is shown that uniform variants of the two definitions of security, presented in the pioneering work of Goldwasser and Micali, are in fact equivalent, and how to construct such zero-knowledge proof systems for every language inNP, using only a uniform complexity assumption.
Abstract: We provide a treatment of encryption and zero-knowledge in terms of uniform complexity measures. This treatment is appropriate for cryptographic settings modeled by probabilistic polynomial-time machines. Our uniform treatment allows the construction of secure encryption schemes and zero-knowledge proof systems (for allNP) using only uniform complexity assumptions. We show that uniform variants of the two definitions of security, presented in the pioneering work of Goldwasser and Micali, are in fact equivalent. Such a result was known before only for nonuniform formalization. Nonuniformity is implicit in all previous treatments of zero-knowledge in the sense that a zero-knowledge proof is required to "leak no knowledge" onall instances. For practical purposes, it suffices to require that it isinfeasible to find instances on which a zero-knowledge proof "leaks knowledge." We show how to construct such zero-knowledge proof systems for every language inNP, using only a uniform complexity assumption. Properties of uniformly zero-knowledge proofs are investigated and their utility is demonstrated.

Patent
Edward Andrew Zuk1
30 Mar 1993
TL;DR: In this paper, a method for loading secret data, such as an application key, on a smart card (6) was proposed, which involves storing a random key on the card, encrypting the encrypted data on the basis of a public key, and providing the encrypted random key to a central processing station.
Abstract: A method for loading secret data, such as an application key, on a smart card (6), which involves storing a random key on the card (6), encrypting the random key on the basis of a public key, and providing the encrypted random key to a central processing station (4). The encrypted random key is decrypted at the central station on the basis of a secret key, and the station (4) encrypts data on the basis of the random key and transmits it to the smart card (6). The smart card decrypts the encrypted data on the basis of the random key. The random key can be generated internally and stored on read protected memory (23) of the card (6). The public key encrypting and secret key decrypting steps may be based on the RSA algorithm, using a small encryption exponent.

Patent
28 May 1993
TL;DR: In this article, the authors proposed a split key scheme for secure communication of a message from a transmitting user to a receiving user using a cryptographic engine using a pseudorandom sequence of bits with an appended error detection field.
Abstract: A system for the secure communication of a message from a transmitting user to a receiving user using a split key scheme. Each user generates a key component using a cryptographic engine. The key component is a pseudorandom sequence of bits with an appended error detection field which is mathematically calculated based on the pseudorandom sequence. This key component is then sent out on a communications channel from the transmitting user to the receiving user. The receiving user also sends its key component to the transmitting user. Each location performs a mathematical check on the key component received from the other location. If the key component checks pass at both locations, the transmit key component and the receive key component, including the error detection fields, are combined at both locations, forming identical complete keys at both locations. The identical complete keys are then used to initiate the cryptographic engines at both locations for subsequent encryption and decryption of messages between the two locations.

Book ChapterDOI
22 Aug 1993
TL;DR: Novel approaches to secret-key agreement are described, not based on public-key cryptography nor number theory, which are extremely efficient implemented in software or make use of very simple uuexpensive hardware.
Abstract: In this paper, we describe novel approaches to secret-key agreement. Our schemes are not based on public-key cryptography nor number theory. They are extremely efficient implemented in software or make use of very simple uuexpensive hardware. Our technology is particularly well-suited for use in cryptographic scenarios like those of the Clipper Chip, the recent encryption proposal put forward by the Clinton Administration.

Proceedings ArticleDOI
01 Dec 1993
TL;DR: Extension to BAN-like logics are proposed which facilitate, for the first time, examination of public-key based authenticated key establishment protocols in which both parties contribute to the derived key (i.e. key agreement protocols).
Abstract: The authentication logic of Burrows, Abadi and Needham (BAN) provided an important step towards rigourous analysis of authentication protocols, and has motivated several subsequent refinements. We propose extensions to BAN-like logics which facilitate, for the first time, examination of public-key based authenticated key establishment protocols in which both parties contribute to the derived key (i.e. key agreement protocols). Attention is focussed on six distinct generic goals for authenticated key establishment protocols. The extended logic is used to analyze three Diffie-Hellman based key agreement protocols, facilitating direct comparison of these protocols with respect to formal goals reached and formal assumptions required.

Book ChapterDOI
01 Jan 1993
TL;DR: Here some previous efforts to engineer cryptosystems based on dynamical systems are reviewed, leading up to a detailed proposal for a cellular automatonCryptosystem, which would enable encryption of messages to preserve their secrecy.
Abstract: Dynamical systems are often described as “unpredictable” or “complex” as aspects of their behavior may bear a cryptic relationship with the simple evolution laws which define them. Some theorists work to quantify this complexity in various ways. Others try to turn the cryptic nature of dynamical systems to a practical end: encryption of messages to preserve their secrecy. Here some previous efforts to engineer cryptosystems based on dynamical systems are reviewed, leading up to a detailed proposal for a cellular automaton cryptosystem.

Journal ArticleDOI
TL;DR: It is shown that GAs can greatly facilitate cryptanalysis by efficiently searching large keyspaces, and their use with GENALYST, an order-based GA for breaking a classic cryptographic system is demonstrated.
Abstract: We consider the use of genetic algorithms (GAs) as powerful tools in the breaking of cryptographic systems. We show that GAs can greatly facilitate cryptanalysis by efficiently searching large keyspaces, and demonstrate their use with GENALYST, an order-based GA for breaking a classic cryptographic system.

Journal ArticleDOI
TL;DR: This work approaches the problem of key management in a modular and hierarchical manner and discusses key management security requirements, deals with generic key management concepts and design criteria, and describes key management services and building blocks, as well as key management facilities, key management units, and their interrelationship.
Abstract: Security services based on cryptographic mechanisms assume keys to be distributed prior to secure communications. The secure management of these keys is one of the most critical elements when integrating cryptographic functions into a system, since any security concept will be ineffective if the key management is weak. This work approaches the problem of key management in a modular and hierarchical manner. It discusses key management security requirements, deals with generic key management concepts and design criteria, describes key management services and building blocks, as well as key management facilities, key management units, and their interrelationship. >

Journal ArticleDOI
TL;DR: A new public-key protocol has been developed for key agreement and authentication that provides security comparable to the well known RSA public- key protocol, but with two orders of magnitude less on-line computation required on one side of the protocol.
Abstract: A new public-key protocol has been developed for key agreement and authentication. This protocol provides security comparable to the well known RSA public-key protocol, but with two orders of magnitude less on-line computation required on one side of the protocol. This advance can make public-key security economical for applications where one side of the interaction is a low-cost customer device, e.g. portable telephones, home banking terminals, or ‘smart cards’.

Proceedings Article
01 Jan 1993
TL;DR: CryptLib is a very portable and efficient library of routines necessary for the aforementioned cryptosystems, written entirely in C and exists under UNIX.
Abstract: With the capacity of communications channels increasing at the current rates and with the kinds of electronic services becoming more varied and multidimensional, there is also coming a greater tendency to store and forward sensitive, semi-private or very private information. With this tendency there is an increasing interest in protecting the privacy and security of communications channels and sensitive information. Many protocols have been devised which provide varying levels of security and which have overhead associated with them roughly proportional to the level of that security. These protocols make use of cryptosystems which may be divided roughly into two groups: private key and public key. It is generally assumed that efficient implementations of these systems are possible only in hardware or at least coded in very nonportable assembly language for particular processors. CryptoLib is a very portable and efficient library of routines necessary for the aforementioned cryptosystems. It written entirely in C and exists under UNIX.

Book ChapterDOI
05 Jul 1993
TL;DR: It is shown that off-line coin schemes can be implemented securely and efficiently, where security is proven based on the hardness of the discrete log function and a pre-processing stage, and where efficiency is in a new sense.
Abstract: No off-line electronic coin scheme has yet been proposed which is both provably secure with respect to natural cryptographic assumptions and efficient with respect to reasonable measures. We show that off-line coin schemes can be implemented securely and efficiently, where security is proven based on the hardness of the discrete log function and a pre-processing stage, and where efficiency is in a new sense that we put forth in this work: “a protocol is efficient if its communication complexity is independent of the computational complexity of its participants” (and thus the communication length and number of encryption operations is only a low-degree polynomial of the input).

Journal ArticleDOI
Stephen T. Kent1
TL;DR: Privacy Enhanced Mail (PEM) as mentioned in this paper is an extension to existing message processing software plus a key management infrastructure that combines to provide users with a facility in which message confidentiality, authenticity, and integrity can be effected.
Abstract: Privacy Enhanced Mail (PEM) consists of extensions to existing message processing software plus a key management infrastructure. These combine to provide users with a facility in which message confidentiality, authenticity, and integrity can be effected. PEM is compatible with RFC 822 message processing conventions and is transparent to SMTP mail relays. PEM uses symmetric cryptography — for example, the Data Encryption Standard (DES) — to provide (optional) encryption of messages. Although the RFCs permit the use of either symmetric or asymmetric (public key) cryptography (for instance, the RSA cryptosystem) to distribute symmetric keys, the RFCs strongly recommend the use of asymmetric cryptography for this purpose and to generate and validate digital signatures for messages and certificates. Public key management in PEM is based on the use of certificates as defined by the CCITT Directory Authentication Framework [CCIT88c]. A public key certification hierarchy for PEM is being established by the Internet Society. This certification hierarchy supports universal authentication of PEM users, under various policies, without the need for prior bilateral agreements among users or organizations with which the users may be affiliated.

01 Feb 1993
TL;DR: In this paper, the authors provide definitions, formats, references, and citations for cryptographic algorithms, usage modes, and associated identifiers and parameters used in support of Privacy Enhanced Mail (PEM) in the Internet community.
Abstract: This document provides definitions, formats, references, and citations for cryptographic algorithms, usage modes, and associated identifiers and parameters used in support of Privacy Enhanced Mail (PEM) in the Internet community. It is intended to become one member of the set of related PEM RFCs. This document is organized into four primary sections, dealing with message encryption algorithms, message integrity check algorithms, symmetric key management algorithms, and asymmetric key management algorithms (including both asymmetric encryption and asymmetric signature algorithms).

Book
01 Oct 1993
TL;DR: Covering the latest developments in cryptography for all data communication professionals who need an understanding of cryptographic technology, this book explains the Data Encryption Standard, stream ciphers, public-key cryptosystems, arithmetic operating circuits, and important classes of BCH and Reed-Solomon codes for multiple-error correction.
Abstract: From the Publisher: This book provides a practical introduction to cryptographic principles and algorithms for communication security and data privacy-both commercial and military-written by one of the world's leading authorities on encryption and coding. Covering the latest developments in cryptography for all data communication professionals who need an understanding of cryptographic technology,the book explains the Data Encryption Standard,stream ciphers,public-key cryptosystems,arithmetic operating circuits,important classes of BCH and Reed-Solomon codes for multiple-error correction,ciphertext protection against illegal deletion or injection of information,practical cryptographic applications,and more.

Journal ArticleDOI
TL;DR: Three methods for strengthening public key cryptosystems in such a way that they become secure against adaptively chosen ciphertext attacks are presented and security of the three example cryptosSystems is formally proved.
Abstract: Three methods for strengthening public key cryptosystems in such a way that they become secure against adaptively chosen ciphertext attacks are presented. In an adaptively chosen ciphertext attack, an attacker can query the deciphering algorithm with any ciphertext except for the exact object ciphertext to be cryptanalyzed. The first strengthening method is based on the use of one-way hash functions, the second on the use of universal hash functions, and the third on the use of digital signature schemes. Each method is illustrated by an example of a public key cryptosystem based on the intractability of computing discrete logarithms in finite fields. Security of the three example cryptosystems is formally proved. Two other issues, namely, applications of the methods to public key cryptosystems based on other intractable problems and enhancement of information authentication capability to the cryptosystems, are also discussed. >

Journal Article
TL;DR: In this paper, a general attack on nonlinearly filtered linear (over Z2) systems is presented and further refined to efficiently cryptanalyze a linear system with a multiplexer as output function.
Abstract: In some applications for synchronous stream ciphers, the risk of loss of synchronization cannot be eliminated completely. In these cases frequent resynchronization or resynchronization upon request may be necessary. In the paper it is shown that this can lead to significant deterioration of the cryptographic security. A powerful general attack on nonlinearly filtered linear (over Z2) systems is presented. This attack is further refined to efficiently cryptanalyze a linear system with a multiplexer as output function.

Journal ArticleDOI
TL;DR: Four efficient server-aided computation protocols for the modular exponentiation operation are proposed and it is shown that the most efficient one ever proposed to provide the highest security level is shown.
Abstract: Four efficient server-aided computation protocols for the modular exponentiation operation are proposed. The server-aided computation protocol is a two-party protocol between the client and the server. This protocol has two objectives. The first is to allow the client to borrow the computational power from the server to reduce the computation time. Note that the server is powerful, but restricted to polynomial time. The second objective is to keep the client's exponent secret from the server. Efficient and secure protocols which disclose no knowledge about the secret exponent are proposed. The protocols are based on efficient exponentiation algorithms. The computation time depends on the server's power and the speed of the channel between the client and the server. The normalized computation time is introduced and used to evaluate the protocols. It is shown that, for typical parameters, the protocol is the most efficient one ever proposed to provide the highest security level. >