scispace - formally typeset
Search or ask a question

Showing papers on "Ciphertext published in 1988"


Proceedings ArticleDOI
01 Jan 1988
TL;DR: The authors showed that interaction in any zero-knowledge proof can be replaced by sharing a common, short, random string and used this result to construct the first public-key cryptosystem secure against chosen ciphertext attack.
Abstract: We show that interaction in any zero-knowledge proof can be replaced by sharing a common, short, random string. We use this result to construct the first public-key cryptosystem secure against chosen ciphertext attack.

879 citations


Book ChapterDOI
01 Apr 1988
TL;DR: A systematic method of checking is suggested, and a generalized version of the cryptanalytic attack which reduces the work factor sigdicantly is described, which can be viewed as generalized probabilistic decoding algorithms for any linear error correcting codes.
Abstract: The best known cryptanalytic attack on McEliece's public-key cryptosystem based on algebraic coding theory is to repeatedly select k bits at random from an n-bit ciphertext vector, which is corrupted by at most t errors, in hope that none of the selected k bits are in error until the cryptanalyst recovers the correct message. The method of determining whether the recovered message is the correct one has not been throughly investigated. In this paper, we suggest a systematic method of checking, and describe a generalized version of the cryptanalytic attack which reduces the work factor sigdicantly (factor of 211 for the commonly used example of n=1024 Goppa code case). Some more improvements are also given. We also note that these cryptanalytic algorithms can be viewed as generalized probabilistic decoding algorithms for any linear error correcting codes.

321 citations


Book ChapterDOI
21 Aug 1988
TL;DR: This paper shows that interaction in any zero-knowledge proof can be replaced by sharing a common, short, random string, which finds immediate application in the construction of the first public-key cryptosystem secure against chosen ciphertext attack.
Abstract: The relevance of zero knowledge to cryptography has become apparent in the recent years. In this paper we advance this theory by showing that interaction in any zero-knowledge proof can be replaced by sharing a common, short, random string. This advance finds immediate application in the construction of the first public-key cryptosystem secure against chosen ciphertext attack.Our solution, though not yet practical, is of theoretical significance, since the existence of cryptosystems secure against chosen ciphertext attack has been a famous long-standing open problem in the field.

46 citations


Journal ArticleDOI
TL;DR: The design of the fast data encipherment algorithm (FEAL) is discussed, which is a block cipher algorithm which produces 64 bit ciphertext from 64 plaintext, using a 64-bit key.
Abstract: This paper discusses the design of the fast data encipherment algorithm (FEAL). FEAL is a conventional encipherment algorithm using the same key for enciphering and deciphering. It is a block cipher algorithm which produces 64 bit ciphertext from 64 plaintext, using a 64-bit key. The main feature of the cipher processing in FEAL is that it is based on 8-bit data manipulations such as addition with modulo 256, and data cycle and data transfer commands for 1-byte. The program performance of FEAL is improved greatly compared with that of DES. The security of FEAL is based on the characteristic value representing the trace of the plaintext and ciphering key remaining in the ciphertext, as well as on the algorithm structure.

26 citations


Journal Article
TL;DR: A blockcipher maps each pair of plaintext and key onto a ciphertext in such a way that for every fixed key, the relationship between plaintexts and ciphertexts is one-to-one.
Abstract: A blockcipher maps each pair of plaintext and key onto a ciphertext in such a way that for every fixed key, the relationship between plaintexts and ciphertexts is one-to-one. It is assumed that plaintexts and ciphertexts belong to a message space comprising all bit-strings (sequences of zeros and ones) of a given length; keys are taken from a key space made up of aU bitstrings of a possibly Merent given length. A well-known blockcipher is the NBS Data Encryption Standard (DES) [6], whch is the iteration of sixteen essentially equal “rounds”.

20 citations


Journal ArticleDOI
TL;DR: Here it is shown that Hellman's result holds with no restrictions on the distribution of keys and messages, and the results are obtained through very simple purely information theoretic arguments, with no need for (explicit) counting arguments.
Abstract: In his landmark 1977 paper [2], Hellman extends the Shannon theory approach to cryptography [3] In particular, he shows that the expected number of spurious key decipherments on lengthn messages is at least 2 H(K)−nD −1 forany uniquely encipherable, uniquely decipherable cipher, as long as each key is equally likely and the set of meaningful cleartext messages follows a uniform distribution (whereH(K) is the key entropy andD is the redundancy of the source language) Here we show that Hellman's result holds with no restrictions on the distribution of keys and messages We also bound from above and below the key equivocation upon seeing the ciphertext The results are obtained through very simple purely information theoretic arguments, with no need for (explicit) counting arguments

18 citations


Journal ArticleDOI
TL;DR: This paper proposes a new public-key cryptosystem based on the difficulty of solving a system of nonlinear equations with rational functions, and the computational complexity of encryption and decryption, the description volume of public and secret keys, the possibility of digital signature are studied.
Abstract: This paper proposes a new public-key cryptosystem based on the difficulty of solving a system of nonlinear equations. The proposed cryptosystem has the following features: 1) The public-key is a nonlinear transform from a plaintext to a ciphertext in the form of rational functions. 2) The complexity of both encryption and decryption is O(m2), where m is the plaintext length. 3) Digital signature is possible. The two previously proposed systems based on the matrix decomposition and the squared matrix are special cases of the proposed system. The reliability of the cryptosystem when nonlinearity is limited to the polynomial form is discussed. Next, a publickey cryptosystem based on the difficulty of solving a system of nonlinear equations with rational functions is proposed, its decryption algorithm is studied, and the conditions for this cryptosystem to ensure reliability are derived. Finally, the computational complexity of encryption and decryption, the description volume of public and secret keys, and the possibility of digital signature are studied.

16 citations


Patent
27 Jan 1988
TL;DR: In this paper, the synchronization information for synchronizing encrypting and decrypting key generators (12 and 16) in secure communications link (10) is transmitted without exacting a bandwidth penalty, where pointer comparators at the transmission and reception ends of the link monitor the cipher text transmitted between the generators to determine whether it includes a predetermined naturally occurring sequence of bits, referred to as a pointer sequence.
Abstract: Synchronization information for synchronizing encrypting and decrypting key generators (12 and 16) in a secure communications link (10) is transmitted without exacting a bandwidth penalty. Pointer comparators (17 and 18) at the transmission and reception ends of the link monitor the cipher text transmitted between the generators to determine whether it includes a predetermined naturally occurring sequence of bits, referred to as a "pointer" sequence. Upon the occurrence of the pointer sequence, the pointer comparators (17 and 18) trigger synchronization circuits (20 and 22) to read a sequence in the cipher text that occurs a predetermined period of time after the occurrence of the pointer. Accordingly, both synchronization circuits (20 and 22) read the same cipher-text sequence, which is a naturally occurring part of the cipher text. In response, one of the synchronization circuits (20) places the encrypting key generator (12) into a state designated by the naturally occurring sequence that that synchronization circuit (20) has read, while the other synchronization circuit (22) places the decrypting key generator (16) into the corresponding state. In this way, the encryption and decryption ends of the data link are synchronized without any bandwidth penalty.

13 citations


Book ChapterDOI
01 Nov 1988
TL;DR: This chapter gives an exposition of a new information theory along this line and examines its applications in cryptography.
Abstract: What is information? In a fundamental sense, Shannon’s definition of entropy captures the notion of information in situations where unlimited computing power is always available. As a result, in applications such as cryptography, where computational cost plays a central role, the classical information theory does not provide a totally satisfactory framework. In recent years, after Diffie and Hellman proposed the use of trapdoor function as the cornerstone for a new genre of cryptography, this deficiency is particularly dramatized; a ciphertext contains all the Shannon information about the plaintext, yet this information is ‘inaccessible’, i.e., it cannot be efficiently computed. This begs the challenging question ‘what is accessible information?’ Can we combine two very successful theories, namely, Information Theory and Computational Complexity Theory, to capture the notion of accessible information? In this chapter, we will give an exposition of a new information theory along this line and examine its applications in cryptography.

11 citations


Journal ArticleDOI
TL;DR: A new technique of random code chaining (RCC) is proposed to make block ciphers more secure in resisting ciphertext searching and authenticity threats and is very suitable for file encryption.
Abstract: A new technique of random code chaining (RCC) is proposed to make block ciphers more secure in resisting ciphertext searching and authenticity threats. Furthermore, the technique is very suitable for file encryption. With this technique, it is possible to make a modification in encrypted files by re-enciphering only one block message.

2 citations