scispace - formally typeset
Search or ask a question

Showing papers on "Cryptography published in 1987"


Proceedings ArticleDOI
12 Oct 1987
TL;DR: This paper presents an extremely efficient, non-interactive protocol for verifiable secret sharing, which provides asynchronous networks with a constant-round simulation of simultaneous broadcast networks whenever even a bare majority of processors are good.
Abstract: This paper presents an extremely efficient, non-interactive protocol for verifiable secret sharing. Verifiable secret sharing (VSS) is a way of bequeathing information to a set of processors such that a quorum of processors is needed to access the information. VSS is a fundamental tool of cryptography and distributed computing. Seemingly difficult problems such as secret bidding, fair voting, leader election, and flipping a fair coin have simple one-round reductions to VSS. There is a constant-round reduction from Byzantine Agreement to non-interactive VSS. Non-interactive VSS provides asynchronous networks with a constant-round simulation of simultaneous broadcast networks whenever even a bare majority of processors are good. VSS is constantly repeated in the simulation of fault-free protocols by faulty systems. As verifiable secret sharing is a bottleneck for so many results, it is essential to find efficient solutions.

1,202 citations


Book
01 Jan 1987
TL;DR: Some topics in Elementary Number Theory include Finite Fields and Quadratic Residues, Primality and Factoring, and Elliptic Curves.
Abstract: 1: Some Topics in Elementary Number Theory. 2: Finite Fields and Quadratic Residues. 3: Cryptography. 4: Public Key. 5: Primality and Factoring. 6: Elliptic Curves.

1,085 citations


Book ChapterDOI
16 Aug 1987
TL;DR: In this paper, the authors explain the lack of usefulness of the above cryptmystems in the case that messages are intended for (or are originating from) a group of people.
Abstract: Messages are frequently addressed to a group of people, e.g., board of directors. Conventional and public key systems (in the sense of Diffie and Hellman [4]) are not adapted when messages are intended for a group instead of for an individual. To deeply understand the lack of usefulness of the above cryptmystems in the case that messages are intended for (or are originating from) a group of people, let u s now nevertheless attempt to use these systems. When conventional and public key systems are used to protect privacy, the legitimate receiver(s) has (have) to know the secret key to decrypt. This means that, a first solution could be, to send the message to dl members of the group, e.g., using their public keys. A second is that the secret key is known to all membexs and that the message is sent only once. All other solutions using a conventional or public key system, are combinations of the above two solutions. We now explain briefly why these two obvious solutions are not adapted to security needs specific to the protection of information intended for groups.

315 citations


Patent
Stephen M. Matyas1, Jonathan Oseas1
03 Feb 1987
TL;DR: In this article, a cryptographic method for discouraging the copying and sharing of purchased software programs allows an encrypted program to be run on only a designated computer or, alternatively, to run on any computer but only by the user possessing a designated smart card.
Abstract: A cryptographic method for discouraging the copying and sharing of purchased software programs allows an encrypted program to be run on only a designated computer or, alternatively, to be run on any computer but only by the user possessing a designated smart card. Each program offering sold by the software vendor is encrypted with a unique file key and then written on a diskette. A user who purchases a diskette having written thereon an encrypted program must first obtain a secret password from the software vendor. This password will allow the encrypted program to be recovered at a prescribed, designated computer having a properly implemented and initialized encryption feature. The encryption feature decrypts the file key of the program from the password, and when the encrypted program is loaded at the proper computer, the program or a portion of it is automatically decrypted and written into a protected memory from which it can only be executed and not accessed for non-execution purposes. In alternative embodiments, the user is not confined to a prescribed, designated computer buy may use the program on other, different computers with a smart card provided the computers have a properly implemented and initialized encryption feature that accepts the smart card. As a further modification, the cryptographic facility may support operations that enable the user to encrypt and decrypt user generated files and/or user generated programs.

219 citations


Patent
28 Aug 1987
TL;DR: In this paper, the authors make a distinction between insiders and outsiders, i.e., insiders who have access to the system and outsiders who do not, and make a comparison between two types of attacks.
Abstract: A cryptographic method and apparatus are disclosed which transform a message or arbitrary length into a block of fixed length (128 bits) defined modification detection code (MDC). Although there are a large number of messages which result in the same MDC, because the MDC is a many-to-one function of the input, it is required that it is practically not feasible for an opponent to find them. In analyzing the methods, a distinction is made between two types of attacks, i.e., insiders (who have access to the system) and outsiders (who do not). The first method employs four encryption steps per DEA block and provides the higher degree of security. Coupling between the different DEA operations is provided by using the input keys also as data in two of the four encryption steps. In addition, there is cross coupling by interchanging half of the internal keys. Although this second coupling operation does not add to security in this scheme, it is mandatory in the second method, which employs only two encryption steps per DEA block to trade off security for performance. By providing key cross coupling in both schemes, an identical kernel is established for both methods. This has an implementation advantage since the first method can be achieved by applying the second method twice. The MDC, when loaded into a secure device, authorizes one and only one data set to be authenticated by the MDC, whereas methods based on message authentication codes or digital signatures involving a public key algorithm authorize a plurality of data sets to be authenticated. The MDC therefore provides for greater security control.

181 citations


Book ChapterDOI
13 Apr 1987
TL;DR: The FEAL (Fast data Encipherment ALgorihtm) fills the need for an encipherment algorithm that has safety equal to DES and is suitable for software as well as hardware implementation.
Abstract: In data communications and information processing systems, cryptography is the most effective way to secure communications and store data. The most commonly used cryptogryphic algorithm is DES (1). However, it is generally implemented with hardware, and the cost is prohibitive for small scale systems such as personal computer communications. Accordingly, an encipherment algorithm that has safety equal to DES and is suitable for software as well as hardware implementation is needed. The FEAL (Fast data Encipherment ALgorihtm) fills this need.

171 citations


01 Jan 1987
TL;DR: This talk focuses on the RSA Public Key Cryptosystem, a cryptographic system in which each encryption process is governed by not one but two keys, which allows one of the keys to be made public while its inverse is kept secret, giving the systems their name.
Abstract: We are going to devote most of our attention in this talk to the RSA Public Key Cryptosystem because it not only remains unbroken but it has some other useful features for digital signatures and authentication. We will briefly mention some other methods which have been compromised to some degree, and one, McEliece's which has not, but which are still valid when both keys are kept secret and some have other features which may be useful. Disciplines Physical Sciences and Mathematics Publication Details Seberry, J, Public key cryptography, Secure Data Communications Workshop, Digest of Papers, IEEE, Melbourne, 1987, 1-17. This conference paper is available at Research Online: http://ro.uow.edu.au/infopapers/1030 P"'f ' oJ! /'ltd, IUE j.}(}+lesh7 3/7/c7 PUBLIC KEY CRYPTOGRAPHY Jennifer Seberry University College The University of New South Wales Australian Defence Forces Academy Canberra INTRODUCTION We are going to devote most of our attention in this talk to the RSA Public Key Cryp~ tosystem because it not only remains unbroken but it has some other useful features for digital signatures and authentication. We will briefly mention some other methods which have been compromised to some degree, and one, McEliece' s which has not, but which are still valid when both keys are kept secret and some have other features which may be useful. PUBLIC KEY SYSTEMS A public key cryptosystem is a cryptographic system in which each encryption process is governed by not one but two keys. The two keys are inverses of each other, that is to say anything encrypted with one can be decrypted with the other and vice versa. The important additional property of a public key crptosystem is that given one of the keys, it is extremely difficult to find the other. This allows one of the keys to be made public while its inverse is kept secret, giving the systems their name. Public key cryptosystems have two very important properties. Because it is not necessary to keep both of the keys secret, one can be made readily available, published in a phonebook for example. Anyone wanting to transmit a confidential message can encrypt it in the public key of the addressee with assurance that only the addressee will be able to read it. Just as a message encrypted in a public key can be produced by anyone but can only be read by the holder of the corresponding secret key, a message encrypted in a secret key, a message encrypted in a secret key can be read by anyone. using the corresponding public key, but could only have been produced by the holder of the secret key. This gives it the fundamental property of a signature. Use is made of modular arithmetic. Mathematicians write the expression a :;b(mod m) (a is congruent to b modulo m) to denote the fact that the integer m divides exactly the difference of the integers a and b. For example, 32:; -4(mod 12). Note that if the remainder on dividing a by m is b, then a == b (mod m). Hence, 5124491" 12172(mod 21753). In fact, the remainder on dividing a by m is the only number b which is congruent to a modulo m such that 0 $; b < m. One very important cosequence of the definition of congruence is that if p (x) is any polynomial function of x with integer coefficients, then p (0)" p(b)(mod m) whenever 0" b(mod m). PUBLIC KEY DISTRIBUTION SYSTEM A public key distribution system is a mechanism which allows two people who have never had any prior secure contact to establish a secure channel "out of thin air", Public key distribution systems do not provide any signature mechanism but, at present, some are faster and more compact than public key cryptosystems which makes them better for many applications. The first practical public key distribution system makes use of the apparent difficulty of computing logarithms over a finite (Galois) field GF(q) with a prime number q of elements (the numbers (O,I, ... ,q-I) under arithmetic mod q). Let Y=axmod q,for 1

161 citations


Proceedings Article
16 Aug 1987
TL;DR: In the public-key model as mentioned in this paper, each user has a single validated public key, and procedures proposed for this model must preserve the security of the keys, which is appropriate to those situations in which generation and validation of new keys is very costly or is otherwise limited.
Abstract: An important area of research in cryptography is the design of protocols for carrying on certain transactions in a communications network, such as playing poker or holding an election. Many of the protocols proposed in this area have required the expensive on-line generation of a large number of new keys. On the other hand, fundamental research in the traditional problems of cryptography, such as encryption and authentication, has developed the public-key model, in which each user has a single validated public key. This model is appropriate to those situations in which generation and validation of new keys is very costly or is otherwise limited. Procedures proposed for this model must preserve the security of the keys. An important question is whether flexible protocol design for a wide variety of problems is possible within the public-key model, so that the expense of generating new keys can be minimized

136 citations


Journal ArticleDOI
Fred Cohen1
TL;DR: This paper describes a cryptographic checksum technique for verifying the integrity of information in computer systems with no built-in protection based on the use of repeated encryption using an RSA cryptosystem as a pseudo-random number generator.

82 citations


Journal ArticleDOI
TL;DR: The heuristic backtrack algorithm typically visits only a few hundred among the 26! possible nodes on sample texts ranging from 100 to 600 words, thereby opening up the possibility of automatic translation of a scanned document into a standard character code, such as ASCII.
Abstract: A substitution cipher consists of a block of natural language text where each letter of the alphabet has been replaced by a distinct symbol. As a problem in cryptography, the substitution cipher is of limited interest, but it has an important application in optical character recognition. Recent advances render it quite feasible to scan documents with a fairly complex layout and to classify (cluster) the printed characters into distinct groups according to their shape. However, given the immense variety of type styles and forms in current use, it is not possible to assign alphabetical identities to characters of arbitrary size and typeface. This gap can be bridged by solving the equivalent of a substitution cipher problem, thereby opening up the possibility of automatic translation of a scanned document into a standard character code, such as ASCII. Earlier methods relying on letter n-gram frequencies require a substantial amount of ciphertext for accurate n-gram estimates. A dictionary-based approach solves the problem using relatively small ciphertext samples and a dictionary of fewer than 500 words. Our heuristic backtrack algorithm typically visits only a few hundred among the 26! possible nodes on sample texts ranging from 100 to 600 words.

70 citations


Book ChapterDOI
13 Apr 1987
TL;DR: The need to protect critical cryptographic variables (particularly keys, and in some cases algorithms) in a secure environment within cryptographic equipment, particularly those used in the area of high value funds transfer transactions is explained.
Abstract: With the growth of user awareness for the need to protect sensitive computer data by cryptographic means, this paper explains the need to protect critical cryptographic variables (particularly keys, and in some cases algorithms) in a secure environment within cryptographic equipment, particularly those used in the area of high value funds transfer transactions. Design principles are outlined, leading to the concept of tamper resistant and not tamper proof devices to protect key data, whether the data be retained within physically large devices or on small portable tokens. Criteria for the detection of attempts to gain access to sensitive data rather than attack prevention are outlined, together with two types of attack scenario - invasive and non-invasive. The risks of attack on cryptographic devices are surveyed and intruder attack objectives are outlined, together with some typical scenarios. The available counter-measures are discussed. Several discreet mechanisms are described. Typical detection mechanisms and sensor systems are discussed plus the design trade-offs that must be made in implementation, in particular manufacturing and maintenance costs versus scope of attack protection. Once an attack is detected, various data destruction mechanisms may be employed. The desirability of active data destruction by "intelligent" means is proposed, together with a discussion of alternative techniques with particular reference to the data storage device characteristics. Some experiences of tamper resistant research and development highlight the potential manufacturing problems - particularly in respect of quality assurance, product fault analysis and life-testing. The desirability of tamper resistant standards and independent assessment facilities is expressed, the applicability of such standards and large scale protection methods on intelligent tokens, in particular smart cards and personal authenticators, is discussed.

Book ChapterDOI
16 Aug 1987
TL;DR: This work shows that essentially any multiparty protocol problem can be solved, and relies on the so called key-safeguarding or secret-sharing schemes proposed by Blakley and Shamir as basic building blocks to achieve the optimal result.
Abstract: It has been shown previously how almost any multiparty protocol problem can be solved. All the constructions suggested so far rely on trapdoor one-way functions, and therefore must assume essentially that public key cryptography is possible. It has also been shown that unconditional protection of a single designated participant is all that can be achieved under that model. Assuming only authenticated secrecy channels between pairs of participants, we show that essentially any multiparty protocol problem can be solved. Such a model actually implies the further requirement that less than one third of the participants deviate from the protocol. The techniques presented do not, however, rely on any cryptographic assumptions; they achieve the optimal result and provide security as good as the secrecy and authentication of the channeis used. Moreover, the constructions have a built-in fault tolerance: once the participants have sent messages committing themselves to the secrets they will use in the protocol, there is no way less than a third of them can stop the protocol from completing correctly. Our technique relies on the so called key-safeguarding or secret-sharing schemes proposed by Blakley and Shamir as basic building blocks. The usefulness of their homomorphic structure was observed by Benaloh, who proposed techniques very similar to ours.

Journal ArticleDOI
TL;DR: In this article, the authors present a Cryptology: From Caesar Ciphers to Public-key Cryptosystems (CKTC) from the point of view of cryptology.
Abstract: (1987). Cryptology: From Caesar Ciphers to Public-key Cryptosystems. The College Mathematics Journal: Vol. 18, No. 1, pp. 2-17.

Book ChapterDOI
16 Aug 1987
TL;DR: By applying the complexity-theoretic approach to knowledge, this work is able to measure and control the computational knowledge released to the various users, as well as its temporal availability.
Abstract: We give a general procedure for designing correct, secure, and fault-tolerant cryptographic protocols for many parties, thus enlarging the domain of tasks that can be performed efficiently by cryptographic means. We model the most general sort of feasible adversarial behavior, and describe fault-recovery procedures that can tolerate it. Our constructions minimize the use of cryptographic resources. By applying the complexity-theoretic approach to knowledge, we are able to measure and control the computational knowledge released to the various users, as well as its temporal availability.

Book
01 Oct 1987
TL;DR: This thoroughly enhanced third edition is an essential text for everyone involved with the operation and security of the computer complexes that are the heart of today's businesses.
Abstract: From the Publisher: Computer Security, Third Edition contains the best ideas on recent advances in computer hardware and the spread of personal computer technology. It includes a complete and comprehensive introduction to computer security, as well as coverage of computer crime, systems security, and cryptology. Convinced that there is no such thing as computer security, only various degrees of insecurity, John Carroll presents the best concepts that high technology, classical security practice, and common sense have to offer to help reduce insecurity to the lowest possible level. This thoroughly enhanced third edition is an essential text for everyone involved with the operation and security of the computer complexes that are the heart of today's businesses. In addition to completely updating the original matter, Computer Security, Third Edition includes new information on: computer crime and the law; physical security; communications; surveillance; and risk management.

Book ChapterDOI
13 Apr 1987
TL;DR: In commercial applications, a minimum ciphering rate of 64 K bit/sec is required which will be the transmission rate of public digital networks and a single-chip implementation of the RSA algorithm seems to be the only solution.
Abstract: In commercial applications, a minimum ciphering rate of 64 K bit/sec is required which will be the transmission rate of public digital networks. In contrast, the RSA method has a very slow ciphering rate particulary when using software implementations of the algorithm. The solution of this problem is a hardware implementation of the RSA algorithm. A cryptography processor, however, consisting of standard chips like bit slice processors again does not achieve the speed necessary. Moreover, in a multi-chip processor, the security of the key management system cannot be guaranteed. Therefore, a single-chip implementation of the RSA algorithm seems to be the only solution. Such a solution is presented as an RSA Cryptography Processor (CP).

Book ChapterDOI
16 Aug 1987
TL;DR: The optimal values for the parameters of the McEliece public key cryptosystem are computed and it is shown that the likelihood of the existence of more than one trapdoor in the system is very small.
Abstract: The optimal values for the parameters of the McEliece public key cryptosystem are computed. Using these values improves the cryptanalytic complexity of the system and decreases its data expansion. Secondly it is shown that the likelihood of the existence of more than one trapdoor in the system is very small.

Book
28 Jun 1987
TL;DR: The author includes not only information about the most important advances in the field of cryptology of the past decade-such as the Data Encryption Standard (DES), public-key cryptology, and the RSA algorithm-but also the research results of the last three years.
Abstract: The author includes not only information about the most important advances in the field of cryptology of the past decade-such as the Data Encryption Standard (DES), public-key cryptology, and the RSA algorithm-but also the research results of the last three years: the Shamir, the Lagarias-Odlyzko, and the Brickell attacks on the Knapsack methods; the new Knapsack method using Galois fields by Chor and Rivest; and the recent analysis by Kaliski, Rivest, and Sherman of group-theoretic properties of the Data Encryption Standard (DES).

Book ChapterDOI
01 Jan 1987
TL;DR: A revised 128-bit MDC algorithm is presented which overcomes the so-called Triple Birthday Attck introduced by Coppersmith and makes use of the Intel 8087/80287 Numeric Data Processor coprocessor chip for the IBM PC/XT/AT and similar microcomputers.
Abstract: Manipulation Detection Codes (MDC) are defined as a class of checksum algorithms which can detect both accidental and malicious modifications of an electronic message or document. Although the MDC result must be protected by encryption to prevent an attacker from succeeding in substituting his own Manipulation Detection Code (MDC) along with the modified text, MDC algorithms do not require the use of secret information such as a cryptographic key. Such techniques are therefore highly useful in allowing encryption and message authentication to be implemented in different protocol layers in a communication system without key management difficulties, as well as in implementing digital signature schemes. It is shown that cryptographic checksums that are intended to detect fraudulant messages should be on the order of 128 bits in length, and the ANSI X9.9-1986 Message Authentication Standard is criticized on that basis. A revised 128-bit MDC algorithm is presented which overcomes the so-called Triple Birthday Attck introduced by Coppersmith. A fast, efficient implementation is discussed which makes use of the Intel 8087/80287 Numeric Data Processor coprocessor chip for the IBM PC/XT/AT and similar microcomputers.

Book
01 Jan 1987
TL;DR: A zero-knowledge Poker protocol that achieves confidentiality of the players' strategy or How to achieve an electronic Poker face is proposed.
Abstract: Data Encryption Standard.- Structure in the S-Boxes of the DES (extended abstract).- Cycle Structure of the DES with Weak and Semi-Weak Keys.- Public-Key Cryptography.- Private-Key Algebraic-Coded Cryptosystems.- Some Variations on RSA Signatures & their Security.- Breaking the Cade Cipher.- A Modification of a Broken Public-Key Cipher.- A Pseudo-Random Bit Generator Based on Elliptic Logarithms.- Two Remarks Concerning the Goldwasser-Micali-Rivest Signature Scheme.- Public-key Systems Based on the Difficulty of Tampering (Is there a difference between DES and RSA?).- A Secure and Privacy-Protecting Protocol for Transmitting Personal Information Between Organizations.- Cryptographic Protocols And Zero-Knowledge Proofs.- How to Prove All NP Statements in Zero-Knowledge and a Methodology of Cryptographic Protocol Design (Extended Abstract).- How To Prove Yourself: Practical Solutions to Identification and Signature Problems.- Demonstrating that a Public Predicate can be Satisfied Without Revealing Any Information About How.- Demonstrating Possession of a Discrete Logarithm Without Revealing it.- Cryptographic Capsules: A Disjunctive Primitive for Interactive Protocols.- Zero-Knowledge Simulation of Boolean Circuits.- All-or-Nothing Disclosure of Secrets.- A zero-knowledge Poker protocol that achieves confidentiality of the players' strategy or How to achieve an electronic Poker face.- Secret-Sharing Methods.- Secret Sharing Homomorphisms: Keeping Shares of a Secret Secret (Extended Abstract).- How to Share a Secret with Cheaters.- Smallest Possible Message Expansion in Threshold Schemes.- Hardware Systems.- VLSI implementation of public-key encryption algorithms.- Architectures for exponentiation in GF(2n).- Implementing the Rivest Shamir and Adleman Public Key Encryption Algorithm on a Standard Digital Signal Processor.- Software Systems.- A High Speed Manipulation Detection Code.- Electronic Funds Transfer Point of Sale in Australia.- Software Protection, Probabilistic Methods, and Other Topics.- The Notion of Security for Probabilistic Cryptosystems (Extended Abstract).- Large-Scale Randomization Techniques.- On the Linear Span of Binary Sequences Obtained from Finite Geometries.- Some Constructions and Bounds for Authentication Codes.- Towards a Theory of Software Protection (Extended Abstract).- Informal Contributions.- Two Observations on Probabilistic Primality Testing.- Public Key Registration.- Is there an ultimate use of cryptography? (Extended Abstract).- Smart Card a Highly Reliable and Portable Security Device.- Thomas - A Complete Single Chip RSA Device.

Journal ArticleDOI
TL;DR: It is shown that cryptographic checksums that are intended to detect fraudulent messages must be on the order of 128 bits in length, and the ANSI X9.9-1986 message authentication standard is criticized on that basis.
Abstract: Digital signature techniques such as the Rivest-Shamir-Adleman (RSA) scheme can be used to establish both the authenticity of a document and the identity of its originator. However, because of the computationally-intensive nature of the RSA algorithm, most digital signature schemes make use of a checksum technique to summarize or represent the document, and then digitally sign the checksum. Message authentication codes (MACs), based on the Data Encryption Standard (DES), are often used for this purpose. It is shown that cryptographic checksums that are intended to detect fraudulent messages must be on the order of 128 bits in length, and the ANSI X9.9-1986 message authentication standard is criticized on that basis. In addition, architectural arguments are advanced to illustrate the advantages of a checksum algorithm that is not based on the use of cryptography and does not require the use of a secret key. Manipulation detection codes (MDC) are defined as a class of checksum algorithms that can detect both accidental and malicious modifications of an electronic message or document, without requiring the use of a cryptographic key.

Journal ArticleDOI
TL;DR: The letter demonstrates that, for n-bit arguments, at most 2n carry-save additions are required, followed by at most two carry-propagate additions for final assimilation, using components no more than n + 3 bits wide.
Abstract: The computation of A*BmoduloN is an important arith-metic operation in security cryptosystems. Since the word length n involved is large, speed-up techniques are important. The letter demonstrates that, for n-bit arguments, at most 2n carry-save additions are required, followed by at most two carry-propagate additions for final assimilation, using components no more than n + 3 bits wide.

Proceedings ArticleDOI
27 Apr 1987
TL;DR: This paper presents an approach to analyzing Encryption protocols using machine aided formal verification techniques and a formal specification of an example system is presented, and a weakness that was revealed by testing the formal specification is discussed.
Abstract: This paper presents an approach to analyzing Encryption protocols using machine aided formal verification techniques. The desirable properties that a protocol is to preserve are expressed as state invariants and the theorems that need to be proved to guarantee that the cryptographic facility satisfies the invariants are automatically generated by the verification system. A formal specification of an example system is presented, and a weakness that was revealed by testing the formal specification is discussed.

Book ChapterDOI
13 Apr 1987
TL;DR: The aim of this contribution is to show how cryptogaphical applications demand for both high security and high speed and how both can be combined.
Abstract: This paper describes the impact of cryptographic requirements on the design of a new highly performant DES chip implementation. Actual cryptogaphical applications demand for both high security and high speed. It is the aim of this contribution to show how both can be combined.

Book
01 Oct 1987
TL;DR: Character sets and substitutions Transposition ciphers and transposed alphabets Cryptographic security Mathematically-based cryptology Congruencies and modular arithmetic Shifts, inverses and polyalphabets
Abstract: Character sets and substitutions Transposition ciphers and transposed alphabets Cryptographic security Mathematically-based cryptology Congruencies and modular arithmetic Shifts, inverses and polyalphabets Prime numbers and multiplication inverses Logarithms and exponents Public key cryptosystems Polyalphabetical and auto-key ciphers Polygraphic ciphers Matrix ciphers Matrices and simultaneous equations The inherent flaw in linear ciphers Binary ciphers The data encryption standard Idiosyncratic cryptology.

Book ChapterDOI
Walter Fumy1
16 Aug 1987
TL;DR: The effects of several straight-forward modifications of FEAL's F-function are discussed and the characteristics of the F- function are compared to the Data Encryption Standard (DES).
Abstract: The cryptographic strength of a Feistel Cipher depends strongly on the properties of its F-function, Certain characteristics of the F-function of the Fast Data Encipherment Algorithm (FEAL) are investigated and compared to characteristics of the F-function of the Data Encryption Standard (DES). The effects of several straight-forward modifications of FEAL's F-function are discussed.

Journal ArticleDOI
TL;DR: Performance results of several implementations are given, which show that the RSA algorithm is acceptably fast for a large number of applications.

Book ChapterDOI
16 Aug 1987
TL;DR: This paper deals with the protocol aspects of integrating cryptography in ISDN, which have been briefly discussed earlier in the literature.
Abstract: Security services in ISDN have been briefly discussed earlier in the literature This paper deals with the protocol aspects of integrating cryptography in ISDN. 125, 26 and 271.

Proceedings Article
01 Jan 1987
TL;DR: An attack on computer security based on a combination of software viruses and hardware trapdoors and the complexity of finding such an attack is discussed.
Abstract: Cryptography can increase the security of computers and modem telecommunication systems. Software viruses and hardware trapdoors are aspects of computer security. Based on a combination of these two aspects, an attack on computer security is presented. The complexity of finding such an attack is discussed. A new open problem is: can cryptography prevent such an attack.

ReportDOI
01 Feb 1987
TL;DR: The results of a trace-driven measurement study of ADP performance show that its throughput and latency are acceptable even within the limitations of today''s technology, provided single-key encryption-decryption can be done in hardware.
Abstract: A mechanism for secure communication in large distributed systems is proposed. The mechanism, called Authenticated Datagram Protocol (ADP provides message authentication and optionally, privacy of data. ADP is a host-to-host datagram protocol, positioned below the transport layer; it uses public-key encryptions to establish secure channels between hosts and to authenticate owners, and single-key encryption for communication over a channel and to ensure privacy of the messages. ADP is shown to satisfy the main security requirements of large distributed systems, to provide end-to-end security mechanisms are at a higher level. The results of a trace-driven measurement study of ADP performance show that its throughput and latency are acceptable even within the limitations of today''s technology, provided single-key encryption-decryption can be done in hardware.