scispace - formally typeset
Search or ask a question

Showing papers in "Communications of The ACM in 1981"


Journal ArticleDOI
TL;DR: New results are derived on the minimum number of landmarks needed to obtain a solution, and algorithms are presented for computing these minimum-landmark solutions in closed form that provide the basis for an automatic system that can solve the Location Determination Problem under difficult viewing.
Abstract: A new paradigm, Random Sample Consensus (RANSAC), for fitting a model to experimental data is introduced. RANSAC is capable of interpreting/smoothing data containing a significant percentage of gross errors, and is thus ideally suited for applications in automated image analysis where interpretation is based on the data provided by error-prone feature detectors. A major portion of this paper describes the application of RANSAC to the Location Determination Problem (LDP): Given an image depicting a set of landmarks with known locations, determine that point in space from which the image was obtained. In response to a RANSAC requirement, new results are derived on the minimum number of landmarks needed to obtain a solution, and algorithms are presented for computing these minimum-landmark solutions in closed form. These results provide the basis for an automatic system that can solve the LDP under difficult viewing

23,396 citations


Journal ArticleDOI
TL;DR: A technique based on public key cryptography is presented that allows an electronic mail system to hide who a participant communicates with as well as the content of the communication - in spite of an unsecured underlying telecommunication system.
Abstract: A technique based on public key cryptography is presented that allows an electronic mail system to hide who a participant communicates with as well as the content of the communication - in spite of an unsecured underlying telecommunication system. The technique does not require a universally trusted authority. One correspondent can remain anonymous to a second, while allowing the second to respond via an untraceable return address. The technique can also be used to form rosters of untraceable digital pseudonyms from selected applications. Applicants retain the exclusive ability to form digital signatures corresponding to their pseudonyms. Elections in which any interested party can verify that the ballots have been properly counted are possible if anonymously mailed ballots are signed with pseudonyms from a roster of registered voters. Another use allows an individual to correspond with a record-keeping organization under a unique pseudonym, which appears in a roster of acceptable clients.

4,075 citations


Journal ArticleDOI
Leslie Lamport1
TL;DR: A method of user password authentication is described which is secure even if an intruder can read the system's data, and can tamper with or eavesdrop on the communication between the user and the system.
Abstract: A method of user password authentication is described which is secure even if an intruder can read the system's data, and can tamper with or eavesdrop on the communication between the user and the system. The method assumes a secure one-way encryption function and can be implemented with a microcomputer in the user's terminal.

2,874 citations


Journal ArticleDOI
TL;DR: This work describes in detail how to program the cube-connected cycles for efficiently solving a large class of problems that include Fast Fourier transform, sorting, permutations, and derived algorithms.
Abstract: An interconnection pattern of processing elements, the cube-connected cycles (CCC), is introduced which can be used as a general purpose parallel processor. Because its design complies with present technological constraints, the CCC can also be used in the layout of many specialized large scale integrated circuits (VLSI). By combining the principles of parallelism and pipelining, the CCC can emulate the cube-connected machine and the shuffle-exchange network with no significant degradation of performance but with a more compact structure. We describe in detail how to program the CCC for efficiently solving a large class of problems that include Fast Fourier transform, sorting, permutations, and derived algorithms.

1,046 citations


Journal ArticleDOI
TL;DR: It is shown that key distribution protocols with timestamps prevent replays of compromised keys and have the additional benefit of replacing a two-step handshake.
Abstract: The distribution of keys in a computer network using single key or public key encryption is discussed. We consider the possibility that communication keys may be compromised, and show that key distribution protocols with timestamps prevent replays of compromised keys. The timestamps have the additional benefit of replacing a two-step handshake.

787 citations


Journal ArticleDOI
TL;DR: The Cornell Program Synthesizer demands a structural perspective at all stages of program development and its separate features are unified by a common foundation: a grammar for the programming language.
Abstract: Programs are not text; they are hierarchical compositions of computational structures and should be edited, executed, and debugged in an environment that consistently acknowledges and reinforces this viewpoint. The Cornell Program Synthesizer demands a structural perspective at all stages of program development. Its separate features are unified by a common foundation: a grammar for the programming language. Its full-screen derivation-tree editor and syntax-directed diagnostic interpreter combine to make the Synthesizer a powerful and responsive interactive programming tool.

715 citations


Journal ArticleDOI
TL;DR: An algorithm is proposed that creates mutual exclusion in a computer network whose nodes communicate only by messages and do not share memory, and it is shown that the number can be contained in a fixed amount of memory by storing it as the residue of a modulus.
Abstract: An algorithm is proposed that creates mutual exclusion in a computer network whose nodes communicate only by messages and do not share memory. The algorithm sends only 2*(N - 1) messages, where N is the number of nodes in the network per critical section invocation. This number of messages is at a minimum if parallel, distributed, symmetric control is used; hence, the algorithm is optimal in this respect. The time needed to achieve mutual exclusion is also minimal under some general assumptions. As in Lamport's "bakery algorithm," unbounded sequence numbers are used to provide first-come firstserved priority into the critical section. It is shown that the number can be contained in a fixed amount of memory by storing it as the residue of a modulus. The number of messages required to implement the exclusion can be reduced by using sequential node-by-node processing, by using broadcast message techniques, or by sending information through timing channels. The "readers and writers" problem is solved by a simple modification of the algorithm and the modifications necessary to make the algorithm robust are described.

702 citations


Journal ArticleDOI
TL;DR: An approach to carrying out asynchronous, distributed simulation on multiprocessor messagepassing architectures because the amount of memory required by all processors together is bounded and no more than the amount required in sequential simulation.
Abstract: An approach to carrying out asynchronous, distributed simulation on multiprocessor messagepassing architectures is presented. This scheme differs from other distributed simulation schemes because (1) the amount of memory required by all processors together is bounded and is no more than the amount required in sequential simulation and (2) the multiprocessor network is allowed to deadlock, the deadlock is detected, and then the deadlock is broken. Proofs for the correctness of this approach are outlined.

686 citations


Journal ArticleDOI
TL;DR: Decoding algorithms for Reed-Solomon codes provide extensions and generalizations of Shamir's method, which is closely related to Reed- Solomon coding schemes.
Abstract: Shamir's scheme for sharing secrets is closely related to Reed-Solomon coding schemes. Decoding algorithms for Reed-Solomon codes provide extensions and generalizations of Shamir's method.

671 citations


Journal ArticleDOI
Philip Heidelberger1, Peter D. Welch1
TL;DR: A method for placing confidence limits on the steady state mean of an output sequence generated by a discrete event simulation is discussed and a run length control procedure is developed that uses the relative width of the generated confidence interval as a stopping criterion.
Abstract: This paper discusses a method for placing confidence limits on the steady state mean of an output sequence generated by a discrete event simulation. An estimate of the variance is obtained by estimating the spectral density at zero frequency. This estimation is accomplished through a regression analysis of the logarithm of the averaged periodogram. By batching the output sequence the storage and computational requirements of the method remain low. A run length control procedure is developed that uses the relative width of the generated confidence interval as a stopping criterion. Experimental results for several queueing models of an interactive computer system are reported.

442 citations


Journal ArticleDOI
TL;DR: The 1980 ACM Turing Award was presented to Charles Antony Richard Hoare, Professor of Computation at the University of Oxford, England, by Walter Carlson, Chairman of the Awards Committee, at the ACM Annual Conference in Nashville, Tennessee, October 27, 1980.
Abstract: The 1980 ACM Turing Award was presented to Charles Antony Richard Hoare, Professor of Computation at the University of Oxford, England, by Walter Carlson, Chairman of the Awards Committee, at the ACM Annual Conference in Nashville, Tennessee, October 27, 1980. Professor Hoare was selected by the General Technical Achievement Award Committee for his fundamental contributions to the definition and design of programming languages. His work is characterized by an unusual combination of insight, originality, elegance, and impact. He is best known for his work on axiomatic definitions of programming languages through the use of techniques popularly referred to as axiomatic semantics. He developed ingenious algorithms such as Quichsort and was responsible for inventing and promulgating advanced data structuring techniques in scientific programming languages. He has also made important contributions to operating systems through the study of monitors. His most recent work is on communicating sequential processes. Prior to his appointment to the University of Oxford in 1977, Professor Hoare was Professor of Computer Science at The Queen's University in Belfast, Ireland, from 1968 to 1977 and was a Visiting Professor at Stanford University in 1973. From 1960 to 1968 he held a number of positions with Elliott Brothers, Ltd., England.

Journal ArticleDOI
TL;DR: In this article, several operating system services are examined with a view toward their applicability to support of database management functions, including buffer pool management, file system, scheduling, process management, and interprocess communication.
Abstract: Several operating system services are examined with a view toward their applicability to support of database management functions. These services include buffer pool management; the file system; scheduling, process management, and interprocess communication; and consistency control.

Journal ArticleDOI
TL;DR: Strip trees is a linear interpolation scheme which realizes an important space savings by not representing all the points explicitly, and even when the overhead of the tree indexing is added, the storage requirement is comparable to raster representations which do represent most of the points explicit.
Abstract: The use of curves to represent two-dimensional structures is an important part of many scientific investigations. For example, geographers use curves extensively to represent map features such as contour lines, roads, and rivers. Circuit layout designers use curves to specify the wiring between circuits. Because of the very large amount of data involved and the need to perform operations on this data efficiently, the representation of such curves is a crucial issue. A hierarchical representation consisting of binary trees with a special datum at each node is described. This datum is called a strip and the tree that contains such data is called a strip tree. Lower levels in the tree correspond to finer resolution representations of the curve. The strip tree structure is a direct consequence of using a special method for digitizing lines and retaining all intermediate steps. This gives several desirable properties. For curves that are well-behaved, intersection and point-membership (for closed curves) calculations can be solved in 0(log n) where n is the number of points describing the curve. The curves can be efficiently encoded and displayed at various resolutions. The representation is closed under intersection and union and these operations can be carried out at different resolutions. All these properties depend on the hierarchical tree structure which allows primitive operations to be performed at the lowest possible resolution with great computational time savings.Strip trees is a linear interpolation scheme which realizes an important space savings by not representing all the points explicitly. This means that even when the overhead of the tree indexing is added, the storage requirement is comparable to raster representations which do represent most of the points explicitly.

Journal ArticleDOI
TL;DR: System R as mentioned in this paper is an experimental database system designed to demonstrate that the usability advantages of the relational data model can be realized in a system with the complete function and high performance required for everyday production use.
Abstract: System R, an experimental database system, was constructed to demonstrate that the usability advantages of the relational data model can be realized in a system with the complete function and high performance required for everyday production use. This paper describes the three principal phases of the System R project and discusses some of the lessons learned from System R about the design of relational systems and database systems in general.

Journal ArticleDOI
TL;DR: It is shown that although either technique significantly improves security over single encryption, the new technique does not significantly increase security over simple double encryption.
Abstract: Double encryption has been suggested to strengthen the Federal Data Encryption Standard (DES). A recent proposal suggests that using two 56-bit keys but enciphering 3 times (encrypt with a first key, decrypt with a second key, then encrypt with the first key again) increases security over simple double encryption. This paper shows that although either technique significantly improves security over single encryption, the new technique does not significantly increase security over simple double encryption. Cryptanalysis of the 112-bit key requires about 256 operations and words of memory, using a chosen plaintext attack. While DES is used as an example, the technique is applicable to any similar cipher.

Journal ArticleDOI
TL;DR: Data from 18-month operational trials of the EIES system indicate that the range of features considered valuable in a computer-based communication system increases with the amount of experience gained by using this medium of communication.
Abstract: Data from 18-month operational trials of the EIES system indicate that the range of features considered valuable in a computer-based communication system increases with the amount of experience gained by using this medium of communication. Simple message systems alone are not likely to satisfy the communications needs of long term, regular users of computerized communications systems. Among the capabilities which long term, regular users find valuable are group conferences, notebooks for text composition, and self-defined commands.

Journal ArticleDOI
TL;DR: Factor analysis resulted in the identification of six problem factors: user knowledge, programmer effectiveness, product quality, programmer time availability, machine requirements, and system reliability, providing new evidence of the importance of the user relationship for system success or failure.
Abstract: The problems of application software maintenance in 487 data processing organizations were surveyed. Factor analysis resulted in the identification of six problem factors: user knowledge, programmer effectiveness, product quality, programmer time availability, machine requirements, and system reliability. User knowledge accounted for about 60 percent of the common problem variance, providing new evidence of the importance of the user relationship for system success or failure. Problems of programmer effectiveness and product quality were greater for older and larger systems and where more effort was spent in corrective maintenance. Larger scale data processing environments were significantly associated with greater problems or programmer effectiveness, but with no other problem factor. Product quality was seen as a lesser problem when certain productivity techniques were used in development.

Journal ArticleDOI
TL;DR: Findings suggest that CIS is a flexible tool that is compatible with a variety of organizational design options and not a cause of design per se.
Abstract: A study of Computer Information Systems and Management (CISM) is described and selected results relating to changes in organizational structure in eight organizations are presented. In five of the organizations no changes in formal structure accompanied the introduction of CIS. Where organizational changes did occur, the existing structure of the organization was usually reinforced. These findings suggest that CIS is a flexible tool that is compatible with a variety of organizational design options and not a cause of design per se.


Journal ArticleDOI
Richard E. Nance1
TL;DR: This paper focuses on the coordination of the time and state concepts using "object" as the link and relates the concept of "state-sequenced simulation" to the variations in time flow mechanisms.
Abstract: Time and state descriptions form the core of a simulation model representation. The historical influence of initial application areas and the exigencies of language implementations have created a muddled view of the time and state relationships. As a consequence, users of simulation programming languages work in relative isolation; model development, simulation application, model portability, and the communication of results are inhibited and simulation practice fails to contribute to the recognition of an underlying foundation or integrating structure. A model representation structure has been forged from a small set of basic definitions which carefully distinguish the state and time relationships. This paper focuses on the coordination of the time and state concepts using \"object\" as the link. In addition to clarifying the relationships, the structure relates the concept of \"state-sequenced simulation\" to the variations in time flow mechanisms. In conclusion, some speculations are offered regarding alternative algorithms for time flow mechanisms.

Journal ArticleDOI
TL;DR: A methodology is presented for constructing the relationships among model user's risk, model builders' risk, acceptable validity range, sample sizes, and cost of data collection when statistical hypothesis testing is used for validating a simulation model of a real, observable system.
Abstract: A methodology is presented for constructing the relationships among model user's risk, model builder's risk, acceptable validity range, sample sizes, and cost of data collection when statistical hypothesis testing is used for validating a simulation model of a real, observable system. The use of the methodology is illustrated for the use of Hotelling's two-sample T 2 test in testing the validity of a multivariate stationary response stimulation model.

Journal ArticleDOI
TL;DR: The acceptability of the components of a simulation study are discussed with respect to the goal of the simulation study, the structure and data of the real system, the parametric model, the model parameter set, the specification of the experimentation, and the existing or conceivable norms of modeling methodology, experimentation technique, simulation methodology, and software engineering.
Abstract: The existing trend of application of computerized simulation studies to large and complex systems necessitates the development of an assessment methology for simulation studies. The basic concepts and criteria necessary for such an assessment methodology are presented in a systematic way. The proposed framework permits discussion of the concepts and criteria related to the acceptability of the following components of a simulation study: Simulation results, real world and simulated data, parametric model and the values of the model parameters, specification of the experimentation, representation and execution of the computer program, and modeling, experimentation, simulation, and programming methodologies or techniques used. The acceptability of the components of a simulation study are discussed with respect to the goal of the simulation study, the structure and data of the real system, the parametric model, the model parameter set, the specification of the experimentation, and the existing or conceivable norms of modeling methodology, experimentation technique, simulation methodology, and software engineering.

Journal ArticleDOI
TL;DR: Results indicate that both user and interface characteristics influence the use of the system options and the request for information in the problem-solving task.
Abstract: An exploratory study was conducted to analyze whether interface and user characteristics affect decision effectiveness and subject behavior in an interactive human/computer problem-solving environment. The dependent variables were performance and the use of the systems options. Two of the independent variables examined, experience and cognitive style, were user characteristics; the other three, dialogue, command, and default types, were interface characteristics. Results indicate that both user and interface characteristics influence the use of the system options and the request for information in the problem-solving task.

Journal ArticleDOI
TL;DR: An algorithm is presented that finds all the straight lines that pierce each data range, and it is shown that the half-plane intersection algorithm of Shamos and Hoey can be improved in this special case to O(n).
Abstract: Applications often require fitting straight lines to data that is input incrementally. The case where a data range [ak, ok] is received at each tk, t1

Journal ArticleDOI
TL;DR: A method is presented for building minimal perfect hash functions, i.e., functions which allow single probe retrieval from minimally sized tables of identifier sets, and a proof of existence for minimalperfect hash functions of a special type (reciprocal type) is given.
Abstract: A method is presented for building minimal perfect hash functions, i.e., functions which allow single probe retrieval from minimally sized tables of identifier sets. A proof of existence for minimal perfect hash functions of a special type (reciprocal type) is given. Two algorithms for determining hash functions of reciprocal type are presented and their practical limitations are discussed. Further, some application results are given and compared with those of earlier approaches for perfect hashing.

Journal ArticleDOI
TL;DR: This work develops and evaluates (replacement) algorithms for the selection of files to be moved from disk to mass storage and finds that algorithms based on both the file size and the time since the file was last used work well.
Abstract: The steady increase in the power and complexity of modern computer systems has encouraged the implementation of automatic file migration systems which move files dynamically between mass storage devices and disk in response to user reference patterns. Using information describing 13 months of user disk data set file references, we develop and evaluate (replacement) algorithms for the selection of files to be moved from disk to mass storage. Our approach is general and demonstrates a general methodology for this type of problem. We find that algorithms based on both the file size and the time since the file was last used work well. The best realizable algorithms tested condition on the empirical distribution of the times between file references. Acceptable results are also obtained by selecting for replacement that file whose size times time to most recent reference is maximal. Comparisons are made with a number of standard algorithms developed for paging, such as Working Set, VMIN, and GOPT. Sufficient information (parameter values, fitted equations) is provided so that our algorithms may be easily implemented on other systems.

Journal ArticleDOI
TL;DR: New analytical and empirical results for the performance of future event set algorithms in discrete event simulation are presented, providing a clear insight to the factors affecting algorithm performance, permit evaluation of the hold model, and determine the best algorithm(s) to use.
Abstract: New analytical and empirical results for the performance of future event set algorithms in discrete event simulation are presented. These results provide a clear insight to the factors affecting algorithm performance, permit evaluation of the hold model, and determine the best algorithm(s) to use. The analytical results include a classification of distributions for efficient insertion scanning of a linear structure. In addition, it is shown that when more than one distribution is present, there is generally an increase in the probability that new insertions will have smaller times than those in the future event set. Twelve algorithms, including most of those recently proposed, were empirically evaluated using primarily simulation models. Of the twelve tested, four performed well, three performed fairly, and five performed poorly.

Journal ArticleDOI
TL;DR: A simplification of flow expressions, an extension of the regular expressions designed to model concurrency, is examined, and grammars for all shuffle languages are generated and shown to be context-sensitive.
Abstract: Flow expressions have been proposed as an extension of the regular expressions designed to model concurrency. We examine a simplification of these flow expressions which we call shuffle expressions. We introduce two types of machines to aid in recognizing shuffle languages and show that one such machine may be equivalent to a Petri Net. In addition, closure and containment properties of the related language classes are investigated, and we show that one machine type recognizes at least a restricted class of shuffle languages. Finally, grammars for all shuffle languages are generated, and the shuffle languages are shown to be context-sensitive.

Journal ArticleDOI
TL;DR: A third project organization which lies between the other two in its communication patterns and dissemination of decision-making authority is presented and recommendations are given for selecting one of the three team organizations depending on the task to be performed.
Abstract: The literature recognizes two group structures for managing programming projects: Baker's chief programmer team and Weinberg's egoless team. Although each structure's success in project management can be demonstrated, this success is clearly dependent on the type of programming task undertaken. Here, for the purposes of comparison, a third project organization which lies between the other two in its communication patterns and dissemination of decision-making authority is presented. Recommendations are given for selecting one of the three team organizations depending on the task to be performed.

Journal ArticleDOI
Per Galle1
TL;DR: An algorithm which generates all possible rectangular plans on modular grids with congruent cells, subject to constraints on total area, room areas, wall lengths, room adjacencies, and room orientations is described.
Abstract: The combinatorial complexity of most floor plan design problems makes it practically impossible to obtain a systematic knowledge of possible solutions using pencil and paper. The objective of this paper is to contribute to the development of computer methods providing such knowledge for the designer. The paper describes an algorithm which generates all possible rectangular plans on modular grids with congruent cells, subject to constraints on total area, room areas, wall lengths, room adjacencies, and room orientations. To make room sizes regular and limit the solution set only such grids are used which minimize the number of cells in the smallest room. The description is sufficiently detailed to serve as a basis for programming. Test results for a Pascal implementation of the algorithm are reported. Realistic problems of up to ten rooms have been solved in modest lengths of computer time. The results indicate that the approach of exhaustive generation may prove to be more fruitful than generally assumed.