scispace - formally typeset
Search or ask a question

Showing papers by "Carnegie Mellon University published in 1986"


Journal ArticleDOI
TL;DR: In this paper, the authors present a data structure for representing Boolean functions and an associated set of manipulation algorithms, which have time complexity proportional to the sizes of the graphs being operated on, and hence are quite efficient as long as the graphs do not grow too large.
Abstract: In this paper we present a new data structure for representing Boolean functions and an associated set of manipulation algorithms. Functions are represented by directed, acyclic graphs in a manner similar to the representations introduced by Lee [1] and Akers [2], but with further restrictions on the ordering of decision variables in the graph. Although a function requires, in the worst case, a graph of size exponential in the number of arguments, many of the functions encountered in typical applications have a more reasonable representation. Our algorithms have time complexity proportional to the sizes of the graphs being operated on, and hence are quite efficient as long as the graphs do not grow too large. We present experimental results from applying these algorithms to problems in logic design verification that demonstrate the practicality of our approach.

9,021 citations


Journal ArticleDOI
TL;DR: It is argued that this technique can provide a practical alternative to manual proof construction or use of a mechanical theorem prover for verifying many finite-state concurrent systems.
Abstract: We give an efficient procedure for verifying that a finite-state concurrent system meets a specification expressed in a (propositional, branching-time) temporal logic. Our algorithm has complexity linear in both the size of the specification and the size of the global state graph for the concurrent system. We also show how this approach can be adapted to handle fairness. We argue that our technique can provide a practical alternative to manual proof construction or use of a mechanical theorem prover for verifying many finite-state concurrent systems. Experimental results show that state machines with several hundred states can be checked in a matter of seconds.

3,335 citations


Journal ArticleDOI
TL;DR: The TRACE model, described in detail elsewhere, deals with short segments of real speech, and suggests a mechanism for coping with the fact that the cues to the identity of phonemes vary as a function of context.

2,663 citations


Journal ArticleDOI
TL;DR: It is argued that electronic mail does not simply speed up the exchange of information but leads to the exchangeof new information as well, and much of the information conveyed through electronic mail was information that would not have been conveyed through another medium.
Abstract: This paper examines electronic mail in organizational communication. Based on ideas about how social context cues within a communication setting affect information exchange, it argues that electronic mail does not simply speed up the exchange of information but leads to the exchange of new information as well. In a field study in a Fortune 500 company, we used questionnaire data and actual messages to examine electronic mail communication at all levels of the organization. Based on hypotheses from research on social communication, we explored effects of electronic communication related to self-absorption, status equalization, and uninhibited behavior. Consistent with experimental studies, we found that decreasing social context cues has substantial deregulating effects on communication. And we found that much of the information conveyed through electronic mail was information that would not have been conveyed through another medium.

2,452 citations


Journal ArticleDOI
TL;DR: These approximations to the posterior means and variances of positive functions of a real or vector-valued parameter, and to the marginal posterior densities of arbitrary parameters can also be used to compute approximate predictive densities.
Abstract: This article describes approximations to the posterior means and variances of positive functions of a real or vector-valued parameter, and to the marginal posterior densities of arbitrary (ie, not necessarily positive) parameters These approximations can also be used to compute approximate predictive densities To apply the proposed method, one only needs to be able to maximize slightly modified likelihood functions and to evaluate the observed information at the maxima Nevertheless, the resulting approximations are generally as accurate and in some cases more accurate than approximations based on third-order expansions of the likelihood and requiring the evaluation of third derivatives The approximate marginal posterior densities behave very much like saddle-point approximations for sampling distributions The principal regularity condition required is that the likelihood times prior be unimodal

2,081 citations


Book
26 Mar 1986
TL;DR: Legged Robots that Balance as discussed by the authors describes the study of physical machines that run and balance on just one leg, including analysis, computer simulation, and laboratory experiments, and reveals that control of such machines is not particularly difficult.
Abstract: This book, by a leading authority on legged locomotion, presents exciting engineering and science, along with fascinating implications for theories of human motor control. It lays fundamental groundwork in legged locomotion, one of the least developed areas of robotics, addressing the possibility of building useful legged robots that run and balance.The book describes the study of physical machines that run and balance on just one leg, including analysis, computer simulation, and laboratory experiments. Contrary to expectations, it reveals that control of such machines is not particularly difficult. It describes how the principles of locomotion discovered with one leg can be extended to systems with several legs and reports preliminary experiments with a quadruped machine that runs using these principles.Raibert's work is unique in its emphasis on dynamics and active balance, aspects of the problem that have played a minor role in most previous work. His studies focus on the central issues of balance and dynamic control, while avoiding several problems that have dominated previous research on legged machines.Marc Raibert is Associate Professor of Computer Science and Robotics at Carnegie-Mellon University and on the editorial board of The MIT Press journal, "Robotics Research. Legged Robots That Balance" is fifteenth in the Artificial Intelligence Series, edited by Patrick Winston and Michael Brady.

2,044 citations



Journal ArticleDOI
TL;DR: An outer-approximation algorithm is presented for solving mixed-integer nonlinear programming problems of a particular class and a theoretical comparison with generalized Benders decomposition is presented on the lower bounds predicted by the relaxed master programs.
Abstract: An outer-approximation algorithm is presented for solving mixed-integer nonlinear programming problems of a particular class. Linearity of the integer (or discrete) variables, and convexity of the nonlinear functions involving continuous variables are the main features in the underlying mathematical structure. Based on principles of decomposition, outer-approximation and relaxation, the proposed algorithm effectively exploits the structure of the problems, and consists of solving an alternating finite sequence of nonlinear programming subproblems and relaxed versions of a mixed-integer linear master program. Convergence and optimality properties of the algorithm are presented, as well as a general discussion on its implementation. Numerical results are reported for several example problems to illustrate the potential of the proposed algorithm for programs in the class addressed in this paper. Finally, a theoretical comparison with generalized Benders decomposition is presented on the lower bounds predicted by the relaxed master programs.

1,258 citations


Journal ArticleDOI
TL;DR: This article explored the effects of computer-mediated communication on communication efficiency, participation, interpersonal behavior, and group choice, and found that when groups were linked by computer, group members made fewer remarks than they did face-to-face and took longer to make their group decisions.

1,241 citations


Journal ArticleDOI
TL;DR: Examining how optimists differ from pessimists in the kinds of coping strategies that they use revealed modest but reliable positive correlations between optimism and problem-focused coping, seeking of social support, and emphasizing positive aspects of the stressful situation.
Abstract: Previous research has shown that dispositional optimism is a prospective predictor of successful adaptation to stressful encounters. In this research we attempted to identify possible mechanisms underlying these effects by examining how optimists differ from pessimists in the kinds of coping strategies that they use. The results of two separate studies revealed modest but reliable positive correlations between optimism and problem-focused coping, seeking of social support, and emphasizing positive aspects of the stressful situation. Pessimism was associated with denial and distancing (Study 1), with focusing on stressful feelings, and with disengagement from the goal with which the stressor was interfering (Study 2). Study 1 also found a positive association between optimism and acceptance/resignation, but only when the event was construed as uncontrollable. Discussion centers on the implications of these findings for understanding the meaning of people's coping efforts in stressful circumstances.

1,222 citations


Journal ArticleDOI
TL;DR: This work evaluates the relative technical and scale efficiencies of decision making units DMUs when some of the inputs or outputs are exogenously fixed and beyond the discretionary control of DMU managers through mathematical programming formulations.
Abstract: We evaluate, by means of mathematical programming formulations, the relative technical and scale efficiencies of decision making units DMUs when some of the inputs or outputs are exogenously fixed and beyond the discretionary control of DMU managers. This approach further develops the work on efficiency evaluation and on estimation of efficient production frontiers known as data envelopment analysis DEA. We also employ the model to provide efficient input and output targets for DMU managers in a way that specifically accounts for the fixed nature of some of the inputs or outputs. We illustrate the approach, using real data, for a network of fast food restaurants.



Journal ArticleDOI
TL;DR: This work proposes the paradigm of recognizing objects while locating them as a prediction and verifi cation scheme that makes efficient use of the shape representation and the matching algorithm, which are general and can be used for other types of data, such as ultrasound, stereo, and tactile.
Abstract: The problem of recognizing and locating rigid objects in 3-D space is important for applications of robotics and naviga tion. We analyze the task requirements in terms of what information needs to be represented, how to represent it, what kind of paradigms can be used to process it, and how to implement the paradigms. We describe shape surfaces by curves and patches, which we represent by linear primitives, such as points, lines, and planes. Next we describe algo rithms to construct this representation from range data. We then propose the paradigm of recognizing objects while locat ing them. We analyze the basic constraint of rigidity that can be exploited, which we implement as a prediction and verifi cation scheme that makes efficient use of the representation. Results are presented for data obtained from a laser range finder, but both the shape representation and the matching algorithm are general and can be used for other types of data, such as ultrasound, stereo, and tactile.

Proceedings ArticleDOI
01 Nov 1986
TL;DR: This paper develops simple, systematic, and efficient techniques for making linked data structures persistent, and uses them to devise persistent forms of binary search trees with logarithmic access, insertion, and deletion times and O (1) space bounds for insertion and deletion.
Abstract: This paper is a study of persistence in data structures. Ordinary data structures are ephemeral in the sense that a change to the structure destroys the old version, leaving only the new version available for use. In contrast, a persistent structure allows access to any version, old or new, at any time. We develop simple, systematic, and efficient techniques for making linked data structures persistent. We use our techniques to devise persistent forms of binary search trees with logarithmic access, insertion, and deletion times and O (1) space bounds for insertion and deletion.

Journal ArticleDOI
TL;DR: The simple assembly line balancing problem (SALBP) as discussed by the authors is a deterministic optimization problem where all input parameters are assumed to be known with certainty and all the algorithms discussed are exact.
Abstract: In this survey paper we discuss the development of the simple assembly line balancing problem SALBP; modifications and generalizations over time; present alternate 0-1 programming formulations and a general integer programming formulation of the problem; discuss other well-known problems related to SALBP; describe and comment on a number of exact i.e., optimum-seeking methods; and present a summary of the reported computational experiences. All models discussed here are deterministic i.e., all input parameters are assumed to be known with certainty and all the algorithms discussed are exact. The problem is termed "simple" in the sense that no "mixed-models," "subassembly lines," "zoning restrictions," etc. are considered. Due to the richness of the literature, we exclude from discussion here a the inexact i.e., heuristic/approximate algorithms for SALPB and b the algorithms for the general assembly line balancing problem including the stochastic models.

Journal ArticleDOI
TL;DR: These economic figures provide a lower-bound estimate of the full economic burden of major depression and further emphasize the need for timely recognition and treatment to potentially minimize the negative impact of the illness on society.

Proceedings Article
11 Aug 1986
TL;DR: A compact representation of all possible assembly plans of a given product using AND/OR graphs is presented, which forms the basis for efficient planning algorithms which enable an increase in assembly system flexibility by allowing an intelligent robot to pick a course of action according to instantaneous conditions.
Abstract: This paper presents a compact representation of all possible assembly plans of a given product using AND/OR graphs. Such a representation forms the basis for efficient planning algorithms which enable an increase in assembly system flexibility by allowing an intelligent robot to pick a course of action according to instantaneous conditions. Two applications are discussed: the selection of the best assembly plan (off-line planning), and opportunistic scheduling (online planning). An example of an assembly with four parts illustrates the use of the AND/OR graph representation to find the best assembly plan based on weighing of operations according to complexity of manipulation and stability of subassemblies. In practice, a generic search algorithm, such as the AO* may be used to find this plan. The scheduling efficiency using this representation is compared to fixed sequence and precedence graph representations. The AND/OR graph consistently reduces the average number of operations.

Journal ArticleDOI
TL;DR: The origins of Andrew are traced, its goals and strategies are discussed, and an overview of the current status of its implementation and usage is given.
Abstract: The Information Technology Center (ITC), a collaborative effort between IBM and Carnegie-Mellon University, is in the process of creating Andrew, a prototype computing and communication system for universities. This article traces the origins of Andrew, discusses its goals and strategies, and gives an overview of the current status of its implementation and usage.


Book
01 May 1986
TL;DR: The first three chapters of the book as mentioned in this paper provide an introduction to type theory (higher-order logic), and the last three chapters are devoted to the formalization of various mathematical concepts in this very expressive formal language.
Abstract: This introduction to mathematical logic starts with propositional calculus and first-order logic. Topics covered include syntax, semantics, soundness, completeness, independence, normal forms, vertical paths through negation normal formulas, compactness, Smullyan's Unifying Principle, natural deduction, cut-elimination, semantic tableaux, Skolemization, Herbrand's Theorem, unification, duality, interpolation, and definability. The last three chapters of the book provide an introduction to type theory (higher-order logic). It is shown how various mathematical concepts can be formalized in this very expressive formal language. This expressive notation facilitates proofs of the classical incompleteness and undecidability theorems which are very elegant and easy to understand. The discussion of semantics makes clear the important distinction between standard and nonstandard models which is so important in understanding puzzling phenomena such as the incompleteness theorems and Skolem's Paradox about countable models of set theory. Some of the numerous exercises require giving formal proofs. A computer program called ETPS which is available from the web facilitates doing and checking such exercises. Audience: This volume will be of interest to mathematicians, computer scientists, and philosophers in universities, as well as to computer scientists in industry who wish to use higher-order logic for hardware and software specification and verification.

Journal ArticleDOI
TL;DR: Kiesler et al. as mentioned in this paper conducted an experimental sample survey on health attitudes, behaviors, and personal traits using two forms of administra- tion: electronic and paper mail.
Abstract: IN ALL SCIENTIFIC ENDEAVOR the available tools affect the questions one can ask and the data one can collect. In this report, we examine a new tool in survey research, the electronic or computer-mediated survey. In the last two decades, electronic computers have come to figure in many phases of survey research-instrument design, sampling, moni- toring of work in the field, coding and editing, data entry, data clean- ing, scale and index construction, data base organization, data base retrieval, statistical analysis, documentation, and report writing (Kar- Abstract This report examines the electronic survey as a research tool. In an electronic survey, respondents use a text processing program to self-administer a computer-based questionnaire. As more people have access to computers, electronic surveys may be- come widespread. The electronic survey can reduce processing costs because it auto- mates the transformation of raw data into computer-readable form. It can combine advantages of interviews (e.g., prompts, complex branching) with those of paper mail surveys (e.g., standardization, anonymity). An important issue is how the electronic survey affects the responses of people who use it. We conducted an experimental sample survey on health attitudes, behaviors, and personal traits using two forms of administra- tion: electronic and paper mail. Closed-end responses in the electronic survey were less socially desirable and tended to be more extreme than were responses in the paper survey. Open-ended responses that could be edited by respondents were relatively long and disclosing. These findings are consistent with other research on computer-mediated communication, raising general issues about using computers to collect self-report data. Sara Kiesler is a Professor of Social Sciences and Social Psychology, and Lee Sproull

Journal ArticleDOI
TL;DR: The paper treats the cases when the categorical variable can be controllable or uncontrollable by the manager, for the cases of technical and scale inefficiency, and the approach is illustrated using real data.
Abstract: Data Envelopment Analysis has now been extensively applied in a range of empirical settings to identify relative inefficiencies, and provide targets for improvements. It accomplishes this by developing peer groups for each unit being operated. The use of categorical variables is an important extension which can improve the peer group construction process and incorporate "on-off" characteristics, e.g., presence of drive-in window or not in a banking network. It relaxes the stringent need for factors to display piecewise constant marginal productivities. In so doing, it substantially strengthens the credibility of the insights obtained. The paper treats the cases when the categorical variable can be controllable or uncontrollable by the manager, for the cases of technical and scale inefficiency. The approach is illustrated using real data.

Journal ArticleDOI
TL;DR: In this article, a theoretical exploration of the mechanics of pushing is presented and applied to the analysis and synthesis of robotic manipulator operations, and the results show that pushing is an essential component of many manipulator operations.
Abstract: Pushing is an essential component of many manipulator operations. This paper presents a theoretical exploration of the mechanics of pushing and demonstrates application of the theory to analysis and synthesis of robotic manipulator oper ations.

Journal ArticleDOI
TL;DR: These results on increased permeability and increased diffusivity in tumors provide a rational basis for the use of large-molecular-weight agents in the detection and treatment of solid tumors.

Journal ArticleDOI
TL;DR: This paper presents a unifying procedure, called Facet, for the automated synthesis of data paths at the register-transfer level that minimizes the number of storage elements, data operators, and interconnection units.
Abstract: This paper presents a unifying procedure, called Facet, for the automated synthesis of data paths at the register-transfer level. The procedure minimizes the number of storage elements, data operators, and interconnection units. A design generator named Emerald, based on Facet, was developed and implemented to facilitate extensive experiments with the methodology. The input to the design generator is a behavioral description which is viewed as a code sequence. Emerald provides mechanisms for interactively manipulating the code sequence. Different forms of the code sequence are mapped into data paths of different cost and speed. Data paths for the behavioral descriptions of the AM2910, the AM2901, and the IBM System/370 were produced and analyzed. Designs for the AM2910 and the AM2901 are compared with commercial designs. Overall, the total number of gates required for Emerald's designs is about 15 percent more than the commercial designs. The design space spanned by the behavioral specification of the AM2901 is extensively explored.

Journal ArticleDOI
TL;DR: In this article, the idea of Tukey's one degree of freedom for nonadditivity test is generalized to the time series setting and the case of concurrent nonlinearity is discussed in detail.
Abstract: SUMMARY This paper considers two nonlinearity tests for stationary time series. The idea of Tukey's one degree of freedom for nonadditivity test is generalized to the time series setting. The case of concurrent nonlinearity is discussed in detail. Simulation results show that the proposed tests are more powerful than that of Keenan (1985).

Journal ArticleDOI
TL;DR: Researchers display confirmation bias when they persevere by revising procedures until obtaining a theory-predicted result, which produces findings that are overgeneralized in avoidable ways, and this in turn hinders successful applications.
Abstract: Researchers display confirmation bias when they persevere by revising procedures until obtaining a theory-predicted result. This strategy produces findings that are overgeneralized in avoidable ways, and this in turn hinders successful applications. (The 40-year history of an attitude-change phenomenon, the sleeper effect, stands as a case in point.) Confirmation bias is an expectable product of theorycentered research strategies, including both the puzzle-solving activity of T. S. Kuhn's "normal science" and, more surprisingly, K. R. Popper's recommended method of falsification seeking. The alternative strategies of condition seeking (identifying limiting conditions for a known finding) and design (discovering conditions that can produce a previously unobtained result) are result centered; they are directed at producing specified patterns of data rather than at the logically impossible goals of establishing either the truth or falsity of a theory. Result-centered methods are by no means atheoretical. Rather, they oblige resourcefulness in using existing theory and can stimulate novel development of theory.


Journal ArticleDOI
01 Apr 1986
TL;DR: An automatic planner is described that constructs a tilting program, using a simple model of the mechanics of sliding, and it is observed that sensorless motion strategies perform conditional actions using mechanical decisions in place of environmental inquiries.
Abstract: An autonomous robotic manipulator can reduce uncertainty in the locations of objects in either of two ways: by sensing, or by motion strategies. This paper explores the use of motion strategies to eliminate uncertainty, without the use of sensors. The approach is demonstrated within the context of a simple method to orient planar objects. A randomly oriented object is dropped into a tray. When the tray is tilted, the object can slide into walls, along walls, and into corners, sometimes with the effect of reducing the number of possible orientations. For some objects a sequence of tilting operations exists that leaves the object's orientation completely determined. The paper describes an automatic planner that constructs such a tilting program, using a simple model of the mechanics of sliding. The planner has been implemented, the resulting programs have been executed using a tray attached to an industrial manipulator, and sometimes the programs work. The paper also explores the issue of sensorless manipulation, tray-tilting in particular, within the context of a formal framework first described by Lozano-Perez, Mason, and Taylor [1984]. It is observed that sensorless motion strategies perform conditional actions using mechanical decisions in place of environmental inquiries.