scispace - formally typeset
Search or ask a question

Showing papers by "Carnegie Mellon University published in 1992"


Journal ArticleDOI
TL;DR: The Weighted Histogram Analysis Method (WHAM) as mentioned in this paper is an extension of Ferrenberg and Swendsen's multiple histogram technique for complex biomolecular Hamiltonians.
Abstract: The Weighted Histogram Analysis Method (WHAM), an extension of Ferrenberg and Swendsen's Multiple Histogram Technique, has been applied for the first time on complex biomolecular Hamiltonians. The method is presented here as an extension of the Umbrella Sampling method for free-energy and Potential of Mean Force calculations. This algorithm possesses the following advantages over methods that are currently employed: (1) It provides a built-in estimate of sampling errors thereby yielding objective estimates of the optimal location and length of additional simulations needed to achieve a desired level of precision; (2) it yields the “best” value of free energies by taking into account all the simulations so as to minimize the statistical errors; (3) in addition to optimizing the links between simulations, it also allows multiple overlaps of probability distributions for obtaining better estimates of the free-energy differences. By recasting the Ferrenberg–Swendsen Multiple Histogram equations in a form suitable for molecular mechanics type Hamiltonians, we have demonstrated the feasibility and robustness of this method by applying it to a test problem of the generation of the Potential of Mean Force profile of the pseudorotation phase angle of the sugar ring in deoxyadenosine. © 1992 by John Wiley & Sons, Inc.

5,784 citations


Journal ArticleDOI
TL;DR: A theory of the way working memory capacity constrains comprehension is proposed, which proposes that both processing and storage are mediated by activation and that the total amount of activation available in working memory varies among individuals.
Abstract: A theory of the way working memory capacity constrains comprehension is proposed. The theory proposes that both processing and storage are mediated by activation and that the total amount of activation available in working memory varies among individuals. Individual differences in working memory capacity for language can account for qualitative and quantitative differences among college-age adults in several aspects of language comprehension. One aspect is syntactic modularity: The larger capacity of some individuals permits interaction among syntactic and pragmatic information, so that their syntactic processes are not informationally encapsulated. Another aspect is syntactic ambiguity: The larger capacity of some individuals permits them to maintain multiple interpretations. The theory is instantiated as a production system model in which the amount of activation available to the model affects how it adapts to the transient computational and storage demands that occur in comprehension.

4,000 citations


Journal ArticleDOI
TL;DR: In this paper, the singular value decomposition (SVDC) technique is used to factor the measurement matrix into two matrices which represent object shape and camera rotation respectively, and two of the three translation components are computed in a preprocessing stage.
Abstract: Inferring scene geometry and camera motion from a stream of images is possible in principle, but is an ill-conditioned problem when the objects are distant with respect to their size. We have developed a factorization method that can overcome this difficulty by recovering shape and motion under orthography without computing depth as an intermediate step. An image stream can be represented by the 2FxP measurement matrix of the image coordinates of P points tracked through F frames. We show that under orthographic projection this matrix is of rank 3. Based on this observation, the factorization method uses the singular-value decomposition technique to factor the measurement matrix into two matrices which represent object shape and camera rotation respectively. Two of the three translation components are computed in a preprocessing stage. The method can also handle and obtain a full solution from a partially filled-in measurement matrix that may result from occlusions or tracking failures. The method gives accurate results, and does not introduce smoothing in either shape or motion. We demonstrate this with a series of experiments on laboratory and outdoor image streams, with and without occlusions.

2,696 citations


Journal ArticleDOI
TL;DR: In this paper, the authors enumerate a set of discounted utility anomalies analogous to the EU anomalies and propose a model that accounts for the anomalies, as well as other intertemporal choice phenomena incompatible with DU.
Abstract: Research on decision making under uncertainly has been strongly influenced by the documentation of numerous expected utility (EU) anomalies—behaviors that violate the expected utility axioms. The relative lack of progress on the closely related topic of intertemporal choice is partly due to the absence of an analogous set of discounted utility (DU) anomalies. We enumerate a set of DU anomalies analogous to the EU anomalies and propose a model that accounts for the anomalies, as well as other intertemporal choice phenomena incompatible with DU. We discuss implications for savings behavior, estimation of discount rates, and choice framing effects.

2,208 citations


Journal ArticleDOI
TL;DR: The OBDD data structure is described and a number of applications that have been solved by OBDd-based symbolic analysis are surveyed.
Abstract: Ordered Binary-Decision Diagrams (OBDDs) represent Boolean functions as directed acyclic graphs. They form a canonical representation, making testing of functional properties such as satisfiability and equivalence straightforward. A number of operations on Boolean functions can be implemented as graph algorithms on OBDD data structures. Using OBDDs, a wide variety of problems can be solved through symbolic analysis. First, the possible variations in system parameters and operating conditions are encoded with Boolean variables. Then the system is evaluated for all variations by a sequence of OBDD operations. Researchers have thus solved a number of problems in digital-system design, finite-state system analysis, artificial intelligence, and mathematical logic. This paper describes the OBDD data structure and surveys a number of applications that have been solved by OBDD-based symbolic analysis.

2,196 citations


Journal ArticleDOI
TL;DR: In this article, a review of the beneficial effects of optimism on psychological and physical well-being is presented, focusing on how optimism may lead a person to cope more adaptively with stress.
Abstract: The primary purpose of this paper is to review recent research examining the beneficial effects of optimism on psychological and physical well-being. The review focuses on research that is longitudinal or prospective in design. Potential mechanisms are also identified whereby the beneficial effects of optimism are produced, focusing in particular on how optimism may lead a person to cope more adaptively with stress. The paper closes with a brief consideration of the similarities and differences between our own theoretical approach and several related approaches that have been taken by others.

2,095 citations


Journal ArticleDOI
TL;DR: This paper compares eight reinforcement learning frameworks: Adaptive heuristic critic (AHC) learning due to Sutton, Q-learning due to Watkins, and three extensions to both basic methods for speeding up learning and two extensions are experience replay, learning action models for planning, and teaching.
Abstract: To date, reinforcement learning has mostly been studied solving simple learning tasks. Reinforcement learning methods that have been studied so far typically converge slowly. The purpose of this work is thus two-fold: 1) to investigate the utility of reinforcement learning in solving much more complicated learning tasks than previously studied, and 2) to investigate methods that will speed up reinforcement learning. This paper compares eight reinforcement learning frameworks: adaptive heuristic critic (AHC) learning due to Sutton, Q-learning due to Watkins, and three extensions to both basic methods for speeding up learning. The three extensions are experience replay, learning action models for planning, and teaching. The frameworks were investigated using connectionism as an approach to generalization. To evaluate the performance of different frameworks, a dynamic environment was used as a testbed. The environment is moderately complex and nondeterministic. This paper describes these frameworks and algorithms in detail and presents empirical evaluation of the frameworks.

1,691 citations


Journal ArticleDOI
TL;DR: Perfusion images of a freeze-injured rat brain have been obtained, demonstrating the technique's ability to detect regional abnormalities in perfusion.
Abstract: A technique has been developed for proton magnetic resonance imaging (MRI) of perfusion, using water as a freely diffusable tracer, and its application to the measurement of cerebral blood flow (CBF) in the rat is demonstrated. The method involves labeling the inflowing water proton spins in the arterial blood by inverting them continuously at the neck region and observing the effects of inversion on the intensity of brain MRI. Solution to the Bloch equations, modified to include the effects of flow, allows regional perfusion rates to be measured from an image with spin inversion, a control image, and a T1 image. Continuous spin inversion labeling the arterial blood water was accomplished, using principles of adiabatic fast passage by applying continuous-wave radiofrequency power in the presence of a magnetic field gradient in the direction of arterial flow. In the detection slice used to measure perfusion, whole brain CBF averaged 1.39 +/- 0.19 ml.g-1.min-1 (mean +/- SEM, n = 5). The technique's sensitivity to changes in CBF was measured by using graded hypercarbia, a condition that is known to increase brain perfusion. CBF vs. pCO2 data yield a best-fit straight line described by CBF (ml.g-1.min-1) = 0.052pCO2 (mm Hg) - 0.173, in excellent agreement with values in the literature. Finally, perfusion images of a freeze-injured rat brain have been obtained, demonstrating the technique's ability to detect regional abnormalities in perfusion.

1,500 citations


Journal ArticleDOI
TL;DR: A disturbance in the internal representation of contextual information can provide a common explanation for schizophrenic deficits in several attention- and language-related tasks and shows that these behavioral deficits may arise from a disturbance in a model parameter corresponding to the neuromodulatory effects of dopamine.
Abstract: Connectionist models are used to explore the relationship between cognitive deficits and biological abnormalities in schizophrenia. Schizophrenic deficits in tasks that tap attention and language processing are reviewed, as are biological disturbances involving prefrontal cortex and the mesocortical dopamine system. Three computer models are then presented that simulate normal and schizophrenic performance in the Stroop task, the continuous performance test, and a lexical disambiguation task. They demonstrate that a disturbance in the internal representation of contextual information can provide a common explanation for schizophrenic deficits in several attention- and language-related tasks. The models also show that these behavioral deficits may arise from a disturbance in a model parameter (gain) corresponding to the neuromodulatory effects of dopamine, in a model component corresponding to the function of prefrontal cortex.

1,467 citations


Journal ArticleDOI
TL;DR: This paper shows that disconnected operation is feasible, efficient and usable by describing its design and implementation in the Coda File System by showing that caching of data, now widely used for performance, can also be exploited to improve availability.
Abstract: Disconnected operation is a mode of operation that enables a client to continue accessing critical data during temporary failures of a shared data repository. An important, though not exclusive, application of disconnected operation is in supporting portable computers. In this paper, we show that disconnected operation is feasible, efficient and usable by describing its design and implementation in the Coda File System. The central idea behind our work is that caching of data, now widely used for performance, can also be exploited to improve availability.

1,214 citations


01 Jan 1992
TL;DR: The symbolic model checking technique revealed subtle errors in this protocol, resulting from complex execution sequences that would occur with very low probability in random simulation runs, and an alternative method is developed for avoiding the state explosion in the case of asynchronous control circuits.
Abstract: Finite state models of concurrent systems grow exponentially as the number of components of the system increases. This is known widely as the state explosion problem in automatic verification, and has limited finite state verification methods to small systems. To avoid this problem, a method called symbolic model checking is proposed and studied. This method avoids building a state graph by using Boolean formulas to represent sets and relations. A variety of properties characterized by least and greatest fixed points can be verified purely by manipulations of these formulas using Ordered Binary Decision Diagrams. Theoretically, a structural class of sequential circuits is demonstrated whose transition relations can be represented by polynomial space OBDDs, though the number of states is exponential. This result is born out by experimental results on example circuits and systems. The most complex of these is the cache consistency protocol of a commercial distributed multiprocessor. The symbolic model checking technique revealed subtle errors in this protocol, resulting from complex execution sequences that would occur with very low probability in random simulation runs. In order to model the cache protocol, a language was developed for describing sequential circuits and protocols at various levels of abstraction. This language has a synchronous dataflow semantics, but allows nondeterminism and supports interleaving processes with shared variables. A system called SMV can automatically verify programs in this language with respect to temporal logic formulas, using the symbolic model checking technique. A technique for proving properties of inductively generated classes of finite state systems is also developed. The proof is checked automatically, but requires a user supplied process called a process invariant to act as an inductive hypothesis. An invariant is developed for the distributed cache protocol, allowing properties of systems with an arbitrary number of processors to be proved. Finally, an alternative method is developed for avoiding the state explosion in the case of asynchronous control circuits. This technique is based on the unfolding of Petri nets, and is used to check for hazards in a distributed mutual exclusion circuit.

Journal ArticleDOI
TL;DR: In this paper, the unconditional exact likelihood function is derived to estimate the parameters of a stationary univariate fractionally integrated time series, which allows the simultaneous estimation of all parameters of the model by exact maximum likelihood.

Book
01 Sep 1992
TL;DR: The Logic of Typed Feature Structures as discussed by the authors is a monograph that brings all the main theoretical ideas into one place where they can be related and compared in a unified setting.
Abstract: For those of us who belonged to the "Bay Area (Computational) Linguistics Community," the early eighties were a heady time. Local researchers working on linguistics, computational linguistics, and logic programming were investigating notions of category, type, feature, term, and partial specification that appeared to converge to a powerful new approach for describing (linguistic) objects and their relationships by monotonic accumulation of constraints between their features. The seed notions had almost independently arisen in generalized phrase structure grammar (GPSG) (Gazdar et al. 1985), lexical-functional grammar (LFG) (Bresnan and Kaplan 1982), functionalunification grammar (FUG) (Kay 1985), logic programming (Colmerauer 1978, Pereira and Warren 1980), and terminological reasoning systems (Ait-Kaci 1984). It took, however, a lot of experimental and theoretical work to identify precisely what the core notions were, how particular systems related to the core notions, and what were the most illuminating mathematical accounts of that core. The development of the unificationbased formalism PATR-II (Shieber 1984) was an early step toward the definition of the core, but its mathematical analysis, and the clarification of the connections between the various systems, are only now coming to a reasonable closure. The Logic of Typed Feature Structures is the first monograph that brings all the main theoretical ideas into one place where they can be related and compared in a unified setting. Carpenter's book touches most of the crucial questions of the developments during the decade, provides proofs for central results, and reaches right up to the edge of current research in the field. These contributions alone make it an indispensable compendium for the researcher or graduate student working on constraint-based grammatical formalisms, and they also make it a very useful reference work for researchers in object-oriented databases and logic programming. Having discharged the main obligation of the reviewer of saying who should read the book under review and why, I will now survey each of the book's four parts while raising some more general questions impinging on the whole book as they arise from the discussion of each part.

Journal ArticleDOI
TL;DR: In this paper, the performance of a hybrid of density functional theory and Hartree-Fock theory, the B-LYP/HF procedure, has been examined with a variety of basis sets.

Journal ArticleDOI
TL;DR: According to the theory of industrial districts, a new wave of economic growth is being led in a number of regions in Europe, North America and East Asia by spatially concentrated networks of mostly small and medium sized enterprises, often using flexible production technology and characterized by extensive local interfirm linkages as discussed by the authors.
Abstract: HARRISON B. (1992) Industrial districts: old wine in new bottles?, Reg. Studies 26, 469–483. According to the theory of industrial districts, a new wave of economic growth is being led in a number of regions in Europe, North America and East Asia by spatially concentrated networks of mostly small and medium sized enterprises, often using flexible production technology and characterized by extensive local interfirm linkages. Does this amount to a re-emergence of the dominance of what urban and regional economists call ‘agglomeration economies’ over the well-known pressures on business to spatially disperse its operations? Neoclassical economic theorizing from Marshall to Perroux provides one perspective on the contemporary industrial district phenomenon. Another is afforded by Granovetter's more recent elaboration of the ideas of ‘embedding’, ‘under-’ and ‘over-socialization’. Confronting each of these theoretical approaches with the other leads me to conclude that the industrial district prototypes involv...

Journal ArticleDOI
TL;DR: In this paper, the authors show that, compared with a face-to-face meeting, a computer-mediated discussion leads to delays; more explicit and outspoken advocacy; "flaming"; more equal participation among group members; and more extreme, unconventional, or risky decisions.

Journal ArticleDOI
TL;DR: In this paper, a partition of the optimal frontier into three parts corresponding to increasing, constant, and decreasing returns to scale is proposed, characterized in terms of optimal primal solutions and optimal dual solutions for both the original Charnes, Cooper, Rhodes model (1978) and the later Banker-Charnes-Cooper model (1984) and relying on concepts developed by R.D. Banker and R.M. Thrall.

Journal ArticleDOI
TL;DR: A review of the book "Groups That Work (And Those That Don't) by Richard Hackman" can be found in this article, where the authors present a review of their book.
Abstract: This article presents a review of the book “Groups That Work (And Those That Don't),” by Richard Hackman.

Journal ArticleDOI
TL;DR: The CLP programming language is defined, its underlyingphilosophy and programming methodology are discussed, important implementation issues are explored in detail, and finally, a prototypeinterpreter is described.
Abstract: The CLP( ℛ ) programming language is defined, its underlying philosophy and programming methodology are discussed, important implementation issues are explored in detail, and finally, a prototype interpreter is described.CLP( ℛ ) is designed to be an instance of the Constraint Logic Programming Scheme, a family of rule-based constraint programming languages defined by Jaffar and Lassez. The domain of computation ℛ of this particular instance is the algebraic structure consisting of uninterpreted functors over real numbers. An important property of CLP( ℛ )is that the constraints are treated uniformly in the sense that they are used to specify the input parameters to a program, they are the only primitives used in the execution of a program, and they are used to describe the output of a program.Implementation of a CLP language, and of CLP( ℛ ) in particular, raises new problems in the design of a constraint-solver. For example, the constraint solver must be incremental in the sense that solving additional constraints must not entail the resolving of old constraints. In our system, constraints are filtered through an inference engine, an engine/solver interface, an equation solver and an inequality solver. This sequence of modules reflects a classification and prioritization of the classes of constraints. Modules solving higher priority constraints are isolated from the complexities of modules solving lower priority constraints. This multiple-phase solving of constraints, together with a set of associated algorithms, gives rise to a practical system.

Journal ArticleDOI
TL;DR: The SPHINX-II speech recognition system is reviewed and recent efforts on improved speech recognition are summarized.

Journal ArticleDOI
TL;DR: A methodology for representing mental models as maps, extracting these maps from texts, and analyzing and comparing the extracted maps is described, supporting both qualitative and quantitative comparisons of the resulting representations.
Abstract: When making decisions or talking to others, people use mental models of the world to evaluate choices and frame discussions. This paper describes a methodology for representing mental models as maps, extracting these maps from texts, and analyzing and comparing the extracted maps. The methodology employs a set of computer-based tools to analyze written and spoken texts. These tools support textual comparison both in terms of what concepts are present and in terms of what structures of information are present. The methodology supports both qualitative and quantitative comparisons of the resulting representations. This approach is illustrated using data drawn from a larger study of students learning to write where it is possible to compare the students' mental models with that of instructor. EXTRACTING, REPRESENTING AND ANALYZING MENTAL MODELS

Journal ArticleDOI
TL;DR: In this paper, the authors study the limiting behavior of solutions to appropriately rescaled versions of the Allen-Cahn equation, a simplified model for dynamic phase transitions, and rigorously establish the existence in the limit of a phase-antiphase interface evolving according to mean curvature motion.
Abstract: We study the limiting behavior of solutions to appropriately rescaled versions of the Allen-Cahn equation, a simplified model for dynamic phase transitions. We rigorously establish the existence in the limit of a phase-antiphase interface evolving according to mean curvature motion. This assertion is valid for all positive time, the motion interpreted in the generalized sense of Evans-Spruck and Chen-Giga-Goto after the onset of geometric singularities.

Book ChapterDOI
TL;DR: In this article, the pioneering work of Borch and Arrow, the derivation of the optimal insurance contract form from the model, is synthesized and extended, and the model is shown to be optimal.
Abstract: Almost every phase of economic behavior is affected by uncertainty. The economic system has adapted to uncertainty by developing methods that facilitate the reallocation of risk among individuals and firms. The most apparent and familiar form for shifting risks is the ordinary insurance policy. Previous insurance decision analyses can be divided into those in which the insurance policy was exogenously specified (see John Gould, Jan Mossin, and Vernon Smith), and those in which it was not (see Karl Borch, 1960, and Kenneth Arrow, 1971, 1973). In this paper, the pioneering work of Borch and Arrow—the derivation of the optimal insurance contract form from the model—is synthesized and extended.

Journal ArticleDOI
TL;DR: In this article, the authors explored the use of the self-organization and learning capabilities of neural networks in structural damage assessment, and trained a neural network to recognize the behavior of the undamaged structure as well as the behaviour of the structure with various possible damage states when subjected to the measurements of the structural response, it should be able to detect any existing damage.

Journal ArticleDOI
TL;DR: A tutorial survey is presented of the many composite filter designs proposed for distortion-invariant optical pattern recognition and remarks are made regarding areas for further investigation.
Abstract: A tutorial survey is presented of the many composite filter designs proposed for distortion-invariant optical pattern recognition. Remarks are made throughout regarding areas for further investigation.

Journal ArticleDOI
TL;DR: This article found that negotiators' judgments of fair outcomes were biased in an egocentric direction, and that the magnitude of the parties' biases strongly predicted the length of strikes, and the role of situational complexity was examined.

01 Nov 1992
TL;DR: In this article, a class of phase-field models for crystallization of a pure substance from its melt are presented, which are based on an entropy functional, and are therefore thermodynamically consistent inasmuch as they guarantee spatially local positive entropy production.
Abstract: In an effort to unify the various phase-field models that have been used to study solidification, we have developed a class of phase-field models for crystallization of a pure substance from its melt. These models are based on an entropy functional, as in the treatment of Penrose and Fife, and are therefore thermodynamically consistent inasmuch as they guarantee spatially local positive entropy production. General conditions are developed to ensure that the phase field takes on constant values in the bulk phases. Specific forms of a phase-field function are chosen to produce two models that bear strong resemblances to the models proposed by Langer and Kobayashi. Our models contain additional nonlinear functions of the phase field that are necessary to guarantee thermodynamic consistency.

Journal ArticleDOI
TL;DR: This paper conducted two experiments exploring how people know whether they have an answer to a question before they actually find it in their memory, in which Ss were trained on relatively novel 2-digit×2-digit arithmetic problems (e.g., 23×27).
Abstract: How do people know whether they have an answer to a question before they actually find it in their memory? We conducted 2 experiments exploring this question, in which Ss were trained on relatively novel 2-digit×2-digit arithmetic problems (e.g., 23×27). Before answering each problem, Ss made a quick feeling of knowing judgment as to whether they could directly retrieve the answer from memory or had to compute it

Journal ArticleDOI
TL;DR: In this article, the authors propose that changes affecting the resource fit between organizations exchanging resources provide an impetus for the dissolution of their relationships, whereas the individual an individual is not affected.
Abstract: In this study, we propose that changes affecting the resource fit between organizations exchanging resources provide an impetus for the dissolution of their relationships, whereas the individual an...

Journal ArticleDOI
TL;DR: A model that explains how the working-memory capacity of a comprehender can constrain syntactic parsing and thereby affect the processing of syntactic ambiguities is proposed.