scispace - formally typeset
Search or ask a question

Showing papers by "Carnegie Mellon University published in 1980"


Journal ArticleDOI
TL;DR: In this article, a contract Gaussian basis set (6•311G) was developed by optimizing exponents and coefficients at the Mo/ller-Plesset (MP) second-order level for the ground states of first-row atoms.
Abstract: A contracted Gaussian basis set (6‐311G**) is developed by optimizing exponents and coefficients at the Mo/ller–Plesset (MP) second‐order level for the ground states of first‐row atoms. This has a triple split in the valence s and p shells together with a single set of uncontracted polarization functions on each atom. The basis is tested by computing structures and energies for some simple molecules at various levels of MP theory and comparing with experiment.

14,120 citations


Journal ArticleDOI
TL;DR: The reading span, the number of final words recalled, varied from two to five for 20 college students and was correlated with three reading comprehension measures, including verbal SAT and tests involving fact retrieval and pronominal reference.

6,041 citations



Journal ArticleDOI
TL;DR: A model of reading comprehension that accounts for the allocation of eye fixations of college students reading scientific passages is presented, embedded in a theoretical framework capable of accommodating the flexibility of reading.
Abstract: This article presents a model of reading comprehension that accounts for the allocation of eye fixations of college students reading scientific passages. The model deals with processing at the level of words, clauses, and text units. Readers make longer pauses at points where processing loads are greater. Greater loads occur while readers are accessing infrequent words, integrating information from important clauses, and making inferences at the ends of sentences. The model accounts forthe gaze duration on each word of text as a function of the involvement of the various levels of processing. The model is embedded in a theoretical framework capable of accommodating the flexibility of reading.

3,444 citations


Journal ArticleDOI
20 Jun 1980-Science
TL;DR: Although a sizable body of knowledge is prerequisite to expert skill, that knowledge must be indexed by large numbers of patterns that, on recognition, guide the expert in a fraction of a second to relevant parts of the knowledge store.
Abstract: Although a sizable body of knowledge is prerequisite to expert skill, that knowledge must be indexed by large numbers of patterns that, on recognition, guide the expert in a fraction of a second to relevant parts of the knowledge store. The knowledge forms complex schemata that can guide a problem's interpretation and solution and that constitute a large part of what we call physical intuition.

2,038 citations


Journal ArticleDOI
TL;DR: In this article, the authors examined several common modes of crystal growth and identified a few new theoretical ideas and a larger number of outstanding problems, including sidebranching and tip-splitting instabilities.
Abstract: Several common modes of crystal growth provide particularly simple and elegant examples of spontaneous pattern formation in nature. Phenomena of interest here are those in which an advancing nonfaceted solidification front suffers an instability and subsequently reorganizes itself into a more complex mode of behavior. The purpose of this essay is to examine several such situations and, in doing this, to identify a few new theoretical ideas and a larger number of outstanding problems. The systems studied are those in which solidification is controlled entirely by a single diffusion process, either the flow of latent heat away from a moving interface or the analogous redistribution of chemical constituents. Convective effects are ignored, as are most effects of crystalline anisotropy. The linear theory of the Mullins-Sekerka instability is reviewed for simple planar and spherical cases and also for a special model of directional solidification. These techniques are then extended to the case of a freely growing dendrite, and it is shown how this analysis leads to an understanding of sidebranching and tip-splitting instabilities. A marginal-stability hypothesis is introduced; and it is argued that this intrinsically nonlinear theory, if valid, permits aone to use results of linear-stability analysis to predict dendritic growth rates. The review concludes with a discussion of nonlinear effects in directional solidication. The nonplanar, cellular interfaces which emerge in this situation have much in common with convection patterns in hydrodynamics. The cellular stability problem is discussed briefly, and some preliminary attempts to do calculations in the strongly nonlinear regime are summarized.

1,969 citations


Journal ArticleDOI
TL;DR: In this article, Anderson's information integration model was used to evaluate the informativeness of person attributes and found that informative attributes attract selective attention at input and also carry extra weight in the final impression.
Abstract: Social perceivers process selected aspects of the stimulus array presented by another person. This article argues that such selectivity is based on the informativeness of person attributes. Properties of the attribute itself—its evaluative extremity (distance from the scale midpoint) and its evaluative valence (positive or negative)—can make it informative. Informative attributes attract selective attention at input and also carry extra weight in the final impression. In the present research, negativity and extremity were manipulated across two separate behavioral dimensions, sociability and civic activism, and over 16 stimulus persons. Perceivers saw two prescaled behavior photographs for each stimulus person and controlled a slide changer switch—this provided a measure of attention as looking time. Perceivers also rated each stimulus person's likability, providing a measure of relative weight for each slide. Weights were derived from Anderson's information integration model. Perceivers preferentially weighted behaviors that were extreme or negative, and the behavioral measure of attention (looking time) replicated the predicted pattern. The results are discussed in light of controversies over the rationality of social information processing.

1,442 citations


Journal ArticleDOI
TL;DR: The characteristics of the speech problem in particular, the special kinds of problem-solving uncertainty in that domain, the structure of the Hearsay-II system developed to cope with that uncertainty, and the relationship between Hearsey-II's structure and those of other speech-understanding systems are discussed.
Abstract: The Hearsay-II system, developed during the DARPA-sponsored five-year speech-understanding research program, represents both a specific solution to the speech-understanding problem and a general framework for coordinating independent processes to achieve cooperative problem-solving behavior. As a computational problem, speech understanding reflects a large number of intrinsically interesting issues. Spoken sounds are achieved by a long chain of successive transformations, from intentions, through semantic and syntactic structuring, to the eventually resulting audible acoustic waves. As a consequence, interpreting speech means effectively inverting these transformations to recover the speaker's intention from the sound. At each step in the interpretive process, ambiguity and uncertainty arise. The Hearsay-II problem-solving framework reconstructs an intention from hypothetical interpretations formulated at various levels of abstraction. In addition, it allocates limited processing resources first to the most promising incremental actions. The final configuration of the Hearsay-II system comprises problem-solving components to generate and evaluate speech hypotheses, and a focus-of-control mechanism to identify potential actions of greatest value. Many of these specific procedures reveal novel approaches to speech problems. Most important, the system successfully integrates and coordinates all of these independent activities to resolve uncertainty and control combinatorics. Several adaptations of the Hearsay-II framework have already been undertaken in other problem domains, and it is anticipated that this trend will continue; many future systems necessarily will integrate diverse sources of knowledge to solve complex problems cooperatively. Discussed in this paper are the characteristics of the speech problem in particular, the special kinds of problem-solving uncertainty in that domain, the structure of the Hearsay-II system developed to cope with that uncertainty, and the relationship between Hearsay-II's structure and those of other speech-understanding systems. The paper is intended for the general computer science audience and presupposes no speech or artificial intelligence background.

1,422 citations


Journal ArticleDOI
TL;DR: In this paper, the nature of physical symbol systems is laid out in ways familiar, but not thereby useless, to review the basis of common understanding between the various disciplines.

1,343 citations


Journal ArticleDOI
TL;DR: In this article, a systematic investigation of all simple possibilities of having massive neutrinos in SU(2) models of electroweak interactions, without the ad hoc imposition of lepton number conservation, is presented.
Abstract: We make a systematic investigation of all simple possibilities of having massive neutrinos in SU(2)\ifmmode\times\else\texttimes\fi{}U(1) models of electroweak interactions, without the ad hoc imposition of lepton-number conservation. The minimal standard model is enlarged with triplet or singly or doubly charged singlet scalars as well as fermions in singlet and doublet representations. We find that in all cases the neutrino mass eigenstates are Majorana fields. This is so even though right-handed neutrino fields are added to the model. When mass terms of the Dirac type are also present (and if ${\ensuremath{ u}}_{R}'\mathrm{s}$ also have small masses) neutrinos will oscillate into antineutrinos (which, we argue, are most likely "sterile"). General fermion mass terms of both Dirac and Majorana types are studied and the results are included in the Appendix.

920 citations


Journal ArticleDOI
TL;DR: In this paper, the authors describe methods for conveniently formulating and estimating dynamic linear econometric models under the hypothesis of rational expectations and derive an econometrically convenient formula for the cross-equation rational expectations restrictions.

Journal ArticleDOI
TL;DR: In this paper, the authors compared short-term self-interest and longstanding symbolic attitudes as determinants of voters' attitudes toward government policy on four controversial issues (unemployment, national health insurance, busing, and law and order), and issue voting concerning those policy areas.
Abstract: This article contrasts short-term self-interest and longstanding symbolic attitudes as determinants of (1) voters' attitudes toward government policy on four controversial issues (unemployment, national health insurance, busing, and law and order), and (2) issue voting concerning those policy areas. In general, we found the various self-interest measures to have very little effect in determining either policy preferences or voting behavior. In contrast, symbolic attitudes (liberal or conservative ideology, party identification, and racial prejudice) had major effects. Nor did self-interest play much of a role in creating “issue publics” that were particularly attentive to, informed about, or constrained in their attitudes about these specific policy issues. Conditions that might facilitate more self-interested political attitudes, specifically having privatistic (rather than public-regarding) personal values, perceiving the policy area as a major national problem, being high in political sophistication, perceiving the government as responsive, or having a sense of political efficacy, were also explored, but had no effect. The possibility that some long-term self-interest might be reflected in either group membership or in symbolic attitudes themselves is examined. While such possibilities cannot be definitively rejected, problems with interpreting standard demographic findings as self-interest effects are discussed.

Journal ArticleDOI
TL;DR: Multidimensional divide-and-conquer is discussed, an algorithmic paradigm that can be instantiated in many different ways to yield a number of algorithms and data structures for multidimensional problems.
Abstract: Most results in the field of algorithm design are single algorithms that solve single problems. In this paper we discuss multidimensional divide-and-conquer, an algorithmic paradigm that can be instantiated in many different ways to yield a number of algorithms and data structures for multidimensional problems. We use this paradigm to give best-known solutions to such problems as the ECDF, maxima, range searching, closest pair, and all nearest neighbor problems. The contributions of the paper are on two levels. On the first level are the particular algorithms and data structures given by applying the paradigm. On the second level is the more novel contribution of this paper: a detailed study of an algorithmic paradigm that is specific enough to be described precisely yet general enough to solve a wide variety of problems.

Journal ArticleDOI
TL;DR: In this paper, the authors evaluate the quality of analysts' forecasts as surrogates for the market expectation of earnings and compare it with that of prediction models commonly used in research and find that prediction errors of analysts are more closely associated with security price movements, suggesting that analyst's forecasts provide a better surrogate for market expectations than forecasts generated by time-series models.

Journal ArticleDOI
TL;DR: A set of two computer-implemented models that solve physics problems in ways characteristic of more and less competent human solvers are described, providing a good account of the order in which principles are applied by humansolvers working problems in kinematics and dynamics.

Journal ArticleDOI
06 Jun 1980-Science
TL;DR: After more than 230 hours of practice in the laboratory, a subject was able to increase his memory span from 7 to 79 digits, and his performance on other memory tests with digits equaled that of memory experts with lifelong training.
Abstract: After more than 230 hours of practice in the laboratory, a subject was able to increase his memory span from 7 to 79 digits. His performance on other memory tests with digits equaled that of memory experts with lifelong training. With an appropriate mnemonic system, there is seemingly no limit to memory performance with practice.


Journal ArticleDOI
TL;DR: In this article, the authors conclude that the strengths of the schema framework more than outweigh the liabilities associated with these criticisms, and conclude that it is more suitable for real social phenomena than old wine in a new bottle.
Abstract: Recent enthusiasm for the social schema concept has been accompanied by a wave of criticism. Skeptics argue that the concept is imprecise and nonfalsifiable, irrelevant to real social phenomena, and simply old wine in a new bottle. While finding elements of truth in each criticism, this article concludes that the strengths of the schema framework more than outweigh the liabilities associated with these criticisms.

Journal ArticleDOI
TL;DR: A technique for multiclass optical pattern recognition of different perspective views of an object is described and a single averaged matched spatial filter is produced from a weighted linear combination of these functions.
Abstract: A technique for multiclass optical pattern recognition of different perspective views of an object is described. Each multiclass representation of an object is described as an orthonormal basis function expansion, and a single averaged matched spatial filter is then produced from a weighted linear combination of these functions. The technique is demonstrated for a terminal missile guidance application using IR tank imagery.

Journal ArticleDOI
TL;DR: This paper found that more attributions were made after unexpected, as opposed to expected, outcomes and that there was a tendency for relatively more stable attributionsto be given after expected outcomes.
Abstract: University of IowaThe present investigation extended the generality of attribution research byexploring several important, issues in a highly involving real-world setting inwhich attributions naturally occur: athletic competition. Newspaper accountsof baseball and football games were coded for attributional content. These datasupported a motivational or self-enhancement explanation for the tendency tomake internal attributions for success and external attributions for failure. Nosupport was found for Miller and Ross's contention that this tendency is medi-ated by expectancies. It was also found that more attributions were-made afterunexpected, as opposed to expected, outcomes. And in accordance with Weiner'sattribution model, there was a tendency for relatively more stable attributionsto be given after expected outcomes. The. advantages and disadvantages ofstudying attributions in archival data and the possibility of attributions justi-fying behavior rather than explaining behavior are discussed.An important motivator of human thoughtis the desire to understand the determinantsof behavior. Like the psychologist, the averageperson is assumed to test "causal theories"concerning the reasons behind his or her ownactions and the actions of other people. Suchcausal knowledge is highly adaptive, yieldingto lay attributors an understanding of (andconsequently the ability to predict and con-trol) many situations in which they findthemselves.The desire to achieve an understanding ofthe causes of human behavior has always beenconsidered the chief motivation underlying theattribution process (e.g., Jones & Davis, 1965;Kelley, 1967, 1971). Rather than studying at-tributions in important human situations,

Journal ArticleDOI
TL;DR: Three classes of transformations based on a different counting scheme for representing the integers are exhibited, and a combinatorial model is used to show the optimality of many of the transformations.

Journal ArticleDOI
TL;DR: In this article, a set of rate equations is proposed to describe nucleation and growth of droplets in metastable, near-critical fluids, and these equations are used in conjunction with steady-state nucleation theory to compute completion times for phase separation in binary mixtures.
Abstract: A simple set of rate equations is proposed to describe nucleation and growth of droplets in metastable, near-critical fluids. These equations are used in conjunction with steady-state nucleation theory to compute completion times for phase separation in binary mixtures. Reexamination of available experimental data provides little if any evidence for the major failure of conventional nucleation theory that has been postulated on the basis of these data.

Journal ArticleDOI
TL;DR: In this paper, the authors developed a recursive method to determine the constraint set and the value function of an optimal taxation plan. But the problem of time inconsistency of the optimal tax plan is not addressed.

Book ChapterDOI
01 Jan 1980
TL;DR: The results show the algorithm to be more reliable and efficient than earlier procedures on large, sparse set covering problems.
Abstract: We report on the implementation and computational testing of several versions of a set covering algorithm, based on the family of cutting planes from conditional bounds discussed in the companion paper [2] The algorithm uses a set of heuristics to find prime covers, another set of heuristics to find feasible solutions to the dual linear program which are needed to generate cuts, and subgradient optimization to find lower bounds It also uses implicit enumeration with some new branching rules Each of the ingredients was implemented and tested in several versions The variant of the algorithm that emerged as best was run on 55 randomly generated test problems (20 of them from the literature), with up to 200 constraints and 2000 variables The results show the algorithm to be more reliable and efficient than earlier procedures on large, sparse set covering problems

Journal ArticleDOI
TL;DR: In this article, the authors describe the mathematical theory underlying an interactive computer program for eliciting the hyperparameters of a subjective conjugate distribution for the multiple linear regression model with the usual normal error structure.
Abstract: This article describes the mathematical theory underlying an interactive computer program for eliciting the hyperparameters of a subjective conjugate distribution for the multiple linear regression model with the usual normal error structure. Although the methods are heuristic, they are shown to produce hyperparameter estimates satisfying the constraints satisfied by the hyperparameters themselves. An application is given to the problem of predicting the time to fatigue failure of an asphalt-concrete road as a function of several design variables concerning the road.

Journal ArticleDOI
TL;DR: In this paper, the onset of convective and constitutional interfacial instabilities during the directional solidification of a single phase binary alloy at constant velocity vertically upwards (positive z -direction) is treated by a linear stability analysis.

Journal ArticleDOI
TL;DR: In this paper, an information-processing model is presented that describes how a person uses an interactive computer text-editing system to make modifications to a manuscript and assesses the predictions of such a model with respect to predicting user behavior sequences, predicting the time required to do particular modifications, and determining the effect on accuracy of the detail with which the modeling is done (the model's "grain size".

Journal ArticleDOI
TL;DR: This paper investigates the problem of reporting all intersecting pairs in a set of n rectilinearly oriented rectangles in the plane and describes an algorithm that solves this problem in worst case time proportional to n lg n + k, where k is the number of interesecting pairs found.
Abstract: In this paper we investigate the problem of reporting all intersecting pairs in a set of n rectilinearly oriented rectangles in the plane. This problem arises in applications such as design rule checking of very large-scale integrated (VLSI) circuits and architectural databases. We describe an algorithm that solves this problem in worst case time proportional to n lg n + k, where k is the number of interesecting pairs found. This algorithm is optimal to within a constant factor. As an intermediate step of this algorithm, we solve a problem related to the range searching problem that arises in database applications. Although the algorithms that we describe are primarily theoretical devices (being very difficult to code), they suggest other algorithms that are quite practical.

Journal ArticleDOI
TL;DR: Structured VLSI design proceeds from algorithm to logic cell to cell array to special-purpose chip, yielding cheap, powerful, and modular hardware that will permanently alter the systems landscape of the 80's.
Abstract: Structured VLSI design proceeds from algorithm to logic cell to cell array to special-purpose chip, yielding cheap, powerful, and modular hardware that will permanently alter the systems landscape of the 80's.

Journal ArticleDOI
TL;DR: The concurrency control techniques introduced in the paper include the use of special nodes and pointers to redirect searches, and theUse of copies of sections of the tree to introduce many changes simultaneously and therefore avoid unpredictable interleaving.
Abstract: The concurrent manipulation of a binary search tree is considered in this paper. The systems presented can support any number of concurrent processes which perform searching, insertion, deletion, and rotation (reorganization) on the tree, but allow any process to lock only a constant number of nodes at any time. Also, in the systems, searches are essentially never blocked. The concurrency control techniques introduced in the paper include the use of special nodes and pointers to redirect searches, and the use of copies of sections of the tree to introduce many changes simultaneously and therefore avoid unpredictable interleaving. Methods developed in this paper may provide new insights into other problems in the area of concurrent database manipulation.