scispace - formally typeset
Search or ask a question

Showing papers in "Psychological Review in 1996"


Journal ArticleDOI
TL;DR: A theory is proposed that increased age in adulthood is associated with a decrease in the speed with which many processing operations can be executed and that this reduction in speed leads to impairments in cognitive functioning because of what are termed the limited time mechanism and the simultaneity mechanism.
Abstract: A theory is proposed to account for some of the age-related differences reported in measures of Type A or fluid cognition. The central hypothesis in the theory is that increased age in adulthood is associated with a decrease in the speed with which many processing operations can be executed and that this reduction in speed leads to impairments in cognitive functioning because of what are termed the limited time mechanism and the simultaneity mechanism. That is, cognitive performance is degraded when processing is slow because relevant operations cannot be successfully executed (limited time) and because the products of early processing may no longer be available when later processing is complete (simultaneity). Several types of evidence, such as the discovery of considerable shared age-related variance across various measures of speed and large attenuation of the age-related influences on cognitive measures after statistical control of measures of speed, are consistent with this theory.

5,094 citations


Journal ArticleDOI
TL;DR: The authors have proposed a family of algorithms based on a simple psychological mechanism: one-reason decision making, and found that these fast and frugal algorithms violate fundamental tenets of classical rationality: they neither look up nor integrate all information.
Abstract: Humans and animals make inferences about the world under limited time and knowledge. In contrast, many models of rational inference treat the mind as a Laplacean Demon, equipped with unlimited time, knowledge, and computational might. Following H. Simon's notion of satisficing, the authors have proposed a family of algorithms based on a simple psychological mechanism: onereason decision making. These fast and frugal algorithms violate fundamental tenets of classical rationality: They neither look up nor integrate all information. By computer simulation, the authors held a competition between the satisficing "Take The Best" algorithm and various "rational" inference procedures (e.g., multiple regression). The Take The Best algorithm matched or outperformed all competitors in inferential speed and accuracy. This result is an existence proof that cognitive mechanisms capable of successful performance in the real world do not need to satisfy the classical norms of rational inference.

3,112 citations


Journal ArticleDOI
TL;DR: Analysis of the ability of networks to reproduce data on acquired surface dyslexia support a view of the reading system that incorporates a graded division of labor between semantic and phonological processes, and contrasts in important ways with the standard dual-route account.
Abstract: A connectionist approach to processing in quasi-regular domains, as exemplified by English word reading, is developed. Networks using appropriately structured orthographic and phonological representations were trained to read both regular and exception words, and yet were also able to read pronounceable nonwords as well as skilled readers. A mathematical analysis of a simplified system clarifies the close relationship of word frequency and spelling-sound consistency in influencing naming latencies. These insights were verified in subsequent simulations, including an attractor network that accounted for latency data directly in its time to settle on a response. Further analyses of the ability of networks to reproduce data on acquired surface dyslexia support a view of the reading system that incorporates a graded division of labor between semantic and phonological processes, and contrasts in important ways with the standard dual-route account.

2,600 citations


Journal ArticleDOI
TL;DR: An interdisciplinary review of evidence about aggression, crime, and violence contradicted the view that low self-esteem is an important cause of violence, finding that violence appears to be most commonly a result of threatened egotism.
Abstract: Conventional wisdom has regarded low self-esteem as an important cause of violence, but the opposite view is theoretically viable. An interdisciplinary review of evidence about aggression, crime, and violence contradicted the view that low self-esteem is an important cause. Instead, violence appears to be most commonly a result of threatened egotismwthat is, highly favorable views of self that are disputed by some person or circumstance. Inflated, unstable, or tentative beliefs in the self's superiority may be most prone to encountering threats and hence to causing violence. The mediating process may involve directing anger outward as a way of avoiding a downward revision of the selfconcept. Only a minority of human violence can be understood as rational, instrumental behavior aimed at securing or protecting material rewards. The pragmatic futility of most violence has been widely recognized: Wars harm both sides, most crimes yield little financial gain, terrorism and assassination almost never bring about the desired political changes, most rapes fail to bring sexual pleasure, torture rarely elicits accurate or useful information, and most murderers soon regret their actions as pointless and selfdefeating (Ford, 1985; Gottfiedson & Hirschi, 1990; Groth, 1979; Keegan, 1993; Sampson & Laub, 1993; .Scm'ry, 1985). What drives people to commit violent and oppressive actions that so often are tangential or even contrary to the rational pursuit of material self-interest? This article reviews literature relevant to the hypothesis that one main source of such violence is threatened egotism, particularly when it consists of favorable self-appraisals that may be inflated or ill-founded and that are confronted with an external evaluation that disputes them. The focus on egotism (i.e., favorable self-appraisals) as one cause of violent aggression runs contrary to an entrenched body of wisdom that has long pointed to low self-esteem as the root of violence and other antisocial behavior. We shall examine the arguments for the low self-esteem view and treat it as a rival hypothesis to our emphasis on high self-esteem. Clearly, there

2,215 citations


Journal ArticleDOI
TL;DR: Empirical evidence attests to diverse need for closure effects on fundamental social psychological phenomena, including impression formation, stereotyping, attribution, persuasion, group decision making, and language use in intergroup contexts.
Abstract: A theoretical framework is outlined in which the key construct is the need for (nonspecific) cognitive closure. The need for closure is a desire for definite knowledge on some issue. It represents a dimension of stable individual differences as well as a situationally evocable state. The need for closure has widely ramifying consequences for social-cognitive phenomena at the intrapersonal, interpersonal, and group levels of analysis. Those consequences derive from 2 general tendencies, those of urgency and permanence. The urgency tendency represents an individual's inclination to attain closure as soon as possible, and the permanence tendency represents an individual's inclination to maintain it for as long as possible. Empirical evidence for present theory attests to diverse need for closure effects on fundamental social psychological phenomena, including impression formation, stereotyping, attribution, persuasion, group decision making, and language use in intergroup contexts. The construction of new knowledge is a pervasive human pursuit for both individuals and collectives. From relatively simple activities such as crossing a busy road to highly complex endeavors such as launching a space shuttle, new knowledge is indispensable for secure decisions and reasoned actions. The knowledge-construction process is often involved and intricate. It draws on background notions activated from memory and local information from the immediate context. It entails the extensive testing of hypotheses and the piecing of isolated cognitive bits into coherent wholes. It integrates inchoate sensations with articulate thoughts, detects meaningful signals in seas of ambient noise, and more. Two aspects of knowledge construction are of present interest: its motivated nature and its social character. That knowledge construction has a motivational base should come as no particular surprise. The host of effortful activities it comprises pose considerable demands on resource allocation; hence, it may well require motivation to get under way. Specifically, individuals may desire knowledge on some topics and not others, and they may delimit their constructive endeavors to those particular domains. But what kind of a motivational variable is the "desire for knowledge"? At least two answers readily suggest themselves: Knowledge could be desired because it conveys welcome news in regard to a given concern or because it conveys any definite news (whether welcome or unwelcome) in instances in which such information is required for some purpose. For instance, a mother may desire to know that

1,928 citations


Journal ArticleDOI
TL;DR: A cognitive theory of posttraumatic stress disorder (PTSD) is proposed that assumes traumas experienced after early childhood give rise to 2 sorts of memory, 1 verbally accessible and 1 automatically accessible through appropriate situational cues.
Abstract: A cognitive theory of post traumatic stress disorder (PTSD) is proposed that assumes traumas experienced after early childhood give rise to 2 sorts of memory, 1 verbally accessible and 1 automatically accessible through appropriate situational cues. These different types of memory are used to explain the complex phenomenology of PTSD, including the experiences of reliving the traumatic event and of emotionally processing the trauma. The theory considers 3 possible outcomes of the emotional processing of trauma, successful completion, chronic processing, and premature inhibition of processing. We discuss the implications of the theory for research design, clinical practice, and resolving contradictions in the empirical data.

1,533 citations


Journal ArticleDOI
TL;DR: The article responds to Gigerenzer's critique and shows that it misrepresents the authors' theoretical position and ignores critical evidence.
Abstract: The study of heuristics and biases in judgement has been criticized in several publications by G. Gigerenzer, who argues that "biases are not biases" and "heuristics are meant to explain what does not exist" (1991, p. 102). The article responds to Gigerenzer's critique and shows that it misrepresents the authors' theoretical position and ignores critical evidence. Contrary to Gigerenzer's central empirical claim, judgments of frequency--not only subjective probabilities--are susceptible to large and systematic biases. A postscript responds to Gigerenzer's (1996) reply.

1,180 citations


Journal ArticleDOI
TL;DR: A model of orthographic processing is described that postulates read-out from different information dimensions, determined by variable response criteria set on these dimensions, that unifies results obtained in response-limited and data-limited paradigms and helps resolve a number of inconsistencies in the experimental literature.
Abstract: A model of orthographic processing is described that postulates read-out from different information dimensions, determined by variable response criteria set on these dimensions. Performance in a perceptual identification task is simulated as the percentage of trials on which a noisy criterion set on the dimension of single word detector activity is reached. Two additional criteria set on the dimensions of total lexical activity and time from stimulus onset are hypothesized to be operational in the lexical decision task. These additional criteria flexibly adjust to changes in stimulus material and task demands, thus accounting for strategic influences on performance in this task. The model unifies results obtained in response-limited and data-limited paradigms and helps resolve a number of inconsistencies in the experimental literature that cannot be accommodated by other current models of visual word recognition.

1,062 citations


Journal ArticleDOI
TL;DR: In this paper, a theoretical framework addressing the strategic regulation of memory reporting is put forward that delineates the mediating role of metamemorial monitoring and control processes, and a general methodology is proposed that incorporates these processes into the assessment of memory-accuracy and memory-quantity performance.
Abstract: When people are allowed freedom to volunteer or withhold information, they can enhance the accuracy of their memory reports substantially relative to forced-report performance. A theoretical framework addressing the strategic regulation of memory reporting is put forward that delineates the mediating role of metamemorial monitoring and control processes. Although the enhancement of memory accuracy is generally accompanied by a reduction in memory quantity, experimental and simulation results indicate that both of these effects depend critically on (a) accuracy incentive and (b) monitoring effectiveness. The results are discussed with regard to the contribution of meta-memory processes to memory performance, and a general methodology is proposed that incorporates these processes into the assessment of memory-accuracy and memory-quantity performance.

925 citations


Journal ArticleDOI
TL;DR: Gigerenzer and Murray as mentioned in this paper pointed out that the imposition of unnecessarily narrow norms of sound reasoning that are used to diagnose so-called cognitive illusions and the continuing reliance on vague heuristics that explain everything and nothing.
Abstract: This reply clarifies what G. Gigerenzers (e.g., 1991. 1994; Gigerenzer & Murray, 1987) critique of the heuristics-and-biases approach to statistical reasoning is and is not about. At issue is the imposition of unnecessarily narrow norms of sound reasoning that are used to diagnose so-called cognitive illusions and the continuing reliance on vague heuristics that explain everything and nothing. D. Kahneman and A. Tversky (1996) incorrectly asserted that Gigerenzer simply claimed that frequency formats make all cognitive illusions disappear. In contrast, Gigerenzer has proposed and tested models that actually predict when frequency judgments are valid and when they are not. The issue is not whether or not. or how often, cognitive illusions disappear. The focus should be rather the construction of detailed models of cognitive processes that explain when and why they disappear. A postscript responds to Kahneman and Tversky's (1996) postscript.

871 citations


Journal ArticleDOI
TL;DR: The model emphasizes the role of differing expectancies of unity and coherence in individual and group targets, which in turn engage different mechanisms for processing information and making judgments.
Abstract: This article analyzes the similarities and differences in forming impressions of individuals and in developing conceptions of groups. In both cases, the perceiver develops a mental conception of the target (individual or group) on the basis of available information and uses that information to make judgments about that person or group. However, a review of existing evidence reveals differences in the outcomes of impressions formed of individual and group targets, even when those impressions are based on the very same behavioral information. A model is proposed to account for these differences. The model emphasizes the role of differing expectancies of unity and coherence in individual and group targets, which in turn engage different mechanisms for processing information and making judgments. Implications of the model are discussed.


Journal ArticleDOI
TL;DR: This article asks whether there is one retention function that can describe all of memory, or perhaps a different function for each of a small number of different kinds of memory and argues that to obtain such a balance more description is needed.
Abstract: A sample of 210 published data sets were assembled that (a) plotted amount remembered versus time, (b) had 5 or more points, and (cj were smooth enough to fit at least 1 of the functions tested with a correlation coefficient of .90 or greater. Each was fit to 105 different 2-parameter functions. The best fits were to the logarithmic function, the power function, the exponential in the square root of time, and the hyperbola in the square root of time. It is difficult to distinguish among these 4 functions with the available data, but the same set of 4 functions fit most data sets, with autobiographical memory being the exception. Theoretical motivations for the best fitting functions are offered. The methodological problems of evaluating functions and the advantages of searching existing data for regularities before formulating theories are considered. At the simplest level, this article is a search for regularities. We ask whether there is one retention function that can describe all of memory, or perhaps a different function for each of a small number of different kinds of memory. At a more abstract level, it is about the role of theory and data in psychological research. Can we most rapidly advance psychology as a science by developing theories at the level that commonly fills psychological journals such as this one, or should we first try to describe phenomena that could constrain theories by establishing robust, preferably quantitative, regularities (Rubin, 1985, 1989, 1995)? A balance between these alternatives is needed, and here we argue that to obtain such a balance more description is needed. Retention offers the ideal topic to make this abstract, philo

Journal ArticleDOI
TL;DR: The relative susceptibility of individuals and groups to systematic judgmental biases is considered in this article, where a theoretical analysis employing J. H. Davis's social decision scheme (SDS) model reveals that the relative magnitude of individual and group bias depends upon several factors, including group size, initial individual judgment, the magnitude of bias among individuals, the type of bias, and most of all, the group-judgment process.
Abstract: The relative susceptibility of individuals and groups to systematic judgmental biases is considered. An overview of the relevant empirical literature reveals no clear or general pattern. However, a theoretical analysis employing J. H. Davis's (1973) social decision scheme (SDS) model reveals that the relative magnitude of individual and group bias depends upon several factors, including group size, initial individual judgment, the magnitude of bias among individuals, the type of bias, and most of all, the group-judgment process. It is concluded that there can be no simple answer to the question, "Which are more biased, individuals or groups?," but the SDS model offers a framework for specifying some of the conditions under which individuals are both more and less biased than groups.


Journal ArticleDOI
Daryl J. Bem1
TL;DR: A developmental theory of erotic/romantic attraction is presented in this article that provides the same basic account for opposite-sex and same-sex desire in both men and women, and proposes that biological variables such as genes, prenatal hormones, and brain neuroanatomy do not code for sexual orientation per se but for childhood temperaments that influence a child's preferences for sex-typical or sex-atypical activities and peers.
Abstract: A developmental theory of erotic/romantic attraction is presented that provides the same basic account for opposite-sex and same-sex desire in both men and women. It proposes that biological variables, such as genes, prenatal hormones, and brain neuroanatomy, do not code for sexual orientation per se but for childhood temperaments that influence a child's preferences for sex-typical or sex-atypical activities and peers. These preferences lead children to feel different from oppositeor same-sex peers--to perceive them as dissimilar, unfamiliar, and exotic. This, in turn, produces heightened nonspecific autonomic arousal that subsequently gets eroticized to that same class of dissimilar peers: Exotic becomes erotic. Specific mechanisms for effeeting this transformation are proposed. The theory claims to accommodate both the empirical evidence of the biological essentialists and the cultural relativism of the social constructionists.

Journal ArticleDOI
TL;DR: The article outlines an alternative to the capacity theory, according to which the unconscious, obligatory operations involved in assigning the syntactic structure of a sentence do not use the same working memory resource as that required for conscious, controlled verbally mediated processes.
Abstract: The authors review M.A. Just and P.A. Carpenter's (1992) "capacity" theory of sentence comprehension and argue that the data cited by Just and Carpenter in support of the theory are unconvincing and that the theory is insufficiently developed to explain or predict observed patterns of results. The article outlines an alternative to the capacity theory, according to which the unconscious, obligatory operations involved in assigning the syntactic structure of a sentence do not use the same working memory resource as that required for conscious, controlled verbally mediated processes.

Journal ArticleDOI

Journal ArticleDOI
TL;DR: The developmental theory of Jean Piaget has been criticized on the grounds that it is conceptually limited, empirically false, or philosophically and epistemologically untenable as mentioned in this paper.
Abstract: The developmental theory of Jean Piaget has been criticized on the grounds that it is conceptually limited, empirically false, or philosophically and epistemologically untenable. This study attempts to rebut these criticisms by showing that most of them (a) derive from widespread misinterpretations of the work of Piaget; (b) fail to appreciate the 2 central issues of his thinking—how new forms of thinking emerge during ontogenesis and how they become psychologically necessary; (c) incorrectly assume that many controversies concerning his theory can be settled empirically or methodologically before they are clarified conceptually; (d) ignore various modifications of Piagetian theory, particularly those advanced after 1970; and (e) forget the dialectical, constructivist, and developmental nature of Piaget's unique approach to human development. Although the authors do not claim there is a "true" Piaget to be discovered, or that the problems with his theory vanish when it is better understood, they do claim that important aspects of Piaget's work have not been assimilated by developmental psychologists.

Journal ArticleDOI
TL;DR: A theory that integrates space-based and object-based approaches to visual attention and provides a quantitative account of the effects of grouping by proximity and distance between items on reaction time and accuracy data in 7 empirical situations that shaped the current literature on visual spatial attention.
Abstract: This article presents a theory that integrates space-based and object-based approaches to visual attention. The theory puts together M.P. van Oeffelen and P.G. Vos's (1982, 1983) COntour DEtector (CODE) theory of perceptual grouping by proximity with C. Bundesen's (1990) theory of visual attention (TVA). CODE provides input to TVA, accounting for spatially based between-object selection, and TVA converts the input to output, accounting for feature- and category-based within-object selection. CODE clusters nearby items into perceptual groups that are both perceptual objects and regions of space, thereby integrating object-based and space-based approaches to attention. The combined theory provides a quantitative account of the effects of grouping by proximity and distance between items on reaction time and accuracy data in 7 empirical situations that shaped the current literature on visual spatial attention.

Journal ArticleDOI
TL;DR: The success of the consonance model underscores important, unforeseen similarities between what had been formerly regarded as the rather exotic process of dissonance reduction and a variety of other, more mundane psychological processes.
Abstract: A constraint satisfaction neural network model (the consonance model) simulated data from the two major cognitive dissonance paradigms of insufficient justification and free choice. In several cases, the model fit the human data better than did cognitive dissonance theory. Superior fits were due to the inclusion of constraints that were not part of dissonance theory and to the increased precision inherent to this computational approach. Predictions generated by the model for a free choice between undesirable alternatives were confirmed in a new psychological experiment. The success of the consonance model underscores important, unforeseen similarities between what had been formerly regarded as the rather exotic process of dissonance reduction and a variety of other, more mundane psychological processes. Many of these processes can be understood as the progressive application of constraints supplied by beliefs and attitudes.

Journal ArticleDOI
TL;DR: Oaksford and Chater as mentioned in this paper presented the first quantitative model of R C. Wason's (1966) selection task in which performance is rational and concluded that O&C's model remains the most compelling and comprehensive account of the selection task.
Abstract: M. Oaksford and N. Chater (O&C, see record 1995-08271-001) presented the first quantitative model of R C. Wason's (1966) selection task in which performance is rational. J. St. B. T. Evans and D. E. Over (see record 83:25190) reply that O&C's account is normatively incorrect and cannot model K. N. Kirby's (see record 1995-04302-001) or R Pollard and J. St. B. T. Evans's (see record 1984-30572-001) data. It is argued that an equivalent measure satisfies their normative concerns and that a modification of O&C's model accounts for their empirical concerns. D. Laming (see record 83:25220) argues that O&C made unjustifiable psychological assumptions and that a "correct" Bayesian analysis agrees with logic. It is argued that O&C's model makes normative and psychological sense and that Laming's analysis is not Bayesian. A. Almor and S. A. Sloman (see record 83:25168) argue that O&C cannot explain their data. It is argued that Almor and Sloman's data do not bear on O&C's model because they alter the nature of the task. It is concluded that O&C's model remains the most compelling and comprehensive account of the selection task.

Journal ArticleDOI
TL;DR: The capacity theory as discussed by the authors adopts the basic premise that thinking is resource limited and proposes a specific model of how the constraint is applied within a particular cognitive architecture and examines the implications in the domain of language comprehension.
Abstract: This article has the dual goals of refuting some of Waters and Caplan's (1996a) incorrect descriptions concerning the empirical support for capacity theory, as well as pointing out the theoretical and empirical difficulties with Waters and Caplan's alternative hypothesis. This article has three sections: (a) a critique of Waters and Caplan's hypothesis, (b) a new functional Magnetic Resonance Imaging (fMRI) study of the reading-span task that supports the capacity theory and not Waters and Caplan's alternative hypothesis, and (c) a reply to some of Waters and Caplan's inaccuracies concerning the empirical support in Just and Carpenter (1992). The capacity theory adopts the basic premise that thinking is resource limited (Kahneman, 1973) but proposes a specific model of how the constraint is applied within a particular cognitive architecture and examines the implications in the domain of language comprehension. The theory deals with the resources used to support language comprehension computations, not the phonological buffer/articulatory loop of Baddeley's (1992) theory. Critique of Waters and Caplan's (1996a) Hypothesis Two problems with Waters and Caplan's (1996a) proposed division of working-memory resources is that it conflates two

Journal ArticleDOI
TL;DR: This model proposes that complex human behaviors may be guided by multiple overlapping neural mechanisms.
Abstract: Behavioral impairments in autism are theorized to result from abnormal neuronal organization in brain development generating 4 systemically related neurofunctional impairments: (a) canalesthesia, wherein abnormal hippocampal system function "canalizes" sensory records, disrupting integration of information; (b) impaired assignment of the affective significance of stimuli, wherein abnormal amygdaloid system function disrupts affect association; (c) asociality, wherein impaired oxytocin system function flattens social bonding and affiliativeness; and (d) extended selective attention, wherein abnormal organization of temporal and parietal polysensory regions yields aberrant overprocessing of primary representations. This model proposes that complex human behaviors may be guided by multiple overlapping neural mechanisms.

Journal ArticleDOI
TL;DR: Drawing on mathematical results in A. N. Kolmogorov's complexity theory, the author argues that simplicity and likelihood are not in competition, but are identical.
Abstract: Two principles of perceptual organization have been proposed. The likelihood principle, following H. L. F. von Helmholtz (1910/1962), proposes that perceptual organization is chosen to correspond to the most likely distal layout. The simplicity principle, following Gestalt psychology, suggests that perceptual organization is chosen to be as simple as possible. The debate between these two views has been a central topic in the study of perceptual organization. Drawing on mathematical results in A. N. Kolmogorov's (1965) complexity theory, the author argues that simplicity and likelihood are not in competition, but are identical. Various implications for the theory of perceptual organization and psychology more generally are outlined.

Journal ArticleDOI
TL;DR: This theoretical study presents a new analysis, based on the recently developed concept of holographic regularity, that applies to the intrinsic character of regularity and specifies the unique formal status of perceptually relevant regularities.
Abstract: Until recently, the transformational approach provided the only available formal analysis of visual regularities like repetition and mirror symmetry. This theoretical study presents a new analysis, based on the recently developed concept of holographic regularity. This concept applies to the intrinsic character of regularity and specifies the unique formal status of perceptually relevant regularities. The crucial point is that the two analyses imply the same structure for repetition but a different structure for mirror symmetry. Transformationally, mirror symmetry is an all-or-nothing property, whereas holographically, it is a graded property. This difference pervades the understanding of both perfect regularities and perturbed regularities. Whereas the transformational approach explains hardly any goodness phenomenon, the holographic approach explains a wide variety of goodness phenomena in a coherent way that is ecologically plausible as well.


Journal ArticleDOI
TL;DR: A model explaining when the manual modality will assume grammatical properties and when it will not is proposed, arguing that two grammatical features, segmentation and hierarchical combination, appear in all settings in which one human communicates symbolically with another.
Abstract: Grammatical properties are found in conventional sign languages of the deaf and in unconventional gesture systems created by deaf children lacking language models. However, they do not arise in spontaneous gestures produced along with speech. The authors propose a model explaining when the manual modality will assume grammatical properties and when it will not. The model argues that two grammatical features, segmentation and hierarchical combination, appear in all settings in which one human communicates symbolically with another. These properties are preferentially assumed by speech whenever words are spoken, constraining the manual modality to a global form. However, when the manual modality must carry the full burden of communication, it is freed from the global form it assumes when integrated with speech--only to be constrained by the task of symbolic communication to take on the grammatical properties of segmentation and hierarchical combination.

Journal ArticleDOI
TL;DR: A reexamination of 2 previous studies provided new insights into the role of attention and location information in object perception and a reinterpretation of the deficits in patients who exhibit attentional disorders.
Abstract: Visual objects are perceived correctly only if their features are identified and then bound together. Illusory conjunctions result when feature identification is correct but an error occurs during feature binding. A new model is proposed that assumes feature binding errors occur because of uncertainty about the location of visual features. This model accounted for data from 2 new experiments better than a model derived from A. M. Treisman and H. Schmidt's (1982) feature integration theory. The traditional method for detecting the occurrence of true illusory conjunctions is shown to be fundamentally flawed. A reexamination of 2 previous studies provided new insights into the role of attention and location information in object perception and a reinterpretation of the deficits in