scispace - formally typeset
Search or ask a question

Showing papers in "Psychological Review in 2009"


Journal ArticleDOI
TL;DR: It is proposed that the success of infants and nonhuman animals on some belief reasoning tasks may be best explained by a cognitively efficient but inflexible capacity for tracking belief-like states in humans.
Abstract: The lack of consensus on how to characterize humans' capacity for belief reasoning has been brought into sharp focus by recent research. Children fail critical tests of belief reasoning before 3 to 4 years of age (H. Wellman, D. Cross, & J. Watson, 2001; H. Wimmer & J. Perner, 1983), yet infants apparently pass false-belief tasks at 13 or 15 months (K. H. Onishi & R. Baillargeon, 2005; L. Surian, S. Caldi, & D. Sperber, 2007). Nonhuman animals also fail critical tests of belief reasoning but can show very complex social behavior (e.g., J. Call & A Tomasello, 2005). Fluent social interaction in adult humans implies efficient processing of beliefs, yet direct tests suggest that belief reasoning is cognitively demanding, even for adults (e.g., I. A. Apperly, D. Samson, & G. W. Humphreys, 2009). The authors interpret these findings by drawing an analogy with the domain of number cognition, where similarly contrasting results have been observed. They propose that the success of infants and nonhuman animals on some belief reasoning tasks may be best explained by a cognitively efficient but inflexible capacity for tracking belief-like states. In humans, this capacity persists in parallel with a later-developing, more flexible but more cognitively demanding theory-of-mind abilities.

838 citations


Journal ArticleDOI
TL;DR: A new model is described that provides a framework for understanding people's reactions to threats to social acceptance and belonging as they occur in the context of diverse phenomena such as rejection, discrimination, ostracism, betrayal, and stigmatization.
Abstract: This article describes a new model that provides a framework for understanding people's reactions to threats to social acceptance and belonging as they occur in the context of diverse phenomena such as rejection, discrimination, ostracism, betrayal, and stigmatization. People's immediate reactions are quite similar across different forms of rejection in terms of negative affect and lowered self-esteem. However, following these immediate responses, people's reactions are influenced by construals of the rejection experience that predict 3 distinct motives for prosocial, antisocial, and socially avoidant behavioral responses. The authors describe the relational, contextual, and dispositional factors that affect which motives determine people's reactions to a rejection experience and the ways in which these 3 motives may work at cross-purposes. The multimotive model accounts for the myriad ways in which responses to rejection unfold over time and offers a basis for the next generation of research on interpersonal rejection.

704 citations


Journal ArticleDOI
TL;DR: The authors present the context maintenance and retrieval (CMR) model of memory search, a generalized version of the temporal context model of M. W. Howard and M. Kahana (2002a), which proposes that memory search is driven by an internally maintained context representation composed of stimulus-related and source-related features.
Abstract: The authors present the context maintenance and retrieval (CMR) model of memory search, a generalized version of the temporal context model of M. W. Howard and M. J. Kahana (2002a), which proposes that memory search is driven by an internally maintained context representation composed of stimulus-related and source-related features. In the CMR model, organizational effects (the tendency for related items to cluster during the recall sequence) arise as a consequence of associations between active context elements and features of the studied material. Semantic clustering is due to longstanding context-to-item associations, whereas temporal clustering and source clustering are both due to associations formed during the study episode. A behavioral investigation of the three forms of organization provides data to constrain the CMR model, revealing interactions between the organizational factors. Finally, the authors discuss the implications of CMR for their understanding of a broad class of episodic memory phenomena and suggest ways in which this theory may guide exploration of the neural correlates of memory search.

564 citations


Journal ArticleDOI
TL;DR: The analytical rumination hypothesis proposes that depression is an evolved response to complex problems, whose function is to minimize disruption and sustain analysis of those problems by giving the triggering problem prioritized access to processing resources.
Abstract: Depression ranks as the primary emotional problem for which help is sought Depressed people often have severe, complex problems, and rumination is a common feature Depressed people often believe that their ruminations give them insight into their problems, but clinicians often view depressive rumination as pathological because it is difficult to disrupt and interferes with the ability to concentrate on other things Abundant evidence indicates that depressive rumination involves the analysis of episode-related problems Because analysis is time consuming and requires sustained processing, disruption would interfere with problem-solving The analytical rumination (AR) hypothesis proposes that depression is an adaptation that evolved as a response to complex problems and whose function is to minimize disruption of rumination and sustain analysis of complex problems It accomplishes this by giving episode-related problems priority access to limited processing resources, by reducing the desire to engage in distracting activities (anhedonia), and by producing psychomotor changes that reduce exposure to distracting stimuli Because processing resources are limited, the inability to concentrate on other things is a tradeoff that must be made to sustain analysis of the triggering problem The AR hypothesis is supported by evidence from many levels, including genes, neurotransmitters and their receptors, neurophysiology, neuroanatomy, neuroenergetics, pharmacology, cognition and behavior, and the efficacy of treatments In addition, we address and provide explanations for puzzling findings in the cognitive and behavioral genetics literatures on depression In the process, we challenge the belief that serotonin transmission is low in depression Finally, we discuss implications of the hypothesis for understanding and treating depression

445 citations


Journal ArticleDOI
TL;DR: It is argued that self-specificity characterizes the subjective perspective, which is not intrinsically self-evaluative but rather relates any represented object to the representing subject and is anchored to the sensorimotor integration of efference with reafference.
Abstract: The authors propose a paradigm shift in the investigation of the self. Synthesizing neuroimaging results from studies investigating the self, the authors first demonstrate that self-relatedness evaluation involves a wide cerebral network, labeled E-network, comprising the medial prefrontal cortex, precuneus, temporoparietal junction, and temporal poles. They further show that this E-network is also recruited during resting state, others' mind reading, memory recall, and reasoning. According to these data, (a) the profile of activation of the E-network demonstrates no preference for the self, and (b) the authors suggest that activity in this network can be explained by the involvement of cognitive processes common to all the tasks recruiting it: inferential processing and memory recall. On this basis, they conclude that standard ways to tackle the self by considering self-evaluation do not target the self in its specificity. Instead, they argue that self-specificity characterizes the subjective perspective, which is not intrinsically self-evaluative but rather relates any represented object to the representing subject. They further propose that such self-specific subject-object relation is anchored to the sensorimotor integration of efference with reafference (i.e., the motor command of the subject's action and its sensory consequence in the external world).

403 citations


Journal ArticleDOI
TL;DR: Using a Bayesian probabilistic model, the authors demonstrate how word meanings can be learned by treating experiential and distributional data as a single joint distribution and learning the statistical structure that underlies it.
Abstract: The authors identify 2 major types of statistical data from which semantic representations can be learned. These are denoted as experiential data and distributional data. Experiential data are derived by way of experience with the physical world and comprise the sensory-motor data obtained through sense receptors. Distributional data, by contrast, describe the statistical distribution of words across spoken and written language. The authors claim that experiential and distributional data represent distinct data types and that each is a nontrivial source of semantic information. Their theoretical proposal is that human semantic representations are derived from an optimal statistical combination of these 2 data types. Using a Bayesian probabilistic model, they demonstrate how word meanings can be learned by treating experiential and distributional data as a single joint distribution and learning the statistical structure that underlies it. The semantic representations that are learned in this manner are measurably more realistic-as verified by comparison to a set of human-based measures of semantic representation-than those available from either data type individually or from both sources independently. This is not a result of merely using quantitatively more data, but rather it is because experiential and distributional data are qualitatively distinct, yet intercorrelated, types of data. The semantic representations that are learned are based on statistical structures that exist both within and between the experiential and distributional data types.

391 citations


Journal ArticleDOI
TL;DR: The simple computations and the intuitive graphic representation of the analysis are illustrated by the analysis of diverse examples from the current literature.
Abstract: Null hypotheses are simple, precise, and theoretically important. Conventional statistical analysis cannot support them; Bayesian analysis can. The challenge in a Bayesian analysis is to formulate a suitably vague alternative, because the vaguer the alternative is (the more it spreads out the unit mass of prior probability), the more the null is favored. A general solution is a sensitivity analysis: Compute the odds for or against the null as a function of the limit(s) on the vagueness of the alternative. If the odds on the null approach 1 from above as the hypothesized maximum size of the possible effect approaches 0, then the data favor the null over any vaguer alternative to it. The simple computations and the intuitive graphic representation of the analysis are illustrated by the analysis of diverse examples from the current literature. They pose 3 common experimental questions: (a) Are 2 means the same? (b) Is performance at chance? (c) Are factors additive?

361 citations


Journal ArticleDOI
TL;DR: In this paper, causal induction is the product of domain-general statistical inference guided by domain-specific prior knowledge, in the form of an abstract causal theory, which is defined as the ontology of entities, properties, and relations that organize a domain; the plausibility of specific causal relationships; and the functional form of those relationships.
Abstract: Inducing causal relationships from observations is a classic problem in scientific inference, statistics, and machine learning It is also a central part of human learning, and a task that people perform remarkably well given its notorious difficulties People can learn causal structure in various settings, from diverse forms of data: observations of the co-occurrence frequencies between causes and effects, interactions between physical objects, or patterns of spatial or temporal coincidence These different modes of learning are typically thought of as distinct psychological processes and are rarely studied together, but at heart they present the same inductive challenge—identifying the unobservable mechanisms that generate observable relations between variables, objects, or events, given only sparse and limited data We present a computational-level analysis of this inductive problem and a framework for its solution, which allows us to model all these forms of causal learning in a common language In this framework, causal induction is the product of domain-general statistical inference guided by domain-specific prior knowledge, in the form of an abstract causal theory We identify 3 key aspects of abstract prior knowledge—the ontology of entities, properties, and relations that organizes a domain; the plausibility of specific causal relationships; and the functional form of those relationships—and show how they provide the constraints that people need to induce useful causal models from sparse data

322 citations


Journal ArticleDOI
TL;DR: A computational theory of performance in this task is described, which provides a detailed, quantitative account of attentional effects in spatial cuing tasks at the level of response accuracy and the response time distributions.
Abstract: The simplest attentional task, detecting a cued stimulus in an otherwise empty visual field, produces complex patterns of performance. Attentional cues interact with backward masks and with spatial uncertainty, and there is a dissociation in the effects of these variables on accuracy and on response time. A computational theory of performance in this task is described. The theory links visual encoding, masking, spatial attention, visual short-term memory (VSTM), and perceptual decision making in an integrated dynamic framework. The theory assumes that decisions are made by a diffusion process driven by a neurally plausible, shunting VSTM. The VSTM trace encodes the transient outputs of early visual filters in a durable form that is preserved for the time needed to make a decision. Attention increases the efficiency of VSTM encoding, either by increasing the rate of trace formation or by reducing the delay before trace formation begins. The theory provides a detailed, quantitative account of attentional effects in spatial cuing tasks at the level of response accuracy and the response time distributions.

271 citations


Journal ArticleDOI
TL;DR: A Bayesian framework is presented that shows how statistical inference can operate over structured background knowledge, and the authors argue that this interaction between structure and statistics is critical for explaining the power and flexibility of human reasoning.
Abstract: Everyday inductive inferences are often guided by rich background knowledge. Formal models of induction should aim to incorporate this knowledge and should explain how different kinds of knowledge lead to the distinctive patterns of reasoning found in different inductive contexts. This article presents a Bayesian framework that attempts to meet both goals and describe 4 applications of the framework: a taxonomic model, a spatial model, a threshold model, and a causal model. Each model makes probabilistic inferences about the extensions of novel properties, but the priors for the 4 models are defined over different kinds of structures that capture different relationships between the categories in a domain. The framework therefore shows how statistical inference can operate over structured background knowledge, and the authors argue that this interaction between structure and statistics is critical for explaining the power and flexibility of human reasoning.

244 citations


Journal ArticleDOI
TL;DR: A Bayesian model is presented to explain the perceptual magnet effect, in which discriminability between vowels is reduced near prototypical vowel sounds, and provides a framework for exploring categorical effects in other domains.
Abstract: A variety of studies have demonstrated that organizing stimuli into categories can affect the way the stimuli are perceived. We explore the influence of categories on perception through one such phenomenon, the perceptual magnet effect, in which discriminability between vowels is reduced near prototypical vowel sounds. We present a Bayesian model to explain why this reduced discriminability might occur: It arises as a consequence of optimally solving the statistical problem of perception in noise. In the optimal solution to this problem, listeners’ perception is biased toward phonetic category means because they use knowledge of these categories to guide their inferences about speakers’ target productions. Simulations show that model predictions closely correspond to previously published human data, and novel experimental results provide evidence for the predicted link between perceptual warping and noise. The model unifies several previous accounts of the perceptual magnet effect and provides a framework for exploring categorical effects in other domains.

Journal ArticleDOI
TL;DR: The model and data show that the standard signal detection interpretation of z-transformed receiver operating characteristic (z-ROC) functions is wrong, and explains sequential effects in which the slope of the z- ROC function changes by about 10% as a function of the prior response in the test list.
Abstract: A new model for confidence judgments in recognition memory is presented. In the model, the match between a single test item and memory produces a distribution of evidence, with better matches corresponding to distributions with higher means. On this match dimension, confidence criteria are placed, and the areas between the criteria under the distribution are used as drift rates to drive racing Ornstein-Uhlenbeck diffusion processes. The model is fit to confidence judgments and quantile response times from two recognition memory experiments that manipulated word frequency and speed versus accuracy emphasis. The model and data show that the standard signal detection interpretation of z-transformed receiver operating characteristic (z-ROC) functions is wrong. The model also explains sequential effects in which the slope of the z-ROC function changes by about 10% as a function of the prior response in the test list.

Journal ArticleDOI
TL;DR: An overview of experimental paradigms used to study habituation is provided, a theoretical approach to habituation to food based on memory and associative conditioning models are integrated, and research on factors that influence habituation are reviewed.
Abstract: Research has shown that animals and humans habituate on a variety of behavioral and physiological responses to repeated presentations of food cues, and habituation is related to amount of food consumed and cessation of eating. The purpose of this article is to provide an overview of experimental paradigms used to study habituation, integrate a theoretical approach to habituation to food based on memory and associative conditioning models, and review research on factors that influence habituation. Individual differences in habituation as they relate to obesity and eating disorders are reviewed, along with research on how individual differences in memory can influence habituation. Other associative conditioning approaches to ingestive behavior are reviewed, as well as how habituation provides novel approaches to preventing or treating obesity. Finally, new directions for habituation research are presented. Habituation provides a novel theoretical framework from which to understand factors that regulate ingestive behavior.

Journal ArticleDOI
TL;DR: The author concludes that the localist representations embedded in theories of perception and cognition are consistent with neuroscience; biology only calls into question the distributed representations often learned in PDP models.
Abstract: A fundamental claim associated with parallel distributed processing (PDP) theories of cognition is that knowledge is coded in a distributed manner in mind and brain. This approach rejects the claim that knowledge is coded in a localist fashion, with words, objects, and simple concepts (e.g. "dog"), that is, coded with their own dedicated representations. One of the putative advantages of this approach is that the theories are biologically plausible. Indeed, advocates of the PDP approach often highlight the close parallels between distributed representations learned in connectionist models and neural coding in brain and often dismiss localist (grandmother cell) theories as biologically implausible. The author reviews a range a data that strongly challenge this claim and shows that localist models provide a better account of single-cell recording studies. The author also contrast local and alternative distributed coding schemes (sparse and coarse coding) and argues that common rejection of grandmother cell theories in neuroscience is due to a misunderstanding about how localist models behave. The author concludes that the localist representations embedded in theories of perception and cognition are consistent with neuroscience; biology only calls into question the distributed representations often learned in PDP models.

Journal ArticleDOI
TL;DR: Simulated distributions of pronunciation times are described in a further test for multiplicative interactions and interdependence in order to establish interaction dominant dynamics, in contrast with component dominance dynamics, as a likely mechanism for cognitive activity.
Abstract: Trial-to-trial variation in word-pronunciation times exhibits 1/f scaling. One explanation is that human performances are consequent on multiplicative interactions among interdependent processes-interaction dominant dynamics. This article describes simulated distributions of pronunciation times in a further test for multiplicative interactions and interdependence. Individual participant distributions of approximately 1,100 word-pronunciation times were successfully mimicked for each participant in combinations of lognormal and power-law behavior. Successful hazard function simulations generalized these results to establish interaction dominant dynamics, in contrast with component dominant dynamics, as a likely mechanism for cognitive activity.

Journal ArticleDOI
TL;DR: This article provides a framework that resolves several anomalies of intertemporal choice by developing discount functions from marginal utilities, and returns conjointly measured determinations of monetary utility and temporal distance functions.
Abstract: Goods remote in temporal, spatial, or social distance, or in likelihood, exert less control over our behavior than those more proximate. The decay of influence with distance, of perennial interest to behavioral economists, has had a renaissance in the study of delay discounting. By developing discount functions from marginal utilities, this article provides a framework that resolves several anomalies of intertemporal choice. Utilities are inferred to be power functions of monetary value, delay, and probability. Utility, not value, is discounted, with decisions made by adding the utility of a good to the disutility of a delay or contingency. The theory reduces to standard treatments, such as exponential, hyperbolic and hyperboloid, and exponential-power; naturally predicts magnitude effects and other asymmetries; is consistent with subadditivity, immediacy, and certainty effects; returns conjointly measured determinations of monetary utility and temporal distance functions; and is extensible to other dimensions of distance.

Journal ArticleDOI
TL;DR: A new approach to modeling and understanding cognition-cognitively bounded rational analysis-that sharpens the predictive acuity of general, integrated theories of cognition and action and narrows the space of predicted behaviors through analysis of the payoff achieved by alternative strategies, rather than through fitting strategies and theoretical parameters to data.
Abstract: The authors assume that individuals adapt rationally to a utility function given constraints imposed by their cognitive architecture and the local task environment. This assumption underlies a new approach to modeling and understanding cognition—cognitively bounded rational analysis—that sharpens the predictive acuity of general, integrated theories of cognition and action. Such theories provide the necessary computational means to explain the flexible nature of human behavior but in doing so introduce extreme degrees of freedom in accounting for data. The new approach narrows the space of predicted behaviors through analysis of the payoff achieved by alternative strategies, rather than through fitting strategies and theoretical parameters to data. It extends and complements established approaches, including computational cognitive architectures, rational analysis, optimal motor control, bounded rationality, and signal detection theory. The authors illustrate the approach with a reanalysis of an existing account of psychological refractory period (PRP) dual-task performance and the development and analysis of a new theory of ordered dual-task responses. These analyses yield several novel results, including a new understanding of the role of strategic variation in existing accounts of PRP and the first predictive, quantitative account showing how the details of ordered dual-task phenomena emerge from the rational control of a cognitive system subject to the combined constraints of internal variance, motor interference, and a response selection bottleneck.

Journal ArticleDOI
TL;DR: A system of procedural or "if... then" rules that foster mutuality in responsiveness by informing and motivating trust and commitment are described and it is argued that tuning rule accessibility and enactment to match the situations encountered in a specific relationship shapes its personality.
Abstract: A model of mutual responsiveness in adult romantic relationships is proposed. Behaving responsively in conflict-of-interest situations requires one partner to resist the temptation to be selfish and the other partner to resist the temptation to protect against exploitation. Managing risk and the attendant temptations of self-interest require the interpersonal mind to function in ways that coordinate trust and commitment across partners. The authors describe a system of procedural or "if... then" rules that foster mutuality in responsiveness by informing and motivating trust and commitment. The authors further argue that tuning rule accessibility and enactment to match the situations encountered in a specific relationship shapes its personality. By imposing a procedural structure on the interdependent mind, the proposed model of mutual responsiveness reframes interdependence theory and generates important research questions for the future.

Journal ArticleDOI
TL;DR: It is argued here that an integration of the theories and findings of mainstream social psychology and of cultural evolutionary theory can be mutually beneficial.
Abstract: Cultural evolutionary theory is an interdisciplinary field in which human culture is viewed as a Darwinian process of variation, competition, and inheritance, and the tools, methods, and theories developed by evolutionary biologists to study genetic evolution are adapted to study cultural change. It is argued here that an integration of the theories and findings of mainstream social psychology and of cultural evolutionary theory can be mutually beneficial. Social psychology provides cultural evolution with a set of empirically verified microevolutionary cultural processes, such as conformity, model-based biases, and content biases, that are responsible for specific patterns of cultural change. Cultural evolutionary theory provides social psychology with ultimate explanations for, and an understanding of the population-level consequences of, many social psychological phenomena, such as social learning, conformity, social comparison, and intergroup processes, as well as linking social psychology with other social science disciplines such as cultural anthropology, archaeology, and sociology.

Journal ArticleDOI
TL;DR: A social psychological model of prospective memory and habit development based on relevant research literature and computer simulations points to a new understanding of the role of habits in supporting the performance of repeated behaviors through remembering.
Abstract: This article presents a social psychological model of prospective memory and habit development. The model is based on relevant research literature, and its dynamics were investigated by computer simulations. Time-series data from a behavior-change campaign in Cuba were used for calibration and validation of the model. The model scored well in several system-analytical tests, including the replication of the data and the forecast of later developments based on earlier data. Additionally, the calibrated parameter values indicate that the accessibilities of intentions decay at the same rate as retrospective memories. However, the accessibilities may stay high due to a reminder, the effectiveness of which depends on a person's commitment to performing the behavior. Furthermore, the effect of the reminder decays over time. This decay is much slower than the development of habits, which, after about a month, were nearly fully developed if the person had executed the behavior sufficiently often. Finally, over time, habits were shown to replace the reminding effect of the external memory aid. This article points to a new understanding of the role of habits in supporting the performance of repeated behaviors through remembering.

Journal ArticleDOI
TL;DR: The authors describe a theoretical model of processes of distributed social cognition that takes account of 3 levels: the individual perceiver, the interacting dyad, and the social network in which they are embedded and results of a multiagent simulation of a subset of these processes are presented.
Abstract: Research on person perception typically emphasizes cognitive processes of information selection and interpretation within the individual perceiver and the nature of the resulting mental representations. The authors focus instead on the ways person perception processes create, and are influenced by, the patterns of impressions that are socially constructed, transmitted, and filtered through social networks. As the socially situated cognition perspective (E. R. Smith & G. R. Semin, 2004) suggests, it is necessary to supplement consideration of intra-individual cognitive processes with an examination of the social context. The authors describe a theoretical model of processes of distributed social cognition that takes account of 3 levels: the individual perceiver, the interacting dyad, and the social network in which they are embedded. The authors' model assumes that perceivers elicit or create as well as interpret impression-relevant information in dyadic interaction and that perceivers obtain information from 3rd-party sources who are linked to perceivers and targets in social networks. The authors also present results of a multiagent simulation of a subset of these processes. Implications of the theoretical model are discussed, for the possibility of correcting biases in person perception and for the nature of underlying mental representations of persons.

Journal ArticleDOI
TL;DR: ND-TSD poses novel, theoretically meaningful constraints on theories of recognition and decision making more generally, and provides a mechanism for rapprochement between theories of decision making that employ deterministic response rules and those that postulate probabilistic response rules.
Abstract: A tacit but fundamental assumption of the theory of signal detection is that criterion placement is a noise-free process. This article challenges that assumption on theoretical and empirical grounds and presents the noisy decision theory of signal detection (ND-TSD). Generalized equations for the isosensitivity function and for measures of discrimination incorporating criterion variability are derived, and the model's relationship with extant models of decision making in discrimination tasks is examined. An experiment evaluating recognition memory for ensembles of word stimuli revealed that criterion noise is not trivial in magnitude and contributes substantially to variance in the slope of the isosensitivity function. The authors discuss how ND-TSD can help explain a number of current and historical puzzles in recognition memory, including the inconsistent relationship between manipulations of learning and the isosensitivity function's slope, the lack of invariance of the slope with manipulations of bias or payoffs, the effects of aging on the decision-making process in recognition, and the nature of responding in remember-know decision tasks. ND-TSD poses novel, theoretically meaningful constraints on theories of recognition and decision making more generally, and provides a mechanism for rapprochement between theories of decision making that employ deterministic response rules and those that postulate probabilistic response rules.

Journal ArticleDOI
TL;DR: It is demonstrated that design optimization has the potential to increase the informativeness of the experimental method and is compared with the quality of designs used in the literature.
Abstract: Models of a psychological process can be difficult to discriminate experimentally because it is not easy to determine the values of the critical design variables (e.g., presentation schedule, stimulus structure) that will be most informative in differentiating them. Recent developments in sampling-based search methods in statistics make it possible to determine these values and thereby identify an optimal experimental design. After describing the method, it is demonstrated in 2 content areas in cognitive psychology in which models are highly competitive: retention (i.e., forgetting) and categorization. The optimal design is compared with the quality of designs used in the literature. The findings demonstrate that design optimization has the potential to increase the informativeness of the experimental method.

Journal ArticleDOI
TL;DR: It is proposed that once the likely nature of people's actual experience of such processes is taken into account, these "errors" and "biases" actually emerge as apt reflections of the probabilistic characteristics of sequences of random events.
Abstract: A long tradition of psychological research has lamented the systematic errors and biases in people's perception of the characteristics of sequences generated by a random mechanism such as a coin toss. It is proposed that once the likely nature of people's actual experience of such processes is taken into account, these "errors" and "biases" actually emerge as apt reflections of the probabilistic characteristics of sequences of random events. Specifically, seeming biases reflect the subjective experience of a finite data stream for an agent with a limited short-term memory capacity. Consequently, these biases seem testimony not to the limitations of people's intuitive statistics but rather to the extent to which the human cognitive system is finely attuned to the statistics of the environment.

Journal ArticleDOI
TL;DR: The authors proposed and tested the alternative hypothesis that the influence of spatial geometry on both behavioral and neuronal levels can be explained by the properties of visual features that constitute local views of the environment, and suggested that the pattern of diagonal errors observed in reorientation tasks can be understood by the analysis of sensory information processing that underlies the navigation strategy employed to solve the task.
Abstract: Modern psychological theories of spatial cognition postulate the existence of a geometric module for reorientation. This concept is derived from experimental data showing that in rectangular arenas with distinct landmarks in the corners, disoriented rats often make diagonal errors, suggesting their preference for the geometric (arena shape) over the nongeometric (landmarks) cues. Moreover, sensitivity of hippocampal cell firing to changes in the environment layout was taken in support of the geometric module hypothesis. Using a computational model of rat navigation, the authors proposed and tested the alternative hypothesis that the influence of spatial geometry on both behavioral and neuronal levels can be explained by the properties of visual features that constitute local views of the environment. Their modeling results suggest that the pattern of diagonal errors observed in reorientation tasks can be understood by the analysis of sensory information processing that underlies the navigation strategy employed to solve the task. In particular, 2 navigation strategies were considered: (a) a place-based locale strategy that relies on a model of grid and place cells and (b) a stimulus–response taxon strategy that involves direct association of local views with action choices. The authors showed that the application of the 2 strategies in the reorientation tasks results in different patterns of diagonal errors, consistent with behavioral data. These results argue against the geometric module hypothesis by providing a simpler and biologically more plausible explanation for the related experimental data. Moreover, the same model also describes behavioral results in different types of water-maze tasks. (PsycINFO Database Record (c) 2009 APA, all rights reserved)

Journal ArticleDOI
TL;DR: Although Thomson's model has been largely forgotten, the authors show that it merits further consideration because it can compete, statistically and biologically, on equal terms with Spearman's model and it is shown that it is impossible to distinguish statistically between the 2 models.
Abstract: Modern factor analysis is the outgrowth of Spearman's original "2-factor" model of intelligence, according to which a mental test score is regarded as the sum of a general factor and a specific factor. As early as 1914, Godfrey Thomson realized that the data did not require this interpretation and he demonstrated this by proposing what became known as his "bonds" model of intelligence. Van der Maas et al. (2006) have recently drawn attention to what they perceive as difficulties with both models and have proposed a 3rd model. Neither alternative requires the general factor that was at the core of Spearman's idea. Although Thomson's model has been largely forgotten, the authors show that it merits further consideration because it can compete, statistically and biologically, on equal terms with Spearman's model. In particular, they show that it is impossible to distinguish statistically between the 2 models. There are also lessons to be learnt from the way in which Thomson arrived at his model and from the subsequent debate between Spearman and Thomson. The extent to which the recent proposal by van der Maas et al. may offer any advantage over Spearman's and Thomson's models is unclear and requires further investigation.

Journal ArticleDOI
TL;DR: It is proposed that in natural environments people have little opportunity or incentive to induce the normative rules of probability theory and, given their cognitive constraints, linear additive integration may often offer superior bounded rationality.
Abstract: Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive integration, in part, at least, because of well-known capacity constraints on controlled thought. In this article, the authors show with computer simulations that when based on approximate knowledge of probabilities, as is routinely the case in natural environments, linear additive integration can yield as accurate estimates, and as good average decision returns, as estimates based on probability theory. It is proposed that in natural environments people have little opportunity or incentive to induce the normative rules of probability theory and, given their cognitive constraints, linear additive integration may often offer superior bounded rationality.

Journal ArticleDOI
TL;DR: A trichotomous theory of recall is formulating that combines the traditional dual processes of recollection and familiarity with a reconstruction process and is embedded in a hidden Markov model that measures all 3 processes with low-burden tasks that are appropriate for even young children.
Abstract: One of the most extensively investigated topics in the adult memory literature, dual memory processes, has had virtually no impact on the study of early memory development. The authors remove the key obstacles to such research by formulating a trichotomous theory of recall that combines the traditional dual processes of recollection and familiarity with a reconstruction process. The theory is then embedded in a hidden Markov model that measures all 3 processes with low-burden tasks that are appropriate for even young children. These techniques are applied to a large corpus of developmental studies of recall, yielding stable findings about the emergence of dual memory processes between childhood and young adulthood and generating tests of many theoretical predictions. The techniques are extended to the study of healthy aging and to the memory sequelae of common forms of neurocognitive impairment, resulting in a theoretical framework that is unified over 4 major domains of memory research: early development, mainstream adult research, aging, and neurocognitive impairment. The techniques are also extended to recognition, creating a unified dual process framework for recall and recognition.

Journal ArticleDOI
TL;DR: The results of 3 experiments in which people viewed simple animations of objects colliding and made judgments of force and resistance supported several predictions made by this account.
Abstract: Impressions of force are commonplace in the visual perception of objects interacting. It is proposed that these impressions have their source in haptically mediated experiences of exertion of force in actions on objects. Visual impressions of force in interactions between objects occur by a kind of generalization of the proprioceptive impression of force to interactions between objects on the basis of matching to stored representations of actions on objects carried out by the perceiver. Such experiences give rise to a distinctive perceptual interpretation of interactions between objects as involving force exerted by one object acting against resistance offered by the other object. Active, moving objects are seen as exerting force; inactive objects are seen as offering varying degrees of resistance and not as exerting force unless there is reason to think that they acted back on the active object. The results of 3 experiments in which people viewed simple animations of objects colliding and made judgments of force and resistance supported several predictions made by this account.

Journal ArticleDOI
TL;DR: It is argued that the distinction between perception and decision making is unnecessary and that it is possible to give a unified account of both lexical processing and decision make, using fewer parameters than the diffusion model.
Abstract: R. Ratcliff, P. Gomez, and G. McKoon (2004) suggested much of what goes on in lexical decision is attributable to decision processes and may not be particularly informative about word recognition. They proposed that lexical decision should be characterized by a decision process, taking the form of a drift-diffusion model (R. Ratcliff, 1978), that operates on the output of lexical model. The present article argues that the distinction between perception and decision making is unnecessary and that it is possible to give a unified account of both lexical processing and decision making. This claim is supported by formal arguments and reinforced by simulations showing how the Bayesian Reader model (D. Norris, 2006) can be extended to fit the data on reaction time distributions collected by Ratcliff, Gomez, and McKoon simply by adding extra sources of noise. The Bayesian Reader gives an integrated explanation of both word recognition and decision making, using fewer parameters than the diffusion model. It can be thought of as a Bayesian diffusion model, which subsumes Ratcliff's drift-diffusion model as a special case.