scispace - formally typeset
Search or ask a question

Showing papers in "Psychological Review in 1993"


Journal ArticleDOI

[...]

TL;DR: It is suggested that delinquency conceals 2 distinct categories of individuals, each with a unique natural history and etiology: a small group engages in antisocial behavior of 1 sort or another at every life stage, whereas a larger group is antisocial only during adolescence.
Abstract: This chapter suggests that delinquency conceals two distinct categories of individuals, each with a unique natural history and etiology: A small group engages in antisocial behavior of one sort or another at every life stage, whereas a larger group is antisocial only during adolescence. According to the theory of life-course-persistent antisocial behavior, children's neuropsychological problems interact cumulatively with their criminogenic environments across development, culminating m a pathological personality. According to the theory of adolescence-limited antisocial behavior, a contemporary maturity gap encourages teens to mimic antisocial behavior in ways that are normative and adjustive. There are marked individual differences in the stability of antisocial behavior. The chapter reviews the mysterious relationship between age and antisocial behavior. Some youths who refrain from antisocial behavior may, for some reason, not sense the maturity gap and therefore lack the hypothesized motivation for experimenting with crime.

8,999 citations


Journal ArticleDOI

[...]

TL;DR: A theoretical framework is proposed that explains expert performance in terms of acquired characteristics resulting from extended deliberate practice and that limits the role of innate (inherited) characteristics to general levels of activity and emotionality.
Abstract: because observed behavior is the result of interactions between environmental factors and genes during the extended period of development. Therefore, to better understand expert and exceptional performance, we must require that the account specify the different environmental factors that could selectively promote and facilitate the achievement of such performance. In addition, recent research on expert performance and expertise (Chi, Glaser, & Farr, 1988; Ericsson & Smith, 1991a) has shown that important characteristics of experts' superior performance are acquired through experience and that the effect of practice on performance is larger than earlier believed possible. For this reason, an account of exceptional performance must specify the environmental circumstances, such as the duration and structure of activities, and necessary minimal biological attributes that lead to the acquisition of such characteristics and a corresponding level of performance. An account that explains how a majority of individuals can attain a given level of expert performance might seem inherently unable to explain the exceptional performance of only a small number of individuals. However, if such an empirical account could be empirically supported, then the extreme characteristics of experts could be viewed as having been acquired through learning and adaptation, and studies of expert performance could provide unique insights into the possibilities and limits of change in cognitive capacities and bodily functions. In this article we propose a theoretical framework that explains expert performance in terms of acquired characteristics resulting from extended deliberate practice and that limits the role of innate (inherited) characteristics to general levels of activity and emotionality. We provide empirical support from two new studies and from already published evidence on expert performance in many different domains.

7,293 citations


Journal ArticleDOI

[...]

TL;DR: A contextual-evolutionary theory of human mating strategies is proposed, hypothesized to have evolved distinct psychological mechanisms that underlie short-term and long-term strategies between men and women.
Abstract: This article proposes a contextual-evolutionary theory of human mating strategies. Both men and women are hypothesized to have evolved distinct psychological mechanisms that underlie short-term and long-term strategies. Men and women confront different adaptive problems in short-term as opposed to long-term mating contexts. Consequently, different mate preferences become activated from their strategic repertoires. Nine key hypotheses and 22 predictions from Sexual Strategies Theory are outlined and tested empirically. Adaptive problems sensitive to context include sexual accessibility, fertility assessment, commitment seeking and avoidance, immediate and enduring resource procurement, paternity certainty, assessment of mate value, and parental investment. Discussion summarizes 6 additional sources of behavioral data, outlines adaptive problems common to both sexes, and suggests additional contexts likely to cause shifts in mating strategy.

3,432 citations


Journal ArticleDOI

[...]

TL;DR: Decision field theory provides for a mathematical foundation leading to a dynamic, stochastic theory of decision behavior in an uncertain environment and is compared with 4 other theories of decision making under uncertainty.
Abstract: Decision field theory provides for a mathematical foundation leading to a dynamic, stochastic theory of decision behavior in an uncertain environment. This theory is used to explain (a) violations of stochastic dominance, (b) violations of strong stochastic transitivity, (c) violations of independence between alternatives, (d) serial position effects on preference, (e) speed-accuracy trade-off effects in decision making, (f) the inverse relation between choice probability and decision time, (g) changes in the direction of preference under time pressure, (h) slower decision times for avoidance as compared with approach conflicts, and (i) preference reversals between choice and selling price measures of preference. The proposed theory is compared with 4 other theories of decision making under uncertainty.

1,678 citations


Journal ArticleDOI

[...]

TL;DR: The dual-route cascaded model as mentioned in this paper is a computational version of the dual route model, which has been shown to be a viable model for reading and learning to read and can explain 6 of the major facts about reading that have been claimed to require a dual route architecture.
Abstract: It has often been argued that various facts about skilled reading aloud cannot be explained by any model unless that model possesses a dual-route architecture (lexical and nonlexical routes from print to speech). This broad claim has been challenged by Seidenberg and McClelland (1989, 1990). Their model has but a single route from print to speech, yet, they contend, it can account for major facts about reading that have hitherto been claimed to require a dual-route architecture. The authors identify 6 of these major facts about reading. The 1-route model proposed by Seidenberg and McClelland can account for the first of these but not the remaining 5. Because models with dual-route architectures can explain all 6 of these basic facts about reading, the authors suggest that this remains the viable architecture for any tenable model of skilled reading and learning to read. The dual-route cascaded model, a computational version of the dual-route model, is described

1,590 citations


Journal ArticleDOI

[...]

TL;DR: In this article, it is argued that an important source of constraints derives from the similarity comparison process itself, and that respects are determined by processes internal to comparisons, rather than hard-wired perceptual processes.
Abstract: This article reviews the status of similarity as an explanatory construct with a focus on similarity judgments. For similarity to be a useful construct, one must be able to specify the ways or respects in which two things are similar. One solution to this problem is to restrict the notion of similarity to hard-wired perceptual processes. It is argued that this view is too narrow and limiting. Instead, it is proposed that an important source of constraints derives from the similarity comparison process itself. Both new experiments and other evidence are described that support the idea that respects are determined by processes internal to comparisons

1,068 citations


Journal ArticleDOI

[...]

TL;DR: The present model describes 4 types of emotion-activating systems, 3 of which involve noncognitive information processing and the hierarchical organization of the systems for generating emotions provides an adaptive advantage.
Abstract: The significant role of emotions in evolution and adaptation suggests that there must be more than 1 mechanism for generating them. Nevertheless, much of current emotion theory focuses on cognitive processes (appraisal, attribution, and construal) as the sole, or primary, means of eliciting emotions. As an alternative to this position, the present model describes 4 types of emotion-activating systems, 3 of which involve noncognitive information processing. From an evolutionary-developmental perspective, the systems maybe viewed as a loosely organized hierarchical arrangement, with neural systems, the simplest and most rapid, at the base and cognitive systems, the most complex and versatile, at the top. The emotion-activating systems operate under a number of constraints, including genetically influenced individual differences. The hierarchical organization of the systems for generating emotions provides an adaptive advantage.

747 citations


Journal ArticleDOI

[...]

Asher Koriat1
TL;DR: In this paper, the authors examined all three questions within a unified model, with the aim of demystifying the feeling-of-knowing (FOK) phenomenon and showed that the computation of FOK is parasitic on the processes involved in attempting to retrieve the target, relying on the accessibility of pertinent information.
Abstract: Even when Ss fail to recall a solicited target, they can provide feeling-of-knowing (FOK) judgments about its availability in memory. Most previous studies addressed the question of FOK accuracy, only a few examined how FOK itself is determined, and none asked how the processes assumed to underlie FOK also account for its accuracy. The present work examined all 3 questions within a unified model, with the aim of demystifying the FOK phenomenon. The model postulates that the computation of FOK is parasitic on the processes involved in attempting to retrieve the target, relying on the accessibility of pertinent information. It specifies the links between memory strength, accessibility of correct and incorrect information about the target, FOK judgments, and recognition memory. Evidence from 3 experiments is presented. The results challenge the view that FOK is based on a direct, privileged access to an internal monitor.

745 citations


Journal ArticleDOI

[...]

TL;DR: For instance, the authors show that people tend to prefer utility levels that improve over time, rather than to spread good outcomes evenly over time when faced with a decision about how to schedule a set of outcomes.
Abstract: Existing models of intertemporal choice normally assume that people are impatient, preferring valuable outcomes sooner rather than later, and that preferences satisfy the formal condition of independence, or separability, which states that the value of a sequence of outcomes equals the sum of the values of its component parts. The authors present empirical results that show both of these assumptions to be false when choices are framed as being between explicitly denned sequences of outcomes. Without a proper sequential context, people may discount isolated outcomes in the conventional manner, but when the sequence context is highlighted, they claim to prefer utility levels that improve over time. The observed violations of additive separability follow, at least in part, from a desire to spread good outcomes evenly over time. Decisions of importance have delayed consequences. The choice of education, work, spending and saving, exercise, diet, as well as the timing of life events, such as schooling, marriage, and childbearing, all produce costs and benefits that endure over time. Therefore, it is not surprising that the problem of choosing between temporally distributed outcomes has attracted attention in a variety of disciplinary settings, including behavioral psychology, social psychology, decision theory, and economics. In spite of this disciplinary diversity, empirical research on intertemporal choice has traditionally had a narrow focus. Until a few years ago, virtually all studies of intertemporal choice were concerned with how people evaluate simple prospects consisting of a single outcome obtained at a point in time. The goal was to estimate equations that express the basic relationship between the atemporal value of an outcome and its value when delayed. Although the estimated functional forms would differ from investigation to investigation , there was general agreement on one point: that delayed outcomes are valued less. In economics, this is referred to as "positive time discounting." Although plausible at first glance, the uniform imposition of positive discounting on all of one's choices has some disturbing and counterintuitive implications. It implies, for instance, that when faced with a decision about how to schedule a set of outcomes, a person should invariably start with the best outcome, followed by the second best outcome, and so on until the worst outcome is reached at the end. Because nothing restricts the generality of this principle, one should find people preferring a declining rather than an increasing standard of living, deteriorating rather than improving health (again, holding lifetime health constant), and so on. In the last few years, several studies have independently fo

647 citations


Journal ArticleDOI

[...]

TL;DR: The authors propose a psychological mechanism that highlights relations among disinhibition, reflection, and failures to learn from aversive feedback that has implications for disinhibited individuals' impulsivity and provides a point of departure to study factors responsible for similarities and differences among these syndromes.
Abstract: Gorenstein and Newman (1980) proposed that poorly modulated responding for reward is the common diathesis underlying disinhibited behavior in several traditionally distinct person categories: psychopathy, hysteria, early onset alcoholism, childhood hyperactivity, and nonpathological impulsivity (e.g., extraversion). The authors extend this proposal by theorizing a psychological mechanism that highlights relations among disinhibition, reflection, and failures to learn from aversive feedback. The hypothesized mechanism is presented as 4 generic stages of response modulation: the dominant response set, the reaction to an aversive event, the subsequent behavioral adaptation, and the immediate and long-term consequences of reflection, or the lack thereof. The mechanism has implications for disinhibited individuals' impulsivity and provides a point of departure to study factors responsible for similarities and differences among these syndromes.

628 citations


Journal ArticleDOI

[...]

TL;DR: A hierarchical model of animal spatial cognitive maps is provided and it is suggested that the hippocampal formation and the posterior parietal cortex would act differently by handling topological and metric information, respectively.
Abstract: This article provides a hierarchical model of animal spatial cognitive maps. Such maps include both topological information, which affords loose, yet operational, representations of the connectivity of space and its overall arrangement, and metric information, which provides information about angles and distances. The model holds that maps can be initially described as a set of location-dependent reference frameworks providing directional information about other locations. The addition of an overall directional reference allows for the buildup of more complete (allocentric) representations. A survey of recent neurobiological data provides some hints about the brain structures involved in these processes and suggests that the hippocampal formation and the posterior parietal cortex would act differently by handling topological and metric information, respectively.

Journal ArticleDOI

[...]

TL;DR: Three forms of stereotype inaccuracy are identified: stereotypic inaccuracy, valence inaccurate, and dispersion inaccuracy; the implications of each form are discussed, along with how each can be assessed using a full-accuracy design.
Abstract: A perennial issue in the study of social stereotypes concerns their accuracy. Yet, there is no clear concept of the various ways in which stereotypes may be accurate or inaccurate and how one would assess their accuracy. This article is designed to rectify this situation. Three forms of stereotype inaccuracy are identified: stereotypic inaccuracy, valence inaccuracy, and dispersion inaccuracy. The implications of each form are discussed, along with how each can be assessed using a full-accuracy design. Past research that has attempted to examine stereotype accuracy is reviewed, and new data on the issue are presented. Although of perennial interest, the theoretical and methodological difficulties of assessing stereotype accuracy are substantial. The goal in this article is to alert the researcher to these difficulties and point toward their solution.

Journal ArticleDOI

[...]

TL;DR: A model of the sources and consequences of gesture-speech mismatches and their role during transitional periods in the acquisition of concepts is proposed and makes 2 major claims.
Abstract: Thoughts conveyed through gesture often differ from thoughts conveyed through speech. In this article, a model of the sources and consequences of such gesture-speech mismatches and their role during transitional periods in the acquisition of concepts is proposed. The model makes 2 major claims: (a) The transitional state is the source of gesture-speech mismatch. In gesture-speech mismatch, 2 beliefs are simultaneously expressed on the same problem--one in gesture and another in speech. This simultaneous activation of multiple beliefs characterizes the transitional knowledge state and creates gesture-speech mismatch. (b) Gesture-speech mismatch signals to the social world that a child is in a transitional state and is ready to learn. The child's spontaneous gestures index the zone of proximal development, thus providing a mechanism by which adults can calibrate their input to that child's level of understanding.

Journal ArticleDOI

[...]

TL;DR: In this article, a Discursive Action Model is proposed for investigating everyday causal attribution, which is based on the linguistic category model and conversational model of Turnbull and Slugoski (1988) and Hilton (1990).
Abstract: Everyday explanations of human actions have been studied as event perception, with language part of method, used by experimenters for describing events and obtaining causal judgments from Ss. Recently, language has acquired theoretical importance as the medium of causal thinking. Two developments are the linguistic category model of Au (1986), Brown and Fish (1983), and Fiedler and Semin (1988) and the conversational model of Turnbull and Slugoski (1988) and Hilton (1990). Three areas of weaknesses are identified: the relation between linguistic and psychological analysis, the nature of ordinary discourse, and the action orientation of event descriptions. A Discursive Action Model is proposed for investigating everyday causal attribution. Although a cognitive psychology of discursive attribution is considered feasible, this must follow a reconceptualization of language as social action

Journal ArticleDOI

[...]

TL;DR: In this article, a simple model of face recognition is lesioned in the parts of the model corresponding to visual processing, and the model demonstrates covert recognition in three qualitatively different tasks.
Abstract: Covert recognition of faces in prosopagnosia, in which patients cannot overtly recognize faces but nevertheless manifest recognition when tested in certain indirect ways, has been interpreted as the functioning of an intact visual face recognition system deprived of access to other brain systems necessary for consciousness The authors propose an alternative hypothesis: that the visual face recognition system is damaged but not obliterated in these patients and that damaged neural networks will manifest their residual knowledge in just the kinds of tasks used to measure covert recognition To test this, a simple model of face recognition is lesioned in the parts of the model corresponding to visual processing The model demonstrates covert recognition in 3 qualitatively different tasks Implications for the nature of prosopagnosia, and for other types of dissociations between conscious and unconscious perception, are discussed

Journal ArticleDOI

[...]

TL;DR: This paper showed that attitudes higher in heritability are more resistant to change and more consequential in the attitude similarity attraction relationship than those with low heritability, and that the possibility that a response may have a high heritability is often ignored.
Abstract: It is argued that differences in response heritability may have important implications for the testing of general psychological theories, that is, responses that differ in heritability may function differently. For example, attitudes higher in heritability are shown to be responded to more quickly, to be more resistant to change, and to be more consequential in the attitude similarity attraction relationship. The substantive results are interpreted in terms of attitude strength and niche building. More generally, the implications of heritability for the generality and typicality of treatment effects are also discussed. Although psychologists clearly recognize the impact of genetics on behavior, their theories rarely reflect this knowledge. Most theories assume that behavior is relatively plastic and is shaped almost entirely by situational parameters. The possibility that a response may have a high heritability is often ignored. I argue here that ignoring this possibility is consequential. The vehicle used in this article is attitudes. This vehicle was chosen because it is a domain with which I have some familiarity; it is a domain that has a number of minitheories, and it is a domain in which the notion of heritability itself is suspect so that a demonstration of the effects of heritability should be particularly evocative.

Journal ArticleDOI

[...]

TL;DR: Three regularities in recognition memory are described with supporting data: the mirror effect, the order of receiver operating characteristic slopes, and the symmetry of movement of underlying distributions and the derivation from attention/likelihood theory is demonstrated.
Abstract: Three regularities in recognition memory are described with supporting data: the mirror effect, the order of receiver operating characteristic slopes, and the symmetry of movement of underlying distributions The derivation of these regularities from attention/likelihood theory is demonstrated The theory's central concept, which distinguishes it from other theories, is the following: Ss make recognition decisions by combining information about new and old items, the combination made in the form of likelihood ratios The central role of the likelihood ratios extends the implications of signal detection theory for recognition memory Attention/likelihood theory is fitted to data of 2 series of experiments One series involves yes-no tests and confidence ratings, the other forced-choice experiments It is argued that the regularities require a revision of most current theories of recognition memory

Journal ArticleDOI

[...]

TL;DR: The author demonstrates that, instead, lengthening and pausing reflect a distinctly prosodic representation, in which phonological constituents are arranged in a hierarchical, nonrecursive structure.
Abstract: Phrase-final words tend to be lengthened and followed by a pause. The dominant view of prosodic production is that word lengthening and pausing reflect the syntax of a sentence. The author demonstrates that, instead, lengthening and pausing reflect a distinctly prosodic representation, in which phonological constituents are arranged in a hierarchical, nonrecursive structure. Prosodic structure is created without knowledge of words' phonemic content. As a result, within a single sentential position, greater word lengthening necessitates shorter pauses, but across positions, word and pause durations show a positive correlation. The author presents a model of prosodic production that describes the process of prosodic encoding and provides a quantitative specification of the relation between word lengthening and pausing. This model has implications for studies of language production, comprehension, and development.

Journal ArticleDOI

[...]

TL;DR: The TODAM2 model as discussed by the authors extended the basic convolution-correlation formalism by using multiple convolutions, n-grams, and chunks to account for chunking and serial organization.
Abstract: This article presents an extended version of the convolution-correlation memory model TODAM (theory of distributed associative memory) that not only eliminates some of the inadequacies of previous versions but also provides a unified treatment of item, associative, and serial-order information. The chunking model extended the basic convolution-correlation formalism by using multiple convolutions, n-grams (multiple autoassociations of sums of item vectors), and chunks (sums of n-grams) to account for chunking and serial organization. TODAM2 extends the chunking model by including rn-grams (reduced n-grams), labels, and «lebals» (the involution or mirror image of a label) to provide a general model for episodic memory. For paired associates, it is assumed that subjects store only labeled n-grams and lebaled rn-grams. It is shown that the model is broadly consistently with a number of major empirical paired-associate and serial-order effects

Journal ArticleDOI

[...]

TL;DR: A model is developed to account for variations in test strategies, beginning with the premise that cognitive processes are adapted to reducing particularly costly errors rather than to detecting "truth" and outlines how error minimization goals might produce data preferences coincidentally consistent with normative prescriptions.
Abstract: A broad empirical literature demonstrates what has been termed a confirmation bias or positive test strategy heuristic in reasoning (Klayman & Ha, 1987), a potentially maladaptive pattern of data preferences that coexists with more normative preferences for highly diagnostic information (Skov & Sherman, 1986). A model is developed to account for these variations in test strategies, beginning with the premise that cognitive processes are adapted to reducing particularly costly errors rather than to detecting "truth" (Cosmides & Tooby, 1992). By specifying the information required to minimize various errors of primary concern, the model clarifies the adaptiveness of certain confirmatory preferences, identifies conditions under which such preferences should diminish, and outlines how error minimization goals might produce data preferences coincidentally consistent with normative prescriptions.

Journal ArticleDOI

[...]

Janet Metcalfe1
TL;DR: In this paper, a novelty-familiarity monitor and a simple control procedure are proposed to solve the problem of composite trace distributed models of human memory and particularly in the Composite Holographic Associative Recall Memory (CHARM) model.
Abstract: This article stems from a technical problem in composite-trace distributed models of human memory and particularly in the Composite Holographic Associative Recall Memory (CHARM) model. Briefly, the composite trace--used as a central construct in such models--can become catastrophically out of control. To solve the problem, a prestorage novelty-familiarity monitor and a simple control procedure need to be implemented. Eight lines of experimental evidence converge on the idea that output from such a novelty-familiarity monitor underlies people's metacognitive judgments of feeling of knowing. Breakdown of the monitoring-control mechanism produces Korsakoff-like symptoms in the model. Impairments in feeling-of-knowing judgments and the failure to release from proactive inhibition, both characteristic of Korsakoff amnesia, are thus attributed to a monitoring-control failure rather than to deficits in the basic memory system.

Journal ArticleDOI

[...]

TL;DR: Fuzzy-trace theory explains this memory-independence effect on the grounds that reasoning operations do not directly access verbatim traces of critical background information but, rather, process gist that was retrieved and edited in parallel with the encoding of such information.
Abstract: Recent experiments have established the surprising fact that age improvements in reasoning are often dissociated from improvements in memory for determinative informational inputs. Fuzzy-trace theory explains this memory-independence effect on the grounds that reasoning operations do not directly access verbatim traces of critical background information but, rather, process gist that was retrieved and edited in parallel with the encoding of such information. This explanation also envisions 2 ways in which children's memory and reasoning might be mutually interfering: (a) memory-to-reasoning interference, a tendency to process verbatim traces of background inputs on both memory probes and reasoning problems that simultaneously improves memory performance and impairs reasoning, and (b) reasoning-to-memory interference, a tendency for reasoning activities that produce problem solutions to erase or reduce the distinctiveness of verbatim traces of background inputs. Both forms of interference were detected in studies of children's story inferences.

Journal ArticleDOI

[...]

TL;DR: The metrics and mapping framework is proposed to account for how these processes are integrated to generate estimates and for predicting when people emphasize heuristics and when they emphasize domain-specific knowledge.
Abstract: Estimation is influenced by a variety of processes: application of heuristics, domain-specific reasoning, and intuitive statistical induction, among them In this article, we propose the metrics and mapping framework to account for how these processes are integrated to generate estimates This framework identifies 2 types of information as critical: knowledge of distributional properties (metric knowledge) and knowledge of relative status of individual entities within the distribution (mapping knowledge) Heuristics and domain-specific knowledge are both viewed as cues that contribute to mapping knowledge; intuitive statistical induction is viewed as providing cues to metric properties Results of 4 experiments illustrate the framework's usefulness for integrating these types of information and for predicting when people emphasize heuristics and when they emphasize domain-specific knowledge

Journal ArticleDOI

[...]

T. L. Davidson1
TL;DR: A research strategy is described that confirms that food deprivation states produce salient interoceptive stimuli in rats and implications for the physiological origins of energy state signals, the brain structures involved with processing energy state information, and the manner in which signals of energy need influence feeding were considered.
Abstract: The idea that different states of energy need give rise to distinct interoceptive sensations has been basic to many accounts of the physiological and the learned controls of feeding. Yet, a number of difficulties have complicated attempts to provide direct evidence for this view. The present article describes a research strategy that confirms that food deprivation states produce salient interoceptive stimuli in rats. The implications of this research for the physiological origins of energy state signals, the brain structures involved with processing energy state information, and the manner in which signals of energy need influence feeding were considered. The possibility that food deprivation cues influence feeding by modulating the activation of associations involving external events and their postingestive aftereffects was discussed with reference to earlier associative accounts of the function of hunger signals.

Journal ArticleDOI

[...]

TL;DR: The authors show that behavior in different types of discrimination-reversal experiments and in extinction is not explained by 2 versions of a popular local model and that the nonlocal cumulative-effects model is consistent with matching and that it can duplicate the major properties of recurrent choice in a set of discrimination and extinction experiments.
Abstract: Recurrent choice has been studied for many years. A static law, matching, has been established, but there is no consensus on the underlying dynamic process. The authors distinguish between dynamic models in which the model state is identified with directly measurable behavioral properties (performance models) and models in which the relation between behavior and state is indirect (state models). Most popular dynamic choice models are local, performance models. The authors show that behavior in different types of discrimination-reversal experiments and in extinction is not explained by 2 versions of a popular local model and that the nonlocal cumulative-effects model is consistent with matching and that it can duplicate the major properties of recurrent choice in a set of discrimination-reversal experiment. The model can also duplicate results from several other experiments on the extinction after complex discrimination training

Journal ArticleDOI

[...]

TL;DR: In this paper, a theory of social contagion is defined to explain how social influence affects sexual development, which can, with some transition rate or probability, result in an increase in level of sexual experience.
Abstract: Epidemic Models of the Onset of Social Activities (EMOSA models) describe the spread of adolescent transition behaviors (e.g., sexuality, smoking, and drinking) through an interacting adolescent network. A theory of social contagion is defined to explain how social influence affects sexual development. Contacts within a network can, with some transition rate or probability, result in an increase in level of sexual experience. Five stages of sexual development are posited. One submodel proposes a systematic progression through these stages; a competing submodel treats each as an independent process. These models are represented in sets of dynamically interacting recursive equations, which are fit to empirical prevalence data to estimate parameters. Model adjustments are substantively interpretable and can be used to test for and better understand social interaction processes that affect adolescent sexual behavior.

Journal ArticleDOI

[...]

TL;DR: The authors consider problems of confirming the null hypothesis, power of the statistical test, Simpson's paradox, and between-subjects and within-subject correlations and conclude that formal models are necessary if findings of (in)dependence are to be interpreted meaningfully in terms of underlying theoretical processes.
Abstract: The authors provide a critical evaluation of the use of stochastic independence in psychological research. Specifically, they consider problems of confirming the null hypothesis, power of the statistical test, Simpson's paradox, and between-subjects and within-subject correlations. These problems are discussed in the context of research on theories of memory and cognitive development and illustrated with data on reasoning-remembering relationships. The authors conclude that demonstrations of response independence do not, by themselves, provide sufficient grounds for deciding whether a single process or multiple processes are necessary to account for performance. Instead, they argue that formal models are necessary if findings of (in)dependence are to be interpreted meaningfully in terms of underlying theoretical processes.

Journal ArticleDOI

[...]

TL;DR: In this article, a quantitative model of adaptation-level (AL) effects on stimulus generalization is presented, which integrates results from single stimulus, go-no-go, and choice discrimination training paradigms.
Abstract: This article presents a quantitative model of adaptation-level (AL) effects on stimulus generalization and integrates results from single stimulus, go-no-go, and choice discrimination training paradigms. The model accurately predicts (a) the gradualness of the shift in responding during the course of asymmetrical generalization testing, (b) the relation between the degree of asymmetry and the amount of shift, (c) the effect of overrepresenting certain stimuli during testing, and (d) the effect of varying the amount of training. With the discrimination training paradigms the effects of the degree of separation between the training stimuli (TSs) and of the relative frequency of their presentation during training and subsequent generalization testing are consistent with an extension of the basic model. Finally, new research is described affirming the applicability of the AL model to several infrahuman species

Journal ArticleDOI

[...]

TL;DR: In this article, the perceptual protocols that underlie discrimination between fractals and between other types of random contours and fractals are examined, and it is shown that people have a natural disposition to view contour in terms of signal and noise.
Abstract: The observation that natural curves and surfaces are often fractal suggests that people may be sensitive to their statistical properties. The perceptual protocols that underlie discrimination between fractals and between other types of random contour and fractals are examined. Discrimination algorithms that have precisely the same sensitivities as human observers are constructed. These algorithms do not recognize the integrated scale hierarchy intrinsic to fractal form and operate by imposing a metatheory of structure that is based on a signal-noise distinction. The success of the algorithms implies that (a) self-affinity in random fractals is not perceptually recovered and (b) people have a natural disposition to view contour in terms of signal and noise. The authors propose that this disposition be understood as a principle of perceptual organization. The environment that we live in has essentially two architectural components: One is carpentered, designed, and built by people; the other is everything else, the material form of nature. If one observes carpentered structures with an unjaded eye, it is difficult not to be struck by the smoothness of the surfaces and the cleanness with which the lines are cut. Even the crudest and least adorned structures have these properties. The things that people make are at least minimally designed, and the primitives of design are lines and planes. This is as true of primitive structures and implements as it is of the things that are built today. An inspection of natural structures reveals an entirely different order. The boundaries that form natural surfaces and contours are often not smooth. Natural form—landscapes, mountain ranges, coastlines, stream paths, clouds, tree lines, vegetation cover—is irregular and rough in appearance. The apparent transparency of this observation belies the subtlety that is required to fully appreciate its import. Geometric descriptions of natural structures required the development of a new set of elements that differ radically from those that comprise Euclidean geometry as well as new modes of analysis that depart from the smoothness assumptions on which differential geometry rests. Real analysis, as developed by Cauchy, Weierstrass, and Bolzano, treats structures that have specific properties under mag

Journal ArticleDOI

[...]

TL;DR: A new reaction time model is presented that includes both sequential- stage (discrete) and overlapping-stage (continuous-flow) models as special cases, and observations of factor additivity support discrete-stage models.
Abstract: This article presents a new reaction time model that includes both sequential-stage (discrete) and overlapping-stage (continuous-flow) models as special cases. In the new model, task performance is carried out by a series of distinct processing stages, each of which functions as a queue. A stimulus conveys 1 or more distinct components of information (e.g., features), and each stage can begin processing as soon as it receives 1 component from its predecessor. If a stimulus activates only 1 component, successive stages operate in strict sequence; if it activates multiple components, successive stages operate with temporal overlap. Within this class of models, experimental factors affecting different processing stages always have additive effects on reaction time with sequential stages but rarely do so with overlapping stages. Within this class of models, then, observations of factor additivity support discrete-stage models.