scispace - formally typeset
Search or ask a question

Showing papers by "Alan D. Baddeley published in 2001"


Journal ArticleDOI
TL;DR: The current state of A.D. Hitch’s (1974) multicomponent working memory model is reviewed and a proposed clarification in which the executive is assumed to be a limited capacity attentional system, aided by a newly postulated fourth system, the episodic buffer.
Abstract: The current state of A.D. Baddeley and G.J. Hitch’s (1974) multicomponent working memory model is reviewed. The phonological and visuospatial subsystems have been extensively investigated, leading both to challenges over interpretation of individual phenomena and to more detailed attempts to model the processes underlying the subsystems. Analysis of the controlling central executive has proved more challenging, leading to a proposed clarification in which the executive is assumed to be a limited capacity attentional system, aided by a newly postulated fourth system, the episodic buffer. Current interest focuses most strongly on the link between working memory and long-term memory and on the processes allowing the integration of information from the component subsystems. The model has proved valuable in accounting for data from a wide range of participant groups under a rich array of task conditions. Working memory does still appear to be working. The term working memory appears to have been first proposed by Miller, Galanter, and Pribram (1960) in their classic book Plans and the Structure of Behavior. The term has subsequently been used in computational modeling approaches (Newell & Simon, 1972) and in animal learn

702 citations


Journal ArticleDOI
TL;DR: A series of 7 experiments used dual-task methodology to investigate the role of working memory in the operation of a simple action-control plan or program involving regular switching between addition and subtraction, finding lists requiring switching were slower than blocked lists and showed 2 concurrent task effects.
Abstract: A series of 7 experiments used dual-task methodology to investigate the role of working memory in the operation of a simple action-control plan or program involving regular switching between addition and subtraction. Lists requiring switching were slower than blocked lists and showed 2 concurrent task effects. Demanding executive tasks impaired performance on both blocked and switched lists, whereas articulatory suppression impaired principally the switched condition. Implications for models of task switching and working memory and for the Vygotskian concept of verbal control of action are discussed.

458 citations


Journal ArticleDOI
01 Aug 2001-Brain
TL;DR: The results suggest a need to fractionate executive processes, and reinforce earlier evidence for a specific dual-task processing deficit in Alzheimer's disease.
Abstract: Attentional control of executive function declines during the early stages of Alzheimer's disease. Controversy exists as to whether this decline results from a single global deficit or whether attentional control can be fractionated, with some aspects being more vulnerable than others. We investigated three proposed domains of attention, namely (i) focal attention, based on simple and choice reaction times; (ii) the capacity to resist distraction in a visual search task; and (iii) the capacity to divide attention between two simultaneous tasks. For each domain, two levels of difficulty were used to study Alzheimer's disease patients, who were compared with elderly and young control subjects. The unitary attentional hypothesis predicted that the impacts of level of difficulty, age and disease would be qualitatively similar across the three attentional domains. In fact we observed different patterns for each domain. We obtained no differential impairment for patients in the focal attentional task, whereas patients were somewhat more susceptible than control subjects to the similarity of the distractor items in visual search. Finally, we observed marked impairment in the capacity of Alzheimer's disease patients to combine performance on two simultaneous tasks, in contrast to preserved dual-task performance in the normal elderly group. These results suggest a need to fractionate executive processes, and reinforce earlier evidence for a specific dual-task processing deficit in Alzheimer's disease.

428 citations


Journal ArticleDOI
TL;DR: His recall of previously unfamiliar newsreel event was impaired, but gained substantially from repetition over a 2-day period, consistent with the hypothesis that the recollective process of episodic memory is not necessary either for recognition or for the acquisition of semantic knowledge.
Abstract: We report the performance on recognition memory tests of Jon, who, despite amnesia from early childhood, has developed normal levels of performance on tests of intelligence, language, and general knowledge. Despite impaired recall, he performed within the normal range on each of six recognition tests, but he appears to lack the recollective phenomenological experience normally associated with episodic memory. His recall of previously unfamiliar newsreel events was impaired, but gained substantially from repetition over a 2-day period. Our results are consistent with the hypothesis that the recollective process of episodic memory is not necessary either for recognition or for the acquisition of semantic knowledge.

245 citations


Journal ArticleDOI
TL;DR: Over the last half century, the experimental study of human memory has departed from the earlier concept of a unitary faculty, with the increase in knowledge leading to differentiation between subsystems of memory, often based on the study of neuropsychological patients.
Abstract: Over the last half century, the experimental study of human memory has departed from the earlier concept of a unitary faculty, with the increase in knowledge leading to differentiation between subsystems of memory, often based on the study of neuropsychological patients. Although foreshadowed by the classic work of William James (1890), the current approach to the fractionation of memory probably began with Hebb's (1949) proposal of a distinction between short-term memory (STM), based on temporary electrical activity within the brain, and longterm memory (LTM), based on the development of more permanent neurochemical changes. He even proposed a learning mechanism, a concept that continues to be influential in neurobiological theorizing (see Burgess et al. 2001). Experimental evidence for a distinction between STM and LTM began to appear a decade later with the demonstration by Brown (1958) and Peterson & Peterson (1959) of the rapid forgetting of small amounts of material when ongoing rehearsal was prevented. They proposed that this forgetting reflected the decay of a short-term trace, a process they distinguished from long-term forgetting, which was attributed to interference among longterm memory representations. This view was resisted, with the counter claim made that all forgetting could be interpreted within a single stimulus-response association framework (Melton 1963). The question of whether shortterm forgetting reflects trace decay or interference remains unresolved (Cowan et al. 2000; Service 1998). During the 1960s, however, experimental evidence from a range of sources seemed to point increasingly strongly to the need to distinguish between STM and LTM on grounds other than type of forgetting. Neuropsychological evidence was particularly influential, with patients suffering from the classic amnesic syndrome showing grossly impaired LTM, coupled with total preservation of performance on a range of tasks associated with STM (Baddeley & Warrington 1970). Anatomically, the amnesic syndrome has most strongly been associated with damage to the hippocampus (Milner 1966), although it could result from damage to a series of structures that broadly make up the Papez circuit (see Aggleton & Pearce 2001). The STM-LTM distinction was further supported by patients showing the opposite dissociation, with STM performance impaired and LTM preserved (Shallice & Warrington 1970). By the late 1960s, a range of two-component models was being proposed, of which the most influential was that of Atkinson & Shiffrin (1968). In this model, information was assumed to come in from the environment, be processed by a short-term storage system and then fed into LTM. Probability of learning was assumed to depend on time held within the short-term store. STM was also

161 citations


Journal ArticleDOI
TL;DR: The evidence suggests that Down syndrome is associated with a specific memory problem, which is linked to a potential deficit in the functioning of the 'phonological loop' of Baddeley's (1986) model of working memory.
Abstract: This paper is divided into three sections The first reviews the evidence for a verbal short-term memory deficit in Down syndrome Existing research suggests that short-term memory for verbal information tends to be impaired in Down syndrome, in contrast to short-term memory for visual and spatial material In addition, problems of hearing or speech do not appear to be a major cause of difficulties on tests of verbal short-term memory This suggests that Down syndrome is associated with a specific memory problem, which we link to a potential deficit in the functioning of the 'phonological loop' of Baddeley's (1986) model of working memory The second section considers the implications of a phonological loop problem Because a reasonable amount is known about the normal functioning of the phonological loop, and of its role in language acquisition in typical development, we can make firm predictions as to the likely nature of the short-term memory problem in Down syndrome, and its consequences for language learning However, we note that the existing evidence from studies with individuals with Down syndrome does not fit well with these predictions This leads to the third section of the paper, in which we consider key questions to be addressed in future research We suggest that there are two questions to be answered, which follow directly from the contradictory results outlined in the previous section These are 'What is the precise nature of the verbal short-term memory deficit in Down syndrome?', and 'What are the consequences of this deficit for learning?' We discuss ways in which these questions might be addressed in future work

139 citations


Journal ArticleDOI
TL;DR: It is suggested that Cowan's revisiting of the magic number reflects a misinterpretation of the working memory model of Baddeley (1986), resulting in a danger of focusing attention on pseudo-problems rather than genuine disparities between his approach and my own.
Abstract: Cowan's revisiting of the magic number is very timely and the case he makes for a more moderate number than seven is persuasive. It is also appropriate to frame his case within a theoretical context, since this will influence what evidence to include and how to interpret it. He presents his model however, as a contrast to the working memory model of Baddeley (1986). I suggest that this reflects a misinterpretation of our model resulting in a danger of focusing attention on pseudo-problems rather than genuine disparities between his approach and my own.

75 citations


Journal ArticleDOI
01 Jan 2001-Cortex
TL;DR: The authors presented data from a series of follow up assessments which examined the development of vocabulary and pattern construction abilities in 15 of the original sample of 16 individuals, over a 40 month period.

70 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present normative data for the Speed and Capacity of Language Processing (SCOLP) test from an older American sample, which comprises two subtests: Spot-the-Word, a lexical decision task, providing an estimate of premorbid intelligence, and Speed of Comprehension, providing a measure of information processing speed.
Abstract: This study presents normative data for the Speed and Capacity of Language Processing (SCOLP) test from an older American sample. The SCOLP comprises 2 subtests: Spot-the-Word, a lexical decision task, providing an estimate of premorbid intelligence, and Speed of Comprehension, providing a measure of information processing speed. Slowed performance may result from normal aging, brain damage (e.g., head injury), or dementing disorders or may represent the intact performance of someone who always performed at the low end of normal. The SCOLP enables the clinician to differentiate between these possibilities. Adequate age-appropriate norms to differentiate dementia from normal aging do not exist. We present data from 424 older community-dwelling Americans (75-94 years old). The results confirm that information processing speed slows with increasing age. By contrast, increasing age has little effect on lexical decision. Thus, our data suggest that the SCOLP shows promise as a tool to help distinguish between n...

39 citations


Journal ArticleDOI
01 Jan 2001-Cortex
TL;DR: Bibliometric measures tend to bias publication towards US journals, where the scientific community is largest, which in turn creates problems for those journals due to overload, and a growing preoccupation with where a paper is published, rather than what it says.

6 citations