Robert A. Bjork
Other affiliations: National Research Council, University of California, University of California, Berkeley ...read more
Bio: Robert A. Bjork is an academic researcher from University of California, Los Angeles. The author has contributed to research in topics: Recall & Forgetting. The author has an hindex of 72, co-authored 168 publications receiving 21670 citations. Previous affiliations of Robert A. Bjork include National Research Council & University of California.
Papers published on a yearly basis
TL;DR: It is concluded that at present, there is no adequate evidence base to justify incorporating learning-styles assessments into general educational practice and limited education resources would better be devoted to adopting other educational practices that have a strong evidence base.
Abstract: The term “learning styles” refers to the concept that individuals differ in regard to what mode of instruction or study is most effective for them. Proponents of learning-style assessment contend that optimal instruction requires diagnosing individuals' learning style and tailoring instruction accordingly. Assessments of learning style typically ask people to evaluate what sort of information presentation they prefer (e.g., words versus pictures versus speech) and/or what kind of mental activity they find most engaging or congenial (e.g., analysis versus listening), although assessment instruments are extremely diverse. The most common—but not the only—hypothesis about the instructional relevance of learning styles is the meshing hypothesis, according to which instruction is best provided in a format that matches the preferences of the learner (e.g., for a “visual learner,” emphasizing visual presentation of information).The learning-styles view has acquired great influence within the education field, and...
TL;DR: The authors argue that typical training procedures are far from optimal and that the goal of training in real-world settings is to support two aspects of post-training performance: (a) the level of performance in the long term and (b) the capability to transfer that training to related tasks and altered contexts.
Abstract: We argue herein that typical training procedures are far from optimal. The goat of training in real-world settings is, or should be, to support two aspects of posttraining performance: (a) the level of performance in the long term and (b) the capability to transfer that training to related tasks and altered contexts. The implicit or explicit assumption of those persons responsible for training is that the procedures that enhance performance and speed improvement during training will necessarily achieve these two goals. However, a variety of experiments on motor and verbal learning indicate that this assumption is often incorrect. Manipulations that maximize performance during training can be detrimental in the long term; conversely, manipulations that degrade the speed of acquisition can support the long-term goals of training. The fact that there are parallel findings in the motor and verbal domains suggests that principles of considerable generality can be deduced to upgrade training procedures.
TL;DR: A critical role for suppression in models of retrieval inhibition and a retrieval-induced forgetting that implicate the retrieval process itself in everyday forgetting are suggested.
Abstract: Three studies show that the retrieval process itself causes long-lasting forgetting. Ss studied 8 categories (e.g., Fruit). Half the members of half the categories were then repeatedly practiced through retrieval tests (e.g., Fruit Or ). Category-cued recall of unpracticed members of practiced categories was impaired on a delayed test. Experiments 2 and 3 identified 2 significant features of this retrieval-induced forgetting: The impairment remains when output interference is controlled, suggesting a retrieval-base d suppression that endures for 20 min or more, and the impairment appears restricted to high-frequenc y members. Low-frequency members show little impairment, even in the presence of strong, practiced competitors that might be expected to block access to those items. These findings suggest a critical role for suppression in models of retrieval inhibition and implicate the retrieval process itself in everyday forgetting. A striking implication of current memory theory is that the very act of remembering may cause forgetting. It is not that the remembered item itself becomes more susceptible to forgetting; in fact, recalling an item increases the likelihood that it will be recallable again at a later time. Rather, it is other items—items that are associated to the same cue or cues guiding retrieval—that may be put in greater jeopardy of being forgotten. Impaired recall of such related items may arise if access to them is blocked by the newly acquired strength of their successfully retrieved competitors (Blaxton & Neely, 1983; Brown, 1981; Brown, Whiteman, Cattoi, & Bradley, 1985; Roediger, 1974, 1978; Roediger & Schmidt, 1980; Rundus, 1973). This implication follows from three assumptions underlying what we herein refer to as strength-dependent competition models of interference: (a) the competition assumption—that memories associated to a common cue compete for access to conscious recall when that cue is presented; (b) the strengthdependence assumption—that the cued recall of an item will decrease as a function of increases in the strengths of its
TL;DR: In neuroscience, modularity of systems is the rule rather than the exception: if one accepts a straightforward relationship between brain systems and cogni- tive systems, the hypothesis of multiple memory systems is a logical exten- sion of current knowledge as discussed by the authors.
Abstract: ionist Positions Abstractionist positions view implicit memory as reflecting modification of the state of abstract lexical, semantic, or procedural knowledge structures; byionist positions view implicit memory as reflecting modification of the state of abstract lexical, semantic, or procedural knowledge structures; by contrast, explicit memory is assumed to depend on formation and retrieval of memory traces representing specific experiences. Abstractionists are often neuroscientifically oriented, using brain lesion data to constrain their theories. Of particular interest are findings that amnesic patients are selectively im paired on direct tests of memory but show normal learning as measured by some indirect tests (reviewed below). These deficits are ascribed to an impairment of the system responsible for memory of specific experiences. In neuroscience, modularity of systems is the rule rather than the exception: If one accepts a straightforward relationship between brain systems and cogni tive systems, the hypothesis of multiple memory systems is a logical exten sion of current knowledge (Cohen 1984; Oakley 1983; for discussion, see Cohen 1 985; Olton 1985; Schacter 1986) . Tu1ving's (1972) heuristic distinction between episodic and semantic forms of memory was later developed into a multiple-system theory (Schacter & Tulving 1982a, b; Tulving 1983, 1984a) . Episodic memory "deals with unique, concrete, personal, temporally dated events," while semantic mem ory "involves general, abstract, timeless knowledge that a person shares with others" ( 1986, p. 307). In recent versions of the theory (Tulving 1984b,c; 486 RICHARDSON-KLA VEHN & BJORK 1985a,b,c; 1986) episodic memory is viewed as a specialized subsystem of semantic memory, with both systems embedded within a procedural memory, an arrangement Tulving terms monohierarchical. This position was designed to facilitate conceptualization of Tulving's (1983) hypothesis of the phylogenetic evolution of episodic from semantic memory, and to account for recent arguments that whereas indirect memory measures reveal evidence of memory early in human ontogeny, the capacity to perfonn direct memory tasks first emerges only at 8-9 months of age (Schacter & Moscovitch 1984) . There are a number of other multiple-system formulations that are some what analogous to the episodic-semantic distinction (e.g. Halgren 1984; Johnson 1983 ; Oakley 1983 ; O'Keefe & Nadel 1978; Olton et al 1979; Schacter & Moscovitch 1984; Warrington & Weiskrantz 1982) . One of these deserves special note: Morton's (1969, 1970, 1979, 1981) mUltisystem theory differs from Tulving's in that conceptual and factual knowledge, as well as personal or episodic memories, are dealt with by the same system (the cognitive system); a separate system contains abstract representations for words (logogens) that are responsible for lexical access. Other theorists regard the distinction between procedural and declarative (or propositional) memories (accepted by Tulving) as sufficient to explain the observed dissociations between direct and indirect measures (e.g. Baddeley 1984; Cohen 1984; McKoon et al 1986; Squire & Cohen 1984) . The pro cedural/declarative distinction was originally formulated by workers in artifi cial intelligence as a distinction between types of knowledge (e .g . Barr & Feigenbaum 1981; Winograd 1975) , but it has been extended into a multiple system viewpoint (Cohen 1984; Squire & Cohen 1984) . Procedural memory involves "reorganization or other modification of existing processing struc tures or procedures ," whereas declarative memory "represents explicitly new data structures derived from the operation of any process or procedure" (Cohen 1984, pp. 96-97). Although procedural memory can be revealed only when a task reengages prior processing operations, it is abstract in the sense that it does not record the specific prior events that caused those processing operations to be modified. The declarative system is considered to be respon sible for conscious access to facts and past experiences; it is necessary for performance of direct memory tests, and is impaired in amnesia. An approach analogous to the procedural/declarative distinction is proposed by Mishkin (Mishkin et al 1984; Mishkin & Petri 1984) , who distinguishes between a memory system and a habit system. The distinction between activation and elaboration (Graf & Mandler 1984; Mandler 1980; Mandler et al 1986) is a process-oriented viewpoint; it differs from other abstractionist positions in being neutral with respect to the issue of memory systems. Activation of a preexisting mental representation "strength ens the relations among its components and increases its accessibility" (Graf MEASURES OF MEMORY 487 & Mandler 1 984, p. 553); elaborative processing is necessary in order to retain new relationships and relate stimuli to the context in which they were presented. Activation alone is sufficient to result in processing facilitation that is revealed in indirect memory tests, whereas elaboration is necessary for direct tests of memory . A similar concept of trace activation has been proposed by Diamond & Rozin (1 984; see also Mortensen 1980) . Nonabstractionist Positions Nonabstractionists are unified by their disagreement with the necessity to distinguish abstract representations from memory traces that preserve in formation from specific experiences. They are typically mainstream cognitive psychologists who concern themselves primarily with the behavior of normal human subjects. Kolers (e.g. 1979, 1 985; Kolers & Roediger 1984; Kolers & Smythe 1 984) has attacked the distinction between procedural and declarative (or proposi tional) knowledge, arguing that "statements or declarations . . . do not fail of procedural representation" (Kolers & Roediger 1984, p. 437). When a subject displays knowledge, he/she is assumed to be engaging in a form of skilled performance. Knowledge is regarded as being specific to the processes by which that knowledge is acquired: Rather than offering a "unitary" theory in opposition to multisystem approaches, Kolers suggests that mentation con sists of a multiplicity of processes whose properties are poorly correlated. Memory is revealed to the extent that processing operations at study and test overlap (the principle of transfer-appropriate processing, Bransford et al 1 979) . Dissociations between direct and indirect measures of retention are to be expected when members of the two classes of task make different process ing requirements , and not otherwise [see also Moscovitch et al ( 1986) for a similar viewpoint]. Jacoby ( 1983b) , Blaxton ( 1 985), and Roediger & Blaxton ( 1 987a,b) have used the terms conceptually driven and data-driven as a taxonomy of the processing demands of memory tests. Direct memory tests typically involve more conceptually driven than data-driven processing because the subject uses associative information to reconstruct the study episode mentally; in direct memory tests typically involve more data-driven than conceptually driven processes because the subject focusses on external stimuli (e.g. a fragment of a word) at test. Dissociations between data-driven and con ceptually driven memory tests would be expected as a function of the type of information (semantic-associative vs perceptual) encoded in a prior episode. Jacoby ( 1982, 1983a,b, 1984, 1 987; Jacoby & Brooks 1984; Jacoby & Dallas 198 1 ; Jacoby & Witherspoon 1982) argues that implicit and explicit memory are reflections of "different aspects of memory for whole prior processing episodes" ( 1 983a, p. 21) . The aware and unaware aspects of 488 RICHARDSON-KLA VEHN & BJORK memory for episodes are assumed to result from differences in information provided by test cues, and possibly accompanying differences in retrieval processes (see Whittlesea 1 987 for a similar position).
TL;DR: A discussion of what learners need to understand in order to become effective stewards of their own learning and a discussion of societal assumptions and attitudes that can be counterproductive in terms of individuals becoming maximally effective learners.
Abstract: Knowing how to manage one's own learning has become increasingly important in recent years, as both the need and the opportunities for individuals to learn on their own outside of formal classroom settings have grown. During that same period, however, research on learning, memory, and metacognitive processes has provided evidence that people often have a faulty mental model of how they learn and remember, making them prone to both misassessing and mismanaging their own learning. After a discussion of what learners need to understand in order to become effective stewards of their own learning, we first review research on what people believe about how they learn and then review research on how people's ongoing assessments of their own learning are influenced by current performance and the subjective sense of fluency. We conclude with a discussion of societal assumptions and attitudes that can be counterproductive in terms of individuals becoming maximally effective learners.
TL;DR: In this article, the authors analyze the internal stickiness of knowledge transfer and test the resulting model using canonical correlation analysis of a data set consisting of 271 observations of 122 best-practice transfers in eight companies.
Abstract: The ability to transfer best practices internally is critical to a firtn's ability to build competitive advantage through the appropriation of rents from scarce internal knowledge. Just as a firm's distinctive competencies tnight be dificult for other firms to imitate, its best prczctices could be dfficult to imitate internnlly. Yet, little systematic attention has been pcrid to such internal stickiness. The author analyzes itlterrzal stickiness of knowledge transfer crnd tests the resulting model using canonical correlation analysis of a data set consisting of 271 observations of 122 best-practice transfers in eight companies. Contrary to corzverztiorzrzl wisdom that blames primarily motivational factors, the study findings show the major barriers to internal knowledge transfer to be knowledge-related factors such as the recipient's lack oj absorptive capacity, causal anzbiguity, and an arciuous relationship between the source and the recipient. The identification and transfer of best practices cally are hindered less by confidentiality and legal is emerging as one of the most important and obstacles than external transfers, they could be widespread practical management issues of the faster and initially less complicated, all other latter half of the 1990s. Armed with meaningful, things being equal. For those reasons, in an era detailed performance data, firms that use fact- when continuous organizational learning and based management methods such as TQM, bench- relentless performance improvement are needed to marking, and process reengineering can regularly remain competitive, companies must increasingly compare the performance of their units along resort to the internal transfer of capabilitie~.~ operational dimensions. Sparse but unequivocal Yet, experience shows that transferring capaevidence suggests that such comparisons often bilities within a firm is far from easy. General reveal surprising performance differences between Motors had great difficulty in transferring manuunits, indicating a need to improve knowledge facturing practices between divisions (Kerwin and utilization within the firm (e.g., Chew, Bresnahan, Woodruff, 1992: 74) and IBM had limited suc
01 Jan 1964
TL;DR: In this paper, the notion of a collective unconscious was introduced as a theory of remembering in social psychology, and a study of remembering as a study in Social Psychology was carried out.
Abstract: Part I. Experimental Studies: 2. Experiment in psychology 3. Experiments on perceiving III Experiments on imaging 4-8. Experiments on remembering: (a) The method of description (b) The method of repeated reproduction (c) The method of picture writing (d) The method of serial reproduction (e) The method of serial reproduction picture material 9. Perceiving, recognizing, remembering 10. A theory of remembering 11. Images and their functions 12. Meaning Part II. Remembering as a Study in Social Psychology: 13. Social psychology 14. Social psychology and the matter of recall 15. Social psychology and the manner of recall 16. Conventionalism 17. The notion of a collective unconscious 18. The basis of social recall 19. A summary and some conclusions.
TL;DR: The present conclusion--that attitudes, self-esteem, and stereotypes have important implicit modes of operation--extends both the construct validity and predictive usefulness of these major theoretical constructs of social psychology.
Abstract: Social behavior is ordinarily treated as being under conscious (if not always thoughtful) control. However, considerable evidence now supports the view that social behavior often operates in an implicit or unconscious fashion. The identifying feature of implicit cognition is that past experience influences judgment in a fashion not introspectively known by the actor. The present conclusion--that attitudes, self-esteem, and stereotypes have important implicit modes of operation--extends both the construct validity and predictive usefulness of these major theoretical constructs of social psychology. Methodologically, this review calls for increased use of indirect measures--which are imperative in studies of implicit cognition. The theorized ordinariness of implicit stereotyping is consistent with recent findings of discrimination by people who explicitly disavow prejudice. The finding that implicit cognitive effects are often reduced by focusing judges' attention on their judgment task provides a basis for evaluating applications (such as affirmative action) aimed at reducing such unintended discrimination.
TL;DR: A wide variety of data on capacity limits suggesting that the smaller capacity limit in short-term memory tasks is real is brought together and a capacity limit for the focus of attention is proposed.
Abstract: Miller (1956) summarized evidence that people can remember about seven chunks in short-term memory (STM) tasks. How- ever, that number was meant more as a rough estimate and a rhetorical device than as a real capacity limit. Others have since suggested that there is a more precise capacity limit, but that it is only three to five chunks. The present target article brings together a wide vari- ety of data on capacity limits suggesting that the smaller capacity limit is real. Capacity limits will be useful in analyses of information processing only if the boundary conditions for observing them can be carefully described. Four basic conditions in which chunks can be identified and capacity limits can accordingly be observed are: (1) when information overload limits chunks to individual stimulus items, (2) when other steps are taken specifically to block the recoding of stimulus items into larger chunks, (3) in performance discontinuities caused by the capacity limit, and (4) in various indirect effects of the capacity limit. Under these conditions, rehearsal and long-term memory cannot be used to combine stimulus items into chunks of an unknown size; nor can storage mechanisms that are not capacity- limited, such as sensory memory, allow the capacity-limited storage mechanism to be refilled during recall. A single, central capacity limit averaging about four chunks is implicated along with other, noncapacity-limited sources. The pure STM capacity limit expressed in chunks is distinguished from compound STM limits obtained when the number of separately held chunks is unclear. Reasons why pure capacity estimates fall within a narrow range are discussed and a capacity limit for the focus of attention is proposed.