scispace - formally typeset
Search or ask a question
Journal Article

Problems for clinical judgement: introducing cognitive psychology as one more basic science

06 Feb 2001-Canadian Medical Association Journal (Canadian Medical Association)-Vol. 164, Iss: 3, pp 358-360
TL;DR: Medical practice is not easy because of its inherent widespread uncertainty, whereas some questions are settled eventually through clinical trials, whereas others are impossible to resolve.
Abstract: Medical practice is not easy because of its inherent widespread uncertainty. Some questions are settled eventually through clinical trials, whereas others are impossible to resolve. Between these 2 extremes rests a large grey area where physicians must exercise their judgement. Good clinical

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: This paper focuses on the part of diagnostic errors that are driven by cognitive bias and is concerned with the application of ‘System 1’ (non‐analytic, pattern recognition) thinking.
Abstract: CONTEXT There is a growing literature on diagnostic errors. The consensus of this literature is that most errors are cognitive and result from the application of one or more cognitive biases. Such biased reasoning is usually associated with 'System 1' (non-analytic, pattern recognition) thinking. METHODS We review this literature and bring in evidence from two other fields: research on clinical reasoning, and research in psychology on 'dual-process' models of thinking. We then synthesise the evidence from these fields exploring possible causes of error and potential solutions. RESULTS We identify that, in fact, there is very little evidence to associate diagnostic errors with System 1 (non-analytical) reasoning. By contrast, studies of dual processing show that experts are as likely to commit errors when they are attempting to be systematic and analytical. We then examine the effectiveness of various approaches to reducing errors. We point out that educational strategies aimed at explaining cognitive biases are unlikely to succeed because of limited transfer. Conversely, there is an accumulation of evidence that interventions directed at specifically encouraging both analytical and non-analytical reasoning have been shown to result in small, but consistent, improvements in accuracy. CONCLUSIONS Diagnostic errors are not simply a consequence of cognitive biases or over-reliance on one kind of thinking. They result from multiple causes and are associated with both analytical and non-analytical reasoning. Limited evidence suggests that strategies directed at encouraging both kinds of reasoning will lead to limited gains in accuracy.

382 citations

Journal ArticleDOI
TL;DR: The adoption of this method provides a systematic approach to cognitive root-cause analysis in the avoidance of adverse outcomes associated with delayed or missed diagnoses and with the clinical management of specific cases.

342 citations

Journal ArticleDOI
TL;DR: The authors review the medical literature to answer two substantial questions that arise from this work: to what extent do diagnostic errors originate in Type 1 (intuitive) processes versus in Type 2 (analytical) processes?
Abstract: Contemporary theories of clinical reasoning espouse a dual processing model, which consists of a rapid, intuitive component (Type 1) and a slower, logical and analytical component (Type 2) Although the general consensus is that this dual processing model is a valid representation of clinical reasoning, the causes of diagnostic errors remain unclear Cognitive theories about human memory propose that such errors may arise from both Type 1 and Type 2 reasoning Errors in Type 1 reasoning may be a consequence of the associative nature of memory, which can lead to cognitive biases However, the literature indicates that, with increasing expertise (and knowledge), the likelihood of errors decreases Errors in Type 2 reasoning may result from the limited capacity of working memory, which constrains computational processes In this article, the authors review the medical literature to answer two substantial questions that arise from this work: (1) To what extent do diagnostic errors originate in Type 1 (intuitive) processes versus in Type 2 (analytical) processes? (2) To what extent are errors a consequence of cognitive biases versus a consequence of knowledge deficits?The literature suggests that both Type 1 and Type 2 processes contribute to errors Although it is possible to experimentally induce cognitive biases, particularly availability bias, the extent to which these biases actually contribute to diagnostic errors is not well established Educational strategies directed at the recognition of biases are ineffective in reducing errors; conversely, strategies focused on the reorganization of knowledge to reduce errors have small but consistent benefits

332 citations

Journal ArticleDOI
TL;DR: The authors propose a new model of expert judgment that is described as a process of slowing down when you should, using efficient nonanalytic processes for many tasks, but transitioning to more effortful analytic processing when necessary.
Abstract: The study of expertise in medical education has tended to follow a tradition of trying to describe the analytic processes and/or nonanalytic resources that experts acquire with experience. However, the authors argue that a critical function of expertise is the judgment required to coordinate these resources, using efficient nonanalytic processes for many tasks, but transitioning to more effortful analytic processing when necessary. Attempts to appreciate the nature of this transition, when it happens, and how it happens, can be informed by the evaluation of other literatures that are addressing these and related problems. The authors review the literatures on educational expertise, attention and effort, situational awareness, and human factors to examine the conceptual frameworks of expertise arising from these domains and the research methodologies that inform their practice. The authors propose a new model of expert judgment that we describe as a process of slowing down when you should.

292 citations

Journal ArticleDOI
TL;DR: This case begins with an unexpected and dramatic finding positive blood cultures in a patient who received an initial diagnosis of viral pharyngitis, and exemplifies that misdiagnoses can occur and can be corrected with follow-up.
Abstract: Cognitive psychology is the science that examines how people reason, formulate judgments, and make decisions. This case involves a patient given a diagnosis of pharyngitis, whose ultimate diagnosis of osteomyelitis was missed through a series of cognitive shortcuts. These errors include the availability heuristic (in which people judge likelihood by how easily examples spring to mind), the anchoring heuristic (in which people stick with initial impressions), framing effects (in which people make different decisions depending on how information is presented), blind obedience (in which people stop thinking when confronted with authority), and premature closure (in which several alternatives are not pursued). Rather than trying to completely eliminate cognitive shortcuts (which often serve clinicians well), becoming aware of common errors might lead to sustained improvement in patient care.

240 citations

References
More filters
Journal Article
01 Jan 1974-Science

4,413 citations

Book
01 Jan 2001
TL;DR: Without a way of critically appraising the information they receive, clinicians are relatively helpless in deciding what new information to learn and decide how to modify their practice.
Abstract: Medical practice is constantly changing. The rate of change is accelerating, and physicians can be forgiven if they often find it dizzying. How can physicians learn about new information and innovations, and decide how (if at all) they should modify their practice? Possible sources include summaries from the medical literature (review articles, practice guidelines, consensus statements, editorials, and summary articles in "throwaway" journals); consultation with colleagues who have special expertise; lectures; seminars; advertisements in medical journals; conversations with representatives from pharmaceutical companies; and original articles in journals and journal supplements. Each of these sources of information might be valuable, though each is subject to its own particular biases. 1,2 Problems arise when, as is often the case, these sources of information provide different suggestions about patient care. See also p 2093. Without a way of critically appraising the information they receive, clinicians are relatively helpless in deciding what new information

3,305 citations