scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review

01 Jan 2005-Medical Teacher (Taylor & Francis)-Vol. 27, Iss: 1, pp 10-28
TL;DR: While research in this field needs improvement in terms of rigor and quality, high-fidelity medical simulations are educationally effective and simulation-based education complements medical education in patient care settings.
Abstract: SUMMARY Review date: 1969 to 2003, 34 years. Background and context: Simulations are now in widespread use in medical education and medical personnel evaluation. Outcomes research on the use and effectiveness of simulation technology in medical education is scattered, inconsistent and varies widely in methodological rigor and substantive focus. Objectives: Review and synthesize existing evidence in educational science that addresses the question, ‘What are the features and uses of high-fidelity medical simulations that lead to most effective learning?’. Search strategy: The search covered five literature databases (ERIC, MEDLINE, PsycINFO, Web of Science and Timelit) and employed 91 single search terms and concepts and their Boolean combinations. Hand searching, Internet searches and attention to the ‘grey literature’ were also used. The aim was to perform the most thorough literature search possible of peer-reviewed publications and reports in the unpublished literature that have been judged for academic quality. Inclusion and exclusion criteria: Four screening criteria were used to reduce the initial pool of 670 journal articles to a focused set of 109 studies: (a) elimination of review articles in favor of empirical studies; (b) use of a simulator as an educational assessment or intervention with learner outcomes measured quantitatively; (c) comparative research, either experimental or quasi-experimental; and (d) research that involves simulation as an educational intervention. Data extraction: Data were extracted systematically from the 109 eligible journal articles by independent coders. Each coder used a standardized data extraction protocol. Data synthesis: Qualitative data synthesis and tabular presentation of research methods and outcomes were used. Heterogeneity of research designs, educational interventions, outcome measures and timeframe precluded data synthesis using meta-analysis. Headline results: Coding accuracy for features of the journal articles is high. The extant quality of the published research is generally weak. The weight of the best available evidence suggests that high-fidelity medical simulations facilitate learning under the right conditions. These include the following:

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: This article reviews and critically evaluates historical and contemporary research on simulation‐based medical education (SBME) and presents and discusses 12 features and best practices that teachers should know in order to use medical simulation technology to maximum educational benefit.
Abstract: Objectives This article reviews and critically evaluates historical and contemporary research on simulation-based medical education (SBME). It also presents and discusses 12 features and best practices of SBME that teachers should know in order to use medical simulation technology to maximum educational benefit. Methods This qualitative synthesis of SBME research and scholarship was carried out in two stages. Firstly, we summarised the results of three SBME research reviews covering the years 1969–2003. Secondly, we performed a selective, critical review of SBME research and scholarship published during 2003–2009. Results The historical and contemporary research synthesis is reported to inform the medical education community about 12 features and best practices of SBME: (i) feedback; (ii) deliberate practice; (iii) curriculum integration; (iv) outcome measurement; (v) simulation fidelity; (vi) skill acquisition and maintenance; (vii) mastery learning; (viii) transfer to practice; (ix) team training; (x) high-stakes testing; (xi) instructor training, and (xii) educational and professional context. Each of these is discussed in the light of available evidence. The scientific quality of contemporary SBME research is much improved compared with the historical record. Conclusions Development of and research into SBME have grown and matured over the past 40 years on substantive and methodological grounds. We believe the impact and educational utility of SBME are likely to increase in the future. More thematic programmes of research are needed. Simulation-based medical education is a complex service intervention that needs to be planned and practised with attention to organisational contexts. Medical Education 2010: 44: 50–63

1,459 citations

Journal ArticleDOI
07 Sep 2011-JAMA
TL;DR: In comparison with no intervention, technology-enhanced simulation training in health professions education is consistently associated with large effects for outcomes of knowledge, skills, and behaviors and moderate effects for patient-related outcomes.
Abstract: Context Although technology-enhanced simulation has widespread appeal, its effectiveness remains uncertain. A comprehensive synthesis of evidence may inform the use of simulation in health professions education. Objective To summarize the outcomes of technology-enhanced simulation training for health professions learners in comparison with no intervention. Data Source Systematic search of MEDLINE, EMBASE, CINAHL, ERIC, PsychINFO, Scopus, key journals, and previous review bibliographies through May 2011. Study Selection Original research in any language evaluating simulation compared with no intervention for training practicing and student physicians, nurses, dentists, and other health care professionals. Data Extraction Reviewers working in duplicate evaluated quality and abstracted information on learners, instructional design (curricular integration, distributing training over multiple days, feedback, mastery learning, and repetitive practice), and outcomes. We coded skills (performance in a test setting) separately for time, process, and product measures, and similarly classified patient care behaviors. Data Synthesis From a pool of 10 903 articles, we identified 609 eligible studies enrolling 35 226 trainees. Of these, 137 were randomized studies, 67 were nonrandomized studies with 2 or more groups, and 405 used a single-group pretest-posttest design. We pooled effect sizes using random effects. Heterogeneity was large (I2>50%) in all main analyses. In comparison with no intervention, pooled effect sizes were 1.20 (95% CI, 1.04-1.35) for knowledge outcomes (n = 118 studies), 1.14 (95% CI, 1.03-1.25) for time skills (n = 210), 1.09 (95% CI, 1.03-1.16) for process skills (n = 426), 1.18 (95% CI, 0.98-1.37) for product skills (n = 54), 0.79 (95% CI, 0.47-1.10) for time behaviors (n = 20), 0.81 (95% CI, 0.66-0.96) for other behaviors (n = 50), and 0.50 (95% CI, 0.34-0.66) for direct effects on patients (n = 32). Subgroup analyses revealed no consistent statistically significant interactions between simulation training and instructional design features or study quality. Conclusion In comparison with no intervention, technology-enhanced simulation training in health professions education is consistently associated with large effects for outcomes of knowledge, skills, and behaviors and moderate effects for patient-related outcomes.

1,420 citations


Cites background or methods from "Features and uses of high-fidelity ..."

  • ...Criterion B was fulfilled if (1) a randomized study concealed allocation or (2) an observational study controlled for another baseline learner characteristic....

    [...]

  • ...We sought to answer 2 questions: (1) To what extent are simulation technologies for training health care professionals associated with improved outcomes in comparison with no intervention? and (2) How do outcomes vary for different simulation instructional designs? Based on the strength of the theoretical foundations and currency in the field, we selected 5 instructional design features(2,9) (curricular integration, distributed practice, feedback, mastery learning, and range of difficulty) for subgroup analyses (see eBox for definitions; available at http://www ....

    [...]

  • ...Beyond descriptive analysis (2) 560 (91....

    [...]

  • ...When authors reported multiple measures of a single outcome (eg, multiple measures of efficiency), we selected in decreasing order of priority (1) the authordefined primary outcome; (2) a global or summary measure of effect; (3) the most clinically relevant measure; or (4) the mean of the measures reported....

    [...]

  • ...cComparability of cohorts criterion A was fulfilled if the study (1) was randomized or (2) controlled for a baseline learning outcome....

    [...]

Journal ArticleDOI
TL;DR: The aim of this paper is to critically review what is felt to be important about the role of debriefing in the field of simulation-based learning, how it has come about and developed over time, and the different styles or approaches that are used and how effective the process is.
Abstract: The aim of this paper is to critically review what is felt to be important about the role of debriefing in the field of simulation-based learning, how it has come about and developed over time, and the different styles or approaches that are used and how effective the process is. A recent systematic

1,351 citations

Journal ArticleDOI
TL;DR: Although the number of reports analyzed in this meta-analysis is small, these results show that SBME with DP is superior to traditional clinical medical education in achieving specific clinical skill acquisition goals.
Abstract: Purpose This article presents a comparison of the effectiveness of traditional clinical education toward skill acquisition goals versus simulation-based medical education (SBME) with deliberate practice (DP). Method This is a quantitative meta-analysis that spans 20 years, 1990 to 2010. A search strategy involving three literature databases, 12 search terms, and four inclusion criteria was used. Four authors independently retrieved and reviewed articles. Main outcome measures were extracted to calculate effect sizes.

1,311 citations

Journal ArticleDOI
10 Sep 2008-JAMA
TL;DR: Internet-based learning is associated with large positive effects compared with no intervention and with non-Internet instructional methods, suggesting effectiveness similar to traditional methods.
Abstract: Context The increasing use of Internet-based learning in health professions education may be informed by a timely, comprehensive synthesis of evidence of effectiveness. Objectives To summarize the effect of Internet-based instruction for health professions learners compared with no intervention and with non-Internet interventions. Data Sources Systematic search of MEDLINE, Scopus, CINAHL, EMBASE, ERIC, TimeLit, Web of Science, Dissertation Abstracts, and the University of Toronto Research and Development Resource Base from 1990 through 2007. Study Selection Studies in any language quantifying the association of Internet-based instruction and educational outcomes for practicing and student physicians, nurses, pharmacists, dentists, and other health care professionals compared with a no-intervention or non-Internet control group or a preintervention assessment. Data Extraction Two reviewers independently evaluated study quality and abstracted information including characteristics of learners, learning setting, and intervention (including level of interactivity, practice exercises, online discussion, and duration). Data Synthesis There were 201 eligible studies. Heterogeneity in results across studies was large (I2 ≥ 79%) in all analyses. Effect sizes were pooled using a random effects model. The pooled effect size in comparison to no intervention favored Internet-based interventions and was 1.00 (95% confidence interval [CI], 0.90-1.10; P Conclusions Internet-based learning is associated with large positive effects compared with no intervention. In contrast, effects compared with non-Internet instructional methods are heterogeneous and generally small, suggesting effectiveness similar to traditional methods. Future research should directly compare different Internet-based interventions.

1,241 citations

References
More filters
Journal ArticleDOI
TL;DR: In this article, the authors examined the performance of anaesthetists while managing simulated anaesthetic crises and to see whether their performance was improved by reviewing their own performances recorded on videotape.
Abstract: The aim of this study was to examine the performance of anaesthetists while managing simulated anaesthetic crises and to see whether their performance was improved by reviewing their own performances recorded on videotape. Thirty-two subjects from four hospitals were allocated randomly to one of two groups, with each subject completing five simulations in a single session. Individuals in the first group completed five simulations with only a short discussion between each simulation. Those in the second group were allowed to review their own performance on videotape between each of the simulations. Performance was measured by both ‘time to solve the problem’ and mental workload, using anaesthetic chart error as a secondary task. Those trainees exposed to videotape feedback had a shorter median ‘time to solve’ and a smaller decrease in chart error when compared to those not exposed to video feedback. However, the differences were not statistically significant, confirming the difficulties encountered by other groups in designing valid tests of the performance of anaesthetists.

89 citations

Journal ArticleDOI
TL;DR: Paramedics trained in endotracheal intubation using a systematic manikin-only teaching program can attain acceptable individual success rates in the actual field setting.

87 citations

Journal ArticleDOI
TL;DR: Written examinations measure acquisition of knowledge but fail to predict if students can apply knowledge to problem solving, whereas both the objective structured clinical examination and the computer-controlled patient simulator can be used as effective performance evaluation tools.
Abstract: ObjectiveTo compare three different evaluative instruments and determine which is able to measure different aspects of medical student learning.DesignStudent learning was evaluated by using written examinations, objective structured clinical examination, and patient simulator that used two clinical

85 citations

Journal ArticleDOI
TL;DR: Both expert- and referent surgeons value Xitact to be an important and useful tool in the laparoscopic teaching setting, and to justify its use in the surgical curriculum.
Abstract: Background: This study was undertaken to establish face-, expert, and referent validity of the Xitact LS500; a virtual reality laparoscopic cholecystectomy simulator. Methods: A four-page, 20-item structured questionnaire was presented to 120 surgeons attending a surgical convention. Participants received an instructed hands-on "tour" on the Xitact simulator. Data were analyzed according to the level of experience of the surgeon, resulting in an "expert group opinion" of 87 surgeons and a "referent group opinion" of 33 surgeons. Results: The majority of respondents believe Xitact has the potential to become a useful tool in teaching (93.1%) and measuring performance assessment (79.3%) in laparoscopic cholecystectomy. Expert- and referent-group opinion does not differ significantly on any of the presented statements. The opinion regarding the realism of the virtual laparoscopic cholecystectomy environment is favorable among both groups, although it is considered not yet perfect. The "haptic feedback" sensation of the Xitact is a parameter that needs further development. Conclusions: Both expert- and referent surgeons value Xitact to be an important and useful tool in the laparoscopic teaching setting. Further studies need to be performed to establish the construct validity of the simulator (e.g., to what extent is the simulator logically encompassed into a theoretical framework of acquiring skills, needed for the laparoscopic cholecystectomy) to measure shortening of learning curves on the laparoscopic cholecystectomy procedure, and ultimately to justify its use in the surgical curriculum.

84 citations

01 Jan 2003
TL;DR: Novices demonstrated improved skill acquisition using simulation and this results support the notion of self-directed skills training and could have significant implications for residency training programs.
Abstract: Background: Simulation-based training provides minimal feedback and relies heavily on self-assessment. Research has shown medical trainees are poor self-assessors. The purpose of this study was to examine trainees’ ability to self-assess technical skills using a simulation-trainer. Methods: Twenty-one medical students performed 10 repetitions of a simulated task. After each repetition they estimated their time and errors made. These were compared with the simulator data. Results: Task time (P 0.0001) and errors made (P 0.0001) improved with repetition. Both self-assessment curves reflected their actual performance curves (P 0.0001). Self-assessment of time did not improve in accuracy (P 0.26) but error estimation did (P 0.01) when compared with actual performance. Conclusions: Novices demonstrated improved skill acquisition using simulation. Their estimates of performance and accuracy of error estimation improved with repetition. Clearly, practice enhances technical skill self-assessment. These results support the notion of self-directed skills training and could have significant implications for residency training programs. © 2003 Excerpta Medica, Inc. All rights reserved.

84 citations