scispace - formally typeset
Search or ask a question
Author

Denise M. Dupras

Bio: Denise M. Dupras is an academic researcher from Mayo Clinic. The author has contributed to research in topics: Graduate medical education & Curriculum. The author has an hindex of 21, co-authored 42 publications receiving 3317 citations.

Papers
More filters
Journal ArticleDOI
10 Sep 2008-JAMA
TL;DR: Internet-based learning is associated with large positive effects compared with no intervention and with non-Internet instructional methods, suggesting effectiveness similar to traditional methods.
Abstract: Context The increasing use of Internet-based learning in health professions education may be informed by a timely, comprehensive synthesis of evidence of effectiveness. Objectives To summarize the effect of Internet-based instruction for health professions learners compared with no intervention and with non-Internet interventions. Data Sources Systematic search of MEDLINE, Scopus, CINAHL, EMBASE, ERIC, TimeLit, Web of Science, Dissertation Abstracts, and the University of Toronto Research and Development Resource Base from 1990 through 2007. Study Selection Studies in any language quantifying the association of Internet-based instruction and educational outcomes for practicing and student physicians, nurses, pharmacists, dentists, and other health care professionals compared with a no-intervention or non-Internet control group or a preintervention assessment. Data Extraction Two reviewers independently evaluated study quality and abstracted information including characteristics of learners, learning setting, and intervention (including level of interactivity, practice exercises, online discussion, and duration). Data Synthesis There were 201 eligible studies. Heterogeneity in results across studies was large (I2 ≥ 79%) in all analyses. Effect sizes were pooled using a random effects model. The pooled effect size in comparison to no intervention favored Internet-based interventions and was 1.00 (95% confidence interval [CI], 0.90-1.10; P Conclusions Internet-based learning is associated with large positive effects compared with no intervention. In contrast, effects compared with non-Internet instructional methods are heterogeneous and generally small, suggesting effectiveness similar to traditional methods. Future research should directly compare different Internet-based interventions.

1,241 citations

Journal ArticleDOI
TL;DR: Interactivity, practice exercises, repetition, and feedback seem to be associated with improved learning outcomes, although inconsistency across studies tempers conclusions.
Abstract: PurposeA recent systematic review (2008) described the effectiveness of Internet-based learning (IBL) in health professions education. A comprehensive synthesis of research investigating how to improve IBL is needed. This systematic review sought to provide such a synthesis.MethodThe authors

468 citations

Journal ArticleDOI
TL;DR: Teaching on the Web involves more than putting together a colorful webpage and by consistently employing principles of effective learning, educators will unlock the full potential of Web-based medical education.
Abstract: OBJECTIVE: Online learning has changed medical education, but many “educational” websites do not employ principles of effective learning. This article will assist readers in developing effective educational websites by integrating principles of active learning with the unique features of the Web. DESIGN: Narrative review. RESULTS: The key steps in developing an effective educational website are: Perform a needs analysis and specify goals and objectives; determine technical resources and needs; evaluate preexisting software and use it if it fully meets your needs; secure commitment from all participants and identify and address potential barriers to implementation; develop content in close coordination with website design (appropriately use multimedia, hyperlinks, and online communication) and follow a timeline; encourage active learning (self-assessment, reflection, self-directed learning, problem-based learning, learner interaction, and feedback); facilitate and plan to encourage use by the learner (make website accessible and user-friendly, provide time for learning, and motivate learners); evaluate learners and course; pilot the website before full implementation; and plan to monitor online communication and maintain the site by resolving technical problems, periodically verifying hyperlinks, and regularly updating content. CONCLUSION: Teaching on the Web involves more than putting together a colorful webpage. By consistently employing principles of effective learning, educators will unlock the full potential of Web-based medical education.

292 citations

Journal ArticleDOI
TL;DR: The most common instructional methods (each used in nearly 50% of courses) were patient cases, self-assessment questions and feedback as mentioned in this paper, and most courses (89%) used written text and most (55%) used multimedia.
Abstract: OBJECTIVES Educators often speak of web-based learning (WBL) as a single entity or a cluster of similar activities with homogeneous effects. Yet a recent systematic review demonstrated large heterogeneity among results from individual studies. Our purpose is to describe the variation in configurations, instructional methods and presentation formats in WBL. METHODS We systematically searched MEDLINE, EMBASE, ERIC, CINAHL and other databases (last search November 2008) for studies comparing a WBL intervention with no intervention or another educational activity. From eligible studies we abstracted information on course participants, topic, configuration and instructional methods. We summarised this information and then purposively selected and described several WBL interventions that illustrate specific technologies and design features. RESULTS We identified 266 eligible studies. Nearly all courses (89%) used written text and most (55%) used multimedia. A total of 32% used online communication via e-mail, threaded discussion, chat or videoconferencing, and 9% implemented synchronous components. Overall, 24% blended web-based and non-computer-based instruction. Most web-based courses (77%) employed specific instructional methods, other than text alone, to enhance the learning process. The most common instructional methods (each used in nearly 50% of courses) were patient cases, self-assessment questions and feedback. We describe several studies to illustrate the range of instructional designs. CONCLUSIONS Educators and researchers cannot treat WBL as a single entity. Many different configurations and instructional methods are available for WBL instructors. Researchers should study when to use specific WBL designs and how to use them effectively.

222 citations

Journal ArticleDOI
TL;DR: Rater training did not improve interrater reliability or accuracy of mini-CEX scores and rater confidence improved for the entire cohort.
Abstract: Background Mini-CEX scores assess resident competence. Rater training might improve mini-CEX score interrater reliability, but evidence is lacking.

178 citations


Cited by
More filters
01 Jan 2006
TL;DR: For example, Standardi pružaju okvir koje ukazuju na ucinkovitost kvalitetnih instrumenata u onim situacijama u kojima je njihovo koristenje potkrijepljeno validacijskim podacima.
Abstract: Pedagosko i psiholosko testiranje i procjenjivanje spadaju među najvažnije doprinose znanosti o ponasanju nasem drustvu i pružaju temeljna i znacajna poboljsanja u odnosu na ranije postupke. Iako se ne može ustvrditi da su svi testovi dovoljno usavrseni niti da su sva testiranja razborita i korisna, postoji velika kolicina informacija koje ukazuju na ucinkovitost kvalitetnih instrumenata u onim situacijama u kojima je njihovo koristenje potkrijepljeno validacijskim podacima. Pravilna upotreba testova može dovesti do boljih odluka o pojedincima i programima nego sto bi to bio slucaj bez njihovog koristenja, a također i ukazati na put za siri i pravedniji pristup obrazovanju i zaposljavanju. Međutim, losa upotreba testova može dovesti do zamjetne stete nanesene ispitanicima i drugim sudionicima u procesu donosenja odluka na temelju testovnih podataka. Cilj Standarda je promoviranje kvalitetne i eticne upotrebe testova te uspostavljanje osnovice za ocjenu kvalitete postupaka testiranja. Svrha objavljivanja Standarda je uspostavljanje kriterija za evaluaciju testova, provedbe testiranja i posljedica upotrebe testova. Iako bi evaluacija prikladnosti testa ili njegove primjene trebala ovisiti prvenstveno o strucnim misljenjima, Standardi pružaju okvir koji osigurava obuhvacanje svih relevantnih pitanja. Bilo bi poželjno da svi autori, sponzori, nakladnici i korisnici profesionalnih testova usvoje Standarde te da poticu druge da ih također prihvate.

3,905 citations

01 May 2009
TL;DR: The meta-analysis of empirical studies of online learning found that, on average, students in online learning conditions performed better than those receiving face-to-face instruction, and suggests that the positive effects associated with blended learning should not be attributed to the media, per se.
Abstract: A systematic search of the research literature from 1996 through July 2008 identified more than a thousand empirical studies of online learning. Analysts screened these studies to find those that (a) contrasted an online to a face-to-face condition, (b) measured student learning outcomes, (c) used a rigorous research design, and (d) provided adequate information to calculate an effect size. As a result of this screening, 51 independent effects were identified that could be subjected to meta-analysis. The meta-analysis found that, on average, students in online learning conditions performed better than those receiving face-to-face instruction. The difference between student outcomes for online and face-to-face classes—measured as the difference between treatment and control means, divided by the pooled standard deviation—was larger in those studies contrasting conditions that blended elements of online and face-to-face instruction with conditions taught entirely face-to-face. Analysts noted that these blended conditions often included additional learning time and instructional elements not received by students in control conditions. This finding suggests that the positive effects associated with blended learning should not be attributed to the media, per se. An unexpected finding was the small number of rigorous published studies contrasting online and face-to-face learning conditions for K–12 students. In light of this small corpus, caution is required in generalizing to the K–12 population because the results are derived for the most part from studies in other settings (e.g., medical training, higher education).

3,114 citations

Journal ArticleDOI
07 Sep 2011-JAMA
TL;DR: In comparison with no intervention, technology-enhanced simulation training in health professions education is consistently associated with large effects for outcomes of knowledge, skills, and behaviors and moderate effects for patient-related outcomes.
Abstract: Context Although technology-enhanced simulation has widespread appeal, its effectiveness remains uncertain. A comprehensive synthesis of evidence may inform the use of simulation in health professions education. Objective To summarize the outcomes of technology-enhanced simulation training for health professions learners in comparison with no intervention. Data Source Systematic search of MEDLINE, EMBASE, CINAHL, ERIC, PsychINFO, Scopus, key journals, and previous review bibliographies through May 2011. Study Selection Original research in any language evaluating simulation compared with no intervention for training practicing and student physicians, nurses, dentists, and other health care professionals. Data Extraction Reviewers working in duplicate evaluated quality and abstracted information on learners, instructional design (curricular integration, distributing training over multiple days, feedback, mastery learning, and repetitive practice), and outcomes. We coded skills (performance in a test setting) separately for time, process, and product measures, and similarly classified patient care behaviors. Data Synthesis From a pool of 10 903 articles, we identified 609 eligible studies enrolling 35 226 trainees. Of these, 137 were randomized studies, 67 were nonrandomized studies with 2 or more groups, and 405 used a single-group pretest-posttest design. We pooled effect sizes using random effects. Heterogeneity was large (I2>50%) in all main analyses. In comparison with no intervention, pooled effect sizes were 1.20 (95% CI, 1.04-1.35) for knowledge outcomes (n = 118 studies), 1.14 (95% CI, 1.03-1.25) for time skills (n = 210), 1.09 (95% CI, 1.03-1.16) for process skills (n = 426), 1.18 (95% CI, 0.98-1.37) for product skills (n = 54), 0.79 (95% CI, 0.47-1.10) for time behaviors (n = 20), 0.81 (95% CI, 0.66-0.96) for other behaviors (n = 50), and 0.50 (95% CI, 0.34-0.66) for direct effects on patients (n = 32). Subgroup analyses revealed no consistent statistically significant interactions between simulation training and instructional design features or study quality. Conclusion In comparison with no intervention, technology-enhanced simulation training in health professions education is consistently associated with large effects for outcomes of knowledge, skills, and behaviors and moderate effects for patient-related outcomes.

1,420 citations

Journal ArticleDOI
TL;DR: The results of this review suggest that when used alone and compared to no intervention, PEMs may have a small beneficial effect on professional practice outcomes.
Abstract: Background Printed educational materials are widely used passive dissemination strategies to improve the quality of clinical practice and patient outcomes. Traditionally they are presented in paper formats such as monographs, publication in peer-reviewed journals and clinical guidelines. Objectives To assess the effect of printed educational materials on the practice of healthcare professionals and patient health outcomes. To explore the influence of some of the characteristics of the printed educational materials (e.g. source, content, format) on their effect on professional practice and patient outcomes. Search methods For this update, search strategies were rewritten and substantially changed from those published in the original review in order to refocus the search from published material to printed material and to expand terminology describing printed materials. Given the significant changes, all databases were searched from start date to June 2011. We searched: MEDLINE, EMBASE, the Cochrane Central Register of Controlled Trials (CENTRAL), HealthStar, CINAHL, ERIC, CAB Abstracts, Global Health, and the EPOC Register. Selection criteria We included randomised controlled trials (RCTs), quasi-randomised trials, controlled before and after studies (CBAs) and interrupted time series (ITS) analyses that evaluated the impact of printed educational materials (PEMs) on healthcare professionals' practice or patient outcomes, or both. We included three types of comparisons: (1) PEM versus no intervention, (2) PEM versus single intervention, (3) multifaceted intervention where PEM is included versus multifaceted intervention without PEM. There was no language restriction. Any objective measure of professional practice (e.g. number of tests ordered, prescriptions for a particular drug), or patient health outcomes (e.g. blood pressure) were included. Data collection and analysis Two review authors undertook data extraction independently, and any disagreement was resolved by discussion among the review authors. For analyses, the included studies were grouped according to study design, type of outcome (professional practice or patient outcome, continuous or dichotomous) and type of comparison. For controlled trials, we reported the median effect size for each outcome within each study, the median effect size across outcomes for each study and the median of these effect sizes across studies. Where the data were available, we re-analysed the ITS studies and reported median differences in slope and in level for each outcome, across outcomes for each study, and then across studies. We categorised each PEM according to potential effects modifiers related to the source of the PEMs, the channel used for their delivery, their content, and their format. Main results The review includes 45 studies: 14 RCTs and 31 ITS studies. Almost all the included studies (44/45) compared the effectiveness of PEM to no intervention. One single study compared paper-based PEM to the same document delivered on CD-ROM. Based on seven RCTs and 54 outcomes, the median absolute risk difference in categorical practice outcomes was 0.02 when PEMs were compared to no intervention (range from 0 to +0.11). Based on three RCTs and eight outcomes, the median improvement in standardised mean difference for continuous profession practice outcomes was 0.13 when PEMs were compared to no intervention (range from -0.16 to +0.36). Only two RCTs and two ITS studies reported patient outcomes. In addition, we re-analysed 54 outcomes from 25 ITS studies, using time series regression and observed statistically significant improvement in level or in slope in 27 outcomes. From the ITS studies, we calculated improvements in professional practice outcomes across studies after PEM dissemination (standardised median change in level = 1.69). From the data gathered, we could not comment on which PEM characteristic influenced their effectiveness. Authors' conclusions The results of this review suggest that when used alone and compared to no intervention, PEMs may have a small beneficial effect on professional practice outcomes. There is insufficient information to reliably estimate the effect of PEMs on patient outcomes, and clinical significance of the observed effect sizes is not known. The effectiveness of PEMs compared to other interventions, or of PEMs as part of a multifaceted intervention, is uncertain.

886 citations

Journal Article
TL;DR: Do I use effective communication strategies?
Abstract: 3. Do I use effective communication strategies? Is there always “time just to talk”? Do I seek family observations/information in assessment? In monitoring? Do family members have opportunities to ask questions or seek clarification? Do I present information at a time and in a format preferred by the family members? Do I keep my work with family members respectful, yet informal, and free of professionally precious jargon?

725 citations