scispace - formally typeset
Search or ask a question
Author

Anthony J Levinson

Bio: Anthony J Levinson is an academic researcher from McMaster University. The author has contributed to research in topics: Usability & The Internet. The author has an hindex of 16, co-authored 38 publications receiving 2896 citations. Previous affiliations of Anthony J Levinson include McMaster University Medical Centre & Mayo Clinic.

Papers
More filters
Journal ArticleDOI
10 Sep 2008-JAMA
TL;DR: Internet-based learning is associated with large positive effects compared with no intervention and with non-Internet instructional methods, suggesting effectiveness similar to traditional methods.
Abstract: Context The increasing use of Internet-based learning in health professions education may be informed by a timely, comprehensive synthesis of evidence of effectiveness. Objectives To summarize the effect of Internet-based instruction for health professions learners compared with no intervention and with non-Internet interventions. Data Sources Systematic search of MEDLINE, Scopus, CINAHL, EMBASE, ERIC, TimeLit, Web of Science, Dissertation Abstracts, and the University of Toronto Research and Development Resource Base from 1990 through 2007. Study Selection Studies in any language quantifying the association of Internet-based instruction and educational outcomes for practicing and student physicians, nurses, pharmacists, dentists, and other health care professionals compared with a no-intervention or non-Internet control group or a preintervention assessment. Data Extraction Two reviewers independently evaluated study quality and abstracted information including characteristics of learners, learning setting, and intervention (including level of interactivity, practice exercises, online discussion, and duration). Data Synthesis There were 201 eligible studies. Heterogeneity in results across studies was large (I2 ≥ 79%) in all analyses. Effect sizes were pooled using a random effects model. The pooled effect size in comparison to no intervention favored Internet-based interventions and was 1.00 (95% confidence interval [CI], 0.90-1.10; P Conclusions Internet-based learning is associated with large positive effects compared with no intervention. In contrast, effects compared with non-Internet instructional methods are heterogeneous and generally small, suggesting effectiveness similar to traditional methods. Future research should directly compare different Internet-based interventions.

1,241 citations

Journal ArticleDOI
TL;DR: Interactivity, practice exercises, repetition, and feedback seem to be associated with improved learning outcomes, although inconsistency across studies tempers conclusions.
Abstract: PurposeA recent systematic review (2008) described the effectiveness of Internet-based learning (IBL) in health professions education. A comprehensive synthesis of research investigating how to improve IBL is needed. This systematic review sought to provide such a synthesis.MethodThe authors

468 citations

Journal ArticleDOI
TL;DR: The most common instructional methods (each used in nearly 50% of courses) were patient cases, self-assessment questions and feedback as mentioned in this paper, and most courses (89%) used written text and most (55%) used multimedia.
Abstract: OBJECTIVES Educators often speak of web-based learning (WBL) as a single entity or a cluster of similar activities with homogeneous effects. Yet a recent systematic review demonstrated large heterogeneity among results from individual studies. Our purpose is to describe the variation in configurations, instructional methods and presentation formats in WBL. METHODS We systematically searched MEDLINE, EMBASE, ERIC, CINAHL and other databases (last search November 2008) for studies comparing a WBL intervention with no intervention or another educational activity. From eligible studies we abstracted information on course participants, topic, configuration and instructional methods. We summarised this information and then purposively selected and described several WBL interventions that illustrate specific technologies and design features. RESULTS We identified 266 eligible studies. Nearly all courses (89%) used written text and most (55%) used multimedia. A total of 32% used online communication via e-mail, threaded discussion, chat or videoconferencing, and 9% implemented synchronous components. Overall, 24% blended web-based and non-computer-based instruction. Most web-based courses (77%) employed specific instructional methods, other than text alone, to enhance the learning process. The most common instructional methods (each used in nearly 50% of courses) were patient cases, self-assessment questions and feedback. We describe several studies to illustrate the range of instructional designs. CONCLUSIONS Educators and researchers cannot treat WBL as a single entity. Many different configurations and instructional methods are available for WBL instructors. Researchers should study when to use specific WBL designs and how to use them effectively.

222 citations

Journal ArticleDOI
TL;DR: Computer‐aided instruction is used increasingly in medical education and anatomy instruction with limited research evidence to guide its design and deployment.
Abstract: CONTEXT Computer-aided instruction is usedincreasingly in medical education and anatomyinstruction with limited research evidence to guide itsdesign and deployment.OBJECTIVES To determine the effects of (a) learnercontrol over the e-learning environment and (b) keyviews of the brain versus multiple views in the learn-ing of brain surface anatomy.DESIGN Randomised trial with 2 phases of study.PARTICIPANTS Volunteer sample of 1st-yearpsychology students (phase 1, n ¼ 120; phase 2,n ¼ 120).INTERVENTIONS Phase 1: computer-based instruc-tion in brain surface anatomy with 4 conditions:(1) learner control⁄multiple views (LMV); (2) learnercontrol⁄key views (LKV); (3) programme control⁄multiple views (PMV); (4) programme control⁄keyviews (PKV). Phase 2: 2 conditions: low learnercontrol⁄keyviews(PKV)versusnolearnercontrol⁄keyviews (SKV). All participants performed a pre-test,post-test and test of visuospatial ability.MAIN OUTCOME MEASURES A 30-item post-test ofbrain surface anatomy structure identification.RESULTS The PKV group attained the best post-testscore (57.7%) and the PMV group received the worst(42.2%), with the 2 high learner control groupsperforming in between. For students with low spatialability, estimated scores are 20% lower for those whosaw multiple views during learning. In phase 2,students with the most static condition and no lear-ner control (SKV) performed similarly to those stu-dents in the PKV group.CONCLUSIONS Multiple views may impede learn-ing, particularly for those with relatively poor spatialability. High degrees of learner control may reduceeffectiveness of learning.KEYWORDS randomized controlled trial; education,distance; neurology⁄*education; brain⁄*anatomy;humans; teaching⁄*methods; computer-assistedinstruction.Medical Education 2007: 41: 495–501

175 citations

Journal ArticleDOI
TL;DR: On average, Internet-based instruction and non-computer instruction require similar time, and Instructional strategies to enhance feedback and interactivity typically prolong learning time, but in many cases also enhance learning outcomes.
Abstract: Authors have claimed that Internet-based instruction promotes greater learning efficiency than non-computer methods. Objectives Determine, through a systematic synthesis of evidence in health professions education, how Internet-based instruction compares with non-computer instruction in time spent learning, and what features of Internet-based instruction are associated with improved learning efficiency. Data sources We searched databases including MEDLINE, CINAHL, EMBASE, and ERIC from 1990 through November 2008. Study selection and data abstraction We included all studies quantifying learning time for Internet-based instruction for health professionals, compared with other instruction. Reviewers worked independently, in duplicate, to abstract information on interventions, outcomes, and study design. Results We identified 20 eligible studies. Random effects meta-analysis of 8 studies comparing Internet-based with non-Internet instruction (positive numbers indicating Internet longer) revealed pooled effect size (ES) for time −0.10 (p = 0.63). Among comparisons of two Internet-based interventions, providing feedback adds time (ES 0.67, p = 0.003, two studies), and greater interactivity generally takes longer (ES 0.25, p = 0.089, five studies). One study demonstrated that adapting to learner prior knowledge saves time without significantly affecting knowledge scores. Other studies revealed that audio narration, video clips, interactive models, and animations increase learning time but also facilitate higher knowledge and/or satisfaction. Across all studies, time correlated positively with knowledge outcomes (r = 0.53, p = 0.021). Conclusions On average, Internet-based instruction and non-computer instruction require similar time. Instructional strategies to enhance feedback and interactivity typically prolong learning time, but in many cases also enhance learning outcomes. Isolated examples suggest potential for improving efficiency in Internet-based instruction.

147 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: A survey of factor analytic studies of human cognitive abilities can be found in this paper, with a focus on the role of factor analysis in human cognitive ability evaluation and cognition. But this survey is limited.
Abstract: (1998). Human cognitive abilities: A survey of factor analytic studies. Gifted and Talented International: Vol. 13, No. 2, pp. 97-98.

2,388 citations

Journal ArticleDOI
07 Sep 2011-JAMA
TL;DR: In comparison with no intervention, technology-enhanced simulation training in health professions education is consistently associated with large effects for outcomes of knowledge, skills, and behaviors and moderate effects for patient-related outcomes.
Abstract: Context Although technology-enhanced simulation has widespread appeal, its effectiveness remains uncertain. A comprehensive synthesis of evidence may inform the use of simulation in health professions education. Objective To summarize the outcomes of technology-enhanced simulation training for health professions learners in comparison with no intervention. Data Source Systematic search of MEDLINE, EMBASE, CINAHL, ERIC, PsychINFO, Scopus, key journals, and previous review bibliographies through May 2011. Study Selection Original research in any language evaluating simulation compared with no intervention for training practicing and student physicians, nurses, dentists, and other health care professionals. Data Extraction Reviewers working in duplicate evaluated quality and abstracted information on learners, instructional design (curricular integration, distributing training over multiple days, feedback, mastery learning, and repetitive practice), and outcomes. We coded skills (performance in a test setting) separately for time, process, and product measures, and similarly classified patient care behaviors. Data Synthesis From a pool of 10 903 articles, we identified 609 eligible studies enrolling 35 226 trainees. Of these, 137 were randomized studies, 67 were nonrandomized studies with 2 or more groups, and 405 used a single-group pretest-posttest design. We pooled effect sizes using random effects. Heterogeneity was large (I2>50%) in all main analyses. In comparison with no intervention, pooled effect sizes were 1.20 (95% CI, 1.04-1.35) for knowledge outcomes (n = 118 studies), 1.14 (95% CI, 1.03-1.25) for time skills (n = 210), 1.09 (95% CI, 1.03-1.16) for process skills (n = 426), 1.18 (95% CI, 0.98-1.37) for product skills (n = 54), 0.79 (95% CI, 0.47-1.10) for time behaviors (n = 20), 0.81 (95% CI, 0.66-0.96) for other behaviors (n = 50), and 0.50 (95% CI, 0.34-0.66) for direct effects on patients (n = 32). Subgroup analyses revealed no consistent statistically significant interactions between simulation training and instructional design features or study quality. Conclusion In comparison with no intervention, technology-enhanced simulation training in health professions education is consistently associated with large effects for outcomes of knowledge, skills, and behaviors and moderate effects for patient-related outcomes.

1,420 citations

Journal ArticleDOI
TL;DR: In this article, the authors describe five basic elements needed to build expertise: effortful exertion to improve performance, intrinsic motivation to engage in the task, carefully tailored practice tasks that focus on areas of weakness, feedback that provides knowledge of results, and continued repetition over a number of years.
Abstract: Practice is a necessary but not sufficient condition to reach high levels of competence. Deliberate practice, which includes five basic elements, is needed to build expertise. Those elements include: 1. Effortful exertion to improve performance 2. Intrinsic motivation to engage in the task 3. Carefully tailored practice tasks that focus on areas of weakness 4. Feedback that provides knowledge of results, and 5. Continued repetition over a number of years (p. 256)

673 citations