Programmatic assessment of competency-based workplace learning: when theory meets practice
Harold G. J. Bok,Pim W. Teunissen,Pim W. Teunissen,Robert P. Favier,Nancy J. Rietbroek,L. F. H. Theyse,Harold Brommer,Jan C. M. Haarhuis,Peter van Beukelen,Cees P. M. van der Vleuten,Debbie Jaarsma +10 more
TLDR
This work investigated the implementation of a theory-based programme of assessment with low-stakes assessments providing simultaneously formative feedback and input for summative decisions and found it not easy to implement.Abstract:
Background: In competency-based medical education emphasis has shifted towards outcomes, capabilities, and learner-centeredness. Together with a focus on sustained evidence of professional competence this calls for new methods of teaching and assessment. Recently, medical educators advocated the use of a holistic, programmatic approach towards assessment. Besides maximum facilitation of learning it should improve the validity and reliability of measurements and documentation of competence development. We explored how, in a competency-based curriculum, current theories on programmatic assessment interacted with educational practice. Methods: In a development study including evaluation, we investigated the implementation of a theory-based programme of assessment. Between April 2011 and May 2012 quantitative evaluation data were collected and used to guide group interviews that explored the experiences of students and clinical supervisors with the assessment programme. We coded the transcripts and emerging topics were organised into a list of lessons learned. Results: The programme mainly focuses on the integration of learning and assessment by motivating and supporting students to seek and accumulate feedback. The assessment instruments were aligned to cover predefined competencies to enable aggregation of information in a structured and meaningful way. Assessments that were designed as formative learning experiences were increasingly perceived as summative by students. Peer feedback was experienced as a valuable method for formative feedback. Social interaction and external guidance seemed to be of crucial importance to scaffold self-directed learning. Aggregating data from individual assessments into a holistic portfolio judgement required expertise and extensive training and supervision of judges. Conclusions: A programme of assessment with low-stakes assessments providing simultaneously formative feedback and input for summative decisions proved not easy to implement. Careful preparation and guidance of the implementation process was crucial. Assessment for learning requires meaningful feedback with each assessment. Special attention should be paid to the quality of feedback at individual assessment moments. Comprehensive attention for faculty development and training for students is essential for the successful implementation of an assessment programme.read more
Citations
More filters
Journal ArticleDOI
Guidelines: the do’s, don’ts and don’t knows of feedback for clinical education
TL;DR: Feedback is not easy to get right, but it is essential to learning in medicine, and there is a wealth of evidence supporting the Do’s and warning against the Don’ts.
Journal ArticleDOI
Twelve Tips for programmatic assessment
C. P. M. van der Vleuten,Lambert Schuwirth,Erik W. Driessen,Marjan J. B. Govaerts,Sylvia Heeneman +4 more
TL;DR: This paper provides concrete recommendations for implementation of programmatic assessment, an integral approach to the design of an assessment program with the intent to optimise its learning function, its decision-making function and its curriculum quality-assurance function.
Journal ArticleDOI
Assessment, feedback and the alchemy of learning.
TL;DR: Reconciling the tension between assessment'sfocus on judgement and decision making and feedback's focus on growth and development represents a critical challenge for researchers and educators.
Journal ArticleDOI
Entrustability Scales: Outlining Their Usefulness for Competency-Based Clinical Assessment
TL;DR: This Perspective outlines how “entrustability scales” may help bridge the gap between the assessment judgments of clinical supervisors and WBA instruments.
Journal ArticleDOI
The impact of programmatic assessment on student learning: theory versus practice
TL;DR: ‘Programmatic assessment’ is intended to optimise both learning functions and decision functions at the programme level of assessment, rather than according to individual methods of assessment.
References
More filters
Journal ArticleDOI
Defining and Assessing Professional Competence
TL;DR: An inclusive definition of competence is generated: the habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and the community being served.
Journal ArticleDOI
Design Research: Theoretical and Methodological Issues
TL;DR: In this paper, the authors outline the goals of design research and how it is related to other methodologies, and provide guidelines for how design research can best be carried out in the future.
Journal ArticleDOI
Competency-based medical education: theory to practice.
Jason R. Frank,Linda Snell,Olle ten Cate,Eric S. Holmboe,Carol Carraccio,Susan R. Swing,Peter Harris,Nicholas Glasgow,Craig Campbell,Deepak Dath,Ronald M. Harden,William Iobst,Donlin M. Long,Rani Mungroo,Denyse Richardson,Jonathan Sherbino,Ivan Silver,Sarah Taber,Martin Talbot,Kenneth A. Harris,Kenneth A. Harris +20 more
TL;DR: The evolution of CBME from the outcomes movement in the 20th century to a renewed approach that, focused on accountability and curricular outcomes and organized around competencies, promotes greater learner-centredness and de-emphasizes time-based curricular design is described.
Journal ArticleDOI
The assessment of professional competence: Developments, research and practical implications.
Journal ArticleDOI
Assessing professional competence: from methods to programmes
TL;DR: In this paper, the authors use a utility model to illustrate that selecting an assessment method involves context-dependent compromises, and that assessment is not a measurement problem but an instructional design problem, comprising educational, implementation and resource aspects.