scispace - formally typeset
Open AccessJournal ArticleDOI

Integrating implementation science in clinical research to maximize public health impact: a call for the reporting and alignment of implementation strategy use with implementation outcomes in clinical research

Reads0
Chats0
TLDR
It is argued that the lack of comprehensive reporting of implementation strategy use and alignment of those strategies with implementation outcomes within clinical research is a missed opportunity to efficiently narrow research-to-practice gaps and it is proposed that revisions to frequently used reporting guidelines in clinical research are needed.
Abstract
Although comprehensive reporting guidelines for implementation strategy use within implementation research exist, they are rarely used by clinical (i.e., efficacy and effectiveness) researchers. In this debate, we argue that the lack of comprehensive reporting of implementation strategy use and alignment of those strategies with implementation outcomes within clinical research is a missed opportunity to efficiently narrow research-to-practice gaps. We review ways that comprehensively specifying implementation strategy use can advance science, including enhancing replicability of clinical trials and reducing the time from clinical research to public health impact. We then propose that revisions to frequently used reporting guidelines in clinical research (e.g., CONSORT, TIDieR) are needed, review current methods for reporting implementation strategy use (e.g., utilizing StaRI), provide pragmatic suggestions on how to both prospectively and retrospectively specify implementation strategy use and align these strategies with implementation outcomes within clinical research, and offer a case study of using these methods. The approaches recommended in this article will not only contribute to shared knowledge and language among clinical and implementation researchers but also facilitate the replication of efficacy and effectiveness research. Ultimately, we hope to accelerate translation from clinical to implementation research in order to expedite improvements in public health.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal Article

Nudge: Improving Decisions about Health, Wealth, and Happiness

TL;DR: Thaler and Sunstein this paper described a general explanation of and advocacy for libertarian paternalism, a term coined by the authors in earlier publications, as a general approach to how leaders, systems, organizations, and governments can nudge people to do the things the nudgers want and need done for the betterment of the nudgees, or of society.
Journal ArticleDOI

Forms and functions of bridging factors: specifying the dynamic links between outer and inner contexts during implementation and sustainment

TL;DR: In this article, the authors proposed five function and three form bridging factor dimensions for evidence-based practice (EBP) implementation and sustainment, based on a case study design.
References
More filters
Journal ArticleDOI

CONSORT 2010 Statement: updated guidelines for reporting parallel group randomised trials

TL;DR: The Consort 2010 Statement as discussed by the authors has been used worldwide to improve the reporting of randomised controlled trials and has been updated by Schulz et al. in 2010, based on new methodological evidence and accumulating experience.
Journal ArticleDOI

Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science

TL;DR: The CFIR provides a pragmatic structure for approaching complex, interacting, multi-level, and transient states of constructs in the real world by embracing, consolidating, and unifying key constructs from published implementation theories.
Journal ArticleDOI

CONSORT 2010 Explanation and Elaboration: updated guidelines for reporting parallel group randomised trials

TL;DR: This update of the CONSORT statement improves the wording and clarity of the previous checklist and incorporates recommendations related to topics that have only recently received recognition, such as selective outcome reporting bias.
Journal ArticleDOI

Outcomes for Implementation Research: Conceptual Distinctions, Measurement Challenges, and Research Agenda

TL;DR: A heuristic, working “taxonomy” of eight conceptually distinct implementation outcomes—acceptability, adoption, appropriateness, feasibility, fidelity, implementation cost, penetration, and sustainability—along with their nominal definitions is proposed.
Related Papers (5)