scispace - formally typeset
Search or ask a question
Author

Rosalind E. Keith

Bio: Rosalind E. Keith is an academic researcher from Mathematica Policy Research. The author has contributed to research in topics: Health care & Medicaid. The author has an hindex of 7, co-authored 16 publications receiving 6142 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: The CFIR provides a pragmatic structure for approaching complex, interacting, multi-level, and transient states of constructs in the real world by embracing, consolidating, and unifying key constructs from published implementation theories.
Abstract: Many interventions found to be effective in health services research studies fail to translate into meaningful patient care outcomes across multiple contexts. Health services researchers recognize the need to evaluate not only summative outcomes but also formative outcomes to assess the extent to which implementation is effective in a specific setting, prolongs sustainability, and promotes dissemination into other settings. Many implementation theories have been published to help promote effective implementation. However, they overlap considerably in the constructs included in individual theories, and a comparison of theories reveals that each is missing important constructs included in other theories. In addition, terminology and definitions are not consistent across theories. We describe the Consolidated Framework For Implementation Research (CFIR) that offers an overarching typology to promote implementation theory development and verification about what works where and why across multiple contexts. We used a snowball sampling approach to identify published theories that were evaluated to identify constructs based on strength of conceptual or empirical support for influence on implementation, consistency in definitions, alignment with our own findings, and potential for measurement. We combined constructs across published theories that had different labels but were redundant or overlapping in definition, and we parsed apart constructs that conflated underlying concepts. The CFIR is composed of five major domains: intervention characteristics, outer setting, inner setting, characteristics of the individuals involved, and the process of implementation. Eight constructs were identified related to the intervention (e.g., evidence strength and quality), four constructs were identified related to outer setting (e.g., patient needs and resources), 12 constructs were identified related to inner setting (e.g., culture, leadership engagement), five constructs were identified related to individual characteristics, and eight constructs were identified related to process (e.g., plan, evaluate, and reflect). We present explicit definitions for each construct. The CFIR provides a pragmatic structure for approaching complex, interacting, multi-level, and transient states of constructs in the real world by embracing, consolidating, and unifying key constructs from published implementation theories. It can be used to guide formative evaluations and build the implementation knowledge base across multiple studies and settings.

8,080 citations

Journal ArticleDOI
TL;DR: An approach to using the Consolidated Framework for Implementation Research (CFIR) to guide systematic research that supports rapid-cycle evaluation of the implementation of health care delivery interventions and produces actionable evaluation findings intended to improve implementation in a timely manner is presented.
Abstract: Much research does not address the practical needs of stakeholders responsible for introducing health care delivery interventions into organizations working to achieve better outcomes. In this article, we present an approach to using the Consolidated Framework for Implementation Research (CFIR) to guide systematic research that supports rapid-cycle evaluation of the implementation of health care delivery interventions and produces actionable evaluation findings intended to improve implementation in a timely manner. To present our approach, we describe a formative cross-case qualitative investigation of 21 primary care practices participating in the Comprehensive Primary Care (CPC) initiative, a multi-payer supported primary care practice transformation intervention led by the Centers for Medicare and Medicaid Services. Qualitative data include observational field notes and semi-structured interviews with primary care practice leadership, clinicians, and administrative and medical support staff. We use intervention-specific codes, and CFIR constructs to reduce and organize the data to support cross-case analysis of patterns of barriers and facilitators relating to different CPC components. Using the CFIR to guide data collection, coding, analysis, and reporting of findings supported a systematic, comprehensive, and timely understanding of barriers and facilitators to practice transformation. Our approach to using the CFIR produced actionable findings for improving implementation effectiveness during this initiative and for identifying improvements to implementation strategies for future practice transformation efforts. The CFIR is a useful tool for guiding rapid-cycle evaluation of the implementation of practice transformation initiatives. Using the approach described here, we systematically identified where adjustments and refinements to the intervention could be made in the second year of the 4-year intervention. We think the approach we describe has broad application and encourage others to use the CFIR, along with intervention-specific codes, to guide the efficient and rigorous analysis of rich qualitative data. NCT02318108

372 citations

Journal ArticleDOI
TL;DR: A proposed methodology for measuring fidelity of implementation (FOI) offers a systematic means for understanding organizational members' use of distinct intervention components, assessing the reasons for variation in use across components and organizations, and evaluating the impact of FOI on intervention effectiveness.
Abstract: Along with the increasing prevalence of chronic illness has been an increase in interventions, such as nurse case management programs, to improve outcomes for patients with chronic illness. Evidence supports the effectiveness of such interventions in reducing patient morbidity, mortality, and resource utilization, but other studies have produced equivocal results. Often, little is known about how implementation of an intervention actually occurs in clinical practice. While studies often assume that interventions are used in clinical practice exactly as originally designed, this may not be the case. Thus, fidelity of an intervention's implementation reflects how an intervention is, or is not, used in clinical practice and is an important factor in understanding intervention effectiveness and in replicating the intervention in dissemination efforts. The purpose of this paper is to contribute to the understanding of implementation science by (a) proposing a methodology for measuring fidelity of implementation (FOI) and (b) testing the measure by examining the association between FOI and intervention effectiveness. We define and measure FOI based on organizational members' level of commitment to using the distinct components that make up an intervention as they were designed. Semistructured interviews were conducted among 18 organizational members in four medical centers, and the interviews were analyzed qualitatively to assess three dimensions of commitment to use--satisfaction, consistency, and quality--and to develop an overall rating of FOI. Mixed methods were used to explore the association between FOI and intervention effectiveness (inpatient resource utilization and mortality). Predictive validity of the FOI measure was supported based on the statistical significance of FOI as a predictor of intervention effectiveness. The strongest relationship between FOI and intervention effectiveness was found when an alternative measure of FOI was utilized based on individual intervention components that had the greatest variation across medical centers. In addition to contextual factors, implementation research needs to consider FOI as an important factor in influencing intervention effectiveness. Our proposed methodology offers a systematic means for understanding organizational members' use of distinct intervention components, assessing the reasons for variation in use across components and organizations, and evaluating the impact of FOI on intervention effectiveness.

73 citations

Posted Content
TL;DR: This first annual report to CMS describes the implementation and impacts of the Comprehensive Primary Care initiative over its first year.
Abstract: In October 2012, the Center for Medicare & Medicaid Innovation of the Centers for Medicare & Medicaid Services (CMS), in a unique collaboration between public and private health care payers, launched the Comprehensive Primary Care (CPC) initiative to improve primary care delivery in seven regions across the United States. This first annual report to CMS describes the implementation and impacts of CPC over its first year.

69 citations

Journal ArticleDOI
TL;DR: Improving providers’ experiences with and uptake of CCM will require addressing several challenges, including the upfront investment for CCM set-up and the time required to provide CCM to more complex patients.
Abstract: Support for ongoing care management and coordination between office visits for patients with multiple chronic conditions has been inadequate. In January 2015, Medicare introduced the Chronic Care Management (CCM) payment policy, which reimburses providers for CCM activities for Medicare beneficiaries occurring outside of office visits. To explore the experiences, facilitators, and challenges of practices providing CCM services, and their implications going forward. Semi-structured telephone interviews from January to April 2016 with 71 respondents. Sixty billing and non-billing providers and practice staff knowledgeable about their practices’ CCM services, and 11 professional society representatives. Practice respondents noted that most patients expressed positive views of CCM services. Practice respondents also perceived several patient benefits, including improved adherence to treatment, access to care team members, satisfaction, care continuity, and care coordination. Facilitators of CCM provision included having an in-practice care manager, patient-centered medical home recognition, experience developing care plans, patient trust in their provider, and supplemental insurance to cover CCM copayments. Most billing practices reported few problems obtaining patients’ consent for CCM, though providers felt that CMS could better facilitate consent by marketing CCM’s goals to beneficiaries. Barriers reported by professional society representatives and by billing and non-billing providers included inadequacy of CCM payments to cover upfront investments for staffing, workflow modification, and time needed to manage complex patients. Other barriers included inadequate infrastructure for health information exchange with other providers and limited electronic health record capabilities for documenting and updating care plans. Practices owned by hospital systems and large medical groups faced greater bureaucracy in implementing CCM than did smaller, independent practices. Improving providers’ experiences with and uptake of CCM will require addressing several challenges, including the upfront investment for CCM set-up and the time required to provide CCM to more complex patients.

41 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: It is concluded that multiple Imputation for Nonresponse in Surveys should be considered as a legitimate method for answering the question of why people do not respond to survey questions.
Abstract: 25. Multiple Imputation for Nonresponse in Surveys. By D. B. Rubin. ISBN 0 471 08705 X. Wiley, Chichester, 1987. 258 pp. £30.25.

3,216 citations

Journal ArticleDOI
Per Nilsen1
TL;DR: A taxonomy that distinguishes between different categories of theories, models and frameworks in implementation science is proposed to facilitate appropriate selection and application of relevant approaches in implementation research and practice and to foster cross-disciplinary dialogue among implementation researchers.
Abstract: Implementation science has progressed towards increased use of theoretical approaches to provide better understanding and explanation of how and why implementation succeeds or fails. The aim of this article is to propose a taxonomy that distinguishes between different categories of theories, models and frameworks in implementation science, to facilitate appropriate selection and application of relevant approaches in implementation research and practice and to foster cross-disciplinary dialogue among implementation researchers. Theoretical approaches used in implementation science have three overarching aims: describing and/or guiding the process of translating research into practice (process models); understanding and/or explaining what influences implementation outcomes (determinant frameworks, classic theories, implementation theories); and evaluating implementation (evaluation frameworks). This article proposes five categories of theoretical approaches to achieve three overarching aims. These categories are not always recognized as separate types of approaches in the literature. While there is overlap between some of the theories, models and frameworks, awareness of the differences is important to facilitate the selection of relevant approaches. Most determinant frameworks provide limited “how-to” support for carrying out implementation endeavours since the determinants usually are too generic to provide sufficient detail for guiding an implementation process. And while the relevance of addressing barriers and enablers to translating research into practice is mentioned in many process models, these models do not identify or systematically structure specific determinants associated with implementation success. Furthermore, process models recognize a temporal sequence of implementation endeavours, whereas determinant frameworks do not explicitly take a process perspective of implementation.

2,392 citations

Journal ArticleDOI
TL;DR: Although traditional clinical effectiveness and implementation trials are likely to remain the most common approach to moving a clinical intervention through from efficacy research to public health impact, judicious use of the proposed hybrid designs could speed the translation of research findings into routine practice.
Abstract: Objectives:This study proposes methods for blending design components of clinical effectiveness and implementation research. Such blending can provide benefits over pursuing these lines of research independently; for example, more rapid translational gains, more effective implementation strategies,

2,126 citations

Journal ArticleDOI
TL;DR: The ERIC study aimed to refine a published compilation of implementation strategy terms and definitions by systematically gathering input from a wide range of stakeholders with expertise in implementation science and clinical practice to generate consensus on implementation strategies and definitions.
Abstract: Identifying, developing, and testing implementation strategies are important goals of implementation science. However, these efforts have been complicated by the use of inconsistent language and inadequate descriptions of implementation strategies in the literature. The Expert Recommendations for Implementing Change (ERIC) study aimed to refine a published compilation of implementation strategy terms and definitions by systematically gathering input from a wide range of stakeholders with expertise in implementation science and clinical practice. Purposive sampling was used to recruit a panel of experts in implementation and clinical practice who engaged in three rounds of a modified Delphi process to generate consensus on implementation strategies and definitions. The first and second rounds involved Web-based surveys soliciting comments on implementation strategy terms and definitions. After each round, iterative refinements were made based upon participant feedback. The third round involved a live polling and consensus process via a Web-based platform and conference call. Participants identified substantial concerns with 31% of the terms and/or definitions and suggested five additional strategies. Seventy-five percent of definitions from the originally published compilation of strategies were retained after voting. Ultimately, the expert panel reached consensus on a final compilation of 73 implementation strategies. This research advances the field by improving the conceptual clarity, relevance, and comprehensiveness of implementation strategies that can be used in isolation or combination in implementation research and practice. Future phases of ERIC will focus on developing conceptually distinct categories of strategies as well as ratings for each strategy’s importance and feasibility. Next, the expert panel will recommend multifaceted strategies for hypothetical yet real-world scenarios that vary by sites’ endorsement of evidence-based programs and practices and the strength of contextual supports that surround the effort.

2,028 citations