scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Training in evidence-based practice.

TL;DR: Broad strategies are needed to overcome the lethargy in behavioral health education and training programs to make them more relevant to contemporary clinical practice and to ensure practice environments that support and reinforce, rather than thwart, the practice of evidence-based treatment.
About: This article is published in Psychiatric Clinics of North America.The article was published on 2003-12-01. It has received 34 citations till now. The article focuses on the topics: Evidence-based practice & Evidence-based medicine.
Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors draw from an interdisciplinary literature (including medical training, adult education, and teacher training) to identify useful training and support approaches as well as important conceptual frameworks that may be applied to training in mental health.
Abstract: Strategies specifically designed to facilitate the training of mental health practitioners in evidence-based practices (EBPs) have lagged behind the development of the interventions themselves. The current paper draws from an interdisciplinary literature (including medical training, adult education, and teacher training) to identify useful training and support approaches as well as important conceptual frameworks that may be applied to training in mental health. Theory and research findings are reviewed, which highlight the importance of continued consultation/support following training workshops, congruence between the training content and practitioner experience, and focus on motivational issues. In addition, six individual approaches are presented with careful attention to their empirical foundations and potential applications. Common techniques are highlighted and applications and future directions for mental health workforce training and research are discussed.

173 citations

Journal ArticleDOI
TL;DR: In this paper, the authors investigated domains of implementation activities and correlated them to implementation success during a large national evidence-based practice implementation project and found that active leaders should focus on redesigning the flow of work to support the implementation and on reinforcing program improvements.
Abstract: Implementation research has examined practice prioritization, implementation leadership, workforce development, workflow re-engineering, and practice reinforcement, but not addressed their relative importance as implementation drivers. This study investigated domains of implementation activities and correlated them to implementation success during a large national evidence-based practice implementation project. Implementation success was correlated with active leadership strategically devoted to redesigning the flow of work and reinforcing implementation through measurement and feedback. Relative attention to workforce development was negatively correlated with implementation. Active leaders should focus on redesigning the flow of work to support the implementation and on reinforcing program improvements.

110 citations


Cites background from "Training in evidence-based practice..."

  • ...Successful implementation requires development of a workforce with the knowledge, attitudes, and skills to do the job (Hoge et al. 2003; Schoenwald et al. 2010b; Siskind and Wiley-Exley 2009)....

    [...]

Journal ArticleDOI
TL;DR: It was found that ACT was generally more successfully implemented than IDDT throughout the state, and that this difference could be traced in large part to state-level factors relating to historical preparation for the practice, establishment of standards, formation of a technical assistance center, and funding.
Abstract: Objective: As part of this national project, we examined barriers and strategies to implementation of two evidence-based practices (EBPs) in Indiana.Background: Despite many advances in the knowledge base regarding mental health treatment, the implementation of EBPs in real-world setting remains poorly understood. The National EBP Project is a multi-state study of factors influencing implementation of EBPs.Methods: Over a 15-month period we observed eight assertive community treatment (ACT) programs and six integrated dual disorders treatment (IDDT) programs and noted pertinent actions taken by the state mental health agency influencing implementation. We created a database containing summaries of monthly visits to each program and interviews with key leaders. Using this database and clinical impressions, we rated barriers and strategies at each site on seven factors: Attitudes, Mastery, Leadership, Staffing, Policies, Workflow, and Program Monitoring.Results: At the site level, the most frequently observed barriers were in the areas of leadership, staffing and policies for ACT, and mastery and leadership for IDDT. Overall, barriers were more evident for IDDT than for ACT. Strategies were less frequently noted but generally paralleled the areas noted for barriers. However, our central finding was that ACT was generally more successfully implemented than IDDT throughout the state, and that this difference could be traced in large part to state-level factors relating to historical preparation for the practice, establishment of standards, formation of a technical assistance center, and funding.Conclusion: In this case study, both state-level and site-specific factors influenced success of implementation of EBPs. To address these factors, the field needs systematic strategies to anticipate and overcome these barriers if full implementation is to be realized.

75 citations

Journal ArticleDOI
TL;DR: Data on minimum state requirements for drug and alcohol counselors and mental health counselors in all 50 states and Washington, DC suggest that training as a mental health counselor is primarily structured through formal education, whereasTraining as a substance abuse counselor resembles an apprentice model.

68 citations


Cites background from "Training in evidence-based practice..."

  • ...In comparison to substance abuse treatment, the mental health field, through the efforts of psychology and psychiatry, has led the way in identifying empirically supported treatments and in requiring the training of evidence-based practice in clinical academic and training programs ( Hoge, Tondora, & Stuart, 2003 )....

    [...]

Journal ArticleDOI
TL;DR: The authors first review several of the definitions, criteria, and strategies that have been used to define scientific evidence and suggest suggestions for further consideration in the process of synthesizing evidence for clinicians.

66 citations


Cites background from "Training in evidence-based practice..."

  • ...Another important distinction is between the use of standardized procedures for finding and evaluating the scientific evidence (evidence-based medicine) and using specific practices that are supported by scientific evidence (evidence-based practice) [8]....

    [...]

References
More filters
Journal ArticleDOI
17 Nov 2001-BMJ
TL;DR: Analyzing health care organizations as complex systems, Crossing the Quality Chasm also documents the causes of the quality gap, identifies current practices that impede quality care, and explores how systems approaches can be used to implement change.
Abstract: Crossing the Quality Chasm identifies and recommends improvements in six dimensions of health care in the U.S.: patient safety, care effectiveness, patient-centeredness, timeliness, care efficiency, and equity. Safety looks at reducing the likelihood that patients are harmed by medical errors. Effectiveness describes avoiding over and underuse of resources and services. Patient-centeredness relates both to customer service and to considering and accommodating individual patient needs when making care decisions. Timeliness emphasizes reducing wait times. Efficiency focuses on reducing waste and, as a result, total cost of care. Equity looks at closing racial and income gaps in health care.

15,046 citations

Book
14 Mar 2000
TL;DR: This chapter discusses how to ask clinical questions you can answer and critically assess the evidence for evidence-based medicine, as well as 7 Rapid Reference Cards used in clinical practice.
Abstract: Introduction: On the Need for Evidence-Based Medicine 1. How to Ask Clinical Questions You Can Answer 2. Searching for the Best Evidence 3. Critically Appraising the Evidence 4. Can You Apply This Valid, Important Evidence in Caring for Your Patient? 5. Evaluation Appendix: Confidence Intervals Also Included Are 7 Rapid Reference Cards

6,019 citations

Journal ArticleDOI
04 Nov 1992-JAMA
TL;DR: An important goal of the medical residency program is to educate physicians in the practice of evidence-based medicine, and strategies include a weekly, formal academic half-day for residents devoted to learning the necessary skills.
Abstract: A NEW paradigm for medical practice is emerging. Evidence-based medicine de-emphasizes intuition, unsystematic clinical experience, and pathophysiologic rationale as sufficient grounds for clinical decision making and stresses the examination of evidence from clinical research. Evidence-based medicine requires new skills of the physician, including efficient literature searching and the application of formal rules of evidence evaluating the clinical literature. An important goal of our medical residency program is to educate physicians in the practice of evidence-based medicine. Strategies include a weekly, formal academic half-day for residents, devoted to learning the necessary skills; recruitment into teaching roles of physicians who practice evidence-based medicine; sharing among faculty of approaches to teaching evidence-based medicine; and providing faculty with feedback on their performance as role models and teachers of evidence-based medicine. The influence of evidencebased medicine on clinical practice and medical education is increasing. CLINICAL SCENARIO A junior medical resident working in a teaching hospital

3,906 citations

Book
01 Jan 2001
TL;DR: Without a way of critically appraising the information they receive, clinicians are relatively helpless in deciding what new information to learn and decide how to modify their practice.
Abstract: Medical practice is constantly changing. The rate of change is accelerating, and physicians can be forgiven if they often find it dizzying. How can physicians learn about new information and innovations, and decide how (if at all) they should modify their practice? Possible sources include summaries from the medical literature (review articles, practice guidelines, consensus statements, editorials, and summary articles in "throwaway" journals); consultation with colleagues who have special expertise; lectures; seminars; advertisements in medical journals; conversations with representatives from pharmaceutical companies; and original articles in journals and journal supplements. Each of these sources of information might be valuable, though each is subject to its own particular biases. 1,2 Problems arise when, as is often the case, these sources of information provide different suggestions about patient care. See also p 2093. Without a way of critically appraising the information they receive, clinicians are relatively helpless in deciding what new information

3,305 citations


"Training in evidence-based practice..." refers background in this paper

  • ...Sackett et al [2] and Guyatt and Rennie [11] have published detailed pocket guides to EBM that can be used for selfstudy or as texts for formal courses....

    [...]

Journal ArticleDOI
06 Sep 1995-JAMA
TL;DR: Widely used CME delivery methods such as conferences have little direct impact on improving professional practice, and more effective methodssuch as systematic practice-based interventions and outreach visits are seldom used by CME providers.
Abstract: Objective. —To review the literature relating to the effectiveness of education strategies designed to change physician performance and health care outcomes. Data Sources. —We searched MEDLINE, ERIC, NTIS, the Research and Development Resource Base in Continuing Medical Education, and other relevant data sources from 1975 to 1994, using continuing medical education (CME) and related terms as keywords. We manually searched journals and the bibliographies of other review articles and called on the opinions of recognized experts. Study Selection. —We reviewed studies that met the following criteria: randomized controlled trials of education strategies or interventions that objectively assessed physician performance and/or health care outcomes. These intervention strategies included (alone and in combination) educational materials, formal CME activities, outreach visits such as academic detailing, opinion leaders, patient-mediated strategies, audit with feedback, and reminders. Studies were selected only if more than 50% of the subjects were either practicing physicians or medical residents. Data Extraction. —We extracted the specialty of the physicians targeted by the interventions and the clinical domain and setting of the trial. We also determined the details of the educational intervention, the extent to which needs or barriers to change had been ascertained prior to the intervention, and the main outcome measure(s). Data Synthesis. —We found 99 trials, containing 160 interventions, that met our criteria. Almost two thirds of the interventions (101 of 160) displayed an improvement in at least one major outcome measure: 70% demonstrated a change in physician performance, and 48% of interventions aimed at health care outcomes produced a positive change. Effective change strategies included reminders, patient-mediated interventions, outreach visits, opinion leaders, and multifaceted activities. Audit with feedback and educational materials were less effective, and formal CME conferences or activities, without enabling or practice-reinforcing strategies, had relatively little impact. Conclusion. —Widely used CME delivery methods such as conferences have little direct impact on improving professional practice. More effective methods such as systematic practice-based interventions and outreach visits are seldom used by CME providers. ( JAMA . 1995;274:700-705)

2,857 citations


"Training in evidence-based practice..." refers background in this paper

  • ...For example, in a review of systematic studies of continuing education, Davis et al [20] reported that 64% of interventions using any two of these educational techniques produced positive changes in provider behavior, while those interventions combining three or more techniques produced a change rate of 79%....

    [...]

  • ...The general findings from this body of research are detailed in a series of articles and reviews published over the past decade [20–24]....

    [...]

  • ...Further, changes in provider behavior are generally small, only occasionally moderate in nature, and seldom large [20], leading Oxman et al [24] to conclude that there are no magic bullets for teaching providers in a manner that achieves behavior change....

    [...]

Trending Questions (1)
Why is there a shortage of behavioral health workers?

This is just one aspect of a much larger crisis in behavioral health workforce education.