scispace - formally typeset
Search or ask a question
Author

Peter Craig

Bio: Peter Craig is an academic researcher from University of Glasgow. The author has contributed to research in topics: Public health & Medicine. The author has an hindex of 25, co-authored 91 publications receiving 13587 citations. Previous affiliations of Peter Craig include University of Bristol & Scottish Government.


Papers
More filters
Journal ArticleDOI
29 Sep 2008-BMJ
TL;DR: The Medical Research Council's evaluation framework (2000) brought welcome clarity to the task and now the council has updated its guidance.
Abstract: Evaluating complex interventions is complicated. The Medical Research Council9s evaluation framework (2000) brought welcome clarity to the task. Now the council has updated its guidance

8,896 citations

31 May 2006
TL;DR: This research highlights the need to understand more fully the rationale behind the continued use of EMMARM, as well as the barriers to doing so, in patients and clinicians.
Abstract: Peter Craig, MRC Population Health Sciences Research Network Paul Dieppe, Nuffield Department of Orthopaedic Surgery, University of Oxford Sally Macintyre, MRC Social and Public Health Sciences Unit Susan Michie, Centre for Outcomes Research and Effectiveness, University College London Irwin Nazareth, MRC General Practice Research Framework Mark Petticrew, Department of Public Health and Policy, London School of Hygiene and Tropical Medicine

1,995 citations

Journal ArticleDOI
30 Sep 2021-BMJ
TL;DR: JBl received funding from NIHR Biomedical Research Centre at University Hospitals Bristol NHS Foundation Trust and the University of Bristol and by the MRC ConDuCT-II Hub (Collaboration and innovation for Difficult and Complex randomised controlled Trials In Invasive procedures).
Abstract: The UK Medical Research Council’s widely used guidance for developing and evaluating complex interventions has been replaced by a new framework, commissioned jointly by the Medical Research Council and the National Institute for Health Research, which takes account of recent developments in theory and methods and the need to maximise the efficiency, use, and impact of research.

1,080 citations

Journal ArticleDOI
TL;DR: The guidance from the Medical Research Council as discussed by the authors emphasizes that natural experiments can provide convincing evidence of impact even when effects are small or take time to appear, and careful choice and combination of methods, testing of assumptions and transparent reporting is vital.
Abstract: Natural experimental studies are often recommended as a way of understanding the health impact of policies and other large scale interventions. Although they have certain advantages over planned experiments, and may be the only option when it is impossible to manipulate exposure to the intervention, natural experimental studies are more susceptible to bias. This paper introduces new guidance from the Medical Research Council to help researchers and users, funders and publishers of research evidence make the best use of natural experimental approaches to evaluating population health interventions. The guidance emphasises that natural experiments can provide convincing evidence of impact even when effects are small or take time to appear. However, a good understanding is needed of the process determining exposure to the intervention, and careful choice and combination of methods, testing of assumptions and transparent reporting is vital. More could be learnt from natural experiments in future as experience of promising but lesser used methods accumulates.

671 citations

01 Jan 2010
TL;DR: New guidance from the Medical Research Council is introduced to help researchers and users, funders and publishers of research evidence make the best use of natural experimental approaches to evaluating population health interventions.
Abstract: Craig P, Cooper C, Gunnell D, Haw S, Lawson K, Macintyre S, Ogilvie D, Petticrew M, Reeves B, Sutton M, Thompson S

644 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: A reporting guideline is described, the Preferred Reporting Items for Systematic reviews and Meta-Analyses for Protocols 2015 (PRISMA-P 2015), which consists of a 17-item checklist intended to facilitate the preparation and reporting of a robust protocol for the systematic review.
Abstract: Systematic reviews should build on a protocol that describes the rationale, hypothesis, and planned methods of the review; few reviews report whether a protocol exists. Detailed, well-described protocols can facilitate the understanding and appraisal of the review methods, as well as the detection of modifications to methods and selective reporting in completed reviews. We describe the development of a reporting guideline, the Preferred Reporting Items for Systematic reviews and Meta-Analyses for Protocols 2015 (PRISMA-P 2015). PRISMA-P consists of a 17-item checklist intended to facilitate the preparation and reporting of a robust protocol for the systematic review. Funders and those commissioning reviews might consider mandating the use of the checklist to facilitate the submission of relevant protocol information in funding applications. Similarly, peer reviewers and editors can use the guidance to gauge the completeness and transparency of a systematic review protocol submitted for publication in a journal or other medium.

14,708 citations

Journal ArticleDOI
07 Mar 2014-BMJ
TL;DR: The TIDieR checklist and guide should improve the reporting of interventions and make it easier for authors to structure accounts of their interventions, reviewers and editors to assess the descriptions, and readers to use the information.
Abstract: Without a complete published description of interventions, clinicians and patients cannot reliably implement interventions that are shown to be useful, and other researchers cannot replicate or build on research findings. The quality of description of interventions in publications, however, is remarkably poor. To improve the completeness of reporting, and ultimately the replicability, of interventions, an international group of experts and stakeholders developed the Template for Intervention Description and Replication (TIDieR) checklist and guide. The process involved a literature review for relevant checklists and research, a Delphi survey of an international panel of experts to guide item selection, and a face to face panel meeting. The resultant 12 item TIDieR checklist (brief name, why, what (materials), what (procedure), who provided, how, where, when and how much, tailoring, modifications, how well (planned), how well (actual)) is an extension of the CONSORT 2010 statement (item 5) and the SPIRIT 2013 statement (item 11). While the emphasis of the checklist is on trials, the guidance is intended to apply across all evaluative study designs. This paper presents the TIDieR checklist and guide, with an explanation and elaboration for each item, and examples of good reporting. The TIDieR checklist and guide should improve the reporting of interventions and make it easier for authors to structure accounts of their interventions, reviewers and editors to assess the descriptions, and readers to use the information.

5,237 citations

Journal ArticleDOI
TL;DR: “BCT taxonomy v1,” an extensive taxonomy of 93 consensually agreed, distinct BCTs, offers a step change as a method for specifying interventions, but the authors anticipate further development and evaluation based on international, interdisciplinary consensus.
Abstract: CONSORT guidelines call for precise reporting of behavior change interventions: we need rigorous methods of characterizing active content of interventions with precision and specificity. The objective of this study is to develop an extensive, consensually agreed hierarchically structured taxonomy of techniques [behavior change techniques (BCTs)] used in behavior change interventions. In a Delphi-type exercise, 14 experts rated labels and definitions of 124 BCTs from six published classification systems. Another 18 experts grouped BCTs according to similarity of active ingredients in an open-sort task. Inter-rater agreement amongst six researchers coding 85 intervention descriptions by BCTs was assessed. This resulted in 93 BCTs clustered into 16 groups. Of the 26 BCTs occurring at least five times, 23 had adjusted kappas of 0.60 or above. “BCT taxonomy v1,” an extensive taxonomy of 93 consensually agreed, distinct BCTs, offers a step change as a method for specifying interventions, but we anticipate further development and evaluation based on international, interdisciplinary consensus.

4,568 citations

Journal ArticleDOI
19 Mar 2015-BMJ
TL;DR: New MRC guidance provides a framework for conducting and reporting process evaluation studies that will help improve the quality of decision-making in the design and testing of complex interventions.
Abstract: Process evaluation is an essential part of designing and testing complex interventions. New MRC guidance provides a framework for conducting and reporting process evaluation studies

3,662 citations