Assessing Markers of Reproducibility and Transparency in Smoking Behaviour Change Intervention Evaluations
Reads0
Chats0
TLDR
Open data, materials, analysis, and replications are rare in smoking behaviour change interventions, whereas funding source and conflict of interest declarations are common.Abstract:
Introduction. Activities promoting research reproducibility and transparency are crucial for generating trustworthy evidence. Evaluation of smoking interventions is one area where vested interests may motivate reduced reproducibility and transparency. Aims. Assess markers of transparency and reproducibility in smoking behaviour change intervention evaluation reports. Methods. One hundred evaluation reports of smoking behaviour change intervention randomised controlled trials published in 2018-2019 were identified. Reproducibility markers of pre-registration; protocol sharing; data, material, and analysis script sharing; replication of a previous study; and open access publication were coded in identified reports. Transparency markers of funding and conflict of interest declarations were also coded. Coding was performed by two researchers, with inter-rater reliability calculated using Krippendorff’s alpha. Results. Seventy-one percent of reports were open access, and 73% were pre-registered. However, there are only 13% provided accessible materials, 7% accessible data, and 1% accessible analysis scripts. No reports were replication studies. Ninety-four percent of reports provided a funding source statement, and eighty-eight percent of reports provided a conflict of interest statement. Conclusions. Open data, materials, analysis, and replications are rare in smoking behaviour change interventions, whereas funding source and conflict of interest declarations are common. Future smoking research should be more reproducible to enable knowledge accumulation. This study was pre-registered: https://osf.io/yqj5p .read more
Citations
More filters
Proceedings Article
HBCP Corpus: A New Resource for the Analysis of Behavioural Change Intervention Reports.
Francesca Bonin,Martin Gleize,Ailbhe Finnerty,Candice Moore,Charles Jochim,Emma Norris,Yufang Hou,Alison J Wright,Debasis Ganguly,Emily Hayes,Silje Zink,Alessandra Pascale,Pol Mac Aonghusa,Susan Michie +13 more
TL;DR: The construction of a corpus of published behaviour change intervention evaluation reports aimed at smoking cessation is described and the annotation of 57 entities, that can be used as an off-the-shelf data resource for tasks such as entity recognition, etc.
Journal ArticleDOI
Assessing Open Science practices in physical activity behaviour change intervention evaluations
TL;DR: The extent that physical activity interventions are embedding Open Science practices is currently unknown as discussed by the authors , and the extent to which such interventions embedding open science practices is unknown, however, the authors in this paper have identified 100 reports of recent physical activity randomised controlled trial behaviour change interventions to estimate the prevalence of Open Science practice.
Journal ArticleDOI
Open Science Practices in Gambling Research Publications (2016–2019): A Scoping Review
TL;DR: In this article , the authors conducted a scoping review of 500 recent studies focused on gambling and problem gambling to examine the use of open science and transparent research practices, including pre-registration, open data, open access and avoiding methods that can lead to publication bias and low replication rates.
Journal ArticleDOI
Open Science in Health Psychology and Behavioral Medicine: A Statement From the Behavioral Medicine Research Council
Suzanne C. Segerstrom,Michael A. Diefenbach,Kyra Hamilton,Daryl B. O'Connor,A. Janet Tomiyama +4 more
TL;DR: Open Science practices include some combination of registering and publishing study protocols (including hypotheses, primary and secondary outcome variables, and analysis plans) and making available preprints of manuscripts, study materials, de-identified data sets, and analytic codes as mentioned in this paper .
Journal ArticleDOI
Establishing open science research priorities in health psychology: a research prioritisation Delphi exercise.
Emma Norris,Amy Prescott,Chris Noone,James A. Green,James Reynolds,S. P. Grant,Elaine Toomey +6 more
TL;DR: A meta-research study aimed to identify research question priorities and obtain consensus on the Top 5 prioritised research questions for Open Science in Health Psychology as discussed by the authors . But, the research on open science practices in health psychology is lacking.
References
More filters
Journal ArticleDOI
The behaviour change wheel: a new method for characterising and designing behaviour change interventions.
TL;DR: Interventions and policies to change behaviour can be usefully characterised by means of a BCW comprising: a 'behaviour system' at the hub, encircled by intervention functions and then by policy categories, and a new framework aimed at overcoming their limitations is developed.
Journal ArticleDOI
Estimating the reproducibility of psychological science
Alexander A. Aarts,Joanna E. Anderson,Christopher J. Anderson,Peter Raymond Attridge,Peter Raymond Attridge,Angela S. Attwood,Jordan Axt,Molly Babel,Štěpán Bahník,Erica Baranski,Michael Barnett-Cowan,Elizabeth Bartmess,Jennifer S. Beer,Raoul Bell,Heather Bentley,Leah Beyan,Grace Binion,Grace Binion,Denny Borsboom,Annick Bosch,Frank A. Bosco,Sara Bowman,Mark J. Brandt,Erin L Braswell,Hilmar Brohmer,Benjamin T. Brown,Kristina G. Brown,Jovita Brüning,Jovita Brüning,Ann Calhoun-Sauls,Shannon P. Callahan,Elizabeth Chagnon,Jesse Chandler,Jesse Chandler,Christopher R. Chartier,Felix Cheung,Felix Cheung,Cody D. Christopherson,Linda Cillessen,Russ Clay,Hayley M. D. Cleary,Mark D. Cloud,Michael Conn,Johanna Cohoon,Simon Columbus,Andreas Cordes,Giulio Costantini,Leslie Cramblet Alvarez,Ed Cremata,Jan Crusius,Jamie DeCoster,Michelle A. DeGaetano,Nicolás Delia Penna,Bobby Den Bezemer,Marie K. Deserno,Olivia Devitt,Laura Dewitte,David G. Dobolyi,Geneva T. Dodson,M. Brent Donnellan,Ryan Donohue,Rebecca A. Dore,Angela Rachael Dorrough,Angela Rachael Dorrough,Anna Dreber,Michelle Dugas,Elizabeth W. Dunn,Kayleigh E Easey,Sylvia Eboigbe,Casey Eggleston,Jo Embley,Sacha Epskamp,Timothy M. Errington,Vivien Estel,Frank J. Farach,Jenelle Feather,Anna Fedor,Belén Fernández-Castilla,Susann Fiedler,James G. Field,Stanka A. Fitneva,Taru Flagan,Amanda L. Forest,Eskil Forsell,Joshua D. Foster,Michael C. Frank,Rebecca S. Frazier,Heather M. Fuchs,Philip A. Gable,Jeff Galak,Elisa Maria Galliani,Anup Gampa,Sara García,Douglas Gazarian,Elizabeth Gilbert,Roger Giner-Sorolla,Andreas Glöckner,Andreas Glöckner,Lars Goellner,Jin X. Goh,Rebecca M. Goldberg,Patrick T. Goodbourn,Shauna Gordon-McKeon,Bryan Gorges,Jessie Gorges,Justin Goss,Jesse Graham,James A. Grange,Jeremy R. Gray,Chris H.J. Hartgerink,Joshua K. Hartshorne,Fred Hasselman,Timothy Hayes,Emma Heikensten,Felix Henninger,Felix Henninger,John Hodsoll,Taylor Holubar,Gea Hoogendoorn,Denise J. Humphries,Cathy On-Ying Hung,Nathali Immelman,Vanessa C. Irsik,Georg Jahn,Frank Jäkel,Marc Jekel,Magnus Johannesson,Larissa Gabrielle Johnson,David J. Johnson,Kate M. Johnson,William J. Johnston,Kai J. Jonas,Jennifer A. Joy-Gaba,Heather Barry Kappes,Kim Kelso,Mallory C. Kidwell,Seung K. Kim,Matthew W. Kirkhart,Bennett Kleinberg,Bennett Kleinberg,Goran Knežević,Franziska Maria Kolorz,Jolanda J. Kossakowski,Robert Krause,Job Krijnen,Tim Kuhlmann,Yoram K. Kunkels,Megan M. Kyc,Calvin K. Lai,Aamir Laique,Daniel Lakens,Kristin A. Lane,Bethany Lassetter,Ljiljana B. Lazarević,Etienne P. Le Bel,Key Jung Lee,Minha Lee,Kristi M. Lemm,Carmel A. Levitan,Melissa Lewis,Lin Lin,Stephanie C. Lin,Matthias Lippold,Darren Loureiro,Ilse Luteijn,Sean P. Mackinnon,Heather N. Mainard,Denise C. Marigold,Daniel P. Martin,Tylar Martinez,E. J. Masicampo,Joshua J. Matacotta,Maya B. Mathur,Michael May,Michael May,Nicole Mechin,Pranjal H. Mehta,Johannes M. Meixner,Johannes M. Meixner,Alissa Melinger,Jeremy K. Miller,Mallorie Miller,Katherine Moore,Katherine Moore,Marcus Möschl,Matt Motyl,Stephanie M. Müller,Marcus R. Munafò,Koen Ilja Neijenhuijs,Taylor Nervi,Gandalf Nicolas,Gustav Nilsonne,Gustav Nilsonne,Brian A. Nosek,Brian A. Nosek,Michèle B. Nuijten,Catherine Olsson,Catherine Olsson,Colleen Osborne,Lutz Ostkamp,Misha Pavel,Ian S. Penton-Voak,Olivia Perna,Cyril Pernet,Marco Perugini,R. Nathan Pipitone,Michael C. Pitts,Franziska Plessow,Franziska Plessow,Jason M. Prenoveau,Rima-Maria Rahal,Rima-Maria Rahal,Kate A. Ratliff,David Reinhard,Frank Renkewitz,Ashley A. Ricker,Anastasia E. Rigney,Andrew M Rivers,Mark A. Roebke,Abraham M. Rutchick,Robert S. Ryan,Onur Sahin,Anondah R. Saide,Gillian M. Sandstrom,David Santos,David Santos,Rebecca Saxe,René Schlegelmilch,René Schlegelmilch,Kathleen Schmidt,Sabine Scholz,Larissa Seibel,Dylan Selterman,Samuel Shaki,William B. Simpson,H. Colleen Sinclair,Jeanine L. M. Skorinko,Agnieszka Slowik,Joel S. Snyder,Courtney K. Soderberg,Carina Sonnleitner,Nick Spencer,Jeffrey R. Spies,Sara Steegen,Stefan Stieger,Nina Strohminger,Gavin Brent Sullivan,Thomas Talhelm,Megan Tapia,Anniek M. te Dorsthorst,Manuela Thomae,Manuela Thomae,Sarah L. Thomas,Pia Tio,Frits Traets,Steve N.H. Tsang,Francis Tuerlinckx,Paul J. Turchan,Milan Valášek,Anna E. Van't Veer,Robbie C. M. van Aert,Marcel A.L.M. van Assen,Riet van Bork,Mathijs Van De Ven,Don van den Bergh,Marije van der Hulst,Roel van Dooren,Johnny van Doorn,Daan R. van Renswoude,Hedderik van Rijn,Wolf Vanpaemel,Alejandro Vásquez Echeverría,Melissa Vazquez,Natalia Vélez,Marieke Vermue,Mark Verschoor,Michelangelo Vianello,Martin Voracek,Gina Vuu,Eric-Jan Wagenmakers,Joanneke Weerdmeester,Ashlee Welsh,Erin C. Westgate,Joeri Wissink,Michael J. Wood,Andy T. Woods,Andy T. Woods,Emily M. Wright,Sining Wu,Marcel Zeelenberg,Kellylynn Zuni +290 more
TL;DR: A large-scale assessment suggests that experimental reproducibility in psychology leaves a lot to be desired, and correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.
Why Most Published Research Findings Are False
TL;DR: In this paper, the authors discuss the implications of these problems for the conduct and interpretation of research and suggest that claimed research findings may often be simply accurate measures of the prevailing bias.
Journal ArticleDOI
The Behavior Change Technique Taxonomy (v1) of 93 Hierarchically Clustered Techniques: Building an International Consensus for the Reporting of Behavior Change Interventions
Susan Michie,Michelle Richardson,Marie Johnston,Marie Johnston,Charles Abraham,Jill J Francis,Wendy Hardeman,Martin P Eccles,James E. Cane,Caroline E Wood +9 more
TL;DR: “BCT taxonomy v1,” an extensive taxonomy of 93 consensually agreed, distinct BCTs, offers a step change as a method for specifying interventions, but the authors anticipate further development and evaluation based on international, interdisciplinary consensus.
Journal ArticleDOI
Answering the Call for a Standard Reliability Measure for Coding Data
TL;DR: This work proposes Krippendorff's alpha as the standard reliability measure, general in that it can be used regardless of the number of observers, levels of measurement, sample sizes, and presence or absence of missing data.