scispace - formally typeset
Open AccessJournal ArticleDOI

Assessing Open Science practices in physical activity behaviour change intervention evaluations

Evi Sugiatni
- Vol. 8, Iss: 2, pp e001282-e001282
Reads0
Chats0
TLDR
The extent that physical activity interventions are embedding Open Science practices is currently unknown as discussed by the authors , and the extent to which such interventions embedding open science practices is unknown, however, the authors in this paper have identified 100 reports of recent physical activity randomised controlled trial behaviour change interventions to estimate the prevalence of Open Science practice.
Abstract
Concerns on the lack of reproducibility and transparency in science have led to a range of research practice reforms, broadly referred to as 'Open Science'. The extent that physical activity interventions are embedding Open Science practices is currently unknown. In this study, we randomly sampled 100 reports of recent physical activity randomised controlled trial behaviour change interventions to estimate the prevalence of Open Science practices.One hundred reports of randomised controlled trial physical activity behaviour change interventions published between 2018 and 2021 were identified, as used within the Human Behaviour-Change Project. Open Science practices were coded in identified reports, including: study pre-registration, protocol sharing, data, materials and analysis scripts sharing, replication of a previous study, open access publication, funding sources and conflict of interest statements. Coding was performed by two independent researchers, with inter-rater reliability calculated using Krippendorff's alpha.78 of the 100 reports provided details of study pre-registration and 41% provided evidence of a published protocol. 4% provided accessible open data, 8% provided open materials and 1% provided open analysis scripts. 73% of reports were published as open access and no studies were described as replication attempts. 93% of reports declared their sources of funding and 88% provided conflicts of interest statements. A Krippendorff's alpha of 0.73 was obtained across all coding.Open data, materials, analysis and replication attempts are currently rare in physical activity behaviour change intervention reports, whereas funding source and conflict of interest declarations are common. Future physical activity research should increase the reproducibility of their methods and results by incorporating more Open Science practices.

read more

Content maybe subject to copyright    Report

Citations
More filters
Posted ContentDOI

Rates and predictors of data and code sharing in the medical and health sciences: A systematic review with meta-analysis of individual participant data.

TL;DR: In this article , a systematic review with meta-analysis of individual participant data (IPD) from meta-research studies was conducted to provide an accurate picture of how common data and code sharing is, how this frequency has changed over time, and what factors are associated with sharing.
Journal ArticleDOI

Measuring the growing impact of BOSEM: halfway there or living on a prayer?

TL;DR: The BOSEM Editorial Board would like to explain why, as well as measure the growing impact of BOSem via the authors' own ‘unofficial’ JIF calculation, and compare the quality and impact of scholarly journals through metrics.
Journal ArticleDOI

Rethinking how and when to report descriptions of behavior change content within interventions: a case study of an ongoing physical activity trial (ready steady 3.0)

TL;DR: In this paper , the authors defined BC types as behavior change techniques (BCT) and behavioral prescriptions and analyzed the types and dosages of the smallest potentially active BC ingredients and associated behavioral prescriptions intended to be delivered in an ongoing physical activity optimization trial for older adults.
Journal ArticleDOI

Open Science Standards at Journals that Inform Evidence-Based Policy.

TL;DR: In this paper , the authors evaluate the impact of open science standards on evidence-based policy making and programmatic decisions in 339 peer-reviewed journals and find that each of ten open science guidelines in TOP was not implemented in most journals' policies (instructions to authors), procedures (manuscript submission systems), or practices (published articles).
References
More filters
Journal ArticleDOI

Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: elaboration and explanation.

TL;DR: The PRISMA-P checklist as mentioned in this paper provides 17 items considered to be essential and minimum components of a systematic review or meta-analysis protocol, as well as a model example from an existing published protocol.
Journal ArticleDOI

The behaviour change wheel: a new method for characterising and designing behaviour change interventions.

TL;DR: Interventions and policies to change behaviour can be usefully characterised by means of a BCW comprising: a 'behaviour system' at the hub, encircled by intervention functions and then by policy categories, and a new framework aimed at overcoming their limitations is developed.
Journal ArticleDOI

False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant

TL;DR: It is shown that despite empirical psychologists’ nominal endorsement of a low rate of false-positive findings, flexibility in data collection, analysis, and reporting dramatically increases actual false- positive rates, and a simple, low-cost, and straightforwardly effective disclosure-based solution is suggested.
Journal ArticleDOI

Answering the Call for a Standard Reliability Measure for Coding Data

TL;DR: This work proposes Krippendorff's alpha as the standard reliability measure, general in that it can be used regardless of the number of observers, levels of measurement, sample sizes, and presence or absence of missing data.
Journal ArticleDOI

A manifesto for reproducible science

TL;DR: This work argues for the adoption of measures to optimize key elements of the scientific process: methods, reporting and dissemination, reproducibility, evaluation and incentives, in the hope that this will facilitate action toward improving the transparency, reproducible and efficiency of scientific research.