scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Knowledge and Attitudes Among Life Scientists Toward Reproducibility Within Journal Articles: A Research Survey.

01 Jan 2021-Frontiers in Research Metrics and Analytics (Frontiers Media SA)-Vol. 6, pp 678554-678554
TL;DR: In this paper, the authors and scientists view the issues around reproducibility, focusing on interactive elements such as interactive figures embedded within online publications, as a solution for enabling the reproducible of experiments.
Abstract: We constructed a survey to understand how authors and scientists view the issues around reproducibility, focusing on interactive elements such as interactive figures embedded within online publications, as a solution for enabling the reproducibility of experiments. We report the views of 251 researchers, comprising authors who have published in eLIFE Sciences, and those who work at the Norwich Biosciences Institutes (NBI). The survey also outlines to what extent researchers are occupied with reproducing experiments themselves. Currently, there is an increasing range of tools that attempt to address the production of reproducible research by making code, data, and analyses available to the community for reuse. We wanted to collect information about attitudes around the consumer end of the spectrum, where life scientists interact with research outputs to interpret scientific results. Static plots and figures within articles are a central part of this interpretation, and therefore we asked respondents to consider various features for an interactive figure within a research article that would allow them to better understand and reproduce a published analysis. The majority (91%) of respondents reported that when authors describe their research methodology (methods and analyses) in detail, published research can become more reproducible. The respondents believe that having interactive figures in published papers is a beneficial element to themselves, the papers they read as well as to their readers. Whilst interactive figures are one potential solution for consuming the results of research more effectively to enable reproducibility, we also review the equally pressing technical and cultural demands on researchers that need to be addressed to achieve greater success in reproducibility in the life sciences.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
22 Aug 2022-PeerJ
TL;DR: The average researcher, according to the results, is unwilling to incur additional costs that are currently needed to use code sharing tools alongside a publication, infer this means different models for funding and producing interactive or executable research outputs if they are to reach a large number of researchers.
Abstract: This research aimed to understand the needs and habits of researchers in relation to code sharing and reuse; gather feedback on prototype code notebooks created by NeuroLibre; and help determine strategies that publishers could use to increase code sharing. We surveyed 188 researchers in computational biology. Respondents were asked about how often and why they look at code, which methods of accessing code they find useful and why, what aspects of code sharing are important to them, and how satisfied they are with their ability to complete these tasks. Respondents were asked to look at a prototype code notebook and give feedback on its features. Respondents were also asked how much time they spent preparing code and if they would be willing to increase this to use a code sharing tool, such as a notebook. As a reader of research articles the most common reason (70%) for looking at code was to gain a better understanding of the article. The most commonly encountered method for code sharing–linking articles to a code repository–was also the most useful method of accessing code from the reader’s perspective. As authors, the respondents were largely satisfied with their ability to carry out tasks related to code sharing. The most important of these tasks were ensuring that the code was running in the correct environment, and sharing code with good documentation. The average researcher, according to our results, is unwilling to incur additional costs (in time, effort or expenditure) that are currently needed to use code sharing tools alongside a publication. We infer this means we need different models for funding and producing interactive or executable research outputs if they are to reach a large number of researchers. For the purpose of increasing the amount of code shared by authors, PLOS Computational Biology is, as a result, focusing on policy rather than tools.

4 citations

Book ChapterDOI
07 Jul 2022
TL;DR: In the field of autism spectrum research, it has been discovered that some results published in studies may not be correct because different researchers using the same dataset and analytical methods were unable to create the same results as discussed by the authors .
Abstract: It has been discovered that some results published in studies may not be correct because different researchers using the same dataset and analytical methods were unable to create the same results. This dilemma is called the reproducibility crisis. Currently, there has not been a comprehensive examination of the possible existence of this crisis in the field of autism spectrum research. This chapter does not answer the question, ‘Is there a reproducibility crisis occurring in the field of autism spectrum research?’ Rather, it contains an outline of this crisis, explains some of the most influential factors that have contributed to its development and how scholars who study the autism spectrum can change their research practices so that this crisis does not develop.The original contribution that this chapter makes to autism spectrum research is to explain how some solutions to the reproducibility crisis can be implemented into the field of autism spectrum research.
Journal ArticleDOI
TL;DR: The TIER2 project as mentioned in this paper is a new international project funded by the European Commission under their Horizon Europe programme, covering three broad research areas (social, life and computer sciences) and two stakeholder groups (research publishers and funders) to systematically investigate reproducibility across contexts.
Abstract: Lack of reproducibility of research results has become a major theme in recent years. As we emerge from the COVID-19 pandemic, economic pressures and exposed consequences of lack of societal trust in science make addressing reproducibility of urgent importance. TIER2 is a new international project funded by the European Commission under their Horizon Europe programme. Covering three broad research areas (social, life and computer sciences) and two cross-disciplinary stakeholder groups (research publishers and funders) to systematically investigate reproducibility across contexts, TIER2 will significantly boost knowledge on reproducibility, create tools, engage communities, implement interventions and policy across different contexts to increase re-use and overall quality of research results in the European Research Area and global R&I, and consequently increase trust, integrity and efficiency in research.
References
More filters
Journal ArticleDOI
TL;DR: The FAIR Data Principles as mentioned in this paper are a set of data reuse principles that focus on enhancing the ability of machines to automatically find and use the data, in addition to supporting its reuse by individuals.
Abstract: There is an urgent need to improve the infrastructure supporting the reuse of scholarly data. A diverse set of stakeholders—representing academia, industry, funding agencies, and scholarly publishers—have come together to design and jointly endorse a concise and measureable set of principles that we refer to as the FAIR Data Principles. The intent is that these may act as a guideline for those wishing to enhance the reusability of their data holdings. Distinct from peer initiatives that focus on the human scholar, the FAIR Principles put specific emphasis on enhancing the ability of machines to automatically find and use the data, in addition to supporting its reuse by individuals. This Comment is the first formal publication of the FAIR Principles, and includes the rationale behind them, and some exemplar implementations in the community.

7,602 citations

Journal ArticleDOI
28 Aug 2015-Science
TL;DR: A large-scale assessment suggests that experimental reproducibility in psychology leaves a lot to be desired, and correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.
Abstract: Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.

5,532 citations

Journal ArticleDOI
TL;DR: In this paper, the authors formulate and define standards for reporting qualitative research while preserving the requisite flexibility for the broad spectrum of qualitative research, and present a set of guidelines for reporting such research.
Abstract: PurposeStandards for reporting exist for many types of quantitative research, but currently none exist for the broad spectrum of qualitative research. The purpose of the present study was to formulate and define standards for reporting qualitative research while preserving the requisite flexibility

4,506 citations

Journal ArticleDOI
01 May 2017
TL;DR: ROZA was developed under the umbrella of LTER-France (Long Term Ecological Research) in order to facilitate the re-use of data and samples and will favor to use of paleodata by non-paleodata scientists, in particular ecologists.
Abstract: Managing paleoscience data is highly challenging to the multiplicity of actors in play, types of sampling, analysis, post-analysis treatments, statistics etc. However, a well-structured curating of data would permit innovative developments based on data and/or sample re-use, such as meta-analysis or the development of new proxies on previously studied cores. In this paper, we will present two recent initiatives that allowed us tackling this objective at a French national level: the “National Cyber Core Repository” (NCCR) and the “LTER-France retro-observatory” (ROZA).NCCR was developed under the umbrella of the French National Center fo Coring and Drilling (C2FN) thanks to the national excellence equipment project CLIMCOR. It aims at gathering on a unique website the locations and metadata of any scientific coring/drilling performed by French teams or using French facilities, whatever the type of archive it is (lake/marine sediment; ice etc.). It uses international standard, notably IGSN (for samples), ORCID (for persons) and DOI (for campaigns). NCC follows the INSPIRE ISO 19115 standard in order to catalogue the data. For continental sediment, NCCR may be fed directly on the field through a specifically developed mobile application.Based on NCCR, further initiatives may be led. In particular, under the umbrella of LTER-France (Long Term Ecological Research), we developed ROZA in order to facilitate the re-use of data and samples. Here the idea is to capitalise the knowledge on a given lake from which several sediment cores can be taken through time. In that aim we selected at least one lake from each of the 13 areas composing the network LTER-France. To enter the database, a set of mandatory data must be provided under a pre-determined format. In that case, the insertion of ROZA within the network LTER will favor to use of paleodata by non-paleodata scientists, in particular ecologists.

3,648 citations

Journal ArticleDOI
26 May 2016-Nature

2,609 citations