Knowledge and Attitudes Among Life Scientists Toward Reproducibility Within Journal Articles: A Research Survey.
Reads0
Chats0
TLDR
In this paper, the authors and scientists view the issues around reproducibility, focusing on interactive elements such as interactive figures embedded within online publications, as a solution for enabling the reproducible of experiments.Abstract:
We constructed a survey to understand how authors and scientists view the issues around reproducibility, focusing on interactive elements such as interactive figures embedded within online publications, as a solution for enabling the reproducibility of experiments. We report the views of 251 researchers, comprising authors who have published in eLIFE Sciences, and those who work at the Norwich Biosciences Institutes (NBI). The survey also outlines to what extent researchers are occupied with reproducing experiments themselves. Currently, there is an increasing range of tools that attempt to address the production of reproducible research by making code, data, and analyses available to the community for reuse. We wanted to collect information about attitudes around the consumer end of the spectrum, where life scientists interact with research outputs to interpret scientific results. Static plots and figures within articles are a central part of this interpretation, and therefore we asked respondents to consider various features for an interactive figure within a research article that would allow them to better understand and reproduce a published analysis. The majority (91%) of respondents reported that when authors describe their research methodology (methods and analyses) in detail, published research can become more reproducible. The respondents believe that having interactive figures in published papers is a beneficial element to themselves, the papers they read as well as to their readers. Whilst interactive figures are one potential solution for consuming the results of research more effectively to enable reproducibility, we also review the equally pressing technical and cultural demands on researchers that need to be addressed to achieve greater success in reproducibility in the life sciences.read more
Citations
More filters
Journal ArticleDOI
A survey of researchers’ code sharing and code reuse practices, and assessment of interactive notebook prototypes
TL;DR: The average researcher, according to the results, is unwilling to incur additional costs that are currently needed to use code sharing tools alongside a publication, infer this means different models for funding and producing interactive or executable research outputs if they are to reach a large number of researchers.
Book ChapterDOI
The Reproducibility Crisis and Autism Spectrum Research
TL;DR: In the field of autism spectrum research, it has been discovered that some results published in studies may not be correct because different researchers using the same dataset and analytical methods were unable to create the same results as discussed by the authors .
Journal ArticleDOI
TIER2: enhancing Trust, Integrity and Efficiency in Research through next-level Reproducibility
Tony Ross-Hellauer,Thomas Klebel,Alexandra Bannach-Brown,Serge Horbach,Hajira Jabeen,Natalia Manola,Teodor Metodiev,Harris Papageorgiou,Martin Reck,Susanna-Assunta Sansone,Jesper W. Schneider,Joeri K. Tijdink,Thanasis Vergoulis +12 more
TL;DR: The TIER2 project as mentioned in this paper is a new international project funded by the European Commission under their Horizon Europe programme, covering three broad research areas (social, life and computer sciences) and two stakeholder groups (research publishers and funders) to systematically investigate reproducibility across contexts.
References
More filters
Journal ArticleDOI
The iPlant Collaborative: Cyberinfrastructure for Plant Biology.
Stephen A. Goff,Matthew W. Vaughn,Sheldon J. McKay,Eric Lyons,Ann E. Stapleton,Damian D. G. Gessler,Naim Matasci,Liya Wang,Matthew R. Hanlon,Andrew Lenards,Andy Muir,Nirav Merchant,Sonya Lowry,Stephen Mock,Matthew Helmke,Adam Kubach,Martha L. Narro,Nicole Hopkins,David A. Micklos,Uwe Hilgert,Michael Gonzales,Chris Jordan,Edwin Skidmore,Rion Dooley,John Cazes,Robert McLay,Zhenyuan Lu,Shiran Pasternak,Lars Koesterke,William H. Piel,Ruth Grene,Christos Noutsos,Karla C Gendler,X. Feng,Chunlao Tang,Monica Lent,Seung Jin Kim,Kristian Kvilekval,B.S. Manjunath,Val Tannen,Alexandros Stamatakis,Michael J. Sanderson,Stephen Welch,Karen Cranston,Pamela S. Soltis,D. E. Soltis,Brian C. O'Meara,Cécile Ané,Thomas P. Brutnell,Daniel J. Kleibenstein,Jeffery W. White,Jim Leebens-Mack,Michael J. Donoghue,Edgar P. Spalding,Christopher R. Myers,David K. Lowenthal,Brian J. Enquist,Brad Boyle,Ali Akoglu,Greg Andrews,Sudha Ram,Doreen Ware,Lincoln Stein,Dan Stanzione +63 more
TL;DR: These workshops teach researchers how to add bioinformatics tools and/or datasets into the iPlant cyberinfrastructure enabling plant scientists to perform complex analyses on large datasets without the need to master the command-line or high-performance computational services.
Journal ArticleDOI
The Availability of Research Data Declines Rapidly with Article Age
Timothy H. Vines,Arianne Albert,Rose L. Andrew,Florence Débarre,Florence Débarre,Dan G. Bock,Michelle T. Franklin,Michelle T. Franklin,Kimberly J. Gilbert,Jean-Sébastien Moore,Jean-Sébastien Moore,Sébastien Renaut,Diana J. Rennison +12 more
TL;DR: The results reinforce the notion that, in the long term, research data cannot be reliably preserved by individual researchers, and further demonstrate the urgent need for policies mandating data sharing via public archives.
Journal ArticleDOI
The Power of Bias in Economics Research
TL;DR: The authors survey 159 empirical economics literatures that draw upon 64,076 estimates of economic parameters reported in more than 6,700 empirical studies to investigate two critical dimensions of the credibility of empirical economics research: statistical power and bias.
Journal ArticleDOI
Making big data open: Data sharing in neuroimaging
TL;DR: The state of data sharing for task-based functional MRI (fMRI) data is outlined, with a focus on various forms of data and their relative utility for subsequent analyses.
Journal ArticleDOI
Badges to Acknowledge Open Practices: A Simple, Low-Cost, Effective Method for Increasing Transparency
Mallory C. Kidwell,Ljiljana B. Lazarević,Erica Baranski,Tom E Hardwicke,Sarah Piechowski,Lina-Sophia Falkenberg,Curtis Kennett,Agnieszka Slowik,Carina Sonnleitner,Chelsey L. Hess-Holden,Timothy M. Errington,Susann Fiedler,Brian A. Nosek,Brian A. Nosek +13 more
TL;DR: The first half of 2015, when Psychological Science gave authors the opportunity to signal open data and materials if they qualified for badges that accompanied published articles, showed an increase of more than an order of magnitude from baseline as discussed by the authors.
Related Papers (5)
Who Wrote the Web? Revisiting Influential Author Identification Research Applicable to Information Retrieval
Martin Potthast,Sarah Braun,Tolga Buz,Fabian Duffhauss,Florian Friedrich,Jörg Marvin Gülzow,Jakob Köhler,Winfried Lötzsch,Fabian Müller,Maike Elisa Müller,Robert Paßmann,Bernhard Reinke,Lucas Rettenmeier,Thomas Rometsch,Timo Sommer,Michael Träger,Sebastian Wilhelm,Benno Stein,Efstathios Stamatatos,Matthias Hagen +19 more