scispace - formally typeset
Search or ask a question
Institution

Center for Open Science

NonprofitCharlottesville, Virginia, United States
About: Center for Open Science is a nonprofit organization based out in Charlottesville, Virginia, United States. It is known for research contribution in the topics: Replication (statistics) & Open science. The organization has 68 authors who have published 123 publications receiving 12899 citations. The organization is also known as: COS & cos.io.

Papers published on a yearly basis

Papers
More filters
Journal ArticleDOI
19 Jan 2017-eLife
TL;DR: The first results from the Reproducibility Project: Cancer Biology suggest that there is scope for improving reproducibility in pre-clinical cancer research.
Abstract: The first results from the Reproducibility Project: Cancer Biology suggest that there is scope for improving reproducibility in pre-clinical cancer research.

165 citations

Journal ArticleDOI
TL;DR: It is proposed that replication is a study for which any outcome would be considered diagnostic evidence about a claim from prior research, which reduces emphasis on operational characteristics of the study and increases emphasis on the interpretation of possible outcomes.
Abstract: Credibility of scientific claims is established with evidence for their replicability using new data. According to common understanding, replication is repeating a study’s procedure and observing whether the prior finding recurs. This definition is intuitive, easy to apply, and incorrect. We propose that replication is a study for which any outcome would be considered diagnostic evidence about a claim from prior research. This definition reduces emphasis on operational characteristics of the study and increases emphasis on the interpretation of possible outcomes. The purpose of replication is to advance theory by confronting existing understanding with new evidence. Ironically, the value of replication may be strongest when existing understanding is weakest. Successful replication provides evidence of generalizability across the conditions that inevitably differ from the original study; Unsuccessful replication indicates that the reliability of the finding may be more constrained than recognized previously. Defining replication as a confrontation of current theoretical expectations clarifies its important, exciting, and generative role in scientific progress.

147 citations

Journal ArticleDOI
04 Mar 2016-Science
TL;DR: Evidence from the Open Science Collaboration’s Reproducibility Project: Psychology indicates high reproducibility, given the study methodology, and both optimistic and pessimistic conclusions are possible, and neither are yet warranted.
Abstract: Gilbert et al. conclude that evidence from the Open Science Collaboration’s Reproducibility Project: Psychology indicates high reproducibility, given the study methodology. Their very optimistic assessment is limited by statistical misconceptions and by causal inferences from selectively interpreted, correlational data. Using the Reproducibility Project: Psychology data, both optimistic and pessimistic conclusions about reproducibility are possible, and neither are yet warranted.

129 citations

Journal ArticleDOI
TL;DR: It is concluded that effective decision making as a means towards more resilient and sustainable communities can be strengthened by leveraging the power of place in citizen science.

127 citations

Journal ArticleDOI
TL;DR: The goal of this study was to gauge and understand what parts of preregistration templates qualitative researchers would find helpful and informative and to assure that the strength of qualitative research, which is its flexibility to adapt, adjust and respond, is not lost in prereg registration.
Abstract: Preregistrations—records made a priori about study designs and analysis plans and placed in open repositories—are thought to strengthen the credibility and transparency of research. Different authors have put forth arguments in favor of introducing this practice in qualitative research and made suggestions for what to include in a qualitative preregistration form. The goal of this study was to gauge and understand what parts of preregistration templates qualitative researchers would find helpful and informative. We used an online Delphi study design consisting of two rounds with feedback reports in between. In total, 48 researchers participated (response rate: 16%). In round 1, panelists considered 14 proposed items relevant to include in the preregistration form, but two items had relevance scores just below our predefined criterion (68%) with mixed argument and were put forth again. We combined items where possible, leading to 11 revised items. In round 2, panelists agreed on including the two remaining items. Panelists also converged on suggested terminology and elaborations, except for two terms for which they provided clear arguments. The result is an agreement-based form for the preregistration of qualitative studies that consists of 13 items. The form will be made available as a registration option on Open Science Framework (osf.io). We believe it is important to assure that the strength of qualitative research, which is its flexibility to adapt, adjust and respond, is not lost in preregistration. The preregistration should provide a systematic starting point.

125 citations


Authors

Showing all 68 results

Network Information
Related Institutions (5)
Broad Institute
11.6K papers, 1.5M citations

82% related

Swiss Institute of Bioinformatics
4.9K papers, 407.6K citations

82% related

Howard Hughes Medical Institute
34.6K papers, 5.2M citations

81% related

European Bioinformatics Institute
10.5K papers, 999.6K citations

81% related

Wellcome Trust Sanger Institute
9.6K papers, 1.2M citations

80% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
20222
202114
202012
201914
201818
201714