Institution
Center for Open Science
Nonprofit•Charlottesville, Virginia, United States•
About: Center for Open Science is a nonprofit organization based out in Charlottesville, Virginia, United States. It is known for research contribution in the topics: Replication (statistics) & Open science. The organization has 68 authors who have published 123 publications receiving 12899 citations. The organization is also known as: COS & cos.io.
Topics: Replication (statistics), Open science, Transparency (behavior), Reproducibility Project, Credibility
Papers
More filters
••
TL;DR: The first results from the Reproducibility Project: Cancer Biology suggest that there is scope for improving reproducibility in pre-clinical cancer research.
Abstract: The first results from the Reproducibility Project: Cancer Biology suggest that there is scope for improving reproducibility in pre-clinical cancer research.
165 citations
••
[...]
TL;DR: It is proposed that replication is a study for which any outcome would be considered diagnostic evidence about a claim from prior research, which reduces emphasis on operational characteristics of the study and increases emphasis on the interpretation of possible outcomes.
Abstract: Credibility of scientific claims is established with evidence for their replicability using new data. According to common understanding, replication is repeating a study’s procedure and observing whether the prior finding recurs. This definition is intuitive, easy to apply, and incorrect. We propose that replication is a study for which any outcome would be considered diagnostic evidence about a claim from prior research. This definition reduces emphasis on operational characteristics of the study and increases emphasis on the interpretation of possible outcomes. The purpose of replication is to advance theory by confronting existing understanding with new evidence. Ironically, the value of replication may be strongest when existing understanding is weakest. Successful replication provides evidence of generalizability across the conditions that inevitably differ from the original study; Unsuccessful replication indicates that the reliability of the finding may be more constrained than recognized previously. Defining replication as a confrontation of current theoretical expectations clarifies its important, exciting, and generative role in scientific progress.
147 citations
••
Russell Sage College1, University of Würzburg2, University of Waterloo3, Virginia Commonwealth University4, University of Michigan5, Mathematica Policy Research6, Ashland University7, Michigan State University8, Southern Oregon University9, University of Göttingen10, University of Southern California11, Australian National University12, Braunschweig University of Technology13, Queen's University14, Stanford University15, Keele University16, Boston College17, Radboud University Nijmegen18, University of Koblenz and Landau19, Erasmus University Rotterdam20, University of Amsterdam21, Harvard University22, Occidental College23, Willamette University24, Arcadia University25, University of Potsdam26, University of Bristol27, VU University Amsterdam28, Karolinska Institutet29, Center for Open Science30, University of Virginia31, University of Baltimore32, University of California, Riverside33, Wesleyan University34, University of Konstanz35, Yale University36, Coventry University37, Tilburg University38, Utrecht University39, Katholieke Universiteit Leuven40, University of Padua41, University of Vienna42, Adams State University43
TL;DR: Evidence from the Open Science Collaboration’s Reproducibility Project: Psychology indicates high reproducibility, given the study methodology, and both optimistic and pessimistic conclusions are possible, and neither are yet warranted.
Abstract: Gilbert et al. conclude that evidence from the Open Science Collaboration’s Reproducibility Project: Psychology indicates high reproducibility, given the study methodology. Their very optimistic assessment is limited by statistical misconceptions and by causal inferences from selectively interpreted, correlational data. Using the Reproducibility Project: Psychology data, both optimistic and pessimistic conclusions about reproducibility are possible, and neither are yet warranted.
129 citations
••
TL;DR: It is concluded that effective decision making as a means towards more resilient and sustainable communities can be strengthened by leveraging the power of place in citizen science.
127 citations
••
TL;DR: The goal of this study was to gauge and understand what parts of preregistration templates qualitative researchers would find helpful and informative and to assure that the strength of qualitative research, which is its flexibility to adapt, adjust and respond, is not lost in prereg registration.
Abstract: Preregistrations—records made a priori about study designs and analysis plans and placed in open repositories—are thought to strengthen the credibility and transparency of research. Different authors have put forth arguments in favor of introducing this practice in qualitative research and made suggestions for what to include in a qualitative preregistration form. The goal of this study was to gauge and understand what parts of preregistration templates qualitative researchers would find helpful and informative. We used an online Delphi study design consisting of two rounds with feedback reports in between. In total, 48 researchers participated (response rate: 16%). In round 1, panelists considered 14 proposed items relevant to include in the preregistration form, but two items had relevance scores just below our predefined criterion (68%) with mixed argument and were put forth again. We combined items where possible, leading to 11 revised items. In round 2, panelists agreed on including the two remaining items. Panelists also converged on suggested terminology and elaborations, except for two terms for which they provided clear arguments. The result is an agreement-based form for the preregistration of qualitative studies that consists of 13 items. The form will be made available as a registration option on Open Science Framework (osf.io). We believe it is important to assure that the strength of qualitative research, which is its flexibility to adapt, adjust and respond, is not lost in preregistration. The preregistration should provide a systematic starting point.
125 citations
Authors
Showing all 68 results
Name | H-index | Papers | Citations |
---|---|---|---|
Brian A. Nosek | 84 | 230 | 54152 |
Santo Fortunato | 57 | 178 | 42502 |
Evan Mayo-Wilson | 39 | 114 | 8254 |
Michael Barnett-Cowan | 21 | 77 | 5920 |
Colin Tucker Smith | 18 | 49 | 2021 |
Christopher J. Lemons | 17 | 59 | 923 |
Svetlana Lorenzano | 16 | 48 | 1297 |
Jeffrey R. Spies | 16 | 24 | 8804 |
Jennifer A. Joy-Gaba | 16 | 31 | 7285 |
Jordan Axt | 14 | 48 | 6725 |
Amy Koshoffer | 14 | 23 | 750 |
Samuel Field | 14 | 31 | 1066 |
David Thomas Mellor | 14 | 47 | 1353 |
Timothy M. Errington | 13 | 34 | 6079 |
Nick Buttrick | 10 | 30 | 868 |