scispace - formally typeset
Search or ask a question
Author

Donald P. Green

Other affiliations: Yale University, Stanford University
Bio: Donald P. Green is an academic researcher from Columbia University. The author has contributed to research in topics: Turnout & Voting. The author has an hindex of 75, co-authored 238 publications receiving 27344 citations. Previous affiliations of Donald P. Green include Yale University & Stanford University.


Papers
More filters
Journal ArticleDOI
Daniel J. Benjamin1, James O. Berger2, Magnus Johannesson1, Magnus Johannesson3, Brian A. Nosek4, Brian A. Nosek5, Eric-Jan Wagenmakers6, Richard A. Berk7, Kenneth A. Bollen8, Björn Brembs9, Lawrence D. Brown7, Colin F. Camerer10, David Cesarini11, David Cesarini12, Christopher D. Chambers13, Merlise A. Clyde2, Thomas D. Cook14, Thomas D. Cook15, Paul De Boeck16, Zoltan Dienes17, Anna Dreber3, Kenny Easwaran18, Charles Efferson19, Ernst Fehr20, Fiona Fidler21, Andy P. Field17, Malcolm R. Forster22, Edward I. George7, Richard Gonzalez23, Steven N. Goodman24, Edwin J. Green25, Donald P. Green26, Anthony G. Greenwald27, Jarrod D. Hadfield28, Larry V. Hedges14, Leonhard Held20, Teck-Hua Ho29, Herbert Hoijtink30, Daniel J. Hruschka31, Kosuke Imai32, Guido W. Imbens24, John P. A. Ioannidis24, Minjeong Jeon33, James Holland Jones34, Michael Kirchler35, David Laibson36, John A. List37, Roderick J. A. Little23, Arthur Lupia23, Edouard Machery38, Scott E. Maxwell39, Michael A. McCarthy21, Don A. Moore40, Stephen L. Morgan41, Marcus R. Munafò42, Shinichi Nakagawa43, Brendan Nyhan44, Timothy H. Parker45, Luis R. Pericchi46, Marco Perugini47, Jeffrey N. Rouder48, Judith Rousseau49, Victoria Savalei50, Felix D. Schönbrodt51, Thomas Sellke52, Betsy Sinclair53, Dustin Tingley36, Trisha Van Zandt16, Simine Vazire54, Duncan J. Watts55, Christopher Winship36, Robert L. Wolpert2, Yu Xie32, Cristobal Young24, Jonathan Zinman44, Valen E. Johnson18, Valen E. Johnson1 
University of Southern California1, Duke University2, Stockholm School of Economics3, University of Virginia4, Center for Open Science5, University of Amsterdam6, University of Pennsylvania7, University of North Carolina at Chapel Hill8, University of Regensburg9, California Institute of Technology10, Research Institute of Industrial Economics11, New York University12, Cardiff University13, Northwestern University14, Mathematica Policy Research15, Ohio State University16, University of Sussex17, Texas A&M University18, Royal Holloway, University of London19, University of Zurich20, University of Melbourne21, University of Wisconsin-Madison22, University of Michigan23, Stanford University24, Rutgers University25, Columbia University26, University of Washington27, University of Edinburgh28, National University of Singapore29, Utrecht University30, Arizona State University31, Princeton University32, University of California, Los Angeles33, Imperial College London34, University of Innsbruck35, Harvard University36, University of Chicago37, University of Pittsburgh38, University of Notre Dame39, University of California, Berkeley40, Johns Hopkins University41, University of Bristol42, University of New South Wales43, Dartmouth College44, Whitman College45, University of Puerto Rico46, University of Milan47, University of California, Irvine48, Paris Dauphine University49, University of British Columbia50, Ludwig Maximilian University of Munich51, Purdue University52, Washington University in St. Louis53, University of California, Davis54, Microsoft55
TL;DR: The default P-value threshold for statistical significance is proposed to be changed from 0.05 to 0.005 for claims of new discoveries in order to reduce uncertainty in the number of discoveries.
Abstract: We propose to change the default P-value threshold for statistical significance from 0.05 to 0.005 for claims of new discoveries.

1,586 citations

Journal ArticleDOI
26 Jun 2015-Science
TL;DR: A growing body of evidence suggests that transparency, openness, and reproducibility are vital features of science as discussed by the authors, and most scientists embrace these features as disciplinary norms and values when asked, therefore, one might expect that these valued features would be routine in daily practice.
Abstract: Transparency, openness, and reproducibility are readily recognized as vital features of science (1, 2). When asked, most scientists embrace these features as disciplinary norms and values (3). Therefore, one might expect that these valued features would be routine in daily practice. Yet, a growing body of evidence suggests that this is not the case (4–6).

1,576 citations

Posted Content
TL;DR: This article proposed to change the default P-value threshold for statistical significance for claims of new discoveries from 0.05 to 0.005, which is the threshold used in this paper.
Abstract: We propose to change the default P-value threshold for statistical significance for claims of new discoveries from 0.05 to 0.005.

1,415 citations

Book
01 Sep 2002
TL;DR: Partisan Hearts and Minds as discussed by the authors is an authoritative study that demonstrates that identification with political parties powerfully determines how citizens look at politics and cast their ballots. And it is the most important theoretical contribution to the study of partisanship in the last two decades.
Abstract: In this authoritative study, three political scientists demonstrate that identification with political parties powerfully determines how citizens look at politics and cast their ballots. "Partisan Hearts and Minds is a profound breakthrough in our understanding of partisan loyalties and makes a major contribution to the study of political attitudes and voting behavior."-Paul Abramson, Michigan State University "This book will be influential the moment it appears. It will be the starting point for all further treatments of the topic."-Richard Johnston, University of British Columbia "The grounding of partisanship in social identities is the most important theoretical contribution to the study of partisanship in the last two decades."-Morris P. Fiorina, Stanford University

1,155 citations

Journal ArticleDOI
TL;DR: The results of a randomized field experiment involving approximately 30,000 registered voters in New Haven, Connecticut as discussed by the authors showed that voter turnout was increased substantially by personal canvassing, slightly by direct mail, and not at all by telephone calls.
Abstract: We report the results of a randomized field experiment involving approximately 30,000 registered voters in New Haven, Connecticut. Nonpartisan get-out-the-vote messages were conveyed through personal canvassing, direct mail, and telephone calls shortly before the November 1998 election. A variety of substantive messages were used. Voter turnout was increased substantially by personal canvassing, slightly by direct mail, and not at all by telephone calls. These findings support our hypothesis that the long-term retrenchment in voter turnout is partly attributable to the decline in face-to-face political mobilization.

1,097 citations


Cited by
More filters
Journal ArticleDOI
29 Mar 2021-BMJ
TL;DR: The preferred reporting items for systematic reviews and meta-analyses (PRISMA) statement as discussed by the authors was designed to help systematic reviewers transparently report why the review was done, what the authors did, and what they found.
Abstract: The Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement, published in 2009, was designed to help systematic reviewers transparently report why the review was done, what the authors did, and what they found. Over the past decade, advances in systematic review methodology and terminology have necessitated an update to the guideline. The PRISMA 2020 statement replaces the 2009 statement and includes new reporting guidance that reflects advances in methods to identify, select, appraise, and synthesise studies. The structure and presentation of the items have been modified to facilitate implementation. In this article, we present the PRISMA 2020 27-item checklist, an expanded checklist that details reporting recommendations for each item, the PRISMA 2020 abstract checklist, and the revised flow diagrams for original and updated reviews.

16,613 citations

Book
01 Jan 2009

8,216 citations

Journal ArticleDOI
TL;DR: In this article, the authors discuss theoretical principles, practical issues, and pragmatic decisions to help developers maximize the construct validity of scales and subscales, and propose factor analysis as a crucial role in ensuring unidimensionality and discriminant validity.
Abstract: A primary goal of scale development is to create a valid measure of an underlying construct. We discuss theoretical principles, practical issues, and pragmatic decisions to help developers maximize the construct validity of scales and subscales. First, it is essential to begin with a clear conceptualization of the target construct. Moreover, the content of the initial item pool should be overinclusive and item wording needs careful attention. Next, the item pool should be tested, along with variables that assess closely related constructs, on a heterogeneous sample representing the entire range of the target population. Finally, in selecting scale items, the goal is unidimensionality rather than internal consistency ; this means that virtually all interitem correlations should be moderate in magnitude. Factor analysis can play a crucial role in ensuring the unidimensionality and discriminant validity of scales.

5,867 citations

Journal ArticleDOI
TL;DR: A theoretical model of psychological well-being that encompasses 6 distinct dimensions of wellness (Autonomy, Environmental Mastery, Personal Growth, Positive Relations with Others, Purpose in Life, Self-Acceptance) was tested with data from a nationally representative sample of adults (N = 1,108), aged 25 and older, who participated in telephone interviews.
Abstract: A theoretical model of psychological well-being that encompasses 6 distinct dimensions of wellness (Autonomy, Environmental Mastery, Personal Growth, Positive Relations with Others, Purpose in Life, Self-Acceptance) was tested with data from a nationally representative sample of adults (N = 1,108), aged 25 and older, who participated in telephone interviews. Confirmatory factor analyses provided support for the proposed 6-factor model, with a single second-order super factor. The model was superior in fit over single-factor and other artifactual models. Age and sex differences on the various well-being dimensions replicated prior findings. Comparisons with other frequently used indicators (positive and negative affect, life satisfaction) demonstrated that the latter neglect key aspects of positive functioning emphasized in theories of health and well-being.

5,610 citations

Journal ArticleDOI
28 Aug 2015-Science
TL;DR: A large-scale assessment suggests that experimental reproducibility in psychology leaves a lot to be desired, and correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.
Abstract: Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.

5,532 citations