Recursive partitioning for heterogeneous causal effects
Susan Athey,Guido W. Imbens +1 more
Reads0
Chats0
TLDR
This paper provides a data-driven approach to partition the data into subpopulations that differ in the magnitude of their treatment effects, and proposes an “honest” approach to estimation, whereby one sample is used to construct the partition and another to estimate treatment effects for each subpopulation.Abstract:
In this paper we propose methods for estimating heterogeneity in causal effects in experimental and observational studies and for conducting hypothesis tests about the magnitude of differences in treatment effects across subsets of the population. We provide a data-driven approach to partition the data into subpopulations that differ in the magnitude of their treatment effects. The approach enables the construction of valid confidence intervals for treatment effects, even with many covariates relative to the sample size, and without “sparsity” assumptions. We propose an “honest” approach to estimation, whereby one sample is used to construct the partition and another to estimate treatment effects for each subpopulation. Our approach builds on regression tree methods, modified to optimize for goodness of fit in treatment effects and to account for honest estimation. Our model selection criterion anticipates that bias will be eliminated by honest estimation and also accounts for the effect of making additional splits on the variance of treatment effect estimates within each subpopulation. We address the challenge that the “ground truth” for a causal effect is not observed for any individual unit, so that standard approaches to cross-validation must be modified. Through a simulation study, we show that for our preferred method honest estimation results in nominal coverage for 90% confidence intervals, whereas coverage ranges between 74% and 84% for nonhonest approaches. Honest estimation requires estimating the model with a smaller sample size; the cost in terms of mean squared error of treatment effects for our preferred method ranges between 7–22%.read more
Citations
More filters
Posted Content
Efficient Balanced Treatment Assignments for Experimentation
TL;DR: This work reframe the problem of balanced treatment assignment as optimization of a two-sample test between test and control units and provides an assignment algorithm that is optimal with respect to the minimum spanning tree test of Friedman and Rafsky (1979).
Posted Content
Heterogeneous Treatment Effects in Regression Discontinuity Designs
TL;DR: In this paper, a supervised machine learning algorithm is proposed to uncover treatment effect heterogeneity in classical regression discontinuity (RD) designs, where each leaf of the tree contains the RD estimate of a treatment (assigned by a common cutoff rule) conditional on the values of some pre-treatment covariates.
Journal ArticleDOI
Ten Rules for Conducting Retrospective Pharmacoepidemiological Analyses: Example COVID-19 Study
Michael Powell,Allison Koenecke,James Brian Byrd,Akihiko Nishimura,Maximilian F. Konig,Ruoxuan Xiong,Sadiqa Mahmood,Vera Mucaj,Chetan Bettegowda,Liam Rose,Suzanne Tamang,Adam Sacarny,Brian Caffo,Susan Athey,Elizabeth A. Stuart,Joshua T. Vogelstein +15 more
TL;DR: In this paper, the authors present 10 rules that serve as an end-to-end introduction to retrospective pharmacoepidemiological analyses of observational health care data using a running example of a hypothetical COVID-19 study.
Grow the pie or have it? Using machine learning for impact heterogeneity in the Ultra-poor Graduation Model
TL;DR: Chowdhury et al. as discussed by the authors found significant variation in impact on assets where the top quintile of gainers had an impact of 3.44 on their log of assets compared to the impact of 1.92 observed by the bottom quintile.
Journal ArticleDOI
Augmented direct learning for conditional average treatment effect estimation with double robustness
Haomiao Meng,Xingye Qiao +1 more
TL;DR: The authors proposed robust direct learning (RD-Learning) to augment D-learning, leading to doubly robust estimators of the treatment effect, which can be used in both the binary and the multi-arm settings and is general enough to allow different function spaces and incorporate different generic learning algorithms.
References
More filters
Journal ArticleDOI
Random Forests
TL;DR: Internal estimates monitor error, strength, and correlation and these are used to show the response to increasing the number of features used in the forest, and are also applicable to regression.
Journal ArticleDOI
Regression Shrinkage and Selection via the Lasso
TL;DR: A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
Book
The Nature of Statistical Learning Theory
TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Statistical learning theory
TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Journal ArticleDOI
The central role of the propensity score in observational studies for causal effects
TL;DR: The authors discusses the central role of propensity scores and balancing scores in the analysis of observational studies and shows that adjustment for the scalar propensity score is sufficient to remove bias due to all observed covariates.
Related Papers (5)
Estimation and Inference of Heterogeneous Treatment Effects using Random Forests
Stefan Wager,Susan Athey +1 more