scispace - formally typeset
Search or ask a question

Showing papers by "Donald B. Rubin published in 1997"


Journal ArticleDOI
TL;DR: Propensity score methods generalize subclassification in the presence of many confounding covariates, such as age, region of the country, and sex, in a study of smoking and mortality.
Abstract: The aim of many analyses of large databases is to draw causal inferences about the effects of actions, treatments, or interventions. Examples include the effects of various options available to a p...

2,902 citations


Journal ArticleDOI
TL;DR: In this article, Bayesian inferential methods for causal estimands in the presence of noncompliance are presented, where the binary treatment assignment is random and hence ignorable, but the treatment received is not ignorable.
Abstract: For most of this century, randomization has been a cornerstone of scientific experimentation, especially when dealing with humans as experimental units. In practice, however, noncompliance is relatively common with human subjects, complicating traditional theories of inference that require adherence to the random treatment assignment. In this paper we present Bayesian inferential methods for causal estimands in the presence of noncompliance, when the binary treatment assignment is random and hence ignorable, but the binary treatment received is not ignorable. We assume that both the treatment assigned and the treatment received are observed. We describe posterior estimation using EM and data augmentation algorithms. Also, we investigate the role of two assumptions often made in econometric instrumental variables analyses, the exclusion restriction and the monotonicity assumption, without which the likelihood functions generally have substantial regions of maxima. We apply our procedures to real and artificial data, thereby demonstrating the technology and showing that our new methods can yield valid inferences that differ in practically important ways from those based on previous methods for analysis in the presence of noncompliance, including intention-to-treat analyses and analyses based on econometric instrumental variables techniques. Finally, we perform a simulation to investigate the operating characteristics of the competing procedures in a simple setting, which indicates relatively dramatic improvements in frequency operating characteristics attainable using our Bayesian procedures.

542 citations


Journal ArticleDOI
TL;DR: In this paper, the authors show that the standard instrumental variables estimates implicitly estimate the outcome distributions to be negative over a substantial range, and that the estimates of the local average treatment effect change considerably when they impose non-negativity in any of a variety of ways.
Abstract: In Imbens and Ingrist (1994), Angrist, Imbens and Rubin (1996) and Imbens and Rubin (1997), assumptions have been outlined under which instrumental variables estimands can be given a causal interpretation as a local average treatment effect without requiring functional form or constant treatment effect assumptions. We extend these results by showing that under these assumptions one can estimate more from the data than the average causal effect for the subpopulation of compliers; one can, in principle, estimate the entire marginal distribution of the outcome under different treatments for this subpopulation. These distributions might be useful for a policy maker who wishes to take into account not only differences in average of earnings when contemplating the merits of one job training programme vs. another. We also show that the standard instrumental variables estimator implicitly estimates these underlying outcome distributions without imposing the required nonnegativity on these implicit density estimates, and that imposing non-negativity can substantially alter the estimates of the local average treatment effect. We illustrate these points by presenting an analysis of the returns to a high school education using quarter of birth as an instrument. We show that the standard instrumental variables estimates implicitly estimate the outcome distributions to be negative over a substantial range, and that the estimates of the local average treatment effect change considerably when we impose nonnegativity in any of a variety of ways.

435 citations


Journal ArticleDOI
TL;DR: A novel and general mixture component model, the features of which include a hierarchical structure with random effects, mixture components characterized by ANOVA-like linear regressions, and mixing mechanisms governed by logistic regressions is proposed.
Abstract: This article proposes a novel and general mixture component model, the features of which include a hierarchical structure with random effects, mixture components characterized by ANOVA-like linear regressions, and mixing mechanisms governed by logistic regressions. The model was developed as a consequence of attending to long-standing psychological theory about schizophrenic behavior. Scientifically revealing results are obtained by fitting the model to a data set concerning nonschizophrenic and schizophrenic eye-tracking behavior under different conditions. Included are descriptions of the algorithms for model fitting, specifically the ECM/SECM algorithms for large sample modal inference, and the Gibbs sampler for simulating the posterior distribution. For guidance on model comparison and selection, we use posterior predictive check distributions to obtain posterior predictive p-values for likelihood ratio statistics, which do not have asymptotic chi 2 reference distributions. These posterior predictive p-values suggest that all the mixture components in our model are necessary. The final model is selected using a combination of scientific parsimony, the posterior predictive p-values, and the posterior distributions of relevant parameters.

31 citations


Proceedings ArticleDOI
12 May 1997
TL;DR: In this article, a very high luminosity electron-positron collider that operates in the /spl Upsi/sub 4s/ energy range is described. But it is not shown how the collider can be used as an accumulator or dedicated synchrotron light source.
Abstract: We describe a very high luminosity electron-positron collider that operates in the /spl Upsi//sub 4s/ energy range. Trajectories intersect with a small horizontal crossing angle so that closely spaced bunches collide only at the interaction point. An electrostatic deflector steers the counter-rotating beams into side-by-side vacuum chambers that share a common dipole guide field. The beams also share superconducting RF accelerating cavities. Independent focusing and chromaticity correction is provided by dual aperture superconducting quadrupoles and sextupoles. The machine is installed above the synchrotron injector in the CESR tunnel. The existing storage ring can remain for service as an accumulator or dedicated synchrotron light source. With 3A/beam in 180 bunches, /spl beta//sub /spl nu//*=7 mm, beam-beam tune shift parameter /spl xi/=0.06 and beam energy of 5.3 GeV, we anticipate luminosity of 3/spl times/10/sup 34/cm/sup -2/s/sup -1/.

10 citations


Proceedings ArticleDOI
12 May 1997
TL;DR: In this article, the continued development of a superconducting RF system for the CESR luminosity upgrade is in progress at the Laboratory of Nuclear Studies, Cornell University and the system description as well as recent results are presented.
Abstract: After the successful CESR beam test of August 1994 the continued development of a superconducting RF system for the CESR luminosity upgrade is in progress at the Laboratory of Nuclear Studies, Cornell University. The system description as well as recent results are presented.

8 citations