scispace - formally typeset
Journal ArticleDOI

On the Large-Sample Minimal Coverage Probability of Confidence Intervals After Model Selection

Paul Kabaila, +1 more
- 01 Jun 2006 - 
- Vol. 101, Iss: 474, pp 619-629
Reads0
Chats0
TLDR
In this paper, a large-sample analysis of the minimal coverage probability of the usual confidence intervals for regression parameters when the underlying model is chosen by a conservative (or overconsistent) model selection procedure is given.
Abstract
We give a large-sample analysis of the minimal coverage probability of the usual confidence intervals for regression parameters when the underlying model is chosen by a “conservative” (or “overconsistent”) model selection procedure. We derive an upper bound for the large-sample limit minimal coverage probability of such intervals that applies to a large class of model selection procedures including the Akaike information criterion as well as various pretesting procedures. This upper bound can be used as a safeguard to identify situations where the actual coverage probability can be far below the nominal level. We illustrate that the (asymptotic) upper bound can be statistically meaningful even in rather small samples.

read more

Citations
More filters
Journal ArticleDOI

Model selection and inference: facts and fiction

TL;DR: Some myths about model selection are debunked, in particular the myth that consistent model selection has no effect on subsequent inference asymptotically and an “impossibility” result regarding the estimation of the finite-sample distribution of post-model-selection estimators.
Journal ArticleDOI

Valid post-selection inference

TL;DR: In this paper, the problem of post-selection inference is reduced to one of simultaneous inference, and the authors propose to use simultaneous inference for all linear functions that arise as coefficient estimates in all submodels.
Journal ArticleDOI

Valid post-selection inference

TL;DR: In this article, the problem of post-selection inference has been reduced to one of simultaneous inference, and the authors propose to buy simultaneous insurance for all possible submodels to ensure that the resulting inference is universally valid under all possible model selection procedures.
Book

Formalized data snooping based on generalized error rates

TL;DR: A review of a number of recent proposals from the statistical literature and how these procedures apply to the general problem of model selection and how to decide which hypotheses to reject is discussed.
Journal ArticleDOI

Model selection and model averaging after multiple imputation

TL;DR: In this article, a framework for model selection and model averaging in the context of missing data is proposed, where the focus lies on multiple imputation as a strategy to deal with the missingness: a consequent combination with model averaging aims to incorporate both the uncertainty associated with the model selection process and with the imputation process.
References
More filters
Journal ArticleDOI

Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties

TL;DR: In this article, penalized likelihood approaches are proposed to handle variable selection problems, and it is shown that the newly proposed estimators perform as well as the oracle procedure in variable selection; namely, they work as well if the correct submodel were known.
Book

Theory of point estimation

TL;DR: In this paper, the authors present an approach for estimating the average risk of a risk-optimal risk maximization algorithm for a set of risk-maximization objectives, including maximalaxity and admissibility.
Book

Regression Analysis by Example

TL;DR: Simple linear regression Multiple linear regression Regression Diagnostics: Detection of Model Violations Qualitative Variables as Predictors Transformation of Variables Weighted Least Squares The Problem of Correlated Errors Analysis of Collinear Data Biased Estimation of Regression Coefficients Variable Selection Procedures Logistic Regression Appendix References as discussed by the authors
Book

Applied Regression Analysis: A Research Tool

TL;DR: In this article, simple regression is used to solve the problem of problem areas in least squares in matrix notation, and the results show that the regression is nonlinear in the parameters.
Related Papers (5)