scispace - formally typeset
Journal ArticleDOI

A solution to the problem of separation in logistic regression

Georg Heinze, +1 more
- 30 Aug 2002 - 
- Vol. 21, Iss: 16, pp 2409-2419
Reads0
Chats0
TLDR
A procedure by Firth originally developed to reduce the bias of maximum likelihood estimates is shown to provide an ideal solution to separation and produces finite parameter estimates by means of penalized maximum likelihood estimation.
Abstract
The phenomenon of separation or monotone likelihood is observed in the fitting process of a logistic model if the likelihood converges while at least one parameter estimate diverges to +/- infinity. Separation primarily occurs in small samples with several unbalanced and highly predictive risk factors. A procedure by Firth originally developed to reduce the bias of maximum likelihood estimates is shown to provide an ideal solution to separation. It produces finite parameter estimates by means of penalized maximum likelihood estimation. Corresponding Wald tests and confidence intervals are available but it is shown that penalized likelihood ratio tests and profile penalized likelihood confidence intervals are often preferable. The clear advantage of the procedure over previous options of analysis is impressively demonstrated by the statistical analysis of two cancer studies.

read more

Citations
More filters
Journal ArticleDOI

Regression Diagnostics: Identifying Influential Data and Sources of Collinearity

TL;DR: This chapter discusses Detecting Influential Observations and Outliers, a method for assessing Collinearity, and its applications in medicine and science.
Book

Flexible Imputation of Missing Data

TL;DR: The problem of missing data concepts of MCAR, MAR and MNAR simple solutions that do not (always) work multiple imputation in a nutshell and some dangers, some do's and some don'ts are covered.
Journal ArticleDOI

A weakly informative default prior distribution for logistic and other regression models

TL;DR: In this paper, the authors propose a new prior distribution for logistic regression models, called Cauchy prior, constructed by first scaling all nonbinary variables to have mean 0 and standard deviation 0.5, and then placing independent Student-t prior distributions on the coefficients.
Journal ArticleDOI

Methods for Detecting Associations with Rare Variants for Common Diseases : Application to Analysis of Sequence Data

TL;DR: It is shown that the collapsing method, which involves collapsing genotypes across variants and applying a univariate test, is powerful for analyzing rare variants, whereas multivariate analysis is robust against inclusion of noncausal variants.
Journal ArticleDOI

Back to the Future: Modeling Time Dependence in Binary Data

TL;DR: Monte Carlo analysis demonstrates that, for the types of hazards one often sees in substantive research, the polynomial approximation always outperforms time dummies and generally performs as well as splines or even more flexible autosmoothing procedures.
References
More filters
Journal ArticleDOI

Infinite Parameter Estimates in Logistic Regression, with Application to Approximate Conditional Inference

TL;DR: In this article, the authors discuss recovery of information regarding logistic regression parameters in cases when maximum likelihood estimates of some parameters are infinite and present an algorithm for detecting such cases and characterizing the divergence of the parameter estimates.
Book ChapterDOI

Generalized Linear Models and Jeffreys Priors: An Iterative Weighted Least-Squares Approach

TL;DR: A new algorithm for calculation of the posterior mode of the Jeffreys invariant prior in generalized linear models makes use of the iterative weighted least-squares method commonly used for maximum likelihood calculations, so that implementation is possible in standard regression software, such as GLIM.

The application of Firth's procedure to Cox and logistic regression

M. Schemper
TL;DR: In this paper, the authors present the results of an extensive simulation study exploring the properties of Firth's procedure in logistic and Cox regression, and the empirical bias of parameter estimates obtained by Firth-type parameter estimates is compared to that resulting from ordinary maximum likelihood estimation in logistically and Cox regressions.
Journal ArticleDOI

Jackknife bias reduction for polychotomous logistic regression.

TL;DR: A Monte Carlo comparison of the jackknife and Taylor series estimates in moderate sample sizes in a general logistic regression setting, to investigate dichotomous and trichotomous responses and a mixture of correlated and uncorrelated binary and normal covariates.