scispace - formally typeset
Open AccessJournal ArticleDOI

Dimension reduction in a semiparametric regression model with errors in covariates

Raymond J. Carroll, +2 more
- 01 Feb 1995 - 
- Vol. 23, Iss: 1, pp 161-181
Reads0
Chats0
TLDR
In this paper, the authors consider a semiparametric estimation method for general regression models when some of the predictors are measured with error, and they show that the usual theory is essentially as good as one can do with this technique.
Abstract
We consider a semiparametric estimation method for general regression models when some of the predictors are measured with error. The technique relies on a kernel regression of the "true" covariate on all the observed covariates and surrogates. This requires a nonparametric regression in as many dimensions as there are covariates and surrogates. The usual theory copes with such higher-dimensional problems by using higher-order kernels, but this is unrealistic for most problems. We show that the usual theory is essentially as good as one can do with this technique. Instead of regression with higher-order kernels, we propose the use of dimension reduction techniques. We assume that the "true" covariate depends only on a linear combination of the observed covariates and surrogates. If this linear combination were known, we could apply the one-dimensional versions of the semiparametric problem, for which standard kernels are applicable. We show that if one can estimate the linear directions at the root-$n$ rate, then asymptotically the resulting estimator of the parameters in the main regression model behaves as if the linear combination were known. Simulations lend some credence to the asymptotic results.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Semiparametric Regression for Clustered Data Using Generalized Estimating Equations

TL;DR: In this article, the authors consider estimating in a semiparametric generalized linear model for clustered data using estimating equations and show that the conventional profile-kernel method often fails to yield a √n-consistent estimator of β along with appropriate inference unless working independence is assumed or θ(t) is artificially undersmoothed, in which case asymptotic inference is possible.

Can sir be as popular as multiple linear regression

TL;DR: This issue is studied for one of the recently proposed methods, sliced inverse regression (SIR), and it is shown how to enhance the SIR analysis so that these features of MLR can be maintained.
Journal ArticleDOI

Weighted Semiparametric Estimation in Regression Analysis with Missing Covariate Data

TL;DR: In this article, the performance of a Horvitz and Thompson (1952)-type weighted estimator was investigated by using different estimates of the selection probabilities, which may be treated as nuisance parameters (or a nuisance function).
Journal ArticleDOI

Estimation in Partially Linear Models With Missing Covariates

TL;DR: In this article, the authors considered the case where the covariate X is sometimes missing, with missingness probability π depending on (Y, Z) and developed new methods for estimating β and ν(·).
Journal ArticleDOI

Failure time regression with continuous covariates measured with error

TL;DR: In this paper, failure time regression analysis with an auxiliary variable in the presence of a validation sample is considered and the induced relative risk function is estimated with a kernel smoother and allow the selection probability of the validation set to depend on the observed covariates.
References
More filters
Book

Measurement Error Models

TL;DR: In this paper, the authors provide a complete treatment of an important and frequently ignored topic, namely measurement error models, including regression models with errors in the variables, latent variable models, and factor models.
Journal ArticleDOI

Projection Pursuit Regression

TL;DR: In this article, a nonparametric multiple regression (NMM) method is presented, which models the regression surface as a sum of general smooth functions of linear combinations of the predictor variables in an iterative manner.
Journal ArticleDOI

Sliced Inverse Regression for Dimension Reduction

TL;DR: In this article, sliced inverse regression (SIR) is proposed to reduce the dimension of the input variable without going through any parametric or nonparametric model-fitting process.
Book

Transformation and Weighting in Regression

TL;DR: The Transform-Both-Sides Methodology as mentioned in this paper combines Transformations and Weighting for least square estimation and inference for Variance Functions, which has been applied to generalized least squares and the analysis of heteroscedasticity.
Journal ArticleDOI

Correction of logistic regression relative risk estimates and confidence intervals for systematic within-person measurement error

TL;DR: Two methods are provided to correct relative risk estimates obtained from logistic regression models for measurement errors in continuous exposures within cohort studies that may be due to either random (unbiased) within-person variation or to systematic errors for individual subjects.
Related Papers (5)