scispace - formally typeset
Open AccessJournal ArticleDOI

Comparing Nonparametric Versus Parametric Regression Fits

Wolfgang Karl Härdle, +1 more
- 01 Dec 1993 - 
- Vol. 21, Iss: 4, pp 1926-1947
Reads0
Chats0
TLDR
In this paper, the wild bootstrap method was used to fit Engel curves in expenditure data analysis, and it was shown that the standard way of bootstrapping this statistic fails.
Abstract
In general, there will be visible differences between a parametric and a nonparametric curve estimate. It is therefore quite natural to compare these in order to decide whether the parametric model could be justified. An asymptotic quantification is the distribution of the integrated squared difference between these curves. We show that the standard way of bootstrapping this statistic fails. We use and analyse a different form of bootstrapping for this task. We call this method the wild bootstrap and apply it to fitting Engel curves in expenditure data analysis.

read more

Content maybe subject to copyright    Report

Citations
More filters
Book

Local Regression and Likelihood

Guohua Pan
TL;DR: The Origins of Local Regression, Fitting with LOCFIT, and Optimizing local Regression methods.
Journal ArticleDOI

On the Failure of the Bootstrap for Matching Estimators

TL;DR: In this article, the authors show that the standard bootstrap is not valid for matching estimators, even in the simple case with a single continuous covariate where the estimator is root-N consistent and asymptotically normally distributed with zero as-ymptotic bias.
Journal ArticleDOI

Bootstrap and Wild Bootstrap for High Dimensional Linear Models

Enno Mammen
- 01 Mar 1993 - 
TL;DR: In this article, two bootstrap procedures are considered for the estimation of the distribution of linear contrasts and of F-test statistics in high dimensional linear models, where the dimension p of the model may increase for sample size $n\rightarrow\infty.
Journal ArticleDOI

Generalized likelihood ratio statistics and Wilks phenomenon

TL;DR: The generalized likelihood ratio statistics are shown to be general and powerful for nonparametric testing problems based on function estimation and can even be adaptively optimal in the sense of Spokoiny by using a simple choice of adaptive smoothing parameter.
Book

Partially Linear Models

TL;DR: The emphasis of this monograph is on methodologies rather than on the theory, with a particular focus on applications of partially linear regression techniques to various statistical problems, including least squares regression, asymptotically efficient estimation, bootstrap resampling, censored data analysis and nonlinear and nonparametric time series models.
References
More filters
Posted Content

Applied nonparametric methods

TL;DR: In this paper, different approaches to nonparametric density and regression estimation are reviewed, and different kernel estimators are compared to k-NN estimators, orthogonal series and splines.
Journal ArticleDOI

On the use of nonparametric regression for model checking

TL;DR: In this article, the authors explored the use of nonparametric regression to check the fit of a parametric regression model and developed a pseudo likelihood ratio test to provide a global assessment of fit and simulation bands to indicate the nature of departures from the model.
Posted Content

Semiparametric comparison of regression curves

TL;DR: In this paper, a comparison of nonparametric regression curves is considered, where it is assumed that there are parametric (possibly nonlinear) transformations of the axes which map one curve into the other.
Journal ArticleDOI

A smoothing spline based test of model adequacy in polynomial regression

TL;DR: In this article, it is shown that no uniformly (in b) most powerful test exists, but a locally (at b=0) mostpowerful test does exist. And the test statistic is derived and calculated based on smoothing spline theory.
Journal ArticleDOI

On the convergence rate of maximal deviation distribution for kernel regression estimates

TL;DR: In this paper, it was shown that the distribution of the maximal deviation tends to double exponent with logarithmic rate and this rate cannot be improved, and it is shown that there is no improvement in this rate.