scispace - formally typeset
Open AccessJournal ArticleDOI

Data-Driven Bandwidth Selection in Local Polynomial Fitting: Variable Bandwidth and Spatial Adaptation

TLDR
In this article, the authors proposed a data-driven bandwidth selection procedure, which can be used to select both constant and variable bandwidths, based on a residual squares criterion along with a good approximation of the bias and variance of the estimator.
Abstract
When estimating a mean regression function and its derivatives, locally weighted least squares regression has proven to be a very attractive technique. The present paper focuses on the important issue of how to select the smoothing parameter or bandwidth. In the case of estimating curves with a complicated structure, a variable bandwidth is desirable. Furthermore, the bandwidth should be indicated by the data themselves. Recent developments in nonparametric smoothing techniques inspired us to propose such a data-driven bandwidth selection procedure, which can be used to select both constant and variable bandwidths. The idea is based on a residual squares criterion along with a good approximation of the bias and variance of the estimator. The procedure can be applied to select bandwidths not only for estimating the regression curve but also for estimating its derivatives. The resulting estimation procedure has the necessary flexibility for capturing complicated shapes of curves. This is illustrated via a large variety of testing examples, including examples with a large spatial variability. The results are also compared with wavelet thresholding techniques, and it seems that our results are at least comparable, i.e. local polynomial regression using our data-driven variable bandwidth has spatial adaptation properties that are similar to wavelets.

read more

Content maybe subject to copyright    Report






Citations
More filters
Journal ArticleDOI

Adapting to Unknown Smoothness via Wavelet Shrinkage

TL;DR: In this article, the authors proposed a smoothness adaptive thresholding procedure, called SureShrink, which is adaptive to the Stein unbiased estimate of risk (sure) for threshold estimates and is near minimax simultaneously over a whole interval of the Besov scale; the size of this interval depends on the choice of mother wavelet.
Journal ArticleDOI

Flexible smoothing with B-splines and penalties

TL;DR: A relatively large number of knots and a difference penalty on coefficients of adjacent B-splines are proposed to use and connections to the familiar spline penalty on the integral of the squared second derivative are shown.
Journal ArticleDOI

Locally Weighted Learning

TL;DR: The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, assessing predictions, handling noisy data and outliers, improving the quality of predictions by tuning fit parameters, and applications of locally weighted learning.
Book

Local Regression and Likelihood

Guohua Pan
TL;DR: The Origins of Local Regression, Fitting with LOCFIT, and Optimizing local Regression methods.
Journal ArticleDOI

An Effective Bandwidth Selector for Local Least Squares Regression

TL;DR: In this paper, the authors apply the idea of plug-in bandwidth selection to develop strategies for choosing the smoothing parameter of local linear squares kernel estimators, which is applicable to odd-degree local polynomial fits and can be extended to other settings, such as derivative estimation and multiple nonparametric regression.
References
More filters
BookDOI

Density estimation for statistics and data analysis

TL;DR: The Kernel Method for Multivariate Data: Three Important Methods and Density Estimation in Action.
Journal ArticleDOI

Robust Locally Weighted Regression and Smoothing Scatterplots

TL;DR: Robust locally weighted regression as discussed by the authors is a method for smoothing a scatterplot, in which the fitted value at z k is the value of a polynomial fit to the data using weighted least squares, where the weight for (x i, y i ) is large if x i is close to x k and small if it is not.
Journal ArticleDOI

Ideal spatial adaptation by wavelet shrinkage

TL;DR: In this article, the authors developed a spatially adaptive method, RiskShrink, which works by shrinkage of empirical wavelet coefficients, and achieved a performance within a factor log 2 n of the ideal performance of piecewise polynomial and variable-knot spline methods.
Journal ArticleDOI

Locally Weighted Regression: An Approach to Regression Analysis by Local Fitting

TL;DR: Locally weighted regression as discussed by the authors is a way of estimating a regression surface through a multivariate smoothing procedure, fitting a function of the independent variables locally and in a moving fashion analogous to how a moving average is computed for a time series.
Journal ArticleDOI

Smoothing Noisy Data with Spline Functions Estimating the Correct Degree of Smoothing by the Method of Generalized Cross-Validation*

Peter Craven, +1 more
TL;DR: In this paper, a method for estimating the optimum amount of smoothing from the data is presented, based on smoothing splines, which is well known to provide nice curves which smooth discrete, noisy data.