scispace - formally typeset
Journal ArticleDOI

Asymptotics and smoothing parameter selection for penalized spline regression with various loss functions

Takuma Yoshida
- 01 Jan 2016 - 
- Vol. 70, Iss: 4, pp 278-303
Reads0
Chats0
TLDR
In this article, the asymptotic bias and variance of the penalized spline estimator with general convex loss functions were analyzed. And the smoothing parameter selection for the minimization of the mean integrated squares error was discussed.
Abstract
Penalized splines are used in various types of regression analyses, including non-parametric quantile, robust and the usual mean regression. In this paper, we focus on the penalized spline estimator with general convex loss functions. By specifying the loss function, we can obtain the mean estimator, quantile estimator and robust estimator. We will first study the asymptotic properties of penalized splines. Specifically, we will show the asymptotic bias and variance as well as the asymptotic normality of the estimator. Next, we will discuss smoothing parameter selection for the minimization of the mean integrated squares error. The new smoothing parameter can be expressed uniquely using the asymptotic bias and variance of the penalized spline estimator. To validate the new smoothing parameter selection method, we will provide a simulation. The simulation results show that the consistency of the estimator with the proposed smoothing parameter selection method can be confirmed and that the proposed estimator has better behavior than the estimator with generalized approximate cross-validation. A real data example is also addressed.

read more

References
More filters
Journal ArticleDOI

The Elements of Statistical Learning

Eric R. Ziegel
- 01 Aug 2003 - 
TL;DR: Chapter 11 includes more case studies in other areas, ranging from manufacturing to marketing research, and a detailed comparison with other diagnostic tools, such as logistic regression and tree-based methods.
Journal ArticleDOI

Flexible smoothing with B-splines and penalties

TL;DR: A relatively large number of knots and a difference penalty on coefficients of adjacent B-splines are proposed to use and connections to the familiar spline penalty on the integral of the squared second derivative are shown.
MonographDOI

Quantile Regression: Name Index

Roger Koenker
Journal ArticleDOI

Stable and Efficient Multiple Smoothing Parameter Estimation for Generalized Additive Models

TL;DR: It is proposed that GAM's with a ridge penalty provide a practical solution in such circumstances, and a multiple smoothing parameter selection method suitable for use in the presence of such a penalty is developed.
Related Papers (5)