Group selection in high-dimensional partially linear additive models
Reads0
Chats0
TLDR
In this article, the adaptive group Lasso was applied to select the important groups, using spline bases to approximate the nonparametric components and the group lasso to obtain an initial consistent estimator.Abstract:
We consider the problem of simultaneous variable selection and estimation in partially linear additive models with a large number of grouped variables in the linear part and a large number of nonparametric components. In our problem, the number of grouped variables may be larger than the sample size, but the number of important groups is “small” relative to the sample size. We apply the adaptive group Lasso to select the important groups, using spline bases to approximate the nonparametric components and the group Lasso to obtain an initial consistent estimator. Under appropriate conditions, it is shown that, the group Lasso selects the number of groups which is comparable with the underlying important groups and is estimation consistent, the adaptive group Lasso selects the correct important groups with probability converging to one as the sample size increases and is selection consistent. The results of simulation studies show that the adaptive group Lasso procedure works well with samples of moderate size. A real example is used to illustrate the application of the proposed penalized method.read more
Citations
More filters
Journal ArticleDOI
Variable selection in high-dimensional partially linear additive models for composite quantile regression
TL;DR: A new estimation procedure based on the composite quantile regression is proposed for the semiparametric additive partial linear models, of which the nonparametric components are approximated by polynomial splines, and is empirically shown to be much more efficient than the popular least-squares-based estimation method for non-normal random errors.
Journal ArticleDOI
On the oracle property of adaptive group Lasso in high-dimensional linear models
Caiya Zhang,Yanbiao Xiang +1 more
TL;DR: Under appropriate conditions, the consistency and asymptotic normality are established, which means that the adaptive group Lasso shares the oracle property in high-dimensional linear regression when the number of group variables diverges with the sample size.
Journal ArticleDOI
Robust group non-convex estimations for high-dimensional partially linear models
Mingqiu Wang,Guo-Liang Tian +1 more
TL;DR: In this article, the robust group selection for partially linear models when the number of covariates can be larger than the sample size is considered, and the non-convex penalty function is applied to achieve both goals of variable selection and estimation simultaneously, and polynomial splines to estimate the nonparametric component.
Journal ArticleDOI
Robust estimation and variable selection in censored partially linear additive models
Huilan Liu,Hu Yang,Xiaochao Xia +2 more
TL;DR: In this article, the nonparametric components are approximated by polynomial spline, and a regularization procedure based on adaptive lasso is proposed for estimation and variable selection simultaneously.
Journal ArticleDOI
Nonconvex penalized ridge estimations for partially linear additive models in ultrahigh dimension
TL;DR: In this paper, the authors combine the strengths of nonconvex penalties and ridge regression (abbreviated as NPR) to study the oracle selection property of the NPR estimator for high-dimensional additive models with highly correlated predictors.
References
More filters
Journal ArticleDOI
Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Jianqing Fan,Runze Li +1 more
TL;DR: In this article, penalized likelihood approaches are proposed to handle variable selection problems, and it is shown that the newly proposed estimators perform as well as the oracle procedure in variable selection; namely, they work as well if the correct submodel were known.
Book
Spline Functions: Basic Theory
TL;DR: The material covered provides the reader with the necessary tools for understanding the many applications of splines in such diverse areas as approximation theory, computer-aided geometric design, curve and surface design and fitting, image processing, numerical solution of differential equations, and increasingly in business and the biosciences.
Journal ArticleDOI
Asymptotics for lasso-type estimators
Keith Knight,Wenjiang J. Fu +1 more
TL;DR: In this paper, the authors consider the asymptotic behavior of regression estimators that minimize the residual sum of squares plus a penalty proportional to the value of the parameter ε > 0, and show that the limiting distributions can have positive probability mass at 0 under appropriate conditions.
Journal ArticleDOI
Consistency of the Group Lasso and Multiple Kernel Learning
TL;DR: This paper derives necessary and sufficient conditions for the consistency of group Lasso under practical assumptions, and proposes an adaptive scheme to obtain a consistent model estimate, even when the necessary condition required for the non adaptive scheme is not satisfied.
Journal ArticleDOI
Spline Smoothing in a Partly Linear Model
TL;DR: On donne des estimations qui minimisent la somme des sommes residuelles des carres et une penalite de difficulte as discussed by the authors, i.e.