Rejoinder to "least angle regression" by efron et al.
Reads0
Chats0
TLDR
In this article, the authors re-joinder to ''Least angle regression'' by Efron et al. [math.ST/0406456] is presented.Abstract:
Rejoinder to ``Least angle regression'' by Efron et al. [math.ST/0406456]read more
Citations
More filters
Journal ArticleDOI
Feature selection in machine learning: A new perspective
TL;DR: This study discusses several frequently-used evaluation measures for feature selection, and surveys supervised, unsupervised, and semi-supervised feature selection methods, which are widely applied in machine learning problems, such as classification and clustering.
Journal ArticleDOI
A high-bias, low-variance introduction to Machine Learning for physicists
Pankaj Mehta,Marin Bukov,Ching-Hao Wang,Alexandre G. R. Day,Charles C. Richardson,Charles K. Fisher,David J. Schwab +6 more
TL;DR: The review begins by covering fundamental concepts in ML and modern statistics such as the bias-variance tradeoff, overfitting, regularization, generalization, and gradient descent before moving on to more advanced topics in both supervised and unsupervised learning.
Journal ArticleDOI
History and trends in solar irradiance and PV power forecasting: A preliminary assessment and review using text mining
TL;DR: This paper presents a preliminary study on how to review solar irradiance and photovoltaic power forecasting using text mining, which serves as the first part of a forthcoming series of text mining applications in solar forecasting.
Journal ArticleDOI
Statistical predictions with glmnet.
TL;DR: In this paper, the authors provide guidelines on how to obtain parsimonious models with low mean squared error and include easy-to follow walk-through examples for each step in R.
Journal ArticleDOI
Robust Wasserstein Profile Inference and Applications to Machine Learning
TL;DR: In this article, the authors show that several machine learning estimators, including square-root LASSO (Least Absolute Shrinkage and Selection) and regularized logistic regression can be represented as solutions to distributionally robust optimization problems.
References
More filters
Journal ArticleDOI
Gaussian model selection
Lucien Birgé,Pascal Massart +1 more
TL;DR: The purpose in this paper is to provide a general approach to model selection via penalization for Gaussian regression and to develop the point of view about this subject.
Journal ArticleDOI
Calibration and empirical Bayes variable selection
Edward I. George,Dean P. Foster +1 more
TL;DR: In this article, the authors proposed empirical Bayes selection criteria that use hyperparameter estimates instead of fixed choices for variable selection for the normal linear model, and their performance is seen to approximate adaptively the performance of the best fixed penalty criterion across a variety of orthogonal and nonorthogonal set-ups, including wavelet regression.
Journal ArticleDOI
Adapting to unknown sparsity by controlling the false discovery rate
TL;DR: This work provides a new perspective on a class of model selection rules which has been introduced recently by several authors, and exhibits a close connection with FDR-controlling procedures under stringent control of the false discovery rate.
Posted Content
Adapting to Unknown Sparsity by controlling the False Discovery Rate
TL;DR: In this article, a data-adaptive thresholding scheme is proposed to recover an n-dimensional vector observed in white noise, where the vector is known to be sparse, but the degree of sparsity is unknown.
Posted Content
An Information Theoretic Comparison of Model Selection Criteria
Dean P. Foster,Robert A. Stine +1 more
TL;DR: By selecting the model that minimizes the total message length, the representations of numerous model selection criteria reproduce their more familiar definitions.