scispace - formally typeset
Open AccessJournal ArticleDOI

Principal component-guided sparse regression

About
This article is published in Canadian Journal of Statistics-revue Canadienne De Statistique.The article was published on 2021-04-16 and is currently open access. It has received 5 citations till now. The article focuses on the topics: Lasso (statistics) & Principal component analysis.

read more

Citations
More filters
Journal ArticleDOI

Functional-Hybrid modeling through automated adaptive symbolic regression for interpretable mathematical expressions

TL;DR: The Functional-Hybrid model as discussed by the authors uses the ranked domain-specific functional beliefs together with symbolic regression to develop dynamic models for the representation of (bio)-chemical processes, focusing on applying chemical reaction kinetic principles to classical chemical reactions, biochemistry, ecology, physiology and a bioreactor.
Journal ArticleDOI

Functional-Hybrid Modeling through automated adaptive symbolic regression for interpretable mathematical expressions

TL;DR: The Functional-Hybrid model as discussed by the authors uses the ranked domain-specific functional beliefs together with symbolic regression to develop dynamic models for the representation of (bio)-chemical processes, focusing on applying chemical reaction kinetic principles to classical chemical reactions, biochemistry, ecology, physiology and a bioreactor.
Journal ArticleDOI

OUP accepted manuscript

TL;DR: SuffPCR as discussed by the authors first estimates sparse principal components and then estimates a linear model on the recovered subspace, which yields improved predictions in high-dimensional tasks including regression and classification, especially in the typical context of omics with correlated features.
Journal ArticleDOI

Dualize, Split, Randomize: Toward Fast Nonsmooth Optimization Algorithms

TL;DR: In this paper , the authors proposed a primal-dual algorithm for minimizing the sum of three convex functions, where the first one F is smooth, the second one is nonsmooth and proximable and the third one is the composition of F with a linear operator L. This problem has many applications, for instance, in image processing and machine learning.
Journal ArticleDOI

An analytical shrinkage estimator for linear regression

TL;DR: This article derived an analytical solution to the optimal shrinkage of OLS regression coefficients toward a constant target, under any first two moments of predictors, which closely mimics the prediction performance of ridge penalty, which admits no general analytical solution.
References
More filters
Journal ArticleDOI

Regression Shrinkage and Selection via the Lasso

TL;DR: A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
Journal ArticleDOI

Regularization and variable selection via the elastic net

TL;DR: It is shown that the elastic net often outperforms the lasso, while enjoying a similar sparsity of representation, and an algorithm called LARS‐EN is proposed for computing elastic net regularization paths efficiently, much like algorithm LARS does for the lamba.
Journal ArticleDOI

Regularization Paths for Generalized Linear Models via Coordinate Descent

TL;DR: In comparative timings, the new algorithms are considerably faster than competing methods and can handle large problems and can also deal efficiently with sparse features.
Journal ArticleDOI

Model selection and estimation in regression with grouped variables

TL;DR: In this paper, instead of selecting factors by stepwise backward elimination, the authors focus on the accuracy of estimation and consider extensions of the lasso, the LARS algorithm and the non-negative garrotte for factor selection.