scispace - formally typeset
Search or ask a question
Topic

Principal component regression

About: Principal component regression is a research topic. Over the lifetime, 3432 publications have been published within this topic receiving 122255 citations.


Papers
More filters
Journal ArticleDOI

[...]

TL;DR: A new method for performing a nonlinear form of principal component analysis by the use of integral operator kernel functions is proposed and experimental results on polynomial feature extraction for pattern recognition are presented.
Abstract: A new method for performing a nonlinear form of principal component analysis is proposed. By the use of integral operator kernel functions, one can efficiently compute principal components in high-dimensional feature spaces, related to input space by some nonlinear map—for instance, the space of all possible five-pixel products in 16 × 16 images. We give the derivation of the method and present experimental results on polynomial feature extraction for pattern recognition.

7,611 citations

Journal ArticleDOI

[...]

4,885 citations

Journal ArticleDOI

[...]

Naomi Altman1
TL;DR: Kernel and nearest-neighbor regression estimators are local versions of univariate location estimators, and so they can readily be introduced to beginning students and consulting clients who are familiar with such summaries as the sample mean and median.
Abstract: Nonparametric regression is a set of techniques for estimating a regression curve without making strong assumptions about the shape of the true regression function. These techniques are therefore useful for building and checking parametric models, as well as for data description. Kernel and nearest-neighbor regression estimators are local versions of univariate location estimators, and so they can readily be introduced to beginning students and consulting clients who are familiar with such summaries as the sample mean and median.

3,505 citations

Proceedings Article

[...]

03 Dec 1996
TL;DR: This work compares support vector regression (SVR) with a committee regression technique (bagging) based on regression trees and ridge regression done in feature space and expects that SVR will have advantages in high dimensionality space because SVR optimization does not depend on the dimensionality of the input space.
Abstract: A new regression technique based on Vapnik's concept of support vectors is introduced. We compare support vector regression (SVR) with a committee regression technique (bagging) based on regression trees and ridge regression done in feature space. On the basis of these experiments, it is expected that SVR will have advantages in high dimensionality space because SVR optimization does not depend on the dimensionality of the input space.

3,498 citations

Journal ArticleDOI

[...]

TL;DR: In this article, the use of Partial Least Squares (PLS) for handling collinearities among the independent variables X in multiple regression is discussed, and successive estimates are obtained using the residuals from previous rank as a new dependent variable y.
Abstract: The use of partial least squares (PLS) for handling collinearities among the independent variables X in multiple regression is discussed. Consecutive estimates $({\text{rank }}1,2,\cdots )$ are obtained using the residuals from previous rank as a new dependent variable y. The PLS method is equivalent to the conjugate gradient method used in Numerical Analysis for related problems.To estimate the “optimal” rank, cross validation is used. Jackknife estimates of the standard errors are thereby obtained with no extra computation.The PLS method is compared with ridge regression and principal components regression on a chemical example of modelling the relation between the measured biological activity and variables describing the chemical structure of a set of substituted phenethylamines.

2,129 citations


Network Information
Related Topics (5)
Support vector machine
73.6K papers, 1.7M citations
77% related
Regression analysis
31K papers, 1.7M citations
76% related
Artificial neural network
207K papers, 4.5M citations
75% related
Estimator
97.3K papers, 2.6M citations
73% related
Robustness (computer science)
94.7K papers, 1.6M citations
73% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202343
202279
202175
202087
201983
201890