scispace - formally typeset
Journal ArticleDOI

The Collinearity Problem in Linear Regression. The Partial Least Squares (PLS) Approach to Generalized Inverses

Svante Wold, +3 more
- 01 Sep 1984 - 
- Vol. 5, Iss: 3, pp 735-743
TLDR
In this article, the use of Partial Least Squares (PLS) for handling collinearities among the independent variables X in multiple regression is discussed, and successive estimates are obtained using the residuals from previous rank as a new dependent variable y.
Abstract
The use of partial least squares (PLS) for handling collinearities among the independent variables X in multiple regression is discussed. Consecutive estimates $({\text{rank }}1,2,\cdots )$ are obtained using the residuals from previous rank as a new dependent variable y. The PLS method is equivalent to the conjugate gradient method used in Numerical Analysis for related problems.To estimate the “optimal” rank, cross validation is used. Jackknife estimates of the standard errors are thereby obtained with no extra computation.The PLS method is compared with ridge regression and principal components regression on a chemical example of modelling the relation between the measured biological activity and variables describing the chemical structure of a set of substituted phenethylamines.

read more

Citations
More filters
Journal ArticleDOI

Principal component analysis

TL;DR: Principal Component Analysis is a multivariate exploratory analysis method useful to separate systematic variation from noise and to define a space of reduced dimensions that preserve noise.
Journal ArticleDOI

PLS-regression: a basic tool of chemometrics

TL;DR: PLS-regression (PLSR) as mentioned in this paper is the PLS approach in its simplest, and in chemistry and technology, most used form (two-block predictive PLS) is a method for relating two data matrices, X and Y, by a linear multivariate model.
Journal ArticleDOI

Partial least-squares regression: a tutorial

TL;DR: In this paper, a tutorial on the Partial Least Squares (PLS) regression method is provided, and an algorithm for a predictive PLS and some practical hints for its use are given.
Journal ArticleDOI

A Statistical View of Some Chemometrics Regression Tools

TL;DR: In this article, the authors examined partial least squares and principal components regression from a statistical perspective and compared them with other statistical methods intended for those situations, such as variable subset selection and ridge regression.
Journal ArticleDOI

A Review of Process Fault Detection and Diagnosis Part I : Quantitative Model-Based Methods

TL;DR: This three part series of papers is to provide a systematic and comparative study of various diagnostic methods from different perspectives and broadly classify fault diagnosis methods into three general categories and review them in three parts.
References
More filters
Journal ArticleDOI

Ridge regression: biased estimation for nonorthogonal problems

TL;DR: In this paper, an estimation procedure based on adding small positive quantities to the diagonal of X′X was proposed, which is a method for showing in two dimensions the effects of nonorthogonality.
Journal ArticleDOI

Methods of Conjugate Gradients for Solving Linear Systems

TL;DR: An iterative algorithm is given for solving a system Ax=k of n linear equations in n unknowns and it is shown that this method is a special case of a very general method which also includes Gaussian elimination.
Journal ArticleDOI

Cross-Validatory Choice and Assessment of Statistical Predictions

TL;DR: In this article, a generalized form of the cross-validation criterion is applied to the choice and assessment of prediction using the data-analytic concept of a prescription, and examples used to illustrate the application are drawn from the problem areas of univariate estimation, linear regression and analysis of variance.
Journal ArticleDOI

LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares

TL;DR: Numerical tests are described comparing I~QR with several other conjugate-gradient algorithms, indicating that I ~QR is the most reliable algorithm when A is ill-conditioned.
Related Papers (5)