scispace - formally typeset
Journal ArticleDOI

Singular value decomposition and least squares solutions

Reads0
Chats0
TLDR
The decomposition of A is called the singular value decomposition (SVD) and the diagonal elements of ∑ are the non-negative square roots of the eigenvalues of A T A; they are called singular values.
Abstract
Let A be a real m×n matrix with m≧n. It is well known (cf. [4]) that $$A = U\sum {V^T}$$ (1) where $${U^T}U = {V^T}V = V{V^T} = {I_n}{\text{ and }}\sum {\text{ = diag(}}{\sigma _{\text{1}}}{\text{,}} \ldots {\text{,}}{\sigma _n}{\text{)}}{\text{.}}$$ The matrix U consists of n orthonormalized eigenvectors associated with the n largest eigenvalues of AA T , and the matrix V consists of the orthonormalized eigenvectors of A T A. The diagonal elements of ∑ are the non-negative square roots of the eigenvalues of A T A; they are called singular values. We shall assume that $${\sigma _1} \geqq {\sigma _2} \geqq \cdots \geqq {\sigma _n} \geqq 0.$$ Thus if rank(A)=r, σ r+1 = σ r+2=⋯=σ n = 0. The decomposition (1) is called the singular value decomposition (SVD).

read more

Citations
More filters
Journal ArticleDOI

Principal component analysis in linear systems: Controllability, observability, and model reduction

TL;DR: In this paper, it is shown that principal component analysis (PCA) is a powerful tool for coping with structural instability in dynamic systems, and it is proposed that the first step in model reduction is to apply the mechanics of minimal realization using these working subspaces.
Journal ArticleDOI

Linear prediction: A tutorial review

TL;DR: This paper gives an exposition of linear prediction in the analysis of discrete signals as a linear combination of its past values and present and past values of a hypothetical input to a system whose output is the given signal.
Journal ArticleDOI

Generalized Cross-Validation as a Method for Choosing a Good Ridge Parameter

TL;DR: The generalized cross-validation (GCV) method as discussed by the authors is a generalized version of Allen's PRESS, which can be used in subset selection and singular value truncation, and even to choose from among mixtures of these methods.
Journal ArticleDOI

Orthogonal least squares learning algorithm for radial basis function networks

TL;DR: The authors propose an alternative learning procedure based on the orthogonal least-squares method, which provides a simple and efficient means for fitting radial basis function networks.
Journal ArticleDOI

PRIMUS: a Windows PC-based system for small-angle scattering data analysis

TL;DR: A program suite for one-dimensional small-angle scattering data processing running on IBM-compatible PCs under Windows 9x/NT/2000/XP is presented and PRIMUS enables model-independent singular value decomposition or linear fitting if the scattering from the components is known.
References
More filters
Journal ArticleDOI

Calculating the Singular Values and Pseudo-Inverse of a Matrix

TL;DR: In this article, a numerically stable and fairly fast scheme is described to compute the unitary matrices U and V which transform a given matrix A into a diagonal form π = U^ * AV, thus exhibiting A's singular values on π's diagonal.
Journal ArticleDOI

Linear least squares solutions by householder transformations

TL;DR: In this paper, the euclidean norm is unitarily invariant and a vector x is determined such that x is parallel b-Ax parallel = \parallel c - QAx parallel where c denotes the first n components of c.