scispace - formally typeset
Search or ask a question

Showing papers on "Singular value decomposition published in 1976"


Journal ArticleDOI
TL;DR: Two generalizations of the singular value decomposition are given in this article, and these generalizations provided a unified way of regarding certain matrix problems and the numerical techniques which are used to solve them.
Abstract: Two generalizations of the singular value decomposition are given. These generalizations provided a unified way of regarding certain matrix problems and the numerical techniques which are used to s...

600 citations


Journal ArticleDOI
TL;DR: In this article, singular value decomposition (SVD) and pseudoinverse techniques are used for image restoration in space-variant point spread functions (SVPSF).
Abstract: The use of singular value decomposition (SVD) techniques in digital image processing is of considerable interest for those facilities with large computing power and stringent imaging requirements. The SVD methods are useful for image as well as quite general point spread function (impulse response) representations. The methods represent simple extensions of the theory of linear filtering. Image enhancement examples will be developed illustrating these principles. The most interesting cases of image restoration are those which involve space variant imaging systems. The SVD, combined with pseudoinverse techniques, provides insight into these types of restorations. Illustrations of large scale N2× N2point spread function matrix representations are discussed along with separable space variant N2× N2point spread function matrix examples. Finally, analysis and methods for obtaining a pseudoinverse of separable space variant point spread functions (SVPSF's) are presented with a variety of object and imaging system dagradations.

362 citations


Journal ArticleDOI
TL;DR: A new transform method in which the singular values and singular vectors of an image are computed and transmitted instead of transform coefficients is presented, and a self adaptive set of experimental results is presented.
Abstract: The numerical techniques of transform image coding are well known in the image bandwidth compression literature. This concise paper presents a new transform method in which the singular values and singular vectors of an image are computed and transmitted instead of transform coefficients. The singular value decomposition (SVD) method is known to be the deterministically optimal transform for energy compaction [2]. A systems implementation is hypothesized, and a variety of coding strategies is developed. Statistical properties of the SVD are discussed and a self adaptive set of experimental results is presented, Imagery compressed to 1, 1.5, and 2.5 bits per pixel with less than 1.6, 1, and 1/3 percent, respective mean-square error is displayed. Finally, additional image coding scenarios are postulated for further consideration.

300 citations


Journal ArticleDOI
TL;DR: This paper is intended as a tutorial review of certain digital image processing transform techniques utilizing the notion of outer product expansions and implementation of the singular value decomposition (SVD) of large sized images.
Abstract: This paper is intended as a tutorial review of certain digital image processing transform techniques utilizing the notion of outer product expansions. Examples from Fourier, Walsh, Haar, and other well known transforms are reviewed in the notation of matrix-vector outer products; and implementation of the singular value decomposition (SVD) of large sized images is presented. The use of the SVD as an aid in image restoration utilizing the pseudoinverses is presented. Conditions on the point spread matrix are investigated in the light of singular value decomposition, Kronecker products, and general imaging conditions.

51 citations


Journal ArticleDOI
TL;DR: The singular value decomposition of a matrix is used to derive systematically the Moore-Penrose inverse for a matrix bordered by a row and a column, in addition to the associated principal Schur complements as mentioned in this paper.
Abstract: The singular value decomposition of a matrix is used to derive systematically the Moore–Penrose inverse for a matrix bordered by a row and a column, in addition to the Moore–Penrose inverse for the associated principal Schur complements.

32 citations


Journal ArticleDOI
TL;DR: A compact program for performing a variety of regression and principal component computations is described in this article, where a singular value decomposition of the data matrix is used which permits calculations involving rank deficient data to be handled satisfactorily.
Abstract: A compact program for performing a variety of regression and principal component computations is described. A singular value decomposition of the data matrix is used which permits calculations involving rank deficient data to be handled satisfactorily. The importance of avoiding the calculation of a sum of squares and cross‐products matrix is demonstrated by an example.

12 citations


Journal ArticleDOI
TL;DR: In this article, the authors make use of Householder transformations and recursive triangulation solutions in presenting numerical algorithms for the computation of 3SLS and k-class estimates, and the singular value decomposition is valuable in providing additional information in k -class estimation.

11 citations


Journal ArticleDOI
TL;DR: If the data are such that the eigenvectors are orthogonal functions of time and they have some recognizable non-random structure permitting predictability in time, then the observed response at time t can be used with the extrapolated forcing function to predict some physical quantity.
Abstract: The theorem of singular value decomposition is used to represent a data matrix X as the product of a system with a response R to a forcing function F. Algebraically, R is the matrix of principal components and F the transpose of the matrix of eigenvectors of X′X. If the data are such that the eigenvectors are orthogonal functions of time and they have some recognizable non-random structure permitting predictability in time, then the observed response at time t can be used with the extrapolated forcing function to predict some physical quantity (e.g., temperature, pressure). This method is called the time extrapolated eigenvector prediction (TEEP). An example is given to illustrate the method with a known forcing function, the annual solar heating cycle. We have access to efficient computer routines which will facilitate an extension to much larger data sets.

11 citations


Journal ArticleDOI
TL;DR: In this paper, a generalized definition of the sth order interactions is given for multiway arrays, from which the similar properties with ANOVA model are derived, and the problem of maximization of the highest order correlation function is solved.
Abstract: A generalized definition of the sth order interactions is given for multiway arrays, from which the similar properties with ANOVA model are derived. The problem of maximization of the highest order correlation function is solved. Moreover, an expression decomposing multiway arrays is shown as a recurrent algorithm that is an extension of the singular value decomposition of matrices. Applications of the decomposition to multiway contingency tables and to the multiway classified data are outlined. This research develops Iwatsubo(1974)and Yoshizawa(1975), and relates with Bahadur's expression and Lancaster's definition of no interaction for contingency tables. The definition and expression given here are more general and constructive than theirs.

4 citations