scispace - formally typeset
Search or ask a question

Showing papers on "QR decomposition published in 1977"


Journal ArticleDOI
TL;DR: In this paper bounds on W and F in terms of E are given and perturbation bounds are given for the closely related Cholesky factorization of a positive definite matrix B into the product of a lower triangular matrix and its transpose.
Abstract: Let A be an $m \times n$ matrix of rank n. The $QR$ factorization of A decomposes A into the product of an $m \times n$ matrix Q with orthonormal columns and a nonsingular upper triangular matrix R. The decomposition is essentially unique, Q being determined up to the signs of its columns and R up to the signs of its rows. If E is an $m \times n$ matrix such that $A + E$ is of rank n, then $A + E$ has an essentially unique factorization $(Q + W)(R + F)$. In this paper bounds on $\| W \|$ and $\| F \|$ in terms of $\| E \|$ are given. In addition perturbation bounds are given for the closely related Cholesky factorization of a positive definite matrix B into the product $R^T R$ of a lower triangular matrix and its transpose.

105 citations


Journal ArticleDOI
TL;DR: An algorithm for solving the matrix equation X = FXF T + S, which is important in the control system design, is presented in this paper and is based on the QR algorithm for finding the eigenvalues of a matrix.
Abstract: A now algorithm for solving the matrix equation X = FXF T + S, which is important in the control system design, is presented in this paper. The algorithm is based on the QR algorithm for finding the eigenvalues of a matrix and works efficiently for large dimensional problems. A simple example is given to illustrate the algorithm. The method is also applicable to other types of equations such as the Lyapunov equation A T X + XA + B = 0.

57 citations


Journal ArticleDOI
TL;DR: Methods for obtaining the estimates of coefficients and other statistics in the usual least squares regression model are disucussed which proceed by forming a version of the Cholesky decomposition of the correlation matrix, and are well adapted to updating as predictot variables are added to or deleted from the regression.
Abstract: Methods for obtaining the estimates of coefficients and other statistics in the usual least squares regression model are disucussed which proceed by forming a version of the Cholesky decomposition of the correlation matrix, and are well adapted to updating as predictot variables are added to or deleted from the regression. Connection with jordan elimination and with methods based on Householder decompositions are explored. Brief comments are made on the relevance of the methods for the computations of multivariate analysis.

11 citations


Journal ArticleDOI
TL;DR: Important classes of algorithms for unconstrained minimization, when applied to a quadratic with Hessian A, may be regarded as alternative ways to effect certain basic matrix factorizations of or with respect to A.

3 citations