Topic
QR decomposition
About: QR decomposition is a research topic. Over the lifetime, 3504 publications have been published within this topic receiving 100599 citations. The topic is also known as: QR factorization.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: It is shown that essentially all standard linear algebra operations, including LU decompositions, QR decomposition, linear equation solving, matrix inversion, solving least squares problems, (generalized) eigenvalue problems and the singular value decomposition can also be done stably (in a normwise sense) in O(nω+η) operations.
Abstract: In Demmel et al. (Numer. Math. 106(2), 199–224, 2007) we showed that a large class of fast recursive matrix multiplication algorithms is stable in a normwise sense, and that in fact if multiplication of n-by-n matrices can be done by any algorithm in O(n ω+η) operations for any η > 0, then it can be done stably in O(nω+η) operations for any η > 0. Here we extend this result to show that essentially all standard linear algebra operations, including LU decomposition, QR decomposition, linear equation solving, matrix inversion, solving least squares problems, (generalized) eigenvalue problems and the singular value decomposition can also be done stably (in a normwise sense) in O(nω+η) operations.
196 citations
••
TL;DR: Sufficient conditions for existence of smooth orthonormal decompositions of smooth time varying matrices, and their block-analogues, are given and differential equations for the factors are derived.
Abstract: In this paper we consider smooth orthonormal decompositions of smooth time varying matrices. Among others, we consider QR-, Schur-, and singular value decompositions, and their block-analogues. Sufficient conditions for existence of such decompositions are given and differential equations for the factors are derived. Also generic smoothness of these factors is discussed.
194 citations
••
TL;DR: In this paper, a method for structural analysis of multivariate data is proposed that combines features of regression analysis and principal component analysis, which is based on the generalized singular value decomposition of a matrix with certain metric matrices.
Abstract: A method for structural analysis of multivariate data is proposed that combines features of regression analysis and principal component analysis. In this method, the original data are first decomposed into several components according to external information. The components are then subjected to principal component analysis to explore structures within the components. It is shown that this requires the generalized singular value decomposition of a matrix with certain metric matrices. The numerical method based on the QR decomposition is described, which simplifies the computation considerably. The proposed method includes a number of interesting special cases, whose relations to existing methods are discussed. Examples are given to demonstrate practical uses of the method.
188 citations
••
TL;DR: In this paper, a constructive proof of the existence of the rank-revealing QR factorization of any matrix A of size m x n with numerical rank r is given. But it is not clear how to find a rank revealing RRQR of A if A has numerical rank deficiency.
Abstract: T. Chan has noted that, even when the singular value decomposition of a matrix A is known, it is still not obvious how to find a rank-revealing QR factorization (RRQR) of A if A has numerical rank deficiency. This paper offers a constructive proof of the existence of the RRQR factorization of any matrix A of size m x n with numerical rank r . The bounds derived in this paper that guarantee the existence of RRQR are all of order f i ,in comparison with Chan's 0(2"-') . It has been known for some time that if A is only numerically rank-one deficient, then the column permutation l7 of A that guarantees a small rnn in the QR factorization of A n can be obtained by inspecting the size of the elements of the right singular vector of A corresponding to the smallest singular value of A . To some extent, our paper generalizes this well-known result. We consider the interplay between two important matrix decompositions: the singular value decomposition and the QR factorization of a matrix A . In particular, we are interested in the case when A is singular or nearly singular. It is well known that for any A E R m X n (a real matrix with rn rows and n columns, where without loss of generality we assume rn > n) there are orthogonal matrices U and V such that where C is a diagonal matrix with nonnegative diagonal elements: We assume that a, 2 a2 2 . . 2 on 2 0 . The decomposition (0.1) is the singular value decomposition (SVD) of A , and the ai are the singular values of A . The columns of V are the right singular vectors of A , and the columns of U are the left singular vectors of A . Mathematically, in terms of the singular values, Received December 1, 1990; revised February 8, 199 1. 199 1 Mathematics Subject Classification. Primary 65F30, 15A23, 15A42, 15A15.
185 citations
••
TL;DR: The versatility of the SDF framework is demonstrated by means of four diverse applications, which are all solved entirely within Tensorlab's DSL.
Abstract: We present structured data fusion (SDF) as a framework for the rapid prototyping of knowledge discovery in one or more possibly incomplete data sets. In SDF, each data set—stored as a dense, sparse, or incomplete tensor—is factorized with a matrix or tensor decomposition. Factorizations can be coupled, or fused, with each other by indicating which factors should be shared between data sets. At the same time, factors may be imposed to have any type of structure that can be constructed as an explicit function of some underlying variables. With the right choice of decomposition type and factor structure, even well-known matrix factorizations such as the eigenvalue decomposition, singular value decomposition and QR factorization can be computed with SDF. A domain specific language (DSL) for SDF is implemented as part of the software package Tensorlab, with which we offer a library of tensor decompositions and factor structures to choose from. The versatility of the SDF framework is demonstrated by means of four diverse applications, which are all solved entirely within Tensorlab’s DSL.
185 citations