scispace - formally typeset
Open AccessJournal ArticleDOI

Efficient algorithms for computing a strong rank-revealing QR factorization

Ming Gu, +1 more
- 01 Jul 1996 - 
- Vol. 17, Iss: 4, pp 848-869
Reads0
Chats0
TLDR
Two algorithms are presented for computing rank-revealing QR factorizations that are nearly as efficient as QR with column pivoting for most problems and take O (ran2) floating-point operations in the worst case.
Abstract
Given anm n matrixM withm > n, it is shown that there exists a permutation FI and an integer k such that the QR factorization MYI= Q(Ak ckBk) reveals the numerical rank of M: the k k upper-triangular matrix Ak is well conditioned, IlCkll2 is small, and Bk is linearly dependent on Ak with coefficients bounded by a low-degree polynomial in n. Existing rank-revealing QR (RRQR) algorithms are related to such factorizations and two algorithms are presented for computing them. The new algorithms are nearly as efficient as QR with column pivoting for most problems and take O (ran2) floating-point operations in the worst case.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions

TL;DR: This work surveys and extends recent research which demonstrates that randomization offers a powerful tool for performing low-rank matrix approximation, and presents a modular framework for constructing randomized algorithms that compute partial matrix decompositions.
Posted Content

Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions

TL;DR: In this article, a modular framework for constructing randomized algorithms that compute partial matrix decompositions is presented, which uses random sampling to identify a subspace that captures most of the action of a matrix and then the input matrix is compressed to this subspace, and the reduced matrix is manipulated deterministically to obtain the desired low-rank factorization.
Posted Content

Randomized algorithms for matrices and data

TL;DR: This monograph will provide a detailed overview of recent work on the theory of randomized matrix algorithms as well as the application of those ideas to the solution of practical problems in large-scale data analysis.
Journal ArticleDOI

Randomized algorithms for the low-rank approximation of matrices

TL;DR: Two recently proposed randomized algorithms for the construction of low-rank approximations to matrices are described and shown to be considerably more efficient and reliable than the classical (deterministic) ones; they also parallelize naturally.
Journal ArticleDOI

Fast Direct Methods for Gaussian Processes

TL;DR: In this paper, the authors show that for the most commonly used covariance functions, the matrix $C$ can be hierarchically factored into a product of block low-rank updates of the identity matrix, yielding an $\mathcal {O} (n\,\log^2, n)$ algorithm for inversion.
References
More filters
Book

Topics in Matrix Analysis

TL;DR: The field of values as discussed by the authors is a generalization of the field of value of matrices and functions, and it includes singular value inequalities, matrix equations and Kronecker products, and Hadamard products.
Book

Solving least squares problems

TL;DR: Since the lm function provides a lot of features it is rather complicated so it is going to instead use the function lsfit as a model, which computes only the coefficient estimates and the residuals.
Journal ArticleDOI

The Collinearity Problem in Linear Regression. The Partial Least Squares (PLS) Approach to Generalized Inverses

TL;DR: In this article, the use of Partial Least Squares (PLS) for handling collinearities among the independent variables X in multiple regression is discussed, and successive estimates are obtained using the residuals from previous rank as a new dependent variable y.
Book

Introduction to matrix computations

G. W. Stewart
TL;DR: Rounding-Error Analysis of Solution of Triangular Systems and of Gaussian Elimination.
Journal ArticleDOI

Numerical methods for solving linear least squares problems

TL;DR: This paper considers stable numerical methods for handling linear least squares problems that frequently involve large quantities of data, and they are ill-conditioned by their very nature.
Related Papers (5)