scispace - formally typeset
Search or ask a question
Topic

QR decomposition

About: QR decomposition is a research topic. Over the lifetime, 3504 publications have been published within this topic receiving 100599 citations. The topic is also known as: QR factorization.


Papers
More filters
Journal ArticleDOI
TL;DR: A low-rank format called Block Low-Rank (BLR) is proposed, and it is explained how it can be used to reduce the memory footprint and the complexity of direct solvers for sparse matrices based on the multifrontal method.
Abstract: Matrices coming from elliptic Partial Differential Equations (PDEs) have been shown to have a low-rank property: well defined off-diagonal blocks of their Schur complements can be approximated by low-rank products. Given a suitable ordering of the matrix which gives to the blocks a geometrical meaning, such approximations can be computed using an SVD or a rank-revealing QR factorization. The resulting representation offers a substantial reduction of the memory requirement and gives efficient ways to perform many of the basic dense algebra operations. Several strategies have been proposed to exploit this property. We propose a low-rank format called Block Low-Rank (BLR), and explain how it can be used to reduce the memory footprint and the complexity of direct solvers for sparse matrices based on the multifrontal method. We present experimental results that show how the BLR format delivers gains that are comparable to those obtained with hierarchical formats such as Hierarchical matrices (H matrices) and Hierarchically Semi-Separable (HSS matrices) but provides much greater flexibility and ease of use which are essential in the context of a general purpose, algebraic solver.

170 citations

Book
26 Feb 2001
TL;DR: In this paper, the authors propose a framework for convergence of operators based on the convergence of a sequence of subspaces and Inverse Iteration Error Analysis (IIA).
Abstract: SPECTRAL DECOMPOSITION Genera Notions Decompositions Spectral Sets of Finite Type Adjoint and Product Spaces SPECTRAL APPROXIMATION Convergence of operators Property U Property L Error Estimates IMPROVEMENT OF ACCURACY Iterative Refinement Acceleration FINITE RANK APPROXIMATIONS Approximations Based on Projection Approximations of Integral Operators A Posteriori Error Estimates MATRIX FORMULATIONS Finite Rank Operators Iterative Refinement Acceleration Numerical Examples MATRIX COMPUTATIONS QR Factorization Convergence of a Sequence of Subspaces QR Methods and Inverse Iteration Error Analysis REFERENCES INDEX Each chapter also includes exercises

170 citations

Journal ArticleDOI
TL;DR: In this article, it was shown that a large class of fast recursive matrix multiplication algorithms are stable in a norm-wise sense, including LU decomposition, QR decomposition and linear equation solving, matrix inversion, solving least squares problems, eigenvalue problems and singular value decomposition.
Abstract: In an earlier paper, we showed that a large class of fast recursive matrix multiplication algorithms is stable in a normwise sense, and that in fact if multiplication of $n$-by-$n$ matrices can be done by any algorithm in $O(n^{\omega + \eta})$ operations for any $\eta > 0$, then it can be done stably in $O(n^{\omega + \eta})$ operations for any $\eta > 0$. Here we extend this result to show that essentially all standard linear algebra operations, including LU decomposition, QR decomposition, linear equation solving, matrix inversion, solving least squares problems, (generalized) eigenvalue problems and the singular value decomposition can also be done stably (in a normwise sense) in $O(n^{\omega + \eta})$ operations.

168 citations

Proceedings ArticleDOI
01 Nov 2005
TL;DR: This paper presents a novel architecture for matrix inversion by generalizing the QR decomposition-based recursive least square (RLS) algorithm, and using Squared Givens rotations and a folded systolic array for FPGA implementation.
Abstract: This paper presents a novel architecture for matrix inversion by generalizing the QR decomposition-based recursive least square (RLS) algorithm. The use of Squared Givens rotations and a folded systolic array makes this architecture very suitable for FPGA implementation. Input is a 4 × 4 matrix of complex, floating point values. The matrix inversion design can achieve throughput of 0.13M updates per second on a state of the art Xilinx Virtex4 FPGA running at 115 MHz. Due to the modular partitioning and interfacing between multiple Boundary and Internal processing units, this architecture is easily extendable for other matrix sizes.

167 citations

01 Jan 2008
TL;DR: The Fortran subroutine BVLS (bounded variable least-squares) solves linear least-Squares problems with upper and lower bounds on the variables, using an active set strategy, and is used to solve minimum l1 and l∞ fitting problems.
Abstract: The Fortran subroutine BVLS (bounded variable least-squares) solves linear least-squares problems with upper and lower bounds on the variables, using an active set strategy. The unconstrained least-squares problems for each candidate set of free variables are solved using the QR decomposition. BVLS has a “warm-start” feature permitting some of the variables to be initialized at their upper or lower bounds, which speeds the solution of a sequence of related problems. Such sequences of problems arise, for example, when BVLS is used to find bounds on linear functionals of a model constrained to satisfy, in an approximate lp-norm sense, a set of linear equality constraints in addition to upper and lower bounds. We show how to use BVLS to solve that problem when p = 1, 2, or ∞, and to solve minimum l1 and l∞ fitting problems. FORTRAN 77 code implementing BVLS is available from the statlib gopher at Carnegie Mellon University.

163 citations


Network Information
Related Topics (5)
Optimization problem
96.4K papers, 2.1M citations
85% related
Network packet
159.7K papers, 2.2M citations
84% related
Robustness (computer science)
94.7K papers, 1.6M citations
83% related
Wireless network
122.5K papers, 2.1M citations
83% related
Wireless sensor network
142K papers, 2.4M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202331
202273
202190
2020132
2019126
2018139