scispace - formally typeset
Search or ask a question
Topic

QR decomposition

About: QR decomposition is a research topic. Over the lifetime, 3504 publications have been published within this topic receiving 100599 citations. The topic is also known as: QR factorization.


Papers
More filters
01 Jan 1999
TL;DR: The SPOOLES software package provides a choice of three sparse matrix orderings (minimum degree, nested dissection and multisection), supports pivoting for numerical stability, can compute direct or drop tolerance factorizations, and the computations are based on BLAS3 numerical kernels to take advantage of high performance computing architectures.
Abstract: 1 Overview Solving sparse linear systems of equations is a common and important component of a multitude of scientific and engineering applications. The SPOOLES software package1 provides this functionality with a collection of software objects and methods. The package provides a choice of three sparse matrix orderings (minimum degree, nested dissection and multisection), supports pivoting for numerical stability (when required), can compute direct or drop tolerance factorizations, and the computations are based on BLAS3 numerical kernels to take advantage of high performance computing architectures. The factorizations and solves are supported in serial, multithreaded (using POSIX threads) and MPI environments. The first step to solving a linear system AX = B is to construct “objects” to hold the entries and structure of A, and the entries of X and B. SPOOLES provides a flexible set of methods to assemble a sparse matrix. The “input matrix” object allows a choice of coordinate systems (by rows, by columns, and other ways), flexible input (input by single entries, (partial) rows or columns, dense submatrices, or any combination), resizes itself as necessary, and assembles, sorts and permutes its entries. It is also a distributed object for MPI environments. Matrix entries can be created and assembled on different processors, and methods exist to assemble and redistribute the matrix entries as necessary. There are three methods to order a sparse matrix: minimum degree, generalized nested dissection and multisection. The latter two orderings depend on a domain/separator tree that is constructed using a graph partitioning method. Domain decomposition is used to find an initial separator, and a sequence of network flow problems are solved to smooth the separator. The qualities of our nested dissection and multisection orderings are comparable to other state of the art packages. Factorizations of square matrices have the form A = PLDUQ and A = PLDLP T , where P and Q are permutation matrices. Square systems of the form A + σB may also be factored and solved (as found in shift-and-invert eigensolvers), as well as full rank overdetermined linear systems, where a QR factorization is computed and the solution found by solving the semi-normal equations.

124 citations

Proceedings ArticleDOI
18 Mar 2004
TL;DR: It is demonstrated that the application of sorted QR decomposition (SQRD) as a initialization step can significantly reduce the computational effort associated with lattice-reduction.
Abstract: Recently the use of lattice-reduction for signal detection in multiple antenna systems has been proposed. In this paper, we adopt these lattice-reduction aided schemes to the MMSE criterion. We show that an obvious way to do this is suboptimum and propose an alternative method based on an extended system model. In conjunction with simple successive interference cancellation this scheme almost reaches the performance of maximum-likelihood detection. Furthermore, we demonstrate that the application of sorted QR decomposition (SQRD) as a initialization step can significantly reduce the computational effort associated with lattice-reduction. Thus, the new algorithm clearly outperforms existing methods with comparable complexity.

123 citations

Journal ArticleDOI
TL;DR: The generalized singular value decomposition is used to analyze the problem of minimizing $||Ax b||_{2}$ subject to the constraint Bx = d and a byproduct of the analysis is a new iterative procedure that can be used to improve an approximate solution obtained via the method of weights.
Abstract: The generalized singular value decomposition is used to analyze the problem of minimizing $||Ax b||_{2}$ subject to the constraint Bx = d. A byproduct of the analysis is a new iterative procedure that can be used to improve an approximate solution obtained via the method of weights. All that is required to implement the procedure is a single QR factorization. These developments turn out to be of interest when A and B are sparse and for the case when systolic architectures are used to carry out the computations.

123 citations

Book
01 Jan 2004
TL;DR: This chapter discusses Newton Methods for Nonlinear Optimization, Iterative Methods, and Applications of the Chebyshev Polynomials, which deals with the effects of Finite Precision Arithmetic.
Abstract: 1. Nonlinear Equations. Biscetion and Inverse Linear Interpolation. Newton's Method. The Fixed Point Theorem. Quadratic Convergence of Newton's Method. Variants of Newton's Method. Brent's Method. Effects of Finite Precision Arithmetic. Newton's Method for Systems. Broyden's Method. 2. Linear Systems. Gaussian Elimination with Partial Pivoting. The LU Decomposition. The LU Decomposition with Pivoting. The Cholesky Decomposition. Condition Numbers. The QR Decomposition. Householder Triangularization and the QR Decomposition. Gram-Schmidt Orthogonalization and the QR Decomposition. The Singular Value Decomposition. 3. Iterative Methods. Jacobi and Gauss-Seidel Iteration. Sparsity. Iterative Refinement. Preconditioning. Krylov Space Methods. Numerical Eigenproblems. 4. Polynomial Interpolation. Lagrange Interpolating Polynomials. Piecewise Linear Interpolation. Cubic Splines. Computation of the Cubic Spline Coefficients. 5. Numerical Integration. Closed Newton-Cotes Formulas. Open Newton-Cotes Formulas and Undetermined Coeffients. Gaussian Quadrature. Gauss-Chebyshev Quadrature. Radau and Lobatto Quadrature. Adaptivity and Automatic Integration. Romberg Integration. 6. Differential Equations. Numerical Differentiation. Euler's Method. Improved Euler's Method. Analysis of Explicit One-Step Methods. Taylor and Runge-Kutta Methods. Adaptivity and Stiffness. Multi-Step Methods. 7. Nonlinear Optimization. One-Dimensional Searches. The Method of Steepest Descent. Newton Methods for Nonlinear Optimization. Multiple Random Start Methods. Direct Search Methods. The Nelder-Mead Method. Conjugate Direction Methods. 8. Approximation Methods. Linear and Nonlinear Least Squares. The Best Approximation Problem. Best Uniform Approximation. Applications of the Chebyshev Polynomials. Afterword. Bibliography. Answers. Index.

122 citations

Journal ArticleDOI
TL;DR: The main idea is to in ter leave composit ions of x and n -x objects and resor t to a lexicographic genera t ion ofComposit ions.
Abstract: p r o c e d u r e Ising (n, x, t, S); i n t e g e r n, x, l; i n t e g e r array S; c o m m e n t Ising generates n-sequences ($1, \" \" , S,) of zeros and ones where x = ~ i ~ S~ and t = ~,-~1 I S~+I S~ I are given. The main idea is to in ter leave composit ions of x and n -x objects and resor t to a lexicographic genera t ion of composit ions. We call these sequences Is ing configurat ions since we believe they first appeared in the s t u d y of the so-called I s ing problem (See Hill [1], Is ing [2]). T he number R(n, x, t) of dist i nc t configurat ions wi th fixed n, x, t is well known [1, 2]:

121 citations


Network Information
Related Topics (5)
Optimization problem
96.4K papers, 2.1M citations
85% related
Network packet
159.7K papers, 2.2M citations
84% related
Robustness (computer science)
94.7K papers, 1.6M citations
83% related
Wireless network
122.5K papers, 2.1M citations
83% related
Wireless sensor network
142K papers, 2.4M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202331
202273
202190
2020132
2019126
2018139