scispace - formally typeset
Search or ask a question
Topic

QR decomposition

About: QR decomposition is a research topic. Over the lifetime, 3504 publications have been published within this topic receiving 100599 citations. The topic is also known as: QR factorization.


Papers
More filters
Journal ArticleDOI
TL;DR: A numerically robust solver for least-square problems with bounded variables (BVLS) is presented for applications including, but not limited to, model predictive control (MPC) and state-of-the-art quadratic programming solvers are compared.
Abstract: In this paper, a numerically robust solver for least-square problems with bounded variables (BVLS) is presented for applications including, but not limited to, model predictive control (MPC). The proposed BVLS algorithm solves the problem efficiently by employing a recursive QR factorization method based on Gram–Schmidt orthogonalization. A reorthogonalization procedure that iteratively refines the QR factors provides numerical robustness for the described primal active-set method, which solves a system of linear equations in each of its iterations via recursive updates. The performance of the proposed BVLS solver, which is implemented in C without external software libraries, is compared in terms of computational efficiency against state-of-the-art quadratic programming solvers for small- to medium-sized random BVLS problems and a typical example of embedded linear MPC application. The numerical tests demonstrate that the solver performs very well even when solving ill-conditioned problems in single-precision floating-point arithmetic.

17 citations

Proceedings ArticleDOI
04 Jul 2011
TL;DR: A procedure is considered, the Slack Reduction Algorithm (SRA), to optimize the execution frequency of a collection of tasks (in which many dense linear algebra algorithms can be decomposed) on multi-core architectures, modulated by an energy-aware simulator.
Abstract: This paper addresses the efficient exploitation of task-level parallelism, present in many dense linear algebra operations, from the point of view of both computational performance and energy consumption. In particular, we consider a procedure, the Slack Reduction Algorithm (SRA), to optimize the execution frequency of a collection of tasks (in which many dense linear algebra algorithms can be decomposed) on multi-core architectures. The results from this procedure are modulated by an energy-aware simulator, which is in charge of scheduling/mapping the execution of these tasks to the cores, leveraging dynamic frequency voltage scaling featured by current technology. Simultaneously, the simulator evaluates the performance benefits of the solution. Experiments with these tools show significant energy gains for two key dense linear algebra operations: the Cholesky and QR factorizations.

17 citations

Journal ArticleDOI
TL;DR: In this paper, a sparse direct solution algorithm for discrete representations of boundary integral operators is proposed, which relies on an expansion of unknown surface currents in a numerically determined basis of functions that are simultaneously localized to small regions on a larger target while also satisfying global boundary conditions.
Abstract: A sparse direct solution algorithm is reported for discrete representations of boundary integral operators. The algorithm relies on an expansion of the unknown surface currents in a numerically determined basis of functions that are simultaneously localized to small regions on a larger target while also satisfying global boundary conditions. It is shown that the QR factorization of the impedance matrix is sparse in this basis at low to moderate frequencies.

17 citations

Journal ArticleDOI
TL;DR: The aim of this paper is to discuss the class of leap-frog-type neural learning algorithms having the unitary group of matrices as parameter space and projection methods and related implementation issues.

17 citations

Journal ArticleDOI
TL;DR: The most efficient program for finding all the eigenvalues of a symmetric matrix is a combination of the Householder tridiagonalization and the QR algorithm, and by avoiding square roots the efficiency of this algorithm can be further increased.
Abstract: The most efficient program for finding all the eigenvalues of a symmetric matrix is a combination of the Householder tridiagonalization and the QR algorithm. The latter, if carried out in a natural way, requires 4n additions, iOn multiplications, 2n divisions, and n square roots per iteration (n the order of the matrix). In 1963, Ortega and Kaiser showed that the process can be carried out using no square roots (and saving 7n multiplications). However, their algorithm is unstable and several modifications were suggested to increase its accuracy. We, too, want to give such a modification together with some examples demonstrating the achieved accuracy. 1. Introduction. In 1961 Francis (4) proposed the QR transformation, an offspring of Rutishauser's LR transformation (8), for the computation of the eigen- values of a general matrix. He considered his method to be inefficient for Hermitian matrices but, fortunately, it soon turned out that, contrary to his original opinion, the method is especially efficient for this class of matrices, provided the given matrix is first reduced by Householder's method to real tridiagonal form and provided that shifts are used to accelerate the rate of convergence. (A description of this technique can be found in (10), for tested ALGOL programs see (6), (3), (2), the properties of the now generally adopted shift are described in (11).) Ortega and Kaiser (7) pointed out that by avoiding square roots the efficiency of this algorithm can be further increased (though if all eigenvalues are to be computed, it is already superior to all other known methods). The algorithm which they proposed, however, was unstable and several modifications were suggested (e.g., (9), (5) and others, not published). We, too, want to give such a modification here, together with some examples demonstrating the achieved accuracy.

17 citations


Network Information
Related Topics (5)
Optimization problem
96.4K papers, 2.1M citations
85% related
Network packet
159.7K papers, 2.2M citations
84% related
Robustness (computer science)
94.7K papers, 1.6M citations
83% related
Wireless network
122.5K papers, 2.1M citations
83% related
Wireless sensor network
142K papers, 2.4M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202331
202273
202190
2020132
2019126
2018139