scispace - formally typeset
Search or ask a question
Topic

QR decomposition

About: QR decomposition is a research topic. Over the lifetime, 3504 publications have been published within this topic receiving 100599 citations. The topic is also known as: QR factorization.


Papers
More filters
Journal ArticleDOI
TL;DR: A preliminary study of the incorporation of the subspace method into a subband framework proves to be efficient, although some problems remain open.
Abstract: A novel approach for multimicrophone speech dereverberation is presented. The method is based on the construction of the null subspace of the data matrix in the presence of colored noise, using the generalized singular-value decomposition (GSVD) technique, or the generalized eigenvalue decomposition (GEVD) of the respective correlation matrices. The special Silvester structure of the filtering matrix, related to this subspace, is exploited for deriving a total least squares (TLS) estimate for the acoustical transfer functions (ATFs). Other less robust but computationally more efficient methods are derived based on the same structure and on the QR decomposition (QRD). A preliminary study of the incorporation of the subspace method into a subband framework proves to be efficient, although some problems remain open. Speech reconstruction is achieved by virtue of the matched filter beamformer (MFBF). An experimental study supports the potential of the proposed methods.

127 citations

Journal ArticleDOI
TL;DR: This paper proposes an LDA-based incremental dimension reduction algorithm, called IDR/QR, which applies QR decomposition rather than SVD, which does not require the whole data matrix in main memory, which is desirable for large data sets.
Abstract: Dimension reduction is a critical data preprocessing step for many database and data mining applications, such as efficient storage and retrieval of high-dimensional data. In the literature, a well-known dimension reduction algorithm is linear discriminant analysis (LDA). The common aspect of previously proposed LDA-based algorithms is the use of singular value decomposition (SVD). Due to the difficulty of designing an incremental solution for the eigenvalue problem on the product of scatter matrices in LDA, there has been little work on designing incremental LDA algorithms that can efficiently incorporate new data items as they become available. In this paper, we propose an LDA-based incremental dimension reduction algorithm, called IDR/QR, which applies QR decomposition rather than SVD. Unlike other LDA-based algorithms, this algorithm does not require the whole data matrix in main memory. This is desirable for large data sets. More importantly, with the insertion of new data items, the IDR/QR algorithm can constrain the computational cost by applying efficient QR-updating techniques. Finally, we evaluate the effectiveness of the IDR/QR algorithm in terms of classification error rate on the reduced dimensional space. Our experiments on several real-world data sets reveal that the classification error rate achieved by the IDR/QR algorithm is very close to the best possible one achieved by other LDA-based algorithms. However, the IDR/QR algorithm has much less computational cost, especially when new data items are inserted dynamically.

127 citations

Book
01 Jan 1988
TL;DR: Part 1 Fundamentals of parallel computation: general principles of parallel computing parallel techniques and algorithms parallel sorting algorithms and future trends in algorithm development.
Abstract: Part 1 Fundamentals of parallel computation: general principles of parallel computing parallel techniques and algorithms parallel sorting algorithms. Part 2 Numerical linear algebra: solution of a system of linear algebraic equations the symmetric eigenvalue problem - Jacobi method QR factorization singular-value decomposition and related problems future trends in algorithm development.

126 citations

Journal ArticleDOI
TL;DR: This work investigates implementation of algorithms for solving the hyper-parameter estimation problem that can deal with both large data sets and possibly ill-conditioned computations and proposes a QR factorization based matrix-inversion-free algorithm to evaluate the cost function in an efficient and accurate way.

126 citations

Book
01 Jan 1997
TL;DR: In this paper, the singular value decomposition of a square matrix has been shown to be unitarily invariant in terms of the scalar product length of a vector isometric matricies.
Abstract: Lecture 1: metric space some useful definitions nested balls normed space popular vector norms matrix norms equivalent norms operator norms. Lecture 2: scalar product length of a vector isometric matricies preservation of length and unitary matricies Schur theorum normal matricies positive definite matricies the singular value decomposition unitarily invariant norms a short way to the SVD approximations of a lower rank smoothness and ranks. Lecture 3: perturbation theory condition of a matrix convergent matricies and series the simplest iteration method inverses and series condition of a linear system consistency of matrix and right-hand side eigenvalue perturbations continuity of the polynomial roots. Lecture 4: diagonal dominance Gerschgorin disks small perturbations of eigen values and vectors condition of a simple eigenvalue analitic perturbations. Lecture 5: spectral distances "symmetric" theorums Hoffman-Wielandt theorum permutation vector of a matrix "unnormal" extension eigenvalues of Hermitian matrices interlacing properties what are clusters? singular value clusters eigenvalue clusters. Lecture 6: floating-point numbers computer arithmetic axioms round-off errors for the scalar product forward and backward analysis some philosophy an example of "bad" operation one more example ideal and machine tests up or down solving the triangular systems. Lecture 7: direct methods for linear systems theory of the LU decomposition round-off errors for the LU decomposition growth of matrix entries and pivoting complete pivoting the Cholesky method triangular decompositions and linear systems solution how to refine the solution. Lecture 8: the QR decomposition of a square matrix the QR decomposition of a rectangular matrix householder matrices elimination of elements by reflections Givens matricies elimination of elements by rotations computer realizations of reflections and rotations orthgonalization method loss of orthogonality modified Gram-Schmidt algorithm bidiagonalization unitary similarity reduction to the Hessenberg form. Lecture 9: the eigenvalue problem the power method subspace iterations distances between subspaces subspaces and orthoprojectors distances and orthoprojectors subspaces of equal dimension the CS decomposition convergence of subspace iterations for the block diagonal matrix convergance of subspace iterations in the general case. Lecture 10: the QR algorithm generalised QR algorithm basic formulas the QR iteration lemma convergance of the QR iterations pessimistric and optimistic Bruhat decomposition what if the inverse matrix is not strongly regular the QR iterations and the subspace iterations. Lecture 11: quadratic convergence cubic convergence what makes the QR algorithm efficient implicit QR iterations arrangement of computations how to find the singular value decomposition. Lecture 12: function approximation (Part contents)

125 citations


Network Information
Related Topics (5)
Optimization problem
96.4K papers, 2.1M citations
85% related
Network packet
159.7K papers, 2.2M citations
84% related
Robustness (computer science)
94.7K papers, 1.6M citations
83% related
Wireless network
122.5K papers, 2.1M citations
83% related
Wireless sensor network
142K papers, 2.4M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202331
202273
202190
2020132
2019126
2018139