Topic
QR decomposition
About: QR decomposition is a research topic. Over the lifetime, 3504 publications have been published within this topic receiving 100599 citations. The topic is also known as: QR factorization.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: In this article, a dual-domain algorithm based on matrix rank reduction was developed for separating simultaneous-source seismic data. But the authors only considered the 3D common receiver gathers or offset-midpoint gathers.
Abstract: We have developed a fast dual-domain algorithm based on matrix rank reduction for separating simultaneous-source seismic data. Our algorithm operates on 3D common receiver gathers or offset-midpoint gathers. At a given monochromatic frequency slice in the ω-x-y domain, the spatial data of the ideal unblended common receiver or offset-midpoint gather could be represented via a low-rank matrix. The interferences from the randomly and closely fired shots increased the rank of the aforementioned matrix. Therefore, we could minimize the misfit between the blended observation and the predicted blended data subject to a low-rank constraint that was applied to the data in the ω-x-y domain. The low-rank constraint could be implemented via the classic truncated singular value decomposition (SVD) or via a randomized QR decomposition (rQRd). The rQRd yielded nearly one order of processing time improvement with respect to the truncated SVD. We have also discovered that the rQRd was less stringent on the select...
49 citations
••
TL;DR: Cyclic pivoting as mentioned in this paper is a generalization of column pivoting and reverse pivoting, and it can give tight estimates of any two a priori-chosen consecutive singular values of a matrix.
Abstract: We introduce a pair of dual concepts: pivoted blocks and reverse pivoted blocks. These blocks are the outcome of a special column pivoting strategy in QR factorization. Our main result is that under such a column pivoting strategy, the QR factorization of a given matrix can give tight estimates of any two a priori-chosen consecutive singular values of that matrix. In particular, a rank-revealing QR factorization is guaranteed when the two chosen consecutive singular values straddle a gap in the singular value spectrum that gives rise to the rank degeneracy of the given matrix. The pivoting strategy, called cyclic pivoting, can be viewed as a generalization of Golub's column pivoting and Stewart's reverse column pivoting. Numerical experiments confirm the tight estimates that our theory asserts.
49 citations
••
TL;DR: In this paper, a truncated QR factorization with column pivoting is proposed to reduce the communication complexity in factorizing a matrix using QR with column pivot decisions, where column-norm updates are required to process pivot decisions.
Abstract: The dominant contribution to communication complexity in factorizing a matrix using QR with column pivoting is due to column-norm updates that are required to process pivot decisions. We use randomized sampling to approximate this process which dramatically reduces communication in column selection. We also introduce a sample update formula to reduce the cost of sampling trailing matrices. Using our column selection mechanism, we observe results that are comparable in quality to those obtained from the QRCP algorithm, but with performance near unpivoted QR. We also demonstrate strong parallel scalability on shared-memory multiple core systems using an implementation in Fortran with OpenMP. This work immediately extends to produce low-rank truncated approximations of large matrices. We propose a truncated QR factorization with column pivoting that avoids trailing matrix updates which are used in current implementations of level-3 BLAS QR and QRCP. Provided the truncation rank is small, avoiding trailing ma...
49 citations
••
TL;DR: It is shown that the Partial Least-Squares (PLS) algorithm for univariate data is equivalent to using a truncated Cayley-Hamilton polynomial expression of degree [email protected]?r for the matrix inverse which is used to compute the least-squares (LS) solution.
48 citations
14 Nov 2008
TL;DR: The problem of updating the QR factorization is treated, with applications to the least squares problem, and algorithms are presented that compute the factorization A1 = Q1 R1, where A1 is the matrix A = QR after it has had a number of rows or columns added or deleted.
Abstract: In this paper we treat the problem of updating the
QR factorization, with applications to the least
squares problem. Algorithms are presented that compute
the factorization A1 = Q1 R1, where A1 is the matrix
A = QR after it has had a number of rows or columns
added or deleted. This is achieved by updating the
factors Q and R, and we show this can be much faster
than computing the factorization of A1 from scratch.
We consider algorithms that exploit the Level 3 BLAS
where possible and place no restriction on the dimensions
of A or the number of rows and columns added or deleted.
For some of our algorithms we present Fortran 77
LAPACK-style code and show the backward error of our
updated factors is comparable to the error bounds of the
QR factorization of A1.
48 citations