scispace - formally typeset
Search or ask a question
Topic

Singular value

About: Singular value is a research topic. Over the lifetime, 6243 publications have been published within this topic receiving 131506 citations.


Papers
More filters
Journal ArticleDOI

[...]

TL;DR: This paper develops a simple first-order and easy-to-implement algorithm that is extremely efficient at addressing problems in which the optimal solution has low rank, and develops a framework in which one can understand these algorithms in terms of well-known Lagrange multiplier algorithms.
Abstract: This paper introduces a novel algorithm to approximate the matrix with minimum nuclear norm among all matrices obeying a set of convex constraints. This problem may be understood as the convex relaxation of a rank minimization problem and arises in many important applications as in the task of recovering a large matrix from a small subset of its entries (the famous Netflix problem). Off-the-shelf algorithms such as interior point methods are not directly amenable to large problems of this kind with over a million unknown entries. This paper develops a simple first-order and easy-to-implement algorithm that is extremely efficient at addressing problems in which the optimal solution has low rank. The algorithm is iterative, produces a sequence of matrices $\{\boldsymbol{X}^k,\boldsymbol{Y}^k\}$, and at each step mainly performs a soft-thresholding operation on the singular values of the matrix $\boldsymbol{Y}^k$. There are two remarkable features making this attractive for low-rank matrix completion problems. The first is that the soft-thresholding operation is applied to a sparse matrix; the second is that the rank of the iterates $\{\boldsymbol{X}^k\}$ is empirically nondecreasing. Both these facts allow the algorithm to make use of very minimal storage space and keep the computational cost of each iteration low. On the theoretical side, we provide a convergence analysis showing that the sequence of iterates converges. On the practical side, we provide numerical examples in which $1,000\times1,000$ matrices are recovered in less than a minute on a modest desktop computer. We also demonstrate that our approach is amenable to very large scale problems by recovering matrices of rank about 10 with nearly a billion unknowns from just about 0.4% of their sampled entries. Our methods are connected with the recent literature on linearized Bregman iterations for $\ell_1$ minimization, and we develop a framework in which one can understand these algorithms in terms of well-known Lagrange multiplier algorithms.

4,762 citations

Journal ArticleDOI

[...]

TL;DR: The generalized cross-validation (GCV) method as discussed by the authors is a generalized version of Allen's PRESS, which can be used in subset selection and singular value truncation, and even to choose from among mixtures of these methods.
Abstract: Consider the ridge estimate (λ) for β in the model unknown, (λ) = (X T X + nλI)−1 X T y. We study the method of generalized cross-validation (GCV) for choosing a good value for λ from the data. The estimate is the minimizer of V(λ) given by where A(λ) = X(X T X + nλI)−1 X T . This estimate is a rotation-invariant version of Allen's PRESS, or ordinary cross-validation. This estimate behaves like a risk improvement estimator, but does not require an estimate of σ2, so can be used when n − p is small, or even if p ≥ 2 n in certain cases. The GCV method can also be used in subset selection and singular value truncation methods for regression, and even to choose from among mixtures of these methods.

3,365 citations

DOI

[...]

John Doyle1
01 Nov 1982
TL;DR: In this article, a general approach for analysing linear systems with structured uncertainty based on a new generalised spectral theory for matrices is introduced, which naturally extend techniques based on singular values and eliminate their most serious difficulties.
Abstract: The paper introduces a general approach for analysing linear systems with structured uncertainty based on a new generalised spectral theory for matrices. The results of the paper naturally extend techniques based on singular values and eliminate their most serious difficulties.

1,958 citations

Journal ArticleDOI

[...]

TL;DR: In this article, a numerically stable and fairly fast scheme is described to compute the unitary matrices U and V which transform a given matrix A into a diagonal form π = U^ * AV, thus exhibiting A's singular values on π's diagonal.
Abstract: A numerically stable and fairly fast scheme is described to compute the unitary matrices U and V which transform a given matrix A into a diagonal form $\Sigma = U^ * AV$, thus exhibiting A’s singular values on $\Sigma $’s diagonal. The scheme first transforms A to a bidiagonal matrix J, then diagonalizes J. The scheme described here is complicated but does not suffer from the computational difficulties which occasionally afflict some previously known methods. Some applications are mentioned, in particular the use of the pseudo-inverse $A^I = V\Sigma ^I U^* $ to solve least squares problems in a way which dampens spurious oscillation and cancellation.

1,629 citations

Journal ArticleDOI

[...]

TL;DR: A tutorial introduction to the complex structured singular value (μ) is presented, with an emphasis on the mathematical aspects of μ.
Abstract: A tutorial introduction to the complex structured singular value (μ) is presented, with an emphasis on the mathematical aspects of μ. The μ-based methods discussed here have been useful for analysing the performance and robustness properties of linear feedback systems. Several tests for robust stability and performance with computable bounds for transfer functions and their state space realizations are compared, and a simple synthesis problem is studied. Uncertain systems are represented using Linear Fractional Transformations (LFTs) which naturally unify the frequency-domain and state space methods.

1,452 citations


Network Information
Related Topics (5)
Linear system
59.5K papers, 1.4M citations
83% related
Matrix (mathematics)
105.5K papers, 1.9M citations
82% related
Optimization problem
96.4K papers, 2.1M citations
82% related
Robustness (computer science)
94.7K papers, 1.6M citations
81% related
Differential equation
88K papers, 2M citations
80% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202358
2022110
2021270
2020333
2019352
2018313