scispace - formally typeset
Open AccessPosted Content

Asymptotic Log-Det Sum-of-Ranks Minimization via Tensor (Alternating) Iteratively Reweighted Least Squares.

TLDR
In this paper, it was shown that iteratively reweighted least squares with weight strength (p = 0) remains a viable method for affine sum-of-ranks minimization.
Abstract
Affine sum-of-ranks minimization (ASRM) generalizes the affine rank minimization (ARM) problem from matrices to tensors. Here, the interest lies in the ranks of a family $\mathcal{K}$ of different matricizations. Transferring our priorly discussed results on asymptotic log-det rank minimization, we show that iteratively reweighted least squares with weight strength $p = 0$ remains a, theoretically and practically, particularly viable method denoted as $\mathrm{IRLS}$-$0\mathcal{K}$. As in the matrix case, we prove global convergence of asymptotic minimizers of the log-det sum-of-ranks function to desired solutions. Further, we show local convergence of $\mathrm{IRLS}$-$0\mathcal{K}$ in dependence of the rate of decline of the therein appearing regularization parameter $\gamma \searrow 0$. For hierarchical families $\mathcal{K}$, we show how an alternating version ($\mathrm{AIRLS}$-$0\mathcal{K}$, related to prior work under the name $\mathrm{SALSA}$) can be evaluated solely through tensor tree network based operations. The method can thereby be applied to high dimensions through the avoidance of exponential computational complexity. Further, the otherwise crucial rank adaption process becomes essentially superfluous even for completion problems. In numerical experiments, we show that the therefor required subspace restrictions and relaxation of the affine constraint cause only a marginal loss of approximation quality. On the other hand, we demonstrate that $\mathrm{IRLS}$-$0\mathcal{K}$ allows to observe the theoretical phase transition also for generic tensor recoverability in practice. Concludingly, we apply $\mathrm{AIRLS}$-$0\mathcal{K}$ to larger scale problems.

read more

Citations
More filters
Posted Content

Algebraic compressed sensing.

TL;DR: In this paper, a broad subclass of algebraic compressed sensing problems, where structured signals are modeled either explicitly or implicitly via polynomials, is introduced, including low-rank matrix and tensor recovery.
References
More filters
Journal ArticleDOI

Tensor Decompositions and Applications

TL;DR: This survey provides an overview of higher-order tensor decompositions, their applications, and available software.
Journal ArticleDOI

Enhancing Sparsity by Reweighted ℓ 1 Minimization

TL;DR: A novel method for sparse signal recovery that in many situations outperforms ℓ1 minimization in the sense that substantially fewer measurements are needed for exact recovery.
Journal ArticleDOI

A Multilinear Singular Value Decomposition

TL;DR: There is a strong analogy between several properties of the matrix and the higher-order tensor decomposition; uniqueness, link with the matrix eigenvalue decomposition, first-order perturbation effects, etc., are analyzed.
Journal ArticleDOI

Some mathematical notes on three-mode factor analysis

TL;DR: The model for three-mode factor analysis is discussed in terms of newer applications of mathematical processes including a type of matrix process termed the Kronecker product and the definition of combination variables.
Journal ArticleDOI

Tensor-Train Decomposition

TL;DR: The new form gives a clear and convenient way to implement all basic operations efficiently, and the efficiency is demonstrated by the computation of the smallest eigenvalue of a 19-dimensional operator.
Related Papers (5)