Open AccessPosted Content
Estimation of low-rank tensors via convex optimization
TLDR
Three approaches for the estimation of the Tucker decomposition of multi-way arrays (tensors) from partial observations are proposed and it is shown that the proposed convex optimization based approaches are more accurate in predictive performance, faster, and more reliable in recovering a known multilinear structure than conventional approaches.Abstract:
In this paper, we propose three approaches for the estimation of the Tucker decomposition of multi-way arrays (tensors) from partial observations All approaches are formulated as convex minimization problems Therefore, the minimum is guaranteed to be unique The proposed approaches can automatically estimate the number of factors (rank) through the optimization Thus, there is no need to specify the rank beforehand The key technique we employ is the trace norm regularization, which is a popular approach for the estimation of low-rank matrices In addition, we propose a simple heuristic to improve the interpretability of the obtained factorization The advantages and disadvantages of three proposed approaches are demonstrated through numerical experiments on both synthetic and real world datasets We show that the proposed convex optimization based approaches are more accurate in predictive performance, faster, and more reliable in recovering a known multilinear structure than conventional approachesread more
Citations
More filters
Journal ArticleDOI
Tensor completion for estimating missing values in visual data
TL;DR: The contribution of this paper is to extend the matrix case to the tensor case by proposing the first definition of the trace norm for tensors and building a working algorithm to estimate missing values in tensors of visual data.
Journal ArticleDOI
Tensor Robust Principal Component Analysis with a New Tensor Nuclear Norm
TL;DR: Zhang et al. as mentioned in this paper proposed a tensor robust principal component analysis (TRPCA) model based on the tensor-tensor product (or t-product) to recover the low-rank and sparse components from their sum.
Proceedings ArticleDOI
Low-Rank Tensor Constrained Multiview Subspace Clustering
TL;DR: A low-rank tensor constraint is introduced to explore the complementary information from multiple views and, accordingly, a novel method called Low-rank Tensor constrained Multiview Subspace Clustering (LT-MSC) is established.
Proceedings ArticleDOI
Tensor Robust Principal Component Analysis: Exact Recovery of Corrupted Low-Rank Tensors via Convex Optimization
TL;DR: This work proves that under certain suitable assumptions, it can recover both the low-rank and the sparse components exactly by simply solving a convex program whose objective is a weighted combination of the tensor nuclear norm and the l1-norm.
Journal ArticleDOI
Robust Low-Rank Tensor Recovery: Models and Algorithms
Donald Goldfarb,Zhiwei Qin +1 more
TL;DR: This paper proposes tailored optimization algorithms with global convergence guarantees for solving both the constrained and the Lagrangian formulations of the problem and proposes a nonconvex model that can often improve the recovery results from the convex models.
References
More filters
Journal ArticleDOI
Regression Shrinkage and Selection via the Lasso
TL;DR: A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
Book
Convex Optimization
Stephen Boyd,Lieven Vandenberghe +1 more
TL;DR: In this article, the focus is on recognizing convex optimization problems and then finding the most appropriate technique for solving them, and a comprehensive introduction to the subject is given. But the focus of this book is not on the optimization problem itself, but on the problem of finding the appropriate technique to solve it.
Book
Distributed Optimization and Statistical Learning Via the Alternating Direction Method of Multipliers
TL;DR: It is argued that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas.
Book
Numerical Optimization
Jorge Nocedal,Stephen J. Wright +1 more
TL;DR: Numerical Optimization presents a comprehensive and up-to-date description of the most effective methods in continuous optimization, responding to the growing interest in optimization in engineering, science, and business by focusing on the methods that are best suited to practical problems.