scispace - formally typeset
Open AccessBook ChapterDOI

Tensor Completion in Hierarchical Tensor Representations

Reads0
Chats0
TLDR
This book chapter considers versions of iterative hard thresholding schemes adapted to hierarchical tensor formats and provides first partial convergence results based on a tensor version of the restricted isometry property (TRIP) of the measurement map.
Abstract
Compressed sensing extends from the recovery of sparse vectors from undersampled measurements via efficient algorithms to the recovery of matrices of low rank from incomplete information. Here we consider a further extension to the reconstruction of tensors of low multi-linear rank in recently introduced hierarchical tensor formats from a small number of measurements. Hierarchical tensors are a flexible generalization of the well-known Tucker representation, which have the advantage that the number of degrees of freedom of a low rank tensor does not scale exponentially with the order of the tensor. While corresponding tensor decompositions can be computed efficiently via successive applications of (matrix) singular value decompositions, some important properties of the singular value decomposition do not extend from the matrix to the tensor case. This results in major computational and theoretical difficulties in designing and analyzing algorithms for low rank tensor recovery. For instance, a canonical analogue of the tensor nuclear norm is NP-hard to compute in general, which is in stark contrast to the matrix case. In this book chapter we consider versions of iterative hard thresholding schemes adapted to hierarchical tensor formats. A variant builds on methods from Riemannian optimization and uses a retraction mapping from the tangent space of the manifold of low rank tensors back to this manifold. We provide first partial convergence results based on a tensor version of the restricted isometry property (TRIP) of the measurement map. Moreover, an estimate of the number of measurements is provided that ensures the TRIP of a given tensor rank with high probability for Gaussian measurement maps.

read more

Citations
More filters
Journal ArticleDOI

Model Compression and Hardware Acceleration for Neural Networks: A Comprehensive Survey

TL;DR: This article reviews the mainstream compression approaches such as compact model, tensor decomposition, data quantization, and network sparsification, and answers the question of how to leverage these methods in the design of neural network accelerators and present the state-of-the-art hardware architectures.
MonographDOI

Low-rank methods for high-dimensional approximation and model order reduction

TL;DR: The problem of best approximation in subsets of low-rank tensors is analyzed and its connection with the problem of optimal model reduction in low-dimensional reduced spaces is discussed.
Journal ArticleDOI

Efficient Tensor Completion for Color Image and Video Recovery: Low-Rank Tensor Train

TL;DR: Wang et al. as discussed by the authors proposed a novel tensor completion approach based on the tensor train (TT) rank, which is able to capture hidden information from tensors thanks to its definition from a well-balanced matricization scheme.
Journal ArticleDOI

Tensor Completion Algorithms in Big Data Analytics

TL;DR: A modern overview of recent advances in tensor completion algorithms from the perspective of big data analytics characterized by diverse variety, large volume, and high velocity is provided.
Journal ArticleDOI

Efficient tensor completion for color image and video recovery: Low-rank tensor train

TL;DR: Wang et al. as mentioned in this paper proposed a novel tensor completion approach based on the tensor train rank, which is able to capture hidden information from tensors thanks to its definition from a well-balanced matricization scheme.
References
More filters
Journal ArticleDOI

Tensor Decompositions and Applications

TL;DR: This survey provides an overview of higher-order tensor decompositions, their applications, and available software.
Journal ArticleDOI

Density matrix formulation for quantum renormalization groups

TL;DR: A generalization of the numerical renormalization-group procedure used first by Wilson for the Kondo problem is presented and it is shown that this formulation is optimal in a certain sense.
Journal ArticleDOI

Exact Matrix Completion via Convex Optimization

TL;DR: It is proved that one can perfectly recover most low-rank matrices from what appears to be an incomplete set of entries, and that objects other than signals and images can be perfectly reconstructed from very limited information.
Journal ArticleDOI

A Multilinear Singular Value Decomposition

TL;DR: There is a strong analogy between several properties of the matrix and the higher-order tensor decomposition; uniqueness, link with the matrix eigenvalue decomposition, first-order perturbation effects, etc., are analyzed.
Related Papers (5)