scispace - formally typeset
Open AccessPosted Content

Quasi-Newton methods on Grassmannians and multilinear approximations of tensors

TLDR
In this article, the authors proposed quasi-Newton and limited memory (L-BFGS) methods for objective functions defined on Grassmannian or a product of Grassmannians, and proved that these methods share the same optimality property as the usual BFGS updates on Euclidean spaces.
Abstract
In this paper we proposed quasi-Newton and limited memory quasi-Newton methods for objective functions defined on Grassmannians or a product of Grassmannians. Specifically we defined BFGS and L-BFGS updates in local and global coordinates on Grassmannians or a product of these. We proved that, when local coordinates are used, our BFGS updates on Grassmannians share the same optimality property as the usual BFGS updates on Euclidean spaces. When applied to the best multilinear rank approximation problem for general and symmetric tensors, our approach yields fast, robust, and accurate algorithms that exploit the special Grassmannian structure of the respective problems, and which work on tensors of large dimensions and arbitrarily high order. Extensive numerical experiments are included to substantiate our claims.

read more

Citations
More filters
Journal ArticleDOI

Tensor Decomposition for Signal Processing and Machine Learning

TL;DR: The material covered includes tensor rank and rank decomposition; basic tensor factorization models and their relationships and properties; broad coverage of algorithms ranging from alternating optimization to stochastic gradient; statistical performance analysis; and applications ranging from source separation to collaborative filtering, mixture and topic modeling, classification, and multilinear subspace learning.
Journal ArticleDOI

Tensor Decompositions for Signal Processing Applications: From two-way to multiway component analysis

TL;DR: Benefiting from the power of multilinear algebra as their mathematical backbone, data analysis techniques using tensor decompositions are shown to have great flexibility in the choice of constraints which match data properties and extract more general latent components in the data than matrix-based methods.
Journal ArticleDOI

A literature survey of low-rank tensor approximation techniques

TL;DR: This survey attempts to give a literature overview of current developments in low-rank tensor approximation, with an emphasis on function-related tensors.
Journal ArticleDOI

On manifolds of tensors of fixed TT-rank

TL;DR: This paper proves that the TT (or compression) ranks ri of a tensor U are unique and equal to the respective separation ranks of U if the components of the TT decomposition are required to fulfil a certain maximal rank condition.
Journal ArticleDOI

Comparing, optimizing, and benchmarking quantum-control algorithms in a unifying programming framework

TL;DR: A novel unifying algorithmic framework, dynamo (dynamic optimisation platform) is introduced designed to provide the quantum-technology community with a convenient matlab-based toolset for optimal control, and gives researchers in optimal-control techniques a framework for benchmarking and comparing new proposed algorithms to the state-of-the-art.
References
More filters
Journal ArticleDOI

Tensor Decompositions and Applications

TL;DR: This survey provides an overview of higher-order tensor decompositions, their applications, and available software.
Book

Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics, 16)

TL;DR: In this paper, Schnabel proposed a modular system of algorithms for unconstrained minimization and nonlinear equations, based on Newton's method for solving one equation in one unknown convergence of sequences of real numbers.
Book

Numerical methods for unconstrained optimization and nonlinear equations

TL;DR: Newton's Method for Nonlinear Equations and Unconstrained Minimization and methods for solving nonlinear least-squares problems with Special Structure.
Journal ArticleDOI

A Multilinear Singular Value Decomposition

TL;DR: There is a strong analogy between several properties of the matrix and the higher-order tensor decomposition; uniqueness, link with the matrix eigenvalue decomposition, first-order perturbation effects, etc., are analyzed.