scispace - formally typeset
Journal ArticleDOI

Tensor-Train Decomposition

Ivan V. Oseledets
- 01 Sep 2011 - 
- Vol. 33, Iss: 5, pp 2295-2317
Reads0
Chats0
TLDR
The new form gives a clear and convenient way to implement all basic operations efficiently, and the efficiency is demonstrated by the computation of the smallest eigenvalue of a 19-dimensional operator.
Abstract
A simple nonrecursive form of the tensor decomposition in $d$ dimensions is presented. It does not inherently suffer from the curse of dimensionality, it has asymptotically the same number of parameters as the canonical decomposition, but it is stable and its computation is based on low-rank approximation of auxiliary unfolding matrices. The new form gives a clear and convenient way to implement all basic operations efficiently. A fast rounding procedure is presented, as well as basic linear algebra operations. Examples showing the benefits of the decomposition are given, and the efficiency is demonstrated by the computation of the smallest eigenvalue of a 19-dimensional operator.

read more

Citations
More filters
Journal ArticleDOI

Recent advances in convolutional neural networks

TL;DR: A broad survey of the recent advances in convolutional neural networks can be found in this article, where the authors discuss the improvements of CNN on different aspects, namely, layer design, activation function, loss function, regularization, optimization and fast computation.
Journal ArticleDOI

Deep convolutional neural networks for image classification: A comprehensive review

TL;DR: This review, which focuses on the application of CNNs to image classification tasks, covers their development, from their predecessors up to recent state-of-the-art deep learning systems.
Journal ArticleDOI

Machine learning and the physical sciences

TL;DR: This article reviews in a selective way the recent research on the interface between machine learning and the physical sciences, including conceptual developments in ML motivated by physical insights, applications of machine learning techniques to several domains in physics, and cross fertilization between the two fields.
Posted Content

Recent Advances in Convolutional Neural Networks

TL;DR: This paper details the improvements of CNN on different aspects, including layer design, activation function, loss function, regularization, optimization and fast computation, and introduces various applications of convolutional neural networks in computer vision, speech and natural language processing.
Journal ArticleDOI

Tensor Decomposition for Signal Processing and Machine Learning

TL;DR: The material covered includes tensor rank and rank decomposition; basic tensor factorization models and their relationships and properties; broad coverage of algorithms ranging from alternating optimization to stochastic gradient; statistical performance analysis; and applications ranging from source separation to collaborative filtering, mixture and topic modeling, classification, and multilinear subspace learning.
References
More filters
Journal ArticleDOI

Tensor Decompositions and Applications

TL;DR: This survey provides an overview of higher-order tensor decompositions, their applications, and available software.
Journal ArticleDOI

Analysis of individual differences in multidimensional scaling via an n-way generalization of 'eckart-young' decomposition

TL;DR: In this paper, an individual differences model for multidimensional scaling is outlined in which individuals are assumed differentially to weight the several dimensions of a common "psychological space" and a corresponding method of analyzing similarities data is proposed, involving a generalization of Eckart-Young analysis to decomposition of three-way (or higher-way) tables.
Journal ArticleDOI

A Multilinear Singular Value Decomposition

TL;DR: There is a strong analogy between several properties of the matrix and the higher-order tensor decomposition; uniqueness, link with the matrix eigenvalue decomposition, first-order perturbation effects, etc., are analyzed.
Journal ArticleDOI

Some mathematical notes on three-mode factor analysis

TL;DR: The model for three-mode factor analysis is discussed in terms of newer applications of mathematical processes including a type of matrix process termed the Kronecker product and the definition of combination variables.

Foundations of the PARAFAC procedure: Models and conditions for an "explanatory" multi-model factor analysis

TL;DR: It is shown that an extension of Cattell's principle of rotation to Proportional Profiles (PP) offers a basis for determining explanatory factors for three-way or higher order multi-mode data.
Related Papers (5)