scispace - formally typeset
Open AccessJournal ArticleDOI

Tensor Decomposition for Signal Processing and Machine Learning

Reads0
Chats0
TLDR
The material covered includes tensor rank and rank decomposition; basic tensor factorization models and their relationships and properties; broad coverage of algorithms ranging from alternating optimization to stochastic gradient; statistical performance analysis; and applications ranging from source separation to collaborative filtering, mixture and topic modeling, classification, and multilinear subspace learning.
Abstract
Tensors or multiway arrays are functions of three or more indices $(i,j,k,\ldots)$ —similar to matrices (two-way arrays), which are functions of two indices $(r,c)$ for (row, column). Tensors have a rich history, stretching over almost a century, and touching upon numerous disciplines; but they have only recently become ubiquitous in signal and data analytics at the confluence of signal processing, statistics, data mining, and machine learning. This overview article aims to provide a good starting point for researchers and practitioners interested in learning about and working with tensors. As such, it focuses on fundamentals and motivation (using various application examples), aiming to strike an appropriate balance of breadth and depth that will enable someone having taken first graduate courses in matrix algebra and probability to get started doing research and/or developing tensor algorithms and software. Some background in applied optimization is useful but not strictly required. The material covered includes tensor rank and rank decomposition; basic tensor factorization models and their relationships and properties (including fairly good coverage of identifiability); broad coverage of algorithms ranging from alternating optimization to stochastic gradient; statistical performance analysis; and applications ranging from source separation to collaborative filtering, mixture and topic modeling, classification, and multilinear subspace learning.

read more

Citations
More filters
Posted Content

The commutation matrix: Some properties and applications

TL;DR: In this article, the expectation and covariance matrix of the Wishart distribution are derived, where the expectation is derived from the expectation matrix of a square matrix containing only zeros and ones.
Journal ArticleDOI

Low-Rank Tensor Networks for Dimensionality Reduction and Large-Scale Optimization Problems: Perspectives and Challenges PART 1.

TL;DR: In this paper, the authors provide mathematical and graphical representations and interpretation of tensor networks, with the main focus on the Tucker and Tensor Train (TT) decompositions and their extensions or generalizations.
Journal ArticleDOI

Nonnegative Matrix Factorization for Signal and Data Analytics: Identifiability, Algorithms, and Applications

TL;DR: Nonnegative matrix factorization (NMF) aims to factor a data matrix into low-rank latent factor matrices with nonnegativity constraints with nonNegativity constraints.
Posted Content

TensorLy: Tensor Learning in Python

TL;DR: TensorLy is a Python library that provides a high-level API for tensor methods and deep tensorized neural networks and aims to follow the same standards adopted by the main projects of the Python scientific community, and to seamlessly integrate with them.
Posted Content

Introduction to Tensor Decompositions and their Applications in Machine Learning.

TL;DR: Basic tensor concepts are introduced, why tensors can be considered more rigid than matrices with respect to the uniqueness of their decomposition, the most important factorization algorithms and their properties are explained, and concrete examples of tensor decomposition applications in machine learning are provided.
References
More filters
Book

Convex Optimization

TL;DR: In this article, the focus is on recognizing convex optimization problems and then finding the most appropriate technique for solving them, and a comprehensive introduction to the subject is given. But the focus of this book is not on the optimization problem itself, but on the problem of finding the appropriate technique to solve it.
Book

Distributed Optimization and Statistical Learning Via the Alternating Direction Method of Multipliers

TL;DR: It is argued that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas.
Journal ArticleDOI

Fundamentals of statistical signal processing: estimation theory

TL;DR: The Fundamentals of Statistical Signal Processing: Estimation Theory as mentioned in this paper is a seminal work in the field of statistical signal processing, and it has been used extensively in many applications.
Journal ArticleDOI

Tensor Decompositions and Applications

TL;DR: This survey provides an overview of higher-order tensor decompositions, their applications, and available software.
Journal ArticleDOI

Analysis of individual differences in multidimensional scaling via an n-way generalization of 'eckart-young' decomposition

TL;DR: In this paper, an individual differences model for multidimensional scaling is outlined in which individuals are assumed differentially to weight the several dimensions of a common "psychological space" and a corresponding method of analyzing similarities data is proposed, involving a generalization of Eckart-Young analysis to decomposition of three-way (or higher-way) tables.