Journal ArticleDOI
Tensor Decompositions and Applications
Tamara G. Kolda,Brett W. Bader +1 more
Reads0
Chats0
TLDR
This survey provides an overview of higher-order tensor decompositions, their applications, and available software.Abstract:
This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or $N$-way array. Decompositions of higher-order tensors (i.e., $N$-way arrays with $N \geq 3$) have applications in psycho-metrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, graph analysis, and elsewhere. Two particular tensor decompositions can be considered to be higher-order extensions of the matrix singular value decomposition: CANDECOMP/PARAFAC (CP) decomposes a tensor as a sum of rank-one tensors, and the Tucker decomposition is a higher-order form of principal component analysis. There are many other tensor decompositions, including INDSCAL, PARAFAC2, CANDELINC, DEDICOM, and PARATUCK2 as well as nonnegative variants of all of the above. The N-way Toolbox, Tensor Toolbox, and Multilinear Engine are examples of software packages for working with tensors.read more
Citations
More filters
Book ChapterDOI
Modeling Relational Data with Graph Convolutional Networks
Michael Sejr Schlichtkrull,Thomas Kipf,Peter Bloem,Rianne van den Berg,Ivan Titov,Ivan Titov,Max Welling,Max Welling +7 more
TL;DR: It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline.
Journal ArticleDOI
From frequency to meaning: vector space models of semantics
Peter D. Turney,Patrick Pantel +1 more
TL;DR: The goal in this survey is to show the breadth of applications of VSMs for semantics, to provide a new perspective on VSMs, and to provide pointers into the literature for those who are less familiar with the field.
Journal ArticleDOI
The structure and dynamics of multilayer networks
Stefano Boccaletti,Ginestra Bianconi,Regino Criado,Regino Criado,C.I. del Genio,Jesús Gómez-Gardeñes,Miguel Romance,Miguel Romance,Irene Sendiña-Nadal,Irene Sendiña-Nadal,Zhen Wang,Massimiliano Zanin +11 more
TL;DR: This work offers a comprehensive review on both structural and dynamical organization of graphs made of diverse relationships (layers) between its constituents, and cover several relevant issues, from a full redefinition of the basic structural measures, to understanding how the multilayer nature of the network affects processes and dynamics.
Book
Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-way Data Analysis and Blind Source Separation
TL;DR: This book provides a broad survey of models and efficient algorithms for Nonnegative Matrix Factorization (NMF), including NMFs various extensions and modifications, especially Nonnegative Tensor Factorizations (NTF) and Nonnegative Tucker Decompositions (NTD).
Journal ArticleDOI
Tensor-Train Decomposition
TL;DR: The new form gives a clear and convenient way to implement all basic operations efficiently, and the efficiency is demonstrated by the computation of the smallest eigenvalue of a 19-dimensional operator.
References
More filters
Book
The Art of Computer Programming
TL;DR: The arrangement of this invention provides a strong vibration free hold-down mechanism while avoiding a large pressure drop to the flow of coolant fluid.
Journal ArticleDOI
Indexing by Latent Semantic Analysis
TL;DR: A new method for automatic indexing and retrieval to take advantage of implicit higher-order structure in the association of terms with documents (“semantic structure”) in order to improve the detection of relevant documents on the basis of terms found in queries.
Journal ArticleDOI
Learning the parts of objects by non-negative matrix factorization
TL;DR: An algorithm for non-negative matrix factorization is demonstrated that is able to learn parts of faces and semantic features of text and is in contrast to other methods that learn holistic, not parts-based, representations.
Learning parts of objects by non-negative matrix factorization
TL;DR: In this article, non-negative matrix factorization is used to learn parts of faces and semantic features of text, which is in contrast to principal components analysis and vector quantization that learn holistic, not parts-based, representations.
Related Papers (5)
Analysis of individual differences in multidimensional scaling via an n-way generalization of 'eckart-young' decomposition
J. Douglas Carroll,Jih-Jie Chang +1 more