scispace - formally typeset
Open AccessPosted Content

Riemannian Gaussian Distributions on the Space of Symmetric Positive Definite Matrices

TLDR
The paper applies the derivation and the implementation of an expectation-maximisation algorithm, for the estimation of mixtures of Riemannian Gaussian distributions, to the problem of texture classification, in computer vision, showing that it yields significantly better performance, in comparison to recent approaches.
Abstract
Data which lie in the space $\mathcal{P}_{m\,}$, of $m \times m$ symmetric positive definite matrices, (sometimes called tensor data), play a fundamental role in applications including medical imaging, computer vision, and radar signal processing. An open challenge, for these applications, is to find a class of probability distributions, which is able to capture the statistical properties of data in $\mathcal{P}_{m\,}$, as they arise in real-world situations. The present paper meets this challenge by introducing Riemannian Gaussian distributions on $\mathcal{P}_{m\,}$. Distributions of this kind were first considered by Pennec in $2006$. However, the present paper gives an exact expression of their probability density function for the first time in existing literature. This leads to two original contributions. First, a detailed study of statistical inference for Riemannian Gaussian distributions, uncovering the connection between maximum likelihood estimation and the concept of Riemannian centre of mass, widely used in applications. Second, the derivation and implementation of an expectation-maximisation algorithm, for the estimation of mixtures of Riemannian Gaussian distributions. The paper applies this new algorithm, to the classification of data in $\mathcal{P}_{m\,}$, (concretely, to the problem of texture classification, in computer vision), showing that it yields significantly better performance, in comparison to recent approaches.

read more

Citations
More filters
Journal ArticleDOI

Riemannian Procrustes Analysis: Transfer Learning for Brain–Computer Interfaces

TL;DR: A simple yet powerful method for matching the statistical distributions of two datasets, thus paving the way to BCI systems capable of reusing data from previous sessions and avoid the need of a calibration procedure.
Journal ArticleDOI

Riemannian Gaussian Distributions on the Space of Symmetric Positive Definite Matrices

TL;DR: In this paper, a Riemannian Gaussian distribution was proposed for the classification of data in the space of symmetric positive definite matrices. But the distribution was not defined in terms of the probability density function.
Journal ArticleDOI

Gaussian Distributions on Riemannian Symmetric Spaces: Statistical Learning With Structured Covariance Matrices

TL;DR: In this paper, a new class of probability distributions, Gaussian distributions of structured covariance matrices, which are Riemannian analogs of Gaussian distribution, are introduced.
Posted Content

Poincaré Wasserstein Autoencoder

TL;DR: This work presents a reformulation of the recently proposed Wasserstein autoencoder framework on a non-Euclidean manifold, the Poincare ball model of the hyperbolic space, and can use its intrinsic hierarchy to impose structure on the learned latent space representations.
Proceedings ArticleDOI

A fixed-point algorithm for estimating power means of positive definite matrices

TL;DR: A general fixed point algorithm (MPM) is provided and it is shown that its convergence rate for p = ±0.5 deteriorates very little with the number and dimension of points given as input, much more than the gradient descent algorithm usually employed to estimate the geometric mean.
References
More filters

Estimating the dimension of a model

TL;DR: In this paper, the problem of selecting one of a number of models of different dimensions is treated by finding its Bayes solution, and evaluating the leading terms of its asymptotic expansion.
Journal ArticleDOI

Ten Lectures on Wavelets

TL;DR: In this article, the regularity of compactly supported wavelets and symmetry of wavelet bases are discussed. But the authors focus on the orthonormal bases of wavelets, rather than the continuous wavelet transform.
Journal ArticleDOI

The Elements of Statistical Learning: Data Mining, Inference, and Prediction

TL;DR: The Elements of Statistical Learning: Data Mining, Inference, and Prediction as discussed by the authors is a popular book for data mining and machine learning, focusing on data mining, inference, and prediction.
Book

Differential Geometry, Lie Groups, and Symmetric Spaces

TL;DR: In this article, the structure of semisimplepleasure Lie groups and Lie algebras is studied. But the classification of simple Lie algesbras and of symmetric spaces is left open.