scispace - formally typeset
Open AccessPosted Content

Metrics for Probabilistic Geometries

TLDR
The geometrical structure of probabilistic generative dimensionality reduction models using the tools of Riemannian geometry is investigated and distances that respect the expected metric lead to more appropriate generation of new data.
Abstract
We investigate the geometrical structure of probabilistic generative dimensionality reduction models using the tools of Riemannian geometry. We explicitly define a distribution over the natural metric given by the models. We provide the necessary algorithms to compute expected metric tensors where the distribution over mappings is given by a Gaussian process. We treat the corresponding latent variable model as a Riemannian manifold and we use the expectation of the metric under the Gaussian process prior to define interpolating paths and measure distance between latent points. We show how distances that respect the expected metric lead to more appropriate generation of new data.

read more

Citations
More filters
Proceedings Article

Latent space oddity: On the curvature of deep generative models

TL;DR: This work shows that the nonlinearity of the generator imply that the latent space gives a distorted view of the input space, and shows that this distortion can be characterized by a stochastic Riemannian metric, and demonstrates that distances and interpolants are significantly improved under this metric.
Posted Content

Geodesic Exponential Kernels: When Curvature and Linearity Conflict

TL;DR: In this paper, it was shown that the common Gaussian kernel can only be generalized to a positive definite kernel on a geodesic metric space if the space is flat and if the Riemannian manifold is Euclidean.
Posted Content

Metrics for Deep Generative Models

TL;DR: The method yields a principled distance measure, provides a tool for visual inspection of deep generative models, and an alternative to linear interpolation in latent space and can be applied for robot movement generalization using previously learned skills.
Posted Content

Fisher GAN

Youssef Mroueh, +1 more
- 26 May 2017 - 
TL;DR: Fisher GAN as discussed by the authors defines a critic with a data dependent constraint on its second order moments, which allows for stable and time efficient training that does not compromise the capacity of the critic, and does not need data independent constraints such as weight clipping.
Posted Content

What is a meaningful representation of protein sequences

TL;DR: Two key contexts in which representations naturally arise: transfer learning and interpretable learning are explored, and it is demonstrated that taking representation geometry into account significantly improves interpretability and lets the models reveal biological information that is otherwise obscured.
References
More filters
Book

Introduction to Algorithms

TL;DR: The updated new edition of the classic Introduction to Algorithms is intended primarily for use in undergraduate or graduate courses in algorithms or data structures and presents a rich variety of algorithms and covers them in considerable depth while making their design and analysis accessible to all levels of readers.
Journal ArticleDOI

A global geometric framework for nonlinear dimensionality reduction.

TL;DR: An approach to solving dimensionality reduction problems that uses easily measured local metric information to learn the underlying global geometry of a data set and efficiently computes a globally optimal solution, and is guaranteed to converge asymptotically to the true structure.
Book

Ordinary differential equations

TL;DR: In this article, the Poincare-Bendixson theory is used to explain the existence of linear differential equations and the use of Implicity Function and fixed point Theorems.
Book ChapterDOI

Introduction to Algorithms

Xin-She Yang
TL;DR: This chapter provides an overview of the fundamentals of algorithms and their links to self-organization, exploration, and exploitation.
Related Papers (5)