scispace - formally typeset
Search or ask a question

Showing papers on "Information geometry published in 2023"


Journal ArticleDOI
TL;DR: In this paper , the fractal dimension emerges as a power exponent of the power law in natural phenomenon and represents the complexity of an object, and it links to the magnitude of nonmetricity (Amari-Chentsov tensor) on statistical manifold.


Journal ArticleDOI
01 May 2023-Entropy
TL;DR: In this paper , a Riemannian metric for the one-sided truncated exponential family (oTEF) is proposed, which is based on the asymptotic properties of maximum likelihood estimators.
Abstract: In information geometry, there has been extensive research on the deep connections between differential geometric structures, such as the Fisher metric and the α-connection, and the statistical theory for statistical models satisfying regularity conditions. However, the study of information geometry for non-regular statistical models is insufficient, and a one-sided truncated exponential family (oTEF) is one example of these models. In this paper, based on the asymptotic properties of maximum likelihood estimators, we provide a Riemannian metric for the oTEF. Furthermore, we demonstrate that the oTEF has an α = 1 parallel prior distribution and that the scalar curvature of a certain submodel, including the Pareto family, is a negative constant.

Journal ArticleDOI
TL;DR: For a complete connected Riemannian manifold M, this article derived inequalities for probability measures on M linking relative entropy, Fisher information, Stein discrepancy and Wasserstein distance, which strengthened in particular the famous log-Sobolev and transportation-cost inequality and extended the so-called Entropy/Stein Discrepancy/Information (HSI) inequality established by Ledoux, Nourdin and Peccati (2015) for the standard Gaussian measure on Euclidean space to the setting of RiemANNian manifold.

Posted ContentDOI
07 Mar 2023
TL;DR: In this paper , the authors discuss how these concepts apply to quantum field theories in the Euclidean domain which can also be seen as statistical field theories, and propose a new generating functional, which is a functional generalization of the Kullback-Leibler divergence.
Abstract: Information geometry provides differential geometric concepts like a Riemannian metric, connections and covariant derivatives on spaces of probability distributions. We discuss here how these concepts apply to quantum field theories in the Euclidean domain which can also be seen as statistical field theories. The geometry has a dual affine structure corresponding to sources and field expectation values seen as coordinates. A key concept is a new generating functional, which is a functional generalization of the Kullback-Leibler divergence. From its functional derivatives one can obtain connected as well as one-particle irreducible correlation functions. It also encodes directly the geometric structure, i. e. the Fisher information metric and the two dual connections, and it determines asymptotic probabilities for field configurations through Sanov's theorem.


Posted ContentDOI
11 Feb 2023
TL;DR: In this paper , the Bregman-Wasserstein divergence is used to extend the weak Riemannian structure of the Wasserstein space to statistical manifolds, and primal and dual connections on the space of probability measures are defined.
Abstract: Consider the Monge-Kantorovich optimal transport problem where the cost function is given by a Bregman divergence. The associated transport cost, which we call the Bregman-Wasserstein divergence, presents a natural asymmetric extension of the squared $2$-Wasserstein metric and has recently found applications in statistics and machine learning. On the other hand, Bregman divergence is a fundamental object in information geometry and induces a dually flat geometry on the underlying manifold. Using the Bregman-Wasserstein divergence, we lift this dualistic geometry to the space of probability measures, thus extending Otto's weak Riemannian structure of the Wasserstein space to statistical manifolds. We do so by generalizing Lott's formal geometric computations on the Wasserstein space. In particular, we define primal and dual connections on the space of probability measures and show that they are conjugate with respect to Otto's metric. We also define primal and dual displacement interpolations which satisfy the corresponding geodesic equations. As applications, we study displacement convexity and the Bregman-Wasserstein barycenter.

Posted ContentDOI
01 Jun 2023
TL;DR: In this article , an unsupervised geometric deep learning framework for representing non-linear dynamical systems based on statistical distributions of local dynamical features is introduced, and geometry-aware or geometry-agnostic representations for robustly comparing dynamical system based on sparse measurements.
Abstract: Abstract The dynamics of neuron populations during diverse behaviours evolve on low-dimensional manifolds. However, it remains challenging to disentangle the role of manifold geometry and dynamics in encoding task variables. Here, we introduce an unsupervised geometric deep learning framework for representing non-linear dynamical systems based on statistical distributions of local dynamical features. Our method provides geometry-aware or geometry-agnostic representations for robustly comparing dynamical systems based on sparse measurements. Our representations are generalisable to compare computations across systems, interpretable to discover a geometric correspondence between neural dynamics and kinematics in a primate reaching task, and intrinsically encode temporal information to give rise to a decoding algorithm with state-of-the-art accuracy. Our results suggest that using the manifold structure over temporal information is important to develop better decoding algorithms and assimilate data across experiments.

Posted ContentDOI
29 May 2023
TL;DR: In this article , a bi-Connection Theory of Gravity is proposed, whose Gravitational action consists of a recently defined mutual curvature scalar, and the geometry of the resulting theory is the statistical manifold.
Abstract: We formulate a bi-Connection Theory of Gravity whose Gravitational action consists of a recently defined mutual curvature scalar. Namely, we build a gravitational theory consisting of one metric and two affine connections, in a Metric-Affine Gravity setup. Consequently, coupling the two connections on an equal footing with matter, we show that the geometry of the resulting theory is, quite intriguingly, that of Statistical Manifold. This ultimately indicates a remarkable mathematical correspondence between Gravity and Information Geometry.

Posted ContentDOI
02 Jun 2023
TL;DR: In this article , the most practical aspects of the Fisher geometry for this fundamental distribution family are presented for general statisticians, using an intuitive understanding of the covariance-induced curvature of this manifold to unify the special cases with known closed-form solution and review approximate solutions for the general case.
Abstract: Choosing the Fisher information as the metric tensor for a Riemannian manifold provides a powerful yet fundamental way to understand statistical distribution families. Distances along this manifold become a compelling measure of statistical distance, and paths of shorter distance improve sampling techniques that leverage a sequence of distributions in their operation. Unfortunately, even for a distribution as generally tractable as the multivariate normal distribution, this information geometry proves unwieldy enough that closed-form solutions for shortest-distance paths or their lengths remain unavailable outside of limited special cases. In this review we present for general statisticians the most practical aspects of the Fisher geometry for this fundamental distribution family. Rather than a differential geometric treatment, we use an intuitive understanding of the covariance-induced curvature of this manifold to unify the special cases with known closed-form solution and review approximate solutions for the general case. We also use the multivariate normal information geometry to better understand the paths or distances commonly used in statistics (annealing, Wasserstein). Given the unavailability of a general solution, we also discuss the methods used for numerically obtaining geodesics in the space of multivariate normals, identifying remaining challenges and suggesting methodological improvements.

Journal ArticleDOI
TL;DR: In this paper , an overview of information geometry for underwater acoustics models for environmental inversion, focusing on transmission loss (TL) in range-independent normal mode models, is presented.
Abstract: Environmental inversions in ocean acoustics require the simultaneous estimation of many unknown parameters. Broad studies of multi-parameter models from diverse fields have shown that such inference problems are universally sloppy. Sloppiness is a phenomenon in which the predictions of a model are insensitive to all but a few key combinations of parameters. Sloppy model analysis is based on information geometry, an application of differential geometry to statistics. In this talk, we give an overview of information geometry for underwater acoustics models for environmental inversion, focusing on transmission loss (TL) in range-independent normal mode models. We demonstrate that these models are sloppy and how the geometry directly relates the information content of acoustic data to the relevance of environmental parameters. In particular, the model manifold quantifies what environmental information is encoded in ocean sounds and how unidentifiable parameters can be removed from the model to give a simplified, identifiable ocean acoustics model of comparable accuracy. We summarize physical insights revealed by this information geometry analysis for ocean inversion. [Work supported by Office of Naval Research]

Posted ContentDOI
28 Mar 2023
TL;DR: In this article , a collection of manifold-metric pairs, probabilistic notions, simple topology, and Einstein-Boltzman equations are presented, and the combination of these combinations result to different flavors of an expanding sub-manifold and metric systems describing simpler dynamics of space around massive objects.
Abstract: Context: In this study, we embrace information by presenting important concepts of abstract information field theory, probabilities, and probabilistic dimensions, in the view of functors of actions theories and other abstract theories. Methodology: The methods used are the presentation of a collection of manifold-metric pairs, probabilistic notions, simple topology, and Einstein-Boltzman equations and the combination of this collection. Results: These combinations result to different flavours of an expanding sub-manifold and metric systems describing simpler dynamics of space around massive objects. Furthermore, we derive the equation of motion of a simplified gravity model in a probabilistic expanding Universe. We further introduce the notions of probabilistic actions and concepts of novel categories of abstract field-particles, such as the probablons and informatons. Conclusion: We conclude that the derived equations are the first steps towards a concrete description of a probabilistic gravity, and a probabilistic expanding Universe descriptions.

Posted ContentDOI
01 Jun 2023
TL;DR: Neural Fisher Information metric (FIM) as mentioned in this paperIM is a method for computing the Fisher information metric from point cloud data, allowing for a continuous manifold model for the data.
Abstract: Although data diffusion embeddings are ubiquitous in unsupervised learning and have proven to be a viable technique for uncovering the underlying intrinsic geometry of data, diffusion embeddings are inherently limited due to their discrete nature. To this end, we propose neural FIM, a method for computing the Fisher information metric (FIM) from point cloud data - allowing for a continuous manifold model for the data. Neural FIM creates an extensible metric space from discrete point cloud data such that information from the metric can inform us of manifold characteristics such as volume and geodesics. We demonstrate Neural FIM's utility in selecting parameters for the PHATE visualization method as well as its ability to obtain information pertaining to local volume illuminating branching points and cluster centers embeddings of a toy dataset and two single-cell datasets of IPSC reprogramming and PBMCs (immune cells).


Posted ContentDOI
28 Mar 2023
TL;DR: GeoTMI as discussed by the authors proposes a new training framework, GeoTMI, that employs denoising process to predict properties accurately using easy-to-obtain geometries (corrupted versions of correct geometry, such as those obtained from low-level calculations).
Abstract: As quantum chemical properties have a dependence on their geometries, graph neural networks (GNNs) using 3D geometric information have achieved high prediction accuracy in many tasks. However, they often require 3D geometries obtained from high-level quantum mechanical calculations, which are practically infeasible, limiting their applicability to real-world problems. To tackle this, we propose a new training framework, GeoTMI, that employs denoising process to predict properties accurately using easy-to-obtain geometries (corrupted versions of correct geometries, such as those obtained from low-level calculations). Our starting point was the idea that the correct geometry is the best description of the target property. Hence, to incorporate information of the correct, GeoTMI aims to maximize mutual information between three variables: the correct and the corrupted geometries and the property. GeoTMI also explicitly updates the corrupted input to approach the correct geometry as it passes through the GNN layers, contributing to more effective denoising. We investigated the performance of the proposed method using 3D GNNs for three prediction tasks: molecular properties, a chemical reaction property, and relaxed energy in a heterogeneous catalytic system. Our results showed consistent improvements in accuracy across various tasks, demonstrating the effectiveness and robustness of GeoTMI.

Journal ArticleDOI
TL;DR: In this article , the synchronization in the Kuramoto model can be treated in terms of information geometry, and it is shown that the Fisher information is sensitive to synchronization transition; specifically, components of the Fisher metric diverge at the critical point.
Abstract: We discuss how the synchronization in the Kuramoto model can be treated in terms of information geometry. We argue that the Fisher information is sensitive to synchronization transition; specifically, components of the Fisher metric diverge at the critical point. Our approach is based on the recently proposed relation between the Kuramoto model and geodesics in hyperbolic space.


Journal ArticleDOI
TL;DR: In this article , a geodesic distance (GD) is proposed as a measure of Riemannian similarity on a multivariate generalized gamma distribution manifold (MGΓD) for color-textured image retrieval.

Posted ContentDOI
12 May 2023
TL;DR: In this article , hyperbolic embeddings of probabilistic models onto a hierarchy of model manifolds were used to encode how model behaviors change as a function of their parameters, giving a quantitative notion of "distances" between model behaviors.
Abstract: The space of possible behaviors complex biological systems may exhibit is unimaginably vast, and these systems often appear to be stochastic, whether due to variable noisy environmental inputs or intrinsically generated chaos. The brain is a prominent example of a biological system with complex behaviors. The number of possible patterns of spikes emitted by a local brain circuit is combinatorially large, though the brain may not make use of all of them. Understanding which of these possible patterns are actually used by the brain, and how those sets of patterns change as properties of neural circuitry change is a major goal in neuroscience. Recently, tools from information geometry have been used to study embeddings of probabilistic models onto a hierarchy of model manifolds that encode how model behaviors change as a function of their parameters, giving a quantitative notion of "distances" between model behaviors. We apply this method to a network model of excitatory and inhibitory neural populations to understand how the competition between membrane and synaptic response timescales shapes the network's information geometry. The hyperbolic embedding allows us to identify the statistical parameters to which the model behavior is most sensitive, and demonstrate how the ranking of these coordinates changes with the balance of excitation and inhibition in the network.

Journal ArticleDOI
Kamal Donko1
TL;DR: In this paper , it was shown that a Sobolev inequality exists between a Riemannian metric and its distance function in the sub-critical case where p < m 2 .
Abstract: If one thinks of a Riemannian metric, $$g_1$$ , analogously as the gradient of the corresponding distance function, $$d_1$$ , with respect to a background Riemannian metric, $$g_0$$ , then a natural question arises as to whether a corresponding theory of Sobolev inequalities exists between the Riemannian metric and its distance function. In this paper, we study the sub-critical case $$p < \frac{m}{2}$$ where we show a Sobolev inequality exists between a Riemannian metric and its distance function. In particular, we show that an $$L^{\frac{p}{2}}$$ bound on a Riemannian metric implies an $$L^q$$ bound on its corresponding distance function. We then use this result to state a convergence theorem and show how this theorem can be useful to prove geometric stability results by proving a version of Gromov’s conjecture for tori with almost non-negative scalar curvature in the conformal case. Examples are given to show that the hypotheses of the main theorems are necessary.

Posted ContentDOI
Hassan Alshal1
30 Jan 2023
TL;DR: In this paper , an emergent spacetime by virtue of the geometric language of statistical information manifolds was constructed by using the corrected form of the entropy-area law, and with the help of von Neumann entropy of quantum matter.
Abstract: Motivated by the corrected form of the entropy-area law, and with the help of von Neumann entropy of quantum matter, we construct an emergent spacetime by the virtue of the geometric language of statistical information manifolds. We discuss the link between Wald--Jacobson approaches of thermodynamic/gravity correspondence and Fisher pseudo-Riemannian metric of information manifold. We derive in detail Einstein's field equations in statistical information geometric forms. This results in finding a quantum origin of a positive cosmological constant that is founded on Fisher metric. This cosmological constant resembles those found in Lovelock's theories in a de Sitter background as a result of using the complex extension of spacetime and the Gaussian exponential families of probability distributions, and we find a time varying dynamical gravitational constant as a function of Fisher metric together with the corresponding Ryu-Takayanagi formula of such system. Consequently, we obtain a dynamical equation for the entropy in information manifold using Liouville-von Neumann equation from the Hamiltonian of the system. This Hamiltonian is suggested to be non-Hermitian, which corroborates the approaches that relate non-unitary conformal field theories to information manifolds. This provides some insights on resolving "the problem of time".

Posted ContentDOI
17 May 2023-bioRxiv
TL;DR: In this paper , a stochastic encoding model of a population of salamander retinal ganglion cells based on a three-layer convolutional neural network model was proposed.
Abstract: The ability to discriminate visual stimuli is constrained by their retinal representations. Previous studies of visual discriminability were limited to either low-dimensional artificial stimuli or theoretical considerations without a realistic model. Here we propose a novel framework for understanding stimulus discriminability achieved by retinal representations of naturalistic stimuli with the method of information geometry. To model the joint probability distribution of neural responses conditioned on the stimulus, we created a stochastic encoding model of a population of salamander retinal ganglion cells based on a three-layer convolutional neural network model. This model not only accurately captured the mean response to natural scenes but also a variety of second-order statistics. With the model and the proposed theory, we are able to compute the Fisher information metric over stimuli and study the most discriminable stimulus directions. We found that the most discriminable stimulus varied substantially, allowing an examination of the relationship between the most discriminable stimulus and the current stimulus. We found that the most discriminative response mode is often aligned with the most stochastic mode. This finding carries the important implication that under natural scenes noise correlations in the retina are information-limiting rather than aiding in increasing information transmission as has been previously speculated. We observed that sensitivity saturates less in the population than for single cells and also that Fisher information varies less than sensitivity as a function of firing rate. We conclude that under natural scenes, population coding benefits from complementary coding and helps to equalize the information carried by different firing rates, which may facilitate decoding of the stimulus under principles of information maximization.

Journal ArticleDOI
TL;DR: In this article , a generic spatiotemporal framework is proposed to analyze manifold-valued measurements, which allows for employing an intrinsic and computationally efficient Riemannian hierarchical model.


Journal ArticleDOI
11 May 2023-Entropy
TL;DR: In this paper , the convergence rate condition of degenerate stochastic SDEs was derived by generalized Gamma calculus, and the generalized Bochner's formula followed a generalized second-order calculus of Kullback-Leibler divergence in density space embedded with a sub-Riemannian type optimal transport metric.
Abstract: We study the dynamical behaviors of degenerate stochastic differential equations (SDEs). We select an auxiliary Fisher information functional as the Lyapunov functional. Using generalized Fisher information, we conduct the Lyapunov exponential convergence analysis of degenerate SDEs. We derive the convergence rate condition by generalized Gamma calculus. Examples of the generalized Bochner's formula are provided in the Heisenberg group, displacement group, and Martinet sub-Riemannian structure. We show that the generalized Bochner's formula follows a generalized second-order calculus of Kullback-Leibler divergence in density space embedded with a sub-Riemannian type optimal transport metric.