scispace - formally typeset
Search or ask a question
Topic

Metric (mathematics)

About: Metric (mathematics) is a research topic. Over the lifetime, 42617 publications have been published within this topic receiving 836571 citations. The topic is also known as: distance function & metric.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, it was shown that in all dimensions D ≥ 4, there exist discrete symmetries that involve inverting a rotation parameter through the AdS radius, which is equivalent to under-rotating metrics.
Abstract: The Kerr–AdS metric in dimension D has cohomogeneity [D/2]; the metric components depend on the radial coordinate r and [D/2] latitude variables μi that are subject to the constraint ∑iμ2i = 1. We find a coordinate reparametrization in which the μi variables are replaced by [D/2] − 1 unconstrained coordinates yα, and having the remarkable property that the Kerr–AdS metric becomes diagonal in the coordinate differentials dyα. The coordinates r and yα now appear in a very symmetrical way in the metric, leading to an immediate generalization in which we can introduce [D/2] − 1 NUT parameters. We find that (D − 5)/2 are non-trivial in odd dimensions whilst (D − 2)/2 are non-trivial in even dimensions. This gives the most general Kerr–NUT–AdS metric in D dimensions. We find that in all dimensions D ≥ 4, there exist discrete symmetries that involve inverting a rotation parameter through the AdS radius. These symmetries imply that Kerr–NUT–AdS metrics with over-rotating parameters are equivalent to under-rotating metrics. We also consider the BPS limit of the Kerr–NUT–AdS metrics, and thereby obtain, in odd dimensions and after Euclideanization, new families of Einstein–Sasaki metrics.

272 citations

Journal ArticleDOI
TL;DR: The derivation of accurate and efficient numerical schemes to estimate statistical parameters of the space of multivariate normal distributions with zero mean vector are extensively addressed.
Abstract: This paper is dedicated to the statistical analysis of the space of multivariate normal distributions with an application to the processing of Diffusion Tensor Images (DTI). It relies on the differential geometrical properties of the underlying parameters space, endowed with a Riemannian metric, as well as on recent works that led to the generalization of the normal law on Riemannian manifolds. We review the geometrical properties of the space of multivariate normal distributions with zero mean vector and focus on an original characterization of the mean, covariance matrix and generalized normal law on that manifold. We extensively address the derivation of accurate and efficient numerical schemes to estimate these statistical parameters. A major application of the present work is related to the analysis and processing of DTI datasets and we show promising results on synthetic and real examples.

272 citations

Book
26 Jun 2015
TL;DR: The duality between cotangent and tangent spaces has been discussed in this article, where the definition of Sobolev classes has been defined and discussed in the context of Laplacian comparison.
Abstract: Introduction Preliminaries Differentials and gradients Laplacian Comparison estimates Appendix A. On the duality between cotangent and tangent spaces Appendix B. Remarks about the definition of the Sobolev classes References

271 citations

Posted Content
15 Mar 2020
TL;DR: This paper adopts the Earth Mover's Distance (EMD) as a metric to compute a structural distance between dense image representations to determine image relevance and designs a cross-reference mechanism that can effectively minimize the impact caused by the cluttered background and large intra-class appearance variations.
Abstract: In this paper, we address the few-shot classification task from a new perspective of optimal matching between image regions. We adopt the Earth Mover's Distance (EMD) as a metric to compute a structural distance between dense image representations to determine image relevance. The EMD generates the optimal matching flows between structural elements that have the minimum matching cost, which is used to represent the image distance for classification. To generate the important weights of elements in the EMD formulation, we design a cross-reference mechanism, which can effectively minimize the impact caused by the cluttered background and large intra-class appearance variations. To handle k-shot classification, we propose to learn a structured fully connected layer that can directly classify dense image representations with the EMD. Based on the implicit function theorem, the EMD can be inserted as a layer into the network for end-to-end training. We conduct comprehensive experiments to validate our algorithm and we set new state-of-the-art performance on four popular few-shot classification benchmarks, namely miniImageNet, tieredImageNet, Fewshot-CIFAR100 (FC100) and Caltech-UCSD Birds-200-2011 (CUB).

271 citations


Network Information
Related Topics (5)
Cluster analysis
146.5K papers, 2.9M citations
83% related
Optimization problem
96.4K papers, 2.1M citations
83% related
Fuzzy logic
151.2K papers, 2.3M citations
83% related
Robustness (computer science)
94.7K papers, 1.6M citations
83% related
Support vector machine
73.6K papers, 1.7M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202253
20213,191
20203,141
20192,843
20182,731
20172,341