scispace - formally typeset
Search or ask a question
Topic

Metric (mathematics)

About: Metric (mathematics) is a research topic. Over the lifetime, 42617 publications have been published within this topic receiving 836571 citations. The topic is also known as: distance function & metric.


Papers
More filters
Posted Content
TL;DR: This paper describes three natural properties of probability divergences that it believes reflect requirements from machine learning: sum invariance, scale sensitivity, and unbiased sample gradients and proposes an alternative to the Wasserstein metric, the Cramer distance, which possesses all three desired properties.
Abstract: The Wasserstein probability metric has received much attention from the machine learning community Unlike the Kullback-Leibler divergence, which strictly measures change in probability, the Wasserstein metric reflects the underlying geometry between outcomes The value of being sensitive to this geometry has been demonstrated, among others, in ordinal regression and generative modelling In this paper we describe three natural properties of probability divergences that reflect requirements from machine learning: sum invariance, scale sensitivity, and unbiased sample gradients The Wasserstein metric possesses the first two properties but, unlike the Kullback-Leibler divergence, does not possess the third We provide empirical evidence suggesting that this is a serious issue in practice Leveraging insights from probabilistic forecasting we propose an alternative to the Wasserstein metric, the Cramer distance We show that the Cramer distance possesses all three desired properties, combining the best of the Wasserstein and Kullback-Leibler divergences To illustrate the relevance of the Cramer distance in practice we design a new algorithm, the Cramer Generative Adversarial Network (GAN), and show that it performs significantly better than the related Wasserstein GAN

289 citations

Proceedings ArticleDOI
01 Apr 2001
TL;DR: This paper examines the average page quality over time of pages downloaded during a web crawl of 328 million unique pages and uses the connectivity-based metric PageRank to measure the quality of a page.
Abstract: This paper examines the average page quality over time of pages downloaded during a web crawl of 328 million unique pages. We use the connectivity-based metric PageRank to measure the quality of a page. We show that traversing the web graph in breadth-first search order is a good crawling strategy, as it tends to discover high-quality pages early on in the crawl.

289 citations

Proceedings ArticleDOI
Ravi Kumar1, Sergei Vassilvitskii1
26 Apr 2010
TL;DR: This work extends Spearman's footrule and Kendall's tau to those with position and element weights, and shows that a variant of the Diaconis-Graham inequality still holds - the generalized two measures remain within a constant factor of each other for all permutations.
Abstract: Spearman's footrule and Kendall's tau are two well established distances between rankings. They, however, fail to take into account concepts crucial to evaluating a result set in information retrieval: element relevance and positional information. That is, changing the rank of a highly-relevant document should result in a higher penalty than changing the rank of an irrelevant document; a similar logic holds for the top versus the bottom of the result ordering. In this work, we extend both of these metrics to those with position and element weights, and show that a variant of the Diaconis-Graham inequality still holds - the generalized two measures remain within a constant factor of each other for all permutations.We continue by extending the element weights into a distance metric between elements. For example, in search evaluation, swapping the order of two nearly duplicate results should result in little penalty, even if these two are highly relevant and appear at the top of the list. We extend the distance measures to this more general case and show that they remain within a constant factor of each other.We conclude by conducting simple experiments on web search data with the proposed measures. Our experiments show that the weighted generalizations are more robust and consistent with each other than their unweighted counter-parts.

288 citations

Journal ArticleDOI
TL;DR: The parsimony principle is applied to the reconstruction of the evolution of homologous sequences where recombinations or horizontal transfer can occur, and a dynamic programming algorithm is presented that finds the most parsimonious history that fits a given set of sequences.
Abstract: The parsimony principle states that a history of a set of sequences that minimizes the amount of evolution is a good approximation to the real evolutionary history of the sequences. This principle is applied to the reconstruction of the evolution of homologous sequences where recombinations or horizontal transfer can occur. First it is demonstrated that the appropriate structure to represent the evolution of sequences with recombinations is a family of trees each describing the evolution of a segment of the sequence. Two trees for neighboring segments will differ by exactly the transfer of a subtree within the whole tree. This leads to a metric between trees based on the smallest number of such operations needed to convert one tree into the other. An algorithm is presented that calculates this metric. This metric is used to formulate a dynamic programming algorithm that finds the most parsimonious history that fits a given set of sequences. The algorithm is potentially very practical, since many groups of sequences defy analysis by methods that ignore recombinations. These methods give ambiguous or contradictory results because the sequence history cannot be described by one phylogeny, but only a family of phylogenies that each describe the history of a segment of the sequences. The generalization of the algorithm to reconstruct gene conversions and the possibility for heuristic versions of the algorithm for larger data sets are discussed.

288 citations

Journal ArticleDOI
TL;DR: In this paper, a locally conformally Kahler (l.c.K) metric with parallel Lee form on a compact complex surface was given. But the graph structure of the compact complex surfaces was not considered.
Abstract: We give a characterization of a locally conformally Kahler (l.c.K.) metric with parallel Lee form on a compact complex surface. Using the Kodaira classification of surfaces, we classify the compact complex surfaces admitting such structures. This gives a classification of Sasakian structures on compact three-manifolds. A weak version of the above mentioned characterization leads to an explicit construction of l.c.K. metrics on all Hopf surfaces. We characterize the locally homogeneous l.c.K. metrics on geometric complex surfaces, and we prove that some Inoue surfaces do not admit any l.c.K. metric.

288 citations


Network Information
Related Topics (5)
Cluster analysis
146.5K papers, 2.9M citations
83% related
Optimization problem
96.4K papers, 2.1M citations
83% related
Fuzzy logic
151.2K papers, 2.3M citations
83% related
Robustness (computer science)
94.7K papers, 1.6M citations
83% related
Support vector machine
73.6K papers, 1.7M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202253
20213,191
20203,141
20192,843
20182,731
20172,341