scispace - formally typeset
Search or ask a question
Topic

Metric (mathematics)

About: Metric (mathematics) is a research topic. Over the lifetime, 42617 publications have been published within this topic receiving 836571 citations. The topic is also known as: distance function & metric.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, the authors give a physical explanation of the Kontsevich-Soibelman wall-crossing formula for the BPS spectrum in Seiberg-Witten theories.
Abstract: We give a physical explanation of the Kontsevich-Soibelman wall-crossing formula for the BPS spectrum in Seiberg-Witten theories. In the process we give an exact description of the BPS instanton corrections to the hyperkahler metric of the moduli space of the theory on R^3 x S^1. The wall-crossing formula reduces to the statement that this metric is continuous. Our construction of the metric uses a four-dimensional analogue of the two-dimensional tt* equations.

483 citations

Journal ArticleDOI
TL;DR: A new metric on linear, time-invariant systems is defined that is no greater than the gap metric, and is in fact the smallest metric for which a certain robust stabilization result holds.
Abstract: A new metric on linear, time-invariant systems is defined. This metric is no greater than the gap metric, and is in fact the smallest metric for which a certain robust stabilization result holds. Unlike other known metrics which induce the graph topology, it has a clear frequency response interpretation. This allows questions regarding robustness in the face of parametric uncertainty to be considered in terms of this metric. >

482 citations

Proceedings Article
21 Aug 2003
TL;DR: It is empirically demonstrate that learning a distance metric using the RCA algorithm significantly improves clustering performance, similarly to the alternative algorithm.
Abstract: We address the problem of learning distance metrics using side-information in the form of groups of "similar" points. We propose to use the RCA algorithm, which is a simple and efficient algorithm for learning a full ranked Mahalanobis metric (Shental et al., 2002). We first show that RCA obtains the solution to an interesting optimization problem, founded on an information theoretic basis. If the Mahalanobis matrix is allowed to be singular, we show that Fisher's linear discriminant followed by RCA is the optimal dimensionality reduction algorithm under the same criterion. We then show how this optimization problem is related to the criterion optimized by another recent algorithm for metric learning (Xing et al., 2002), which uses the same kind of side information. We empirically demonstrate that learning a distance metric using the RCA algorithm significantly improves clustering performance, similarly to the alternative algorithm. Since the RCA algorithm is much more efficient and cost effective than the alternative, as it only uses closed form expressions of the data, it seems like a preferable choice for the learning of full rank Mahalanobis distances.

481 citations

Journal ArticleDOI
Charles H. Bennett1, Peter Gacs, Ming Li, Paul M. B. Vitányi, Wojciech H. Zurek 
TL;DR: It is shown that the information distance is a universal cognitive similarity distance and investigated the maximal correlation of the shortest programs involved, the maximal uncorrelation of programs, and the density properties of the discrete metric spaces induced by the information distances.
Abstract: While Kolmogorov (1965) complexity is the accepted absolute measure of information content in an individual finite object, a similarly absolute notion is needed for the information distance between two individual objects, for example, two pictures. We give several natural definitions of a universal information metric, based on length of shortest programs for either ordinary computations or reversible (dissipationless) computations. It turns out that these definitions are equivalent up to an additive logarithmic term. We show that the information distance is a universal cognitive similarity distance. We investigate the maximal correlation of the shortest programs involved, the maximal uncorrelation of programs (a generalization of the Slepian-Wolf theorem of classical information theory), and the density properties of the discrete metric spaces induced by the information distances. A related distance measures the amount of nonreversibility of a computation. Using the physical theory of reversible computation, we give an appropriate (universal, antisymmetric, and transitive) measure of the thermodynamic work required to transform one object in another object by the most efficient process. Information distance between individual objects is needed in pattern recognition where one wants to express effective notions of "pattern similarity" or "cognitive similarity" between individual objects and in thermodynamics of computation where one wants to analyze the energy dissipation of a computation from a particular input to a particular output.

479 citations

Proceedings ArticleDOI
18 Oct 1998
TL;DR: This work presents a natural extension of the original error metric that can account for a wide range of vertex attributes and can rapidly produce high quality approximations of complex polygonal surface models.
Abstract: There are a variety of application areas in which there is a need for simplifying complex polygonal surface models. These models often have material properties such as colors, textures, and surface normals. Our surface simplification algorithm, based on iterative edge contraction and quadric error metrics, can rapidly produce high quality approximations of such models. We present a natural extension of our original error metric that can account for a wide range of vertex attributes.

478 citations


Network Information
Related Topics (5)
Cluster analysis
146.5K papers, 2.9M citations
83% related
Optimization problem
96.4K papers, 2.1M citations
83% related
Fuzzy logic
151.2K papers, 2.3M citations
83% related
Robustness (computer science)
94.7K papers, 1.6M citations
83% related
Support vector machine
73.6K papers, 1.7M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202253
20213,191
20203,141
20192,843
20182,731
20172,341