scispace - formally typeset
Search or ask a question
Topic

Metric (mathematics)

About: Metric (mathematics) is a research topic. Over the lifetime, 42617 publications have been published within this topic receiving 836571 citations. The topic is also known as: distance function & metric.


Papers
More filters
Proceedings ArticleDOI
14 Oct 1996
TL;DR: It is proved that any metric space can be probabilistically-approximated by hierarchically well-separated trees (HST) with a polylogarithmic distortion.
Abstract: This paper provides a novel technique for the analysis of randomized algorithms for optimization problems on metric spaces, by relating the randomized performance ratio for any, metric space to the randomized performance ratio for a set of "simple" metric spaces. We define a notion of a set of metric spaces that probabilistically-approximates another metric space. We prove that any metric space can be probabilistically-approximated by hierarchically well-separated trees (HST) with a polylogarithmic distortion. These metric spaces are "simple" as being: (1) tree metrics; (2) natural for applying a divide-and-conquer algorithmic approach. The technique presented is of particular interest in the context of on-line computation. A large number of on-line algorithmic problems, including metrical task systems, server problems, distributed paging, and dynamic storage rearrangement are defined in terms of some metric space. Typically for these problems, there are linear lower bounds on the competitive ratio of deterministic algorithms. Although randomization against an oblivious adversary has the potential of overcoming these high ratios, very little progress has been made in the analysis. We demonstrate the use of our technique by obtaining substantially improved results for two different on-line problems.

797 citations

Book ChapterDOI
31 Aug 2004
TL;DR: A new distance function, which is a marriage of L1- norm and the edit distance, ERP, which can support local time shifting, and is a metric, and dominates all existing strategies.
Abstract: Existing studies on time series are based on two categories of distance functions. The first category consists of the Lp-norms. They are metric distance functions but cannot support local time shifting. The second category consists of distance functions which are capable of handling local time shifting but are nonmetric. The first contribution of this paper is the proposal of a new distance function, which we call ERP ("Edit distance with Real Penalty"). Representing a marriage of L1- norm and the edit distance, ERP can support local time shifting, and is a metric. The second contribution of the paper is the development of pruning strategies for large time series databases. Given that ERP is a metric, one way to prune is to apply the triangle inequality. Another way to prune is to develop a lower bound on the ERP distance. We propose such a lower bound, which has the nice computational property that it can be efficiently indexed with a standard B+- tree. Moreover, we show that these two ways of pruning can be used simultaneously for ERP distances. Specifically, the false positives obtained from the B+-tree can be further minimized by applying the triangle inequality. Based on extensive experimentation with existing benchmarks and techniques, we show that this combination delivers superb pruning power and search time performance, and dominates all existing strategies.

790 citations

Proceedings ArticleDOI
24 Nov 2003
TL;DR: Three variants of a new quality metric for image fusion based on an image quality index recently introduced by Wang and Bovik are presented, which are compliant with subjective evaluations and can therefore be used to compare different image fusion methods or to find the best parameters for a given fusion algorithm.
Abstract: We present three variants of a new quality metric for image fusion. The interest of our metrics, which are based on an image quality index recently introduced by Wang and Bovik in [Z. Wang et al., March 2002], lies in the fact that they do not require a ground-truth or reference image. We perform several simulations which show that our metrics are compliant with subjective evaluations and can therefore be used to compare different image fusion methods or to find the best parameters for a given fusion algorithm.

782 citations

Journal Article
TL;DR: The Manopt toolbox as discussed by the authors is a user-friendly, documented piece of software dedicated to simplify experimenting with state-of-the-art Riemannian optimization algorithms.
Abstract: Optimization on manifolds is a rapidly developing branch of nonlinear optimization. Its focus is on problems where the smooth geometry of the search space can be leveraged to design efficient numerical algorithms. In particular, optimization on manifolds is well-suited to deal with rank and orthogonality constraints. Such structured constraints appear pervasively in machine learning applications, including low-rank matrix completion, sensor network localization, camera network registration, independent component analysis, metric learning, dimensionality reduction and so on. The Manopt toolbox, available at www.manopt.org, is a user-friendly, documented piece of software dedicated to simplify experimenting with state of the art Riemannian optimization algorithms. By dealing internally with most of the differential geometry, the package aims particularly at lowering the entrance barrier.

775 citations

Journal ArticleDOI
TL;DR: In this paper, a new class of empty space metrics is obtained, one member of this class being a natural generalization of the Schwarzschild metric, which contains one arbitrary parameter in addition to the mass.
Abstract: A new class of empty‐space metrics is obtained, one member of this class being a natural generalization of the Schwarzschild metric. This latter metric contains one arbitrary parameter in addition to the mass. The entire class is the set of metrics which are algebraically specialized (contain multiple‐principle null vectors) such that the propagation vector is not proportional to a gradient. These metrics belong to the Petrov class type I degenerate.

773 citations


Network Information
Related Topics (5)
Cluster analysis
146.5K papers, 2.9M citations
83% related
Optimization problem
96.4K papers, 2.1M citations
83% related
Fuzzy logic
151.2K papers, 2.3M citations
83% related
Robustness (computer science)
94.7K papers, 1.6M citations
83% related
Support vector machine
73.6K papers, 1.7M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202253
20213,191
20203,141
20192,843
20182,731
20172,341