scispace - formally typeset
Search or ask a question
Topic

Metric (mathematics)

About: Metric (mathematics) is a research topic. Over the lifetime, 42617 publications have been published within this topic receiving 836571 citations. The topic is also known as: distance function & metric.


Papers
More filters
Journal ArticleDOI
TL;DR: A new approach is proposed which combines canonical space transformation (CST) based on Canonical Analysis (CA), with EST for feature extraction, which can be used to reduce data dimensionality and to optimise the class separability of different gait classes simultaneously.

167 citations

Journal ArticleDOI
TL;DR: A new distance metric learning algorithm, namely weakly-supervised deep metric learning (WDML), under the deep learning framework is proposed, which utilizes a progressive learning manner to discover knowledge by jointly exploiting the heterogeneous data structures from visual contents and user-provided tags of social images.
Abstract: Recent years have witnessed the explosive growth of community-contributed images with rich context information, which is beneficial to the task of image retrieval. It can help us to learn a suitable metric to alleviate the semantic gap. In this paper, we propose a new distance metric learning algorithm, namely weakly-supervised deep metric learning (WDML), under the deep learning framework. It utilizes a progressive learning manner to discover knowledge by jointly exploiting the heterogeneous data structures from visual contents and user-provided tags of social images. The semantic structure in the textual space is expected to be well preserved while the problem of the noisy, incomplete or subjective tags is addressed by leveraging the visual structure in the original visual space. Besides, a sparse model with the ${\ell _{2,1}}$ mixed norm is imposed on the transformation matrix of the first layer in the deep architecture to compress the noisy or redundant visual features. The proposed problem is formulated as an optimization problem with a well-defined objective function and a simple yet efficient iterative algorithm is proposed to solve it. Extensive experiments on real-world social image datasets are conducted to verify the effectiveness of the proposed method for image retrieval. Encouraging experimental results are achieved compared with several representative metric learning methods.

167 citations

Proceedings ArticleDOI
17 Jul 2006
TL;DR: This work proposes training log-linear combinations of models for dependency parsing and for machine translation, and describes techniques for optimizing nonlinear functions such as precision or the BLEU metric.
Abstract: When training the parameters for a natural language system, one would prefer to minimize 1-best loss (error) on an evaluation set. Since the error surface for many natural language problems is piecewise constant and riddled with local minima, many systems instead optimize log-likelihood, which is conveniently differentiable and convex. We propose training instead to minimize the expected loss, or risk. We define this expectation using a probability distribution over hypotheses that we gradually sharpen (anneal) to focus on the 1-best hypothesis. Besides the linear loss functions used in previous work, we also describe techniques for optimizing nonlinear functions such as precision or the BLEU metric. We present experiments training log-linear combinations of models for dependency parsing and for machine translation. In machine translation, annealed minimum risk training achieves significant improvements in BLEU over standard minimum error training. We also show improvements in labeled dependency parsing.

167 citations

Journal ArticleDOI
TL;DR: In this paper, the Ricci tensor is defined as the curvature tensor of a smooth metric g, and the existence of Ricci curvatures is shown to be a special case of curvatures with curvatures of different signed curvatures.
Abstract: One of the most natural and important topics in Riemannian geometry is the relation between curvature and global structure of the underlying manifold. For instance, complete manifolds of negative sectional curvature are always aspherical and in the compact case their fundamental group can only contain abelian subgroups which are infinite cyclic. Furthermore, it seemed to be a natural principle that a (closed) manifold cannot carry two metrics of different signed curvatures, as it is a basic fact that this is true for sectional curvature. But it turned out to be wrong (much later and from a strongly analytic argument) for the scalar curvature S, since each manifold M', n > 3, admits a complete metric with S _-1 (cf. Aubin [A] and Bland, Kalka [BIK]). Hence the situation for Ricci curvature Ric, lying between sectional and scalar curvature, seemed to be quite delicate. Up to now, the most general results concerning Ric < 0 were proved by Gao, Yau [GY] and Brooks [Br] using Thurston's theory of hyperbolic threemanifolds, viz.: Each closed three-manifold admits a metric with Ric < 0. This is obtained from the fact that these manifolds carry hyperbolic metrics with certain singularities; Gao and Yau (resp. Brooks) smoothed these singularities to get a regular metric with Ric < 0. These methods extend to three-manifolds of finite type and certain hyperbolic orbifolds. In any case, the arguments rely on exploiting some extraordinary metric structures, whose existence is neither obvious nor conceptually related to the Ricci curvature problem. Indeed, the existence depends on the assumption that the manifold is three-dimensional and compact. Moreover this approach does not provide insight into the typical behaviour of metrics with Ric < 0 since one is led to very special metrics. In this article we approach negative Ricci curvature using a completely different and new concept (which will become even more significant in [L2]) as we deliberately produce Ric < 0. Actually we will prove the following results; in these notes Ric(g), resp. r(g), denotes the Ricci tensor, resp. curvature of a smooth metric g:

167 citations

Journal ArticleDOI
TL;DR: This work presents an efficient method to conformally parameterize 3D mesh data sets to the plane by concentrating all the 3D curvature at a small number of select mesh vertices, called cone singularities, and cutting the mesh through those singular vertices to obtain disk topology.
Abstract: We present an efficient method to conformally parameterize 3D mesh data sets to the plane. The idea behind our method is to concentrate all the 3D curvature at a small number of select mesh vertices, called cone singularities, and then cut the mesh through those singular vertices to obtain disk topology. The singular vertices are chosen automatically. As opposed to most previous methods, our flattening process involves only the solution of linear systems of Poisson equations, thus is very efficient. Our method is shown to be faster than existing methods, yet generates parameterizations having comparable quasi-conformal distortion.

166 citations


Network Information
Related Topics (5)
Cluster analysis
146.5K papers, 2.9M citations
83% related
Optimization problem
96.4K papers, 2.1M citations
83% related
Fuzzy logic
151.2K papers, 2.3M citations
83% related
Robustness (computer science)
94.7K papers, 1.6M citations
83% related
Support vector machine
73.6K papers, 1.7M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202253
20213,191
20203,141
20192,843
20182,731
20172,341