scispace - formally typeset
Search or ask a question
Topic

Metric (mathematics)

About: Metric (mathematics) is a research topic. Over the lifetime, 42617 publications have been published within this topic receiving 836571 citations. The topic is also known as: distance function & metric.


Papers
More filters
Journal ArticleDOI
TL;DR: A novel multidimensional projection technique based on least square approximations that is faster and more accurate than other existing high-quality methods, particularly where it was mostly tested, that is, for mapping text sets.
Abstract: The problem of projecting multidimensional data into lower dimensions has been pursued by many researchers due to its potential application to data analyses of various kinds. This paper presents a novel multidimensional projection technique based on least square approximations. The approximations compute the coordinates of a set of projected points based on the coordinates of a reduced number of control points with defined geometry. We name the technique least square projections (LSP). From an initial projection of the control points, LSP defines the positioning of their neighboring points through a numerical solution that aims at preserving a similarity relationship between the points given by a metric in mD. In order to perform the projection, a small number of distance calculations are necessary, and no repositioning of the points is required to obtain a final solution with satisfactory precision. The results show the capability of the technique to form groups of points by degree of similarity in 2D. We illustrate that capability through its application to mapping collections of textual documents from varied sources, a strategic yet difficult application. LSP is faster and more accurate than other existing high-quality methods, particularly where it was mostly tested, that is, for mapping text sets.

285 citations

Journal ArticleDOI
TL;DR: The Ricci-flat curvature of the Lagrangian torus fibrations of Calabi-Yau n-folds has been studied in this paper, where it was shown that the curvature can be approximated to O(e−C/∊) for some constant C > 0.
Abstract: Motivated by the picture of mirror symmetry suggested by Strominger, Yau and Zaslow, we make a conjecture concerning the Gromov-Hausdorff limits of Calabi-Yau n-folds (with Ricci-flat Kahler metric) as one approaches a large complex structure limit point in moduli; a similar conjecture was made independently by Kontsevich, Soibelman and Todorov. Roughly stated, the conjecture says that, if the metrics are normalized to have constant diameter, then this limit is the base of the conjectural special lagrangian torus fibrations associated with the large complex structure limit, namely an n-sphere, and that the metric on this Sn is induced from a standard (singular) Riemannian metric on the base, the singularities of the metric corresponding to the limit discriminant locus of the fibrations. This conjecture is trivially true for elliptic curves; in this paper we prove it in the case of K3 surfaces. Using the standard description of mirror symmetry for K3 surfaces and the hyperkahler rotation trick, we reduce the problem to that of studying Kahler degenerations of elliptic K3 surfaces, with the Kahler class approaching the wall of the Kahler cone corresponding to the fibration and the volume normalized to be one. Here we are able to write down a remarkably accurate approximation to the Ricci-flat metric: if the elliptic fibres are of area ∊ > 0, then the error is O(e−C/∊) for some constant C > 0. This metric is obtained by gluing together a semi-flat metric on the smooth part of the fibration with suitable Ooguri-Vafa metrics near the singular fibres. For small ∊, this is a sufficiently good approximation that the above conjecture is then an easy consequence.

284 citations

Journal ArticleDOI
TL;DR: In this paper, the authors reformulate the Hamiltonian form of bosonic eleven-dimensional supergravity in terms of an object that unifies the three-form and the metric.
Abstract: We reformulate the Hamiltonian form of bosonic eleven dimensional supergravity in terms of an object that unifies the three-form and the metric. For the case of four spatial dimensions, the duality group is manifest and the metric and C-field are on an equal footing even though no dimensional reduction is required for our results to hold. One may also describe our results using the generalized geometry that emerges from membrane duality. The relationship between the twisted Courant algebra and the gauge symmetries of eleven dimensional supergravity are described in detail.

284 citations

Proceedings ArticleDOI
09 Oct 2006
TL;DR: A calibration method for eye-in-hand systems in order to estimate the hand-eye and the robot-world transformations in terms of a parametrization of a stochastic model and a novel metric is proposed for nonlinear optimization.
Abstract: This paper presents a calibration method for eye-in-hand systems in order to estimate the hand-eye and the robot-world transformations. The estimation takes place in terms of a parametrization of a stochastic model. In order to perform optimally, a metric on the group of the rigid transformations SE(3) and the corresponding error model are proposed for nonlinear optimization. This novel metric works well with both common formulations AX=XB and AX=ZB, and makes use of them in accordance with the nature of the problem. The metric also adapts itself to the system precision characteristics. The method is compared in performance to earlier approaches.

284 citations

Proceedings Article
03 Jul 2018
TL;DR: In this article, a task-specific learner of an EMMT-net performs gradient descent with respect to a meta-learned distance metric, which warps the activation space to be more sensitive to task identity.
Abstract: Gradient-based meta-learning methods leverage gradient descent to learn the commonalities among various tasks. While previous such methods have been successful in meta-learning tasks, they resort to simple gradient descent during meta-testing. Our primary contribution is the {\em MT-net}, which enables the meta-learner to learn on each layer's activation space a subspace that the task-specific learner performs gradient descent on. Additionally, a task-specific learner of an {\em MT-net} performs gradient descent with respect to a meta-learned distance metric, which warps the activation space to be more sensitive to task identity. We demonstrate that the dimension of this learned subspace reflects the complexity of the task-specific learner's adaptation task, and also that our model is less sensitive to the choice of initial learning rates than previous gradient-based meta-learning methods. Our method achieves state-of-the-art or comparable performance on few-shot classification and regression tasks.

283 citations


Network Information
Related Topics (5)
Cluster analysis
146.5K papers, 2.9M citations
83% related
Optimization problem
96.4K papers, 2.1M citations
83% related
Fuzzy logic
151.2K papers, 2.3M citations
83% related
Robustness (computer science)
94.7K papers, 1.6M citations
83% related
Support vector machine
73.6K papers, 1.7M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202253
20213,191
20203,141
20192,843
20182,731
20172,341