scispace - formally typeset
Open AccessProceedings Article

Exact computation of a manifold metric, via lipschitz embeddings and shortest paths on a graph

TLDR
This paper gives the first exact algorithm for computing a data-sensitive metric called the nearest neighbor metric, and proves the surprising result that a previously published $3-approximation is an exact algorithm.
Abstract
Data-sensitive metrics adapt distances locally based the density of data points with the goal of aligning distances and some notion of similarity. In this paper, we give the first exact algorithm for computing a data-sensitive metric called the nearest neighbor metric. In fact, we prove the surprising result that a previously published 3-approximation is an exact algorithm. The nearest neighbor metric can be viewed as a special case of a density-based distance used in machine learning, or it can be seen as an example of a manifold metric. Previous computational research on such metrics despaired of computing exact distances on account of the apparent difficulty of minimizing over all continuous paths between a pair of points. We leverage the exact computation of the nearest neighbor metric to compute sparse spanners and persistent homology. We also explore the behavior of the metric built from point sets drawn from an underlying distribution and consider the more general case of inputs that are finite collections of path-connected compact sets. The main results connect several classical theories such as the conformal change of Riemannian metrics, the theory of positive definite functions of Schoenberg, and screw function theory of Schoenberg and Von Neumann. We also develop some novel proof techniques based on the combination of screw functions and Lipschitz extensions that may be of independent interest.

read more

Content maybe subject to copyright    Report

Citations
More filters
Posted Content

Algorithms and Hardness for Linear Algebra on Geometric Graphs

TL;DR: The study of when efficient spectral graph theory is possible on K graphs is initiated and the exponential dependence on the dimension d cannot be improved in the celebrated fast multi-pole method of Greengard and Rokhlin can be improved, assuming SETH, for a broad class of functions.
Posted Content

Balancing Geometry and Density: Path Distances on High-Dimensional Data.

TL;DR: New geometric and computational analyses of power-weighted shortest-path distances are presented, illuminating the way these metrics balance density and geometry in the underlying data, and clarify their key parameters and discuss how they may be chosen in practice.
Posted Content

Intrinsic persistent homology via density-based metric learning

TL;DR: It is proved that the metric space defined by the sample endowed with a computable metric known as sample Fermat distance converges a.s. in the sense of Gromov-Hausdorff, which is applied to obtain sample persistence diagrams that converge towards an intrinsic persistence diagram.
Journal ArticleDOI

Adaptive Metrics for Adaptive Samples

TL;DR: In this paper, the authors generalize the local-feature size definition of adaptive sampling used in surface reconstruction to relate it to an alternative metric on Euclidean space, making it simpler both to give adaptive sampling versions of homological inference results and to prove topological guarantees using the critical points theory of distance functions.
Proceedings ArticleDOI

Algorithms and Hardness for Linear Algebra on Geometric Graphs

TL;DR: In this paper, it was shown that the exponential dependence on the dimension of the K-graph is not improved by the Strong Exponential Time Hypothesis (SETH) for a broad class of functions, including the Gaussian kernel, Neural tangent kernels, and more.
Related Papers (5)