scispace - formally typeset
Journal ArticleDOI

Gromov–Wasserstein Distances and the Metric Approach to Object Matching

Facundo Mémoli
- 01 Aug 2011 - 
- Vol. 11, Iss: 4, pp 417-487
TLDR
This paper discusses certain modifications of the ideas concerning the Gromov–Hausdorff distance which have the goal of modeling and tackling the practical problems of object matching and comparison by proving explicit lower bounds for the proposed distance that involve many of the invariants previously reported by researchers.
Abstract
This paper discusses certain modifications of the ideas concerning the Gromov–Hausdorff distance which have the goal of modeling and tackling the practical problems of object matching and comparison. Objects are viewed as metric measure spaces, and based on ideas from mass transportation, a Gromov–Wasserstein type of distance between objects is defined. This reformulation yields a distance between objects which is more amenable to practical computations but retains all the desirable theoretical underpinnings. The theoretical properties of this new notion of distance are studied, and it is established that it provides a strict metric on the collection of isomorphism classes of metric measure spaces. Furthermore, the topology generated by this metric is studied, and sufficient conditions for the pre-compactness of families of metric measure spaces are identified. A second goal of this paper is to establish links to several other practical methods proposed in the literature for comparing/matching shapes in precise terms. This is done by proving explicit lower bounds for the proposed distance that involve many of the invariants previously reported by researchers. These lower bounds can be computed in polynomial time. The numerical implementations of the ideas are discussed and computational examples are presented.

read more

Citations
More filters
Book ChapterDOI

Metric structures on datasets: stability and classification of algorithms

TL;DR: It is described how using this formalism leads to an axiomatic description of many clustering algorithms, both flat and hierarchical, as well as several computational techniques that operate within the context of data/shape matching under invariances.
Proceedings ArticleDOI

Efficient and Robust Shape Correspondence via Sparsity-Enforced Quadratic Assignment

TL;DR: This work introduces a novel local pairwise descriptor and develops a simple, effective iterative method to solve the resulting quadratic assignment through sparsity control for shape correspondence between two approximate isometric surfaces.
Proceedings ArticleDOI

Geometry-aware domain adaptation for unsupervised alignment of word embeddings.

TL;DR: The authors proposed a manifold based geometric approach for learning unsupervised alignment of word embeddings between the source and the target languages, which formulates the alignment learning problem as a domain adaptation problem over the manifold of doubly stochastic matrices.
Posted Content

Partial Gromov-Wasserstein with Applications on Positive-Unlabeled Learning.

TL;DR: This paper addresses the partial Gromov-Wasserstein problem and proposes an algorithm to solve it, and highlights that partial Wasserstein-based metrics prove effective in usual PU learning settings and demonstrates that partial Grosvenstein metrics is efficient in scenario where point clouds come from different domains or have different features.
Posted Content

Gromov-Wasserstein Factorization Models for Graph Clustering

TL;DR: In this paper, a nonlinear factorization model based on Gromov-Wasserstein (GW) discrepancy is proposed, which estimates observed graphs as GW barycenters constructed by a set of atoms with different weights.
References
More filters
Proceedings ArticleDOI

YALMIP : a toolbox for modeling and optimization in MATLAB

TL;DR: Free MATLAB toolbox YALMIP is introduced, developed initially to model SDPs and solve these by interfacing eternal solvers by making development of optimization problems in general, and control oriented SDP problems in particular, extremely simple.
Journal ArticleDOI

Combinatorial optimization: algorithms and complexity

TL;DR: This clearly written, mathematically rigorous text includes a novel algorithmic exposition of the simplex method and also discusses the Soviet ellipsoid algorithm for linear programming; efficient algorithms for network flow, matching, spanning trees, and matroids; the theory of NP-complete problems; approximation algorithms, local search heuristics for NPcomplete problems, more.
Journal ArticleDOI

Shape matching and object recognition using shape contexts

TL;DR: This paper presents work on computing shape models that are computationally fast and invariant basic transformations like translation, scaling and rotation, and proposes shape detection using a feature called shape context, which is descriptive of the shape of the object.
Book

Probability and Measure

TL;DR: In this paper, the convergence of distributions is considered in the context of conditional probability, i.e., random variables and expected values, and the probability of a given distribution converging to a certain value.
Book

Linear and nonlinear programming

TL;DR: Strodiot and Zentralblatt as discussed by the authors introduced the concept of unconstrained optimization, which is a generalization of linear programming, and showed that it is possible to obtain convergence properties for both standard and accelerated steepest descent methods.