Journal ArticleDOI
Gromov–Wasserstein Distances and the Metric Approach to Object Matching
TLDR
This paper discusses certain modifications of the ideas concerning the Gromov–Hausdorff distance which have the goal of modeling and tackling the practical problems of object matching and comparison by proving explicit lower bounds for the proposed distance that involve many of the invariants previously reported by researchers.Abstract:
This paper discusses certain modifications of the ideas concerning the Gromov–Hausdorff distance which have the goal of modeling and tackling the practical problems of object matching and comparison. Objects are viewed as metric measure spaces, and based on ideas from mass transportation, a Gromov–Wasserstein type of distance between objects is defined. This reformulation yields a distance between objects which is more amenable to practical computations but retains all the desirable theoretical underpinnings. The theoretical properties of this new notion of distance are studied, and it is established that it provides a strict metric on the collection of isomorphism classes of metric measure spaces. Furthermore, the topology generated by this metric is studied, and sufficient conditions for the pre-compactness of families of metric measure spaces are identified. A second goal of this paper is to establish links to several other practical methods proposed in the literature for comparing/matching shapes in precise terms. This is done by proving explicit lower bounds for the proposed distance that involve many of the invariants previously reported by researchers. These lower bounds can be computed in polynomial time. The numerical implementations of the ideas are discussed and computational examples are presented.read more
Citations
More filters
Posted Content
Generalized Spectral Clustering via Gromov-Wasserstein Learning
Samir Chowdhury,Tom Needham +1 more
TL;DR: This work establishes a bridge between spectral clustering and Gromov-Wasserstein Learning (GWL), a recent optimal transport-based approach to graph partitioning, and shows that when comparing against a two-node template graph using the heat kernel at the infinite time limit, the resulting partition agrees with the partition produced by the Fiedler vector.
Journal ArticleDOI
Modelling Convex Shape Priors and Matching Based on the Gromov-Wasserstein Distance
TL;DR: A convex shape prior functional is presented which takes the form of a modified transport problem and inherits the ability to incorporate vast classes of geometric invariances beyond rigid isometries and can be minimized by standard linear programming methods.
Posted Content
Minibatch optimal transport distances; analysis and applications
TL;DR: It is argued that the minibatch strategy comes with appealing properties such as unbiased estimators, gradients and a concentration bound around the expectation, but also with limits: theminibatch OT is not a distance.
Proceedings Article
Learning Autoencoders with Relational Regularization
TL;DR: The relational regularized autoencoder (RAE) outperforms existing methods and helps co-training of multiple autoencoders even if they have heterogeneous architectures and incomparable latent spaces.
Proceedings Article
Variance-Minimizing Transport Plans for Inter-surface Mapping
TL;DR: In this article, an efficient computational method for generating dense and low distortion maps between two arbitrary surfaces of same genus is introduced. But instead of relying on semantic correspondences or surface parameterization, they directly optimize a variance-minimizing transport plan between two input surfaces that defines an as-conformal-as-possible inter-surface map satisfying a user-prescribed bound on area distortion.
References
More filters
Proceedings ArticleDOI
YALMIP : a toolbox for modeling and optimization in MATLAB
TL;DR: Free MATLAB toolbox YALMIP is introduced, developed initially to model SDPs and solve these by interfacing eternal solvers by making development of optimization problems in general, and control oriented SDP problems in particular, extremely simple.
Journal ArticleDOI
Combinatorial optimization: algorithms and complexity
TL;DR: This clearly written, mathematically rigorous text includes a novel algorithmic exposition of the simplex method and also discusses the Soviet ellipsoid algorithm for linear programming; efficient algorithms for network flow, matching, spanning trees, and matroids; the theory of NP-complete problems; approximation algorithms, local search heuristics for NPcomplete problems, more.
Journal ArticleDOI
Shape matching and object recognition using shape contexts
TL;DR: This paper presents work on computing shape models that are computationally fast and invariant basic transformations like translation, scaling and rotation, and proposes shape detection using a feature called shape context, which is descriptive of the shape of the object.
Book
Probability and Measure
TL;DR: In this paper, the convergence of distributions is considered in the context of conditional probability, i.e., random variables and expected values, and the probability of a given distribution converging to a certain value.
Book
Linear and nonlinear programming
David G. Luenberger,Yinyu Ye +1 more
TL;DR: Strodiot and Zentralblatt as discussed by the authors introduced the concept of unconstrained optimization, which is a generalization of linear programming, and showed that it is possible to obtain convergence properties for both standard and accelerated steepest descent methods.