Journal ArticleDOI
Gromov–Wasserstein Distances and the Metric Approach to Object Matching
TLDR
This paper discusses certain modifications of the ideas concerning the Gromov–Hausdorff distance which have the goal of modeling and tackling the practical problems of object matching and comparison by proving explicit lower bounds for the proposed distance that involve many of the invariants previously reported by researchers.Abstract:
This paper discusses certain modifications of the ideas concerning the Gromov–Hausdorff distance which have the goal of modeling and tackling the practical problems of object matching and comparison. Objects are viewed as metric measure spaces, and based on ideas from mass transportation, a Gromov–Wasserstein type of distance between objects is defined. This reformulation yields a distance between objects which is more amenable to practical computations but retains all the desirable theoretical underpinnings. The theoretical properties of this new notion of distance are studied, and it is established that it provides a strict metric on the collection of isomorphism classes of metric measure spaces. Furthermore, the topology generated by this metric is studied, and sufficient conditions for the pre-compactness of families of metric measure spaces are identified. A second goal of this paper is to establish links to several other practical methods proposed in the literature for comparing/matching shapes in precise terms. This is done by proving explicit lower bounds for the proposed distance that involve many of the invariants previously reported by researchers. These lower bounds can be computed in polynomial time. The numerical implementations of the ideas are discussed and computational examples are presented.read more
Citations
More filters
Journal ArticleDOI
Formal Conceptual Views in Neural Networks
Johannes Hirth,Tom Hanika +1 more
TL;DR: Two notions for conceptual views of a neural network are introduced, a many-valued and a symbolic view, which provide novel analysis methods to enable a human AI analyst to grasp deeper insights into the knowledge that is captured by the neurons of a network.
Journal ArticleDOI
Dataset Similarity to Assess Semisupervised Learning Under Distribution Mismatch Between the Labeled and Unlabeled Datasets
TL;DR: In this paper , a quantitative unlabeled dataset selection heuristic based on dataset dissimilarity measures is proposed to evaluate the impact of distribution mismatch on the labeled and unlabelled datasets.
Posted Content
LSMI-Sinkhorn: Semi-supervised Mutual Information Estimation with Optimal Transport
TL;DR: In this article, a semi-supervised Squared-loss Mutual Information (SMI) estimation method using a small number of paired samples and the available unpaired ones is proposed.
Dissertation
Analysis and control of diffusion processes in networks
TL;DR: This work allows the rigorous analysis of the behavior of a network's characteristics when it converges, in a structural sense, to a given metric space, and could open the way to the application of control strategies on networks to spatial and macroscopic information about the contact network in a given population.
Journal ArticleDOI
Simplexwise Distance Distributions for finite spaces with metrics and measures
TL;DR: Simplexwise distance distributions (SDDs) as discussed by the authors have been proposed to classify all known non-equivalent spaces that were impossible to distinguish by simpler invariants, and define metrics on SDDs that are Lipschitz continuous and allow exact computations whose parametrised complexities are polynomial in the number of given points.
References
More filters
Proceedings ArticleDOI
YALMIP : a toolbox for modeling and optimization in MATLAB
TL;DR: Free MATLAB toolbox YALMIP is introduced, developed initially to model SDPs and solve these by interfacing eternal solvers by making development of optimization problems in general, and control oriented SDP problems in particular, extremely simple.
Journal ArticleDOI
Combinatorial optimization: algorithms and complexity
TL;DR: This clearly written, mathematically rigorous text includes a novel algorithmic exposition of the simplex method and also discusses the Soviet ellipsoid algorithm for linear programming; efficient algorithms for network flow, matching, spanning trees, and matroids; the theory of NP-complete problems; approximation algorithms, local search heuristics for NPcomplete problems, more.
Journal ArticleDOI
Shape matching and object recognition using shape contexts
TL;DR: This paper presents work on computing shape models that are computationally fast and invariant basic transformations like translation, scaling and rotation, and proposes shape detection using a feature called shape context, which is descriptive of the shape of the object.
Book
Probability and Measure
TL;DR: In this paper, the convergence of distributions is considered in the context of conditional probability, i.e., random variables and expected values, and the probability of a given distribution converging to a certain value.
Book
Linear and nonlinear programming
David G. Luenberger,Yinyu Ye +1 more
TL;DR: Strodiot and Zentralblatt as discussed by the authors introduced the concept of unconstrained optimization, which is a generalization of linear programming, and showed that it is possible to obtain convergence properties for both standard and accelerated steepest descent methods.