scispace - formally typeset
Search or ask a question

Showing papers by "Richard C. Wilson published in 1997"


Journal ArticleDOI
TL;DR: The main conclusion of the study is that the active process of graph-editing outperforms the alternatives in terms of its ability to effectively control a large population of contaminating clutter.
Abstract: This paper describes a Bayesian framework for performing relational graph matching by discrete relaxation. Our basic aim is to draw on this framework to provide a comparative evaluation of a number of contrasting approaches to relational matching. Broadly speaking there are two main aspects to this study. Firstly we focus on the issue of how relational inexactness may be quantified. We illustrate that several popular relational distance measures can be recovered as specific limiting cases of the Bayesian consistency measure. The second aspect of our comparison concerns the way in which structural inexactness is controlled. We investigate three different realizations of the matching process which draw on contrasting control models. The main conclusion of our study is that the active process of graph-editing outperforms the alternatives in terms of its ability to effectively control a large population of contaminating clutter.

337 citations


Journal ArticleDOI
TL;DR: This paper casts the optimisation process into a Bayesian framework by exploiting the recently reported global consistency measure of Wilson and Hancock as a fitness measure, and demonstrates empirically that the method possesses polynomial convergence time and that the convergence rate is more rapid than simulated annealing.

157 citations


Journal ArticleDOI
TL;DR: The Bayesian framework is used to construct an inference matrix which can be used to gauge the mutual consistency of multiple graph-matches and is realised as an iterative discrete relaxation process which aims to maximise the elements of the inference matrix.

63 citations


Journal ArticleDOI
TL;DR: A Bayesian framework for matching Delaunay triangulations using a model of the compatibility between faces of the data and model graphs and a particularly simple compatibility model that is entirely devoid of free parameters is presented.

46 citations


Proceedings ArticleDOI
17 Jun 1997
TL;DR: A new class of the adaptive mesh is described that uses both split and merge operations to adapt itself to the structure of volumetric data-points and is regulated by the curvature of the underlying surface.
Abstract: The main contribution of the paper is to describe a new class of the adaptive mesh. The mesh uses both split and merge operations to adapt itself to the structure of volumetric data-points. The adaptive behaviour is controlled by the variance of the data-point positions about maximum-likelihood quadric patches. The authors show that the density of control points on the mesh is regulated by the curvature of the underlying surface. Finally, they illustrate the effectiveness of the method on both real-world and simulated data-sets.

8 citations


Book ChapterDOI
10 Sep 1997
TL;DR: The Bayesian framework is used to construct an inference matrix which can be used to gauge the mutual consistency of multiple graph-matches and is realised as an iterative discrete relaxation process which aims to maximise the elements of the inference matrix.
Abstract: This paper describes the development of a Bayesian framework for multiple graph matching. The study is motivated by the plethora of multi-sensor fusion problems which can be abstracted as multiple graph matching tasks. The study uses as its starting point the Bayesian consistency measure recently developed by Wilson and Hancock. Hitherto, the consistency measure has been used exclusively in the matching of graph-pairs. In the multiple graph matching study reported in this paper, we use the Bayesian framework to construct an inference matrix which can be used to gauge the mutual consistency of multiple graph-matches. The multiple graph-matching process is realised as an iterative discrete relaxation process which aims to maximise the elements of the inference matrix. We experiment with our multiple graph matching process using an application vehicle furnished by the matching of aerial imagery. Here we are concerned with the simultaneous fusion of optical, infra-red and synthetic aperture radar images in the presence of digital map data.

8 citations


Book ChapterDOI
21 May 1997
TL;DR: A comparative study of various deterministic discrete search-strategies for graph-matching shows that although more computationally intensive, the global gradient method offers significant performance advantages in terms of accuracy of match.
Abstract: This paper describes a comparative study of various deterministic discrete search-strategies for graph-matching. The framework for our study is provided by the Bayesian consistency measure recently reported by Wilson and Hancock [47–49]. We investigate two classes of update process. The first of these aim to exploit discrete gradient ascent methods. We investigate the effect of searching in the direction of both the local and global gradient maximum. An experimental study demonstrates that although more computationally intensive, the global gradient method offers significant performance advantages in terms of accuracy of match. Our second search strategy is based on tabu search. In order to develop this method we introduce memory into the search procedure by defining context dependant search paths. We illustrate that although it is more efficient than the global gradient method, tabu search delivers almost comparable performance.

5 citations


Book ChapterDOI
17 Sep 1997
TL;DR: A statistical model is developed which allows scheme to be initialised using the probabilities of the different H - K labels to be estimated from surface normal information, and a dictionary of feasible surface-label configurations is developed.
Abstract: Our main contributions in this paper are twofold. In the first instance, we demonstrate how H - K surface labelling can be realised using dictionary-based probabilistic relaxation. To facilitate this implementation we have developed a dictionary of feasible surface-label configurations. These configurations observe certain constraints on the contiguity of elliptic and hyperbolic regions, and, on the continuity and thinness of parabolic lines. The second contribution is to develop a statistical model which allows scheme to be initialised using the probabilities of the different H - K labels to be estimated from surface normal information.

3 citations


Book ChapterDOI
21 May 1997
TL;DR: This paper describes how relational graph matching can be effected using the EM algorithm as a two-step iterative process and evaluates the noise sensitivity of the matching method on synthetically generated graphs.
Abstract: This paper describes how relational graph matching can be effected using the EM algorithm. The matching process is realised as a two-step iterative process. Firstly, updated symbolic matches are located so as to minimise a Kullback-Leibler divergence between the model and data graphs. Secondly, with the updated matches to hand probabilities describing the affinity between nodes in the model and data graphs may be computed. The probability distributions underpinning this study are computed using a simple model of uniform matching errors. As a result the Kullback-Leibler divergence is defined over a family of exponential distributions of Hamming distance. We evaluate the noise sensitivity of our matching method on synthetically generated graphs. Finally, we offer comparison with both mean-field annealing and quadratic assignment.

2 citations


Proceedings Article
01 Dec 1997
TL;DR: In this article, a Bayesian framework for matching hierarchical relational models is developed to make discrete label assignments so as to optimise a global cost function that draws information concerning the consistency of match from different levels of the hierarchy.
Abstract: Our aim in this paper is to develop a Bayesian framework for matching hierarchical relational models. The goal is to make discrete label assignments so as to optimise a global cost function that draws information concerning the consistency of match from different levels of the hierarchy. Our Bayesian development naturally distinguishes between intra-level and inter-level constraints. This allows the impact of reassigning a match to be assessed not only at its own (or peer) level of representation, but also upon its parents and children in the hierarchy.

2 citations


01 Jan 1997
TL;DR: A Bayesian framework for matching Delaunay triangulations using a model of the compatibility between faces of the data and model graphs and a particularly simple compatibility model that is entirely devoid of free parameters is presented.
Abstract: This paper describes a Bayesian framework for matching Delaunay triangulations. Relational structures of this sort are ubiquitous in intermediate level computer vision, being used to represent both Voronoi tessellations of the image plane and volumetric surface data. Our matching process is realised in terms of probabilistic relaxation. The novelty of our method stems from its use of a support function specified in terms of face-units of the graphs under match. In this way, we draw on more expressive constraints than is possible at the level of edge-units alone. In order to apply this new relaxation process to the matching of realistic imagery requires a model of the compatibility between faces of the data and model graphs. We present a particularly simple compatibility model that is entirely devoid of free parameters. It requires only knowledge of the numbers of nodes, edges and faces in the model graph. The resulting matching scheme is evaluated on radar images and compared with its edge-based counterpart. We establish the operational limits and noise sensitivity on the matching of random-dot patterns. Copyright © 1996 Pattern Recognition Society. Published by Elsevier Science Ltd.