scispace - formally typeset
Search or ask a question
Author

Neil Gordon

Bio: Neil Gordon is an academic researcher from Aston University. The author has contributed to research in topics: Particle filter & Negative luminescence. The author has an hindex of 37, co-authored 181 publications receiving 37011 citations. Previous affiliations of Neil Gordon include Qinetiq & University of Cambridge.


Papers
More filters
Proceedings Article
26 Sep 2008
TL;DR: In this article, a statistical analysis of vessel motion patterns in the ports and waterways using AIS ship self-reporting data is devoted to statistical analysis, which is carried out in the framework of adaptive kernel density estimation.
Abstract: The paper is devoted to statistical analysis of vessel motion patterns in the ports and waterways using AIS ship self-reporting data. From the real historic AIS data we extract motion patterns which are then used to construct the corresponding motion anomaly detectors. This is carried out in the framework of adaptive kernel density estimation. The anomaly detector is then sequentially applied to the real incoming AIS data for the purpose of anomaly detection. Under the null hypothesis (no anomaly), using the historic motion pattern data, we predict the motion of vessels using the Gaussian sum tracking filter.

226 citations

Journal ArticleDOI
26 Feb 2014-Sensors
TL;DR: An analysis of current scientific literature mainly covering the last decade and examines the trends in the development of electronic, acoustic and optical-fiber humidity sensors over this period indicates that a new generation of sensor technology based on optical fibers is emerging.
Abstract: This review offers new perspectives on the subject and highlights an area in need of further research. It includes an analysis of current scientific literature mainly covering the last decade and examines the trends in the development of electronic, acoustic and optical-fiber humidity sensors over this period. The major findings indicate that a new generation of sensor technology based on optical fibers is emerging. The current trends suggest that electronic humidity sensors could soon be replaced by sensors that are based on photonic structures. Recent scientific advances are expected to allow dedicated systems to avoid the relatively high price of interrogation modules that is currently a major disadvantage of fiber-based sensors.

212 citations

Journal ArticleDOI
TL;DR: A Monte Carlo simulation example of a bearings-only tracking problem is presented, and the performance of the bootstrap filter is compared with a standard Cartesian extended Kalman filter (EKF), a modified gain EKF, and a hybrid filter.
Abstract: The bootstrap filter is an algorithm for implementing recursive Bayesian filters. The required density of the state vector is represented as a set of random samples that are updated and propagated by the algorithm. The method is not restricted by assumptions of linearity or Gaussian noise: It may be applied to any state transition or measurement model. A Monte Carlo simulation example of a bearings-only tracking problem is presented, and the performance of the bootstrap filter is compared with a standard Cartesian extended Kalman filter (EKF), a modified gain EKF, and a hybrid filter. A preliminary investigation of an application of the bootstrap filter to an exoatmospheric engagement with non-Gaussian measurement errors is also given.

168 citations

Journal ArticleDOI
17 Oct 2005
TL;DR: A particle-based track-before-detect filtering algorithm that incorporates the Swerling family of target amplitude fluctuation models in order to capture the effect of radar cross-section changes that a target would present to a sensor over time is presented.
Abstract: A particle-based track-before-detect filtering algorithm is presented. This algorithm incorporates the Swerling family of target amplitude fluctuation models in order to capture the effect of radar cross-section changes that a target would present to a sensor over time. The filter is designed with an existence variable, to determine the presence of a target in the data, and an efficient method of incorporating this variable in a particle filter scheme is developed. Results of the algorithm on simulated data show a significant gain in detection performance through accurately modelling the target amplitude fluctuations.

140 citations

Journal ArticleDOI
TL;DR: The problem of tracking multiple targets with multiple sensors in the presence of interfering measurements is considered and a new hybrid bootstrap filter is proposed, an approach where random samples are used to represent the target posterior distributions.
Abstract: The problem of tracking multiple targets with multiple sensors in the presence of interfering measurements is considered. A new hybrid bootstrap filter is proposed. The bootstrap filter is an approach where random samples are used to represent the target posterior distributions. By using this approach, we circumvent the usual problem of an exponentially increasing number of association hypotheses as well as allowing the use of any nonlinear/non-Gaussian system and/or measurement models.

135 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Both optimal and suboptimal Bayesian algorithms for nonlinear/non-Gaussian tracking problems, with a focus on particle filters are reviewed.
Abstract: Increasingly, for many application areas, it is becoming important to include elements of nonlinearity and non-Gaussianity in order to model accurately the underlying dynamics of a physical system. Moreover, it is typically crucial to process data on-line as it arrives, both from the point of view of storage costs as well as for rapid adaptation to changing signal characteristics. In this paper, we review both optimal and suboptimal Bayesian algorithms for nonlinear/non-Gaussian tracking problems, with a focus on particle filters. Particle filters are sequential Monte Carlo methods based on point mass (or "particle") representations of probability densities, which can be applied to any state-space model and which generalize the traditional Kalman filtering methods. Several variants of the particle filter such as SIR, ASIR, and RPF are introduced within a generic framework of the sequential importance sampling (SIS) algorithm. These are discussed and compared with the standard EKF through an illustrative example.

11,409 citations

Christopher M. Bishop1
01 Jan 2006
TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Abstract: Probability Distributions.- Linear Models for Regression.- Linear Models for Classification.- Neural Networks.- Kernel Methods.- Sparse Kernel Machines.- Graphical Models.- Mixture Models and EM.- Approximate Inference.- Sampling Methods.- Continuous Latent Variables.- Sequential Data.- Combining Models.

10,141 citations

Book
24 Aug 2012
TL;DR: This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach, and is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.
Abstract: Today's Web-enabled deluge of electronic data calls for automated methods of data analysis. Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach. The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic methods, the book stresses a principled model-based approach, often using the language of graphical models to specify models in a concise and intuitive way. Almost all the models described have been implemented in a MATLAB software package--PMTK (probabilistic modeling toolkit)--that is freely available online. The book is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.

8,059 citations

BookDOI
01 Jan 2001
TL;DR: This book presents the first comprehensive treatment of Monte Carlo techniques, including convergence results and applications to tracking, guidance, automated target recognition, aircraft navigation, robot navigation, econometrics, financial modeling, neural networks, optimal control, optimal filtering, communications, reinforcement learning, signal enhancement, model averaging and selection.
Abstract: Monte Carlo methods are revolutionizing the on-line analysis of data in fields as diverse as financial modeling, target tracking and computer vision. These methods, appearing under the names of bootstrap filters, condensation, optimal Monte Carlo filters, particle filters and survival of the fittest, have made it possible to solve numerically many complex, non-standard problems that were previously intractable. This book presents the first comprehensive treatment of these techniques, including convergence results and applications to tracking, guidance, automated target recognition, aircraft navigation, robot navigation, econometrics, financial modeling, neural networks, optimal control, optimal filtering, communications, reinforcement learning, signal enhancement, model averaging and selection, computer vision, semiconductor design, population biology, dynamic Bayesian networks, and time series analysis. This will be of great value to students, researchers and practitioners, who have some basic knowledge of probability. Arnaud Doucet received the Ph. D. degree from the University of Paris-XI Orsay in 1997. From 1998 to 2000, he conducted research at the Signal Processing Group of Cambridge University, UK. He is currently an assistant professor at the Department of Electrical Engineering of Melbourne University, Australia. His research interests include Bayesian statistics, dynamic models and Monte Carlo methods. Nando de Freitas obtained a Ph.D. degree in information engineering from Cambridge University in 1999. He is presently a research associate with the artificial intelligence group of the University of California at Berkeley. His main research interests are in Bayesian statistics and the application of on-line and batch Monte Carlo methods to machine learning. Neil Gordon obtained a Ph.D. in Statistics from Imperial College, University of London in 1993. He is with the Pattern and Information Processing group at the Defence Evaluation and Research Agency in the United Kingdom. His research interests are in time series, statistical data analysis, and pattern recognition with a particular emphasis on target tracking and missile guidance.

6,574 citations

Book
01 Jan 2005
TL;DR: This research presents a novel approach to planning and navigation algorithms that exploit statistics gleaned from uncertain, imperfect real-world environments to guide robots toward their goals and around obstacles.
Abstract: Planning and navigation algorithms exploit statistics gleaned from uncertain, imperfect real-world environments to guide robots toward their goals and around obstacles.

6,425 citations