scispace - formally typeset
Search or ask a question
Author

Simon Maskell

Bio: Simon Maskell is an academic researcher from University of Liverpool. The author has contributed to research in topics: Particle filter & Kalman filter. The author has an hindex of 27, co-authored 128 publications receiving 14367 citations. Previous affiliations of Simon Maskell include Qinetiq & University of Cambridge.


Papers
More filters
Proceedings ArticleDOI
23 Mar 2004
TL;DR: The paper focuses on the design of a proposal distribution that uses an extended Kalman filter in radar co-ordinates that improves on the performance possible using a particle filter with other choices of proposal distribution.
Abstract: Particle filters are the state-of-the-art solution to difficult tracking problems. The crucial step in the efficient application of particle filters to specific problems is the design of the proposal distribution. This proposal distribution defines how the particles are propagated from one time step to the next. If the proposal is not well designed, then one typically needs a huge number of particles to achieve good performance. The paper focuses on the design of a proposal distribution that is well suited to the specific problem being considered; the work is motivated by the need to use particle filters to track using multiple radars observing multiple closely spaced targets at high range. The targets can be resolved in radar co-ordinates, but not easily tracked in Cartesian co-ordinates. In the general case, it often happens that the proposal needs to interpolate between both the information derived using the dynamic model for the target behaviour and the information inherent in the measurement. Schemes to conduct this interpolation are often based on extended or unscented Kalman filters. When using such filters in non-linear environments, such as when tracking using radars, a pertinent choice of co-ordinate frame can improve performance. Based on this idea, a proposal distribution is described that uses an extended Kalman filter in radar co-ordinates. This results in a skewed proposal in Cartesian co-ordinates that, in this specific problem, improves on the performance possible using a particle filter with other choices of proposal distribution.

1 citations

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a novel parallel redistribution for DM that achieves an O(log 2N) time complexity, which is the fastest known algorithm for non-linear non-Gaussian dynamic models.
Abstract: Resampling is a well-known statistical algorithm that is commonly applied in the context of Particle Filters (PFs) in order to perform state estimation for non-linear non-Gaussian dynamic models. As the models become more complex and accurate, the run-time of PF applications becomes increasingly slow. Parallel computing can help to address this. However, resampling (and, hence, PFs as well) necessarily involves a bottleneck, the redistribution step, which is notoriously challenging to parallelize if using textbook parallel computing techniques. A state-of-the-art redistribution takes O((log2N)2) computations on Distributed Memory (DM) architectures, which most supercomputers adopt, whereas redistribution can be performed in O(log2N) on Shared Memory (SM) architectures, such as GPU or mainstream CPUs. In this paper, we propose a novel parallel redistribution for DM that achieves an O(log2N) time complexity. We also present empirical results that indicate that our novel approach outperforms the O((log2N)2) approach.

1 citations

Proceedings ArticleDOI
10 Jul 2018
TL;DR: This paper considers the problem of tracking a manoeuvring target when some of the measurements are delayed by the time taken to propagate through some medium, and uses a particle filter, which handles out-of-sequence measurements by storing a history of hypothesised target states and measurement emission times for each particle.
Abstract: Ahstract-This paper considers the problem of tracking a manoeuvring target when some of the measurements are delayed by the time taken to propagate through some medium We are especially interested in bearing-only measurements, since it is possible to extract range information by fusing measurements which have negligible propagation delay (such as from electrooptical sensors) and measurements which have a propagation delay proportional to the range to the target (such as from acoustic sensors) This requires us to handle measurements which appear out of sequence, and with emission times unknown to the tracker Unlike previous approaches, a particle filter is used, which handles out-of-sequence measurements by storing a history of hypothesised target states and measurement emission times for each particle This allows new target states and times to be inserted into the trajectory of each target by interpolating between adjacent states in the history

1 citations

Journal ArticleDOI
TL;DR: GroundsWell will use interdisciplinary, problem-solving approaches to accelerate and optimise community collaborations among citizens, users, implementers, policymakers and researchers to impact research, policy, practice and active citizenship.
Abstract: Natural environments, such as parks, woodlands and lakes, have positive impacts on health and wellbeing. Urban Green and Blue Spaces (UGBS), and the activities that take place in them, can significantly influence the health outcomes of all communities, and reduce health inequalities. Improving access and quality of UGBS needs understanding of the range of systems (e.g. planning, transport, environment, community) in which UGBS are located. UGBS offers an ideal exemplar for testing systems innovations as it reflects place-based and whole society processes , with potential to reduce non-communicable disease (NCD) risk and associated social inequalities in health. UGBS can impact multiple behavioural and environmental aetiological pathways. However, the systems which desire, design, develop, and deliver UGBS are fragmented and siloed, with ineffective mechanisms for data generation, knowledge exchange and mobilisation. Further, UGBS need to be co-designed with and by those whose health could benefit most from them, so they are appropriate, accessible, valued and used well. This paper describes a major new prevention research programme and partnership, GroundsWell, which aims to transform UGBS-related systems by improving how we plan, design, evaluate and manage UGBS so that it benefits all communities, especially those who are in poorest health. We use a broad definition of health to include physical, mental, social wellbeing and quality of life. Our objectives are to transform systems so that UGBS are planned, developed, implemented, maintained and evaluated with our communities and data systems to enhance health and reduce inequalities. GroundsWell will use interdisciplinary, problem-solving approaches to accelerate and optimise community collaborations among citizens, users, implementers, policymakers and researchers to impact research, policy, practice and active citizenship. GroundsWell will be shaped and developed in three pioneer cities (Belfast, Edinburgh, Liverpool) and their regional contexts, with embedded translational mechanisms to ensure that outputs and impact have UK-wide and international application.

1 citations

Posted Content
TL;DR: In this paper, the No-U-Turn Sampler (NUTS) is incorporated into an SMC sampler to improve the efficiency of the exploration of the target space, and the SMC can be optimized using both a near-optimal L-kernel and a Hamiltonian proposal.
Abstract: Markov Chain Monte Carlo (MCMC) is a powerful method for drawing samples from non-standard probability distributions and is utilized across many fields and disciplines. Methods such as Metropolis-Adjusted Langevin (MALA) and Hamiltonian Monte Carlo (HMC), which use gradient information to explore the target distribution, are popular variants of MCMC. The Sequential Monte Carlo (SMC) sampler is an alternative sampling method which, unlike MCMC, can readily utilise parallel computing architectures and also has tuning parameters not available to MCMC. One such parameter is the L-kernel which can be used to minimise the variance of the estimates from an SMC sampler. In this letter, we show how the proposal used in the No-U-Turn Sampler (NUTS), an advanced variant of HMC, can be incorporated into an SMC sampler to improve the efficiency of the exploration of the target space. We also show how the SMC sampler can be optimized using both a near-optimal L-kernel and a Hamiltonian proposal

1 citations


Cited by
More filters
Book
24 Aug 2012
TL;DR: This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach, and is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.
Abstract: Today's Web-enabled deluge of electronic data calls for automated methods of data analysis. Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach. The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic methods, the book stresses a principled model-based approach, often using the language of graphical models to specify models in a concise and intuitive way. Almost all the models described have been implemented in a MATLAB software package--PMTK (probabilistic modeling toolkit)--that is freely available online. The book is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.

8,059 citations

MonographDOI
01 Jan 2006
TL;DR: This coherent and comprehensive book unifies material from several sources, including robotics, control theory, artificial intelligence, and algorithms, into planning under differential constraints that arise when automating the motions of virtually any mechanical system.
Abstract: Planning algorithms are impacting technical disciplines and industries around the world, including robotics, computer-aided design, manufacturing, computer graphics, aerospace applications, drug design, and protein folding. This coherent and comprehensive book unifies material from several sources, including robotics, control theory, artificial intelligence, and algorithms. The treatment is centered on robot motion planning but integrates material on planning in discrete spaces. A major part of the book is devoted to planning under uncertainty, including decision theory, Markov decision processes, and information spaces, which are the “configuration spaces” of all sensor-based planning problems. The last part of the book delves into planning under differential constraints that arise when automating the motions of virtually any mechanical system. Developed from courses taught by the author, the book is intended for students, engineers, and researchers in robotics, artificial intelligence, and control theory as well as computer graphics, algorithms, and computational biology.

6,340 citations

Journal ArticleDOI
TL;DR: A new approach toward target representation and localization, the central component in visual tracking of nonrigid objects, is proposed, which employs a metric derived from the Bhattacharyya coefficient as similarity measure, and uses the mean shift procedure to perform the optimization.
Abstract: A new approach toward target representation and localization, the central component in visual tracking of nonrigid objects, is proposed. The feature histogram-based target representations are regularized by spatial masking with an isotropic kernel. The masking induces spatially-smooth similarity functions suitable for gradient-based optimization, hence, the target localization problem can be formulated using the basin of attraction of the local maxima. We employ a metric derived from the Bhattacharyya coefficient as similarity measure, and use the mean shift procedure to perform the optimization. In the presented tracking examples, the new method successfully coped with camera motion, partial occlusions, clutter, and target scale variations. Integration with motion filters and data association techniques is also discussed. We describe only a few of the potential applications: exploitation of background information, Kalman tracking using motion models, and face tracking.

4,996 citations

Posted Content
TL;DR: In this paper, the authors provide a unified and comprehensive theory of structural time series models, including a detailed treatment of the Kalman filter for modeling economic and social time series, and address the special problems which the treatment of such series poses.
Abstract: In this book, Andrew Harvey sets out to provide a unified and comprehensive theory of structural time series models. Unlike the traditional ARIMA models, structural time series models consist explicitly of unobserved components, such as trends and seasonals, which have a direct interpretation. As a result the model selection methodology associated with structural models is much closer to econometric methodology. The link with econometrics is made even closer by the natural way in which the models can be extended to include explanatory variables and to cope with multivariate time series. From the technical point of view, state space models and the Kalman filter play a key role in the statistical treatment of structural time series models. The book includes a detailed treatment of the Kalman filter. This technique was originally developed in control engineering, but is becoming increasingly important in fields such as economics and operations research. This book is concerned primarily with modelling economic and social time series, and with addressing the special problems which the treatment of such series poses. The properties of the models and the methodological techniques used to select them are illustrated with various applications. These range from the modellling of trends and cycles in US macroeconomic time series to to an evaluation of the effects of seat belt legislation in the UK.

4,252 citations

Proceedings ArticleDOI
26 Apr 2010
TL;DR: This paper investigates the real-time interaction of events such as earthquakes in Twitter and proposes an algorithm to monitor tweets and to detect a target event and produces a probabilistic spatiotemporal model for the target event that can find the center and the trajectory of the event location.
Abstract: Twitter, a popular microblogging service, has received much attention recently. An important characteristic of Twitter is its real-time nature. For example, when an earthquake occurs, people make many Twitter posts (tweets) related to the earthquake, which enables detection of earthquake occurrence promptly, simply by observing the tweets. As described in this paper, we investigate the real-time interaction of events such as earthquakes in Twitter and propose an algorithm to monitor tweets and to detect a target event. To detect a target event, we devise a classifier of tweets based on features such as the keywords in a tweet, the number of words, and their context. Subsequently, we produce a probabilistic spatiotemporal model for the target event that can find the center and the trajectory of the event location. We consider each Twitter user as a sensor and apply Kalman filtering and particle filtering, which are widely used for location estimation in ubiquitous/pervasive computing. The particle filter works better than other comparable methods for estimating the centers of earthquakes and the trajectories of typhoons. As an application, we construct an earthquake reporting system in Japan. Because of the numerous earthquakes and the large number of Twitter users throughout the country, we can detect an earthquake with high probability (96% of earthquakes of Japan Meteorological Agency (JMA) seismic intensity scale 3 or more are detected) merely by monitoring tweets. Our system detects earthquakes promptly and sends e-mails to registered users. Notification is delivered much faster than the announcements that are broadcast by the JMA.

3,976 citations