scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Novel approach to nonlinear/non-Gaussian Bayesian state estimation

01 Apr 1993-Vol. 140, Iss: 2, pp 107-113
TL;DR: An algorithm, the bootstrap filter, is proposed for implementing recursive Bayesian filters, represented as a set of random samples, which are updated and propagated by the algorithm.
Abstract: An algorithm, the bootstrap filter, is proposed for implementing recursive Bayesian filters. The required density of the state vector is represented as a set of random samples, which are updated and propagated by the algorithm. The method is not restricted by assumptions of linear- ity or Gaussian noise: it may be applied to any state transition or measurement model. A simula- tion example of the bearings only tracking problem is presented. This simulation includes schemes for improving the efficiency of the basic algorithm. For this example, the performance of the bootstrap filter is greatly superior to the standard extended Kalman filter.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: Both optimal and suboptimal Bayesian algorithms for nonlinear/non-Gaussian tracking problems, with a focus on particle filters are reviewed.
Abstract: Increasingly, for many application areas, it is becoming important to include elements of nonlinearity and non-Gaussianity in order to model accurately the underlying dynamics of a physical system. Moreover, it is typically crucial to process data on-line as it arrives, both from the point of view of storage costs as well as for rapid adaptation to changing signal characteristics. In this paper, we review both optimal and suboptimal Bayesian algorithms for nonlinear/non-Gaussian tracking problems, with a focus on particle filters. Particle filters are sequential Monte Carlo methods based on point mass (or "particle") representations of probability densities, which can be applied to any state-space model and which generalize the traditional Kalman filtering methods. Several variants of the particle filter such as SIR, ASIR, and RPF are introduced within a generic framework of the sequential importance sampling (SIS) algorithm. These are discussed and compared with the standard EKF through an illustrative example.

11,409 citations


Cites methods or result from "Novel approach to nonlinear/non-Gau..."

  • ...[14]. This sequential MC (SMC) approach is known variously as bootstrap filtering [ 17 ], the condensation algorithm [29], particle filtering [6], interacting particle approximations [10], [11], and survival of the fittest [24]....

    [...]

  • ...1) Sampling Importance Resampling Filter: The SIR filter proposed in [ 17 ] is an MC method that can be applied to recursive Bayesian filtering problems....

    [...]

  • ...We use and . This example has been analyzed before in many publications [5], [ 17 ], [25]....

    [...]

Christopher M. Bishop1
01 Jan 2006
TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Abstract: Probability Distributions.- Linear Models for Regression.- Linear Models for Classification.- Neural Networks.- Kernel Methods.- Sparse Kernel Machines.- Graphical Models.- Mixture Models and EM.- Approximate Inference.- Sampling Methods.- Continuous Latent Variables.- Sequential Data.- Combining Models.

10,141 citations

Book
24 Aug 2012
TL;DR: This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach, and is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.
Abstract: Today's Web-enabled deluge of electronic data calls for automated methods of data analysis. Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach. The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic methods, the book stresses a principled model-based approach, often using the language of graphical models to specify models in a concise and intuitive way. Almost all the models described have been implemented in a MATLAB software package--PMTK (probabilistic modeling toolkit)--that is freely available online. The book is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.

8,059 citations


Cites background from "Novel approach to nonlinear/non-Gau..."

  • ...(The original bootstrap filter (Gordon 1993) resampled at every step, but this is suboptimal....

    [...]

BookDOI
01 Jan 2001
TL;DR: This book presents the first comprehensive treatment of Monte Carlo techniques, including convergence results and applications to tracking, guidance, automated target recognition, aircraft navigation, robot navigation, econometrics, financial modeling, neural networks, optimal control, optimal filtering, communications, reinforcement learning, signal enhancement, model averaging and selection.
Abstract: Monte Carlo methods are revolutionizing the on-line analysis of data in fields as diverse as financial modeling, target tracking and computer vision. These methods, appearing under the names of bootstrap filters, condensation, optimal Monte Carlo filters, particle filters and survival of the fittest, have made it possible to solve numerically many complex, non-standard problems that were previously intractable. This book presents the first comprehensive treatment of these techniques, including convergence results and applications to tracking, guidance, automated target recognition, aircraft navigation, robot navigation, econometrics, financial modeling, neural networks, optimal control, optimal filtering, communications, reinforcement learning, signal enhancement, model averaging and selection, computer vision, semiconductor design, population biology, dynamic Bayesian networks, and time series analysis. This will be of great value to students, researchers and practitioners, who have some basic knowledge of probability. Arnaud Doucet received the Ph. D. degree from the University of Paris-XI Orsay in 1997. From 1998 to 2000, he conducted research at the Signal Processing Group of Cambridge University, UK. He is currently an assistant professor at the Department of Electrical Engineering of Melbourne University, Australia. His research interests include Bayesian statistics, dynamic models and Monte Carlo methods. Nando de Freitas obtained a Ph.D. degree in information engineering from Cambridge University in 1999. He is presently a research associate with the artificial intelligence group of the University of California at Berkeley. His main research interests are in Bayesian statistics and the application of on-line and batch Monte Carlo methods to machine learning. Neil Gordon obtained a Ph.D. in Statistics from Imperial College, University of London in 1993. He is with the Pattern and Information Processing group at the Defence Evaluation and Research Agency in the United Kingdom. His research interests are in time series, statistical data analysis, and pattern recognition with a particular emphasis on target tracking and missile guidance.

6,574 citations


Cites background or methods from "Novel approach to nonlinear/non-Gau..."

  • ...This includes the multinomial sampling originally proposed in (Gordon et al. 1993), residual resampling (Higuchi 1997, Liu and Chen 1998) and minimum variance sampling (Carpenter, Clifford and Fearnhead 1999a, Crisan and Lyons 1999, Kitagawa 1996)....

    [...]

  • ...In (Del Moral and Guionnet 1998b), the algorithm under study is the standard bootstrap filter (Gordon et al. 1993), which is based on an exploration of the space according to the natural dynamics of the system...

    [...]

  • ...In particular, we show that the auxiliary particle filtering methods of (Pitt and Shephard: this volume) fall into the same general class of algorithms as the standard bootstrap filter of (Gordon et al. 1993)....

    [...]

  • ...The introduction of this key step in (Gordon et al. 1993) led to the first operational SMC method....

    [...]

  • ...It is possible to carry out sample regeneration using a mixture approximation (Gordon et al. 1993), see also Liu and West (2001: this volume)....

    [...]

Journal ArticleDOI
TL;DR: The Condensation algorithm uses “factored sampling”, previously applied to the interpretation of static images, in which the probability distribution of possible interpretations is represented by a randomly generated set.
Abstract: The problem of tracking curves in dense visual clutter is challenging. Kalman filtering is inadequate because it is based on Gaussian densities which, being unimo dal, cannot represent simultaneous alternative hypotheses. The Condensation algorithm uses “factored sampling”, previously applied to the interpretation of static images, in which the probability distribution of possible interpretations is represented by a randomly generated set. Condensation uses learned dynamical models, together with visual observations, to propagate the random set over time. The result is highly robust tracking of agile motion. Notwithstanding the use of stochastic methods, the algorithm runs in near real-time.

5,804 citations


Cites background from "Novel approach to nonlinear/non-Gau..."

  • ...…modelling (Blake et al., 1993, 1995), object dynamics are modelled as a second order process, conveniently represented in discrete timet as a second order linear difference equation: xt − x̄ = A(xt−1 − x̄)+ Bwt (10) wherewt are independent vectors of independent standard normal variables, the…...

    [...]

References
More filters
Journal ArticleDOI
TL;DR: In this article, two approaches to the non-Gaussian filtering problem are presented, which retain the computationally attractive recursive structure of the Kalman filter and approximate well the exact minimum variance filter in cases where either the state noise is Gaussian or its variance small in comparison to the observation noise variance, or the system is one step observable.
Abstract: Two approaches to the non-Gaussian filtering problem are presented. The proposed filters retain the computationally attractive recursive structure of the Kalman filter and they approximate well the exact minimum variance filter in cases where either 1) the state noise is Gaussian or its variance small in comparison to the observation noise variance, or 2) the observation noise is Gaussian and the system is one step observable. In both cases, the state estimate is formed as a linear prediction corrected by a nonlinear function of past and present observations. Some simulation results are presented.

373 citations

Journal ArticleDOI
TL;DR: The numerical solution proposed here is obtained by modifying the recursion and using a simple piece-wise constant approximation to the density functions, yielding a bound on the maximum error growth, and a characterization of the situations with potential for large errors.

146 citations