scispace - formally typeset
Search or ask a question

Showing papers on "Particle filter published in 1997"


Book
Jun Liu1, Rong Chen1
01 Jan 1997
TL;DR: A general framework for using Monte Carlo methods in dynamic systems and a general use of Rao-Blackwellization is proposed to improve performance and to compare different Monte Carlo procedures.
Abstract: We provide a general framework for using Monte Carlo methods in dynamic systems and discuss its wide applications. Under this framework, several currently available techniques are studied and generalized to accommodate more complex features. All of these methods are partial combinations of three ingredients: importance sampling and resampling, rejection sampling, and Markov chain iterations. We provide guidelines on how they should be used and under what circumstance each method is most suitable. Through the analysis of differences and connections, we consolidate these methods into a generic algorithm by combining desirable features. In addition, we propose a general use of Rao-Blackwellization to improve performance. Examples from econometrics and engineering are presented to demonstrate the importance of Rao–Blackwellization and to compare different Monte Carlo procedures.

2,150 citations


Journal ArticleDOI
TL;DR: The application of optimal nonlinear/non-Gaussian filtering to the problem of INS/GPS integration in critical situations is described, and particle filtering theory is introduced and GPS/INS integration simulation results are discussed.
Abstract: The application of optimal nonlinear/non-Gaussian filtering to the problem of INS/GPS integration in critical situations is described. This approach is made possible by a new technique called particle filtering, and exhibits superior performance when compared with classical suboptimal techniques such as extended Kalman filtering. Particle filtering theory is introduced and GPS/INS integration simulation results are discussed.

157 citations


Journal ArticleDOI
01 Mar 1997
TL;DR: The reliability of composite generation and transmission systems with time varying loads at each bus can be effectively assessed using the sequential Monte Carlo simulation method as mentioned in this paper, which can also provide information on the distributions of the adequacy indices, which is not possible with other non-sequential approaches.
Abstract: The reliability of composite generation and transmission systems with time varying loads at each bus can be effectively assessed using the sequential Monte Carlo simulation method. This method can also provide information on the distributions of the adequacy indices, which is not possible with other nonsequential approaches. The approach presented uses an annual chronological load curve for each load bus and a sequential Monte Carlo approach for composite system reliability assessment. The paper presents the basic sequential simulation technique and illustrates its application to obtain distributions of the various composite system indices for the IEEE reliability test system.

50 citations


Dissertation
01 Jan 1997
TL;DR: This thesis presents an alternative general theory for uncertainty analysis, based on the use of stochastic process models, in a Bayesian context, indicating that it has the potential to provide more accurate uncertainty analyses for the parameters of computationally expensive algorithms.
Abstract: In the field of radiation protection, complex computationally expensive algorithms are used to predict radiation doses, to organs in the human body from exposure to internally deposited radionuclides. These algorithms contain many inputs, the true values of which are uncertain. Current methods for assessing the effects of the input uncertainties on the output of the algorithms are based on Monte Carlo analyses, i.e. sampling from subjective prior distributions that represent the uncertainty on each input, evaluating the output of the model and calculating sample statistics. For complex computationally expensive algorithms, it is often not possible to get a large enough sample for a meaningful uncertainty analysis. This thesis presents an alternative general theory for uncertainty analysis, based on the use of stochastic process models, in a Bayesian context. The measures provided by the Monte Carlo analysis are obtained, plus extra more informative measures, but using a far smaller sample. The theory is initially developed in a general form and then specifically for algorithms with inputs whose uncertainty can be characterised by independent normal distributions. The Monte Carlo and Bayesian methodologies are then compared using two practical examples. The first example, is based on a simple model developed to calculate doses due to radioactive iodine. This model has two normally distributed uncertain parameters and due to its simplicity an independent measurement of the true uncertainty on the output is available for comparison. This exercise appears to show that the Bayesian methodology is superior in this simple case. The purpose of the second example is to determine if the methodology is practical in a 'real-life' situation and to compare it with a Monte Carlo analysis. A model for calculating doses due to plutonium contamination is used. This model is computationally expensive and has fourteen uncertain inputs. The Bayesian analysis compared favourably to the Monte Carlo, indicating that it has the potential to provide more accurate uncertainty analyses for the parameters of computationally expensive algorithms.

28 citations


Journal ArticleDOI
Merrilee Hurn1
TL;DR: The results suggest that adapting the auxiliary variables to the specific application is beneficial, however the form of adaptation needed and the extent of the resulting benefits are not always clear-cut.
Abstract: Markov chain Monte Carlo (MCMC) methods are now widely used in a diverse range of application areas to tackle previously intractable problems Difficult questions remain, however, in designing MCMC samplers for problems exhibiting severe multimodality where standard methods may exhibit prohibitively slow movement around the state space Auxiliary variable methods, sometimes together with multigrid ideas, have been proposed as one possible way forward Initial disappointing experiments have led to data-driven modifications of the methods In this paper, these suggestions are investigated for lattice data such as is found in imaging and some spatial applications The results suggest that adapting the auxiliary variables to the specific application is beneficial However the form of adaptation needed and the extent of the resulting benefits are not always clear-cut

21 citations


Journal ArticleDOI
TL;DR: Versions of the Gibbs sampler are derived for the analysis of data from the hidden Markov mesh random fields sometimes used in image analysis, providing a numerical approach to the otherwise intractable Bayesian analysis of these problems.
Abstract: Versions of the Gibbs sampler are derived for the analysis of data from the hidden Markov mesh random fields sometimes used in image analysis. This provides a numerical approach to the otherwise intractable Bayesian analysis of these problems. Detailed formulation is provided for particular examples based on Devijver's Markov mesh model (1988), and the BUGS package is used to do the computations. Theoretical aspects are discussed and a numerical study, based on image analysis, is reported.

17 citations


Journal ArticleDOI
TL;DR: In this paper, the authors describe the application of pseudo-sequential Monte Carlo simulation to the composite generation-transmission reliability evaluation of the Brazilian South-Southeast system using chronological load modeling.

14 citations


Journal ArticleDOI
01 Jul 1997
TL;DR: This article explains how to reformulate the basic Metropolis algorithm so as to avoid the do-nothing steps and reduce the running time, while also keeping track of the simulated time as determined by themetropolis algorithm.
Abstract: N. Metropolis's (1953) algorithm has often been used for simulating physical systems that pass among a set of states, with the probabilities of the system being in such states distributed like the Boltzmann function. There are literally thousands of different applications in the physical sciences and elsewhere. In this article, we explain how to reformulate the basic Metropolis algorithm so as to avoid the do-nothing steps and reduce the running time, while also keeping track of the simulated time as determined by the Metropolis algorithm. By the simulated time, we mean the number of Monte Carlo steps that would have been taken if the basic Metropolis algorithm had been used. This approach has already proved successful when used for parallel simulations of molecular beam epitaxy. We show an example.

9 citations


Proceedings ArticleDOI
02 Nov 1997
TL;DR: Stochastic algorithms are proposed to perform statistical estimation for filtered point processes in a Bayesian framework which rely on Markov chain Monte Carlo methods which are powerful stochastic simulation methods.
Abstract: Filtered point processes model a huge amount of physical phenomena. Usually, only noisy observations are in practice available. From these data, one would like to estimate the parameters of the filtered point process. This is a complex problem which in general does not admit any closed-form solution. In this paper, we propose stochastic algorithms to perform statistical estimation for such processes in a Bayesian framework. These algorithms rely on Markov chain Monte Carlo methods which are powerful stochastic simulation methods.

6 citations


Journal ArticleDOI
TL;DR: Simulations suggest that a current implementation of Monte Carlo Markov chain methods offers modest advantages over a current approximation method in the context of the mixed model, but that distinguishing polygenic from oligogenic effects may require sufficiently more complex models, requiring MCMC methods for a practical solution.
Abstract: Model selection and parameter estimation is an integral part of genetic analyses leading up to gene identification. However, exact computation of likelihoods for complex models on large pedigrees is not possible. Monte Carlo Markov chain (MCMC) methods provide a computationally feasible way of estimating these likelihoods and associated parameters. The practical utility of these methods depends on their performance, compared to both existing approximation methods, and to absolute measures. Simulations suggest that a current implementation of MCMC methods offers modest advantages over a current approximation method in the context of the mixed model, but that distinguishing polygenic from oligogenic effects may require sufficiently more complex models, requiring MCMC methods for a practical solution.

Journal ArticleDOI
TL;DR: A grid-based prior on unimodal distribution functions from a Bayesian perspective is described that enables implementation of Monte Carlo methods providing a full Bayesian solution.
Abstract: In this paper, we set out to estimate unimodal distribution functions from a Bayesian perspective A grid-based prior on such distributions is described that enables implementation of Monte Carlo methods providing a full Bayesian solution