scispace - formally typeset
Search or ask a question
Author

Pierre Del Moral

Bio: Pierre Del Moral is an academic researcher from French Institute for Research in Computer Science and Automation. The author has contributed to research in topics: Particle filter & Markov chain Monte Carlo. The author has an hindex of 34, co-authored 215 publications receiving 8023 citations. Previous affiliations of Pierre Del Moral include Centre national de la recherche scientifique & University of Nice Sophia Antipolis.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors propose a methodology to sample sequentially from a sequence of probability distributions that are defined on a common space, each distribution being known up to a normalizing constant.
Abstract: Summary. We propose a methodology to sample sequentially from a sequence of probability distributions that are defined on a common space, each distribution being known up to a normalizing constant. These probability distributions are approximated by a cloud of weighted random samples which are propagated over time by using sequential Monte Carlo methods. This methodology allows us to derive simple algorithms to make parallel Markov chain Monte Carlo algorithms interact to perform global optimization and sequential Bayesian estimation and to compute ratios of normalizing constants. We illustrate these algorithms for various integration tasks arising in the context of Bayesian inference.

1,684 citations

Book
01 Jan 2004
TL;DR: In this paper, the origins of Feynman-Kac and Particle Models are discussed and an overview of the evolution and evolution of these models is given, as well as a discussion of some of the properties of the models.
Abstract: 1 Introduction- 11 On the Origins of Feynman-Kac and Particle Models- 12 Notation and Conventions- 13 Feynman-Kac Path Models- 131 Path-Space and Marginal Models- 132 Nonlinear Equations- 14 Motivating Examples- 141 Engineering Science- 142 Bayesian Methodology- 143 Particle and Statistical Physics- 144 Biology- 145 Applied Probability and Statistics- 15 Interacting Particle Systems- 151 Discrete Time Models- 152 Continuous Time Models- 16 Sequential Monte Carlo Methodology- 17 Particle Interpretations- 18 A Contents Guide for the Reader- 2 Feynman-Kac Formulae- 21 Introduction- 22 An Introduction to Markov Chains- 221 Canonical Probability Spaces- 222 Path-Space Markov Models- 223 Stopped Markov chains- 224 Examples- 23 Description of the Models- 24 Structural Stability Properties- 241 Path Space and Marginal Models- 242 Change of Reference Probability Measures- 243 Updated and Prediction Flow Models- 25 Distribution Flows Models- 251 Killing Interpretation- 252 Interacting Process Interpretation- 253 McKean Models- 254 Kalman-Bucy filters- 26 Feynman-Kac Models in Random Media- 261 Quenched and Annealed Feynman-Kac Flows- 262 Feynman-Kac Models in Distribution Space- 27 Feynman-Kac Semigroups- 271 Prediction Semigroups- 272 Updated Semigroups- 3 Genealogical and Interacting Particle Models- 31 Introduction- 32 Interacting Particle Interpretations- 33 Particle models with Degenerate Potential- 34 Historical and Genealogical Tree Models- 341 Introduction- 342 A Rigorous Approach and Related Transport Problems- 343 Complete Genealogical Tree Models- 35 Particle Approximation Measures- 351 Some Convergence Results- 352 Regularity Conditions- 4 Stability of Feynman-Kac Semigroups- 41 Introduction- 42 Contraction Properties of Markov Kernels- 421 h-relative Entropy- 422 Lipschitz Contractions- 43 Contraction Properties of Feynman-Kac Semigroups- 431 Functional Entropy Inequalities- 432 Contraction Coefficients- 433 Strong Contraction Estimates- 434 Weak Regularity Properties- 44 Updated Feynman-Kac Models- 45 A Class of Stochastic Semigroups- 5 Invariant Measures and Related Topics- 51 Introduction- 52 Existence and Uniqueness- 53 Invariant Measures and Feynman-Kac Modeling- 54 Feynman-Kac and Metropolis-Hastings Models- 55 Feynman-Kac-Metropolis Models- 551 Introduction- 552 The Genealogical Metropolis Particle Model- 553 Path Space Models and Restricted Markov Chains- 554 Stability Properties- 6 Annealing Properties- 61 Introduction- 62 Feynman-Kac-Metropolis Models- 621 Description of the Model- 622 Regularity Properties- 623 Asymptotic Behavior- 63 Feynman-Kac Trapping Models- 631 Description of the Model- 632 Regularity Properties- 633 Asymptotic Behavior- 634 Large-Deviation Analysis- 635 Concentration Levels- 7 Asymptotic Behavior- 71 Introduction- 72 Some Preliminaries- 721 McKean Interpretations- 722 Vanishing Potentials- 73 Inequalities for Independent Random Variables- 731 Lp and Exponential Inequalities- 732 Empirical Processes- 74 Strong Law of Large Numbers- 741 Extinction Probabilities- 742 Convergence of Empirical Processes- 743 Time-Uniform Estimates- 8 Propagation of Chaos- 81 Introduction- 82 Some Preliminaries- 83 Outline of Results- 84 Weak Propagation of Chaos- 85 Relative Entropy Estimates- 86 A Combinatorial Transport Equation- 87 Asymptotic Properties of Boltzmann-Gibbs Distributions- 88 Feynman-Kac Semigroups- 881 Marginal Models- 882 Path-Space Models- 89 Total Variation Estimates- 9 Central Limit Theorems- 91 Introduction- 92 Some Preliminaries- 93 Some Local Fluctuation Results- 94 Particle Density Profiles- 941 Unnormalized Measures- 942 Normalized Measures- 943 Killing Interpretations and Related Comparisons- 95 A Berry-Esseen Type Theorem- 96 A Donsker Type Theorem- 97 Path-Space Models- 98 Covariance Functions- 10 Large-Deviation Principles- 101 Introduction- 102 Some Preliminary Results- 1021 Topological Properties- 1022 Idempotent Analysis- 1023 Some Regularity Properties- 103 Cramer's Method- 104 Laplace-Varadhan's Integral Techniques- 105 Dawson-Gartner Projective Limits Techniques- 106 Sanov's Theorem- 1061 Introduction- 1062 Topological Preliminaries- 1063 Sanov's Theorem in the r-Topology- 107 Path-Space and Interacting Particle Models- 1071 Proof of Theorem 1011- 1072 Sufficient Conditions- 108 Particle Density Profile Models- 1081 Introduction- 1082 Strong Large-Deviation Principles- 11 Feynman-Kac and Interacting Particle Recipes- 111 Introduction- 112 Interacting Metropolis Models- 1121 Introduction- 1122 Feynman-Kac-Metropolis and Particle Models- 1123 Interacting Metropolis and Gibbs Samplers- 113 An Overview of some General Principles- 114 Descendant and Ancestral Genealogies- 115 Conditional Explorations- 116 State-Space Enlargements and Path-Particle Models- 117 Conditional Excursion Particle Models- 118 Branching Selection Variants- 1181 Introduction- 1182 Description of the Models- 1183 Some Branching Selection Rules- 1184 Some L2-mean Error Estimates- 1185 Long Time Behavior- 1186 Conditional Branching Models- 119 Exercises- 12 Applications- 121 Introduction- 122 Random Excursion Models- 1221 Introduction- 1222 Dirichlet Problems with Boundary Conditions- 1223 Multilevel Feynman-Kac Formulae- 1224 Dirichlet Problems with Hard Boundary Conditions- 1225 Rare Event Analysis- 1226 Asymptotic Particle Analysis of Rare Events- 1227 Fluctuation Results and Some Comparisons- 1228 Exercises- 123 Change of Reference Measures- 1231 Introduction- 1232 Importance Sampling- 1233 Sequential Analysis of Probability Ratio Tests- 1234 A Multisplitting Particle Approach- 1235 Exercises- 124 Spectral Analysis of Feynman-Kac-Schrodinger Semigroups- 1241 Lyapunov Exponents and Spectral Radii- 1242 Feynman-Kac Asymptotic Models- 1243 Particle Lyapunov Exponents- 1244 Hard, Soft and Repulsive Obstacles- 1245 Related Spectral Quantities- 1246 Exercises- 125 Directed Polymers Simulation- 1251 Feynman-Kac and Boltzmann-Gibbs Models- 1252 Evolutionary Particle Simulation Methods- 1253 Repulsive Interaction and Self-Avoiding Markov Chains- 1254 Attractive Interaction and Reinforced Markov Chains- 1255 Particle Polymerization Techniques- 1256 Exercises- 126 Filtering/Smoothing and Path estimation- 1261 Introduction- 1262 Motivating Examples- 1263 Feynman-Kac Representations- 1264 Stability Properties of the Filtering Equations- 1265 Asymptotic Properties of Log-likelihood Functions- 1266 Particle Approximation Measures- 1267 A Partially Linear/Gaussian Filtering Model- 1268 Exercises- References

1,079 citations

Journal ArticleDOI
TL;DR: An adaptive SMC algorithm is proposed which admits a computational complexity that is linear in the number of samples and adaptively determines the simulation parameters.
Abstract: Approximate Bayesian computation (ABC) is a popular approach to address inference problems where the likelihood function is intractable, or expensive to calculate To improve over Markov chain Monte Carlo (MCMC) implementations of ABC, the use of sequential Monte Carlo (SMC) methods has recently been suggested Most effective SMC algorithms that are currently available for ABC have a computational complexity that is quadratic in the number of Monte Carlo samples (Beaumont et al, Biometrika 86:983---990, 2009; Peters et al, Technical report, 2008; Toni et al, J Roy Soc Interface 6:187---202, 2009) and require the careful choice of simulation parameters In this article an adaptive SMC algorithm is proposed which admits a computational complexity that is linear in the number of samples and adaptively determines the simulation parameters We demonstrate our algorithm on a toy example and on a birth-death-mutation model arising in epidemiology

530 citations

Book
30 Mar 2004

350 citations

Book ChapterDOI
01 Jan 2000
TL;DR: In this article, the authors focus on interacting particle systems methods for solving numerically a class of Feynman-Kac formulae arising in the study of certain parabolic differential equations, physics, biology, evolutionary computing, nonlinear filtering and elsewhere.
Abstract: This paper focuses on interacting particle systems methods for solving numerically a class of Feynman-Kac formulae arising in the study of certain parabolic differential equations, physics, biology, evolutionary computing, nonlinear filtering and elsewhere. We have tried to give an “expose” of the mathematical theory that is useful for analyzing the convergence of such genetic-type and particle approximating models including law of large numbers, large deviations principles, fluctuations and empirical process theory as well as semigroup techniques and limit theorems for processes.

327 citations


Cited by
More filters
BookDOI
01 Jan 2001
TL;DR: This book presents the first comprehensive treatment of Monte Carlo techniques, including convergence results and applications to tracking, guidance, automated target recognition, aircraft navigation, robot navigation, econometrics, financial modeling, neural networks, optimal control, optimal filtering, communications, reinforcement learning, signal enhancement, model averaging and selection.
Abstract: Monte Carlo methods are revolutionizing the on-line analysis of data in fields as diverse as financial modeling, target tracking and computer vision. These methods, appearing under the names of bootstrap filters, condensation, optimal Monte Carlo filters, particle filters and survival of the fittest, have made it possible to solve numerically many complex, non-standard problems that were previously intractable. This book presents the first comprehensive treatment of these techniques, including convergence results and applications to tracking, guidance, automated target recognition, aircraft navigation, robot navigation, econometrics, financial modeling, neural networks, optimal control, optimal filtering, communications, reinforcement learning, signal enhancement, model averaging and selection, computer vision, semiconductor design, population biology, dynamic Bayesian networks, and time series analysis. This will be of great value to students, researchers and practitioners, who have some basic knowledge of probability. Arnaud Doucet received the Ph. D. degree from the University of Paris-XI Orsay in 1997. From 1998 to 2000, he conducted research at the Signal Processing Group of Cambridge University, UK. He is currently an assistant professor at the Department of Electrical Engineering of Melbourne University, Australia. His research interests include Bayesian statistics, dynamic models and Monte Carlo methods. Nando de Freitas obtained a Ph.D. degree in information engineering from Cambridge University in 1999. He is presently a research associate with the artificial intelligence group of the University of California at Berkeley. His main research interests are in Bayesian statistics and the application of on-line and batch Monte Carlo methods to machine learning. Neil Gordon obtained a Ph.D. in Statistics from Imperial College, University of London in 1993. He is with the Pattern and Information Processing group at the Defence Evaluation and Research Agency in the United Kingdom. His research interests are in time series, statistical data analysis, and pattern recognition with a particular emphasis on target tracking and missile guidance.

6,574 citations

01 Apr 2003
TL;DR: The EnKF has a large user group, and numerous publications have discussed applications and theoretical aspects of it as mentioned in this paper, and also presents new ideas and alternative interpretations which further explain the success of the EnkF.
Abstract: The purpose of this paper is to provide a comprehensive presentation and interpretation of the Ensemble Kalman Filter (EnKF) and its numerical implementation. The EnKF has a large user group, and numerous publications have discussed applications and theoretical aspects of it. This paper reviews the important results from these studies and also presents new ideas and alternative interpretations which further explain the success of the EnKF. In addition to providing the theoretical framework needed for using the EnKF, there is also a focus on the algorithmic formulation and optimal numerical implementation. A program listing is given for some of the key subroutines. The paper also touches upon specific issues such as the use of nonlinear measurements, in situ profiles of temperature and salinity, and data which are available with high frequency in time. An ensemble based optimal interpolation (EnOI) scheme is presented as a cost-effective approach which may serve as an alternative to the EnKF in some applications. A fairly extensive discussion is devoted to the use of time correlated model errors and the estimation of model bias.

2,975 citations

BookDOI
10 May 2011
TL;DR: A Markov chain Monte Carlo based analysis of a multilevel model for functional MRI data and its applications in environmental epidemiology, educational research, and fisheries science are studied.
Abstract: Foreword Stephen P. Brooks, Andrew Gelman, Galin L. Jones, and Xiao-Li Meng Introduction to MCMC, Charles J. Geyer A short history of Markov chain Monte Carlo: Subjective recollections from in-complete data, Christian Robert and George Casella Reversible jump Markov chain Monte Carlo, Yanan Fan and Scott A. Sisson Optimal proposal distributions and adaptive MCMC, Jeffrey S. Rosenthal MCMC using Hamiltonian dynamics, Radford M. Neal Inference and Monitoring Convergence, Andrew Gelman and Kenneth Shirley Implementing MCMC: Estimating with confidence, James M. Flegal and Galin L. Jones Perfection within reach: Exact MCMC sampling, Radu V. Craiu and Xiao-Li Meng Spatial point processes, Mark Huber The data augmentation algorithm: Theory and methodology, James P. Hobert Importance sampling, simulated tempering and umbrella sampling, Charles J.Geyer Likelihood-free Markov chain Monte Carlo, Scott A. Sisson and Yanan Fan MCMC in the analysis of genetic data on related individuals, Elizabeth Thompson A Markov chain Monte Carlo based analysis of a multilevel model for functional MRI data, Brian Caffo, DuBois Bowman, Lynn Eberly, and Susan Spear Bassett Partially collapsed Gibbs sampling & path-adaptive Metropolis-Hastings in high-energy astrophysics, David van Dyk and Taeyoung Park Posterior exploration for computationally intensive forward models, Dave Higdon, C. Shane Reese, J. David Moulton, Jasper A. Vrugt and Colin Fox Statistical ecology, Ruth King Gaussian random field models for spatial data, Murali Haran Modeling preference changes via a hidden Markov item response theory model, Jong Hee Park Parallel Bayesian MCMC imputation for multiple distributed lag models: A case study in environmental epidemiology, Brian Caffo, Roger Peng, Francesca Dominici, Thomas A. Louis, and Scott Zeger MCMC for state space models, Paul Fearnhead MCMC in educational research, Roy Levy, Robert J. Mislevy, and John T. Behrens Applications of MCMC in fisheries science, Russell B. Millar Model comparison and simulation for hierarchical models: analyzing rural-urban migration in Thailand, Filiz Garip and Bruce Western

2,415 citations

Journal ArticleDOI
TL;DR: It is shown here how it is possible to build efficient high dimensional proposal distributions by using sequential Monte Carlo methods, which allows not only to improve over standard Markov chain Monte Carlo schemes but also to make Bayesian inference feasible for a large class of statistical models where this was not previously so.
Abstract: Summary. Markov chain Monte Carlo and sequential Monte Carlo methods have emerged as the two main tools to sample from high dimensional probability distributions. Although asymptotic convergence of Markov chain Monte Carlo algorithms is ensured under weak assumptions, the performance of these algorithms is unreliable when the proposal distributions that are used to explore the space are poorly chosen and/or if highly correlated variables are updated independently. We show here how it is possible to build efficient high dimensional proposal distributions by using sequential Monte Carlo methods. This allows us not only to improve over standard Markov chain Monte Carlo schemes but also to make Bayesian inference feasible for a large class of statistical models where this was not previously so. We demonstrate these algorithms on a non-linear state space model and a Levy-driven stochastic volatility model.

1,869 citations

Book Chapter
01 Jan 2008
TL;DR: A complete, up-to-date survey of particle filtering methods as of 2008, including basic and advanced particle methods for filtering as well as smoothing.
Abstract: Optimal estimation problems for non-linear non-Gaussian state-space models do not typically admit analytic solutions. Since their introduction in 1993, particle filtering methods have become a very popular class of algorithms to solve these estimation problems numerically in an online manner, i.e. recursively as observations become available, and are now routinely used in fields as diverse as computer vision, econometrics, robotics and navigation. The objective of this tutorial is to provide a complete, up-to-date survey of this field as of 2008. Basic and advanced particle methods for filtering as well as smoothing are presented.

1,860 citations