A
Arnaud Doucet
Researcher at University of Oxford
Publications - 431
Citations - 46995
Arnaud Doucet is an academic researcher from University of Oxford. The author has contributed to research in topics: Particle filter & Markov chain Monte Carlo. The author has an hindex of 75, co-authored 386 publications receiving 43388 citations. Previous affiliations of Arnaud Doucet include University of British Columbia & École nationale supérieure de l'électronique et de ses applications.
Papers
More filters
Proceedings ArticleDOI
Marginal MAP estimation using Markov chain Monte Carlo
TL;DR: A simple and novel MCMC strategy called state-augmentation for marginal estimation (SAME) is presented, that allows MMAP estimates to be obtained for Bayesian models.
Proceedings ArticleDOI
A Distributed Recursive Maximum Likelihood Implementation for Sensor Registration
TL;DR: This work describes how a completely decentralized version of RML can be implemented in dynamic graphical models through the propagation of suitable messages that are exchanged between neighbouring nodes of the graph to solve the sensor registration and localisation problem for sensor networks.
Proceedings ArticleDOI
Particle filtering for multi-target tracking using jump Markov systems
TL;DR: The proposed method applies particle filtering techniques to a jump Markov system that models the multi-target dynamics and Simulation results using this particle method are presented.
A New Class of Soft MIMO
TL;DR: A new class of soft-input soft-output demodulation schemes for multiple-input multiple-output (MIMO) channels, based on the sequential Monte Carlo (SMC) framework under both stochastic and deterministic settings, which offer error performance comparable with that of the sphere decoding algorithm without attendant increase in complexity.
Journal ArticleDOI
From Denoising Diffusions to Denoising Markov Models
TL;DR: Denoising diffusions are state-of-the-art generative models exhibiting remarkable empirical performance as mentioned in this paper , which work by diffusing the data distribution into a Gaussian distribution and then learning to reverse this noising process to obtain synthetic data points.