Bayesian inverse problems with gaussian priors
Reads0
Chats0
TLDR
In this article, the posterior distribution in a nonparametric inverse problem is shown to contract to the true parameter at a rate that depends on the smoothness of the parameter, and the smoothing and scale of the prior.Abstract:
The posterior distribution in a nonparametric inverse problem is shown to contract to the true parameter at a rate that depends on the smoothness of the parameter, and the smoothness and scale of the prior. Correct combinations of these characteristics lead to the minimax rate. The frequentist coverage of credible sets is shown to depend on the combination of prior and true parameter, with smoother priors leading to zero coverage and rougher priors to conservative coverage. In the latter case credible sets are of the correct order of magnitude. The results are numerically illustrated by the problem of recovering a function from observation of a noisy version of its primitive.read more
Citations
More filters
Book ChapterDOI
The Bayesian Approach to Inverse Problems
Masoumeh Dashti,Andrew M. Stuart +1 more
TL;DR: In this paper, the authors highlight the mathematical and computational structure relating to the formulation of, and development of algorithms for, the Bayesian approach to inverse problems in differential equations, and describe measure-preserving dynamics on the underlying infinite dimensional space.
Journal ArticleDOI
Solving inverse problems using data-driven models
TL;DR: This survey paper aims to give an account of some of the main contributions in data-driven inverse problems.
Journal ArticleDOI
MAP estimators and their consistency in Bayesian nonparametric inverse problems
TL;DR: In this paper, a Bayesian approach is adopted to the inverse problem of estimating an unknown function u from noisy measurements y of a known, possibly nonlinear, map applied to u. The prior measure is specified as a Gaussian random field μ 0.
Journal ArticleDOI
Frequentist Consistency of Variational Bayes
Yixin Wang,David M. Blei +1 more
TL;DR: It is proved that the VB posterior converges to the Kullback–Leibler (KL) minimizer of a normal distribution, centered at the truth and the corresponding variational expectation of the parameter is consistent and asymptotically normal.
Journal ArticleDOI
Importance Sampling: Intrinsic Dimension and Computational Cost
TL;DR: In this article, the authors present a general theory of importance sampling in Bayesian inverse problems and filtering, with a focus on the use of the importance sampling method in filtering and filtering.
References
More filters
Journal ArticleDOI
Inverse problems: A Bayesian perspective
TL;DR: The Bayesian approach to regularization is reviewed, developing a function space viewpoint on the subject, which allows for a full characterization of all possible solutions, and their relative probabilities, whilst simultaneously forcing significant modelling issues to be addressed in a clear and precise fashion.