scispace - formally typeset
Search or ask a question

Showing papers by "Arnaud Guyader published in 2015"


Journal ArticleDOI
TL;DR: In this article, the authors analyze approximate Bayesian computations from the point of view of k-nearest neighbor theory and explore the statistical properties of its outputs, in particular some asymptotic features of the genuine conditional density estimate associated with ABC, which is an interesting hybrid between a kNN and a kernel method.
Abstract: Approximate Bayesian Computation (ABC for short) is a family of computational techniques which offer an almost automated solution in situations where evaluation of the posterior likelihood is computationally prohibitive, or whenever suitable likelihoods are not available. In the present paper, we analyze the procedure from the point of view of k-nearest neighbor theory and explore the statistical properties of its outputs. We discuss in particular some asymptotic features of the genuine conditional density estimate associated with ABC, which is an interesting hybrid between a k-nearest neighbor and a kernel method.

113 citations


Journal ArticleDOI
TL;DR: The so-called SQMC particle method to compute posterior Bayesian law, and a nonparametric analysis of the widespread ABC method are described, from an applied and theoretical point of view.
Abstract: This paper proposes to review some recent developments in Bayesian statistics for high dimensional data. After giving some brief motivations in a short introduction, we describe new ad- vances in the understanding of Bayes posterior computation as well as theoretical contributions in non parametric and high dimensional Bayesian approaches. From an applied point of view, we describe the so-called SQMC particle method to compute posterior Bayesian law, and provide a nonparametric analysis of the widespread ABC method. On the theoretical side, we describe some recent advances in Bayesian consistency for a nonparametric hidden Markov model as well as new PAC-Bayesian results for dierent models of high dimensional regression. R esum

14 citations


Journal ArticleDOI
TL;DR: A new nonparametric method for estimating a univariate regression function of bounded variation that exploits the Jordan decomposition and generalizes the well-known consistency property of isotonic regression to the framework of a non-monotone regression function.
Abstract: This article introduces a new nonparametric method for estimating a univariate regression function of bounded variation. The method exploits the Jordan decomposition which states that a function of bounded variation can be decomposed as the sum of a non-decreasing function and a non-increasing function. This suggests combining the backfitting algorithm for estimating additive functions with isotonic regression for estimating monotone functions. The resulting iterative algorithm is called Iterative Isotonic Regression (I.I.R.). The main technical result in this paper is the consistency of the proposed estimator when the number of iterations $k_n$ grows appropriately with the sample size $n$. The proof requires two auxiliary results that are of interest in and by themselves: firstly, we generalize the well-known consistency property of isotonic regression to the framework of a non-monotone regression function, and secondly, we relate the backfitting algorithm to Von Neumann's algorithm in convex analysis.