scispace - formally typeset
Search or ask a question
Topic

Gaussian process

About: Gaussian process is a research topic. Over the lifetime, 18944 publications have been published within this topic receiving 486645 citations. The topic is also known as: Gaussian stochastic process.


Papers
More filters
Journal ArticleDOI
TL;DR: It is demonstrated that the new network can lead to a parsimonious model with much better generalization property compared with the traditional single width RBF networks.

109 citations

Proceedings ArticleDOI
15 Jun 2019
TL;DR: In this article, the authors show that the deep image prior is asymptotically equivalent to a stationary Gaussian process prior in the limit as the number of channels in each layer goes to infinity, and derive the corresponding kernel.
Abstract: The deep image prior was recently introduced as a prior for natural images. It represents images as the output of a convolutional network with random inputs. For “inference”, gradient descent is performed to adjust network parameters to make the output match observations. This approach yields good performance on a range of image reconstruction tasks. We show that the deep image prior is asymptotically equivalent to a stationary Gaussian process prior in the limit as the number of channels in each layer of the network goes to infinity, and derive the corresponding kernel. This informs a Bayesian approach to inference. We show that by conducting posterior inference using stochastic gradient Langevin dynamics we avoid the need for early stopping, which is a drawback of the current approach, and improve results for denoising and impainting tasks. We illustrate these intuitions on a number of 1D and 2D signal reconstruction tasks.

109 citations

Journal Article
Antti Honkela, Tapani Raiko1, Mikael Kuusela1, Matti Tornio1, Juha Karhunen1 
TL;DR: An efficient algorithm for applying VB to more general models based on specifying the functional form of the approximation, such as multivariate Gaussian, which outperforms alternative gradient-based methods by a significant margin.
Abstract: Variational Bayesian (VB) methods are typically only applied to models in the conjugate-exponential family using the variational Bayesian expectation maximisation (VB EM) algorithm or one of its variants. In this paper we present an efficient algorithm for applying VB to more general models. The method is based on specifying the functional form of the approximation, such as multivariate Gaussian. The parameters of the approximation are optimised using a conjugate gradient algorithm that utilises the Riemannian geometry of the space of the approximations. This leads to a very efficient algorithm for suitably structured approximations. It is shown empirically that the proposed method is comparable or superior in efficiency to the VB EM in a case where both are applicable. We also apply the algorithm to learning a nonlinear state-space model and a nonlinear factor analysis model for which the VB EM is not applicable. For these models, the proposed algorithm outperforms alternative gradient-based methods by a significant margin.

108 citations

Journal ArticleDOI
TL;DR: In this article, several well-known linear and nonlinear image restoration methods are written as recursive algorithms, and some new recursive algorithms are developed, based on the assumption that the noise is either a Poisson or a Gaussian process.
Abstract: Linear and nonlinear image restoration methods have been studied in depth but have always been treated separately. In this paper several well-known linear and nonlinear restoration methods are written as recursive algorithms, and some new recursive algorithms are developed. The nonlinear restoration algorithms are based on the assumption that the noise is either a Poisson or a Gaussian process. The linear algorithms are shown to be related to the nonlinear methods through the partial derivative, with respect to the object, of a Poisson or a Gaussian likelihood function. A table of results is given, along with applications to real imagery.

108 citations

Journal ArticleDOI
01 May 2012
TL;DR: This paper proposes the fuzzy local GMM (FLGMM) algorithm, which estimates the segmentation result that maximizes the posterior probability by minimizing an objective energy function, in which a truncated Gaussian kernel function is used to impose the spatial constraint and fuzzy memberships are employed to balance the contribution of each GMM.
Abstract: Accurate brain tissue segmentation from magnetic resonance (MR) images is an essential step in quantitative brain image analysis. However, due to the existence of noise and intensity inhomogeneity in brain MR images, many segmentation algorithms suffer from limited accuracy. In this paper, we assume that the local image data within each voxel's neighborhood satisfy the Gaussian mixture model (GMM), and thus propose the fuzzy local GMM (FLGMM) algorithm for automated brain MR image segmentation. This algorithm estimates the segmentation result that maximizes the posterior probability by minimizing an objective energy function, in which a truncated Gaussian kernel function is used to impose the spatial constraint and fuzzy memberships are employed to balance the contribution of each GMM. We compared our algorithm to state-of-the-art segmentation approaches in both synthetic and clinical data. Our results show that the proposed algorithm can largely overcome the difficulties raised by noise, low contrast, and bias field, and substantially improve the accuracy of brain MR image segmentation.

108 citations


Network Information
Related Topics (5)
Estimator
97.3K papers, 2.6M citations
87% related
Optimization problem
96.4K papers, 2.1M citations
85% related
Artificial neural network
207K papers, 4.5M citations
84% related
Support vector machine
73.6K papers, 1.7M citations
82% related
Deep learning
79.8K papers, 2.1M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023502
20221,181
20211,132
20201,220
20191,119
2018978