scispace - formally typeset
Search or ask a question
Topic

Gaussian process

About: Gaussian process is a research topic. Over the lifetime, 18944 publications have been published within this topic receiving 486645 citations. The topic is also known as: Gaussian stochastic process.


Papers
More filters
Journal ArticleDOI
TL;DR: It is shown that a proper initialization of the recursive procedure leads to an adaptive NMF with the constant false alarm rate (CFAR) property and that it is very effective to operate in heterogeneous environments of relevant practical interest.
Abstract: Adaptive detection of signals embedded in Gaussian or non-Gaussian noise is a problem of primary concern among radar engineers. We propose a recursive algorithm to estimate the structure of the covariance matrix of either a set of Gaussian vectors that share the spectral properties up to a multiplicative factor or a set of spherically invariant random vectors (SIRVs) with the same covariance matrix and possibly correlated texture components. We also assess the performance of an adaptive implementation of the normalized matched filter (NMF), relying on the newly introduced estimate, in the presence of compound-Gaussian, clutter-dominated, disturbance. In particular, it is shown that a proper initialization of the recursive procedure leads to an adaptive NMF with the constant false alarm rate (CFAR) property and that it is very effective to operate in heterogeneous environments of relevant practical interest.

291 citations

Proceedings Article
11 Mar 2007
TL;DR: This paper develops a new sparse GP approximation which is a combination of both the global and local approaches, and shows that it is derived as a natural extension of the framework developed by Quinonero Candela and Rasmussen for sparse GP approximations.
Abstract: Gaussian process (GP) models are flexible probabilistic nonparametric models for regression, classification and other tasks. Unfortunately they suffer from computational intractability for large data sets. Over the past decade there have been many different approximations developed to reduce this cost. Most of these can be termed global approximations, in that they try to summarize all the training data via a small set of support points. A different approach is that of local regression, where many local experts account for their own part of space. In this paper we start by investigating the regimes in which these different approaches work well or fail. We then proceed to develop a new sparse GP approximation which is a combination of both the global and local approaches. Theoretically we show that it is derived as a natural extension of the framework developed by Quinonero Candela and Rasmussen [2005] for sparse GP approximations. We demonstrate the benefits of the combined approximation on some 1D examples for illustration, and on some large real-world data sets.

290 citations

Journal Article
TL;DR: In this paper, a Bayesian inferential framework, implemented via the Markov chain Monte Carlo method, is used to solve the prediction problem for non-linear functionals of S(×), making a proper allowance for the uncertainty in the estimation of any model parameters.
Abstract: Conventional geostatistical methodology solves the problem of predicting the realized value of a linear functional of a Gaussian spatial stochastic process S(x) based on observations (Y i = S(x i ;)+ Z i at sampling locations x i , where the Z i are mutually independent, zero-mean Gaussian random variables. We describe two spatial applications for which Gaussian distributional assumptions are clearly inappropriate. The first concerns the assessment of residual contamination from nuclear weapons testing on a South Pacific island, in which the sampling method generates spatially indexed Poisson counts conditional on an unobserved spatially varying intensity of radioactivity; we conclude that a conventional geostatistical analysis oversmooths the data and underestimates the spatial extremes of the intensity. The second application provides a description of spatial variation in the risk of campylobacter infections relative to other enteric infections in part of north Lancashire and south Cumbria. For this application, we treat the data as binomial counts at unit postcode locations, conditionally on an unobserved relative risk surface which we estimate. The theoretical framework for our extension of geostatistical methods is that, conditionally on the unobserved process S(×), observations at sample locations × i form a generalized linear model with the corresponding values of S(× i ) appearing as an offset term in the linear predictor. We use a Bayesian inferential framework, implemented via the Markov chain Monte Carlo method, to solve the prediction problem for non-linear functionals of S(×), making a proper allowance for the uncertainty in the estimation of any model parameters.

288 citations

01 Dec 2008
TL;DR: In this paper, a Bayesian approach to nonlinear inverse problems in which the unknown quantity is a spatial or temporal field, endowed with a hierarchical Gaussian process prior, is proposed, where truncated Karhunen-Loeve expansions are introduced to efficiently parameterize the unknown field and specify a stochastic forward problem whose solution captures that of the deterministic forward model over the support of the prior.
Abstract: We consider a Bayesian approach to nonlinear inverse problems in which the unknown quantity is a spatial or temporal field, endowed with a hierarchical Gaussian process prior. Computational challenges in this construction arise from the need for repeated evaluations of the forward model (e.g., in the context of Markov chain Monte Carlo) and are compounded by high dimensionality of the posterior. We address these challenges by introducing truncated Karhunen-Loeve expansions, based on the prior distribution, to efficiently parameterize the unknown field and to specify a stochastic forward problem whose solution captures that of the deterministic forward model over the support of the prior. We seek a solution of this problem using Galerkin projection on a polynomial chaos basis, and use the solution to construct a reduced-dimensionality surrogate posterior density that is inexpensive to evaluate. We demonstrate the formulation on a transient diffusion equation with prescribed source terms, inferring the spatially-varying diffusivity of the medium from limited and noisy data.

288 citations

Posted Content
TL;DR: In this paper, the authors introduce scalable deep kernels, which combine the structural properties of deep learning architectures with the non-parametric flexibility of kernel methods, and jointly learn the properties of these kernels through the marginal likelihood of a Gaussian process.
Abstract: We introduce scalable deep kernels, which combine the structural properties of deep learning architectures with the non-parametric flexibility of kernel methods. Specifically, we transform the inputs of a spectral mixture base kernel with a deep architecture, using local kernel interpolation, inducing points, and structure exploiting (Kronecker and Toeplitz) algebra for a scalable kernel representation. These closed-form kernels can be used as drop-in replacements for standard kernels, with benefits in expressive power and scalability. We jointly learn the properties of these kernels through the marginal likelihood of a Gaussian process. Inference and learning cost $O(n)$ for $n$ training points, and predictions cost $O(1)$ per test point. On a large and diverse collection of applications, including a dataset with 2 million examples, we show improved performance over scalable Gaussian processes with flexible kernel learning models, and stand-alone deep architectures.

288 citations


Network Information
Related Topics (5)
Estimator
97.3K papers, 2.6M citations
87% related
Optimization problem
96.4K papers, 2.1M citations
85% related
Artificial neural network
207K papers, 4.5M citations
84% related
Support vector machine
73.6K papers, 1.7M citations
82% related
Deep learning
79.8K papers, 2.1M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023502
20221,181
20211,132
20201,220
20191,119
2018978