scispace - formally typeset
Search or ask a question
Topic

Gaussian process

About: Gaussian process is a research topic. Over the lifetime, 18944 publications have been published within this topic receiving 486645 citations. The topic is also known as: Gaussian stochastic process.


Papers
More filters
Journal ArticleDOI
TL;DR: Empirical evidence shows that, in the case of functions having many local optima, the performance of the proposed algorithm was better than that of classical evolutionary programming using Gaussian mutation.
Abstract: Studies evolutionary programming with mutations based on the Levy probability distribution. The Levy probability distribution has an infinite second moment and is, therefore, more likely to generate an offspring that is farther away from its parent than the commonly employed Gaussian mutation. Such likelihood depends on a parameter /spl alpha/ in the Levy distribution. We propose an evolutionary programming algorithm using adaptive as well as nonadaptive Levy mutations. The proposed algorithm was applied to multivariate functional optimization. Empirical evidence shows that, in the case of functions having many local optima, the performance of the proposed algorithm was better than that of classical evolutionary programming using Gaussian mutation.

478 citations

Journal ArticleDOI
TL;DR: In this paper, it was shown that the asymptotic null distribution of the maximally selected rank statistic is the distribution of a standardized Gaussian process on an interval, i.e., the supremum of the absolute value of a dependent variable.
Abstract: A common statistical problem is the assessment of the predictive power of a quantitative variable for some dependent variable. A maximally selected rank statistic regarding the quantitative variable provides a test and implicitly an estimate of a cutpoint as a simple classification rule. Restricting the selection to an arbitrary given inner part of the support of the quantitative variable, we show that the asymptotic null distribution of the maximally selected rank statistic is the distribution of the supremum of the absolute value of a standardized Gaussian process on an interval. The asymptotic argument holds also in the case of tied or censored observations. We compare Monte Carlo results with an approximation of the asymptotic distribution under the null hypothesis. In addition, we investigate the behaviour of the test procedure and of the familiar Spearman rank test for independence, under some alternatives. Moreover, we discuss some aspects of the problem of estimating an underlying cutpoint.

478 citations

Journal Article
TL;DR: A probabilistic kernel approach to ordinal regression based on Gaussian processes is presented, where a threshold model that generalizes the probit function is used as the likelihood function for ordinal variables.
Abstract: We present a probabilistic kernel approach to ordinal regression based on Gaussian processes. A threshold model that generalizes the probit function is used as the likelihood function for ordinal variables. Two inference techniques, based on the Laplace approximation and the expectation propagation algorithm respectively, are derived for hyperparameter learning and model selection. We compare these two Gaussian process approaches with a previous ordinal regression method based on support vector machines on some benchmark and real-world data sets, including applications of ordinal regression to collaborative filtering and gene expression analysis. Experimental results on these data sets verify the usefulness of our approach.

475 citations

Journal ArticleDOI
TL;DR: In this paper, the authors combine Malliavin calculus with Stein's method to derive explicit bounds in the Gaussian and Gamma approximations of random variables in a fixed Wiener chaos of a general Gaussian process.
Abstract: We combine Malliavin calculus with Stein’s method, in order to derive explicit bounds in the Gaussian and Gamma approximations of random variables in a fixed Wiener chaos of a general Gaussian process. Our approach generalizes, refines and unifies the central and non-central limit theorems for multiple Wiener–Ito integrals recently proved (in several papers, from 2005 to 2007) by Nourdin, Nualart, Ortiz-Latorre, Peccati and Tudor. We apply our techniques to prove Berry–Esseen bounds in the Breuer–Major CLT for subordinated functionals of fractional Brownian motion. By using the well-known Mehler’s formula for Ornstein–Uhlenbeck semigroups, we also recover a technical result recently proved by Chatterjee, concerning the Gaussian approximation of functionals of finite-dimensional Gaussian vectors.

473 citations

Journal ArticleDOI
TL;DR: The Gaussian approximation potentials (GAP) framework is described, a variety of descriptors are discussed, how to train the model on total energies and derivatives, and the simultaneous use of multiple models of different complexity are discussed.
Abstract: We present a swift walk-through of our recent work that uses machine learning to fit interatomic potentials based on quantum mechanical data. We describe our Gaussian approximation potentials (GAP) framework, discuss a variety of descriptors, how to train the model on total energies and derivatives, and the simultaneous use of multiple models of different complexity. We also show a small example using QUIP, the software sandbox implementation of GAP that is available for noncommercial use. © 2015 Wiley Periodicals, Inc.

470 citations


Network Information
Related Topics (5)
Estimator
97.3K papers, 2.6M citations
87% related
Optimization problem
96.4K papers, 2.1M citations
85% related
Artificial neural network
207K papers, 4.5M citations
84% related
Support vector machine
73.6K papers, 1.7M citations
82% related
Deep learning
79.8K papers, 2.1M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023502
20221,181
20211,132
20201,220
20191,119
2018978