scispace - formally typeset
Search or ask a question
Topic

Gaussian process

About: Gaussian process is a research topic. Over the lifetime, 18944 publications have been published within this topic receiving 486645 citations. The topic is also known as: Gaussian stochastic process.


Papers
More filters
Posted Content
TL;DR: In this article, an equivalence between wide fully connected neural networks (FCNs) and Gaussian processes (GPs) was derived for CNNs both with and without pooling layers, and achieved state-of-the-art results on CIFAR10 for GPs without trainable kernels.
Abstract: There is a previously identified equivalence between wide fully connected neural networks (FCNs) and Gaussian processes (GPs). This equivalence enables, for instance, test set predictions that would have resulted from a fully Bayesian, infinitely wide trained FCN to be computed without ever instantiating the FCN, but by instead evaluating the corresponding GP. In this work, we derive an analogous equivalence for multi-layer convolutional neural networks (CNNs) both with and without pooling layers, and achieve state of the art results on CIFAR10 for GPs without trainable kernels. We also introduce a Monte Carlo method to estimate the GP corresponding to a given neural network architecture, even in cases where the analytic form has too many terms to be computationally feasible. Surprisingly, in the absence of pooling layers, the GPs corresponding to CNNs with and without weight sharing are identical. As a consequence, translation equivariance, beneficial in finite channel CNNs trained with stochastic gradient descent (SGD), is guaranteed to play no role in the Bayesian treatment of the infinite channel limit - a qualitative difference between the two regimes that is not present in the FCN case. We confirm experimentally, that while in some scenarios the performance of SGD-trained finite CNNs approaches that of the corresponding GPs as the channel count increases, with careful tuning SGD-trained CNNs can significantly outperform their corresponding GPs, suggesting advantages from SGD training compared to fully Bayesian parameter estimation.

110 citations

Journal ArticleDOI
TL;DR: Experiments performed on linear Gaussian, linear non-Gaussian, and nonlinear systems with varying in-band noise levels, data lengths, and kernel sizes confirm that correntropy can be employed as a discriminative measure for detecting nonlinear characteristics in time series.

110 citations

Proceedings ArticleDOI
13 Jun 2010
TL;DR: This paper shows that the same approximation result can be established by using lattices for transmission and quantization along with structured mappings at the relays.
Abstract: Recently, it has been shown that a quantize-map-and-forward scheme approximately achieves (within a constant number of bits) the Gaussian relay network capacity for arbitrary topologies [1]. This was established using Gaussian codebooks for transmission and random mappings at the relays. In this paper, we show that the same approximation result can be established by using lattices for transmission and quantization along with structured mappings at the relays.

110 citations

Proceedings ArticleDOI
25 Mar 2003
TL;DR: The paper shows the optimality conditions that quantizers must satisfy, and generalizes the Lloyd algorithm for their design, and experimental results are shown for the Gaussian scalar asymmetric case.
Abstract: The problem of designing optimal quantizers for distributed source coding is addressed. The generality of this formulation includes both the symmetric and asymmetric scenarios, together with a number of coding schemes, such as ideal coding achieving a rate equal to the joint conditional entropy of the quantized sources given the side information. The paper shows the optimality conditions that quantizers must satisfy, and generalizes the Lloyd algorithm for their design. Experimental results are shown for the Gaussian scalar asymmetric case.

110 citations

Journal ArticleDOI
E. B. Dynkin1
TL;DR: In this article, a relation between two random fields is established which is useful both for the field theory and theory of Markov processes, and the relation between these two fields is shown to be equivalent to the relationship between the free field and the occupation field.

110 citations


Network Information
Related Topics (5)
Estimator
97.3K papers, 2.6M citations
87% related
Optimization problem
96.4K papers, 2.1M citations
85% related
Artificial neural network
207K papers, 4.5M citations
84% related
Support vector machine
73.6K papers, 1.7M citations
82% related
Deep learning
79.8K papers, 2.1M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023502
20221,181
20211,132
20201,220
20191,119
2018978