scispace - formally typeset
Search or ask a question
Topic

Gaussian process

About: Gaussian process is a research topic. Over the lifetime, 18944 publications have been published within this topic receiving 486645 citations. The topic is also known as: Gaussian stochastic process.


Papers
More filters
Proceedings ArticleDOI
28 Jun 2009
TL;DR: This paper proposes a weighted ℓ1 minimization recovery algorithm and analyzes its performance using a Grassman angle approach on a model where the entries of the unknown vector fall into two sets, each with a different probability of being nonzero.
Abstract: In this paper we study the compressed sensing problem of recovering a sparse signal from a system of underdetermined linear equations when we have prior information about the probability of each entry of the unknown signal being nonzero. In particular, we focus on a model where the entries of the unknown vector fall into two sets, each with a different probability of being nonzero. We propose a weighted l 1 minimization recovery algorithm and analyze its performance using a Grassman angle approach. We compute explicitly the relationship between the system parameters (the weights, the number of measurements, the size of the two sets, the probabilities of being non-zero) so that an iid random Gaussian measurement matrix along with weighted l 1 minimization recovers almost all such sparse signals with overwhelming probability as the problem dimension increases. This allows us to compute the optimal weights. We also provide simulations to demonstrate the advantages of the method over conventional l 1 optimization.

195 citations

Proceedings Article
08 Dec 2008
TL;DR: This work presents an accelerated sampling procedure which enables Bayesian inference of parameters in nonlinear ordinary and delay differential equations via the novel use of Gaussian processes (GP).
Abstract: Identification and comparison of nonlinear dynamical system models using noisy and sparse experimental data is a vital task in many fields, however current methods are computationally expensive and prone to error due in part to the nonlinear nature of the likelihood surfaces induced. We present an accelerated sampling procedure which enables Bayesian inference of parameters in nonlinear ordinary and delay differential equations via the novel use of Gaussian processes (GP). Our method involves GP regression over time-series data, and the resulting derivative and time delay estimates make parameter inference possible without solving the dynamical system explicitly, resulting in dramatic savings of computational time. We demonstrate the speed and statistical accuracy of our approach using examples of both ordinary and delay differential equations, and provide a comprehensive comparison with current state of the art methods.

195 citations

Journal ArticleDOI
TL;DR: In this article, the authors give sufficient and necessary conditions for the mild solution of a stochastic linear equation to be regular both "in time" and "in space" or to be the strong solution.
Abstract: The paper gives sufficient, and in some cases necessary and sufficient, conditions for the mild solution of a stochastic linear equation to be regular both "in time" and "in space" or to be the strong solution

195 citations

Proceedings ArticleDOI
24 Jul 2016
TL;DR: Manifold Gaussian Processes is a novel supervised method that jointly learns a transformation of the data into a feature space and a GP regression from the feature space to observed space, which allows to learn data representations, which are useful for the overall regression task.
Abstract: Off-the-shelf Gaussian Process (GP) covariance functions encode smoothness assumptions on the structure of the function to be modeled. To model complex and non-differentiable functions, these smoothness assumptions are often too restrictive. One way to alleviate this limitation is to find a different representation of the data by introducing a feature space. This feature space is often learned in an unsupervised way, which might lead to data representations that are not useful for the overall regression task. In this paper, we propose Manifold Gaussian Processes, a novel supervised method that jointly learns a transformation of the data into a feature space and a GP regression from the feature space to observed space. The Manifold GP is a full GP and allows to learn data representations, which are useful for the overall regression task. As a proof-of-concept, we evaluate our approach on complex non-smooth functions where standard GPs perform poorly, such as step functions and robotics tasks with contacts.

195 citations

Journal ArticleDOI
24 Jun 2001
TL;DR: Aspects of the duality between the information-embedding problem and the Wyner-Ziv (1976) problem of source coding with side information at the decoder are developed and used to establish a spectrum new results on these and related problems, with implications for a number of important applications.
Abstract: Aspects of the duality between the information-embedding problem and the Wyner-Ziv (1976) problem of source coding with side information at the decoder are developed and used to establish a spectrum new results on these and related problems, with implications for a number of important applications. The single-letter characterization of the information-embedding problem is developed and related to the corresponding characterization of the Wyner-Ziv problem, both of which correspond to optimization of a common mutual information difference. Dual variables and dual Markov conditions are identified, along with the dual role of noise and distortion in the two problems. For a Gaussian context with quadratic distortion metric, a geometric interpretation of the duality is developed. From such insights, we develop a capacity-achieving information-embedding system based on nested lattices. We show the resulting encoder-decoder has precisely the same decoder-encoder structure as the corresponding Wyner-Ziv system based on nested lattices that achieves the rate-distortion limit. For a binary context with Hamming distortion metric, the information-embedding capacity is developed, along with its relationship to the corresponding Wyner-Ziv rate-distortion function. In turn, an information-embedding system for this case based on nested linear codes is constructed having an encoder-decoder that is identical to the decoder-encoder structure for the corresponding system that achieves the Wyner-Ziv rate-distortion limit. Finally, based on these results, a simple layered joint source-channel coding system is developed with a perfectly symmetric encoder-decoder structure. Its application and performance is discussed in a broadcast setting in which there is a need to control the fidelity experienced by different receivers. Among other results, we show that such systems and their multilayer extensions retain attractive optimality properties in the Gaussian-quadratic case, but not in the binary-Hamming case.

194 citations


Network Information
Related Topics (5)
Estimator
97.3K papers, 2.6M citations
87% related
Optimization problem
96.4K papers, 2.1M citations
85% related
Artificial neural network
207K papers, 4.5M citations
84% related
Support vector machine
73.6K papers, 1.7M citations
82% related
Deep learning
79.8K papers, 2.1M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023502
20221,181
20211,132
20201,220
20191,119
2018978