scispace - formally typeset
Search or ask a question
Topic

Gaussian process

About: Gaussian process is a research topic. Over the lifetime, 18944 publications have been published within this topic receiving 486645 citations. The topic is also known as: Gaussian stochastic process.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, it was shown that the process D has the law of the process of the largest eigenvalues of the main minors of an infinite random matrix drawn from Gaussian Unitary Ensemble.
Abstract: Consider the process D k , k = 1,2,…, given by B i being independent standard Brownian motions. This process describes the limiting behavior “near the edge” in queues in series, totally asymmetric exclusion processes or oriented percolation. The problem of finding the distribution of D. was posed in [GW]. The main result of this paper is that the process D. has the law of the process of the largest eigenvalues of the main minors of an infinite random matrix drawn from Gaussian Unitary Ensemble.

276 citations

Book
13 Nov 1995
TL;DR: In this paper, the authors present an approach to Kinetic theory models, including Stochastic Processes, Polymer Dynamics, and Fluid Mechanics, based on the CONNFFESSIT idea.
Abstract: 1 Stochastic Processes, Polymer Dynamics, and Fluid Mechanics.- 1.1 Approach to Kinetic Theory Models.- 1.2 Flow Calculation and Material Functions.- 1.2.1 Shear Flows.- 1.2.2 General Extensional Flows.- 1.2.3 The CONNFFESSIT Idea.- References.- I Stochastic Processes.- 2 Basic Concepts from Stochastics.- 2.1 Events and Probabilities.- 2.1.1 Events and ?-Algebras.- 2.1.2 Probability Axioms.- 2.1.3 Gaussian Probability Measures.- 2.2 Random Variables.- 2.2.1 Definitions and Examples.- 2.2.2 Expectations and Moments.- 2.2.3 Joint Distributions and Independence.- 2.2.4 Conditional Expectations and Probabilities.- 2.2.5 Gaussian Random Variables.- 2.2.6 Convergence of Random Variables.- 2.3 Basic Theory of Stochastic Processes.- 2.3.1 Definitions and Distributions.- 2.3.2 Gaussian Processes.- 2.3.3 Markov Processes.- 2.3.4 Martingales.- References.- 3 Stochastic Calculus.- 3.1 Motivation.- 3.1.1 Naive Approach to Stochastic Differential Equations.- 3.1.2 Criticism of the Naive Approach.- 3.2 Stochastic Integration.- 3.2.1 Definition of the Ito Integral.- 3.2.2 Properties of the Ito Integral.- 3.2.3 Ito's Formula.- 3.3 Stochastic Differential Equations.- 3.3.1 Definitions and Basic Theorems.- 3.3.2 Linear Stochastic Differential Equations.- 3.3.3 Fokker-Planck Equations.- 3.3.4 Mean Field Interactions.- 3.3.5 Boundary Conditions.- 3.3.6 Stratonovich's Stochastic Calculus.- 3.4 Numerical Integration Schemes.- 3.4.1 Euler's Method.- 3.4.2 Mil'shtein's Method.- 3.4.3 Weak Approximation Schemes.- 3.4.4 More Sophisticated Methods.- References.- II Polymer Dynamics.- 4 Bead-Spring Models for Dilute Solutions.- 4.1 Rouse Model.- 4.1.1 Analytical Solution for the Equations of Motion.- 4.1.2 Stress Tensor.- 4.1.3 Material Functions in Shear and Extensional Flows.- 4.1.4 A Primer in Brownian Dynamics Simulations.- 4.1.5 Variance Reduced Simulations.- 4.2 Hydrodynamic Interaction.- 4.2.1 Description of Hydrodynamic Interaction.- 4.2.2 Zimm Model.- 4.2.3 Long Chain Limit and Universal Behavior.- 4.2.4 Gaussian Approximation.- 4.2.5 Simulation of Dumbbells.- 4.3 Nonlinear Forces.- 4.3.1 Excluded Volume.- 4.3.2 Finite Polymer Extensibility.- References.- 5 Models with Constraints.- 5.1 General Bead-Rod-Spring Models.- 5.1.1 Philosophy of Constraints.- 5.1.2 Formulation of Stochastic Differential Equations.- 5.1.3 Generalized Coordinates Versus Constraint Conditions.- 5.1.4 Numerical Integration Schemes.- 5.1.5 Stress Tensor.- 5.2 Rigid Rod Models.- 5.2.1 Dilute Solutions of Rod-like Molecules.- 5.2.2 Liquid Crystal Polymers.- References.- 6 Reptation Models for Concentrated Solutions and Melts.- 6.1 Doi-Edwards and Curtiss-Bird Models.- 6.1.1 Polymer Dynamics.- 6.1.2 Stress Tensor.- 6.1.3 Simulations in Steady Shear Flow.- 6.1.4 Efficiency of Simulations.- 6.2 Reptating-Rope Model.- 6.2.1 Basic Model Equations.- 6.2.2 Results for Steady Shear Flow.- 6.3 Modified Reptation Models.- 6.3.1 A Model Related to "Double Reptation".- 6.3.2 Doi-Edwards Model Without Independent Alignment.- References.- Landmark Papers and Books.- Solutions to Exercises.- References.- Author Index.

276 citations

Journal ArticleDOI
TL;DR: An iterative, inverse filter criteria-based approach is developed using the third-order or the fourth-order normalized cumulants of the inverse filtered data at zero lag of a multiple-input multiple-output (MIMO) system given only the measurements of the vector output of the system.
Abstract: This paper is concerned with the problem of estimation and deconvolution of the matrix impulse response function of a multiple-input multiple-output (MIMO) system given only the measurements of the vector output of the system. The system is assumed to be driven by a temporally i.i.d. and spatially independent non-Gaussian vector sequence (which is not observed). An iterative, inverse filter criteria-based approach is developed using the third-order or the fourth-order normalized cumulants of the inverse filtered data at zero lag. Stationary points of the proposed cost functions are investigated. The approach is input iterative, i.e., the input sequences are extracted and removed one by one. The matrix impulse response is then obtained by cross correlating the extracted inputs with the observed outputs. Identifiability conditions are analyzed. The strong consistency of the proposed approach is also briefly discussed. Computer simulation examples are presented to illustrate the proposed approaches.

276 citations

Journal ArticleDOI
TL;DR: This paper presents different efficient approximations for dependent output Gaussian processes constructed through the convolution formalism, exploit the conditional independencies present naturally in the model and shows experimental results with synthetic and real data.
Abstract: Recently there has been an increasing interest in regression methods that deal with multiple outputs. This has been motivated partly by frameworks like multitask learning, multisensor networks or structured output data. From a Gaussian processes perspective, the problem reduces to specifying an appropriate covariance function that, whilst being positive semi-definite, captures the dependencies between all the data points and across all the outputs. One approach to account for non-trivial correlations between outputs employs convolution processes. Under a latent function interpretation of the convolution transform we establish dependencies between output variables. The main drawbacks of this approach are the associated computational and storage demands. In this paper we address these issues. We present different efficient approximations for dependent output Gaussian processes constructed through the convolution formalism. We exploit the conditional independencies present naturally in the model. This leads to a form of the covariance similar in spirit to the so called PITC and FITC approximations for a single output. We show experimental results with synthetic and real data, in particular, we show results in school exams score prediction, pollution prediction and gene expression data.

274 citations

Proceedings Article
01 May 2017
TL;DR: In this paper, a doubly stochastic variational inference algorithm for DGPs is proposed, which does not force independence between layers and can be used for both classification and regression.
Abstract: Deep Gaussian processes (DGPs) are multi-layer generalizations of GPs, but inference in these models has proved challenging. Existing approaches to inference in DGP models assume approximate posteriors that force independence between the layers, and do not work well in practice. We present a doubly stochastic variational inference algorithm, which does not force independence between layers. With our method of inference we demonstrate that a DGP model can be used effectively on data ranging in size from hundreds to a billion points. We provide strong empirical evidence that our inference scheme for DGPs works well in practice in both classification and regression.

274 citations


Network Information
Related Topics (5)
Estimator
97.3K papers, 2.6M citations
87% related
Optimization problem
96.4K papers, 2.1M citations
85% related
Artificial neural network
207K papers, 4.5M citations
84% related
Support vector machine
73.6K papers, 1.7M citations
82% related
Deep learning
79.8K papers, 2.1M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023502
20221,181
20211,132
20201,220
20191,119
2018978