scispace - formally typeset
Open AccessProceedings Article

Gaussian processes for Big data

TLDR
In this article, the authors introduce stochastic variational inference for Gaussian process models, which enables the application of Gaussian Process (GP) models to data sets containing millions of data points.
Abstract
We introduce stochastic variational inference for Gaussian process models. This enables the application of Gaussian process (GP) models to data sets containing millions of data points. We show how GPs can be variationally decomposed to depend on a set of globally relevant inducing variables which factorize the model in the necessary manner to perform variational inference. Our approach is readily extended to models with non-Gaussian likelihoods and latent variable models based around Gaussian processes. We demonstrate the approach on a simple toy problem and two real world data sets.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Variational Inference: A Review for Statisticians

TL;DR: For instance, mean-field variational inference as discussed by the authors approximates probability densities through optimization, which is used in many applications and tends to be faster than classical methods, such as Markov chain Monte Carlo sampling.
Journal ArticleDOI

Probabilistic machine learning and artificial intelligence

TL;DR: This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.
Journal ArticleDOI

Hidden physics models: Machine learning of nonlinear partial differential equations

TL;DR: In this article, a new paradigm of learning partial differential equations from small data is presented, which is essentially data-efficient learning machines capable of leveraging the underlying laws of physics, expressed by time dependent and nonlinear partial differential equation, to extract patterns from high-dimensional data generated from experiments.
Journal ArticleDOI

Variational Inference: A Review for Statisticians

TL;DR: Variational inference (VI), a method from machine learning that approximates probability densities through optimization, is reviewed and a variant that uses stochastic optimization to scale up to massive data is derived.
Proceedings Article

Deep Neural Networks as Gaussian Processes

TL;DR: The exact equivalence between infinitely wide deep networks and GPs is derived and it is found that test performance increases as finite-width trained networks are made wider and more similar to a GP, and thus that GP predictions typically outperform those of finite- width networks.
References
More filters
Book

Gaussian Processes for Machine Learning

TL;DR: The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.
Journal ArticleDOI

Stochastic variational inference

TL;DR: Stochastic variational inference lets us apply complex Bayesian models to massive data sets, and it is shown that the Bayesian nonparametric topic model outperforms its parametric counterpart.
Journal ArticleDOI

A Unifying View of Sparse Approximate Gaussian Process Regression

TL;DR: A new unifying view, including all existing proper probabilistic sparse approximations for Gaussian process regression, relies on expressing the effective prior which the methods are using, and highlights the relationship between existing methods.
Proceedings Article

Variational Learning of Inducing Variables in Sparse Gaussian Processes

TL;DR: A variational formulation for sparse approximations that jointly infers the inducing inputs and the kernel hyperparameters by maximizing a lower bound of the true log marginal likelihood.
Journal Article

Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models

TL;DR: A novel probabilistic interpretation of principal component analysis (PCA) that is based on a Gaussian process latent variable model (GP-LVM), and related to popular spectral techniques such as kernel PCA and multidimensional scaling.