scispace - formally typeset
Open AccessBook

Gaussian Processes for Machine Learning

Reads0
Chats0
TLDR
The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.
Abstract
A comprehensive and self-contained introduction to Gaussian processes, which provide a principled, practical, probabilistic approach to learning in kernel machines. Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received increased attention in the machine-learning community over the past decade, and this book provides a long-needed systematic and unified treatment of theoretical and practical aspects of GPs in machine learning. The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics. The book deals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective. Many connections to other well-known techniques from machine learning and statistics are discussed, including support-vector machines, neural networks, splines, regularization networks, relevance vector machines and others. Theoretical issues including learning curves and the PAC-Bayesian framework are treated, and several approximation methods for learning with large datasets are discussed. The book contains illustrative examples and exercises, and code and datasets are available on the Web. Appendixes provide mathematical background and a discussion of Gaussian Markov processes.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Entropy search for information-efficient global optimization

TL;DR: The authors develops desiderata for probabilistic optimization algorithms, then presents a concrete algorithm which addresses each of the computational intractabilities with a sequence of approximations and explicitly addresses the decision problem of maximizing information gain from each evaluation.
Posted Content

DESIRE: Distant Future Prediction in Dynamic Scenes with Interacting Agents

TL;DR: In this paper, a Deep Stochastic IOC RNN Encoderdecoder framework, DESIRE, is proposed to predict future locations of objects in multiple scenes by accounting for the multi-modal nature of the future prediction (i.e., given the same context, future may vary).
Journal Article

Distinguishing cause from effect using observational data: methods and benchmarks

TL;DR: Empirical results on real-world data indicate that certain methods are indeed able to distinguish cause from effect using only purely observational data, although more benchmark data would be needed to obtain statistically significant conclusions.
Journal ArticleDOI

Applying Bayesian parameter estimation to relativistic heavy-ion collisions: simultaneous characterization of the initial state and quark-gluon plasma medium

TL;DR: In this article, the authors quantitatively estimate properties of the quark-gluon plasma created in ultrarelativistic heavy-ion collisions utilizing Bayesian statistics and a multiparameter model-to-data comparison.
Proceedings ArticleDOI

Active Learning with Gaussian Processes for Object Categorization

TL;DR: This work derives a novel active category learning method based on the probabilistic regression model, and shows that a significant boost in classification performance is possible, especially when the amount of training data for a category is ultimately very small.
References
More filters
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Book

Matrix computations

Gene H. Golub
Book

Convex Optimization

TL;DR: In this article, the focus is on recognizing convex optimization problems and then finding the most appropriate technique for solving them, and a comprehensive introduction to the subject is given. But the focus of this book is not on the optimization problem itself, but on the problem of finding the appropriate technique to solve it.

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Book

The Fractal Geometry of Nature

TL;DR: This book is a blend of erudition, popularization, and exposition, and the illustrations include many superb examples of computer graphics that are works of art in their own right.
Related Papers (5)