Open AccessBook
Gaussian Processes for Machine Learning
Reads0
Chats0
TLDR
The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.Abstract:
A comprehensive and self-contained introduction to Gaussian processes, which provide a principled, practical, probabilistic approach to learning in kernel machines. Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received increased attention in the machine-learning community over the past decade, and this book provides a long-needed systematic and unified treatment of theoretical and practical aspects of GPs in machine learning. The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics. The book deals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective. Many connections to other well-known techniques from machine learning and statistics are discussed, including support-vector machines, neural networks, splines, regularization networks, relevance vector machines and others. Theoretical issues including learning curves and the PAC-Bayesian framework are treated, and several approximation methods for learning with large datasets are discussed. The book contains illustrative examples and exercises, and code and datasets are available on the Web. Appendixes provide mathematical background and a discussion of Gaussian Markov processes.read more
Citations
More filters
Proceedings ArticleDOI
Are we ready for autonomous driving? The KITTI vision benchmark suite
TL;DR: The autonomous driving platform is used to develop novel challenging benchmarks for the tasks of stereo, optical flow, visual odometry/SLAM and 3D object detection, revealing that methods ranking high on established datasets such as Middlebury perform below average when being moved outside the laboratory to the real world.
Journal Article
Random search for hyper-parameter optimization
James Bergstra,Yoshua Bengio +1 more
TL;DR: This paper shows empirically and theoretically that randomly chosen trials are more efficient for hyper-parameter optimization than trials on a grid, and shows that random search is a natural baseline against which to judge progress in the development of adaptive (sequential) hyper- parameter optimization algorithms.
Proceedings Article
Practical Bayesian Optimization of Machine Learning Algorithms
TL;DR: This work describes new algorithms that take into account the variable cost of learning algorithm experiments and that can leverage the presence of multiple cores for parallel experimentation and shows that these proposed algorithms improve on previous automatic procedures and can reach or surpass human expert-level optimization for many algorithms.
Journal ArticleDOI
Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
TL;DR: In this article, the authors introduce physics-informed neural networks, which are trained to solve supervised learning tasks while respecting any given laws of physics described by general nonlinear partial differential equations.
Journal ArticleDOI
The future of employment: How susceptible are jobs to computerisation?
TL;DR: In this paper, a Gaussian process classifier was used to estimate the probability of computerisation for 702 detailed occupations, and the expected impacts of future computerisation on US labour market outcomes, with the primary objective of analyzing the number of jobs at risk and the relationship between an occupations probability of computing, wages and educational attainment.
References
More filters
Proceedings Article
Fast Forward Selection to Speed Up Sparse Gaussian Process Regression
TL;DR: A method for the sparse greedy approximation of Bayesian Gaussian process regression, featuring a novel heuristic for very fast forward selection, which leads to a sufficiently stable approximation of the log marginal likelihood of the training data, which can be optimised to adjust a large number of hyperparameters automatically.
BookDOI
Image Analysis, Random Fields and Dynamic Monte Carlo Methods
TL;DR: The book is mainly concerned with the mathematical foundations of Bayesian image analysis and its algorithms, which amounts to the study of Markov random fields and dynamic Monte Carlo algorithms like sampling, simulated annealing and stochastic gradient algorithms.
Journal Article
Gaussian Processes for Ordinal Regression
Wei Chu,Zoubin Ghahramani +1 more
TL;DR: A probabilistic kernel approach to ordinal regression based on Gaussian processes is presented, where a threshold model that generalizes the probit function is used as the likelihood function for ordinal variables.
Dissertation
Evaluation of gaussian processes and other methods for non-linear regression
TL;DR: It is shown that a Bayesian approach to learning in multi-layer perceptron neural networks achieves better performance than the commonly used early stopping procedure, even for reasonably short amounts of computation time.
Posted Content
Monte Carlo Implementation of Gaussian Process Models for Bayesian Regression and Classification
TL;DR: Software is now available that implements Gaussian process methods using covariance functions with hierarchical parameterizations, which can discover high-level properties of the data, such as which inputs are relevant to predicting the response.