Open AccessBook
Gaussian Processes for Machine Learning
Reads0
Chats0
TLDR
The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.Abstract:
A comprehensive and self-contained introduction to Gaussian processes, which provide a principled, practical, probabilistic approach to learning in kernel machines. Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received increased attention in the machine-learning community over the past decade, and this book provides a long-needed systematic and unified treatment of theoretical and practical aspects of GPs in machine learning. The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics. The book deals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective. Many connections to other well-known techniques from machine learning and statistics are discussed, including support-vector machines, neural networks, splines, regularization networks, relevance vector machines and others. Theoretical issues including learning curves and the PAC-Bayesian framework are treated, and several approximation methods for learning with large datasets are discussed. The book contains illustrative examples and exercises, and code and datasets are available on the Web. Appendixes provide mathematical background and a discussion of Gaussian Markov processes.read more
Citations
More filters
Journal ArticleDOI
Monthly streamflow forecasting using Gaussian Process Regression
TL;DR: Gaussian Process Regression (GPR), an effective kernel-based machine learning algorithm, is applied to probabilistic streamflow forecasting and indicates relatively strong persistence of streamflow predictability in the extended period, although the low-predictability basins tend to show more variations.
Proceedings ArticleDOI
Weakly supervised structured output learning for semantic segmentation
TL;DR: A parametric family of structured models is defined, were each model weights visual cues in a different way and a Maximum Expected Agreement model selection principle is proposed that evaluates the quality of a model from the family without looking at superpixel labels.
Journal ArticleDOI
Underwater Data Collection Using Robotic Sensor Networks
Geoffrey A. Hollinger,Sunav Choudhary,Parastoo Qarabaqi,C. Murphy,Urbashi Mitra,Gaurav S. Sukhatme,Milica Stojanovic,Hanumant Singh,Franz S. Hover +8 more
TL;DR: In this paper, an AUV is used to collect data from an underwater sensor network, where the AUV must plan a path that maximizes the information collected while minimizing travel time or fuel expenditure.
Journal ArticleDOI
Learning control Lyapunov function to ensure stability of dynamical system-based robot reaching motions
TL;DR: An imitation learning approach that exploits the power of Control Lyapunov Function (CLF) control scheme to ensure global asymptotic stability of nonlinear DS systems and allows learning a larger set of robot motions compared to existing methods that are based on quadratic LyAPunov functions.
Journal ArticleDOI
Learning-based Nonlinear Model Predictive Control to Improve Vision-based Mobile Robot Path Tracking
TL;DR: The results show that the system can start from a generic a priori vehicle model and subsequently learn to reduce vehicle- and trajectory-specific path-tracking errors based on experience, and be able to balance trial time, path- tracking errors, and localization reliability based on previous experience.
References
More filters
Book
The Nature of Statistical Learning Theory
TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Book
Convex Optimization
Stephen Boyd,Lieven Vandenberghe +1 more
TL;DR: In this article, the focus is on recognizing convex optimization problems and then finding the most appropriate technique for solving them, and a comprehensive introduction to the subject is given. But the focus of this book is not on the optimization problem itself, but on the problem of finding the appropriate technique to solve it.
Statistical learning theory
TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Book
The Fractal Geometry of Nature
TL;DR: This book is a blend of erudition, popularization, and exposition, and the illustrations include many superb examples of computer graphics that are works of art in their own right.