scispace - formally typeset
J

John Shawe-Taylor

Researcher at University College London

Publications -  522
Citations -  56300

John Shawe-Taylor is an academic researcher from University College London. The author has contributed to research in topics: Support vector machine & Kernel method. The author has an hindex of 72, co-authored 503 publications receiving 52369 citations. Previous affiliations of John Shawe-Taylor include Royal Holloway, University of London & Université de Montréal.

Papers
More filters
Book

An Introduction to Support Vector Machines and Other Kernel-based Learning Methods

TL;DR: This is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory, and will guide practitioners to updated literature, new applications, and on-line software.
Book

Kernel Methods for Pattern Analysis

TL;DR: This book provides an easy introduction for students and researchers to the growing field of kernel-based pattern analysis, demonstrating with examples how to handcraft an algorithm or a kernel for a new specific application, and covering all the necessary conceptual and mathematical tools to do so.
Journal ArticleDOI

Estimating the Support of a High-Dimensional Distribution

TL;DR: In this paper, the authors propose a method to estimate a function f that is positive on S and negative on the complement of S. The functional form of f is given by a kernel expansion in terms of a potentially small subset of the training data; it is regularized by controlling the length of the weight vector in an associated feature space.
Book

An Introduction to Support Vector Machines

TL;DR: This book is the first comprehensive introduction to Support Vector Machines, a new generation learning system based on recent advances in statistical learning theory, and introduces Bayesian analysis of learning and relates SVMs to Gaussian Processes and other kernel based learning methods.
Proceedings Article

Large Margin DAGs for Multiclass Classification

TL;DR: An algorithm, DAGSVM, is presented, which operates in a kernel-induced feature space and uses two-class maximal margin hyperplanes at each decision-node of the DDAG, which is substantially faster to train and evaluate than either the standard algorithm or Max Wins, while maintaining comparable accuracy to both of these algorithms.