Fast rates for support vector machines using Gaussian kernels
Ingo Steinwart,Clint Scovel +1 more
Reads0
Chats0
TLDR
This work uses concepts like Tsybakov’s noise assumption and local Rademacher averages to establish learning rates up to the order of n −1 for nontrivial distributions and introduces a geometric assumption for distributions that allows us to estimate the approximation properties of Gaussian RBF kernels.Abstract:
For binary classification we establish learning rates up to the order of $n^{-1}$ for support vector machines (SVMs) with hinge loss and Gaussian RBF kernels. These rates are in terms of two assumptions on the considered distributions: Tsybakov's noise assumption to establish a small estimation error, and a new geometric noise condition which is used to bound the approximation error. Unlike previously proposed concepts for bounding the approximation error, the geometric noise assumption does not employ any smoothness assumption.read more
Citations
More filters
Journal ArticleDOI
On Early Stopping in Gradient Descent Learning
TL;DR: A family of gradient descent algorithms to approximate the regression function from reproducing kernel Hilbert spaces (RKHSs) is studied, the family being characterized by a polynomial decreasing rate of step sizes (or learning rate).
Journal ArticleDOI
Estimating Individualized Treatment Rules Using Outcome Weighted Learning
TL;DR: This article shows that estimating an optimal ITR that is a deterministic function of patient-specific characteristics maximizing expected clinical outcome is equivalent to a classification problem where each subject is weighted proportional to his or her clinical outcome and proposes an outcome weighted learning approach based on the support vector machine framework.
Book
Oracle Inequalities in Empirical Risk Minimization and Sparse Recovery Problems: École d'Été de Probabilités de Saint-Flour XXXVIII-2008
TL;DR: The purpose of these lecture notes is to provide an introduction to the general theory of empirical risk minimization with an emphasis on excess risk bounds and oracle inequalities in penalized problems.
Journal ArticleDOI
Classification with a Reject Option using a Hinge Loss
Peter L. Bartlett,Marten Wegkamp +1 more
TL;DR: This work considers the problem of binary classification where the classifier can, for a particular cost, choose not to classify an observation and proposes a certain convex loss function φ, analogous to the hinge loss used in support vector machines (SVMs).
Journal ArticleDOI
Fast learning rates for plug-in classifiers
TL;DR: This work constructs plug-in classifiers that can achieve not only fast, but also super-fast rates, that is, rates faster than n -1 , and establishes minimax lower bounds showing that the obtained rates cannot be improved.
References
More filters
Book
An Introduction to Support Vector Machines and Other Kernel-based Learning Methods
TL;DR: This is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory, and will guide practitioners to updated literature, new applications, and on-line software.
BookDOI
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
TL;DR: Learning with Kernels provides an introduction to SVMs and related kernel methods that provide all of the concepts necessary to enable a reader equipped with some basic mathematical knowledge to enter the world of machine learning using theoretically well-founded yet easy-to-use kernel algorithms.
Book
Methods of Mathematical Physics
Richard Courant,David Hilbert +1 more
TL;DR: In this paper, the authors present an algebraic extension of LINEAR TRANSFORMATIONS and QUADRATIC FORMS, and apply it to EIGEN-VARIATIONS.
Journal ArticleDOI
Theory of Reproducing Kernels.
TL;DR: In this paper, a short historical introduction is given to indicate the different manners in which these kernels have been used by various investigators and discuss the more important trends of the application of these kernels without attempting, however, a complete bibliography of the subject matter.
Related Papers (5)
Fundamental error bounds in state estimation: An information-theoretic analysis
Song Fang,Jie Chen,Hideaki Ishii +2 more