Open AccessProceedings Article
Model Selection for Support Vector Machines
Olivier Chapelle,Vladimir Vapnik +1 more
- Vol. 12, pp 230-236
TLDR
New functionals for parameter (model) selection of Support Vector Machines are introduced based on the concepts of the span of support vectors and rescaling of the feature space and it is shown that using these functionals one can both predict the best choice of parameters of the model and the relative quality of performance for any value of parameter.Abstract:
New functionals for parameter (model) selection of Support Vector Machines are introduced based on the concepts of the span of support vectors and rescaling of the feature space. It is shown that using these functionals, one can both predict the best choice of parameters of the model and the relative quality of performance for any value of parameter.read more
Citations
More filters
BookDOI
Semi-Supervised Learning
TL;DR: Semi-supervised learning (SSL) as discussed by the authors is the middle ground between supervised learning (in which all training examples are labeled) and unsupervised training (where no label data are given).
Journal ArticleDOI
Choosing Multiple Parameters for Support Vector Machines
TL;DR: The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters.
Journal ArticleDOI
Practical selection of SVM parameters and noise estimation for SVM regression
Vladimir Cherkassky,Yunqian Ma +1 more
TL;DR: This work describes a new analytical prescription for setting the value of insensitive zone epsilon, as a function of training sample size, and compares generalization performance of SVM regression under sparse sample settings with regression using 'least-modulus' loss (epsilon=0) and standard squared loss.
Journal ArticleDOI
Regularization Networks and Support Vector Machines
TL;DR: Both formulations of regularization and Support Vector Machines are reviewed in the context of Vapnik's theory of statistical learning which provides a general foundation for the learning problem, combining functional analysis and statistics.
Book
Adaptive computation and machine learning
TL;DR: This book attempts to give an overview of the different recent efforts to deal with covariate shift, a challenging situation where the joint distribution of inputs and outputs differs between the training and test stages.
References
More filters
Book
The Nature of Statistical Learning Theory
TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Statistical learning theory
TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Journal ArticleDOI
A Tutorial on Support Vector Machines for Pattern Recognition
TL;DR: There are several arguments which support the observed high accuracy of SVMs, which are reviewed and numerous examples and proofs of most of the key theorems are given.
Book
Nonlinear Programming: Theory and Algorithms
TL;DR: The book is a solid reference for professionals as well as a useful text for students in the fields of operations research, management science, industrial engineering, applied mathematics, and also in engineering disciplines that deal with analytical optimization techniques.