Choosing Multiple Parameters for Support Vector Machines
TLDR
The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters.Abstract:
The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered. This is done by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters. Usual methods for choosing parameters, based on exhaustive search become intractable as soon as the number of parameters exceeds two. Some experimental results assess the feasibility of our approach for a large number of parameters (more than 100) and demonstrate an improvement of generalization performance.read more
Citations
More filters
Journal ArticleDOI
A new hybrid correction method for short-term load forecasting based on ARIMA, SVR and CSA
TL;DR: A new hybrid correction method based on autoregressive integrated moving average (ARIMA) model, support vector regression (SVR) and cuckoo search algorithm (CSA) to achieve a more reliable forecasting model is proposed.
Proceedings ArticleDOI
Leave-One-Out Kernel Optimization for Shadow Detection
TL;DR: The objective of this work is to detect shadows in images by posing this as the problem of labeling image regions, where each region corresponds to a group of superpixels, and proposing a new method for shadow removal based on region relighting.
Journal ArticleDOI
Improving classification performance of Support Vector Machine by genetically optimising kernel shape and hyper-parameters
TL;DR: Numerical experiments show that the SVM algorithm, involving the evolutionary kernel of kernels (eKoK) the authors propose, performs better than well-known classic kernels whose parameters were optimised and a state of the art convex linear and an evolutionary linear, respectively, kernel combinations.
Journal ArticleDOI
An Efficient Approach to Integrating Radius Information into Multiple Kernel Learning
TL;DR: This paper proposes to incorporate the radius of the minimum enclosing ball (MEB) into MKL with the following advantages: more robust in the presence of outliers or noisy training samples; more computationally efficient by avoiding the quadratic optimization for computing the radius at each iteration; and readily solvable by the existing off-the-shelf MKL packages.
Journal ArticleDOI
Classification model selection via bilevel programming
TL;DR: This work proposes a bilevel program that is significantly more versatile than commonly used grid search procedures, enabling the use of models with many hyper-parameters, and demonstrates the practicality of this approach for model selection in machine learning.
References
More filters
Book
The Nature of Statistical Learning Theory
TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Journal ArticleDOI
Support-Vector Networks
Corinna Cortes,Vladimir Vapnik +1 more
TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Statistical learning theory
TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Book
An Introduction to Support Vector Machines and Other Kernel-based Learning Methods
TL;DR: This is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory, and will guide practitioners to updated literature, new applications, and on-line software.
Journal ArticleDOI
Molecular classification of cancer: class discovery and class prediction by gene expression monitoring.
Todd R. Golub,Todd R. Golub,Donna K. Slonim,Pablo Tamayo,Christine Huard,Michelle Gaasenbeek,Jill P. Mesirov,Hilary A. Coller,Mignon L. Loh,James R. Downing,Michael A. Caligiuri,Clara D. Bloomfield,Eric S. Lander +12 more
TL;DR: A generic approach to cancer classification based on gene expression monitoring by DNA microarrays is described and applied to human acute leukemias as a test case and suggests a general strategy for discovering and predicting cancer classes for other types of cancer, independent of previous biological knowledge.