scispace - formally typeset
Open AccessJournal ArticleDOI

Choosing Multiple Parameters for Support Vector Machines

TLDR
The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters.
Abstract
The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered. This is done by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters. Usual methods for choosing parameters, based on exhaustive search become intractable as soon as the number of parameters exceeds two. Some experimental results assess the feasibility of our approach for a large number of parameters (more than 100) and demonstrate an improvement of generalization performance.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

New support vector algorithms with parametric insensitive/margin model

TL;DR: A modification of v-support vector machines (v-SVM) for regression and classification is described, and the use of a parametric insensitive/margin model with an arbitrary shape is demonstrated.
Journal ArticleDOI

A multiple kernel support vector machine scheme for feature selection and rule extraction from gene expression data of cancer tissue

TL;DR: A novel rule extraction approach using the information provided by the separating hyperplane and support vectors is proposed to improve the generalization capacity and comprehensibility of rules and reduce the computational complexity of SVM.
Journal ArticleDOI

Composite kernel learning

TL;DR: This work proposes Composite Kernel Learning to address the situation where distinct components give rise to a group structure among kernels, and describes the convexity of the learning problem, and provides a general wrapper algorithm for computing solutions.
Journal ArticleDOI

Leave-One-Out Bounds for Support Vector Regression Model Selection

TL;DR: Experiments demonstrate that the proposed bounds are competitive with Bayesian SVR for parameter selection and the differentiability of leave-one-out bounds is discussed.
Journal ArticleDOI

Spatial prediction of landslide hazard at the Luxi area (China) using support vector machines.

TL;DR: In this article, the authors investigated the potential application of GIS-based Support Vector Machines (SVM) with four kernel functions, i.e., radial basis function (RBF), polynomial (PL), sigmoid (SIG), and linear (LN), for landslide susceptibility mapping at Luxi city in Jiangxi province, China.
References
More filters
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Journal ArticleDOI

Support-Vector Networks

TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Book

An Introduction to Support Vector Machines and Other Kernel-based Learning Methods

TL;DR: This is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory, and will guide practitioners to updated literature, new applications, and on-line software.
Journal ArticleDOI

Molecular classification of cancer: class discovery and class prediction by gene expression monitoring.

TL;DR: A generic approach to cancer classification based on gene expression monitoring by DNA microarrays is described and applied to human acute leukemias as a test case and suggests a general strategy for discovering and predicting cancer classes for other types of cancer, independent of previous biological knowledge.