scispace - formally typeset
Open AccessJournal ArticleDOI

Choosing Multiple Parameters for Support Vector Machines

TLDR
The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters.
Abstract
The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered. This is done by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters. Usual methods for choosing parameters, based on exhaustive search become intractable as soon as the number of parameters exceeds two. Some experimental results assess the feasibility of our approach for a large number of parameters (more than 100) and demonstrate an improvement of generalization performance.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Rademacher chaos complexities for learning the kernel problem

TL;DR: A novel generalization bound for learning the kernel problem is developed and satisfactory excess generalization bounds and misclassification error rates for learning gaussian kernels and general radial basis kernels are established.
Journal ArticleDOI

Protein subcellular localization prediction using multiple kernel learning based support vector machine

TL;DR: This study aimed to develop an efficient multi-label protein subcellular localization prediction system, named as MKLoc, by introducing multiple kernel learning (MKL) based SVM, and results indicate that MKLoc not only achieves higher accuracy than a single kernel based S VM system but also shows significantly better results than those obtained from other top systems.
Book ChapterDOI

Optimization of the SVM Kernels Using an Empirical Error Minimization Scheme

TL;DR: The present work is an extended experimental study of the framework proposed by Chapelle et al. for optimizing SVM kernels using an analytic upper bound of the error, and minimizes an empirical error estimate using a Quasi-Newton technique.
Proceedings Article

Convex formulations of radius-margin based Support Vector Machines

TL;DR: Two novel algorithms are presented: R-SVM µ+--a SVM radius-margin based feature selection algorithm, and R- SVM+-- a metric learning-based SVM that achieves a significantly better classification performance compared to SVM and its other state-of-the-art variants.
Journal ArticleDOI

Parameters selection in gene selection using Gaussian kernel support vector machines by genetic algorithm

TL;DR: A new method to select parameters of the Recursive feature elimination algorithm implemented with Gaussian kernel SVMs as better alternatives to the common practice of selecting the apparently best parameters by using a genetic algorithm to search for a couple of optimal parameter.
References
More filters
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Journal ArticleDOI

Support-Vector Networks

TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Book

An Introduction to Support Vector Machines and Other Kernel-based Learning Methods

TL;DR: This is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory, and will guide practitioners to updated literature, new applications, and on-line software.
Journal ArticleDOI

Molecular classification of cancer: class discovery and class prediction by gene expression monitoring.

TL;DR: A generic approach to cancer classification based on gene expression monitoring by DNA microarrays is described and applied to human acute leukemias as a test case and suggests a general strategy for discovering and predicting cancer classes for other types of cancer, independent of previous biological knowledge.