scispace - formally typeset
Open AccessJournal ArticleDOI

Choosing Multiple Parameters for Support Vector Machines

TLDR
The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters.
Abstract
The problem of automatically tuning multiple parameters for pattern recognition Support Vector Machines (SVMs) is considered. This is done by minimizing some estimates of the generalization error of SVMs using a gradient descent algorithm over the set of parameters. Usual methods for choosing parameters, based on exhaustive search become intractable as soon as the number of parameters exceeds two. Some experimental results assess the feasibility of our approach for a large number of parameters (more than 100) and demonstrate an improvement of generalization performance.

read more

Content maybe subject to copyright    Report

Citations
More filters
Book ChapterDOI

Saliency in Crowd

TL;DR: This work presents a first focused study on saliency in crowd and proposes a new model for saliency prediction that takes into account the crowding information, and multiple kernel learning (MKL) is used as a core computational module to integrate various features at both low- and high-levels.
Journal ArticleDOI

Uma Introdução às Support Vector Machines

TL;DR: This paper presents an introduction to the Support Vector Machines, a Machine Learning technique that has received increasing attention in the last years and obtained results superior to those of other learning techniques in various applications.
Book ChapterDOI

Multi-objective model selection for support vector machines

TL;DR: In this article, model selection for support vector machines is viewed as a multi-objective optimization problem, where model complexity and training accuracy define two conflicting objectives.
Journal ArticleDOI

2005 Special Issue: Bayesian approach to feature selection and parameter tuning for support vector machine classifiers

TL;DR: In this article, a Nystrom approximation of the Gram matrix is used to speed up sampling times significantly while maintaining almost unchanged classification accuracy, and the final tuned hyperparameter values provide a useful criterion for pruning irrelevant features, and define a measure of relevance with which to determine systematically how many features should be removed.
Journal ArticleDOI

Automatic time series analysis for electric load forecasting via support vector regression

TL;DR: Experiments demonstrate the virtues of the proposed approach in terms of predictive performance and correct identification of relevant lags and seasonal patterns, compared to well-known strategies for time series analysis designed for energy load forecasting and state-of-the-art strategies for automatic model selection.
References
More filters
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Journal ArticleDOI

Support-Vector Networks

TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Book

An Introduction to Support Vector Machines and Other Kernel-based Learning Methods

TL;DR: This is the first comprehensive introduction to Support Vector Machines (SVMs), a new generation learning system based on recent advances in statistical learning theory, and will guide practitioners to updated literature, new applications, and on-line software.
Journal ArticleDOI

Molecular classification of cancer: class discovery and class prediction by gene expression monitoring.

TL;DR: A generic approach to cancer classification based on gene expression monitoring by DNA microarrays is described and applied to human acute leukemias as a test case and suggests a general strategy for discovering and predicting cancer classes for other types of cancer, independent of previous biological knowledge.