scispace - formally typeset
Open AccessJournal ArticleDOI

New Support Vector Algorithms

Reads0
Chats0
TLDR
A new class of support vector algorithms for regression and classification that eliminates one of the other free parameters of the algorithm: the accuracy parameter in the regression case, and the regularization constant C in the classification case.
Abstract
We propose a new class of support vector algorithms for regression and classification. In these algorithms, a parameter ν lets one effectively control the number of support vectors. While this can be useful in its own right, the parameterization has the additional benefit of enabling us to eliminate one of the other free parameters of the algorithm: the accuracy parameter epsilon in the regression case, and the regularization constant C in the classification case. We describe the algorithms, give some theoretical results concerning the meaning and the choice of ν, and report experimental results.

read more

Citations
More filters
Journal ArticleDOI

A multiple kernel learning approach to perform classification of groups from complex-valued fMRI data analysis: application to schizophrenia

TL;DR: An MKL-based methodology that improves schizophrenia characterization by using both magnitude and phase fMRI data and is also capable of detecting the brain regions that convey most of the discriminative information between patients and controls is presented.
Proceedings ArticleDOI

Detecting Symbian OS malware through static function call analysis

TL;DR: In this article, a static malware detection in resource-limited mobile environments is presented, which can be used to extend currently used third-party application signing mechanisms for increasing malware detection capabilities.

Tuning support vector machines for minimax and Neyman-Pearson classification

TL;DR: It is demonstrated that smoothing significantly improves the accuracy of cross-validation error estimates, leading to dramatic performance gains and proposed coordinate descent strategies that offer significant gains in computational efficiency, with little to no loss in performance.
Journal ArticleDOI

A new feature set with new window techniques for customer churn prediction in land-line telecommunications

TL;DR: The experimental results show that the new features with the new window techniques are efficient for churn prediction in land-line telecommunication service fields.
Journal ArticleDOI

An overview of gradient-enhanced metamodels with applications

TL;DR: This article is a review of the main metamodels that use function gradients in addition to function values and indicates that there is a trade-off between the better computing time of least squares methods and the larger versatility of kernel-based approaches.
References
More filters
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Journal ArticleDOI

Support-Vector Networks

TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Book

Matrix Analysis

TL;DR: In this article, the authors present results of both classic and recent matrix analyses using canonical forms as a unifying theme, and demonstrate their importance in a variety of applications, such as linear algebra and matrix theory.
Journal ArticleDOI

A Tutorial on Support Vector Machines for Pattern Recognition

TL;DR: There are several arguments which support the observed high accuracy of SVMs, which are reviewed and numerous examples and proofs of most of the key theorems are given.
Book

Nonlinear Programming