scispace - formally typeset
Open AccessJournal ArticleDOI

New Support Vector Algorithms

Reads0
Chats0
TLDR
A new class of support vector algorithms for regression and classification that eliminates one of the other free parameters of the algorithm: the accuracy parameter in the regression case, and the regularization constant C in the classification case.
Abstract
We propose a new class of support vector algorithms for regression and classification. In these algorithms, a parameter ν lets one effectively control the number of support vectors. While this can be useful in its own right, the parameterization has the additional benefit of enabling us to eliminate one of the other free parameters of the algorithm: the accuracy parameter epsilon in the regression case, and the regularization constant C in the classification case. We describe the algorithms, give some theoretical results concerning the meaning and the choice of ν, and report experimental results.

read more

Citations
More filters
Journal ArticleDOI

A Robust Regularization Path Algorithm for $\nu $ -Support Vector Classification

TL;DR: A robust regularization path algorithm is proposed for LaTeX vector classification, based on lower upper decomposition with partial pivoting, that can avoid the exceptions completely, handle the singularities in the key matrix, and fit the entire solution path in a finite number of steps.
Journal ArticleDOI

Landslide susceptibility mapping based on Support Vector Machine: A case study on natural slopes of Hong Kong, China

TL;DR: An overview of the SVM, both one-class and two-class SVM methods, is first presented followed by its use in landslide susceptibility mapping, where it is concluded that two- class SVM possesses better prediction efficiency than logistic regression and one- Class SVM.
Journal ArticleDOI

Incremental learning for ν -Support Vector Regression

TL;DR: This paper proposes a special procedure called initial adjustments, which adjusts the weights of ν-SVC based on the Karush-Kuhn-Tucker conditions to prepare an initial solution for the incremental learning of the INSVR learning algorithm.
Journal ArticleDOI

A tutorial on ν‐support vector machines

TL;DR: In this article, the main ideas of statistical learning theory, support vector machines (SVMs), and kernel feature spaces are briefly described, with particular emphasis on a description of the so-called ν-SVM.
Journal ArticleDOI

Nonparametric Quantile Estimation

TL;DR: A nonparametric version of a quantile estimator is presented, which can be obtained by solving a simple quadratic programming problem and uniform convergence statements and bounds on the quantile property of the estimator are provided.
References
More filters
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Journal ArticleDOI

Support-Vector Networks

TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Book

Matrix Analysis

TL;DR: In this article, the authors present results of both classic and recent matrix analyses using canonical forms as a unifying theme, and demonstrate their importance in a variety of applications, such as linear algebra and matrix theory.
Journal ArticleDOI

A Tutorial on Support Vector Machines for Pattern Recognition

TL;DR: There are several arguments which support the observed high accuracy of SVMs, which are reviewed and numerous examples and proofs of most of the key theorems are given.
Book

Nonlinear Programming