scispace - formally typeset
Open AccessJournal ArticleDOI

New Support Vector Algorithms

Reads0
Chats0
TLDR
A new class of support vector algorithms for regression and classification that eliminates one of the other free parameters of the algorithm: the accuracy parameter in the regression case, and the regularization constant C in the classification case.
Abstract
We propose a new class of support vector algorithms for regression and classification. In these algorithms, a parameter ν lets one effectively control the number of support vectors. While this can be useful in its own right, the parameterization has the additional benefit of enabling us to eliminate one of the other free parameters of the algorithm: the accuracy parameter epsilon in the regression case, and the regularization constant C in the classification case. We describe the algorithms, give some theoretical results concerning the meaning and the choice of ν, and report experimental results.

read more

Citations
More filters
Journal ArticleDOI

3D Visual Discomfort Predictor: Analysis of Disparity and Neural Activity Statistics

TL;DR: In this article, a model-based neuronal and statistical framework called the 3D visual discomfort predictor (3D-VDP) was developed to automatically predict the level of visual discomfort that is experienced when viewing S3D images.
Journal ArticleDOI

Comparison of the molecular and cellular phenotypes of common mouse syngeneic models with human tumors

TL;DR: This study characterizes the mouse syngeneic models and compares with their human tumor counterparts to contribute to a framework that may help investigators select the model most relevant to study a particular immune-oncology mechanism, and may rationalize some of the challenges associated with translating preclinical findings to clinical studies.
Journal ArticleDOI

Some greedy learning algorithms for sparse regression and classification with mercer kernels

TL;DR: Algorithms based on residual minimization and thin QR factorization are presented for constructing sparse regression and classification models using Mercer kernels to develop efficient numerical schemes for reducing the training and runtime complexities of kernel-based algorithms applied to large datasets.
Proceedings Article

How SVMs can estimate quantiles and the median

TL;DR: It is shown that SVMs based on the ∊-insensitive loss estimate the conditional median only under certain conditions on P, and this result is used to derive an oracle inequality for an SVMbased on the pinball loss.
Proceedings Article

Distance metric learning with kernels

TL;DR: In this article, a feature weighting method that works in both the input space and the kernel-induced feature space is proposed, which assumes only the availability of similari ty (dissimilarity) information.
References
More filters
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Journal ArticleDOI

Support-Vector Networks

TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Book

Matrix Analysis

TL;DR: In this article, the authors present results of both classic and recent matrix analyses using canonical forms as a unifying theme, and demonstrate their importance in a variety of applications, such as linear algebra and matrix theory.
Journal ArticleDOI

A Tutorial on Support Vector Machines for Pattern Recognition

TL;DR: There are several arguments which support the observed high accuracy of SVMs, which are reviewed and numerous examples and proofs of most of the key theorems are given.
Book

Nonlinear Programming