scispace - formally typeset
Open AccessJournal ArticleDOI

Support-Vector Networks

Corinna Cortes, +1 more
- 15 Sep 1995 - 
- Vol. 20, Iss: 3, pp 273-297
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Abstract
The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data. High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Adaptive and self-confident on-line learning algorithms

TL;DR: In this paper, the authors studied on-line learning in the linear regression framework and proposed adaptive tunings for generalized linear regression and Weighted Majority over a finite set of experts.
Proceedings ArticleDOI

Unsupervised improvement of visual detectors using cotraining

TL;DR: A new technique for training visual detectors which requires only a small quantity of labeled data, and then uses unlabeled data to improve performance over time is described, based on the cotraining framework of Blum and Mitchell.
Journal ArticleDOI

Multi-method analysis of MRI images in early diagnostics of Alzheimer's disease.

TL;DR: The presented results show that a comprehensive analysis of MRI images combining multiple features improves classification accuracy and predictive power in detecting early AD.
Journal ArticleDOI

Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties

TL;DR: In this article, an asynchronous parallel stochastic proximal coordinate descent algorithm for minimizing a composite objective function, which consists of a smooth convex function added to a separable convex functions, is presented.
Proceedings Article

Multi-attention Recurrent Network for Human Communication Comprehension.

TL;DR: The main strength of the model comes from discovering interactions between modalities through time using a neural component called the Multi-attention Block (MAB) and storing them in the hybrid memory of a recurrent part called the Long-short Term Hybrid Memory (LSTHM).
References
More filters
Journal ArticleDOI

Learning representations by back-propagating errors

TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Book

Methods of Mathematical Physics

TL;DR: In this paper, the authors present an algebraic extension of LINEAR TRANSFORMATIONS and QUADRATIC FORMS, and apply it to EIGEN-VARIATIONS.
Related Papers (5)