scispace - formally typeset
Open AccessJournal ArticleDOI

Support-Vector Networks

Corinna Cortes, +1 more
- 15 Sep 1995 - 
- Vol. 20, Iss: 3, pp 273-297
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Abstract
The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data. High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

A hybrid annual power load forecasting model based on generalized regression neural network with fruit fly optimization algorithm

TL;DR: A hybrid annual power load forecasting model combining fruit fly optimization algorithm (FOA) and generalized regression neural network was proposed to solve this problem, where the FOA was used to automatically select the appropriate spread parameter value for the GRNN power load forecasts model.
Journal ArticleDOI

Face recognition: component-based versus global approaches

TL;DR: A component-based method and two global methods for face recognition and evaluate them with respect to robustness against pose changes are presented and the component system clearly outperformed both global systems.
Journal ArticleDOI

Support vector machine-based classification of Alzheimer’s disease from whole-brain anatomical MRI

TL;DR: A new automated method based on support vector machine (SVM) classification of whole-brain anatomical magnetic resonance imaging to discriminate between patients with Alzheimer's disease and elderly control subjects has the potential in distinguishing patients with AD from elderly controls and therefore may help in the early diagnosis of AD.
DissertationDOI

Automatic model construction with Gaussian processes

TL;DR: In this article, the authors proposed a method to use the National Sciences and Engineering Research Council of Canada, the Cambridge Commonwealth Trust, Pembroke College, a grant from the Engineering and Physical Sciences Research Council, and Google.
Journal ArticleDOI

The Transporter Classification Database

TL;DR: This manuscript describes an update of the database descriptions previously featured in NAR database issues, which are of increasing usefulness to the international scientific community and can serve as a model for the expansion of database technologies.
References
More filters
Journal ArticleDOI

Learning representations by back-propagating errors

TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Book

Methods of Mathematical Physics

TL;DR: In this paper, the authors present an algebraic extension of LINEAR TRANSFORMATIONS and QUADRATIC FORMS, and apply it to EIGEN-VARIATIONS.
Related Papers (5)