scispace - formally typeset
Open AccessJournal ArticleDOI

Support-Vector Networks

Corinna Cortes, +1 more
- 15 Sep 1995 - 
- Vol. 20, Iss: 3, pp 273-297
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Abstract
The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data. High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Classification of small UAVs and birds by micro-Doppler signatures

TL;DR: The problem of unmanned aerial vehicles classification using continuous wave radar using micro-Doppler signature is considered and it provides capability of correct classification with a probability of around 95%.
Proceedings ArticleDOI

A PAC analysis of a Bayesian estimator

TL;DR: The paper uses the techniques to give the first PAC style analysis of a Bayesian inspired estimator of generalisation, the size of a ball which can be placed in the consistent region of parameter space, and the resulting bounds are independent of the complexity of the function class though they depend linearly on the dimensionality of the parameter space.
Journal ArticleDOI

A learning method for the class imbalance problem with medical data sets

TL;DR: The proposed method has better classification performance than SVM, C4.5 decision tree and two other studies, and extends the data attribute dimension into a higher dimension space using classification related information to enhance the classification accuracy.
Journal ArticleDOI

Support Vectors Machine-based identification of heart valve diseases using heart sounds

TL;DR: An automated diagnosis system for the identification of heart valve diseases based on the Support Vector Machines (SVM) classification of heart sounds was applied in a representative global dataset of 198 heart sound signals, which come both from healthy medical cases and from cases suffering from the four most usualheart valve diseases.
Journal ArticleDOI

Optimal weighted nearest neighbour classifiers

TL;DR: In this article, the authors derived an asymptotic expansion for the excess risk (regret) of a weighted nearest-neighbor classifier and showed that the regret of this classifier to that of an unweighted $k$-nearest neighbor classifier depends only on the dimension of the feature vectors, and not on the underlying populations.
References
More filters
Journal ArticleDOI

Learning representations by back-propagating errors

TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Book

Methods of Mathematical Physics

TL;DR: In this paper, the authors present an algebraic extension of LINEAR TRANSFORMATIONS and QUADRATIC FORMS, and apply it to EIGEN-VARIATIONS.
Related Papers (5)