scispace - formally typeset
Open AccessJournal ArticleDOI

Support-Vector Networks

Corinna Cortes, +1 more
- 15 Sep 1995 - 
- Vol. 20, Iss: 3, pp 273-297
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Abstract
The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data. High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

read more

Content maybe subject to copyright    Report

Citations
More filters
Proceedings ArticleDOI

Dependency Tree Kernels for Relation Extraction

TL;DR: This work extends previous work on tree kernels to estimate the similarity between the dependency trees of sentences, and uses this kernel within a Support Vector Machine to detect and classify relations between entities in the Automatic Content Extraction (ACE) corpus of news articles.
Journal ArticleDOI

Support vector machine approach for protein subcellular localization prediction.

TL;DR: Support Vector Machine has been introduced to predict the subcellular localization of proteins from their amino acid compositions and can be a complementary method to other existing methods based on sorting signals.
Journal ArticleDOI

An Insight into Extreme Learning Machines: Random Neurons, Random Features and Kernels

TL;DR: An insight into ELMs in three aspects, viz: random neurons, random features and kernels is provided and it is shown that in theory ELMs (with the same kernels) tend to outperform support vector machine and its variants in both regression and classification applications with much easier implementation.
Journal ArticleDOI

A comparative study on the predictive ability of the decision tree, support vector machine and neuro-fuzzy models in landslide susceptibility mapping using GIS

TL;DR: In this paper, three different approaches such as decision tree (DT), support vector machine (SVM) and adaptive neuro-fuzzy inference system (ANFIS) were compared for landslide susceptibility mapping at Penang Hill area, Malaysia.
Journal ArticleDOI

Support vector machines for classification in remote sensing

TL;DR: Results show that the SVM achieves a higher level of classification accuracy than either the ML or the ANN classifier, and that theSVM can be used with small training datasets and high‐dimensional data.
References
More filters
Journal ArticleDOI

Learning representations by back-propagating errors

TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Book

Methods of Mathematical Physics

TL;DR: In this paper, the authors present an algebraic extension of LINEAR TRANSFORMATIONS and QUADRATIC FORMS, and apply it to EIGEN-VARIATIONS.
Related Papers (5)