scispace - formally typeset
Open AccessJournal ArticleDOI

Support-Vector Networks

Corinna Cortes, +1 more
- 15 Sep 1995 - 
- Vol. 20, Iss: 3, pp 273-297
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Abstract
The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data. High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

read more

Content maybe subject to copyright    Report

Citations
More filters
Proceedings ArticleDOI

Discriminative learning of Markov random fields for segmentation of 3D scan data

TL;DR: This work addresses the problem of segmenting 3D scan data into objects or object classes by using a recently proposed maximum-margin framework to discriminatively train the model from a set of labeled scans and automatically learn the relative importance of the features for the segmentation task.
Journal ArticleDOI

A tutorial on ν‐support vector machines

TL;DR: In this article, the main ideas of statistical learning theory, support vector machines (SVMs), and kernel feature spaces are briefly described, with particular emphasis on a description of the so-called ν-SVM.
Journal ArticleDOI

A meta-analysis of remote sensing research on supervised pixel-based land-cover image classification processes: General guidelines for practitioners and future research

TL;DR: A statistical meta-analysis of the past 15 years of research on supervised per-pixel image classification revealed that inclusion of texture information yielded the greatest improvement in overall accuracy of land-cover classification with an average increase of 12.1%.
Journal ArticleDOI

Locality-Preserving Dimensionality Reduction and Classification for Hyperspectral Image Analysis

TL;DR: The proposed framework employs local Fisher's discriminant analysis to reduce the dimensionality of the data while preserving its multi-dimensional structure, while a subsequent Gaussian mixture model or support vector machine provides effective classification of the reduced-dimension multimodal data.
Posted Content

Deep k-Nearest Neighbors: Towards Confident, Interpretable and Robust Deep Learning

TL;DR: The DkNN algorithm is evaluated on several datasets, and it is shown the confidence estimates accurately identify inputs outside the model, and that the explanations provided by nearest neighbors are intuitive and useful in understanding model failures.
References
More filters
Journal ArticleDOI

Learning representations by back-propagating errors

TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Book

Methods of Mathematical Physics

TL;DR: In this paper, the authors present an algebraic extension of LINEAR TRANSFORMATIONS and QUADRATIC FORMS, and apply it to EIGEN-VARIATIONS.
Related Papers (5)