scispace - formally typeset
Open AccessJournal ArticleDOI

Support-Vector Networks

Corinna Cortes, +1 more
- 15 Sep 1995 - 
- Vol. 20, Iss: 3, pp 273-297
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Abstract
The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data. High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

A category-free neural population supports evolving demands during decision-making

TL;DR: This work evaluated rat PPC neurons recorded during multisensory decisions and revealed that the network explored different dimensions during decision and movement, suggesting that a single network of neurons can support the evolving behavioral demands of decision-making.
Book ChapterDOI

Centroid-Based Document Classification: Analysis and Experimental Results

TL;DR: The authors' experiments show that this centroidbased classifier consistently and substantially outperforms other algorithms such as Naive Bayesian, k-nearest-neighbors, and C4.5, on a wide range of datasets.
Journal ArticleDOI

Continuous Locomotion-Mode Identification for Prosthetic Legs Based on Neuromuscular–Mechanical Fusion

TL;DR: An algorithm based on neuromuscular-mechanical fusion to continuously recognize a variety of locomotion modes performed by patients with transfemoral (TF) amputations outperformed methods that used only EMG signals or mechanical information.
Proceedings Article

Fastfood: approximating kernel expansions in loglinear time

TL;DR: Improvements to Fastfood, an approximation that accelerates kernel methods significantly and achieves similar accuracy to full kernel expansions and Random Kitchen Sinks while being 100x faster and using 1000x less memory, make kernel methods more practical for applications that have large training sets and/or require real-time prediction.
Book ChapterDOI

FaceTracer: A Search Engine for Large Collections of Images with Faces

TL;DR: The first image search engine based entirely on faces, which shows state-of-the-art classification results compared to previous works, and demonstrates the power of the architecture through a functional, large-scale face search engine.
References
More filters
Journal ArticleDOI

Learning representations by back-propagating errors

TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Book

Methods of Mathematical Physics

TL;DR: In this paper, the authors present an algebraic extension of LINEAR TRANSFORMATIONS and QUADRATIC FORMS, and apply it to EIGEN-VARIATIONS.
Related Papers (5)