scispace - formally typeset
Open AccessJournal ArticleDOI

Support-Vector Networks

Corinna Cortes, +1 more
- 15 Sep 1995 - 
- Vol. 20, Iss: 3, pp 273-297
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Abstract
The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data. High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Fault detection, classification and location for transmission lines and distribution systems: a review on the methods

TL;DR: A comprehensive review on the methods used for fault detection, classification and location in transmission lines and distribution systems is presented in this article, where fault detection techniques are discussed on the basis of feature extraction.
Journal ArticleDOI

Road Extraction Using SVM and Image Segmentation

TL;DR: In this paper, a two-step approach for road extraction that utilizes pixel spectral information for classification and image segmentation-derived object features was proposed, which achieved higher accuracy than Gaussian maximum likelihood (GML).
Journal ArticleDOI

GraphProt: modeling binding preferences of RNA-binding proteins

TL;DR: GraphProt models predict that predicted Ago2 targets display higher levels of expression upon Ago2 knockdown, whereas control targets do not, and estimated binding affinities correlate with experimental measurements.
Patent

Compensation of propagation delays of wireless signals

TL;DR: In this article, the authors proposed a method for compensation of propagation delay offsets of wireless signals through determination of an effective wireless signal propagation delay that accounts for signal path delay and propagation delay over the air, based at least in part on statistical analysis of accurate location estimates of reference positions throughout a coverage sector or cell.
Journal ArticleDOI

Computerized Image-Based Detection and Grading of Lymphocytic Infiltration in HER2+ Breast Cancer Histopathology

TL;DR: A computer-aided diagnosis (CADx) scheme to automatically detect and grade the extent of lymphocytic infiltration in digitized HER2+ BC histopathology will potentially help clinicians determine disease outcome and allow them to make better therapy recommendations for patients with HER2- BC.
References
More filters
Journal ArticleDOI

Learning representations by back-propagating errors

TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Book

Methods of Mathematical Physics

TL;DR: In this paper, the authors present an algebraic extension of LINEAR TRANSFORMATIONS and QUADRATIC FORMS, and apply it to EIGEN-VARIATIONS.
Related Papers (5)