scispace - formally typeset
Open AccessJournal ArticleDOI

Support-Vector Networks

Corinna Cortes, +1 more
- 15 Sep 1995 - 
- Vol. 20, Iss: 3, pp 273-297
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Abstract
The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data. High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Feature extraction and recognition of ictal EEG using EMD and SVM

TL;DR: A novel method for feature extraction and pattern recognition of ictal EEG, based upon empirical mode decomposition (EMD) and support vector machine (SVM), where the EEG signal is decomposed into Intrinsic Mode Functions (IMFs) using EMD, and then the coefficient of variation and fluctuation index of IMFs are extracted as features.
Journal ArticleDOI

Word sequence kernels

TL;DR: This article proposes the use of string kernels, a novel way of computing document similarity based of matching non-consecutive subsequences of characters with sequences of words rather than characters, and presents some extensions to sequence kernels dealing with symbol-dependent and match-dependent decay factors.
Journal ArticleDOI

A review on prognostic techniques for non-stationary and non-linear rotating systems

TL;DR: In this paper, the authors present a review of the applicability of prognostic techniques for rotating machinery operating under non-linear and non-stationary conditions, as well as their application in the research field.
Journal ArticleDOI

CatBoost for big data: an interdisciplinary review

TL;DR: This survey takes an interdisciplinary approach to cover studies related to CatBoost in a single work, and provides researchers an in-depth understanding to help clarify proper application of Cat boost in solving problems.
Journal ArticleDOI

A systematic performance evaluation of clustering methods for single-cell RNA-seq data.

TL;DR: A systematic and extensible performance evaluation of 14 clustering algorithms implemented in R, including both methods developed explicitly for scRNA-seq data and more general-purpose methods, found that consensus clustering typically did not improve the performance compared to the best of the combined methods, but that several of the top-performing methods already perform some type of consensus clustered.
References
More filters
Journal ArticleDOI

Learning representations by back-propagating errors

TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Book

Methods of Mathematical Physics

TL;DR: In this paper, the authors present an algebraic extension of LINEAR TRANSFORMATIONS and QUADRATIC FORMS, and apply it to EIGEN-VARIATIONS.
Related Papers (5)