scispace - formally typeset
Open AccessJournal ArticleDOI

Support-Vector Networks

Corinna Cortes, +1 more
- 15 Sep 1995 - 
- Vol. 20, Iss: 3, pp 273-297
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Abstract
The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data. High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Discriminative Shared Gaussian Processes for Multiview and View-Invariant Facial Expression Recognition

TL;DR: A discriminative shared Gaussian process latent variable model (DS-GPLVM) for multiview and view-invariant classification of facial expressions from multiple views is proposed and validated.
Journal ArticleDOI

Unmanned Aerial System (UAS)-Based Phenotyping of Soybean using Multi-sensor Data Fusion and Extreme Learning Machine

TL;DR: In this paper, the power of high spatial resolution RGB, multispectral and thermal data fusion to estimate soybean (Glycine max) biochemical parameters including chlorophyll content and nitrogen concentration, and biophysical parameters including leaf area index (LAI), above ground fresh and dry biomass.
Posted Content

L2 Regularization for Learning Kernels

TL;DR: In this article, the authors studied the problem of learning kernels with the same family of kernels but with an L2 regularization instead, and for regression problems and derived the form of the solution of the optimization problem and gave an efficient iterative algorithm for computing that solution.
Journal ArticleDOI

Detection of non-coding RNAs on the basis of predicted secondary structure formation free energy change

TL;DR: Dynalign provides a method for discovering ncRNAs in sequenced genomes that other methods may not identify, and can be used as a comparable or more accurate tool than RNAz or QRNA in genomic screens, especially for low-identity regions.
Journal ArticleDOI

A support vector machine-based state-of-health estimation method for lithium-ion batteries under electric vehicle operation

TL;DR: Capacity and resistance are state-of-health (SOH) indicators that are essential to monitor during the application of batteries on board electric vehicles as discussed by the authors. But they are difficult to measure accurately.
References
More filters
Journal ArticleDOI

Learning representations by back-propagating errors

TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Book

Methods of Mathematical Physics

TL;DR: In this paper, the authors present an algebraic extension of LINEAR TRANSFORMATIONS and QUADRATIC FORMS, and apply it to EIGEN-VARIATIONS.
Related Papers (5)