scispace - formally typeset
Open AccessJournal ArticleDOI

Support-Vector Networks

Corinna Cortes, +1 more
- 15 Sep 1995 - 
- Vol. 20, Iss: 3, pp 273-297
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Abstract
The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data. High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Deep Representation Learning with Part Loss for Person Re-Identification

TL;DR: Wang et al. as mentioned in this paper proposed part loss, which automatically generates several parts for an image, and computes the person classification loss on each part separately, which enforces the deep network to focus on the entire human body and learn discriminative representations for different parts.
Journal ArticleDOI

Learning the Kernel with Hyperkernels

TL;DR: The equivalent representer theorem for the choice of kernels is state and a semidefinite programming formulation of the resulting optimization problem is presented, which leads to a statistical estimation problem similar to the problem of minimizing a regularized risk functional.
Journal ArticleDOI

2009 Special Issue: A new learning paradigm: Learning using privileged information

TL;DR: In this paper, an advanced learning paradigm called Learning Using Hidden Information (LUHI) was introduced, where a teacher can provide students with hidden information that exists in explanations, comments, comparisons, and so on.
Journal ArticleDOI

Machine Learning Techniques for Cooperative Spectrum Sensing in Cognitive Radio Networks

TL;DR: Novel cooperative spectrum sensing algorithms for cognitive radio (CR) networks based on machine learning techniques which are used for pattern classification outperform the existing state-of-the-art CSS techniques.
Proceedings Article

Memory Fusion Network for Multi-view Sequential Learning

TL;DR: Memory Fusion Network (MFN) as discussed by the authors explicitly accounts for both interactions in a neural architecture and continuously models them through time by using a memory fusion network to learn view-specific interactions and cross-view interactions.
References
More filters
Journal ArticleDOI

Learning representations by back-propagating errors

TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Book

Methods of Mathematical Physics

TL;DR: In this paper, the authors present an algebraic extension of LINEAR TRANSFORMATIONS and QUADRATIC FORMS, and apply it to EIGEN-VARIATIONS.
Related Papers (5)