Support-Vector Networks
Corinna Cortes,Vladimir Vapnik +1 more
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.Abstract:
The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data.
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.read more
Citations
More filters
Journal Article
The doubly regularized support vector machine
TL;DR: This paper proposes a doubly regularized support vector machine (DrSVM), which uses the elastic-net penalty, a mixture of the L2-norm and the L1-norm penalties, and performs automatic variable selection in a way similar to the L 1-norm SVM.
Proceedings ArticleDOI
Support vector machines for speaker verification and identification
Vincent Wan,William M. Campbell +1 more
TL;DR: A new technique for normalising the polynomial kernel is developed and used to achieve performance comparable to other classifiers on the YOHO database.
Posted Content
Cross-Age LFW: A Database for Studying Cross-Age Face Recognition in Unconstrained Environments
TL;DR: A Cross-Age LFW is constructed which deliberately searches and selects 3,000 positive face pairs with age gaps to add aging process intra-class variance and evaluate several metric learning and deep learning methods on the new database.
Journal ArticleDOI
Radiomics: a new application from established techniques
TL;DR: Radiomics is defined as the high throughput extraction of quantitative imaging features or texture from imaging to decode tissue pathology and creating a high dimensional data set for feature extraction and can be further used to develop computational models using advanced machine learning algorithms that may serve as a tool for personalized diagnosis and treatment guidance.
Proceedings ArticleDOI
SemEval-2020 Task 12: Multilingual Offensive Language Identification in Social Media (OffensEval 2020)
Marcos Zampieri,Preslav Nakov,Sara Rosenthal,Pepa Atanasova,Georgi Karadzhov,Hamdy Mubarak,Leon Derczynski,Zeses Pitenis,Çağrı Çöltekin +8 more
TL;DR: The SemEval-2020 Task 12 on Multilingual Offensive Language Identification in Social Media (OffensEval 2020) as mentioned in this paper included three subtasks corresponding to the hierarchical taxonomy of the OLID schema, and was offered in five languages: Arabic, Danish, English, Greek, and Turkish.
References
More filters
Journal ArticleDOI
Learning representations by back-propagating errors
TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Book ChapterDOI
Learning internal representations by error propagation
TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Proceedings ArticleDOI
A training algorithm for optimal margin classifiers
TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Book
Methods of Mathematical Physics
Richard Courant,David Hilbert +1 more
TL;DR: In this paper, the authors present an algebraic extension of LINEAR TRANSFORMATIONS and QUADRATIC FORMS, and apply it to EIGEN-VARIATIONS.