Support-Vector Networks
Corinna Cortes,Vladimir Vapnik +1 more
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.Abstract:
The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data.
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.read more
Citations
More filters
Journal ArticleDOI
A structural approach to relaxation in glassy liquids
TL;DR: In this article, a new approach based on machine learning was proposed to reveal a correlation between softness and glassy dynamics, which is strongly correlated with local structure and is strongly associated with dynamics.
Book ChapterDOI
Breaking Cryptographic Implementations Using Deep Learning Techniques
TL;DR: In this article, a recent line of research has investigated new profiling approaches mainly by applying machine learning techniques and obtained results are commensurate and in some particular cases better, compared to template attack.
Journal ArticleDOI
A Comprehensive Review on Current Advances in Peptide Drug Development and Design.
TL;DR: An updated review on key developments of computational modeling of peptide–protein interactions (PepPIs) with an aim to assist experimental biologists exploit suitable docking methods to advance peptide interfering strategies against PPIs.
Journal ArticleDOI
Data mining in the Life Sciences with Random Forest: a walk in the park or lost in the jungle?
Wouter G. Touw,Jumamurat R. Bayjanov,Lex Overmars,Lennart Backus,Jos Boekhorst,Michiel Wels,Sacha A. F. T. van Hijum +6 more
TL;DR: Some of the to the best of the authors' knowledge rarely or never used RF properties that allow maximizing the biological insights that can be extracted from complex omics data sets using RF are detailed.
Journal ArticleDOI
A real-valued genetic algorithm to optimize the parameters of support vector machine for predicting bankruptcy
TL;DR: Experimental results show that the GA-SVM model performs the best predictive accuracy, implying that integrating the RGA with traditional SVM model is very successful.
References
More filters
Journal ArticleDOI
Learning representations by back-propagating errors
TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Book ChapterDOI
Learning internal representations by error propagation
TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Proceedings ArticleDOI
A training algorithm for optimal margin classifiers
TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Book
Methods of Mathematical Physics
Richard Courant,David Hilbert +1 more
TL;DR: In this paper, the authors present an algebraic extension of LINEAR TRANSFORMATIONS and QUADRATIC FORMS, and apply it to EIGEN-VARIATIONS.