Support-Vector Networks
Corinna Cortes,Vladimir Vapnik +1 more
TLDR
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.Abstract:
The support-vector network is a new learning machine for two-group classification problems. The machine conceptually implements the following idea: input vectors are non-linearly mapped to a very high-dimension feature space. In this feature space a linear decision surface is constructed. Special properties of the decision surface ensures high generalization ability of the learning machine. The idea behind the support-vector network was previously implemented for the restricted case where the training data can be separated without errors. We here extend this result to non-separable training data.
High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated. We also compare the performance of the support-vector network to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.read more
Citations
More filters
Proceedings ArticleDOI
Real-time eye detection and tracking under various light conditions
TL;DR: This paper presents a new real-time eye detection and tracking methodology that works under variable and realistic lighting conditions and can robustly track eyes when the pupils are not very bright due to significant external illumination interferences.
Journal ArticleDOI
Occupant behavior and schedule modeling for building energy simulation through office appliance power consumption data mining
TL;DR: In this paper, the authors developed an indirect data mining approach using office appliance power consumption data to learn the occupant "passive" behavior in a medium office building, where the average percentage of correctly classified individual behavior instances is 90.29%.
Proceedings ArticleDOI
An SVM learning approach to robotic grasping
TL;DR: This paper attempts to find optimal grasps of objects using a grasping simulator using a combination of numerical methods to recover parts of the grasp quality surface with any robotic hand, and contemporary machine learning methods to interpolate that surface, in order to find the optimal grasp.
Journal ArticleDOI
Land cover mapping of large areas using chain classification of neighboring Landsat satellite images
Jan Knorn,Andreas Rabe,Volker C. Radeloff,Tobias Kuemmerle,Tobias Kuemmerle,Jacek Kozak,Patrick Hostert +6 more
TL;DR: It is noted that chain classification can only be applied when land cover classes are well represented in the overlap area of neighboring Landsat scenes, but as long as this constraint is met, chain classification is a powerful approach for large area land cover classifications, especially in areas of varying training data availability.
Journal ArticleDOI
NullHop: A Flexible Convolutional Neural Network Accelerator Based on Sparse Representations of Feature Maps
Alessandro Aimar,Hesham Mostafa,Enrico Calabrese,Antonio Rios-Navarro,Ricardo Tapiador-Morales,Iulia-Alexandra Lungu,Moritz B. Milde,Federico Corradi,Alejandro Linares-Barranco,Shih-Chii Liu,Tobi Delbruck +10 more
TL;DR: In this article, the sparsity of neuron activations in CNNs is exploited to accelerate the computation and reduce memory requirements for low-power and low-latency application scenarios.
References
More filters
Journal ArticleDOI
Learning representations by back-propagating errors
TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Book ChapterDOI
Learning internal representations by error propagation
TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Proceedings ArticleDOI
A training algorithm for optimal margin classifiers
TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Book
Methods of Mathematical Physics
Richard Courant,David Hilbert +1 more
TL;DR: In this paper, the authors present an algebraic extension of LINEAR TRANSFORMATIONS and QUADRATIC FORMS, and apply it to EIGEN-VARIATIONS.