scispace - formally typeset
Search or ask a question

Showing papers on "Support vector machine published in 1995"


Proceedings Article
20 Aug 1995
TL;DR: It is observed that three different types of handwritten digit classifiers construct their decision surface from strongly overlapping small subsets of the data base, which opens up the possibility of compressing data bases significantly by disposing of theData which is not important for the solution of a given task.
Abstract: We report a novel possibility for extracting a small subset of a data base which contains all the information necessary to solve a given classification task: using the Support Vector Algorithm to train three different types of handwritten digit classifiers, we observed that these types of classifiers construct their decision surface from strongly overlapping small (≈ 4%) subsets of the data base. This finding opens up the possibility of compressing data bases significantly by disposing of the data which is not important for the solution of a given task. In addition, we show that the theory allows us to predict the classifier that will have the best generalization ability, based solely on performance on the training set and characteristics of the learning machines. This finding is important for cases where the amount of available data is limited.

629 citations


Book ChapterDOI
Vladimir Vapnik1
01 Jan 1995
TL;DR: To implement the SRM inductive principle in learning algorithms one has to minimize the risk in a given set of functions by controlling two factors: thevalue of the empirical risk and the value of the confidence interval.
Abstract: To implement the SRM inductive principle in learning algorithms one has to minimize the risk in a given set of functions by controlling two factors: the value of the empirical risk and the value of the confidence interval.

32 citations