scispace - formally typeset
Search or ask a question

Showing papers on "Support vector machine published in 1996"


Proceedings Article
03 Dec 1996
TL;DR: This work compares support vector regression (SVR) with a committee regression technique (bagging) based on regression trees and ridge regression done in feature space and expects that SVR will have advantages in high dimensionality space because SVR optimization does not depend on the dimensionality of the input space.
Abstract: A new regression technique based on Vapnik's concept of support vectors is introduced. We compare support vector regression (SVR) with a committee regression technique (bagging) based on regression trees and ridge regression done in feature space. On the basis of these experiments, it is expected that SVR will have advantages in high dimensionality space because SVR optimization does not depend on the dimensionality of the input space.

4,009 citations


Proceedings Article
03 Dec 1996
TL;DR: This presentation reports results of applying the Support Vector method to problems of estimating regressions, constructing multidimensional splines, and solving linear operator equations.
Abstract: The Support Vector (SV) method was recently proposed for estimating regressions, constructing multidimensional splines, and solving linear operator equations [Vapnik, 1995]. In this presentation we report results of applying the SV method to these problems.

2,632 citations


Proceedings Article
03 Jul 1996
TL;DR: The results show that the method can decrease the computational complexity of the decision rule by a factor of ten with no loss in generalization perfor mance making the SVM test speed com petitive with that of other methods.
Abstract: A Support Vector Machine SVM is a uni versal learning machine whose decision sur face is parameterized by a set of support vec tors and by a set of corresponding weights An SVM is also characterized by a kernel function Choice of the kernel determines whether the resulting SVM is a polynomial classi er a two layer neural network a ra dial basis function machine or some other learning machine SVMs are currently considerably slower in test phase than other approaches with sim ilar generalization performance To address this we present a general method to signif icantly decrease the complexity of the deci sion rule obtained using an SVM The pro posed method computes an approximation to the decision rule in terms of a reduced set of vectors These reduced set vectors are not support vectors and can in some cases be computed analytically We give ex perimental results for three pattern recogni tion problems The results show that the method can decrease the computational com plexity of the decision rule by a factor of ten with no loss in generalization perfor mance making the SVM test speed com petitive with that of other methods Fur ther the method allows the generalization performance complexity trade o to be di rectly controlled The proposed method is not speci c to pattern recognition and can be applied to any problem where the Sup port Vector algorithm is used for example regression INTRODUCTION SUPPORT VECTOR MACHINES Consider a two class classi er for which the decision rule takes the form

515 citations


Proceedings Article
03 Dec 1996
TL;DR: This paper combines two techniques for improving generalization performance and speed on a pattern recognition problem by incorporating known invariances of the problem, and applies the reduced set method, applicable to any support vector machine.
Abstract: Support Vector Learning Machines (SVM) are finding application in pattern recognition, regression estimation, and operator inversion for ill-posed problems. Against this very general backdrop, any methods for improving the generalization performance, or for improving the speed in test phase, of SVMs are of increasing interest. In this paper we combine two such techniques on a pattern recognition problem. The method for improving generalization performance (the "virtual support vector" method) does so by incorporating known invariances of the problem. This method achieves a drop in the error rate on 10,000 NIST test digit images of 1.4% to 1.0%. The method for improving the speed (the "reduced set" method) does so by approximating the support vector decision surface. We apply this method to achieve a factor of fifty speedup in test phase over the virtual support vector machine. The combined approach yields a machine which is both 22 times faster than the original machine, and which has better generalization performance, achieving 1.1 % error. The virtual support vector method is applicable to any SVM problem with known invariances. The reduced set method is applicable to any support vector machine.

413 citations


Book ChapterDOI
16 Jul 1996
TL;DR: This work presents a method of incorporating prior knowledge about transformation invariances by applying transformations to support vectors, the training examples most critical for determining the classification boundary.
Abstract: Developed only recently, support vector learning machines achieve high generalization ability by minimizing a bound on the expected test error; however, so far there existed no way of adding knowledge about invariances of a classification problem at hand. We present a method of incorporating prior knowledge about transformation invariances by applying transformations to support vectors, the training examples most critical for determining the classification boundary.

315 citations


Book ChapterDOI
16 Jul 1996
TL;DR: Two view-based object recognition algorithms are compared: a heuristic algorithm based on oriented filters, and a support vector learning machine trained on low-resolution images of the objects.
Abstract: Two view-based object recognition algorithms are compared: (1) a heuristic algorithm based on oriented filters, and (2) a support vector learning machine trained on low-resolution images of the objects. Classification performance is assessed using a high number of images generated by a computer graphics system under precisely controlled conditions. Training- and test-images show a set of 25 realistic three-dimensional models of chairs from viewing directions spread over the upper half of the viewing sphere. The percentage of correct identification of all 25 objects is measured.

217 citations


Proceedings ArticleDOI
07 May 1996
TL;DR: Though experimental results are preliminary, performance improvements over the BBN modified Gaussian Bayes decision system have been obtained on the Switchboard corpus.
Abstract: A novel approach to speaker identification is presented. The technique, based on Vapnik's (1995) work with support vectors, is exciting for several reasons. The support vector method is a discriminative approach, modeling the boundaries directly between speakers voices in some feature space rather than by the difficult intermediate step of estimating speaker densities. Most importantly, support vector discriminant classifiers are unique in that they separate training data while keeping discriminating power low, thereby reducing test errors. As a result it is possible to build useful classifiers with many more parameters than training points. Furthermore, Vapnik's theory suggests which class of discriminating functions should be used given the amount of training data by being able to determine bounds on the expected number of test errors. Support vector classifiers are efficient to compute compared to other discriminant functions. Though experimental results are preliminary, performance improvements over the BBN modified Gaussian Bayes decision system have been obtained on the Switchboard corpus.

125 citations