scispace - formally typeset
Search or ask a question

Showing papers on "Relevance vector machine published in 1996"


Proceedings Article
03 Jul 1996
TL;DR: The results show that the method can decrease the computational complexity of the decision rule by a factor of ten with no loss in generalization perfor mance making the SVM test speed com petitive with that of other methods.
Abstract: A Support Vector Machine SVM is a uni versal learning machine whose decision sur face is parameterized by a set of support vec tors and by a set of corresponding weights An SVM is also characterized by a kernel function Choice of the kernel determines whether the resulting SVM is a polynomial classi er a two layer neural network a ra dial basis function machine or some other learning machine SVMs are currently considerably slower in test phase than other approaches with sim ilar generalization performance To address this we present a general method to signif icantly decrease the complexity of the deci sion rule obtained using an SVM The pro posed method computes an approximation to the decision rule in terms of a reduced set of vectors These reduced set vectors are not support vectors and can in some cases be computed analytically We give ex perimental results for three pattern recogni tion problems The results show that the method can decrease the computational com plexity of the decision rule by a factor of ten with no loss in generalization perfor mance making the SVM test speed com petitive with that of other methods Fur ther the method allows the generalization performance complexity trade o to be di rectly controlled The proposed method is not speci c to pattern recognition and can be applied to any problem where the Sup port Vector algorithm is used for example regression INTRODUCTION SUPPORT VECTOR MACHINES Consider a two class classi er for which the decision rule takes the form

515 citations


Proceedings Article
03 Dec 1996
TL;DR: This paper combines two techniques for improving generalization performance and speed on a pattern recognition problem by incorporating known invariances of the problem, and applies the reduced set method, applicable to any support vector machine.
Abstract: Support Vector Learning Machines (SVM) are finding application in pattern recognition, regression estimation, and operator inversion for ill-posed problems. Against this very general backdrop, any methods for improving the generalization performance, or for improving the speed in test phase, of SVMs are of increasing interest. In this paper we combine two such techniques on a pattern recognition problem. The method for improving generalization performance (the "virtual support vector" method) does so by incorporating known invariances of the problem. This method achieves a drop in the error rate on 10,000 NIST test digit images of 1.4% to 1.0%. The method for improving the speed (the "reduced set" method) does so by approximating the support vector decision surface. We apply this method to achieve a factor of fifty speedup in test phase over the virtual support vector machine. The combined approach yields a machine which is both 22 times faster than the original machine, and which has better generalization performance, achieving 1.1 % error. The virtual support vector method is applicable to any SVM problem with known invariances. The reduced set method is applicable to any support vector machine.

413 citations


Book ChapterDOI
16 Jul 1996
TL;DR: This work presents a method of incorporating prior knowledge about transformation invariances by applying transformations to support vectors, the training examples most critical for determining the classification boundary.
Abstract: Developed only recently, support vector learning machines achieve high generalization ability by minimizing a bound on the expected test error; however, so far there existed no way of adding knowledge about invariances of a classification problem at hand. We present a method of incorporating prior knowledge about transformation invariances by applying transformations to support vectors, the training examples most critical for determining the classification boundary.

315 citations


Journal ArticleDOI
01 May 1996
TL;DR: In this paper, the authors considered the plant variations as being uncertainties represented by polytopes and proposed an alternative scheme of vector control for digital control of the induction machine, which involves treatment of the machine equations in state space and the acquisition of a robust state-feedback gain by an H/sub 2/ convex optimisation method.
Abstract: Using a discrete time-variant state-space model for the induction machine, the authors consider the plant variations as being uncertainties represented by polytopes. In addition, an alternative scheme of vector control is proposed for digital control of the induction machine. This scheme involves treatment of the machine equations in state space and the acquisition of a robust state-feedback gain by an H/sub 2/ convex optimisation method.

8 citations