scispace - formally typeset
Search or ask a question

Showing papers on "Statistical learning theory published in 1997"


Journal ArticleDOI
TL;DR: As one of the part of book categories, the nature of statistical learning theory always becomes the most wanted book.
Abstract: If you really want to be smarter, reading can be one of the lots ways to evoke and realize. Many people who like reading will have more knowledge and experiences. Reading can be a way to gain information from economics, politics, science, fiction, literature, religion, and many others. As one of the part of book categories, the nature of statistical learning theory always becomes the most wanted book. Many people are absolutely searching for this book. It means that many love to read this kind of book.

2,716 citations


Journal ArticleDOI
TL;DR: The results show that on the United States postal service database of handwritten digits, the SV machine achieves the highest recognition accuracy, followed by the hybrid system, and the SV approach is thus not only theoretically well-founded but also superior in a practical application.
Abstract: The support vector (SV) machine is a novel type of learning machine, based on statistical learning theory, which contains polynomial classifiers, neural networks, and radial basis function (RBF) networks as special cases. In the RBF case, the SV algorithm automatically determines centers, weights, and threshold that minimize an upper bound on the expected test error. The present study is devoted to an experimental comparison of these machines with a classical approach, where the centers are determined by X-means clustering, and the weights are computed using error backpropagation. We consider three machines, namely, a classical RBF machine, an SV machine with Gaussian kernel, and a hybrid system with the centers determined by the SV method and the weights trained by error backpropagation. Our results show that on the United States postal service database of handwritten digits, the SV machine achieves the highest recognition accuracy, followed by the hybrid system. The SV approach is thus not only theoretically well-founded but also superior in a practical application.

1,385 citations


Journal ArticleDOI
TL;DR: Comparing three approaches to machine learning that have developed largely independently: classical statistics, Vapnik's statistical learning theory, and computational learning theory concludes that statisticians and data miners can profit by studying each other's methods and using a judiciously chosen combination of them.

78 citations


Book ChapterDOI
08 Oct 1997
TL;DR: The effective VC dimension of the classifier for various input distributions is calculated experimentally, and these results are used as the basis for a discussion of the behaviour of the n-tuple classifier.
Abstract: One family of classifiers which has has considerable experimental success over the last thirty year is that of the n-tuple classifier and its descendents. However, the theoretical basis for such classifiers is uncertain despite attempts from time to time to place it in a statistical framework. In particular the most commonly used training algorithms do not even try to minimise recognition error on the training set. In this paper the tools of statistical learning theory are applied to the classifier in an attempt to describe the classifier's effectiveness. In particular the effective VC dimension of the classifier for various input distributions is calculated experimentally, and these results used as the basis for a discussion of the behaviour of the n-tuple classifier. As a side-issue an error-minimising algorithm for the n-tuple classifier is also proposed and briefly examined.

5 citations