scispace - formally typeset
Journal ArticleDOI

A formal analysis of stopping criteria of decomposition methods for support vector machines

Chih-Jen Lin
- 01 Sep 2002 - 
- Vol. 13, Iss: 5, pp 1045-1052
TLDR
A result shows that, in final iterations of the decomposition method, only a particular set of variables are still being modified, which supports the use of the shrinking and caching techniques in some existing implementations.
Abstract
In a previous paper, the author (2001) proved the convergence of a commonly used decomposition method for support vector machines (SVMs). However, there is no theoretical justification about its stopping criterion, which is based on the gap of the violation of the optimality condition. It is essential to have the gap asymptotically approach zero, so we are sure that existing implementations stop in a finite number of iterations after reaching a specified tolerance. Here, we prove this result and illustrate it by two extensions: /spl nu/-SVM and a multiclass SVM by Crammer and Singer (2001). A further result shows that, in final iterations of the decomposition method, only a particular set of variables are still being modified. This supports the use of the shrinking and caching techniques in some existing implementations. Finally, we prove the asymptotic convergence of a decomposition method for this multiclass SVM. Discussions on the difference between this convergence proof and the one in another paper by Lin are also included.

read more

Citations
More filters
Journal ArticleDOI

LIBSVM: A library for support vector machines

TL;DR: Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.
Journal ArticleDOI

A comparison of methods for multiclass support vector machines

TL;DR: Decomposition implementations for two "all-together" multiclass SVM methods are given and it is shown that for large problems methods by considering all data at once in general need fewer support vectors.

A Comparison of Methods for Multi-class Support Vector Machines

TL;DR: These experiments indicate that the “one-against-one” and DAG methods are more suitable for practical use than the other methods, and show that for large problems methods by considering all data at once in general need fewer support vectors.
Journal ArticleDOI

Training ν -Support Vector Classifiers: Theory and Algorithms

TL;DR: A decomposition method for -SVM is proposed that is competitive with existing methods for C-SVM and shows that in general they are two different problems with the same optimal solution set.

Support Vector Machine Solvers

TL;DR: This chapter contains sections titled: Introduction, Support Vector Machines, Duality, Sparsity, Early SVM Algorithms, The Decomposition Method, A Case Study: LIBSVM, Conclusion and Outlook.
References
More filters
Journal ArticleDOI

LIBSVM: A library for support vector machines

TL;DR: Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.
Journal ArticleDOI

Support-Vector Networks

TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Journal ArticleDOI

A comparison of methods for multiclass support vector machines

TL;DR: Decomposition implementations for two "all-together" multiclass SVM methods are given and it is shown that for large problems methods by considering all data at once in general need fewer support vectors.
Proceedings ArticleDOI

Advances in kernel methods: support vector learning

TL;DR: Support vector machines for dynamic reconstruction of a chaotic system, Klaus-Robert Muller et al pairwise classification and support vector machines, Ulrich Kressel.