New Support Vector Algorithms
Reads0
Chats0
TLDR
A new class of support vector algorithms for regression and classification that eliminates one of the other free parameters of the algorithm: the accuracy parameter in the regression case, and the regularization constant C in the classification case.Abstract:
We propose a new class of support vector algorithms for regression and classification. In these algorithms, a parameter ν lets one effectively control the number of support vectors. While this can be useful in its own right, the parameterization has the additional benefit of enabling us to eliminate one of the other free parameters of the algorithm: the accuracy parameter epsilon in the regression case, and the regularization constant C in the classification case. We describe the algorithms, give some theoretical results concerning the meaning and the choice of ν, and report experimental results.read more
Citations
More filters
Journal ArticleDOI
A comparative analysis of artificial neural network (ANN), wavelet neural network (WNN), and support vector machine (SVM) data-driven models to mineral potential mapping for copper mineralizations in the Shahr-e-Babak region, Kerman, Iran
TL;DR: An alternative method of mineral potential mapping is presented which is based on integration between wavelet theory and ANN or WNN, indicating that WNN method with POLYWOG 3 transfer function have high complex ability to learn and track unknown/undefined complex systems.
Book ChapterDOI
Learning with Rigorous Support Vector Machines
Jinbo Bi,Vladimir Vapnik +1 more
TL;DR: RSVM produces classifiers equivalent to those obtained by classic SVMs for appropriate parameter choices, but the use of the parameter H facilitates model selection, thus minimizing VC bounds on the generalization risk more effectively.
Proceedings Article
Machine learning using hyperkernels
TL;DR: It is demonstrated that it is possible to learn the kernel for various formulations of machine learning problems by providing mathematical programming formulations and experimental results for the C-SVM, ν- SVM and Lagrangian SVM for classification on UCI data, and novelty detection.
Journal ArticleDOI
Automatic classification of weld defects in radiographic images
TL;DR: This work proposes a method based on the direct multiclass support vector machine (DMSVM) to classify the defect, which has good generalisation under the circumstances of a small training set and suggests four new features to characterise the defect to greatly improve the separability of the feature group.
Journal ArticleDOI
Accurate on-line ν -support vector learning
TL;DR: A new effective accurate on-line algorithm which is designed based on a modified formulation of the original ν-SVM, which achieves the fast convergence especially on the Gaussian kernel and is faster than the batch algorithm.
References
More filters
Book
The Nature of Statistical Learning Theory
TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Journal ArticleDOI
Support-Vector Networks
Corinna Cortes,Vladimir Vapnik +1 more
TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Book
Matrix Analysis
Roger A. Horn,Charles R. Johnson +1 more
TL;DR: In this article, the authors present results of both classic and recent matrix analyses using canonical forms as a unifying theme, and demonstrate their importance in a variety of applications, such as linear algebra and matrix theory.
Journal ArticleDOI
A Tutorial on Support Vector Machines for Pattern Recognition
TL;DR: There are several arguments which support the observed high accuracy of SVMs, which are reviewed and numerous examples and proofs of most of the key theorems are given.