New Support Vector Algorithms
Reads0
Chats0
TLDR
A new class of support vector algorithms for regression and classification that eliminates one of the other free parameters of the algorithm: the accuracy parameter in the regression case, and the regularization constant C in the classification case.Abstract:
We propose a new class of support vector algorithms for regression and classification. In these algorithms, a parameter ν lets one effectively control the number of support vectors. While this can be useful in its own right, the parameterization has the additional benefit of enabling us to eliminate one of the other free parameters of the algorithm: the accuracy parameter epsilon in the regression case, and the regularization constant C in the classification case. We describe the algorithms, give some theoretical results concerning the meaning and the choice of ν, and report experimental results.read more
Citations
More filters
Journal ArticleDOI
Enhancement of Visual Comfort and Sense of Presence on Stereoscopic 3D Images
TL;DR: A new visual comfort enhancement approach for processing S3D visual signals to deliver a more comfortable 3D viewing experience at the display through an optimization process whereby a predictive indicator of visual discomfort is minimized, while still aiming to maintain the viewer’s sense of 3D presence.
Patent
Defect classification with optimized purity
Vladimir Shlain,Assaf Glazer +1 more
TL;DR: In this article, a method for defect analysis includes identifying single-class classifiers for a plurality of defect classes, the plurality of defects characterized by respective ranges of inspection parameter values, while identifying the defects not in the respective class as unknown defects.
Journal ArticleDOI
Rapid Identification of Rainbow Trout Adulteration in Atlantic Salmon by Raman Spectroscopy Combined with Machine Learning.
TL;DR: The results indicate that Raman spectroscopy can be used as an effective Atlantic salmon adulteration identification method and that the developed GA–KM–Cubist machine learning model achieved satisfactory results based on MSC preprocessing.
Journal ArticleDOI
Supervised machine learning techniques and genetic optimization for occupational diseases risk prediction
TL;DR: Three different machine learning approaches are compared: the first one is based on the k -means algorithm, in charge to determine a set of meaningful labelled clusters as the final model, while the latter two are based on fully supervised techniques, namely Support Vector Machines and K -Nearest Neighbours.
Journal ArticleDOI
Provably Fast Training Algorithms for Support Vector Machines
TL;DR: An upper bound on the expected running time is formally proved which is quasilinear with respect to the number of data points and polynomial withrespect to the other parameters, i.e., the number and inverse of a chosen soft margin parameter.
References
More filters
Book
The Nature of Statistical Learning Theory
TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Journal ArticleDOI
Support-Vector Networks
Corinna Cortes,Vladimir Vapnik +1 more
TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Book
Matrix Analysis
Roger A. Horn,Charles R. Johnson +1 more
TL;DR: In this article, the authors present results of both classic and recent matrix analyses using canonical forms as a unifying theme, and demonstrate their importance in a variety of applications, such as linear algebra and matrix theory.
Journal ArticleDOI
A Tutorial on Support Vector Machines for Pattern Recognition
TL;DR: There are several arguments which support the observed high accuracy of SVMs, which are reviewed and numerous examples and proofs of most of the key theorems are given.