Journal ArticleDOI
LIBSVM: A library for support vector machines
Chih-Chung Chang,Chih-Jen Lin +1 more
Reads0
Chats0
TLDR
Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.Abstract:
LIBSVM is a library for Support Vector Machines (SVMs). We have been actively developing this package since the year 2000. The goal is to help users to easily apply SVM to their applications. LIBSVM has gained wide popularity in machine learning and many other areas. In this article, we present all implementation details of LIBSVM. Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.read more
Citations
More filters
Journal ArticleDOI
Vision-based action recognition of earthmoving equipment using spatio-temporal features and support vector machine classifiers
TL;DR: A computer vision based algorithm for recognizing single actions of earthmoving construction equipment, based on a multiple binary SVM classifier and spatio-temporal features, which outperforms previous algorithms for excavator and truck action recognition.
Proceedings ArticleDOI
Speech emotion recognition based on HMM and SVM
Yi-Lin Lin,Gang Wei +1 more
TL;DR: Two classification methods, the hidden Markov model (HMM) and the support vector machine (SVM), are used, to classify five emotional states: anger, happiness, sadness, surprise and a neutral state.
Journal ArticleDOI
Dimension Reduction Using Spatial and Spectral Regularized Local Discriminant Embedding for Hyperspectral Image Classification
TL;DR: Experimental results show that the proposed SSRLDE significantly outperforms the state-of-the-art DR methods for HSI classification.
Proceedings ArticleDOI
Combining Multiple Kernel Methods on Riemannian Manifold for Emotion Recognition in the Wild
TL;DR: The method for the Emotion Recognition in the Wild Challenge (EmotiW 2014) is presented, and an optimal fusion of classifiers learned from different kernels and different modalities (video and audio) is conducted at the decision level for further boosting the performance.
Journal ArticleDOI
Predicting Malignant Nodules from Screening CT Scans
Samuel H. Hawkins,Hua Wang,Ying Liu,Alberto Garcia,Olya Stringfield,Henry Krewer,Qian Li,Dmitry Cherezov,Robert A. Gatenby,Yoganand Balagurunathan,Dmitry B. Goldgof,Matthew B. Schabath,Lawrence O. Hall,Robert J. Gillies +13 more
TL;DR: The radiomics of lung cancer screening computed tomography scans at baseline can be used to assess risk for development of cancer.
References
More filters
Journal ArticleDOI
Support-Vector Networks
Corinna Cortes,Vladimir Vapnik +1 more
TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Statistical learning theory
TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Proceedings ArticleDOI
A training algorithm for optimal margin classifiers
TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
A Practical Guide to Support Vector Classication
TL;DR: A simple procedure is proposed, which usually gives reasonable results and is suitable for beginners who are not familiar with SVM.
Journal ArticleDOI
A comparison of methods for multiclass support vector machines
Hsu Chih-Wei,Chih-Jen Lin +1 more
TL;DR: Decomposition implementations for two "all-together" multiclass SVM methods are given and it is shown that for large problems methods by considering all data at once in general need fewer support vectors.