Journal ArticleDOI
LIBSVM: A library for support vector machines
Chih-Chung Chang,Chih-Jen Lin +1 more
Reads0
Chats0
TLDR
Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.Abstract:
LIBSVM is a library for Support Vector Machines (SVMs). We have been actively developing this package since the year 2000. The goal is to help users to easily apply SVM to their applications. LIBSVM has gained wide popularity in machine learning and many other areas. In this article, we present all implementation details of LIBSVM. Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.read more
Citations
More filters
Book ChapterDOI
Hough transform and 3D SURF for robust three dimensional classification
TL;DR: A new robust 3D shape classification method is proposed, which extends a robust 2D feature descriptor, SURF, to be used in the context of 3D shapes and shows how3D shape class recognition can be improved by probabilistic Hough transform based methods, already popular in 2D.
Journal ArticleDOI
Chaotic multi-swarm whale optimizer boosted support vector machine for medical diagnosis
Mingjing Wang,Huiling Chen +1 more
TL;DR: To perform parameter optimization and feature selection simultaneously for SVM, an improved whale optimization algorithm (CMWOA), which combines chaotic and multi-swarm strategies is proposed, which significantly outperformed all the other competitors in terms of classification performance and feature subset size.
Journal ArticleDOI
Decision Tree and SVM-Based Data Analytics for Theft Detection in Smart Grid
TL;DR: This paper proposes a comprehensive top-down scheme capable enough to precisely detect and locate real-time electricity theft at every level in power transmission and distribution (T&D).
Journal ArticleDOI
SOLpro: accurate sequence-based prediction of protein solubility
TL;DR: A sequence-based prediction method able to accurately predict the propensity of a protein to be soluble on overexpression could be used to prioritize targets in large-scale proteomics projects and to identify mutations likely to increase the solubility of insoluble proteins.
Journal ArticleDOI
Automatic speech emotion recognition using modulation spectral features
TL;DR: Modulation spectral features are proposed for the automatic recognition of human affective information from speech and render a substantial improvement in recognition performance when used to augment prosodic features, which have been extensively used for emotion recognition.
References
More filters
Journal ArticleDOI
Support-Vector Networks
Corinna Cortes,Vladimir Vapnik +1 more
TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Statistical learning theory
TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Proceedings ArticleDOI
A training algorithm for optimal margin classifiers
TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
A Practical Guide to Support Vector Classication
TL;DR: A simple procedure is proposed, which usually gives reasonable results and is suitable for beginners who are not familiar with SVM.
Journal ArticleDOI
A comparison of methods for multiclass support vector machines
Hsu Chih-Wei,Chih-Jen Lin +1 more
TL;DR: Decomposition implementations for two "all-together" multiclass SVM methods are given and it is shown that for large problems methods by considering all data at once in general need fewer support vectors.