Journal ArticleDOI
LIBSVM: A library for support vector machines
Chih-Chung Chang,Chih-Jen Lin +1 more
TLDR
Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.Abstract:
LIBSVM is a library for Support Vector Machines (SVMs). We have been actively developing this package since the year 2000. The goal is to help users to easily apply SVM to their applications. LIBSVM has gained wide popularity in machine learning and many other areas. In this article, we present all implementation details of LIBSVM. Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.read more
Citations
More filters
Proceedings ArticleDOI
Extracting shared subspace for multi-label classification
TL;DR: This paper considers a general framework for extracting shared structures in multi-label classification, and includes several well-known algorithms as special cases, thus elucidating their intrinsic relationships.
Journal ArticleDOI
Learning Sparse Codes for Hyperspectral Imagery
TL;DR: This work modified an existing unsupervised learning approach and applied it to HSI data to learn an optimal sparse coding dictionary, which improves the performance of a supervised classification algorithm, both in terms of the classifier complexity and generalization from very small training sets.
Journal ArticleDOI
Mining Social Media Data for Understanding Students' Learning Experiences
TL;DR: This work presents a methodology and results that show how informal social media data can provide insights into students' experiences, and implemented a multi-label classification algorithm to classify tweets reflecting students' problems.
Journal ArticleDOI
Unsupervised Spatial–Spectral Feature Learning by 3D Convolutional Autoencoder for Hyperspectral Classification
TL;DR: Experimental results on several benchmark hyperspectral data sets have demonstrated that the proposed 3D-CAE is very effective in extracting spatial–spectral features and outperforms not only traditional unsupervised feature extraction algorithms but also many supervised feature extraction algorithm in classification application.
Proceedings ArticleDOI
Cuff-less high-accuracy calibration-free blood pressure estimation using pulse transit time
TL;DR: A novel method is proposed for accurate and reliable estimation of blood pressure that is calibration-free by extraction of several physiological parameters from Photoplethysmography (PPG) signal as well as utilizing signal processing and machine learning algorithms.
References
More filters
Journal ArticleDOI
Support-Vector Networks
Corinna Cortes,Vladimir Vapnik +1 more
TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Statistical learning theory
TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Proceedings ArticleDOI
A training algorithm for optimal margin classifiers
TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
A Practical Guide to Support Vector Classication
TL;DR: A simple procedure is proposed, which usually gives reasonable results and is suitable for beginners who are not familiar with SVM.
Journal ArticleDOI
A comparison of methods for multiclass support vector machines
Hsu Chih-Wei,Chih-Jen Lin +1 more
TL;DR: Decomposition implementations for two "all-together" multiclass SVM methods are given and it is shown that for large problems methods by considering all data at once in general need fewer support vectors.