scispace - formally typeset
Journal ArticleDOI

LIBSVM: A library for support vector machines

TLDR
Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.
Abstract
LIBSVM is a library for Support Vector Machines (SVMs). We have been actively developing this package since the year 2000. The goal is to help users to easily apply SVM to their applications. LIBSVM has gained wide popularity in machine learning and many other areas. In this article, we present all implementation details of LIBSVM. Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

3D analysis of facial morphology

TL;DR: Dense surface models can be used to analyze 3D facial morphology by establishing a correspondence of thousands of points across each 3D face image and provide dramatic visualizations of face‐shape variation with potential for training physicians to recognize the key components of particular syndromes.
Journal ArticleDOI

An effective feature selection method for hyperspectral image classification based on genetic algorithm and support vector machine

TL;DR: A hybrid feature selection strategy based on genetic algorithm and support vector machine (GA-SVM) formed a wrapper to search for the best combination of bands with higher classification accuracy, which reduced the computational cost of the genetic algorithm.
Proceedings Article

Classifying Political Orientation on Twitter: It’s Not Easy!

TL;DR: Standard techniques for inferring political orientation show that methods which previously reported greater than 90% inference accuracy, actually achieve barely 65% accuracy on normal users, and show that classifiers cannot be used to classify users outside the narrow range of political orientation on which they were trained.
Journal ArticleDOI

Image annotation by kNN-sparse graph-based label propagation over noisily tagged web images

TL;DR: This article exploits the problem of annotating a large-scale image corpus by label propagation over noisily tagged web images by proposing a novel kNN-sparse graph-based semi-supervised learning approach for harnessing the labeled and unlabeled data simultaneously.
Journal ArticleDOI

Solar Power Prediction Based on Satellite Images and Support Vector Machine

TL;DR: In this article, a solar power prediction model based on various satellite images and a support vector machine (SVM) learning scheme was proposed to forecast the motion vectors of clouds by utilizing satellite images of atmospheric motion vectors (AMVs).
References
More filters
Journal ArticleDOI

Support-Vector Networks

TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.

A Practical Guide to Support Vector Classication

TL;DR: A simple procedure is proposed, which usually gives reasonable results and is suitable for beginners who are not familiar with SVM.
Journal ArticleDOI

A comparison of methods for multiclass support vector machines

TL;DR: Decomposition implementations for two "all-together" multiclass SVM methods are given and it is shown that for large problems methods by considering all data at once in general need fewer support vectors.