scispace - formally typeset
Journal ArticleDOI

LIBSVM: A library for support vector machines

TLDR
Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.
Abstract
LIBSVM is a library for Support Vector Machines (SVMs). We have been actively developing this package since the year 2000. The goal is to help users to easily apply SVM to their applications. LIBSVM has gained wide popularity in machine learning and many other areas. In this article, we present all implementation details of LIBSVM. Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Semi-Supervised Sparse Representation Based Classification for Face Recognition With Insufficient Labeled Samples

TL;DR: In this paper, a semi-supervised sparse representation-based classification method was proposed to deal with the non-linear nuisance variations between labeled and unlabeled samples, where a gallery dictionary consisting of one or more examples of each person and a variation dictionary representing linear nuisance variables (e.g., different lighting conditions and different glasses).
Journal ArticleDOI

Time Complexity Analysis of Support Vector Machines (SVM) in LibSVM

TL;DR: The research has proved that the complexity of SVM (LibSVM) is O(n3) and the time complexity shown that C++ faster than Java, both in training and testing, beside that the data growth will be affect and increase the time of computation.
Journal ArticleDOI

Feature subset selection and feature ranking for multivariate time series

TL;DR: This work proposes a family of novel unsupervised methods for feature subset selection from multivariate time series (MTS) based on common principal component analysis, termed CLeVer, which outperforms RFE, FC, and random selection by up to a factor of two in terms of the classification accuracy, while taking up to 2 orders of magnitude less processing time than RFE and FC.
Proceedings ArticleDOI

Question Classification using Head Words and their Hypernyms

TL;DR: This work proposes head word feature and present two approaches to augment semantic features of such head words using WordNet and proposes a compact yet effective feature set.
Proceedings ArticleDOI

The Interestingness of Images

TL;DR: This work introduces a set of features computationally capturing the three main aspects of visual interestingness and builds an interestingness predictor from them, shown on three datasets with varying context, reflecting the prior knowledge of the viewers.
References
More filters
Journal ArticleDOI

Support-Vector Networks

TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.

A Practical Guide to Support Vector Classication

TL;DR: A simple procedure is proposed, which usually gives reasonable results and is suitable for beginners who are not familiar with SVM.
Journal ArticleDOI

A comparison of methods for multiclass support vector machines

TL;DR: Decomposition implementations for two "all-together" multiclass SVM methods are given and it is shown that for large problems methods by considering all data at once in general need fewer support vectors.