scispace - formally typeset
Journal ArticleDOI

LIBSVM: A library for support vector machines

TLDR
Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.
Abstract
LIBSVM is a library for Support Vector Machines (SVMs). We have been actively developing this package since the year 2000. The goal is to help users to easily apply SVM to their applications. LIBSVM has gained wide popularity in machine learning and many other areas. In this article, we present all implementation details of LIBSVM. Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Wavelet-Based Energy Features for Glaucomatous Image Classification

TL;DR: This paper proposes a novel technique to extract energy signatures obtained using 2-D discrete wavelet transform, and subject these signatures to different feature ranking and feature selection strategies, achieving an accuracy of around 93% using tenfold cross validations.
Journal ArticleDOI

A feature weighted support vector machine and K-nearest neighbor algorithm for stock market indices prediction

TL;DR: A basic hybridized framework of the feature weighted support vector machine as well as feature weighted K-nearest neighbor to effectively predict stock market indices and can achieve a better prediction capability to Shanghai Stock Exchange Composite Index and Shenzhen Stock Exchange Component Index in the short, medium and long term respectively.
Journal ArticleDOI

Predictive modelling for solar thermal energy systems: A comparison of support vector regression, random forest, extra trees and regression trees

TL;DR: It was found that RF and ET have comparable predictive power and are equally applicable for predicting useful solar thermal energy (USTE), with root mean square error values of 6.86 and 7.12 on the testing dataset, respectively.
Journal ArticleDOI

Support vector machines under adversarial label contamination

TL;DR: This work considers an attacker that aims to maximize the SVM?s classification error by flipping a number of labels in the training data, and formalizes a corresponding optimal attack strategy, and solves it by means of heuristic approaches to keep the computational complexity tractable.
Journal ArticleDOI

Remaining Useful Life Estimation in Rolling Bearings Utilizing Data-Driven Probabilistic E-Support Vectors Regression

TL;DR: A data-driven approach for the remaining useful life (RUL) estimation of rolling element bearings based on ε-Support Vector Regression, with Wiener entropy utilized for the first time in the condition monitoring of rolling bearings.
References
More filters
Journal ArticleDOI

Support-Vector Networks

TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.

A Practical Guide to Support Vector Classication

TL;DR: A simple procedure is proposed, which usually gives reasonable results and is suitable for beginners who are not familiar with SVM.
Journal ArticleDOI

A comparison of methods for multiclass support vector machines

TL;DR: Decomposition implementations for two "all-together" multiclass SVM methods are given and it is shown that for large problems methods by considering all data at once in general need fewer support vectors.