scispace - formally typeset
Journal ArticleDOI

LIBSVM: A library for support vector machines

Reads0
Chats0
TLDR
Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.
Abstract
LIBSVM is a library for Support Vector Machines (SVMs). We have been actively developing this package since the year 2000. The goal is to help users to easily apply SVM to their applications. LIBSVM has gained wide popularity in machine learning and many other areas. In this article, we present all implementation details of LIBSVM. Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Transfer Learning Improves Supervised Image Segmentation Across Imaging Protocols

TL;DR: Four transfer classifiers are presented that can train a classification scheme with only a small amount of representative training data, in addition to a larger amount of other training data with slightly different characteristics that may improve performance over supervised learning for segmentation across scanners and scan protocols.
Journal ArticleDOI

TEclass—a tool for automated classification of unknown eukaryotic transposable elements

TL;DR: TEclass is a tool to classify unknown transposable elements into their four main functional categories, which reflect their mode of transposition: DNA transposons, long terminal repeats (LTRs), long interspersed nuclear elements (LINEs) and shortInterspersednuclear elements (SINEs).
Journal ArticleDOI

A comparative study of different machine learning methods on microarray gene expression data.

TL;DR: The importance of feature selection in accurately classifying new samples and how an integrated feature selection and classification algorithm is performing and is capable of identifying significant genes are revealed.
Journal ArticleDOI

Comparison of four statistical and machine learning methods for crash severity prediction.

TL;DR: Overall correct prediction rate had almost the exact opposite results compared to the proposed approach, showing that neglecting the crash costs can lead to misjudgment in choosing the right prediction method.
Journal ArticleDOI

A bagging SVM to learn from positive and unlabeled examples

TL;DR: It is shown theoretically and experimentally that the proposed method can match and even outperform the performance of state-of-the-art methods for PU learning, particularly when the number of positive examples is limited and the fraction of negatives among the unlabeled examples is small.
References
More filters
Journal ArticleDOI

Support-Vector Networks

TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.

A Practical Guide to Support Vector Classication

TL;DR: A simple procedure is proposed, which usually gives reasonable results and is suitable for beginners who are not familiar with SVM.
Journal ArticleDOI

A comparison of methods for multiclass support vector machines

TL;DR: Decomposition implementations for two "all-together" multiclass SVM methods are given and it is shown that for large problems methods by considering all data at once in general need fewer support vectors.