scispace - formally typeset
Journal ArticleDOI

LIBSVM: A library for support vector machines

Reads0
Chats0
TLDR
Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.
Abstract
LIBSVM is a library for Support Vector Machines (SVMs). We have been actively developing this package since the year 2000. The goal is to help users to easily apply SVM to their applications. LIBSVM has gained wide popularity in machine learning and many other areas. In this article, we present all implementation details of LIBSVM. Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Semantic point cloud interpretation based on optimal neighborhoods, relevant features and efficient classifiers

TL;DR: It is demonstrated that the selection of optimal neighborhoods for individual 3D points significantly improves the results of 3D scene analysis and may even further increase the quality of the derived results while significantly reducing both processing time and memory consumption.
Journal ArticleDOI

Comparative study on classifying human activities with miniature inertial and magnetic sensors

TL;DR: Bayesian decision making (BDM) results in the highest correct classification rate with relatively small computational cost, and a performance comparison of the classification techniques is provided in terms of their correct differentiation rates, confusion matrices, and computational cost.
Journal ArticleDOI

Improving Computer-Aided Detection Using Convolutional Neural Networks and Random View Aggregation

TL;DR: In this paper, a coarse-to-fine cascade framework is proposed to generate 2D or 2.5D views via sampling through scale transformations, random translations and rotations, which are used to train deep convolutional neural network (ConvNet) classifiers.
Proceedings ArticleDOI

Open Set Domain Adaptation

TL;DR: This work learns a mapping from the source to the target domain by jointly solving an assignment problem that labels those target instances that potentially belong to the categories of interest present in the source dataset.
Journal ArticleDOI

PLEK: a tool for predicting long non-coding RNAs and messenger RNAs based on an improved k-mer scheme

TL;DR: PLEK is an efficient alignment-free computational tool to distinguish lncRNAs from mRNAs in RNA-seq transcriptomes of species lacking reference genomes and is especially suitable for PacBio or 454 sequencing data and large-scale transcriptome data.
References
More filters
Journal ArticleDOI

Support-Vector Networks

TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.

A Practical Guide to Support Vector Classication

TL;DR: A simple procedure is proposed, which usually gives reasonable results and is suitable for beginners who are not familiar with SVM.
Journal ArticleDOI

A comparison of methods for multiclass support vector machines

TL;DR: Decomposition implementations for two "all-together" multiclass SVM methods are given and it is shown that for large problems methods by considering all data at once in general need fewer support vectors.