scispace - formally typeset
Journal ArticleDOI

LIBSVM: A library for support vector machines

TLDR
Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.
Abstract
LIBSVM is a library for Support Vector Machines (SVMs). We have been actively developing this package since the year 2000. The goal is to help users to easily apply SVM to their applications. LIBSVM has gained wide popularity in machine learning and many other areas. In this article, we present all implementation details of LIBSVM. Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

pRNAm-PC: Predicting N(6)-methyladenosine sites in RNA sequences via physical-chemical properties.

TL;DR: A new predictor called pRNAm-PC is proposed, in which RNA sequence samples are expressed by a novel mode of pseudo dinucleotide composition (PseDNC) whose components were derived from a physical-chemical matrix via a series of auto-covariance and cross covariance transformations.
Journal ArticleDOI

WS-SNPs&GO: a web server for predicting the deleterious effect of human protein variants using functional annotation

TL;DR: This work presents the web server implementation of SNPs&GO, a valuable tool that includes in a unique framework information derived from protein sequence, structure, evolutionary profile, and protein function.
Journal ArticleDOI

Robust sound event classification using deep neural networks

TL;DR: A sound event classification framework is outlined that compares auditory image front end features with spectrogram image-based frontEnd features, using support vector machine and deep neural network classifiers, and is shown to compare very well with current state-of-the-art classification techniques.
Proceedings ArticleDOI

PFID: Pittsburgh fast-food image dataset

TL;DR: The first visual dataset of fast foods is introduced with a total of 4,545 still images, 606 stereo pairs, 303 360° videos for structure from motion, and 27 privacy-preserving videos of eating events of volunteers to stimulate research on fast food recognition for dietary assessment.
Journal ArticleDOI

A hybrid SOFM-SVR with a filter-based feature selection for stock market forecasting

TL;DR: Wang et al. as mentioned in this paper hybridized SVR with the self-organizing feature map (SOFM) technique and a filter-based feature selection to reduce the cost of training time and to improve prediction accuracies.
References
More filters
Journal ArticleDOI

Support-Vector Networks

TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.

A Practical Guide to Support Vector Classication

TL;DR: A simple procedure is proposed, which usually gives reasonable results and is suitable for beginners who are not familiar with SVM.
Journal ArticleDOI

A comparison of methods for multiclass support vector machines

TL;DR: Decomposition implementations for two "all-together" multiclass SVM methods are given and it is shown that for large problems methods by considering all data at once in general need fewer support vectors.