scispace - formally typeset
Journal ArticleDOI

LIBSVM: A library for support vector machines

TLDR
Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.
Abstract
LIBSVM is a library for Support Vector Machines (SVMs). We have been actively developing this package since the year 2000. The goal is to help users to easily apply SVM to their applications. LIBSVM has gained wide popularity in machine learning and many other areas. In this article, we present all implementation details of LIBSVM. Issues such as solving SVM optimization problems theoretical convergence multiclass classification probability estimates and parameter selection are discussed in detail.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

TROP-ELM: A double-regularized ELM using LARS and Tikhonov regularization

TL;DR: The proposed modification of the OP-ELM uses a cascade of two regularization penalties: first a L1 penalty to rank the neurons of the hidden layer, followed by a L2 penalty on the regression weights for numerical stability and efficient pruning of the neurons.
Proceedings Article

Integrating Language and Vision to Generate Natural Language Descriptions of Videos in the Wild

TL;DR: This paper proposes a strategy for generating textual descriptions of videos by using a factor graph to combine visual detections with language statistics, and uses state-of-the-art visual recognition systems to obtain confidences on entities, activities, and scenes present in the video.
Journal ArticleDOI

Stream-based active learning for sentiment analysis in the financial domain

TL;DR: Whether the sentiment expressed in Twitter feeds, which discuss selected companies and their products, can indicate their stock price changes is analyzed, and changes in positive sentiment probability can be used as indicators of the changes in stock closing prices.
Proceedings ArticleDOI

Inferring semantic concepts from community-contributed images and noisy tags

TL;DR: This paper proposes a novel sparse graph-based semi-supervised learning approach for harnessing the labeled and unlabeled data simultaneously, and constructs an informative compact concept space with small semantic gap to infer the semantic concepts in this space to bridge the semantic gap.
Journal ArticleDOI

Extracting biological information with computational analysis of Fourier-transform infrared (FTIR) biospectroscopy datasets: current practices to future perspectives

TL;DR: Many of the methods presented in this review are Machine Learning and Statistical techniques that are extendable to other forms of computer-based biomedical analysis, including mass spectrometry and magnetic resonance.
References
More filters
Journal ArticleDOI

Support-Vector Networks

TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.

A Practical Guide to Support Vector Classication

TL;DR: A simple procedure is proposed, which usually gives reasonable results and is suitable for beginners who are not familiar with SVM.
Journal ArticleDOI

A comparison of methods for multiclass support vector machines

TL;DR: Decomposition implementations for two "all-together" multiclass SVM methods are given and it is shown that for large problems methods by considering all data at once in general need fewer support vectors.