scispace - formally typeset
Open AccessJournal Article

Learning with progressive transductive support vector machine

TLDR
In this paper, a progressive transductive support vector machine (PSVM) was proposed to handle different class distributions. But the experimental results show that the proposed algorithm is not suitable for the case when the training sets are small or when there is a significant deviation between the training and working sets subsamples of the total population.
Abstract
Support vector machine (SVM) is a new learning method developed in recent years based on the foundations of statistical learning theory. By taking a transductive approach instead of an inductive one in support vector classifiers, the working set can be used as an additional source of information about margins. Compared with traditional inductive support vector machines, transductive support vector machine is often more powerful and can give better performance. In transduction, one estimates the classification function at points within the working set using information from both the training and the working set data. This will help to improve the generalization performance of SVMs, especially when training data is inadequate. Intuitively, we would expect transductive learning to yield improvements when the training sets are small or when there is a significant deviation between the training and working set subsamples of the total population. In this paper, a progressive transductive support vector machine is addressed to extend Joachims' transductive SVM to handle different class distributions. It solves the problem of having to estimate the ratio of positive/negative examples from the working set. The experimental results show the algorithm is very promising.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Domain Adaptation Problems: A DASVM Classification Technique and a Circular Validation Strategy

TL;DR: Experimental results confirmed the effectiveness and the reliability of both the DASVM technique and the proposed circular validation strategy for validating the learning of domain adaptation classifiers when no true labels for the target--domain instances are available.
Journal ArticleDOI

A Novel Transductive SVM for Semisupervised Classification of Remote-Sensing Images

TL;DR: A novel modified TSVM classifier designed for addressing ill-posed remote-sensing problems is proposed that is able to mitigate the effects of suboptimal model selection and can address multiclass cases.
Journal ArticleDOI

Optimization Techniques for Semi-Supervised Support Vector Machines

TL;DR: The performance and behavior of various S3VMs algorithms is studied together, under a common experimental setting, to review key ideas in this literature on semi-supervised support Vector Machines.
Journal ArticleDOI

Transmembrane protein topology prediction using support vector machines.

TL;DR: The high accuracy of TM topology prediction which includes detection of both signal peptides and re-entrant helices, combined with the ability to effectively discriminate between TM and globular proteins, make this method ideally suited to whole genome annotation of alpha-helical transmembrane proteins.
References
More filters
Book

The Nature of Statistical Learning Theory

TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Journal ArticleDOI

Support-Vector Networks

TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.

Statistical learning theory

TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Proceedings ArticleDOI

A training algorithm for optimal margin classifiers

TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Proceedings ArticleDOI

Combining labeled and unlabeled data with co-training

TL;DR: A PAC-style analysis is provided for a problem setting motivated by the task of learning to classify web pages, in which the description of each example can be partitioned into two distinct views, to allow inexpensive unlabeled data to augment, a much smaller set of labeled examples.