Proceedings ArticleDOI
Learning with progressive transductive Support Vector Machine
Yisong Chen,Guoping Wang,Shihai Dong +2 more
- pp 67-74
TLDR
A progressive transductive support vector machine is addressed to extend Joachims' Transductive SVM to handle different class distributions and solves the problem of having to estimate the ratio of positive/negative examples from the working set.Abstract:
Support Vector Machine (SVM) is a new learning method developed in recent years based on the foundations of statistical learning theory. By taking a transductive approach instead of an inductive one in support vector classifiers, the test set can be used as an additional source of information about margins. Intuitively, we would expect transductive learning to yield improvements when the training sets are small or when there is a significant deviation between the training and working set subsamples of the total population. In this paper, a progressive transductive support vector machine is addressed to extend Joachims' Transductive SVM to handle different class distributions. It solves the problem of having to estimate the ratio of positive/negative examples from the working set. The experimental results show that the algorithm is very promising.read more
Citations
More filters
Proceedings ArticleDOI
Cross-domain learning methods for high-level visual concept classification
TL;DR: This work develops a new cross-domain SVM (CDSVM) algorithm for adapting previously learned support vectors from one domain to help classification in another domain, and proposes an intuitive selection criterion to determine which cross- domain learning method to use for each concept.
Journal ArticleDOI
Combating Negative Transfer From Predictive Distribution Differences
TL;DR: A predictive distribution matching (PDM) regularizer and a PDM framework learn the target classifier by favoring source data with large positive transferability while inferring the labels of target unlabeled data and a criterion to measure thepositive transferability between sample pairs of different domains in terms of their prediction distributions is proposed.
Book ChapterDOI
Predictive distribution matching SVM for multi-domain learning
TL;DR: The Predictive Distribution Matching SVM (PDM-SVM) is proposed to learn a robust classifier in the target domain (referred to as the target classifier) by leveraging the labeled data from only the relevant regions of multiple sources.
Columbia University TRECVID 2007 High-Level Feature Extraction.
TL;DR: A new cross-domain SVM (CDSVM) algorithm for adapting previously learned support vectors from one domain to help classification in another domain is developed and tested, and an intuitive selection criterion is proposed to determine which cross- domain learning method to use for each concept.
Journal ArticleDOI
A new transductive learning method with universum data
Yanshan Xiao,Feng Junyao,Bo Liu +2 more
TL;DR: This paper proposes a new method, called information entropy-based transductive support vector machine with Universum data (IEB-TUSVM), which mainly consists of two steps and analyzes the computational complexity of the proposed method.
References
More filters
Book
The Nature of Statistical Learning Theory
TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Journal ArticleDOI
Support-Vector Networks
Corinna Cortes,Vladimir Vapnik +1 more
TL;DR: High generalization ability of support-vector networks utilizing polynomial input transformations is demonstrated and the performance of the support- vector network is compared to various classical learning algorithms that all took part in a benchmark study of Optical Character Recognition.
Statistical learning theory
TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Proceedings ArticleDOI
A training algorithm for optimal margin classifiers
TL;DR: A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented, applicable to a wide variety of the classification functions, including Perceptrons, polynomials, and Radial Basis Functions.
Proceedings ArticleDOI
Combining labeled and unlabeled data with co-training
Avrim Blum,Tom M. Mitchell +1 more
TL;DR: A PAC-style analysis is provided for a problem setting motivated by the task of learning to classify web pages, in which the description of each example can be partitioned into two distinct views, to allow inexpensive unlabeled data to augment, a much smaller set of labeled examples.