scispace - formally typeset
Search or ask a question
Topic

Statistical learning theory

About: Statistical learning theory is a research topic. Over the lifetime, 1618 publications have been published within this topic receiving 158033 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: The most basic assumption in statistical learning theory is that training data and test data are drawn from the same underlying distribution, but in many applications, the "in-domai...
Abstract: The most basic assumption used in statistical learning theory is that training data and test data are drawn from the same underlying distribution. Unfortunately, in many applications, the "in-domai...

16 citations

Book ChapterDOI
29 Oct 2012
TL;DR: In this article, the authors tackle the problem of prediction and confidence intervals for time series using a statistical learning approach and quantile loss functions and show that the Gibbs estimator is able to predict as well as the best predictor in a given family for a wide set of loss functions.
Abstract: In this paper, we tackle the problem of prediction and confidence intervals for time series using a statistical learning approach and quantile loss functions. In a first time, we show that the Gibbs estimator is able to predict as well as the best predictor in a given family for a wide set of loss functions. In particular, using the quantile loss function of [1], this allows to build confidence intervals. We apply these results to the problem of prediction and confidence regions for the French Gross Domestic Product (GDP) growth, with promising results.

16 citations

Proceedings Article
Dixian Zhu1, Zhe Li1, Xiaoyu Wang, Boqing Gong2, Tianbao Yang1 
11 Apr 2019
TL;DR: A novel robust zero- sum game framework for pool-based active learning grounded on advanced statistical learning theory that avoids the issues of many previous algorithms such as inefficiency, sampling bias and sensitivity to imbalanced data distribution is presented.
Abstract: In this paper, we present a novel robust zero- sum game framework for pool-based active learning grounded on advanced statistical learning theory. Pool-based active learning usually consists of two components, namely, learning of a classifier given labeled data and querying of unlabeled data for labeling. Most previous studies on active learning consider these as two separate tasks and propose various heuristics for selecting important unlabeled data for labeling, which may render the selection of unlabeled examples sub-optimal for minimizing the classification error. In contrast, the present work formulates active learning as a unified optimization framework for learning the classifier, i.e., the querying of labels and the learning of models are unified to minimize a common objective for statistical learning. In addition, the proposed method avoids the issues of many previous algorithms such as inefficiency, sampling bias and sensitivity to imbalanced data distribution. Besides theoretical analysis, we conduct extensive experiments on benchmark datasets and demonstrate the superior performance of the proposed active learning method compared with the state-of-the-art methods.

16 citations

Proceedings ArticleDOI
23 Oct 2006
TL;DR: It is proved that if a kernel has a perfect alignment with the classification task, the SVM classifier has better performances.
Abstract: This paper studies several key aspects of support vector machine (SVM) for Web page classification. Developed from statistical learning theory, SVM is widely investigated and used for text categorization because of its high generalization performance and tolerant ability of processing high dimension classification. Firstly, some methods for Web page presentation are studied. Secondly the Web page classification based on SVM is implementation on data set, and NB classifier is used for study the performance of the SVM classifier processing high dimension space. Finally, the comparison on the polynomial kernel function and the radius basis function (RBF) kernel function is studied. It is proved that if a kernel has a perfect alignment with the classification task, the SVM classifier has better performances

16 citations


Network Information
Related Topics (5)
Artificial neural network
207K papers, 4.5M citations
86% related
Cluster analysis
146.5K papers, 2.9M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
81% related
Optimization problem
96.4K papers, 2.1M citations
80% related
Fuzzy logic
151.2K papers, 2.3M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20239
202219
202159
202069
201972
201847