scispace - formally typeset
Search or ask a question
Topic

Statistical learning theory

About: Statistical learning theory is a research topic. Over the lifetime, 1618 publications have been published within this topic receiving 158033 citations.


Papers
More filters
Journal Article
TL;DR: This paper considers the General Learning Setting (introduced by Vapnik), which includes most statistical learning problems as special cases, and identifies stability as the key necessary and sufficient condition for learnability.
Abstract: The problem of characterizing learnability is the most basic question of statistical learning theory. A fundamental and long-standing answer, at least for the case of supervised classification and regression, is that learnability is equivalent to uniform convergence of the empirical risk to the population risk, and that if a problem is learnable, it is learnable via empirical risk minimization. In this paper, we consider the General Learning Setting (introduced by Vapnik), which includes most statistical learning problems as special cases. We show that in this setting, there are non-trivial learning problems where uniform convergence does not hold, empirical risk minimization fails, and yet they are learnable using alternative mechanisms. Instead of uniform convergence, we identify stability as the key necessary and sufficient condition for learnability. Moreover, we show that the conditions for learnability in the general setting are significantly more complex than in supervised classification and regression.

432 citations

Journal ArticleDOI
TL;DR: In this article, the main ideas of statistical learning theory, support vector machines (SVMs), and kernel feature spaces are briefly described, with particular emphasis on a description of the so-called ν-SVM.
Abstract: We briefly describe the main ideas of statistical learning theory, support vector machines (SVMs), and kernel feature spaces. We place particular emphasis on a description of the so-called ν-SVM, including details of the algorithm and its implementation, theoretical results, and practical applications. Copyright © 2005 John Wiley & Sons, Ltd.

410 citations

01 Jan 2000
TL;DR: A new framework for the general learning problem, and a novel powerful learning method called Support Vector Machine or SVM, which can solve small sample learning problems better are introduced.
Abstract: Data based machine learning covers a wide range of topics from pattern recognition to function regression and density estimation Most of the existing methods are based on traditional statistics, which provides conclusion only for the situation where sample size is tending to infinity So they may not work in practical cases of limited samples Statistical Learning Theory or SLT is a small sample statistics by Vapnik et al, which concerns mainly the statistic principles when samples are limited, especially the properties of learning procedure in such cases SLT provides us a new framework for the general learning problem, and a novel powerful learning method called Support Vector Machine or SVM, which can solve small sample learning problems better It is believed that the study of SLT and SVM is becoming a new hot area in the field of machine learning This review introduces the basic ideas of SLT and SVM, their major characteristics and some current research trends

408 citations

Book
27 Aug 2004
TL;DR: In this paper, a scenario approach for Probabilistic Robust Design is presented for LPV systems. But the approach is not suitable for linear systems and does not address the limitations of the robustness Paradigm.
Abstract: Overview.- Elements of Probability Theory.- Uncertain Linear Systems and Robustness.- Linear Robust Control Design.- Some Limits of the Robustness Paradigm.- Probabilistic Methods for Robustness.- Monte Carlo Methods.- Randomized Algorithms in Systems and Control.- Probability Inequalities.- Statistical Learning Theory and Control Design.- Sequential Algorithms for Probabilistic Robust Design.- Sequential Algorithms for LPV Systems.- Scenario Approach for Probabilistic Robust Design.- Random Number and Variate Generation.- Statistical Theory of Radial Random Vectors.- Vector Randomization Methods.- Statistical Theory of Radial Random Matrices.- Matrix Randomization Methods.- Applications of Randomized Algorithms.- Appendix.

393 citations

Book
28 Jun 2017
TL;DR: The kernel mean embedding (KME) as discussed by the authors is a generalization of the original feature map of support vector machines (SVMs) and other kernel methods, and it can be viewed as a generalisation of the SVM feature map.
Abstract: A Hilbert space embedding of a distribution—in short, a kernel mean embedding—has recently emerged as a powerful tool for machine learning and statistical inference. The basic idea behind this framework is to map distributions into a reproducing kernel Hilbert space (RKHS) in which the whole arsenal of kernel methods can be extended to probability measures. It can be viewed as a generalization of the original “feature map” common to support vector machines (SVMs) and other kernel methods. In addition to the classical applications of kernel methods, the kernel mean embedding has found novel applications in fields ranging from probabilistic modeling to statistical inference, causal discovery, and deep learning. Kernel Mean Embedding of Distributions: A Review and Beyond provides a comprehensive review of existing work and recent advances in this research area, and to discuss some of the most challenging issues and open problems that could potentially lead to new research directions. The targeted audience includes graduate students and researchers in machine learning and statistics who are interested in the theory and applications of kernel mean embeddings.

375 citations


Network Information
Related Topics (5)
Artificial neural network
207K papers, 4.5M citations
86% related
Cluster analysis
146.5K papers, 2.9M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
81% related
Optimization problem
96.4K papers, 2.1M citations
80% related
Fuzzy logic
151.2K papers, 2.3M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20239
202219
202159
202069
201972
201847