scispace - formally typeset
Search or ask a question
Topic

Statistical learning theory

About: Statistical learning theory is a research topic. Over the lifetime, 1618 publications have been published within this topic receiving 158033 citations.


Papers
More filters
Proceedings Article
24 May 2019
TL;DR: In this paper, the authors address the question of how to learn robustly in such scenarios, and derive a procedure that allows for learning from all available sources, yet automatically suppresses irrelevant or corrupted data.
Abstract: Modern machine learning methods often require more data for training than a single expert can provide. Therefore, it has become a standard procedure to collect data from external sources, e.g. via crowdsourcing. Unfortunately, the quality of these sources is not always guaranteed. As additional complications, the data might be stored in a distributed way, or might even have to remain private. In this work, we address the question of how to learn robustly in such scenarios. Studying the problem through the lens of statistical learning theory, we derive a procedure that allows for learning from all available sources, yet automatically suppresses irrelevant or corrupted data. We show by extensive experiments that our method provides significant improvements over alternative approaches from robust statistics and distributed optimization.

28 citations

Book ChapterDOI
01 Jan 2018
TL;DR: In statistical learning theory (regression, classification, etc.) there are many regression models, such as algebraic polynomials, which help in the development of models for classification.
Abstract: In statistical learning theory (regression, classification, etc.) there are many regression models, such as algebraic polynomials,

28 citations

Proceedings ArticleDOI
23 Jun 2008
TL;DR: A framework formulated under statistical learning theory that facilitates robust learning of a discriminative projection is proposed and the experimental results suggest that the proposed method outperforms some recent regularized techniques when the number of training samples is small.
Abstract: Learning a robust projection with a small number of training samples is still a challenging problem in face recognition, especially when the unseen faces have extreme variation in pose, illumination, and facial expression. To address this problem, we propose a framework formulated under statistical learning theory that facilitates robust learning of a discriminative projection. Dimensionality reduction using the projection matrix is combined with a linear classifier in the regularized framework of lasso regression. The projection matrix in conjunction with the classifier parameters are then found by solving an optimization problem over the Stiefel manifold. The experimental results on standard face databases suggest that the proposed method outperforms some recent regularized techniques when the number of training samples is small.

28 citations

Journal Article
TL;DR: The research shows that the method suggested features higher performance on classification and generalization ability and shorter training time over the methods based on artificial neural networks, especially for small samples.
Abstract: Fault diagnosis method based on SVM is proposed in this paper. The research shows that the method suggested features higher performance on classification and generalization ability and shorter training time over the methods based on artificial neural networks, especially for small samples.

28 citations

Journal ArticleDOI
01 Dec 2008
TL;DR: This paper proposes to address the small sample size (SSS) problem in the framework of statistical learning theory by compute linear discriminants by regularized least squares regression, where the singularity problem is resolved.
Abstract: Linear discriminant analysis (LDA) as a dimension reduction method is widely used in classification such as face recognition. However, it suffers from the small sample size (SSS) problem when data dimensionality is greater than the sample size, as in images where features are high dimensional and correlated. In this paper, we propose to address the SSS problem in the framework of statistical learning theory. We compute linear discriminants by regularized least squares regression, where the singularity problem is resolved. The resulting discriminants are complete in that they include both regular and irregular information. We show that our proposal and its nonlinear extension belong to the same framework where powerful classifiers such as support vector machines are formulated. In addition, our approach allows us to establish an error bound for LDA. Finally, our experiments validate our theoretical analysis results.

28 citations


Network Information
Related Topics (5)
Artificial neural network
207K papers, 4.5M citations
86% related
Cluster analysis
146.5K papers, 2.9M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
81% related
Optimization problem
96.4K papers, 2.1M citations
80% related
Fuzzy logic
151.2K papers, 2.3M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20239
202219
202159
202069
201972
201847