scispace - formally typeset
Search or ask a question
Topic

Statistical learning theory

About: Statistical learning theory is a research topic. Over the lifetime, 1618 publications have been published within this topic receiving 158033 citations.


Papers
More filters
01 Jan 2001
TL;DR: This chapter contains sections titled: Introduction, The Law of Large Numbers, When Does LearningWork: the Question of Consistency, Uniform Convergence and Consistsency, How to Derive a VC Bound, A Model Selection Example, and Problems.
Abstract: This chapter contains sections titled: Introduction, The Law of Large Numbers, When Does LearningWork: the Question of Consistency, Uniform Convergence and Consistency, How to Derive a VC Bound, A Model Selection Example, Summary, Problems

8 citations

Journal ArticleDOI
TL;DR: In this paper, the authors estimate real-world private firm default probabilities over a fixed time horizon, conditioned on a vector of explanatory variables, which include financial ratios, economic indicators, and market prices.
Abstract: We estimate real-world private firm default probabilities over a fixed time horizon,conditioned on a vector of explanatory variables, which include financial ratios, economic indicators, and market prices. To estimate our model, we apply a recently developed method from statistical learning theory. This method leads to a model that is particularly appropriate for financial market participants who would use the model to make financial decisions. We compare our model with various benchmark models, with respect to a number of performance measures. In all of these tests, our model outperformed the benchmark models. We also discuss possible reasons for this outperformance. A revised version of this paper appeared in the Journal of Credit Risk, Volume 2/Number 1, Spring 2006.

8 citations

Journal ArticleDOI
TL;DR: This work develops distribution-free uniform deviation bounds and applies these to obtain finite sample bounds and strong universal consistency to achieve distribution free, asymptotically optimal inference under the random effects model.
Abstract: The false discovery rate (FDR) and false nondiscovery rate (FNDR) have received considerable attention in the literature on multiple testing. These performance measures are also appropriate for classification, and in this work we develop generalization error analyses for FDR and FNDR when learning a classifier from labeled training data. Unlike more conventional classification performance measures, the empirical FDR and FNDR are not binomial random variables but rather a ratio of binomials, which introduces challenges not present in conventional formulations of the classification problem. We develop distribution-free uniform deviation bounds and apply these to obtain finite sample bounds and strong universal consistency. We also present a simulation study demonstrating the merits of variance-based bounds, which we also develop. In the context of multiple testing with FDR/FNDR, our framework may be viewed as a way to leverage training data to achieve distribution free, asymptotically optimal inference under the random effects model.

8 citations

Proceedings ArticleDOI
15 Jun 2004
TL;DR: A novel regression technique, called Support Vector Machines (SVM), based on the statistical learning theory is explored and the prediction result shows that prediction accuracy of SVM is better than that of neural network.
Abstract: Machine learning techniques are finding more and more applications in the field of load forecasting. In this paper a novel regression technique, called Support Vector Machines (SVM), based on the statistical learning theory is explored. SVM is hased on the principle of Structure Risk Minimization as opposed to the principle of Empirical Risk Minimization supported by conventional regression techniques. The natural gas load data in Xi'an city in 2001 and 2002 are used in this study to demonstrate the forecasting capabilities of SVM. The result is compared with that of neural network based model for 7-lead day forecusting. The prediction result shows that prediction accuracy of SVM is better than that of neural network. Thus, SVM appears to he a very promising prediction tool. The software package NGPSLF based on support vector regression (SVR) also has been gone into practical business application.

8 citations

Journal ArticleDOI
TL;DR: In this essay, it is shown that, despite their differences, statistical and formal learning theory yield precisely the same result for a class of inductive problems that I call strongly VC ordered, of which Goodman’s riddle is just one example.
Abstract: Nelson Goodman’s new riddle of induction forcefully illustrates a challenge that must be confronted by any adequate theory of inductive inference: provide some basis for choosing among alternative hypotheses that fit past data but make divergent predictions. One response to this challenge is to distinguish among alternatives by means of some epistemically significant characteristic beyond fit with the data. Statistical learning theory takes this approach by showing how a concept similar to Popper’s notion of degrees of testability is linked to minimizing expected predictive error. In contrast, formal learning theory appeals to Ockham’s razor, which it justifies by reference to the goal of enhancing efficient convergence to the truth. In this essay, I show that, despite their differences, statistical and formal learning theory yield precisely the same result for a class of inductive problems that I call strongly VC ordered, of which Goodman’s riddle is just one example.

8 citations


Network Information
Related Topics (5)
Artificial neural network
207K papers, 4.5M citations
86% related
Cluster analysis
146.5K papers, 2.9M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
81% related
Optimization problem
96.4K papers, 2.1M citations
80% related
Fuzzy logic
151.2K papers, 2.3M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20239
202219
202159
202069
201972
201847