scispace - formally typeset
Search or ask a question
Topic

Statistical learning theory

About: Statistical learning theory is a research topic. Over the lifetime, 1618 publications have been published within this topic receiving 158033 citations.


Papers
More filters
Posted Content
TL;DR: This chapter provides an overview of the key ideas and insights of statistical learning theory and describes some other variants of machine learning.
Abstract: Statistical learning theory provides the theoretical basis for many of today's machine learning algorithms. In this article we attempt to give a gentle, non-technical overview over the key ideas and insights of statistical learning theory. We target at a broad audience, not necessarily machine learning researchers. This paper can serve as a starting point for people who want to get an overview on the field before diving into technical details.

7 citations

Proceedings ArticleDOI
Zhong Yi1, Zhou Chun-guang1, Huang Lan1, Wang Yan1, Yang Bin1 
11 Dec 2009
TL;DR: The support vector regression (SVR) algorithm is used to solve practical problems in the area of real estate for predict housing values, with a view to consumers in the choice of housing to provide good guidance.
Abstract: Support vector regression is based on statistical learning theory under the framework of a new general-purpose machine learning method, which is a effective way to deal with nonlinear classification and nonlinear regression. Due to the comprehensive theoretical basis and excellent learning performance, The technology has become the current international machine learning research community hot spots, which can to better address the practical problem, such as the small sample and high dimension, nonlinear and local minima etc.. In the article, support vector regression (SVR) and the RBF neural network do function fitting tests, using simulation data, and the results are compared and evaluation. And use the SVR algorithm to solve practical problems in the area of real estate for predict housing values, with a view to consumers in the choice of housing to provide good guidance.

7 citations

Book ChapterDOI
01 Jan 2012
TL;DR: This chapter gives a general overview of Artificial Neural Networks Learning from the perspectives of Statistical Learning Theory and Multi-objective Optimization.
Abstract: This chapter gives a general overview of Artificial Neural Networks Learning from the perspectives of Statistical Learning Theory and Multi-objective Optimization. Both approaches treat the general learning problem as a trade-off between the empirical risk obtained from the data set and the model complexity. Learning is seen as a problem of fitting model output to the data, and model complexity to system complexity. Since the later is not known in advance, only bounds to model complexity can be assumed in advance, so model selection can only be accomplished with ad-hoc decision making strategies, like the ones provided by Multi-objective learning. The main concepts of Multi-objective learning are then presented in the context of ECG problems.

7 citations

Journal Article
TL;DR: This paper develops the deviation inequalities and the symmetrization inequality for the learning process, and develops the risk bounds based on the covering number, and studies the asymptotic convergence and the rate of convergence of thelearning process for Leevy process.
Abstract: Leevy processes refer to a class of stochastic processes, for example, Poisson processes and Brownian motions, and play an important role in stochastic processes and machine learning. Therefore, it is essential to study risk bounds of the learning process for time-dependent samples drawn from a Leevy process (or briefly called learning process for Leevy process). It is noteworthy that samples in this learning process are not independently and identically distributed (i.i.d.). Therefore, results in traditional statistical learning theory are not applicable (or at least cannot be applied directly), because they are obtained under the sample-i.i.d. assumption. In this paper, we study risk bounds of the learning process for time-dependent samples drawn from a Leevy process, and then analyze the asymptotical behavior of the learning process. In particular, we first develop the deviation inequalities and the symmetrization inequality for the learning process. By using the resultant inequalities, we then obtain the risk bounds based on the covering number. Finally, based on the resulting risk bounds, we study the asymptotic convergence and the rate of convergence of the learning process for Leevy process. Meanwhile, we also give a comparison to the related results under the sample-i.i.d. assumption.

7 citations

Proceedings ArticleDOI
29 May 2012
TL;DR: It turned out that SVMs can overcome the ANNs's inherent drawback of slow training convergence speed and demonstrate high potentials for dealing with the chosen modeling of unsteady aerodynamics.
Abstract: Accurately modeling nonlinear and unsteady aerodynamics at high attitude flight plays an important role in design of future high performance fighters. In the meanwhile, it also can improve the prediction of high angle of attack dynamics of normal aircraft configurations. Support vector machines (SVMs), known as a novel type of learning machines based on statistical learning theory and structural risk minimization (SRM) principle, can be used for handle regression problems. By denoting a set of nonlinear transformations from the complex input space to a high-dimensional feature space, SVMs can approximate the regression function by a linear regression in the feature space. Such implementation is so simple that it can be analyzed mathematically. By employing SVMs, the present work models the unsteady pitching oscillation aerodynamic data of a 1/10 scaled aircraft model. Here, the input data are established from the wind tunnel experiments at different frequencies and amplitudes. To make comparison, the artificial neural networks (ANNs) technique is also used. It turned out that SVMs can overcome the ANNs's inherent drawback of slow training convergence speed. Consequently, SVMs demonstrate high potentials for dealing with the chosen modeling of unsteady aerodynamics.

7 citations


Network Information
Related Topics (5)
Artificial neural network
207K papers, 4.5M citations
86% related
Cluster analysis
146.5K papers, 2.9M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
81% related
Optimization problem
96.4K papers, 2.1M citations
80% related
Fuzzy logic
151.2K papers, 2.3M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20239
202219
202159
202069
201972
201847