scispace - formally typeset
Search or ask a question
Topic

Statistical learning theory

About: Statistical learning theory is a research topic. Over the lifetime, 1618 publications have been published within this topic receiving 158033 citations.


Papers
More filters
Book ChapterDOI
01 Jan 2013
TL;DR: The finite sample distribution of many nonparametric methods from statistical learning theory is unknown because the distribution P from which the data were generated is unknown and because there often exist only asymptotical results on the behaviour of such methods.
Abstract: The finite sample distribution of many nonparametric methods from statistical learning theory is unknown because the distribution P from which the data were generated is unknown and because there often exist only asymptotical results on the behaviour of such methods.

12 citations

Proceedings ArticleDOI
Lei Guo1, Xuena Liu1, Youxi Wu1, Weili Yan1, Xueqin Shen1 
22 Oct 2007
TL;DR: The segmentation of MRI image based on the multi-classification SVM (MCSVM) is investigated and it can reach satisfactory generalization accuracy, exhibiting its great potential in image segmentation.
Abstract: In head MRI image, the boundary of each encephalic tissue is highly complicated and irregular. It is a real challenge to traditional segmentation algorithms. As a new kind of machine learning, support vector machine (SVM) based on statistical learning theory (SLT) has high generalization ability, especially for dataset with small number of samples in high dimensional space. SVM was originally developed for two-class classification. It is extended to solve multi-class classification problem. In this paper, 57 dimensional feature vectors for MRI image are selected as input for SVM. The segmentation of MRI image based on the multi-classification SVM (MCSVM) is investigated. As our experiment demonstrates, the boundaries of 7 kinds of encephalic tissues are extracted successfully, and it can reach satisfactory generalization accuracy. Thus, SVM exhibits its great potential in image segmentation.

12 citations

Book ChapterDOI
01 Jan 2004
TL;DR: This chapter builds on the theoretical knowledge on SVMs to apply SVMs for classification of multi and hyperspectral remote sensing data and to overcome the Hughes Phenomenon.
Abstract: As discussed in Chap. 5, support vector machines (SVMs) have originated from statistical learning theory for classification and regression problems. Unlike the popular neural network classifiers, SVMs do not minimize the empirical training error (Byun and Lee 2003). Instead, they aim to maximize the margin between two classes of interest by placing a linear separating hyperplane between them. While doing so, the upper bound on the generalization error is minimized. Thus, SVM based classifiers are expected to have more generalization capability than neural networks. Other advantages of SVMs are their ability to adapt their learning characteristic via a kernel function and to adequately classify data on a high-dimensional feature space with a limited number of training data sets thereby overcoming the Hughes Phenomenon. The theoretical background on SVMs has been presented in Chap. 5. This chapter builds on this theoretical knowledge to apply SVMs for classification of multi and hyperspectral remote sensing data.

11 citations

Journal ArticleDOI
TL;DR: In this paper, the orthogonal projection kernels of father wavelet (OPFW kernels) are introduced into SVMs and can have good performance in both approximation and generalisation.
Abstract: Recently the study on the theory of wavelets shows that the wavelets have not only the multi-resolution property both in frequency and time domain, but also the good approximation ability. SVMs based on the statistical learning theory are a kind of general and effective learning machines, and have described for us the nice application blueprint in machine learning domain. There exists a bottleneck problem, or the pre-selection of kernel parameter for SVMs. In this paper, the orthogonal projection kernels of father wavelet (OPFW kernels) are introduced into SVMs. In doing so SVMs based on the OPFW kernels can have good performance in both approximation and generalisation. Simultaneously the parameter pre-selection of wavelet kernels can be implemented by discrete wavelet transform. Experiments on regression estimation illustrate the approximation and generalisation ability of our method.

11 citations

Journal ArticleDOI
TL;DR: This paper introduces a method to compute the Shattering coefficient of DT models using recurrence equations, and assess the bias of models provided by DT algorithms while solving practical problems as well as their overall learning bounds in light of the SLT.
Abstract: In spite of the relevance of Decision Trees (DTs), there is still a disconnection between their theoretical and practical results while selecting models to address specific learning tasks. A particular criterion is provided by the Shattering coefficient, a growth function formulated in the context of the Statistical Learning Theory (SLT), which measures the complexity of the algorithm bias as sample sizes increase. In attempt to establish the basis for a relative theoretical complexity analysis, this paper introduces a method to compute the Shattering coefficient of DT models using recurrence equations. Next, we assess the bias of models provided by DT algorithms while solving practical problems as well as their overall learning bounds in light of the SLT. As the main contribution, our results support other researchers to decide on the most adequate DT models to tackle specific supervised learning tasks.

11 citations


Network Information
Related Topics (5)
Artificial neural network
207K papers, 4.5M citations
86% related
Cluster analysis
146.5K papers, 2.9M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
81% related
Optimization problem
96.4K papers, 2.1M citations
80% related
Fuzzy logic
151.2K papers, 2.3M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20239
202219
202159
202069
201972
201847