scispace - formally typeset
Search or ask a question
Topic

Statistical learning theory

About: Statistical learning theory is a research topic. Over the lifetime, 1618 publications have been published within this topic receiving 158033 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: Support Vector Regression (SVR) is gaining in popularity in the detection of outliers and classification problems in high-dimensional data (HDD) as this technique does not require the data to be of high quality.
Abstract: Support Vector Regression (SVR) is gaining in popularity in the detection of outliers and classification problems in high-dimensional data (HDD) as this technique does not require the data to be of...

6 citations

Posted ContentDOI
TL;DR: This paper embeds evolutionary computation into the most prominent representative of this class of learning methods, namely into Support Vector Machines (SVM), and shows that evolutionary SVM are at least as accurate as their quadratic programming counterparts on eight real-world benchmark data sets in terms of generalization performance.
Abstract: In this paper we embed evolutionary computation into statistical learning theory. First, we outline the connection between large margin optimization and statistical learning and see why this paradigm is successful for many pattern recognition problems. We then embed evolutionary computation into the most prominent representative of this class of learning methods, namely into Support Vector Machines (SVM). In contrast to former applications of evolutionary algorithms to SVM we do not only optimize the method or kernel parameters. We rather use evolution strategies in order to directly solve the posed constrained optimization problem. Transforming the problem into the Wolfe dual reduces the total runtime and allows the usage of kernel functions just as for traditional SVM. We will show that evolutionary SVM are at least as accurate as their quadratic programming counterparts on eight real-world benchmark data sets in terms of generalization performance. They always outperform traditional approaches in terms of the original optimization problem. Additionally, the proposed algorithm is more generic than existing traditional solutions since it will also work for non-positive semidefinite or indefinite kernel functions. The evolutionary SVM variants frequently outperform their quadratic programming competitors in cases where such an indefinite Kernel function is used.

6 citations

Proceedings ArticleDOI
01 Aug 2013
TL;DR: The support vector machine's classification mechanism and its application in mechanical fault diagnosis are introduced and some of the shortcomings of the machine learning algorithm are put forward.
Abstract: Support vector machine is a machine learning algorithm developed by Vapnik from the statistical learning theory for data classification via study from a small sample of fault data. For fault data it can isolate the fault categories accurately even though only has the small sample of data. In the present work, support vector machine's classification mechanism and its application in mechanical fault diagnosis are introduced. Therefore, give an instance the support vector machine makes fault classification for the coal mine scraper conveyor's faults. Last but not the least, put forward some of the shortcomings of the support vector machine and look forward to the direction of development of the support vector machine fault diagnosis in the future.

6 citations

Proceedings ArticleDOI
12 Jul 2008
TL;DR: A fast SVM training algorithm is proposed that is significantly faster than LibSVM and requires less support vectors to achieve good classification accuracy, and a fast quadric programming (QP) trainer is achieved.
Abstract: A fast SVM training algorithm is proposed in this paper. By integrating kernel caching, shrinking and using second order information, a fast quadric programming(QP) trainer is achieved. For traditional two-class SVM, the generalized error bound derived from statistical learning theory(SLT) is computed and minimized for the selection of parameters, with the Zoutendijk(ZQP) idea and parallel method to speed up the process. For one-class SVM, a compression criterion is proposed to search the best kernel width automatically. Experiments demonstrate that the proposed method is significantly faster than LibSVM and requires less support vectors to achieve good classification accuracy.

6 citations

Proceedings ArticleDOI
Lei Guo1, Youxi Wu, Weili Yan, Xueqin Shen, Ying Li 
01 Jan 2005
TL;DR: A medical diagnosis decision system (MDDSS) based on SVM has been established to intellectively diagnose 4 types of acid-base disturbance to show its great potential in MDDSS.
Abstract: Support vector machine (SVM) is a new learning technique based on statistical learning theory (SLT). In this paper, a medical diagnosis decision system (MDDSS) based on SVM has been established to intellectively diagnose 4 types of acid-base disturbance. SVM was originally developed for two-class classification. It is extended to solve multi-class classification problem named hierarchical SVM with clustering algorithm based on stepwise decomposition. Compared with other classical classification techniques, SVM not only has more solid theoretical foundation, it also has greater generalization ability as our experiment demonstrates. Thus, SVM exhibits its great potential in MDDSS

6 citations


Network Information
Related Topics (5)
Artificial neural network
207K papers, 4.5M citations
86% related
Cluster analysis
146.5K papers, 2.9M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
81% related
Optimization problem
96.4K papers, 2.1M citations
80% related
Fuzzy logic
151.2K papers, 2.3M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20239
202219
202159
202069
201972
201847