Topic
Statistical learning theory
About: Statistical learning theory is a research topic. Over the lifetime, 1618 publications have been published within this topic receiving 158033 citations.
Papers published on a yearly basis
Papers
More filters
••
05 Jan 2004TL;DR: The work presented here examines the feasibility of applying SVMs to the aerodynamic modeling field through empirical comparisons between the SVMs and the commonly used neural network technique through two practical data modeling cases.
Abstract: Aerodynamic data modeling plays an important role in aerospace and industrial fluid engineering. Support vector machines (SVMs), as a novel type of learning algorithms based on the statistical learning theory, can be used for regression problems and have been reported to perform well with promising results. The work presented here examines the feasibility of applying SVMs to the aerodynamic modeling field. Mainly, the empirical comparisons between the SVMs and the commonly used neural network technique are carried out through two practical data modeling cases – performance-prediction of a new prototype mixer for engine combustors, and calibration of a five-hole pressure probe. A CFD-based diffuser optimization design is also involved in the article, in which an SVM is used to construct a response surface and hereby to make the optimization perform on an easily computable surrogate space. The obtained simulation results in all the application cases demonstrate that SVMs are the potential options for the ch...
13 citations
••
TL;DR: The objective of this paper is to discuss and compare some aspect of pattern recognition, among the various framework in which pattern recognition has been traditional formulated, amongst the various techniques and methods imported from statistical learning theory.
Abstract: The objective of this paper is to discuss and compare some aspect of pattern recognition, among the various framework in which pattern recognition has been traditional formulated. The primary goal of pattern recognition is super- vised or unsupervised classification. More recently, neural network techniques and methods imported from statistical learning theory have been receiving increasing attention. The design of a recognition system requires careful attention to the following issues: definition of pattern classes, sensing environment, pattern representation, feature extraction and selection, cluster analysis, classifier design and learning, selection of training and test samples, and performance evaluation.
13 citations
•
21 Jun 2019TL;DR: In this article, the authors show that the standard complexity measures of Gaussian and Rademacher complexities and VC dimension are sufficient measures of complexity for the purposes of bounding the generalization error and learning rates of hypothesis classes in this setting.
Abstract: Statistical learning theory has largely focused on learning and generalization given independent and identically distributed (i.i.d.) samples. Motivated by applications involving time-series data, there has been a growing literature on learning and generalization in settings where data is sampled from an ergodic process. This work has also developed complexity measures, which appropriately extend the notion of Rademacher complexity to bound the generalization error and learning rates of hypothesis classes in this setting. Rather than time-series data, our work is motivated by settings where data is sampled on a network or a spatial domain, and thus do not fit well within the framework of prior work. We provide learning and generalization bounds for data that are complexly dependent, yet their distribution satisfies the standard Dobrushin's condition. Indeed, we show that the standard complexity measures of Gaussian and Rademacher complexities and VC dimension are sufficient measures of complexity for the purposes of bounding the generalization error and learning rates of hypothesis classes in our setting. Moreover, our generalization bounds only degrade by constant factors compared to their i.i.d. analogs, and our learnability bounds degrade by log factors in the size of the training set.
13 citations
••
TL;DR: A new system identification method based on SVR is proposed for linear in parameter models and the effectiveness of the proposed method is examined through numerical examples.
13 citations
••
TL;DR: The concepts of fuzzy expected risk functional, fuzzy empiricalrisk functional and fuzzy empirical risk minimization principle are redefined and the key theorem of learning theory based on fuzzy number samples is proved.
13 citations