Topic
Statistical learning theory
About: Statistical learning theory is a research topic. Over the lifetime, 1618 publications have been published within this topic receiving 158033 citations.
Papers published on a yearly basis
Papers
More filters
•
TL;DR: A bayesian algorithm for wireless fingerprinting and indoor location determination using fuzzy clustering with bayesian learning as a statistical learning theory is proposed.
Abstract: For the indoor positioning, wireless fingerprinting is most favorable because fingerprinting is most accurate among the technique for wireless network based indoor positioning which does not require any special equipments dedicated for positioning. The deployment of a fingerprinting method consists of off-line phase and on-line phase and more efficient and accurate methods have been studied. This paper proposes a bayesian algorithm for wireless fingerprinting and indoor location determination using fuzzy clustering with bayesian learning as a statistical learning theory.
3 citations
••
26 Aug 2004TL;DR: The hybrid genetic algorithm based on the statistical learning theory proposed in this paper shows better generalization ability in testing sample set.
Abstract: Based on the statistical learning theory and support vector machines, a novel fitness function is constructed according to the structural risk minimization principle. Then, a new hybrid genetic process is presented and implemented in real coding. This new hybrid genetic process is used to optimize neural networks, and a classification task is taken for an example to examine the performance of the new hybrid genetic algorithm. The simulation results are compared with those obtained from the neural networks trained by the previous genetic algorithm. The hybrid genetic algorithm based on the statistical learning theory proposed in this paper shows better generalization ability in testing sample set.
3 citations
••
3 citations
••
23 Jan 2009TL;DR: Based on the theory of support vector machine for regression (SVR), a SVR model is established for predicting the output in fully mechanized mining face, and then realizes the model by programming based on Mat lab, finally, compared with genetic neural network prediction model.
Abstract: Support Vector Machine is a new machine learning technique developed on the basis of Statistical Learning Theory, which has become the hotspot of machine learning because of its excellent learning performance. Based on analyzing the theory of support vector machine for regression (SVR), a SVR model is established for predicting the output in fully mechanized mining face, and then realizes the model by programming based on Mat lab, finally, compared with genetic neural network prediction model. It shows that SVM has a higher accuracy of prediction than GNN, which proved the validity and practicality of the model.
3 citations
•
TL;DR: This article proposes margin-based multicategory classification methods with a reject option and introduces a new and unique refine option for the multicategory problem, where the class of an observation is predicted to be from a set of class labels, whose cardinality is not necessarily one.
Abstract: In many real applications of statistical learning, a decision made from misclassification can be too costly to afford; in this case, a reject option, which defers the decision until further investigation is conducted, is often preferred. In recent years, there has been much development for binary classification with a reject option. Yet, little progress has been made for the multicategory case. In this article, we propose margin-based multicategory classification methods with a reject option. In addition, and more importantly, we introduce a new and unique refine option for the multicategory problem, where the class of an observation is predicted to be from a set of class labels, whose cardinality is not necessarily one. The main advantage of both options lies in their capacity of identifying error-prone observations. Moreover, the refine option can provide more constructive information for classification by effectively ruling out implausible classes. Efficient implementations have been developed for the proposed methods. On the theoretical side, we offer a novel statistical learning theory and show a fast convergence rate of the excess $\ell$-risk of our methods with emphasis on diverging dimensionality and number of classes. The results can be further improved under a low noise assumption. A set of comprehensive simulation and real data studies has shown the usefulness of the new learning tools compared to regular multicategory classifiers. Detailed proofs of theorems and extended numerical results are included in the supplemental materials available online.
3 citations