scispace - formally typeset
Search or ask a question
Topic

Statistical learning theory

About: Statistical learning theory is a research topic. Over the lifetime, 1618 publications have been published within this topic receiving 158033 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: This work proposes a novel quantum version of the classical binary classification task by considering maps with classical input and quantum output and corresponding classical-quantum training data and provides sample complexity lower bounds which show that the upper bounds are essentially tight for pure output states.
Abstract: In classical statistical learning theory, one of the most well studied problems is that of binary classification. The information-theoretic sample complexity of this task is tightly characterized by the Vapnik-Chervonenkis (VC) dimension. A quantum analog of this task, with training data given as a quantum state has also been intensely studied and is now known to have the same sample complexity as its classical counterpart. We propose a novel quantum version of the classical binary classification task by considering maps with classical input and quantum output and corresponding classical-quantum training data. We discuss learning strategies for the agnostic and for the realizable case and study their performance to obtain sample complexity upper bounds. Moreover, we provide sample complexity lower bounds which show that our upper bounds are essentially tight for pure output states. In particular, we see that the sample complexity is the same as in the classical binary classification task w.r.t. its dependence on accuracy, confidence and the VC-dimension.

8 citations

Proceedings ArticleDOI
13 Oct 2005
TL;DR: Simulation results show that the proposed least squares support vector machines estimation algorithm provides a powerful tool for identification and soft-sensor modeling and has promising application in industrial process applications.
Abstract: Support vector machines (SVM) is a novel machine learning method based on small-sample statistical learning theory (SLT), and is powerful for the problem with small sample, nonlinearity, high dimension, and local minima. SVM have been very successful in pattern recognition, fault diagnoses and function estimation problems. Least squares support vector machines (LS-SVM) is an SVM version which involves equality instead of inequality constraints and works with a least squares cost function. This paper discusses least squares support vector machines (LS-SVM) estimation algorithm and introduces applications of the novel method for the nonlinear control systems. Then identification of MIMO models and soft-sensor modeling based on least squares support vector machines (LS-SVM) is proposed. The simulation results show that the proposed method provides a powerful tool for identification and soft-sensor modeling and has promising application in industrial process applications

8 citations

Journal ArticleDOI
TL;DR: The authors examined anomalous data from an old study and shed new light on statistical learning theory and deterministic assumptions about human behavior, revealing new insights about the human behavior and learning process.
Abstract: Reexamination of anomalous data from an old study sheds new light on statistical learning theory and deterministic assumptions about human behavior.

8 citations

Posted Content
TL;DR: In this article, the problem of reconstructing encrypted databases from access pattern leakage is closely related to statistical learning theory, and the authors show that devastatingly small numbers of queries are needed to attain very accurate database reconstruction.
Abstract: We show that the problem of reconstructing encrypted databases from access pattern leakage is closely related to statistical learning theory. This new viewpoint enables us to develop broader attacks that are supported by streamlined performance analyses. First, we address the problem of e-approximate database reconstruction (e-ADR) from range query leakage, giving attacks whose query cost scales only with the relative error e, and is independent of the size of the database, or the number N of possible values of data items. This already goes significantly beyond the state-of-the-art for such attacks, as represented by Kellaris et al. (ACM CCS 2016) and Lacharite et al. (IEEE SP using real data, we show that devastatingly small numbers of queries are needed to attain very accurate database reconstruction. Finally, we generalize from ranges to consider what learning theory tells us about the impact of access pattern leakage for other classes of queries, focusing on prefix and suffix queries. We illustrate this with both concrete attacks for prefix queries and with a general lower bound for all query classes. We also show a very general reduction from reconstruction with known or chosen queries to PAC learning.

8 citations

Proceedings ArticleDOI
09 Nov 2006
TL;DR: A successful adoption of the particle swarm optimization (PSO) algorithm is presented to improve the performances of SVM classifier for the purpose of incipient faults syndrome diagnosis of power transformers.
Abstract: Based on statistical learning theory, Support Vector Machine (SVM) has been well recognized as a powerful computational tool for problems with nonlinearity had high dimensionalities. In this paper, we present a successful adoption of the particle swarm optimization (PSO) algorithm to improve the performances of SVM classifier for the purpose of incipient faults syndrome diagnosis of power transformers. A PSO-based encoding technique is applied to improve the accuracy of classification. The proposed scheme removes irreverent input features that may be confusing the classifier and optimizes the kernel parameters simultaneously. Experiments on real operational data demonstrated the effectiveness and high efficiency of the proposed approach which make operation faster and also increase the accuracy of the classification.

8 citations


Network Information
Related Topics (5)
Artificial neural network
207K papers, 4.5M citations
86% related
Cluster analysis
146.5K papers, 2.9M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
81% related
Optimization problem
96.4K papers, 2.1M citations
80% related
Fuzzy logic
151.2K papers, 2.3M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20239
202219
202159
202069
201972
201847