scispace - formally typeset
Journal ArticleDOI

Learnability and the Vapnik-Chervonenkis dimension

Reads0
Chats0
TLDR
This paper shows that the essential condition for distribution-free learnability is finiteness of the Vapnik-Chervonenkis dimension, a simple combinatorial parameter of the class of concepts to be learned.
Abstract
Valiant's learnability model is extended to learning classes of concepts defined by regions in Euclidean space En. The methods in this paper lead to a unified treatment of some of Valiant's results, along with previous results on distribution-free convergence of certain pattern recognition algorithms. It is shown that the essential condition for distribution-free learnability is finiteness of the Vapnik-Chervonenkis dimension, a simple combinatorial parameter of the class of concepts to be learned. Using this parameter, the complexity and closure properties of learnable classes are analyzed, and the necessary and sufficient conditions are provided for feasible learnability.

read more

Content maybe subject to copyright    Report

Citations
More filters

Optimization and approximation on systems of geometric objects

TL;DR: In case of a legitimate complaint, the Library will make the material inaccessible and/or remove it from the website as mentioned in this paper, in case of legitimate complaints the material will be removed.
Journal ArticleDOI

A novel iron loss reduction technique for distribution transformers based on a combined genetic algorithm - neural network approach

TL;DR: An effective method to reduce the iron losses of wound core distribution transformers based on a combined neural network/genetic algorithm approach that tackles the iron loss reduction problem during the transformer production phase while previous works concentrated on the design phase.
BookDOI

Computational Theories of Learning and Developmental Psycholinguistics

TL;DR: The primary purpose of this chapter is explain to developmental psycholinguists and language scientists more generally the main conclusions and issues in computational learning theories.
Journal ArticleDOI

Efficient learning with virtual threshold gates

TL;DR: This paper finds ways to keep the exponentially many weights of Winnow implicitly so that the time for the algorithm to compute a prediction and update its “virtual” weights is polynomial, and thinks that other online algorithms with multiplicative weight updates whose loss bounds grow logarithmically with the dimension are amenable to these methods.
Journal ArticleDOI

Bounding sample size with the Vapnik-Chervonenkis dimension

TL;DR: A proof that a concept class is learnable provided the Vapnik—Chervonenkis dimension is finite is given.
References
More filters
Book

Computers and Intractability: A Guide to the Theory of NP-Completeness

TL;DR: The second edition of a quarterly column as discussed by the authors provides a continuing update to the list of problems (NP-complete and harder) presented by M. R. Garey and myself in our book "Computers and Intractability: A Guide to the Theory of NP-Completeness,” W. H. Freeman & Co., San Francisco, 1979.
Book

The Art of Computer Programming

TL;DR: The arrangement of this invention provides a strong vibration free hold-down mechanism while avoiding a large pressure drop to the flow of coolant fluid.
Book

Pattern classification and scene analysis

TL;DR: In this article, a unified, comprehensive and up-to-date treatment of both statistical and descriptive methods for pattern recognition is provided, including Bayesian decision theory, supervised and unsupervised learning, nonparametric techniques, discriminant analysis, clustering, preprosessing of pictorial data, spatial filtering, shape description techniques, perspective transformations, projective invariants, linguistic procedures, and artificial intelligence techniques for scene analysis.