scispace - formally typeset
Open AccessJournal ArticleDOI

Results on learnability and the Vapnik-Chervonenkis dimension

TLDR
The notion of dynamic sampling is introduced, wherein the number of examples examined may increase with the complexity of the target concept, and this method is used to establish the learnability of various concept classes with an infinite Vapnik-Chervonenkis dimension.
Abstract
We consider the problem of learning a concept from examples in the distribution-free model by Valiant. (An essentially equivalent model, if one ignores issues of computational difficulty, was studied by Vapnik and Chervonenkis.) We introduce the notion of dynamic sampling, wherein the number of examples examined may increase with the complexity of the target concept. This method is used to establish the learnability of various concept classes with an infinite Vapnik-Chervonenkis dimension. We also discuss an important variation on the problem of learning from examples, called approximating from examples. Here we do not assume that the target concept T is a member of the concept class C from which approximations are chosen. This problem takes on particular interest when the VC dimension of C is infinite. Finally, we discuss the problem of computing the VC dimension of a finite concept set defined on a finite domain and consider the structure of classes of a fixed small dimension.

read more

Citations
More filters
Journal ArticleDOI

Structural risk minimization over data-dependent hierarchies

TL;DR: A result is presented that allows one to trade off errors on the training sample against improved generalization performance, and a more general result in terms of "luckiness" functions, which provides a quite general way for exploiting serendipitous simplicity in observed data to obtain better prediction accuracy from small training sets.
Proceedings ArticleDOI

Some PAC-Bayesian theorems

TL;DR: The PAC-Bayesian theorems given here apply to an arbitrary prior measure on an arbitrary concept space and provide an alternative to the use of VC dimension in proving PAC bounds for parameterized concepts.
Journal ArticleDOI

Problems and results in extremal combinatorics-II

TL;DR: This paper contains a collection of problems and results in the area, including solutions or partial solutions to open problems suggested by various researchers, and is a sequel to a previous paper of the same flavor.
Journal ArticleDOI

PAC-Bayesian Stochastic Model Selection

TL;DR: A PAC-Bayesian performance guarantee for stochastic model selection that is superior to analogous guarantees for deterministic model selection and shown that the posterior optimizing the performance guarantee is a Gibbs distribution.
Proceedings ArticleDOI

Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers

TL;DR: The results show that for two general kinds of concept class the V-C dimension is polynomially bounded in the number of real numbers used to define a problem instance, and that in the continuous case, as in the discrete, the real barrier to efficient learning in the Occam sense is complexity- theoretic and not information-theoretic.
References
More filters
Journal ArticleDOI

Paper: Modeling by shortest data description

Jorma Rissanen
- 01 Sep 1978 - 
TL;DR: The number of digits it takes to write down an observed sequence x1,...,xN of a time series depends on the model with its parameters that one assumes to have generated the observed data.
Proceedings ArticleDOI

A theory of the learnable

TL;DR: This paper regards learning as the phenomenon of knowledge acquisition in the absence of explicit programming, and gives a precise methodology for studying this phenomenon from a computational viewpoint.
Book ChapterDOI

On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities

TL;DR: This chapter reproduces the English translation by B. Seckler of the paper by Vapnik and Chervonenkis in which they gave proofs for the innovative results they had obtained in a draft form in July 1966 and announced in 1968 in their note in Soviet Mathematics Doklady.
Journal ArticleDOI

Learnability and the Vapnik-Chervonenkis dimension

TL;DR: This paper shows that the essential condition for distribution-free learnability is finiteness of the Vapnik-Chervonenkis dimension, a simple combinatorial parameter of the class of concepts to be learned.
Journal ArticleDOI

Learning Decision Lists

TL;DR: This paper introduces a new representation for Boolean functions, called decision lists, and shows that they are efficiently learnable from examples, and strictly increases the set of functions known to be polynomially learnable, in the sense of Valiant (1984).
Related Papers (5)