Learnability and the Vapnik-Chervonenkis dimension
Citations
80 citations
Cites background from "Learnability and the Vapnik-Chervon..."
...The existence of the latter would mean that the combinatorial properties that determine the information complexity of PAC-learning (i.e., of learning from randomly drawn examples) are essentially the same as those that determine the information complexity of teaching (i.e., of learning from helpfully selected examples), at least when using the recursive teaching model....
[...]
...For example, in PAC-learning (Valiant, 1984), the information complexity of a concept class C is the worst-case sample complexity a best possible PAC learner for C can achieve on all concepts in C....
[...]
...This way sample bounds for PAC-learning of a class C can be obtained from the size of a smallest sample compression scheme for C (Littlestone and Warmuth, 1996; Floyd and Warmuth, 1995)....
[...]
...Among many relevant properties, it provides bounds on the sample complexity of PAC-learning (Blumer et al., 1989)....
[...]
80 citations
80 citations
79 citations
78 citations
Cites background from "Learnability and the Vapnik-Chervon..."
...Observe, however, that in some respects parameterized complexity appears to be, in a sense, ‘orthogonal’ to classical complexity: For instance, the so-called problem of computing the V-C dimension from learning theory [13,77], which is not known (and not believed) to be NP-hard, is W [1]-complete [30,31]....
[...]
References
42,654 citations
40,020 citations
17,939 citations
14,948 citations
13,647 citations