Learnability and the Vapnik-Chervonenkis dimension
Citations
9 citations
Cites result from "Learnability and the Vapnik-Chervon..."
...By the VC dimension Theorem [7] for S of size poly(1/ , 1/ , VCD(Hn)), any function f ∈ Hn satisfies with probability 1 − , ∣∣∣ Pr (x,y)∈DX×{0,1} [f (x) = y] − Pr (x,y)∈US [f (x) = y] ∣∣∣ 2 ....
[...]
...If R ⊂ A then by the VC-dimension Theorem [7], with probability at least 1 2 a hypothesis h consistent with R agrees with (1 − ) of the points of A....
[...]
9 citations
9 citations
Cites background from "Learnability and the Vapnik-Chervon..."
...However,one can use the PAC model to approximate an equivalence query without formallyspecifying the hypothesis h. Blumer, Ehrenfeucht, Haussler and Warmuth [27] provedthat any hypothesis consistent with a labeled random sample of size log 1= + log jCj !is a PAC-hypothesis; i.e., with probability at least 1 , the hypothesis is -good....
[...]
...Blumer, Ehrenfeucht, Haussler andWarmuth [28] have shown that a sample of size polynomial in the VC-dimension of the hypothesisclass is su cient for PAC-learning, so the number of examples needed is polynomial in n. 2.3 Unreliable Boundary Queries 45that query is \positive", then mark ~xneg....
[...]
...172 Bibliography[57] David Haussler, Michael Kearns, Nick Littlestone, and Manfred K. Warmuth....
[...]
...[58] David Haussler, Nick Littlestone, and Manfred K. Warmuth....
[...]
...[86] Leonard Pitt and Manfred K. Warmuth....
[...]
9 citations
9 citations
Cites background from "Learnability and the Vapnik-Chervon..."
...Blumer et al. (1989) proved that there exists a class that cannot be efficiently learned by SQ, but is actually efficiently learnable....
[...]
References
42,654 citations
40,020 citations
17,939 citations
14,948 citations
13,647 citations