Learnability and the Vapnik-Chervonenkis dimension
Citations
11 citations
11 citations
Cites background from "Learnability and the Vapnik-Chervon..."
...Let us recall some basic facts about"-nets, for example, see the papers [12], [ 4 ], and [7], or the books [2], [16], and [18]....
[...]
...Later, Blummer et al. [ 4 ] showed that Haussler and Welzl’s upper bound could be lowered....
[...]
11 citations
Additional excerpts
...The third line of the table, included for comparison, is simply a standard sequential algorithm for learning a halfspace based on polynomial-time linear programming executed on one processor (Blumer et al., 1989; Karmarkar, 1984)....
[...]
...naive parallelization of Perceptron poly(n, 1/γ) Õ(1/γ(2)) +O(log n) naive parallelization of [27] poly(n, 1/γ) Õ(1/γ(2)) +O(log n) polynomial-time linear programming [2] 1 poly(n, log(1/γ)) This paper poly(n, 1/γ) Õ(1/γ) +O(log n)...
[...]
11 citations
Cites background from "Learnability and the Vapnik-Chervon..."
...The VapnikChervonenkis dimension (or, the VC dimension), is introduced to measure the capacity of neural networks [35]....
[...]
11 citations
References
42,654 citations
40,020 citations
17,939 citations
14,948 citations
13,647 citations