Learnability and the Vapnik-Chervonenkis dimension
Citations
23 citations
Cites background from "Learnability and the Vapnik-Chervon..."
...Such area has found many applications in computational learning (Blumer et al. 1989) and its central notion of shattered sets is a widely studied topic in extremal set theory (Jukna 2010)....
[...]
23 citations
Cites background or methods or result from "Learnability and the Vapnik-Chervon..."
...The chopping procedure described above suggests that the use of threshold decision lists is fairly natural, if an iterative approach is to be taken to pattern classification....
[...]
...Following a form of the PAC model of computational learning theory (see Anthony and Biggs, 1992; Vapnik, 1998; Blumer et al., 1989), we assume that labeled data points (x,b) (where x ∈ Rn and b ∈ {0,1}) have been generated randomly (perhaps from some larger corpus of data) according to a fixed…...
[...]
...For similar results, see Vapnik and Chervonenkis (1971); Blumer et al. (1989); and Anthony and Bartlett (1999). then, for m≥ 8/ε, Pm(Q)≤ 2P2m(T )....
[...]
...The key probability results we employ are the following bounds, due respectively to Vapnik and Chervonenkis (1971) and Blumer et al. (1989) (see also Anthony and Bartlett, 1999): for any ε ∈ (0,1), Pm ({s ∈ Zm : there exists f ∈ H, erP( f )≥ ers( f )+ ε}) < 4ΠH(2m)e−mε 2/8, and, for m≥ 8/ε, Pm ({s…...
[...]
...Lower bounds on the VC-dimension would provide worst-case lower bounds on generalization error (see Ehrenfeucht et al., 1989; Anthony and Biggs, 1992; Anthony and Bartlett, 1999; Blumer et al., 1989)....
[...]
23 citations
23 citations
23 citations
References
42,654 citations
40,020 citations
17,939 citations
14,948 citations
13,647 citations