Learnability and the Vapnik-Chervonenkis dimension
Citations
16 citations
16 citations
Cites methods from "Learnability and the Vapnik-Chervon..."
...We adopt Valiant's formalization of this intuitive notion into what is known as the Probably Approximate Correct (PAC), or Distribution Free, model of learning (Valiant, 1984; Blumer et al., 1989)....
[...]
16 citations
Cites background from "Learnability and the Vapnik-Chervon..."
...PROOF: SinceH is PAC-learnable, it must necessarily have a finite VC-dimen sion [24]....
[...]
16 citations
Cites background from "Learnability and the Vapnik-Chervon..."
...According to the result in (Blumer et al. 1989) for learning half-spaces separated by a hyperplane, there exists a learning algorithm which satisfies the following conditions for every distribution P~ on [0, c~) n and every ¢ and 6 in the range of (0, 1), 1....
[...]
...Since the VC dimension of this problem is n + 1, according to Theorem 2.1 in (Blumer et al. 1989) , the number of required points N is at most 4 2 8(n+ 1) 1og213) (i) max( logs 5' ~ and any algorithm which produces consistent values of w and d with the following constraints: for everyxEX +,w.x<d,…...
[...]
16 citations
Cites background from "Learnability and the Vapnik-Chervon..."
...Document pre-processing: In general, the first step in understanding a document is to segment it into blocks that can be said to be atomic, i.e. to represent one distinct logical entity in the document’s structure....
[...]
References
42,654 citations
40,020 citations
17,939 citations
14,948 citations
13,647 citations