scispace - formally typeset
Open AccessBook

An Introduction to Computational Learning Theory

Reads0
Chats0
TLDR
The probably approximately correct learning model Occam's razor the Vapnik-Chervonenkis dimension weak and strong learning learning in the presence of noise inherent unpredictability reducibility in PAC learning learning finite automata is described.
Abstract
The probably approximately correct learning model Occam's razor the Vapnik-Chervonenkis dimension weak and strong learning learning in the presence of noise inherent unpredictability reducibility in PAC learning learning finite automata by experimentation appendix - some tools for probabilistic analysis.

read more

Citations
More filters
Book ChapterDOI

Traffic Sign Detection

TL;DR: This chapter introduces an evolutionary approach to feature selection which allows building detectors using feature sets with large cardinalities and introduces the basic concepts of the machine learning framework and some bio-inspired features.
Journal ArticleDOI

Query-Efficient Algorithms for Polynomial Interpolation over Composites

TL;DR: The interpolation algorithm is used to design algorithms for zero-testing and distributional learning of polynomials over Zm.
Book ChapterDOI

Data Complexity Issues in Grammatical Inference

TL;DR: It is argued that there are three levels at which data complexity for grammatical inference can be studied, and that the main difficulties arise from the fact that the structural definitions of the languages and the topological measures do not match.
Posted Content

Bounding the Fat Shattering Dimension of a Composition Function Class Built Using a Continuous Logic Connective

TL;DR: Sauer's Lemma is explained, which involves the VC dimension and is used to prove the equivalence of a concept class being distribution-free PAC learnable and it having finite VC dimension, and the construction of a new function class from a collection of function classes.