scispace - formally typeset
Open AccessBook

An Introduction to Computational Learning Theory

Reads0
Chats0
TLDR
The probably approximately correct learning model Occam's razor the Vapnik-Chervonenkis dimension weak and strong learning learning in the presence of noise inherent unpredictability reducibility in PAC learning learning finite automata is described.
Abstract
The probably approximately correct learning model Occam's razor the Vapnik-Chervonenkis dimension weak and strong learning learning in the presence of noise inherent unpredictability reducibility in PAC learning learning finite automata by experimentation appendix - some tools for probabilistic analysis.

read more

Citations
More filters
Book ChapterDOI

Convergence Theorems of Estimation of Distribution Algorithms

TL;DR: It is shown that a good approximation of the true distribution is not necessary, it suffices to use a factorization where the global optima have a large enough probability and explains the success of EDAs in practical applications using Bayesian networks.
Book ChapterDOI

A Novel Learning Algorithm for Büchi Automata Based on Family of DFAs and Classification Trees

TL;DR: In this paper, a classification tree structure instead of the standard observation table structure is used to learn a Buchi automaton from a teacher who knows an OO-regular language.
Journal ArticleDOI

On-line learning in parity machines

TL;DR: A set of recursion relations for the relevant probability distributions, which permit study of the general K case, are introduced and the generalization error curve is determined and shown to decay to zero for large as even in the presence of noise.
Book ChapterDOI

Generalized Graph Colorability and Compressibility of Boolean Formulae

TL;DR: It is proved that approximating the minimally consistent DNF formula, and a generalization of graph colorability, is very hard and the proof technique is such that the stronger the complexity hypothesis used, the larger the inapproximability ratio obtained.