scispace - formally typeset
Open AccessBook

An Introduction to Computational Learning Theory

Reads0
Chats0
TLDR
The probably approximately correct learning model Occam's razor the Vapnik-Chervonenkis dimension weak and strong learning learning in the presence of noise inherent unpredictability reducibility in PAC learning learning finite automata is described.
Abstract
The probably approximately correct learning model Occam's razor the Vapnik-Chervonenkis dimension weak and strong learning learning in the presence of noise inherent unpredictability reducibility in PAC learning learning finite automata by experimentation appendix - some tools for probabilistic analysis.

read more

Citations
More filters
Book ChapterDOI

AveBoost2: Boosting for noisy data

TL;DR: An algorithm is developed that first constructed a distribution the same way as AdaBoost but then averaged it with the previous models’ distributions to create the next base model’s distribution and its experimental performance improvement is even greater on the noisy data than the original data.

Connectionism and the problem of systematicity

TL;DR: It is argued that Connectionism potentially has an explanation for the acquisition of systematic behaviour, which is an issue not addressed in the Classical paradigm since systematicity is built into a Classical architecture.
Proceedings ArticleDOI

Learning Path Queries on Graph Databases

TL;DR: This paper investigates the problem of learning graph queries by exploiting user examples, focuses on path queries defined by regular expressions, identifies fundamental difficulties of the problem setting, formalizes what it means to be learnable, and proves that the class of queries under study enjoys this property.

Inference of genetic regulatory networks under the best-fit extension paradigm

TL;DR: This work shows that for many classes of Boolean functions, including the class of all Boolean function functions, the problem of inferring the network structure is polynomial-time solvable, implying its practical applicability to real data analysis.
Journal Article

Relational learning as search in a critical region

TL;DR: The chances of success and the computational cost of relational learning, which appears to be severely affected by the presence of a phase transition in the covering test, are investigated.