scispace - formally typeset
Open AccessBook

An Introduction to Computational Learning Theory

Reads0
Chats0
TLDR
The probably approximately correct learning model Occam's razor the Vapnik-Chervonenkis dimension weak and strong learning learning in the presence of noise inherent unpredictability reducibility in PAC learning learning finite automata is described.
Abstract
The probably approximately correct learning model Occam's razor the Vapnik-Chervonenkis dimension weak and strong learning learning in the presence of noise inherent unpredictability reducibility in PAC learning learning finite automata by experimentation appendix - some tools for probabilistic analysis.

read more

Citations
More filters
Journal ArticleDOI

Searching for interacting features in subset selection

TL;DR: This paper takes up the challenge to design a special data structure for feature quality evaluation, and to employ an information-theoretic feature ranking mechanism to efficiently handle feature interaction in subset selection.
Book ChapterDOI

Toward Attribute Efficient Learning of Decision Lists and Parities

TL;DR: This work considers two well-studied problems regarding attribute efficient learning: learning decision lists and learning parity functions and gives the first polynomial time algorithm for learning parity on a superconstant number of variables with sublinear sample complexity.
Journal Article

Hardness of Learning Halfspaces with Noise.

TL;DR: In this paper, it was shown that weak proper agnostic learning of halfspaces is NP-hard and that the problem is intractable in the presence of random classification noise.
Book ChapterDOI

A theory of inductive query answering

TL;DR: In this article, a decomposition of a Boolean query Q into k sub-queries Q/sub i/ = q/sub A/spl and/Q/sub M/ that are the conjunction of a monotonic and an anti-monotonic predicate is presented.