scispace - formally typeset
Open AccessBook

An Introduction to Computational Learning Theory

Reads0
Chats0
TLDR
The probably approximately correct learning model Occam's razor the Vapnik-Chervonenkis dimension weak and strong learning learning in the presence of noise inherent unpredictability reducibility in PAC learning learning finite automata is described.
Abstract
The probably approximately correct learning model Occam's razor the Vapnik-Chervonenkis dimension weak and strong learning learning in the presence of noise inherent unpredictability reducibility in PAC learning learning finite automata by experimentation appendix - some tools for probabilistic analysis.

read more

Citations
More filters
Journal Article

Breaking the Minsky-Papert Barrier for Constant-Depth Circuits.

TL;DR: The threshold degree of a Boolean function is defined in this article as the minimum degree of the real polynomial of the function that represents the Boolean function in the sign of the sign.
Proceedings Article

Learning cooperative games

TL;DR: A novel connection between PAC learnability and core stability is established: for games that are efficiently learnable, it is possible to find payoff divisions that are likely to be stable using a polynomial number of samples.
Journal Article

Constrained Counting and Sampling: Bridging the Gap Between Theory and Practice

TL;DR: A novel hashing-based algorithmic framework for constrained sampling and counting that combines the classical algorithmic technique of universal hashing with the dramatic progress made in combinatorial reasoning tools, in particular, SAT and SMT, over the past two decades is introduced.
Proceedings Article

Kernel methods for learning languages

TL;DR: In this article, the authors study the linear separability of automata and languages by examining the rich class of piecewise-testable languages and prove that all languages linearly separable under a regular finite cover embedding are regular.
Book ChapterDOI

Repairing Decision-Making Programs Under Uncertainty

TL;DR: In this paper, the authors propose distribution-guided inductive synthesis, a repair technique that iteratively samples a finite set of inputs from a probability distribution defining the precondition, synthesizes a minimal repair to the program over the sampled inputs using an smt-based encoding, and verifies that the resulting program is correct and is semantically close to the original program.