Open AccessBook
An Introduction to Computational Learning Theory
Michael Kearns,Umesh Vazirani +1 more
Reads0
Chats0
TLDR
The probably approximately correct learning model Occam's razor the Vapnik-Chervonenkis dimension weak and strong learning learning in the presence of noise inherent unpredictability reducibility in PAC learning learning finite automata is described.Abstract:
The probably approximately correct learning model Occam's razor the Vapnik-Chervonenkis dimension weak and strong learning learning in the presence of noise inherent unpredictability reducibility in PAC learning learning finite automata by experimentation appendix - some tools for probabilistic analysis.read more
Citations
More filters
Journal ArticleDOI
A statistical perspective on data mining
TL;DR: Comparing three approaches to machine learning that have developed largely independently: classical statistics, Vapnik's statistical learning theory, and computational learning theory concludes that statisticians and data miners can profit by studying each other's methods and using a judiciously chosen combination of them.
Book ChapterDOI
Distance Between Herbrand Interpretations: A Measure for Approximations to a Target Concept
TL;DR: This paper defines a metric d on the set of expressions, motivated by the structure and complexity of the expressions and the symbols used therein, which allows us to measure the distance between Herbrand interpretations, by considering the elements in their symmetric difference.
Journal ArticleDOI
Learning from interpretation transition
TL;DR: A novel framework for learning normal logic programs from transitions of interpretations, given a set of pairs of interpretations such that J=TP(I), where TP is the immediate consequence operator, is proposed.
Proceedings Article
Learning to take actions
TL;DR: In this article, the authors formalize a model for supervised learning of action strategies in dynamic stochastic domains, and show that pac-learning results on Occam algorithms hold in this model as well.
Proceedings ArticleDOI
New degree bounds for polynomial threshold functions
Ryan O'Donnell,Rocco A. Servedio +1 more
TL;DR: The upper bounds for Boolean formulas yield the first known subexponential time learning algorithms for formulas of superconstant depth and the first new degree lower bounds since 1968 are given, improving results of Minsky and Papert.