scispace - formally typeset
Open AccessJournal ArticleDOI

Sample size lower bounds in PAC learning by algorithmic complexity theory

Bruno Apolloni, +1 more
- 06 Dec 1998 - 
- Vol. 209, Iss: 1, pp 141-162
Reads0
Chats0
TLDR
This paper focuses on a general setup for obtaining sample size lower bounds for learning concept classes under fixed distribution laws in an extended PAC learning framework based on incompressibility methods drawn from Kolmogorov Complexity and Algorithmic Probability theories.
About
This article is published in Theoretical Computer Science.The article was published on 1998-12-06 and is currently open access. It has received 11 citations till now. The article focuses on the topics: Algorithmic learning theory & Sample exclusion dimension.

read more

Citations
More filters
Journal ArticleDOI

Recent trends on nanofluid heat transfer machine learning research applied to renewable energy

TL;DR: The present review aims at revealing the recent trends of machine learning research in nanofluids and scrutinizing the features and applicability of various machine learning methods.

Optimal quantum sample complexity of learning algorithms

TL;DR: In this paper, it was shown that quantum and classical sample complexity are in fact equal up to constant factors in both the PAC and agnostic models, where each example is a coherent quantum state.
Proceedings ArticleDOI

Bounds on the sample complexity of Bayesian learning using information theory and the VC dimension

TL;DR: An understanding of the sample complexity of learning in several existing models is provided and a systematic investigation and comparison of two fundamental quantities in learning and information theory is undertaken.
Journal ArticleDOI

On the relationship between workflow models and document types

TL;DR: A framework in which it is possible to derive the business processes that have to be supported and the database of the information system separately, and it is shown that there is a one-to-one correspondence between Jackson nets and Jackson types.
Proceedings ArticleDOI

Improved lower bounds for learning from noisy examples: an information-theoretic approach

TL;DR: This paper presents a general information-theoretic approach for obtaining lower bounds on the number of examples needed to PAC learn in the presence of noise, and the technique is applied to several different models, illustrating its generality and power.
References
More filters
Proceedings ArticleDOI

A theory of the learnable

TL;DR: This paper regards learning as the phenomenon of knowledge acquisition in the absence of explicit programming, and gives a precise methodology for studying this phenomenon from a computational viewpoint.
Book

An Introduction to Kolmogorov Complexity and Its Applications

TL;DR: The Journal of Symbolic Logic as discussed by the authors presents a thorough treatment of the subject with a wide range of illustrative applications such as the randomness of finite objects or infinite sequences, Martin-Loef tests for randomness, information theory, computational learning theory, the complexity of algorithms, and the thermodynamics of computing.

An Introduction to Kolmogorov Complexity and Its Applications

TL;DR: The book presents a thorough treatment of the central ideas and their applications of Kolmogorov complexity with a wide range of illustrative applications, and will be ideal for advanced undergraduate students, graduate students, and researchers in computer science, mathematics, cognitive sciences, philosophy, artificial intelligence, statistics, and physics.
Book

Estimation of Dependences Based on Empirical Data

TL;DR: In this article, the Big Picture of Inference: Direct Inference Instead of Generalization (INFI) instead of generalization (2000-2010) is presented. But this is not the case in this paper.
Related Papers (5)