scispace - formally typeset
Journal ArticleDOI

How many queries are needed to learn

Reads0
Chats0
TLDR
It is shown that an honest class is exactly polynomial-query learnable if and only if it is learnable using an oracle for Γ p 4, and a new relationship between query complexity and time complexity in exact learning is shown.
Abstract
We investigate the query complexity of exact learning in the membership and (proper) equivalence query model. We give a complete characterization of concept classes that are learnable with a polynomial number of polynomial sized queries in this model. We give applications of this characterization, including results on learning a natural subclass of DNF formulas, and on learning with membership queries alone. Query complexity has previously been used to prove lower bounds on the time complexity of exact learning. We show a new relationship between query complexity and time complexity in exact learning: If any “honest” class is exactly and properly learnable with polynomial query complexity, but not learnable in polynomial time, then P = NP. In particular, we show that an honest class is exactly polynomial-query learnable if and only if it is learnable using an oracle for Gp4.

read more

Citations
More filters
Book

Grammatical Inference: Learning Automata and Grammars

TL;DR: The author describes a number of techniques and algorithms that allow us to learn from text, from an informant, or through interaction with the environment that concern automata, grammars, rewriting systems, pattern languages or transducers.
Proceedings ArticleDOI

A bound on the label complexity of agnostic active learning

TL;DR: General bounds on the number of label requests made by the A2 algorithm proposed by Balcan, Beygelzimer & Langford are derived, which represents the first nontrivial general-purpose upper bound on label complexity in the agnostic PAC model.
Journal ArticleDOI

Equivalences and Separations Between Quantum and Classical Learnability

TL;DR: These results contrast known results that show that testing black-box functions for various properties, as opposed to learning, can require exponentially more classical queries than quantum queries.
Journal ArticleDOI

The Geometry of Generalized Binary Search

TL;DR: In this paper, the authors investigated the problem of determining a binary-valued function through a sequence of strategically selected queries, and developed novel incoherence and geometric conditions under which GBS achieves the information-theoretically optimal query complexity, i.e., given a collection of N hypotheses, GBS terminates with the correct function after no more than a constant times log N queries.

Theoretical foundations of active learning

TL;DR: Borders on the rates of convergence achievable by active learning are derived, under various noise models and under general conditions on the hypothesis class.
References
More filters
Book

Introduction to Automata Theory, Languages, and Computation

TL;DR: This book is a rigorous exposition of formal languages and models of computation, with an introduction to computational complexity, appropriate for upper-level computer science undergraduates who are comfortable with mathematical arguments.
Proceedings ArticleDOI

A theory of the learnable

TL;DR: This paper regards learning as the phenomenon of knowledge acquisition in the absence of explicit programming, and gives a precise methodology for studying this phenomenon from a computational viewpoint.
Journal ArticleDOI

Learning regular sets from queries and counterexamples

TL;DR: In this article, the problem of identifying an unknown regular set from examples of its members and nonmembers is addressed, where the regular set is presented by a minimaMy adequate teacher, which can answer membership queries about the set and can also test a conjecture and indicate whether it is equal to the unknown set and provide a counterexample if not.
Journal ArticleDOI

Learnability and the Vapnik-Chervonenkis dimension

TL;DR: This paper shows that the essential condition for distribution-free learnability is finiteness of the Vapnik-Chervonenkis dimension, a simple combinatorial parameter of the class of concepts to be learned.
Journal ArticleDOI

Queries and Concept Learning

TL;DR: This work considers the problem of using queries to learn an unknown concept, and several types of queries are described and studied: membership, equivalence, subset, superset, disjointness, and exhaustiveness queries.