scispace - formally typeset
Journal ArticleDOI

On learning ring-sum-expansions

Paul Fischer, +1 more
- 01 Feb 1992 - 
- Vol. 21, Iss: 1, pp 181-192
Reads0
Chats0
TLDR
It is proved that 2-term is learnable by a conjunction of a 2-CNF and a 1-DNF and that k-RSE, the class of ring-sum-expansions containing only monomials of length at most k, can be learned from positive (negative) examples alone.
Abstract
The problem of learning ring-sum-expansions from examples is studied. Ring-sum-expansions (RSE) are representations of Boolean functions over the base $\{ \wedge , \oplus ,1 \}$, which reflect arithmetic operations in $GF(2)$. k-RSE is the class of ring-sum-expansions containing only monomials of length at most k. k-term is the class of ring-sum-expansions having at most k monomials. It is shown that k-RSE, $k \geq 1$, is learnable while k-term-RSE, $k \geq 2$, is not learnable if $RP \ne NP$. Without using a complexity-theoretical hypothesis, it is proven that k-RSE, $k \geq 1$, and k-term-RSE, $k \geq 2$ cannot be learned from positive (negative) examples alone. However, if the restriction that the hypothesis which is output by the learning algorithm is also a k-RSE is suspended, then k-RSE is learnable from positive (negative) examples only. Moreover, it is proved that 2-term is learnable by a conjunction of a 2-CNF and a 1-DNF. Finally the paper presents learning (on-line prediction) algorithms for k-...

read more

Citations
More filters
Proceedings ArticleDOI

Efficient noise-tolerant learning from statistical queries

TL;DR: This paper formalizes a new but related model of learning from statistical queries, and demonstrates the generality of the statistical query model, showing that practically every class learnable in Valiant’s model and its variants can also be learned in the new model (and thus can be learning in the presence of noise).
Journal ArticleDOI

Efficient noise-tolerant learning from statistical queries

TL;DR: This paper formalizes a new but related model of learning from statistical queries, and demonstrates the generality of the statistical query model, showing that practically every class learnable in Valiant's model and its variants can also be learned in the new model (and thus can be learning in the presence of noise).
Posted Content

What Can We Learn Privately

TL;DR: In this paper, it was shown that a concept class is learnable by a local algorithm if and only if it is learnedable in the statistical query (SQ) model.
Proceedings ArticleDOI

On the learnability of discrete distributions

TL;DR: A new model of learning probability distributions from independent draws is introduced, inspired by the popular Probably Approximately Correct (PAC) model for learning boolean functions from labeled examples, in the sense that it emphasizes efficient and approximate learning, and it studies the learnability of restricted classes of target distributions.
Proceedings ArticleDOI

What Can We Learn Privately

TL;DR: This work investigates learning algorithms that satisfy differential privacy, a notion that provides strong confidentiality guarantees in the contexts where aggregate information is released about a database containing sensitive information about individuals.