scispace - formally typeset
Open AccessProceedings Article

Normal Forms in Semantic Language Identification

Reads0
Chats0
TLDR
It is shown that strongly locking learning can be assumed for partially set-driven learners, even when learning restrictions apply, and also the converse is true: every strongly locking learner can be made partiallySet-driven.
Abstract
We consider language learning in the limit from text where all learning restrictions are semantic, that is, where any conjecture may be replaced by a semantically equivalent conjecture. For different such learning criteria, starting with the well-known TxtGBclearning, we consider three different normal forms: strongly locking learning, consistent learning and (partially) set-driven learning. These normal forms support and simplify proofs and give insight into what behaviors are necessary for successful learning (for example when consistency in conservative learning implies cautiousness and strong decisiveness). We show that strongly locking learning can be assumed for partially set-driven learners, even when learning restrictions apply. We give a very general proof relying only on a natural property of the learning restriction, namely, allowing for simulation on equivalent text. Furthermore, when no restrictions apply, also the converse is true: every strongly locking learner can be made partially set-driven. For several semantic learning criteria we show that learning can be done consistently. Finally, we deduce for which learning restrictions partial set-drivenness and set-drivenness coincide, including a general statement about classes of infinite languages. The latter again relies on a simulation argument.

read more

Citations
More filters
Proceedings Article

Cautious Limit Learning.

TL;DR: This paper compares the known variants in a number of different settings, namely full-information and (partially) set-driven learning, paired either with the syntactic convergence restriction (explanatory learning) or the semantic converge restriction (behaviourally correct learning) to understand the restriction of cautious learning more fully.
Posted Content

Learning from Informants: Relations between Learning Success Criteria

TL;DR: The deduced main theorem states the relations between the most important delayable learning success criteria, being the ones not ruined by a delayed in time hypothesis output, and the claim for \emph{delayability} being the right structural property is underpinned.
Posted Content

Learning Families of Formal Languages from Positive and Negative Information.

TL;DR: These investigations underpin the claim for \emph{delayability} being the right structural property to gain a deeper understanding concerning the nature of learning restrictions.
Posted Content

Learning Languages in the Limit from Positive Information with Finitely Many Memory Changes

TL;DR: It is shown that non-U-shapedness is not restrictive, while conservativeness and (strong) monotonicity are, and that iterative and bounded memory states learning are equivalent for a wealth of restrictions.
Posted Content

Learning Languages with Decidable Hypotheses

TL;DR: This paper establishes a hierarchy of learning power depending on whether $C$-indices are required on all outputs; (a) only on outputs relevant for the class to be learned and (c) only in the limit as final, correct hypotheses.
References
More filters
Journal ArticleDOI

Language identification in the limit

TL;DR: It was found that theclass of context-sensitive languages is learnable from an informant, but that not even the class of regular languages is learningable from a text.
Book

Formal Principles of Language Acquisition

TL;DR: The authors of this book have developed a rigorous and unified theory that opens the study of language learnability to discoveries about the mechanisms of language acquisition in human beings and has important implications for linguistic theory, child language research, and the philosophy of language.
Journal ArticleDOI

Theory of Recursive Functions and Effective Computability

D. C. Cooper
- 01 Feb 1969 - 
TL;DR: If searching for the ebook by Hartley Rogers Theory of Recursive Functions and Effective Computability in pdf format, then you've come to the faithful site, which presented the complete version of this book in PDF, DjVu, doc, ePub, txt forms.
Journal ArticleDOI

Inductive inference of formal languages from positive data

TL;DR: A theorem characterizing when an indexed family of nonempty recursive formal languages is inferrable from positive data is proved, and other useful conditions for inference frompositive data are obtained.
Journal ArticleDOI

A Machine-Independent Theory of the Complexity of Recursive Functions

TL;DR: The number of steps required to compute a function depends on the type of computer that is used, on the choice of computer program, and on the input-output code, but the results obtained in this paper are nearly independent of these considerations.