scispace - formally typeset
Open Access

Empirical tests of the Gradual Learning Algorithm

Reads0
Chats0
TLDR
It is argued that the Gradual Learning Algorithm has a number of special advantages: it can learn free variation, deal effectively with noisy learning data, and account for gradient well-formedness judgments.
Abstract
The Gradual Learning Algorithm (Boersma 1997) is a constraint ranking algorithm for learning Optimality-theoretic grammars. The purpose of this article is to assess the capabilities of the Gradual Learning Algorithm, particularly in comparison with the Constraint Demotion algorithm of Tesar and Smolensky (1993, 1996, 1998), which initiated the learnability research program for Optimality Theory. We argue that the Gradual Learning Algorithm has a number of special advantages: it can learn free variation, avoid failure when confronted with noisy learning data, and account for gradient well-formedness judgments. The case studies we examine involve Ilokano reduplication and metathesis, Finnish genitive plurals, and the distribution of English light and dark /l/.

read more

Citations
More filters
Journal ArticleDOI

Differential Object Marking: Iconicity vs. Economy

TL;DR: The degree to which DOM penetrates the class of objects reflects the tension between two types of principles: one involves iconicity: the more marked a direct object qua object, the more likely it is to be overtly case-marked.
Journal ArticleDOI

Second language acquisition.

TL;DR: The review details the theoretical stance of the two different approaches to the nature of language: generative linguistics and general cognitive approaches and some results of key acquisition studies from the two theoretical frameworks are discussed.

Handbook of Pragmatics

TL;DR: The handbook of pragmatics as mentioned in this paper is a handbook for pragmatism, which is used in many of the works mentioned in this article. کتابخانه دیجیتال شاپور اهواز
Journal ArticleDOI

Redundancy and reduction: Speakers manage syntactic information density

TL;DR: A principle of efficient language production based on information theoretic considerations is proposed: Uniform Information Density predicts that language production is affected by a preference to distribute information uniformly across the linguistic signal, and this prediction is tested against data from syntactic reduction.
Journal ArticleDOI

A Maximum Entropy Model of Phonotactics and Phonotactic Learning

TL;DR: This work proposes a theory of phonotactic grammars and a learning algorithm that constructs such Grammars from positive evidence, and applies the model in a variety of learning simulations, showing that the learnedgrammars capture the distributional generalizations of these languages and accurately predict the findings of a phonotactics experiment.
References
More filters
Book

Phonology and Syntax: The Relation between Sound and Structure

TL;DR: A fundamentally new approach to the theory of phonology and its relation to syntax is developed in this book, which is the first to address the question of the relation between syntax and phonology in a systematic way.

Faithfulness and reduplicative identity

TL;DR: The UMass and Rutgers Correspondence Theory seminars were particularly important for the development of this work as discussed by the authors, and the comments, questions, and suggestions from the participants in the (eventually joint) UMass/Rutgers correspondence theory seminars are particularly important.
Journal Article

Compensatory Lengthening in Moraic Phonology

Bruce Hayes
- 01 Jan 1989 - 
TL;DR: Representation phonologique du niveau prosodique, contenant une seule unite correspondant a la notion traditionnelle de more dans un cadre metrique.