scispace - formally typeset
Search or ask a question

Showing papers on "Chomsky hierarchy published in 2013"


Journal ArticleDOI
TL;DR: It is argued that the hypothesis that humans employ distinct learning mechanisms for phonology and syntax currently offers the best explanation for this difference.
Abstract: An important distinction between phonology and syntax has been overlooked. All phonological patterns belong to the regular region of the Chomsky Hierarchy, but not all syntactic patterns do. We argue that the hypothesis that humans employ distinct learning mechanisms for phonology and syntax currently offers the best explanation for this difference.

61 citations


BookDOI
01 Jan 2013
TL;DR: Some of the most prominent figures in linguistics, including Noam Chomsky and Barbara H. Partee, offer new insights into the nature of linguistic meaning and pave the way for the further development of formal semantics and formal pragmatics.
Abstract: In recent years, the study of formal semantics and formal pragmatics has grown tremendously showing that core aspects of language meaning can be explained by a few principles. These principles are grounded in the logic that is behind - and tightly intertwined with - the grammar of human language. In this book, some of the most prominent figures in linguistics, including Noam Chomsky and Barbara H. Partee, offer new insights into the nature of linguistic meaning and pave the way for the further development of formal semantics and formal pragmatics. Each chapter investigates various dimensions in which the logical nature of human language manifests itself within a language and/or across languages. Phenomena like bare plurals, free choice items, scalar implicatures, intervention effects, and logical operators are investigated in depth and at times cross-linguistically and/or experimentally. This volume will be of interest to scholars working within the fields of semantics, pragmatics, language acquisition and psycholinguistics.

42 citations


01 Jan 2013
TL;DR: This paper is an attempt to share with a larger audience some modern developments in the theory of finite automata, written for the mathematician who has a background in semigroup theory but knows next to nothing on automata and languages.
Abstract: This paper is an attempt to share with a larger audience some modern developments in the theory of finite automata. It is written for the mathematician who has a background in semigroup theory but knows next to nothing on automata and languages. No proofs are given, but the main results are illustrated by several examples and counterexamples. What is the topic of this theory? It deals with languages, automata and semigroups, although recent developments have shown interesting connections with model theory in logic, symbolic dynamics and topology. Historically, in their attempt to formalize natural languages, linguists such as Chomsky gave a mathematical definition of natural concepts such as words, languages or grammars: given a finite set A, a word on A is simply an element of the free monoid on A, and a language is a set of words. But since scientists are fond of classifications of all sorts, language theory didn’t escape to this mania. Chomsky established a first hierarchy, based on his formal grammars. In this paper, we are interested in the recognizable languages, which form the lower level of the Chomsky hierarchy. A recognizable language can be described in terms of finite automata while, for the higher levels, more powerful machines, ranging from pushdown automata to Turing machines, are required. For this reason, problems on finite automata are often under-estimated, according to the vague — but totally erroneous — feeling that “if a problem has been reduced to a question about finite automata, then it should be easy to solve”. Kleene’s theorem [23] is usually considered as the foundation of the theory. It shows that the class of recognizable languages (i.e. recognized by finite automata), coincides with the class of rational languages, which are given by rational expressions. Rational expressions can be thought of as a generalization of polynomials involving three operations: union (which plays the role of addition), product and star operation. An important corollary of Kleene’s theorem is that rational languages are closed under complement. In the sixties, several classification schemes for the rational languages were proposed, based on the

24 citations


Book
12 Nov 2013
TL;DR: The authors The Settling of a Language and the Whorf Hypothesis of Relativism or a Universal Theory is a testbed for Grammatical Theories, and it has been used for grammatical theories in language and cognition.
Abstract: Introduction 1 The Settling of a Language 2 The Whorf Hypothesis 3 Relativism or a Universal Theory? 4 What Does Language have to do With Logic and Mathematics? 5 A Testbed for Grammatical Theories 6 The Chomsky Hierarchy in Perspective 7 Reflexivity and Identity in Language and Cognition 8 The Generalized Logic Hierarchy and its Cognitive Implications 9 The Intensionalization of Extensions Bibliography Index

21 citations


Journal ArticleDOI
Kaoru Fujioka1
TL;DR: Some characterizations of and representation theorems for languages in the Chomsky hierarchy are obtained by using insertion systems, strictly locally testable languages, and morphisms in the framework of the Chomsky-Schutzenberger theorem.

6 citations


01 Jan 2013
TL;DR: The aim of this article is to survey some connections between formal language theory and group theory with particular emphasis on the word problem for groups and the consequence on the algebraic structure of a group of its word problem belonging to a certain class of formal languages.
Abstract: The aim of this article is to survey some connections between formal language theory and group theory with particular emphasis on the word problem for groups and the consequence on the algebraic structure of a group of its word problem belonging to a certain class of formal languages. We define our terms in Section 2 and then consider the structure of groups whose word problem is regular or context-free in Section 3. In Section 4 we look at groups whose wordproblem is a one-counter language, and we move up the Chomsky hierarchy to briefly consider what happens above context-free in Section 5. In Section 6, we see what happens if we consider languages lying in certain complexity classes, and we summarize the situation in Section 7. For general background material on group theory we refer the reader to [25, 26], and for formal language theory to [7, 16, 20].

5 citations


Journal ArticleDOI
TL;DR: It is proved that unrestricted Szilard languages and certain leftmost SzilARD languages of context-free matrix grammars, without appearance checking, can be accepted by indexing alternating Turing machines in logarithmic time and space.
Abstract: The regulated rewriting mechanism is one of the most efficient methods to augment the Chomsky hierarchy with a large variety of language classes In this paper we investigate the derivation mechanism in regulated rewriting grammars such as matrix grammars, by studying their Szilard languages We focus on the complexity of Szilard languages associated with unrestricted and leftmost-like derivations in matrix grammars, with or without appearance checking The reason is twofold First, to relate these classes of languages to parallel complexity classes such as NC1 and AC1, and, second, to improve some previous results We prove that unrestricted Szilard languages and certain leftmost Szilard languages of context-free matrix grammars, without appearance checking, can be accepted by indexing alternating Turing machines in logarithmic time and space Consequently, these classes are included in UE*-uniform NC1 Unrestricted Szilard languages of matrix grammars with appearance checking can be accepted by deterministic Turing machines in On log n time and Olog n space Leftmost-like Szilard languages of context-free matrix grammars, with appearance checking, can be recognized by nondeterministic Turing machines by using the same time and space resources Hence, all these classes are included in AC1

4 citations


Proceedings Article
01 Jan 2013
TL;DR: In this paper, the authors present the state of the art of machine learning, agent technologies and formal languages, not considering them as isolated research areas, but focusing on the relationship among them.
Abstract: The paper presents the state of the art of machine learning, agent technologies and formal languages, not considering them as isolated research areas, but focusing on the relationship among them. First, we consider the relationship between learning and agents. Second, the relationship between machine learning and formal languages. And third, the relationship between agents and formal languages. Finally, we point to some promising directions on the intersection among these three areas.

2 citations


Journal Article
TL;DR: A novel dynamical Bayesian network model, while based on a generalization of the Hid- den Markov model, has qualitatively greater generative power than either the Hidden Markovmodel itself or any of its existing variants and generalizations.

1 citations


Proceedings ArticleDOI
17 Aug 2013
TL;DR: This work extracted the dependency tree based lexical information and incorporate the information into the language model by extending the approach to consider Chomsky hierarchy type 1 and type 2 to improve the baseline system of Malay large vocabulary automatic speech recognition system.
Abstract: This research work describes our approaches in using dependency parse tree information to derive useful hidden word statistics to improve the baseline system of Malay large vocabulary automatic speech recognition system. The traditional approaches to train language model are mainly based on Chomsky hierarchy type 3 that approximates natural language as regular language. This approach ignores the characteristics of natural language. Our work attempted to overcome these limitations by extending the approach to consider Chomsky hierarchy type 1 and type 2. We extracted the dependency tree based lexical information and incorporate the information into the language model. The second pass lattice rescoring was performed to produce better hypotheses for Malay large vocabulary continuous speech recognition system. The absolute WER reduction was 2.2% and 3.8% for MASS and MASS-NEWS Corpus, respectively.

1 citations


Journal ArticleDOI
TL;DR: A mathematical apparatus for the description of context-dependent grammars of artificial languages, viz., their syntax, as well as their logical and generating semantics, is presented.
Abstract: This paper presents a mathematical apparatus for the description of context-dependent grammars of artificial languages, viz., their syntax, as well as their logical and generating semantics. Examples that demonstrate the applicability of the suggested apparatus are given; the model of formal languages (Chomsky generating grammars) is compared with the model of artificial languages introduced in this work.