scispace - formally typeset
Search or ask a question

Showing papers on "Chomsky hierarchy published in 2016"


Proceedings Article
12 Jun 2016
TL;DR: This paper investigates STRIPS with and without conditional effects; though it also tighten some existing results on hierarchical formalisms, and discusses the language-based expressivity measure with respect to the other approaches.
Abstract: From a theoretical perspective, judging the expressivity of planning formalisms helps to understand the relationship of different representations and to infer theoretical properties. From a practical point of view, it is important to be able to choose the best formalism for a problem at hand, or to ponder the consequences of introducing new representation features. Most work on the expressivity is based either on compilation approaches, or on the computational complexity of the plan existence problem. Recently, we introduced a new notion of expressivity. It is based on comparing the structural complexity of the set of solutions to a planning problem by interpreting the set as a formal language and classifying it with respect to the Chomsky hierarchy. This is a more direct measure than the plan existence problem and enables also the comparison of formalisms that can not be compiled into each other. While existing work on that last approach focused on different hierarchical problem classes, this paper investigates STRIPS with and without conditional effects; though we also tighten some existing results on hierarchical formalisms. Our second contribution is a discussion on the language-based expressivity measure with respect to the other approaches.

39 citations


BookDOI
05 May 2016
TL;DR: This book explains advanced theoretical and application-related issues in grammatical inference, a research area inside the inductive inference paradigm for machine learning, and focuses on the main classes of formal languages according to Chomsky's hierarchy.
Abstract: This book explains advanced theoretical and application-related issues in grammatical inference, a research area inside the inductive inference paradigm for machine learning. The first three chapters of the book deal with issues regarding theoretical learning frameworks; the next four chapters focus on the main classes of formal languages according to Chomsky's hierarchy, in particular regular and context-free languages; and the final chapter addresses the processing of biosequences. The topics chosen are of foundational interest with relatively mature and established results, algorithms and conclusions. The book will be of value to researchers and graduate students in areas such as theoretical computer science, machine learning, computational linguistics, bioinformatics, and cognitive psychology who are engaged with the study of learning, especially of the structure underlying the concept to be learned. Some knowledge of mathematics and theoretical computer science, including formal language theory, automata theory, formal grammars, and algorithmics, is a prerequisite for reading this book.

21 citations


Journal ArticleDOI
TL;DR: This work introduces a new variant numerical P systems with migrating variables (MNP systems), and investigates the computational power of MNP systems both as number generators and as string generators, working in the one-parallel or the sequential modes.

17 citations


Journal ArticleDOI
01 Jan 2016-Language
TL;DR: The authors examines the evolution of the goals of Chomsky's research program into the nature, origin, and use of language and compares the early goals, first formulated in the 1950s and revised less than a decade later in Chomsky 1965 and summarized in Chomsky & Lasnik 1977, with the reformulation under the minimalist program.
Abstract: In order to provide a framework for evaluating the generative enterprise as discussed in Chomsky’s linguistics, which spans almost four decades—what has been and what remains to be accomplished—this essay examines the evolution of the goals of Chomsky’s research program into the nature, origin, and use of language. It compares the early goals, first formulated in the 1950s and revised less than a decade later in Chomsky 1965 and summarized in Chomsky & Lasnik 1977, with the reformulation under the minimalist program, focusing on the strong minimalist thesis, which motivates a search for principled explanation in terms of interface conditions and general principles of computational efficiency, the latter based on the operation Merge and a theory of phases. This evaluation develops some alternative proposals to the formulations in the volume under review.

8 citations


Journal ArticleDOI
TL;DR: This model generates interesting pictures for a given regulated language in P systems with isotonic arrays and isotonic array rules and the regulated language will be a language of Chomsky hierarchy.
Abstract: In this paper, we propose a new regulated evolution in P systems with isotonic arrays and isotonic array rules. The regulated language will be a language of Chomsky hierarchy. This model generates interesting pictures for a given regulated language. The generative capacity is explored.

6 citations


Journal ArticleDOI
TL;DR: A formal grammar model of call-by-name stack behaviour is given in terms of the Chomsky hierarchy and it is formally proved that Turchin's relation can terminate all computations generated by the model.
Abstract: Supercompilation is a program transformation technique that was first described by V. F. Turchin in the 1970s. In supercompilation, Turchin's relation as a similarity relation on call-stack configurations is used both for call-by-value and call-by-name semantics to terminate unfolding of the program being transformed. In this paper, we give a formal grammar model of call-by-name stack behaviour. We classify the model in terms of the Chomsky hierarchy and then formally prove that Turchin's relation can terminate all computations generated by the model.

3 citations


Journal ArticleDOI
06 Jul 2016
TL;DR: In this article, a formal grammar model of call-by-name stack behavior is presented, and it is shown that Turchin's relation can terminate all computations generated by the model.
Abstract: Supercompilation is a program transformation technique that was first described by V. F. Turchin in the 1970s. In supercompilation, Turchin's relation as a similarity relation on call-stack configurations is used both for call-by-value and call-by-name semantics to terminate unfolding of the program being transformed. In this paper, we give a formal grammar model of call-by-name stack behaviour. We classify the model in terms of the Chomsky hierarchy and then formally prove that Turchin's relation can terminate all computations generated by the model.

3 citations


01 Jan 2016
TL;DR: The formal complexity of natural language is universally compatible with any devices to read and is available in the digital library an online access to it is set as public so you can download it instantly.
Abstract: Thank you very much for downloading the formal complexity of natural language. Maybe you have knowledge that, people have look hundreds times for their chosen books like this the formal complexity of natural language, but end up in malicious downloads. Rather than reading a good book with a cup of tea in the afternoon, instead they cope with some harmful virus inside their desktop computer. the formal complexity of natural language is available in our digital library an online access to it is set as public so you can download it instantly. Our books collection hosts in multiple locations, allowing you to get the most less latency time to download any of our books like this one. Kindly say, the the formal complexity of natural language is universally compatible with any devices to read.

3 citations


Book ChapterDOI
25 Jul 2016
TL;DR: This work considers a variant of rewriting P systems with only regular or linear rewriting rules and alphabetic flat splicing rules, and the language generative power of rewritingP systems with flat splice rules in comparison withflat splicing systems and Chomsky hierarchy is investigated.
Abstract: Rewriting P systems, as language generating devices, are one of the earliest classes of P systems with structured strings as objects and the rewriting rules as evolution rules. Flat splicing is an operation on strings, inspired by a splicing operation on circular strings. In this work, we consider a variant of rewriting P systems with only regular or linear rewriting rules and alphabetic flat splicing rules, and the language generative power of rewriting P systems with flat splicing rules in comparison with flat splicing systems and Chomsky hierarchy is investigated.

2 citations


Book ChapterDOI
11 Jan 2016

2 citations


Journal ArticleDOI
TL;DR: It is Chomsky’s great achievement that he instituted rationalism such as linguistic creativity and innateness in linguistics, but this mutation theory, in relevance to UG consisting of external and internal, or free merge, cannot provide an adequate explanation for language acquisition.
Abstract: It is Chomsky’s great achievement that he instituted rationalism such as linguistic creativity and innateness in linguistics. Based on uniformity and rapidity of language acquisition (or growth), he established species-specific innate Universal Grammar (UG), the faculty of language. However, Chomsky proposed that UG results from mutations. This mutation theory, in relevance to UG consisting of external and internal, or free merge, cannot provide an adequate explanation for language acquisition. Nor is Chomsky’s merge-mutation theory powerful enough to explain abstract structures such as phonological and syntactic structures of English as well as other languages. More importantly, this theory would be repeatedly refuted by many biologists’ claim that mutations hardly add new, complex and specified information. To these fallacies of Chomsky’s merge-mutation theory, an adequate answer is “intelligent design” consisting of irreducible and specified complexity. Intelligent design proposes that human beings are designed with some species-specific biological endowments different from those of other animals, which in turn have their own pre-programmed behaviors. Only human beings are designed with UG as we can see by the numerous cases that Chomsky suggested.

Journal ArticleDOI
TL;DR: The criticism will be presented, and the conclusion that the general rule is not compatible with all human languages will be suggested by studying some human languages as a case namely Arabic, English, and Russian.
Abstract: Every language has its characteristics and rules, though all languages share the same components like words, sentences, subject, verb, object and so on. Nevertheless, Chomsky suggested the theory of language acquisition in children instinctively through a universal grammar that represents a universal grammar for all human languages. Since it has its declaration, this theory has encountered criticism from linguists. In this paper, the criticism will be presented, and the conclusion that the general rule is not compatible with all human languages will be suggested by studying some human languages as a case namely Arabic, English, and Russian.

Proceedings ArticleDOI
01 Nov 2016
TL;DR: An introduction to the fundamentals of language theory, their classification, restrictions and representation, and an implementation of an algorithm that will analyze streams of texts, accepting or rejecting them as they comply or not with the predefined rules.
Abstract: Formal language theory plays, in computer science, a fundamental role that allows, among other things, the development of one of the cornerstones of information technology: programming languages. They define the mandatory grammatical rules that programmers need to follow to create the tools that enable humans to interact with machines. Despite its significance, formal language theory is often taken for granted, even by software developers, who regularly follow the rules of their programming domain. Unless the developer is creating its own programming language, data structure, or describing the formal background of an existing one, he will not need to dive deep into formal languages, grammar or automaton theory. Those who do need to develop their own rules will first have to understand the theory of formal languages, and the limitations they impose. This paper will do an introduction to the fundamentals of language theory, their classification, restrictions and representation. Once this ground rules are set, we will use a worldwide known data structure format such as the JavaScript Object Notation (JSON), to formally define its grammar rules and automaton. All of this formal background will allow us to transform theory into bits, by developing an algorithm that will analyze streams of texts, accepting or rejecting them as they comply or not with the predefined rules. Finally, we will analyze the outcomes of this implementation, its benefits, limitations, and alternatives that could have been followed.

Journal ArticleDOI
TL;DR: Alternative proofs are given for these characterizations of the structure of palindromic regular and palINDromic context-free languages.
Abstract: The characterization of the structure of palindromic regular and palindromiccontext-free languages is described by S. Horvath, J. Karhumaki,and J. Kleijn in 1987. In this paper alternative proofs are givenfor these characterizations.

Proceedings ArticleDOI
22 May 2016
TL;DR: It is demonstrated that, despite negative learnability results in the theoretical regime, it is possible to use long short-term memory networks, a type of recurrent neural network (RNN) architecture, to learn a grammar for URIs that appear in Apache HTTP access logs for a particular server with high accuracy.
Abstract: Formal Language Theory for Security (LangSec) applies the tools of theoretical computer science to the problem of protocol design and analysis. In practice, most results have focused on protocol design, showing that by restricting the complexity of protocols it is possible to design parsers with desirable and formally verifiable properties, such as correctness and equivalence. When we consider existing protocols, however, many of these were not subjected to formal analysis during their design, and many are not implemented in a manner consistent with their formal documentation. Determining a grammar for such protocols is the first step in analyzing them, which places this problem in the domain of grammatical inference, for which a deep theoretical literature exists. In particular, although it has been shown that the higher level categories of the Chomsky hierarchy cannot be generically learned, it is also known that certain subcategories of that hierarchy can be effectively learned. In this paper, we summarize some theoretical results for inferring well-known Chomsky grammars, with special attention to context-free grammars (CFGs) and their generated languages (CFLs). We then demonstrate that, despite negative learnability results in the theoretical regime, we can use long short-term memory (LSTM) networks, a type of recurrent neural network (RNN) architecture, to learn a grammar for URIs that appear in Apache HTTP access logs for a particular server with high accuracy. We discuss these results in the context of grammatical inference, and suggest avenues for further research into learnability of a subgroup of the context-free grammars.

Journal ArticleDOI
TL;DR: In The Cultural Logic of Computation David Golumbia offers a critique of Chomsky and of computational linguistics that is rendered moot by his poor understanding of those ideas.
Abstract: In The Cultural Logic of Computation David Golumbia offers a critique of Chomsky and of computational linguistics that is rendered moot by his poor understanding of those ideas. He fails to understand and appreciate the distinction between the abstract theory of computation and real computation, such as that involved in machine translation; he confuses the Chomsky hierarchy of language types with hierarchical social organization; he misperceives the conceptual value of computation as a way of thinking about the mind; he ignores the standard account of the defunding of machine translation in the 1960s (it wasn’t working) in favor of obscure political speculations; he offers casual remarks about the demographics of linguistics without any evidence, thus betraying his ideological preconceptions; and he seems to hold a view of analog phenomena that is at odds with the analog/digital distinction as it is used in linguistics, computation, and the cognitive sciences.