scispace - formally typeset
Search or ask a question
Topic

Formal language

About: Formal language is a research topic. Over the lifetime, 5763 publications have been published within this topic receiving 154114 citations.


Papers
More filters
Book
18 Mar 1995
TL;DR: Theoretical Preliminaries for Computational Complexity 1.0 and 2.0.
Abstract: 0. Mathematical Preliminaries. 1. Alphabets and Languages. 2. Regular Languages. 3. Context Free Languages. 4. Turing Machines. 5. Turing Machines and Languages. 6. Decidability. 7. An Introduction to Computational Complexity. Bibliography.

36 citations

Book ChapterDOI
30 Aug 1999
TL;DR: The model of generalized P-systems, GP- systems for short, is considered, a new model for computations using membrane structures and recently introduced by Gheorghe Păun, allowing for the simulation of graph controlled grammars of arbitrary type based on productions working on single objects.
Abstract: We consider a variant of P-systems, a new model for computations using membrane structures and recently introduced by Gheorghe Paun. Using the membranes as a kind of filter for specific objects when transferring them into an inner compartment turns out to be a very powerful mechanism in combination with suitable rules to be applied within the membranes. The model of generalized P-systems, GP-systems for short, considered in this paper allows for the simulation of graph controlled grammars of arbitrary type based on productions working on single objects; for example, the general results we establish in this paper can immediately be applied to the graph controlled versions of context-free string grammars, n-dimensional #-context-free array grammars, and elementary graph grammars.

36 citations

Proceedings ArticleDOI
01 Nov 2018
TL;DR: This work investigates whether recurrent neural networks are capable of learning the rules of opening and closing brackets by applying them to synthetic Dyck languages that consist of different types of brackets, and provides an analysis of the statistical properties of these languages as a baseline.
Abstract: Many natural and formal languages contain words or symbols that require a matching counterpart for making an expression well-formed. The combination of opening and closing brackets is a typical example of such a construction. Due to their commonness, the ability to follow such rules is important for language modeling. Currently, recurrent neural networks (RNNs) are extensively used for this task. We investigate whether they are capable of learning the rules of opening and closing brackets by applying them to synthetic Dyck languages that consist of different types of brackets. We provide an analysis of the statistical properties of these languages as a baseline and show strengths and limits of Elman-RNNs, GRUs and LSTMs in experiments on random samples of these languages. In terms of perplexity and prediction accuracy, the RNNs get close to the theoretical baseline in most cases.

36 citations

Book ChapterDOI
TL;DR: Two types of languages defined by a string through iterative factor duplications, inspired by the process of tandem repeats production in the evolution of DNA are considered, and some conditions for the non-regularity of these languages are given.
Abstract: We consider two types of languages defined by a string through iterative factor duplications, inspired by the process of tandem repeats production in the evolution of DNA. We investigate some decidability matters concerning the unbounded duplication languages and then fix the place of bounded duplication languages in the Chomsky hierarchy by showing that all these languages are context-free. We give some conditions for the non-regularity of these languages. Finally, we discuss some open problems and directions for further research.

36 citations

01 Jan 2001
TL;DR: This paper shows that the analysis presented in earlier papers can be extended in a reasonable way to several cases that were unaccounted for in the original discussion, including well-known examples as the following.
Abstract: In recent papers (Kroch and Joshi 1985, Kroch 1987) we claimed that, if one adopts the Tree Adjoining Grammar (TAG) formalism of Joshi, Levi, and Takahashi (1975) as the formal language of syntax, the ungrammaticality of extractions from whislands can be made to follow in a straightforward way from the nonexistence of multiple whfronting in simple questions. The analysis we gave was oversimplified, however, because it wrongly predicted all whisland extractions to be ungrammatical, and we know that certain of them are well-formed, not only in languages like Swedish or Italian, but also in English (Chomsky 1986, Grimshaw 1986). Nevertheless, the analysis we gave had the attraction of providing a simple structural explanation for the whisland effect, and it generalized directly to such other manifestations of subjacency as the Complex Noun Phrase Constraint (CNPC). In this paper we show that the analysis presented in our earlier papers can be extended in a reasonable way to several cases that were unaccounted for in the original discussion. In particular, we discuss such well-known examples as the following:

35 citations


Network Information
Related Topics (5)
Data structure
28.1K papers, 608.6K citations
87% related
Time complexity
36K papers, 879.5K citations
86% related
Graph (abstract data type)
69.9K papers, 1.2M citations
85% related
Semantics
24.9K papers, 653K citations
85% related
Component-based software engineering
24.2K papers, 461.9K citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20237
202237
2021113
2020175
2019173
2018142