scispace - formally typeset
Search or ask a question

Showing papers on "Context-sensitive grammar published in 2012"


BookDOI
01 Jan 2012
TL;DR: A method for harvesting invention fowl which includes the steps of horizontally extending beneath the fowl, in a confined area, a plurality of lifting fingers.
Abstract: A method for harvesting invention fowl which includes the steps of horizontally extending beneath the fowl, in a confined area, a plurality of lifting fingers; raising and pivoting the fingers to lift the fowl and supporting them at least in part upon a continuously moving structure; continuing to said the fowl on the continuously moving structure to convey the fowl to a cooping location; and moving the fowl into a coop from the continuously moving structure.

100 citations


Journal ArticleDOI
TL;DR: It is shown that every linear straight-line context-free tree grammar can be transformed in polynomial time into a monadic (and linear) one.

45 citations


Proceedings Article
22 Jul 2012
TL;DR: A simple EM-based grammar induction algorithm for Combinatory Categorial Grammar (CCG) that achieves state-of-the-art performance by relying on a minimal number of very general linguistic principles, and discovers all categories automatically.
Abstract: We present a simple EM-based grammar induction algorithm for Combinatory Categorial Grammar (CCG) that achieves state-of-the-art performance by relying on a minimal number of very general linguistic principles. Unlike previous work on unsupervised parsing with CCGs, our approach has no prior language-specific knowledge, and discovers all categories automatically. Additionally, unlike other approaches, our grammar remains robust when parsing longer sentences, performing as well as or better than other systems. We believe this is a natural result of using an expressive grammar formalism with an extended domain of locality.

40 citations


Book ChapterDOI
05 Mar 2012
TL;DR: A new connection between formal language theory and proof theory is introduced and one of the most fundamental proof transformations is shown to correspond exactly to the computation of the language of a certain class of tree grammars.
Abstract: We introduce a new connection between formal language theory and proof theory. One of the most fundamental proof transformations in a class of formal proofs is shown to correspond exactly to the computation of the language of a certain class of tree grammars. Translations in both directions, from proofs to grammars and from grammars to proofs, are provided. This correspondence allows theoretical as well as practical applications.

30 citations


Proceedings Article
08 Jul 2012
TL;DR: Simple context-free tree grammar strongly lexicalize tree adjoining grammars and themselves, in which each production contains a lexical symbol.
Abstract: Recently, it was shown (KUHLMANN, SATTA: Tree-adjoining grammars are not closed under strong lexicalization Comput Linguist, 2012) that finitely ambiguous tree adjoining grammars cannot be transformed into a normal form (preserving the generated tree language), in which each production contains a lexical symbol A more powerful model, the simple context-free tree grammar, admits such a normal form It can be effectively constructed and the maximal rank of the nonterminals only increases by 1 Thus, simple context-free tree grammars strongly lexicalize tree adjoining grammars and themselves

18 citations


Book ChapterDOI
Ryo Yoshinaka1
05 Mar 2012
TL;DR: This paper shows how opposite approaches to distributional learning models and exploits the relation between strings and contexts are integrated into single learning algorithms that learn quite rich classes of context-free grammars.
Abstract: Recently several "distributional learning algorithms" have been proposed and have made great success in learning different subclasses of context-free grammars. The distributional learning models and exploits the relation between strings and contexts that form grammatical sentences in the language of the learning target. There are two main approaches. One, which we call primal, constructs nonterminals whose language is supposed to be characterized by strings. The other, which we call dual, uses contexts to characterize the language of each nonterminal of the conjecture grammar. This paper shows how those opposite approaches are integrated into single learning algorithms that learn quite rich classes of context-free grammars.

17 citations


Proceedings ArticleDOI
26 Oct 2012
TL;DR: A Watson-Crick regular grammar that has rules as in a regular grammar but involves double stranded strings is introduced and some properties of these grammars are obtained and an application is indicated to generation of chain code pictures.
Abstract: Motivated by Watson-Crick automata, we introduce here a grammar counterpart called a Watson-Crick regular grammar that has rules as in a regular grammar but involves double stranded strings. The language generated by this grammar consists of strings in the upper strands of the double stranded strings related by a complementarity relation. We obtain some properties of these grammars and also indicate an application to generation of chain code pictures.

15 citations


Journal ArticleDOI
TL;DR: It is proved that every recursively enumerable language can be generated by a one-sided random context grammar with no more than ten nonterminals, and the notion of a right random context nonterminal is introduced.
Abstract: In the present paper, we study the nonterminal complexity of one-sided random context grammars. More specifically, we prove that every recursively enumerable language can be generated by a one-sided random context grammar with no more than ten nonterminals. An analogical result holds for thirteen nonterminals in terms of these grammars with the set of left random context rules coinciding with the set of right random context rules. Furthermore, we introduce the notion of a right random context nonterminal, defined as a nonterminal that appears on the left-hand side of a right random context rule. We demonstrate how to convert any one-sided random context grammar G to an equivalent one-sided random context grammar H with two right random context nonterminals. An analogical conversion is given in terms of (1) propagating one-sided random context grammars and (2) left random context nonterminals. In the conclusion, two open problems are stated.

11 citations


Journal ArticleDOI
TL;DR: It is proved that one-sided forbidding Grammars are equivalent to selective substitution grammars and characterize the family of context-free languages.
Abstract: In one-sided forbidding grammars, the set of rules is divided into the set of left forbidding rules and the set of right forbidding rules. A left forbidding rule can rewrite a non-terminal if each of its forbidding symbols is absent to the left of the rewritten symbol in the current sentential form, while a right forbidding rule is applied analogically except that this absence is verified to the right. Apart from this, they work like ordinary forbidding grammars. As its main result, this paper proves that one-sided forbidding grammars are equivalent to selective substitution grammars. This equivalence is established in terms of grammars with and without erasing rules. Furthermore, this paper proves that one-sided forbidding grammars in which the set of left forbidding rules coincides with the set of right forbidding rules characterize the family of context-free languages. In the conclusion, the significance of the achieved results is discussed.

11 citations


Journal ArticleDOI
TL;DR: This paper proves that context-free grammars with a simpler restriction where only symbols to be rewritten are restricted are restricted, not the rules, in the sense that any rule rewriting the chosen nonterminal can be applied.

10 citations


Journal ArticleDOI
TL;DR: It has been claimed in the literature that for every tree-adjoining grammar, one can construct a strongly equivalent lexicalized version, but it is shown that such a procedure does not exist: Tree-ad joining grammars are not closed under strong lexicalization.
Abstract: A lexicalized tree-adjoining grammar is a tree-adjoining grammar where each elementary tree contains some overt lexical item. Such grammars are being used to give lexical accounts of syntactic phenomena, where an elementary tree defines the domain of locality of the syntactic and semantic dependencies of its lexical items. It has been claimed in the literature that for every tree-adjoining grammar, one can construct a strongly equivalent lexicalized version. We show that such a procedure does not exist: Tree-adjoining grammars are not closed under strong lexicalization.

Book ChapterDOI
05 Mar 2012
TL;DR: This paper gives two equivalent definitions of the model and establishes its basic properties, including a transformation to a normal form, a cubic-time parsing algorithm, and another recognition algorithm working in linear space.
Abstract: Conjunctive grammars (Okhotin, 2001) are an extension of the standard context-free grammars with a conjunction operation, which maintains most of their practical properties, including many parsing algorithms. This paper introduces a further extension to the model, which is equipped with quantifiers for referring to the left context, in which the substring being defined does occur. For example, a rule A → a & ◺B defines a string a, as long as it is preceded by any string defined by B. The paper gives two equivalent definitions of the model--by logical deduction and by language equations--and establishes its basic properties, including a transformation to a normal form, a cubic-time parsing algorithm, and another recognition algorithm working in linear space.

Journal ArticleDOI
TL;DR: It is shown that many decision problems can be decided in polynomial time for Muller context-free grammars in normal form, and a limitedness property is established: if the language generated by a grammar contains only scattered words, then either there is an integer n such that each word of the language has Hausdorff rank at most n.

Proceedings Article
01 Jan 2012
TL;DR: The approach to Object Grammars is implemented as one of the foundations of the Ensō system and the utility of the approach is illustrated by showing how it enables definition and composition of domain-specific languages (DSLs).
Abstract: Object Grammars define mappings between text and object graphs. Parsing recognizes syntactic features and creates the corresponding object structure. In the reverse direction, formatting recognizes object graph features and generates an appropriate textual presentation. The key to Object Grammars is the expressive power of the mapping, which decouples the syntactic structure from the graph structure. To handle graphs, Object Grammars support declarative annotations for resolving textual names that refer to arbitrary objects in the graph structure. Predicates on the semantic structure provide additional control over the mapping. Furthermore, Object Grammars are compositional so that languages may be defined in a modular fashion. We have implemented our approach to Object Grammars as one of the foundations of the Ensō system and illustrate the utility of our approach by showing how it enables definition and composition of domain-specific languages (DSLs).

30 Sep 2012
TL;DR: This paper revisits and explores an exemplar shape grammar from literature to illustrate the use of different grammar formalisms and considers the implementation of rule application within a sortal grammar interpreter.
Abstract: Grammar formalisms for design come in a large variety, requiring different representations of the entities being generated, and different interpretative mechanisms for this generation. Most examples of shape grammars rely on labeled shapes, a combination of line segments and labeled points. Color grammars extend the shape grammar formalism to allow for a variety of qualitative aspects of design, such as color, to be integrated in the rules of a shape grammar. Sortal grammars consider a compositional approach to the representational structures underlying (augmented) shape grammars, allowing for a variety of grammar formalism to be defined and explored. In this paper, we revisit and explore an exemplar shape grammar from literature to illustrate the use of different grammar formalisms and consider the implementation of rule application within a sortal grammar interpreter

Journal ArticleDOI
TL;DR: A new algorithm solving the membership problem for context-free grammars generating strings over a one-letter alphabet is developed, which is based upon fast multiplication of integers, works in time |G|@?nlog^3n@?2^O^(^l^o^g^^^*^n^), and is applicable to context- free Grammars augmented with Boolean operations, known as Boolean grammARS.

Journal ArticleDOI
TL;DR: Some results on the power of external contextual grammars with regular commutative, regular circular, definite, suffix-free, ordered, combinational, nilpotent, and union-free selection languages are given.

Patent
Yusuke Doi1, Yumiko Sakai1
18 Oct 2012
TL;DR: In this paper, an EXI decoder is provided with a grammar store storing a first set of type grammars and a second set of types that are common to the first set.
Abstract: There is provided with an EXI decoder, including: a grammar store storing a first set of type grammars and a second set of type grammars, the first set of type grammars being type grammars generated according to an EXI specification from a basic schema of an XML and the second set of type grammars being type grammars that, among a set of type grammars generated according to the EXI specification from an extension schema of XML, type grammars common to the first set of type grammars are excluded; a stream input unit to receive an EXI stream; and a parser unit decoding the EXI stream, when the EXI stream is compatible with the basic schema, based on the first set of type grammars, and, when the EXI stream is compatible with the extension schema, based on the second set of type grammars and the common type grammars.

Journal ArticleDOI
20 Jun 2012
TL;DR: The notion of a new transducer as a two-component system, which consists of a nite automaton and a context-free grammar, which can accept and generate all recursively enumerable languages is introduced.
Abstract: This paper introduces the notion of a new transducer as a two-component system, which consists of a nite automaton and a context-free grammar. In essence, while the automaton reads its input string, the grammar produces its output string, and their cooperation is controlled by a set, which restricts the usage of their rules. From a theoretical viewpoint, the present paper discusses the power of this sys- tem working in an ordinary way as well as in a leftmost way. In addition, the paper introduces an appearance checking, which allows us to check whether some symbols are present in the rewritten string, and studies its eect on the power. It achieves the following three main results. First, the system generates and accepts languages dened by matrix grammars and partially blind multi-counter automata, respec- tively. Second, if we place a leftmost restriction on derivation in the context-free grammar, both accepting and generating power of the system is equal to generative power of context-free grammars. Third, the system with appearance checking can accept and generate all recursively enumerable languages. From more pragmatical viewpoint, this paper describes several linguistic applications. A special attention is paid to the Japanese-Czech translation.

Journal ArticleDOI
01 Aug 2012
TL;DR: A fundamental framework of fuzzy grammars based on lattices is established, and it is proved that l-VFAs, l-valued deterministic finite automata, l-RGs and l-DRGs are equivalent based on depth-first way.
Abstract: In this paper, on the basis of breadth-first and depth-first ways, we establish a fundamental framework of fuzzy grammars based on lattices, which provides a necessary tool for the analysis of fuzzy automata. The relationship among finite automata with membership values in lattices (l-VFAs), lattice-valued regular grammars (l-RGs) and lattice-valued deterministic regular grammars (l-DRGs) is investigated. It is demonstrated that, based on each semantic way, l-VFAs and l-RGs are equivalent in the sense that they accept or generate the same classes of fuzzy languages. Furthermore, it is proved that l-VFAs, l-valued deterministic finite automata, l-RGs and l-DRGs are equivalent based on depth-first way. For any l-RG, the language based on breadth-first way coincides with the language based on depth-first way if and only if the truth-valued domain l is a distributive lattice.

Journal Article
TL;DR: The notion of new synchronous grammars as systems consisting of two context-free Grammars with linked rules instead of linked nonterminals is introduced, and linguistic application prospects are presented on natural language translation between Japanese and English.
Abstract: This paper introduces the notion of new synchronous grammars as systems consisting of two context-free grammars with linked rules instead of linked nonterminals Further, synchronous versions of regulated grammars, specifically, matrix grammars and scattered context grammars, are discussed From a theoretical point of view, this paper discusses the power of these synchronous grammars It demonstrates the following main results First, if we synchronize context-free grammars by linking rules, the grammar generates the languages defined by matrix grammars Second, if we synchronize matrix grammars by linking matrices, the generative power remains unchanged Third, synchronous scattered context grammars generate the class of recursively enumerable languages From a more practical viewpoint, this paper presents linguistic application prospects The focus is on natural language translation between Japanese and English

Book ChapterDOI
23 Jul 2012
TL;DR: It is demonstrated that the family of languages generated by unambiguous conjunctive grammars with 1 nonterminal symbol is strictly included in the Languages generated by 2-nonterminal Grammars, which is in turn a proper subset of the family generated using 3 or more nonterminals.
Abstract: It is demonstrated that the family of languages generated by unambiguous conjunctive grammars with 1 nonterminal symbol is strictly included in the languages generated by 2-nonterminal grammars, which is in turn a proper subset of the family generated using 3 or more nonterminal symbols. This hierarchy is established by considering grammars over a one-letter alphabet, for which it is shown that 1-nonterminal grammars generate only regular languages, 2-nonterminal grammars generate some non-regular languages, but all of them have upper density zero, while 3-nonterminal grammars may generate some non-regular languages of non-zero density. It is also shown that the equivalence problem for 2-nonterminal grammars is undecidable.

Book ChapterDOI
02 Jul 2012
TL;DR: A normal form of CDG similar to Greibach normal form for cf-grammars is defined and an effective algorithm which transforms any CDG into an equivalent CDG in the normal form is proposed.
Abstract: Categorial Dependency Grammars (CDG) studied in this paper are categorial grammars expressing projective and discontinuous dependencies, stronger than cf-grammars and presumably nonequivalent to mild context-sensitive grammars. We define a normal form of CDG similar to Greibach normal form for cf-grammars and propose an effective algorithm which transforms any CDG into an equivalent CDG in the normal form. A class of push-down automata with independent counters is defined and it is proved that they accept the class of CDG-languages. We present algorithms that transform any CDG into an automaton and vice versa.


Journal ArticleDOI
TL;DR: The present paper examines the source of this expressive power of graph grammars by analyzing complexity and decidability of the so-called k-connecting Lin-A-NLC (k-Lin-A -NLC) grammARS in which the right-hand side of each production contains at most k nodes that can be connected to outside nodes.

Journal Article
TL;DR: Based on the analysis and induction of the key characteristics of context-sensitive graph grammar, the relationships between their expressiveness are uncovered and proved by constructing formalism-transforming algorithms.
Abstract: Context-Sensitive graph grammars are formal tools used for specifying visual languages.In order to intuitively describe and parse visual languages,current research has stressed the formalisms and algorithms of graph grammars,but has neglected the comparison of their expressiveness.Based on the analysis and induction of the key characteristics of context-sensitive graph grammar,the relationships between their expressiveness are uncovered and proved in this paper by constructing formalism-transforming algorithms.Moreover,the proposed algorithms correlate with these formalisms;thus,facilitating the usage of context-sensitive graph grammars,as alternative formalisms rather than merely one can be chosen to separately specify and parse visual objects in applications.

Journal ArticleDOI
TL;DR: An algorithm which solves the membership problem of Petri net controlled grammars without @l-rules and cyclic rules and is included in the class of context-sensitive languages.

Book ChapterDOI
14 Aug 2012
TL;DR: It is shown that the class of languages that are generated by centralized PC grammar systems with context-sensitive components working in nonreturning mode coincides with the complexity class NEXT=∪c≥1NTIME(2c·n).
Abstract: It is known that in returning mode centralized PC grammar systems with context-sensitive components only generate context-sensitive languages. Here we show that the class of languages that are generated by centralized PC grammar systems with context-sensitive components working in nonreturning mode coincides with the complexity class NEXT=∪c≥1NTIME(2c·n).

Journal ArticleDOI
TL;DR: The hierarchy by the k-valued constraint is established in the class of categorial grammars extended with iterated types adapted to express the so called projective dependency structures.

Proceedings Article
Niklas Fors1
01 Jan 2012
TL;DR: This work investigates how RAGs can be used for implementing tools for visual languages by extending Knuth’s attribute grammars with references to extend the abstract syntax tree to a graph.
Abstract: Reference attributed grammars (RAGs) extend Knuth’s attribute grammars with references. These references can be used to extend the abstract syntax tree to a graph. We investigate how RAGs can be used for implementing tools for visual languages. Programs in those languages can often be expressed as graphs.