scispace - formally typeset
Search or ask a question

Showing papers on "Formal grammar published in 2003"


Book ChapterDOI
01 Jan 2003
TL;DR: In this article, the authors propose secondary notations that can be used however the user likes, such as comments in a programming language, and the use of colours or format choices to indicate information additional to the content of text.
Abstract: ion: types and availability of abstraction mechanisms. Abstractions (redefinitions) change the underlying notation. Macros, data structures, global findand-replace commands, quick-dial telephone codes, and word-processor styles are all abstractions. Some are persistent, some are transient. Abstractions, if the user is allowed to modify them, always require an abstraction manager -a redefinition sub-device. It will sometimes have its own notation and environment (e.g. the Word style sheet manager) but not always (for example, a class hierarchy can be built in a conventional text editor). Systems that allow many abstractions are potentially difficult to learn. Secondary notation: extra information in means other than formal syntax. Users often need to record things that have not been anticipated by the notation designer. Rather than anticipating every possible user requirement, many systems support secondary notations that can be used however the user likes. One example is comments in a programming language, another is the use of colours or format choices to indicate information additional to the content of text. Closeness of mapping: closeness of representation to domain. How closely related is the notation to the result it is describing?

220 citations


Book ChapterDOI
15 Jul 2003
TL;DR: This programming language is an extension of 3APL and allows the programmer to implement agents’ mental attitudes like beliefs, goals, plans, and actions, and agents' reasoning rules by means of which agents can modify their mental attitudes.
Abstract: This paper presents the specification of a programming language for cognitive agents. This programming language is an extension of 3APL (An Abstract Agent Programming Language) and allows the programmer to implement agents’ mental attitudes like beliefs, goals, plans, and actions, and agents’ reasoning rules by means of which agents can modify their mental attitudes. The formal syntax and semantics of this language is presented as well as a discussion on the deliberation cycle and an example.

189 citations


Journal ArticleDOI
TL;DR: In this paper, it is suggested that the theory of parameters, being able to deal with cultural variation, is in the best position to achieve adequacy at the fourth level, one connected with historical explanations, and some methods to pursue this goal are proposed.
Abstract: The present paper addresses some foundational issues on the status of parametric linguistics, understood as a partially independent new branch of formal grammar and of the cognitive sciences more generally Chomsky’s (1964) original three levels of adequacy are extended to five and it is then suggested that the theory of parameters, being able to deal with cultural variation, is in the best position to achieve adequacy at the fourth proposed level, one connected with historical explanations, and some methods to pursue this goal are proposed The development of parametric linguistics is viewed as a major step toward the potential application of the Galilean style of formal grammar both to the study of linguistic history and to other domains of cognitive science and cultural anthropology

65 citations


Proceedings ArticleDOI
09 Mar 2003
TL;DR: Grammar-specific genetic operators for crossover and mutation are proposed to achieve grammar induction to make domain-specific language development easier for domain experts not versed in a programming language design.
Abstract: One of the open problems in the area of domain-specific languages is how to make domain-specific language development easier for domain experts not versed in a programming language design. Possible approaches are to build a domain-specific language from parameterized building blocks or by language (grammar) induction. This paper uses an evolutionary approach to grammar induction. Grammar-specific genetic operators for crossover and mutation are proposed to achieve this task. Suitability of the approach is shown by small experiments where underlying grammars are successfully genetically obtained and parsers are than automatically generated.

37 citations


Journal ArticleDOI
TL;DR: For instance, this article pointed out that the grammar debate is really about conflicting social forces people would rather not discuss: race and ethnicity, power and privilege, oppression and marginalization, and that usage rules are the conventions of written English that allow Americans to discriminate against one another.
Abstract: itself without some clue as to whether we mean formal grammar, school grammar, linguistic descriptions of grammar, spelling, punctuation, usage, grammar worksheets, grammar in context (Constance Weaver's phrase), error avoidance, or memorization of the parts of speech. However, in this article, we deliberately use the word grammar in its unmodified form because that is the way most people who complain about student writing still employ that word. We must add, however, that there are undoubtedly other things people mean by "grammar" that are not explicit in the above list and perhaps not even recognized consciously by users of the word. As James Zebroski suggests, the grammar debate is really about conflicting social forces people would rather not discuss: race and ethnicity, power and privilege, oppression and marginalization (318-19). Rhetoricians Sharon Crowley and Debra Hawhee are even more direct in their view that "usage rules are the conventions of written English that allow Americans to discriminate against one another" (283). The ongoing grammar issue is a patina for a more complex, serious debate we all need to have about power and opportunity in this culture. In light of these important problems, why do so many handwringing arguments about grammar circle back to the same tired question of how to make grammar interesting to students? We want to move important issues in the teaching of writing off the dime about grammar. The question in our title is meant to change the conversation and explode simplistic answers regarding writing pedagogy. We know there are many effective writing teachers who understand that grammar is a tool for making meaning and not an end in itself. However, even those teachers are under increasing pressure to teach handbook rules in traditional fashion to address

16 citations


01 Jan 2003
TL;DR: The proposed formulation allow considering context-dependent languages as natural extensions of context-free ones, by adding some arrangements, while covering issues related to scopes, types, dynamic syntax, static semantics and language extensibility.
Abstract: The aim of the present paper is to revisit some important topics regarding pedagogical issues on the teaching of concepts and echniques of programmingt languages and compiler construction. Essentially, a unified formal model is used in the proposed approach to explore the exposition of the students to a set of lessons with growing complexity, with a heavy practical component, all sharing the same common formal model. Experiments start with the definition of regular languages, aimed to introduce concepts and give the students familiarity with formal languages and their description through a metalanguage suitable for direct mapping into finite-state automata. Context-free languages are then considered as a simple extension of regular languages, both in grammar and acceptor aspects. The proposed formulation allow considering context-dependent languages as natural extensions of context-free ones, by adding some arrangements, while covering issues related to scopes, types, dynamic syntax, static semantics and language extensibility.

13 citations


Journal ArticleDOI
TL;DR: This paper explores the behavior of range concatenation grammars in counting, a domain in which bad reputation of other classical syntactic formalisms is well known and leads to some surprising results.

11 citations




Proceedings ArticleDOI
28 Oct 2003
TL;DR: The paper describes a formal model to specify the syntax and semantics of interactive visual languages by SR-task grammars, where a new form of production rules allows to directly specify the interactive behavior of the visual environments described.
Abstract: The paper describes a formal model to specify the syntax and semantics of interactive visual languages by SR-task grammars. The proposed approach provides language designers with a new type of grammars based on the formalism of symbol-relation grammars, where a new form of production rules allows to directly specify the interactive behavior of the visual environments described. Therefore, the language designer only has to define the development of a visual scenario depending on occurring events, and the changes of the state of a scene is regarded as the application of a rewriting rule.

6 citations


01 Jan 2003
TL;DR: This paper shows an epistemological ambiguity that arises in the context of logic programming, and investigates the causes and the consequences and points out some directions to overcome the ambiguity.
Abstract: It is commonly believed that the meaning of a formal declarative knowledge representation language is determined by its formal semantics. This is not quite so. This paper shows an epistemological ambiguity that arises in the context of logic programming. Several different logic programming formalisms and semantics have been proposed. Hence, logic programming can be seen as an overlapping family of formal logics, each induced by a pair of a formal syntax and a formal semantics. We would expect that (a) each such pair has a unique declarative reading and (b) for a program in the intersection of several formal LP logics with the same formal semantics in each of them, its declarative reading is the same in each of them. I show in this paper that neither (a) nor (b) holds. The paper investigates the causes and the consequences of this phenomenon and points out some directions to overcome the ambiguity.

01 Jan 2003
TL;DR: It is proved that the languages of link structured lists of words associated to rigid link grammars have finite elasticity and a learning algorithm is shown and this result leads to the learnability of rigid or k-valued link Grammars learned from strings.
Abstract: The article is concerned with learning link grammars in the model of Gold. We show that rigid and k-valued link grammars are learnable from strings. In fact, we prove that the languages of link structured lists of words associated to rigid link grammars have finite elasticity and we show a learning algorithm. As a standard corollary, this result leads to the learnability of rigid or k-valued link grammars learned from strings.

Journal ArticleDOI
TL;DR: Grammars for RNA secondary structure [5, 8] as subclasses of mcfg are identified and the inclusion relation among the class of languages generated by these grammars is clarified.
Abstract: Much attention has been paid to RNA secondary structure predicition based on context-free grammar (cfg) since cfg can represent stem-loop structure by its derivation tree. Especially, techniques based on CKY (Cocke-Kasami-Younger) algorithm have been widely investigated [1]. Pseudoknots play an important role in RNA functions such as ribosomal frameshifting and splicing. A database (PseudoBase) for RNA pseudoknots has been constructed [9]. Unfortunately, it is known that cfg cannot represent pseudoknot structure and a few grammars have been proposed to represent pseudoknots [5, 8]. However, the relation among the expressive (generative) power of these grammars and/or other grammars in formal language theory beyond cfg has not been clarified. The authors have proposed a class of grammars called multiple context-free grammars [3, 7]. In this research, we identify grammars for RNA secondary structure [5, 8] as subclasses of mcfg and also clarify the inclusion relation among the class of languages generated by these grammars.

01 Jan 2003
TL;DR: It is proved that the number of components in context-free cooperating distributed (CD) grammar systems can be reduced to 3 when they are working in the so-called sf-mode of derivation, which is the cooperation protocol which has been considered first for CD grammar systems.
Abstract: It is proved that the number of components in context-free cooperating distributed (CD) grammar systems can be reduced to 3 when they are working in the so-called sf-mode of derivation, which is the cooperation protocol which has been considered first for CD grammar systems, in this derivation mode, a component continues the derivation until and unless there is a nonterminal in the sentential form which cannot be rewritten according to that component. Moreover, it is shown that CD grammar systems in sf-mode with only one component can generate only the context-free languages but they can generate non-context-free languages if two components are used. The sf-mode of derivation is compared with other well-known cooperation protocols with respect to the hierarchies induced by the number of components.

Book ChapterDOI
16 Feb 2003
TL;DR: This paper will illustrate some of the new insights into syntactic description, semantic composition, discourse structure, language generation, psycholinguistic and statistical processing in the context of the lexicalized tree-adjoining grammar (LTAG).
Abstract: For the specification of formal systems for a grammar formalism, conventional mathematical wisdom dictates that we start with primitives (basic primitive structures or building blocks) as simple as possible and then introduce various operations for constructing more complex structures. Alternatively, we can start with complex (more complicated) primitives that directly capture crucial linguistic properties and then introduce some general operations (language independent operations) for composing them. This latter approach has led to the so-called strongly lexicalized grammars, providing some new insights into syntactic description, semantic composition, discourse structure, language generation, psycholinguistic and statistical processing, all with computational implications. In this paper, we will illustrate some of these insights in the context of the lexicalized tree-adjoining grammar (LTAG).

Patent
24 Oct 2003
TL;DR: In this article, a heterogeneous data access system sets up a formal granule identification grammar and interfaced granule (9, 11) client server adaptor for each tool with direct HTTP (Hyper Text Transfer Protocol), FTP (File Transfer Protocol) or HTTPS (Hyper text Transfer Protocol Secured) connection of appropriate client (C) and server (S).
Abstract: A heterogeneous data access system sets up a formal granule identification grammar and interfaced granule (9, 11) client server adaptor (10, 12) for each tool (6, 7) with direct HTTP (Hyper Text Transfer Protocol), FTP (File Transfer Protocol) or HTTPS (Hyper Text Transfer Protocol Secured) connection of appropriate client (C) and server (S). Includes an Independent claim for use of the system with XML (Extended Mark Up Language) format granules with structured data.

Proceedings ArticleDOI
18 Aug 2003
TL;DR: This paper provides a calculus that helps in expressing these performance issues at the design stage, a subset of the Pi calculus with enriched timing constructs and a formal syntax, an operational semantics and a bisimulation method.
Abstract: Server side computation is very important to the end user. It is a performance issue, which is, usually, not well realized. In this paper, we provide a calculus that helps in expressing these performance issues at the design stage. The calculus is a subset of the Pi calculus with enriched timing constructs. We provide a formal syntax, an operational semantics and a bisimulation method. To illustrate the calculus an example from the Web domain is given.

Book ChapterDOI
01 Jan 2003
TL;DR: The use of a formal specification language as a modelling tool provides a possible way to achieve precise analysis in requirements engineering or analysis.
Abstract: Requirements engineering or analysis is normally seen as the process by which software requirements are identified and specified, and has been described as a ‘process of discovery, refinement, modelling and specification’ [36] . System modelling is a useful tool in this process. We can create models to express our understanding of a current system problem, to identify areas for change or to describe the final product to be built. They permit the partitioning of complex system problems into smaller and more manageable units, and can be used as a basis for communication and validation with the client. Many of the modelling approaches use semi-formal diagramming techniques resulting, for example in data flow diagrams and entity relationship diagrams, and may be supported by explanatory text . However these approaches lack the well defined and formal syntax that permits precise analysis, and models expressed in these terms are open to possible misunderstanding and ambiguity. At some point we need to express the model more formally and to subject it to precise and rigorous analysis. The use of a formal specification language as a modelling tool provides a possible way to achieve this.

Journal ArticleDOI
TL;DR: A compiler generator that outputs code based on the use of monadic combinators, Mimico provides an easy way of specifying the syntax and semantics of languages, and generates readable output in the form of Haskell programs.
Abstract: This article describes a compiler generator, called Mimico, that outputs code based on the use of monadic combinators. Mimico can parse infinite look-ahead and left-recursive context free grammars and defines a scheme for handling the precedence and associativity of binary infix operators, and monadic code in semantic rules. Mimico provides an easy way of specifying the syntax and semantics of languages, and generates readable output in the form of Haskell programs. The article presents Mimico's general principles, its formal syntax and semantics, its limitations and illustrative examples of its behaviour.

Book ChapterDOI
19 Nov 2003
TL;DR: This work has developed software tools that allow executing the formal language definition on a computer and found that tools greatly help to eliminate numerous errors from the formal definition, which likely would have not been found without tools.
Abstract: With the latest revision of the ITU-T Specification and Description Language (SDL-2000), a formal language definition based on the concept of Abstract State Machines (ASMs) became integral part of the standard. Together with the formal definition, we have developed software tools that allow executing the formal language definition on a computer. In doing so, we found that tools greatly help to eliminate numerous errors from the formal definition, which likely would have not been found without tools.

01 Jan 2003
TL;DR: A Mathematical Theory governing word order typology that explains not only the established generative grammar rules of a language but, also, lays the groundwork for understanding sentence competency in terms of irreducible components that has not been accounted for in previous formal theories.
Abstract: In this paper I attempt to lay the groundwork for an algorithm that measures sentence competency. Heretofore, competency of sentences was determined by interviewing speakers of the language. The data compiled forms the basis for grammatical rules that establish the generative grammar of a language. However, the generative grammar, once established, does not filter out all incompetent sentences. Chomsky has noted that there are many sentences that are grammatical but do not satisfy the notion of competency and, similarly, many non-grammatical constructions that do. I propose that generative grammar constructions as well as formal theory frameworks such as Transformational Grammar, Minimalist Theory, and Government and Binding do not represent the most irreducible component of a language that determines sentence competency. I propose a Mathematical Theory governing word order typology that explains not only the established generative grammar rules of a language but, also, lays the groundwork for understanding sentence competency in terms of irreducible components that has not been accounted for in previous formal theories. I have done so by relying on a mathematical analysis of word frequency relationships based upon large, representative corpuses that represents a more basic component of sentence construction overlooked by current text processing and artificial intelligence parsing systems and unaccounted for by the generative grammar rules of a language.

Journal Article
TL;DR: A system structure is introduced to construct the whole processing system for mechanical documentation and an XML viewer for tabular forms based on the attribute graph grammar is introduced.
Abstract: We deal with mechanical documentation in software development tools. First, we review tabular forms for program specification and their formal syntax by an attribute edNCE graph grammar. Next we explain a parser based on the syntactic definitions and attribute rules. Furthermore, we introduce an XML viewer for tabular forms based on the attribute graph grammar. Finally, we introduce a system structure to construct the whole processing system for mechanical documentation. These results are applied to mechanical manipulation of general tabular forms.

Book ChapterDOI
23 Jun 2003
TL;DR: A knowledge base of plant unit operations that can be linked to a graphical front end for inputting operating instructions and then builds a formal model of the instruction set as an interlingua and then uses it to output multilingual operating procedures.
Abstract: This paper presents a system, named CAPTOP, for authoring and checking operating procedures for plant operations. It consists of a knowledge base of plant unit operations that can be linked to a graphical front end for inputting operating instructions. The system then builds a formal model of the instruction set as an interlingua and then uses it to output multilingual operating procedures. It avoids the problems of natural language understanding that make machine translation so difficult. Furthermore, the system could also generate output in a formal syntax that can be used as input to another knowledge based component, CHECKOP, for checking the procedure for operability and safety problems.

Dissertation
26 Nov 2003
TL;DR: This thesis presents a novel model for analyzing queries of the users of spoken language systems in restricted discourse areas by means of a formal grammar in the discourse area "currency conversion", which allows the automatic analysis of spoken input.
Abstract: This thesis presents a novel model for analyzing queries of the users of spoken language systems in restricted discourse areas. Such systems are able to interact with the user by means of speech and provide information in specific domains. The queries which may be posed are modeled by means of a formal grammar in the discourse area "currency conversion". This grammar allows the automatic analysis of spoken input. The results of many scientific disciplines, such as linguistics, computer science, and cognitive science contribute to the model. Special attention is paid to the integration of various knowledge types (semantic, syntactic, and pragmatic information) into a consistent model. This leads to a grammar life cycle, which is composed of a sequence of steps: grammar specification, grammar mining, grammar design, and iterative grammar evaluation. The development of the grammar is illustrated with numerous examples from the discourse area "currency conversion". Finally, our approach is compared with currently existing systems, and the integration into the design of user-friendly man-machine interfaces is discussed.

Book ChapterDOI
Sergey Verlan1
17 Jul 2003
TL;DR: In this article, the authors considered splicing P systems with immediate communication and proved that such systems with two membranes can generate any recursively enumerable language, and they showed that it is possible to generate a language by such two membranes.
Abstract: We consider splicing P systems with immediate communication introduced by Gh. Paun in [5]. We solve the open problem Q17 from that book by proving that systems with two membranes can generate any recursively enumerable language. We discuss the similarities between splicing P systems with immediate communication having two membranes and time-varying distributed H systems with two components. We also consider non-extended splicing P systems, i.e., without a terminal alphabet. We show that it is possible to generate any recursively enumerable language by such systems with two membranes. In this way we solve the open problem Q16 from the same book.