scispace - formally typeset
Search or ask a question

Showing papers on "Formal grammar published in 1988"


Journal ArticleDOI
M. Leyton1
TL;DR: It is argued that the inference of history arises from a newly discovered duality between curvature extrema and symmetry structure, and a formal grammar is developed by which someone can infer the processes that produced the second stage from the first.

246 citations


Journal ArticleDOI
TL;DR: The syntax of PICASSO queries is presented and PICassO queries with similar queries in standard relational query languages are compared.
Abstract: SUMMARY PICASSO (PICture Aided Sophisticated Sketch Of database queries) is a graphics-based database query language designed for use with a universal relation database system. The primary objective of PICASSO is ease of use. Graphics are used to provide a simple method of expressing queries and to provide visual feedback to the user about the system's interpretation of the query. Inexperienced users can use the graphical feedback to aid them in formulating queries whereas experienced users can ignore the feedback. Inexperienced users can pose queries without knowing the details of underlying database schema and without learning the formal syntax of SQLlike query language. This paper presents the syntax of PICASSO queries and compares PICASSO queries with similar queries in standard relational query languages. Comparisons are also made with SystemAJ, a non-graphical universal relation system on which PICASSO is based. The hypergraph semantics of the universal relation are used as the foundation for PICASSO and their integration with a graphical workstation enhances the usability of database systems.

93 citations


Book ChapterDOI
Y. Shoham1, N. Goyal1
01 Oct 1988
TL;DR: This chapter presents an introduction to a representative temporal logic with formal syntax and semantics and provides an overview of the problems and the advances made in nonmonotonic temporal reasoning.
Abstract: Publisher Summary In one way or another, every area of artificial intelligence (AI) has to do with time. Medical diagnosis systems reason about the time at which a virus infected the blood system. Device troubleshooting systems look at how long it takes a capacitor to saturate. In automatic programming, the time at which a variable becomes bound is important. In robot planning, one wants to achieve one goal before another to meet deadlines, and so on. In qualitative physics, the concept of time is essential as well. One can identify several classes of tasks in AI that require reasoning about time: (1) prediction, (2) explanation, (3) planning, and (4) learning new rules. These classes of tasks, though related, have given rise to by and large disjoint fields of research. These disjoint research areas can be unified to some extent by providing a uniform framework for temporal reasoning. The somewhat mythical area of temporal reasoning aims to provide such a framework. The representation of temporal information and reasoning about such information requires a language that can capture the concept of change over time and can express the truth or falsity of statements at different times. This language should not only be well-defined but also have a clear meaning. This has led researchers to develop temporal logic. The passage of time is important only because changes are possible with time. This chapter explains two different approaches to reasoning about change: change-based and time-based. The chapter presents an introduction to a representative temporal logic with formal syntax and semantics. It also provides an overview of the problems and the advances made in nonmonotonic temporal reasoning.

56 citations


01 Jun 1988
TL;DR: The formal syntax and semantics of Real Time Logic (RTL), a logic for the specification of real-time systems, are presented and RTL is shown to be undecidable by a reduction from the acceptance problem for two-counter machines.
Abstract: This paper presents the formal syntax and semantics of Real Time Logic (RTL), a logic for the specification of real-time systems. An example illustrating the specification of a system in RTL is presented, and natural deduction is used to verify that the system satisfies a given safety property. RTL is shown to be undecidable by a reduction from the acceptance problem for two-counter machines. Decidable subclasses of the logic are also discussed.

37 citations


Book ChapterDOI
01 Jan 1988
TL;DR: In the first part of this paper, it is shown how one natural methodological approach to linguistic problems leads directly to a framework which emphasizes the generalization of the problem of composition to a number of dimensions.
Abstract: In the first part of this paper, I show how one natural methodological approach to linguistic problems leads directly to a framework which emphasizes the generalization of the problem of composition to a number of dimensions. In the second part, I introduce a general notion of compositional function, which provides an appropriate abstract setting in which to discuss the problem of multi-dimensional composition. In this general setting, we may think of a grammar G,much as Montague did (1974, Chapter 7, hereafter ‘UG’), as the closure of a set of compositional functions over a set of postulated basic expressions (each endowed with properties in the various dimensions under consideration). There are then a number of ways of conceiving of languages associated with G, depending on whether we wish to consider every analysis generated by G,some special class of analyses generated by G (such as those associated with a designated symbol ‘S’),the phonological structures definable in the structure G,and so on.

37 citations


Journal ArticleDOI

17 citations


Journal ArticleDOI
TL;DR: The paper discusses the consequences of this approach (Grammatically Restricted Machine Translation, GRMT) and describes the limits set by a standard choice of grammatical rules for sentences and clauses, noun phrases, verb phrases, sentence adverbials, etc.
Abstract: One may indicate the potentials of an MT system by stating what text genres it can process, e.g., weather reports and technical manuals. This approach is practical, but misleading, unless domain knowledge is highly integrated in the system. Another way to indicate which fragments of language the system can process is to state its grammatical potentials, or more formally, which languages the grammars of the system can generate. This approach is more technical and less understandable to the layman (customer), but it is less misleading, since it stresses the point that the fragments which can be translated by the grammars of a system need not necessarily coincide exactly with any particular genre. Generally, the syntactic and lexical rules of an MT system allow it to translate many sentences other than those belonging to a certain genre. On the other hand it probably cannot translate all the sentences of a particular genre. Swetra is a multilanguage MT system defined by the potentials of a formal grammar (standard referent grammar) and not by reference to a genre. Successful translation of sentences can be guaranteed if they are within a specified syntactic format based on a specified lexicon. The paper discusses the consequences of this approach (Grammatically Restricted Machine Translation, GRMT) and describes the limits set by a standard choice of grammatical rules for sentences and clauses, noun phrases, verb phrases, sentence adverbials, etc. Such rules have been set up for English, Swedish and Russian, mainly on the basis of familiarity (frequency) and computer efficiency, but restricting the grammar and making it suitable for several languages poses many problems for optimization. Sample texts — newspaper reports — illustrate the type of text that can be translated with reasonable success among Russian, English and Swedish.

14 citations



Book ChapterDOI
14 Nov 1988
TL;DR: It is shown that only regular languages belong to the same levels of both hierarchies, which implies that VLSI circuits need Θ (n) area and Θ(n2) area to recognize deterministic context-free languages.
Abstract: Chomsky hierarchy is compared with the hierarchy of communication complexity for VLSI. It is shown that only regular languages belong to the same levels of both hierarchies. There are hard languages according to Chomsky hierarchy that belong to the lowest level in communication complexity hierarchy. On the other hand there is a deterministic linear language that requires the highest (linear) communication complexity. This is the main result because it implies that VLSI circuits need Θ(n) area and Θ(n2) area. (time)2 complexity to recognize deterministic context-free languages which solves an open problem of Hromkovic [7].

7 citations


Book ChapterDOI
04 Sep 1988

7 citations



Book ChapterDOI
01 Jan 1988
TL;DR: In this article, the authors present a representation which closely mirrors the characteristics of generic structures in patents, and which enables an internal representation to be created, and the application of formal grammar theory to some of the problems caused by generic radical terms.
Abstract: The achievements of the project are reviewed briefly. They include: a) the design, implementation and testing of a representation which closely mirrors the characteristics of generic structures in patents, and which enables an internal representation to be created, b) the application of formal grammar theory to some of the problems caused by generic radical terms, and c) the development of a number of search representations including both fragments and ring descriptors, as well as reduced graph representations, together with appropriate search algorithms, for screening and atom- and bond-level searching.

Book
01 Jan 1988
TL;DR: This book discusses Semantic Aspects of Natural Language, Utterer's Meaning, Sentence-Meaning, and Word-Means, and the Proper Treatment of Quantification in Ordinary English.
Abstract: Philosophy and Natural-Language Processing.- Prologue: Modes of Meaning.- Utterer's Meaning, Sentence-Meaning, and Word-Meaning.- I: Formal Syntax of Natural Language.- Footlose and Context-Free.- Evidence Against the Context-Freeness of Natural Language.- II: Semantic Aspects of Natural Language.- Truth and Meaning.- Semantics for Propositional Attitudes.- III: Connecting Syntax with Semantics.- The Proper Treatment of Quantification in Ordinary English.- Phrase Structure Grammar.- IV: Natural Language and Logical Form.- Quantifiers in Natural Languages: Some Logical Problems, I.- Generalized Quantifiers and Natural Language.- V: Possible-Worlds and Situation Semantics.- From Worlds to Situations.- Possible Worlds and Situations.- Epilogue: From Semantics to Pragmatics.- Semantics versus Pragmatics.- Selected Bibliography.- Index of Names.- Index of Subjects.


Journal ArticleDOI
TL;DR: It is argued that formal semantics, in the model-theoretic style pioneered by Tarski, is appropriate for specifying the meanings of the compositional component of artificial formal languages but not of natural languages.
Abstract: It is argued that formal semantics, in the model-theoretic style pioneered by Tarski, is appropriate for specifying the meanings of the compositional component of artificial formal languages but not of natural languages. Since computer programming languages are both formal and artificial, formal semantics has a clear application to them, but this does not mean that it is in any way relevant to the problem of meaning in AI. The distinction is drawn between what an expression in a language means, and what a person means by using it. The former is the only kind of meaning that formal semantics can ever explain, whereas for AI to succeed it is essential to elucidate, and then to recreate, the latter. No verdict is offered on whether or not this may ultimately be possible; but it is argued that formal semantics would be an inappropriate tool to use to this end.

01 Jan 1988
TL;DR: The possibility of using a logic grammar as the basis for automatically generating a part-of-speech lexicon is investigated, and an implementation of such a lexicon learner is demonstrated together with several possible improvements concerning efficiency and other factors.
Abstract: The possibility of using a logic grammar as the basis for automatically generating a part-of-speech lexicon is investigated, and an implementation of such a lexicon learner is demonstrated together with several possible improvements concerning efficiency and other factors.

Book ChapterDOI
11 Feb 1988
TL;DR: The main result of the paper states that L is a language with a Hotz-isomorphism if and only if F(X)/L is a finitely presentable group.
Abstract: A language \(L \subseteq X*\)is called a language with a Hotz-isomorphism if there is a generating grammar such that its Hotz group is canonically isomorphic to F(X)/L. The main result of the paper states that L is a language with a Hotz-isomorphism if and only if F(X)/L is a finitely presentable group. We also prove an analogous result for Hotz-monoids. Further, we show when the construction of a grammar which allows the Hotz-isomorphism is effective; and we prove various undecidability results concerning this construction.

Journal ArticleDOI
TL;DR: A detailed study of a particular biomedical application is given to mechanize the identification of submedian and telocentric chromosomes and the context free grammar is exploited to find the canonical collections of the corresponding LR parser.

Journal ArticleDOI
J. E. Lang1
TL;DR: This paper will write down formal grammars for FFP based on the syntax as given by Backus and Williams and will discuss their semantic implications.
Abstract: In Backus's original paper on FP and FFP [1], the syntax of these language is described informally. Though a formal syntax for Berkeley FP has appeared in print [2], no formal representation of FFP syntax has yet appeared. It is desirable to have such a formal representation for implementation purposes, and such a representation may also be useful for specifying where the syntax of FFP ends and its semantics begins. In this paper, we will write down formal grammars for FFP based on the syntax as given by Backus [1] and Williams [3] and will discuss their semantic implications. A third grammar similar to that for FP is presented and it may be used for implementing the language.

Proceedings ArticleDOI
22 Aug 1988
TL;DR: A new type of abstract automaton is introduced, and both formal and linguistic implications are discussed, most importantly a new possibility of proving certain formal properties of (natural) languages and their grammars and of refinement of the Chomsky hierarchy.
Abstract: A new type of abstract automaton is introduced, and both formal and linguistic implications are discussed, most importantly a new possibility of proving certain formal properties of (natural) languages and their grammars (such as context-freeness) and of refinement of the Chomsky hierarchy.

Proceedings ArticleDOI
16 Mar 1988
TL;DR: A survey of the impact of linguistically sound, formally specified grammatical theories that reflect a declarative rather than procedural approach to grammar and which embody the flexibility of information location afforded by unification on the future of practical natural language processing systems.
Abstract: Special logics for representing the semantics of domains peculiar to human language have been developed, and formal syntax research has been enriched by a substantial number of linguistically sound, formally specified grammatical theories that reflect a declarative rather than procedural approach to grammar and which embody the flexibility of information location afforded by unification. These theories have been investigated for their weak generative capacity, the complexity of their parsing, and the ways in which grammatical rules or principles can be embedded in actual parsers, developments which promise to impact greatly on the future of practical natural language processing systems. The impact of this research on knowledge representation and formal language theory is surveyed. >

Book ChapterDOI
01 Jan 1988
TL;DR: Most AI theorists and researchers would agree with Hofstadter’s arguments or at least be in close sympathy with them, as would information theorists like MacKay, but a number of computer scientists and others, like Steiner, would deny that any formalism ever could simulate natural-language processes fully, even in theory.
Abstract: Most AI theorists and researchers would agree with Hofstadter’s arguments or at least be in close sympathy with them, as would information theorists like MacKay. Jackson, for example, contends that ‘There is no known a priori limit to the extensibility of a computer’s language capability other than those limits of a purely practical nature (memory size and processing speed)’ and that, ‘Although the difficulties involved with understanding natural language should not be minimized, no one has been able to show … that English is theoretically outside the language capability of all computers ….’1 However, a number of computer scientists and others, like Steiner, would deny that any formalism ever could simulate natural-language processes fully, even in theory. Weizenbaum’s opinions in that regard are well known and representative.