scispace - formally typeset
Search or ask a question

Showing papers on "Formal language published in 1996"


Journal ArticleDOI
TL;DR: The distinction between rule-based and associative systems of reasoning has been discussed extensively in cognitive psychology as discussed by the authors, where the distinction is based on the properties that are normally assigned to rules.
Abstract: Distinctions have been proposed between systems of reasoning for centuries. This article distills properties shared by many of these distinctions and characterizes the resulting systems in light of recent findings and theoretical developments. One system is associative because its computations reflect similarity structure and relations of temporal contiguity. The other is "rule based" because it operates on symbolic structures that have logical content and variables and because its computations have the properties that are normally assigned to rules. The systems serve complementary functions and can simultaneously generate different solutions to a reasoning problem. The rule-based system can suppress the associative system but not completely inhibit it. The article reviews evidence in favor of the distinction and its characterization. One of the oldest conundrums in psychology is whether people are best conceived as parallel processors of information who operate along diffuse associative links or as analysts who operate by deliberate and sequential manipulation of internal representations. Are inferences drawn through a network of learned associative pathways or through application of a kind of"psychologic" that manipulates symbolic tokens in a rule-governed way? The debate has raged (again) in cognitive psychology for almost a decade now. It has pitted those who prefer models of mental phenomena to be built out of networks of associative devices that pass activation around in parallel and distributed form (the way brains probably function) against those who prefer models built out of formal languages in which symbols are composed into sentences that are processed sequentially (the way computers function). An obvious solution to the conundrum is to conceive of the

3,488 citations


Book
01 Aug 1996
TL;DR: In this paper, the notions of union, intersection, concatenation, Kleene closure and grammar for fuzzy languages are defined as extensions of the corresponding notions in the theory of formal languages.
Abstract: : A fuzzy language is defined to be a fuzzy subset of the set of strings over a finite alphabet. The notions of union, intersection, concatenation, Kleene closure and grammar for such languages are defined as extensions of the corresponding notions in the theory of formal languages. An explicit expression for the membership function of the language L(G) generated by a fuzzy grammar G is given and it is shown that any context-sensitive fuzzy grammar is recursive. For fuzzy context-free grammars, procedures for constructing the Chomsky and Greibach normal forms are outlined and illustrated by examples. (Author)

324 citations


Book
01 Jan 1996
TL;DR: In this article, an introductory course for computer science and computer engineering majors who have knowledge of some higher-level programming language, the fundamentals of formal languages, automata, computability, and related matters form the major part of the theory of computation.
Abstract: Formal languages, automata, computability, and related matters form the major part of the theory of computation. This textbook is designed for an introductory course for computer science and computer engineering majors who have knowledge of some higher-level programming language, the fundamentals of

322 citations


Journal ArticleDOI
TL;DR: A simpler proof of the fundamental fact that the closure of a regular language under iterated splicing using a finite number of splicing rules is again regular is given.

159 citations


Journal ArticleDOI
TL;DR: A new class of languages, called multi-push-down (mpd), that generalize the classical context-free ones (cf, or Chomsky type 2) and preserve some important properties of classical languages.
Abstract: A new class of languages, called multi-push-down (mpd), that generalize the classical context-free (cf, or Chomsky type 2) ones is introduced. These languages preserve some important properties of ...

113 citations


Journal ArticleDOI
TL;DR: In its earliest formulation, supervisory control theory for discreteevent systems addresses formal control synthesis for discrete event systems in an abstract framework of formal languages and automata, and a principal theme is the modular decomposition of control problems as a means of managing their complexity.

91 citations


Pei Wang1
03 Oct 1996
TL;DR: This research sheds light on several notions in artificial intelligence and cognitive science, including symbol-grounding, induction, categorization, logic, and computation, which are absent from conventional computational models of reasoning.
Abstract: Every artificial-intelligence research project needs a working definition of "intelligence", on which the deepest goals and assumptions of the research are based. In the project described in the following chapters, "intelligence" is defined as the capacity to adapt under insufficient knowledge and resources. Concretely, an intelligent system should be finite and open, and should work in real time. If these criteria are used in the design of a reasoning system, the result is NARS, a non-axiomatic reasoning system. NARS uses a term-oriented formal language, characterized by the use of subject-predicate sentences. The language has an experience-grounded semantics, according to which the truth value of a judgment is determined by previous experience, and the meaning of a term is determined by its relations with other terms. Several different types of uncertainty, such as randomness, fuzziness, and ignorance, can be represented in the language in a single way. The inference rules of NARS are based on three inheritance relations between terms. With different combinations of premises, revision, deduction, induction, abduction, exemplification, comparison, and analogy can all be carried out in a uniform format, the major difference between these types of inference being that different functions are used to calculate the truth value of the conclusion from the truth values of the premises. Since it has insufficient space-time resources, the system needs to distribute them among its tasks very carefully, and to dynamically adjust the distribution as the situation changes. This leads to a "controlled concurrency" control mechanism, and a "bag-based" memory organization. A recent implementation of the NARS model, with examples, is discussed. The system has many interesting properties that are shared by human cognition, but are absent from conventional computational models of reasoning. This research sheds light on several notions in artificial intelligence and cognitive science, including symbol-grounding, induction, categorization, logic, and computation. These are discussed to show the implications of the new theory of intelligence. Finally, the major results of the research are summarized, a preliminary evaluation of the working definition of intelligence is given, and the limitations and future extensions of the research are discussed.

73 citations


Proceedings ArticleDOI
19 Jun 1996
TL;DR: A policy based approach to distributed systems management is proposed, extended by a graph model of the process semantics of operational policy and event specifications, supported by a compiler mapping operational specifications into their semantic graphs, and performing analysis and manipulation functions on such graphs.
Abstract: The heterogeneity, increasing size and complexity of distributed systems requires new architectures, strategies, and tools for their technical management. In this paper we propose a policy based approach to distributed systems management. The use of different abstraction levels allows a stepwise refinement from an informal strategic level to a formalized operational level. On the lowest level, we use a formal language for separate definition of policies and events, that enables the computer to check the syntax of a given policy description and to translate policies into executable rules. To increase the capability for reasoning on a given set of policies, we extended the architecture by a graph model of the process semantics of operational policy and event specifications. The graph model is supported by a compiler mapping operational specifications into their semantic graphs, and performing analysis and manipulation functions on such graphs.

64 citations


Book ChapterDOI
23 Sep 1996
TL;DR: This work presents its own state-oriented logical approach to active rules which combines the declarative semantics of deductive rules with the possibility to define updates in the style of production rules and active rules.
Abstract: After briefly reviewing the basic notions and terminology of active rules and relating them to production rules and deductive rules, respectively, we survey a number of formal approaches to active rules. Subsequently, we present our own state-oriented logical approach to active rules which combines the declarative semantics of deductive rules with the possibility to define updates in the style of production rules and active rules. The resulting language Statelog is surprisingly simple, yet captures many features of active rules including composite event detection and different coupling modes. Thus, it can be used for the formal analysis of rule properties like termination and expressive power. Finally, we show how nested transactions can be modeled in Statelog, both from the operational and the model-theoretic perspective.

61 citations


Book ChapterDOI
23 Sep 1996
TL;DR: It is shown that, in a formal sense, Old Georgian can be taken to provide an example of a non-semilinear language and that none of the aforementioned grammar formalisms is strong enough to generate this language.
Abstract: Mildly context sensitive grammar formalisms such as multi-component TAGs and linear context free rewrite systems have been introduced to capture the full complexity of natural languages. We show that, in a formal sense, Old Georgian can be taken to provide an example of a non-semilinear language. This implies that none of the aforementioned grammar formalisms is strong enough to generate this language.

61 citations


Proceedings ArticleDOI
27 Jul 1996
TL;DR: The notions of serializability, linearizability and sequential consistency are used in the specification of concurrent systems and it is shown that the model checking problem for each of these properties can be cast in terms of the containment of one regular language in another regular language shuffled using a semi-commutative alphabet.
Abstract: The notions of serializability, linearizability and sequential consistency are used in the specification of concurrent systems. We show that the model checking problem for each of these properties can be cast in terms of the containment of one regular language in another regular language shuffled using a semi-commutative alphabet. The three model checking problems are shown to be, respectively, in PSPACE, in EXPSPACE, and undecidable.

Journal ArticleDOI
TL;DR: Text-matching and text-compression algorithms are two important subjects in the wider domain of text processing and play an important role in theoretical computer science by providing challenging problems.
Abstract: Pattern matching is the problem of locating a specific pattern inside raw data. The pattern is usually a collection of strings described in some formal language. Applications require two kinds of solution depending upon which string, the pattern, or the text, is given first. Solutions based on the use of automata or combinatorial properties of strings are commonly implemented to preprocess the pattern. The notion of indices realized by trees or automata is used in the second kind of solutions. The aim of data compression is to provide representation of data in a reduced form in order to save both storage place and transmission time. There is no loss of information, the compression processes are reversible. Pattern-matching and text-compression algorithms are two important subjects in the wider domain of text processing. They apply to the manipulation of texts (word editors), to the storage of textual data (text compression), and to data retrieval systems (full text search). They are basic components used in implementations of practical softwares existing under most operating systems. Moreover, they emphasize programming methods that serve as paradigms in other fields of computer science (system or software design). Finally, they also play an important role in theoretical computer science by providing challenging problems. Although data are recorded in various ways, text remains the main way to exchange information. This is particularly evident in literature or linguistics where data are composed of huge corpora and dictionaries, but applies as well to computer science where a large amount of data is stored in linear files. And it is also the case, for instance, in molecular biology because biological molecules can often be approximated as sequences of nucleotides or amino acids. Furthermore, the quantity of available data in these fields tend to double every 18 months. This is the reason that algorithms must be efficient even if the speed and storage capacity of computers increase continuously.

Proceedings ArticleDOI
27 Jul 1996
TL;DR: This work introduces a new syntactic explanation of type expressions as functors as well as a simple logic for programs with recursive types in which they carry out their proofs.
Abstract: We study recursive types from a syntactic perspective. In particular, we compare the formulations of recursive types that are used in programming languages and formal systems. Our main tool is a new syntactic explanation of type expressions as functors. We also introduce a simple logic for programs with recursive types in which we carry out our proofs.

Journal ArticleDOI
TL;DR: An algorithm for computing a minimally restrictive control when the plant behaviour is a deterministic Petri net language and the desired behaviour isA regular language that can directly identify the set of forbidden markings without having to construct any reachability tree.
Abstract: Algorithms for computing a minimally restrictive control in the context of supervisory control of discrete-event systems have been well developed when both the plant and the desired behaviour are given as regular languages. In this paper the authors extend such prior results by presenting an algorithm for computing a minimally restrictive control when the plant behaviour is a deterministic Petri net language and the desired behaviour is a regular language. As part of the development of the algorithm, the authors establish the following results that are of independent interest: i) the problem of determining whether a given deterministic Petri net language is controllable with respect to another deterministic Petri net language is reducible to a reachability problem of Petri nets and ii) the problem of synthesizing the minimally restrictive supervisor so that the controlled system generates the supremal controllable sublanguage is reducible to a forbidden marking problem. In particular, the authors can directly identify the set of forbidden markings without having to construct any reachability tree.

Proceedings ArticleDOI
10 Mar 1996
TL;DR: This paper shows how to make the test of proving security properties of cryptographic protocols easier by automating the generation of lemmas involving the use of formal languages.
Abstract: The NRL protocol analyzer is a tool for proving security properties of cryptographic protocols, and for finding flaws if they exist. It is used by having the user first prove a number of lemmas stating that infinite classes of states are unreachable, and then performing an exhaustive search on the remaining state space. One main source of difficulty in using the tool is in generating the lemmas that are to be proved. In this paper we show how we have made the test easier by automating the generation of lemmas involving the use of formal languages.

Journal ArticleDOI
TL;DR: This approach, based on Universal Algebra, facilitates the development of the theory of formal languages so as to include the description of sets of finite trees, finite graphs, finite hypergraphs, tuples of words, partially commutative words and other similar finite objects.

Journal ArticleDOI
TL;DR: The paper provides an accurate analysis of the derivation mechanism and the expressive power of the SR formalism, which is necessary to fully exploit the capabilities of the model.
Abstract: A common approach to the formal description of pictorial and visual languages makes use of formal grammars and rewriting mechanisms. The present paper is concerned with the formalism of Symbol?Relation Grammars (SR grammars, for short). Each sentence in an SR language is composed of a set of symbol occurrences representing visual elementary objects, which are related through a set of binary relational items. The main feature of SR grammars is the uniform way they use context-free productions to rewrite symbol occurrences as well as relation items. The clearness and uniformity of the derivation process for SR grammars allow the extension of well-established techniques of syntactic and semantic analysis to the case of SR grammars. The paper provides an accurate analysis of the derivation mechanism and the expressive power of the SR formalism. This is necessary to fully exploit the capabilities of the model. The most meaningful features of SR grammars as well as their generative power are compared with those of well-known graph grammar families. In spite of their structural simplicity, variations of SR grammars have a generative power comparable with that of expressive classes of graph grammars, such as the edNCE and the N-edNCE classes.

Book ChapterDOI
14 May 1996
TL;DR: This paper proposes a conceptual organization for assumptions of problem-solving methods and suggests a formal language to describe them and takes examples from the Propose & Revise problem-Solving method and from diagnosis.
Abstract: Assumptions of problem-solving methods refer to necessary applicability conditions of problem-solving methods, indicating that a problem-solving method is only applicable to realize a task, if the assumptions are met. In principle, such assumptions may refer to any kind of condition involved in a problem-solving method's applicability, including its required domain knowledge. In this paper, we propose a conceptual organization for assumptions of problem-solving methods and suggest a formal language to describe them. For illustration we take examples from the Propose & Revise problem-solving method and from diagnosis.

Journal ArticleDOI
TL;DR: Current results for letter-to-phoneme conversion are at least as good as the best reported so far for a data-driven approach, while being comparable in performance to knowledge-based approaches.

Proceedings ArticleDOI
22 Feb 1996
TL;DR: A tool currently under construction is presented that will allow the temporal validation of a system specification with respect to its R/T constraints while staying within the context of the SIGNAL language by use of the so called temporal homomorphisms.
Abstract: We present a tool currently under construction in order to enhance the SIGNAL language environment with a facility that will allow the temporal validation of a system specification with respect to its R/T constraints while staying within the context of the SIGNAL language. By use of the so called temporal homomorphisms we express the temporal dimension of a functional specification as a SIGNAL program. This facility can be further extended to evaluate the temporal behavior of a system with respect to a chosen execution architecture, by modelling processor architectural features influencing execution time. Reasoning about the timing properties of a program is indispensable in the development of time critical systems where failure to meet deadlines can result in loss of life or material. There is much work done in the domain of functional specification where formal languages like SIGNAL, Esterel, Lustre can be successfully used. There seems to be an inadequacy as far as the validation of temporal properties is concerned and this is mainly due to the fact that many different factors influence the execution time of a program making this problem quite complicated when viewed from a high abstraction level.

Journal ArticleDOI
TL;DR: In this paper, the use of formal specification languages for knowledge-based systems (KBSs) has been proposed, which are closely based on the structure of informal knowledge-models.
Abstract: Much of the work on validation and verification of knowledge based systems (KBSs) has been done in terms of implementation languages (mostly rule-based languages). Recent papers have argued that it is advantageous to do validation and verification in terms of a more abstract and formal specification of the system. However, constructing such formal specifications is a difficult task. This paper proposes the use of formal specification languages for KBS-development that are closely based on the structure of informal knowledge-models. The use of such formal languages has as advantages that (i) we can give strong support for the construction of a formal specification, namely on the basis of the informal description of the system; and (ii) we can use the structural correspondence to verify that the formal specification does indeed capture the informally stated requirements.

Journal ArticleDOI
TL;DR: Some set operations on collections of closed intervals are presented and used to parallelizeMMach programs and to prove the equivalence between distinct MMach programs.
Abstract: Mathematical morphology on sets can be understood as a formal language, whose vocabulary comprises erosions, dilations, complementation, intersection and union. This language is com- plete, that is, it is enough to perform any set operation. Since the sixties special machines, called morphological machines (MMachs), have been built to implement this language. In the literature, we find hundreds of MMach programs that are used to solve image analysis problems. However, the design of these programs is not an elemen- tary task. Thus, recently much research effort has been addressed to automating the programming of MMachs. A very promising ap- proach to this problem is the description of the target operator by input-output pairs of images and the translation of these data into efficient MMach programs. This approach can be decomposed into two equally important steps: (1) learning of the target operator from pairs of images; (2) search for economical representations for the operators learned. The theory presented in this paper is useful in the second step of this procedure. We present some set operations on collections of closed intervals and give efficient algorithms to per- form them. These operations are used to parallelize MMach pro- grams and to prove the equivalence between distinct MMach pro- grams. © 1996 SPIE and IS&T.

Book
01 Sep 1996
TL;DR: Pan as mentioned in this paper is a language-based editing and browsing system designed to support development and maintenance of complex software documents, including programs, using a variety of mechanisms to help users understand and manipulate complex documents effectively, in terms of underlying language when necessary, but always in the framework of a coherent, useroriented interface.
Abstract: Many kinds of complex documents, including programs, are based on underlying formal languages. Language-based editing systems exploit knowledge of these languages to provide services beyond the scope of traditional text editors. To be effective, these services must use the power of language-based information to broaden the options available to the user, but without revealing complex linguistic and implementation models. Users understand complex documents in terms of many overlapping structures, only some of which are related to linguistic structure. Communications with the user concerning document structures must be based on models of document structure that are natural, convenient, and coherent to the user. Pan is a language-based editing and browsing system designed to support development and maintenance of complex software documents. Pan uses a variety of mechanisms to help users understand and manipulate complex documents effectively, in terms of underlying language when necessary, but always in the framework of a coherent, user-oriented interface. A fully functional prototype is in regular use. It serves as a testbed for ongoing research on a wide variety of tools to aid document comprehension, as well as on language-based analysis methods.

Journal ArticleDOI
TL;DR: An infinite hierarchy of languages that comprises the context-free languages as the first and all the languages generated by Tree Adjoining Grammars (TAGs) as the second element is obtained.

Proceedings ArticleDOI
22 Mar 1996
TL;DR: The focus of this paper is on the specification of the object-oriented tool specification language GTSL (GOODSTEP Tool Specification Language), an ESPRIT-III project to produce a General Object-Oriented Database for SofTware Engineering Processes.
Abstract: The definition of software development methods encompasses the definition of syntax and static semantics of formal languages. These languages determine documents to be produced during the application of a method. Developers demand language-based tools that provide document production support, check the syntax and static semantics of documents, and thus implement methods. Method integration must determine inter-document consistency constraints between documents produced in the various tasks. Tools must, therefore, be integrated to implement the required method integration and check or even preserve inter-document consistency. The focus of this paper is on the specification of such integrated tools and outlines the main concepts of the object-oriented tool specification language GTSL (GOODSTEP Tool Specification Language). GOODSTEP is an ESPRIT-III project (no. 6115) to produce a General Object-Oriented Database for SofTware Engineering Processes.

Book ChapterDOI
23 Sep 1996
TL;DR: This survey reviews a number of logical theories dedicated to developing logical theories in which the state of the underlying database can evolve with time, and discusses their application domains and highlights their strong and weak points.
Abstract: Updates are a crucial component of any database programming language. Even the simplest database transactions, such as withdrawal from a bank account, require updates. Unfortunately, updates are not accounted for by the classical Horn semantics of logic programs and deductive databases, which limits their usefulness in real-world applications. As a short-term practical solution, logic programming languages have resorted to handling updates using ad hoc operators without a logical semantics. A great many works have been dedicated to developing logical theories in which the state of the underlying database can evolve with time. Many of these theories were developed with specific applications in mind, such as reasoning about actions, database transactions, program verification, etc. As a result, the different approaches have different strengths and weaknesses. In this survey, we review a number of these works, discuss their application domains, and highlight their strong and weak points.

Journal ArticleDOI
TL;DR: This paper shows how the RCC spatial calculus can be used to provide an unambiguous, formal description of visual computer languages by systematically describing the syntax of Pictorial Janus.
Abstract: Visual computer languages exploit the natural language of diagrams and pictures to provide a simple and intelligible approach to programming. Unfortunately, it is difficult to provide them with a formal syntax and semantics. In this paper we show how the RCC spatial calculus can be used to provide an unambiguous, formal description of such languages by systematically describing the syntax of Pictorial Janus.

Journal ArticleDOI
TL;DR: A solution to the Sisyphus-II elevator-design problem is described by combining the formal specification language KARL and the configurable role-limiting shell approach and gets a description of the knowledge-based system at different levels corresponding to the different activities of its development process.
Abstract: This paper describes a solution to the Sisyphus-II elevator-design problem by combining the formal specification language KARL and the configurable role-limiting shell approach. A knowledge-based system configuring elevator systems is specified and implemented. First, the knowledge is described in a graphical and semi-formal manner influenced by the KADS models of expertise. A formal description is then gained by supplementing the semi-formal description with formal specifications which add a new level of precision and uniqueness. Finally, a generic shell for propose-and-revise systems is designed and implemented as the realization of the final system. This shell was derived by adapting the shellbox COKE, also used for the previous Sisyphus office-assignment problem. As a result of this integration, we get a description of the knowledge-based system at different levels corresponding to the different activities of its development process.

Journal ArticleDOI
TL;DR: A method for ray tracing recursive objects defined by parametric rewriting systems using constructive solid geometry (CSG) as the underlying method of object representation is introduced, which translates the systems into cyclic CSG graphs, which can be used directly as an object representation forray tracing.
Abstract: A method for ray tracing recursive objects defined by parametric rewriting systems using constructive solid geometry (CSG) as the underlying method of object representation is introduced. Thus, the formal languages of our rewriting systems are subsets of the infinite set of CSG expressions. Instead of deriving such expressions to build up large CSG trees, we translate the systems into cyclic CSG graphs, which can be used directly as an object representation for ray tracing. For this purpose the CSG concept is extended by three new nodes. Selection nodes join all the rules for one grammar symbol, control flow by selecting proper rules, and are end-points of cyclic edges. Transformation nodes map the rays in affine space. Calculation nodes evaluate a finite set of arithmetic expressions to modify global parameters, which effect flow control and transformations. The CSG graphs introduced here are a very compact data structure, much like the describing data set. This property meets our intention to avoid both restrictions of the complexity of the scenes by computer memory and the approximation accuracy of objects.

Proceedings ArticleDOI
24 May 1996
TL;DR: In this paper, a strictly stronger version of the conversion lemma for polynomial-time Turing machines is proposed, which allows iterated application and is shown to be complete under projection reductions for NP leaf languages.
Abstract: The concepts of succinct problem representation, and of NP leaf languages, were developed to characterize complexity classes above polynomial time. Here, we work out a descriptive complexity approach to succinctly represented problems, and prove a strictly stronger version of the Conversion Lemma from Balcazar et al (1992) which allows iterated application. Moreover, we prove that for every problem /spl Pi/ its succinct version s/spl Pi/ is complete under projection reductions for the leaf language it defines. Projection reductions are a highly restrictive reducibility notion stemming from descriptive complexity theory. Our main tool is a characterization of polynomial time Turing machines in terms of circuits which are constructed uniformly by quantifier-free formulas. Finally, we show that an alternative succinct representation model allows to obtain completeness results for all syntactic complexity classes even under monotone projection reductions. Thus, we positively answer a question by Stewart (1991, 1994) for a large number of complexity classes.