scispace - formally typeset
Search or ask a question

Showing papers on "Formal language published in 1999"


Book
01 Jan 1999
TL;DR: This book discusses the role of belief in the development of language and some of the ways in which language and belief have changed over time.
Abstract: Preface. Suggested Courses. 1. Having Beliefs. 1+. Basic Tools. 2. Giving Up Beliefs. 2+. The Logic of Contraction. 3. Taking on Beliefs. 3+. The Logic of Revision. 4. Hidden Structures of Belief. 4+. The Logic of Base-Generated Operations. 5. Believing and Hypothesizing. 5+. Extending the Formal Language. 6. Afterthought. 6+. Some Unsolved Problems. Suggested Readings. Bibliography. Symbol Index. Name Index.

354 citations


Book ChapterDOI
01 Sep 1999
TL;DR: Isar's main aspect is its formal language for natural deduction proofs, which sets out to bridge the semantic gap between internal notions of proof given by state-of-the-art interactive theorem proving systems and an appropriate level of abstraction for user-level work.
Abstract: We present a generic approach to readable formal proof documents, called Intelligible semi-automated reasoning (Isar). It addresses the major problem of existing interactive theorem proving systems that there is no appropriate notion of proof available that is suitable for human communication, or even just maintenance. Isar's main aspect is its formal language for natural deduction proofs, which sets out to bridge the semantic gap between internal notions of proof given by state-of-the-art interactive theorem proving systems and an appropriate level of abstraction for user-level work. The Isar language is both human readable and machine-checkable, by virtue of the Isar/VM interpreter. Compared to existing declarative theorem proving systems, Isar avoids several shortcomings: it is based on a few basic principles only, it is quite independent of the underlying logic, and supports a broad range of automated proof methods. Interactive proof development is supported as well. Most of the Isar concepts have already been implemented within Isabelle. The resulting system already accommodates simple applications.

248 citations


Proceedings ArticleDOI
23 Mar 1999
TL;DR: A (formal) meta-model for event algebras is introduced that subdivides the semantics of complex events into elementary, independent dimensions and fulfils the criteria for a good language design to a large extent.
Abstract: Active database management systems have been developed for applications needing an automatic reaction in response to certain events. Events can be simple in nature or complex. Complex events rely on simpler ones and are usually specified with the help of operators of an event algebra. There are quite a few papers dealing with extensions of existing event algebras. However, a systematic and comprehensive analysis of the semantics of complex events is still lacking. As a consequence most proposals suffer from different kinds of peculiarities. Independent aspects are not treated independently, leading to shady mixtures of aspects in operators. Moreover, aspects are not always treated uniformly. Operators may have other semantics than expected. The paper addresses these problems by an extensive and in-depth analysis of the foundations of complex events. As a result of this analysis, a (formal) meta-model for event algebras is introduced that subdivides the semantics of complex events into elementary, independent dimensions. Each of these dimensions are discussed in detail. The resulting language specification fulfils the criteria for a good language design (like orthogonality, symmetry, homogeneity, lean set of language constructs) to a large extent.

211 citations


Journal Article
Joshua T. Goodman1
TL;DR: This work synthesizes work on parsing algorithms, deductive parsing, and the theory of algebra applied to formal languages into a general system for describing parsers that allows a single, simple representation to be used for describing Parsers that compute recognition, derivation forests, Viterbi, n-best, inside values, and other values.
Abstract: We synthesize work on parsing algorithms, deductive parsing, and the theory of algebra applied to formal languages into a general system for describing parsers. Each parser performs abstract computations using the operations of a semiring. The system allows a single, simple representation to be used for describing parsers that compute recognition, derivation forests, Viterbi, n-best, inside values, and other values, simply by substituting the operations of different semirings. We also show how to use the same representation, interpreted differently, to compute outside values. The system can be used to describe a wide variety of parsers, including Earley's algorithm, tree adjoining grammar parsing, Graham Harrison Ruzzo parsing, and prefix value computation.

187 citations


Journal ArticleDOI
TL;DR: A variant of Fine and Wilf's theorem for partial words is proved, and extensions of some general combinatorial properties of words are given.

168 citations


Proceedings ArticleDOI
06 Oct 1999
TL;DR: The lexical, syntactic and semantic structure of function identifiers is analyzed by means of a segmentation technique, a regular language and a conceptual classification, and the application of these analyses to a database of procedural programs suggests some potential uses.
Abstract: The identifiers chosen by programmers as function names contain valuable information. They are often the starting point for program understanding activities, especially when high-level views, like the call graph, are available. In this paper, the lexical, syntactic and semantic structure of function identifiers is analyzed by means of a segmentation technique, a regular language and a conceptual classification. The application of these analyses to a database of procedural programs suggests some potential uses of the results, ranging from support for program understanding to the evolution towards standard and more maintainable forms of programs.

152 citations


Book ChapterDOI
01 Jan 1999
TL;DR: Table processing techniques of rough set theory may be used to simplify these tables and their transformations of fuzzy logic designs; the complexity of large scaled fuzzy systems may be reduced.
Abstract: The primary goal of granular computing is to elevate the lower level data processing to a high level knowledge processing. Such an elevation is achieved by granulating the data space into a concept space. Each granule represents certain primitive concept, and the granulation as a whole represents a knowledge. In this paper, such an intuitive idea is formalized into a mathematical theory: Zadeh’s informal words are taken literally as a formal definition of granulation. Such a mathematical notion is a mild generalization of the “old” notion of crisp/fuzzy neighborhood systems of (pre-)topological spaces. A crisp/fuzzy neighborhood is a granule and is assigned a meaningful name to represent certain primitive concept or to summarize the information content. The set of all linear combinations of these names, called formal words, mathematically forms a vector space over real numbers. Each vector is intuitively an advanced concept represented by some “weighted averaged” of primitive concepts. In terms of these concepts, the universe can be represented by a formal word table; this is one form of Zadeh’s veristic constraints. Such a representation is useful; fuzzy logic designs can be formulated as series of table transformations. So table processing techniques of rough set theory may be used to simplify these tables and their transformations. Therefore the complexity of large scaled fuzzy systems may be reduced; details will be reported in future papers.

134 citations


Proceedings ArticleDOI
17 Oct 1999
TL;DR: This paper discusses testability of more complex languages and shows that the query complexity required for testing context free languages cannot be bounded by any function of /spl epsiv/.
Abstract: We continue the study of combinatorial property testing, initiated by Goldreich, Goldwasser and Ron (1996). The subject of this paper is testing regular languages. Our main result is as follows. For a regular language L/spl isin/{0, 1}* and an integer n there exists a randomized algorithm which always accepts a word w of length n if w/spl isin/L, and rejects it with high probability if w has to be modified in at least En positions to create a word in L. The algorithm queries O~(1//spl epsiv/) bits of w. This query complexity is shown to be optimal up to a factor poly-logarithmic in 1//spl epsiv/. We also discuss testability of more complex languages and show, in particular, that the query complexity required for testing context free languages cannot be bounded by any function of /spl epsiv/. The problem of testing regular languages can be viewed as a part of a very general approach, seeking to probe testability of properties defined by logical means.

134 citations


Journal ArticleDOI
TL;DR: A class of algorithms which allow for the identification of the structure of the minimal stochastic automaton generating the language are proposed and it is shown that the time needed grows only linearly with the size of the sample set.
Abstract: In this paper, the identification of stochastic regular languages is addressed. For this purpose, we propose a class of algorithms which allow for the identification of the structure of the minimal stochastic automaton generating the language. It is shown that the time needed grows only linearly with the size of the sample set and a measure of the complexity of the task is provided. Experimentally, our implementation proves very fast for application purposes.

127 citations


BookDOI
01 Mar 1999
TL;DR: For the first time in book form, original results from the last ten years are presented, some previously unpublished, using combinatorial and algebraic methods, mainly based on combinatorics on words and especially on the theory of "unavoidable regularities".
Abstract: This is a rigorous and self-contained monograph on a central topic in theoretical computer science. For the first time in book form, original results from the last ten years are presented, some previously unpublished, using combinatorial and algebraic methods. These are mainly based on combinatorics on words and especially on the theory of "unavoidable regularities." Researchers will find important new results on semigroups and formal languages, as well as various applications for these methods.

118 citations


Proceedings ArticleDOI
02 Jul 1999
TL;DR: This model is shown to be fully abstract, with respect to an equivalence based on both safety and liveness properties, by means of a factorization theorem which states that every nondeterministic strategy is the composite of a deterministic strategy with a nondetergetic oracle.
Abstract: A game semantics of finite nondeterminism is proposed. In this model, a strategy may make a choice between different moves in a given situation; moreover, strategies carry extra information about their possible divergent behaviour. A Cartesian closed category is built and a model of a simple, higher-order nondeterministic imperative language is given. This model is shown to be fully abstract, with respect to an equivalence based on both safety and liveness properties, by means of a factorization theorem which states that every nondeterministic strategy is the composite of a deterministic strategy with a nondeterministic oracle.

Journal ArticleDOI
TL;DR: A system which translates the Itten theory into a formal language that expresses the semantics associated with the combination of chromatic properties of color images and exploits a competitive learning technique to segment images into regions with homogeneous colors.
Abstract: The development of a system supporting querying of image databases by color content tackles a major design choice about properties of colors which are referenced within user queries. On the one hand, low-level properties directly reflect numerical features and concepts tied to the machine representation of color information. On the other hand, high-level properties address concepts such as the perceptual quality of colors and the sensations that they convey. Color-induced sensations include warmth, accordance or contrast, harmony, excitement, depression, anguish, etc. In other words, they refer to the semantics of color usage. In particular, paintings are an example where the message is contained more in the high-level color qualities and spatial arrangements than in the physical properties of colors. Starting from this observation, Johannes Itten introduced a formalism to analyze the use of color in art and the effects that this induces on the user's psyche. In this paper, we present a system which translates the Itten theory into a formal language that expresses the semantics associated with the combination of chromatic properties of color images. The system exploits a competitive learning technique to segment images into regions with homogeneous colors. Fuzzy sets are used to represent low-level region properties such as hue, saturation, luminance, warmth, size and position. A formal language and a set of model-checking rules are implemented to define semantic clauses and verify the degree of truth by which they hold over an image.

Journal ArticleDOI
TL;DR: The notion of regularity, i.e., finiteness of automata representation, of probabilistic languages has been defined and it is shown that regularity is preserved under choice, concatenation, and Kleene-closure.
Abstract: The formalism of probabilistic languages has been introduced for modeling the qualitative behavior of stochastic discrete-event systems. A probabilistic language is a unit interval-valued map over the set of traces of the system satisfying certain consistency constraints. Regular language operators such as choice, concatenation, and Kleene-closure have been defined in the setting of probabilistic languages to allow modeling of complex systems in terms of simpler ones. The set of probabilistic languages is closed under such operators, thus forming an algebra. It also is a complete partial order under a natural ordering in which the operators are continuous. Hence, recursive equations can be solved in this algebra. This is alternatively derived by using the contraction mapping theorem on the set of probabilistic languages which is shown to be a complete metric space. The notion of regularity, i.e., finiteness of automata representation, of probabilistic languages has been defined and it is shown that regularity is preserved under choice, concatenation, and Kleene-closure. We show that this formalism is also useful in describing system performance measures such as completion time, reliability, etc., and present properties to aide their computation.

Patent
03 Sep 1999
TL;DR: In this paper, a method for hierarchical translation of input to a formal command in natural language understanding systems includes presenting an input command to be translated to a NER, where at least two NER levels are provided in the NER.
Abstract: A method for hierarchical translation of input to a formal command in natural language understanding systems includes presenting an input command to be translated to a natural language understanding engine. At least two translator levels are provided in the natural language understanding engine. A first translator level of the at least two translator levels translates the input command into at least one category by associating the input command with the at least one category for the next level of translators. A formal language command is output for the input command from a last of the at least two translator levels based on the input command and the at least one category.

Book ChapterDOI
22 Mar 1999
TL;DR: It is shown that Lr is decidable, and it is explained how Lr relates to two previously defined structuredescription formalisms by showing how an arbitrary shape descriptor from each of these formalisms can be translated into an Lr formula.
Abstract: This paper aims to provide a better formalism for describing properties of linked data structures (e.g., lists, trees, graphs), as well as the intermediate states that arise when such structures are destructively updated. The paper defines a new logic that is suitable for these purposes (called Lr, for "logic of reachability expressions"). We show that Lr is decidable, and explain how Lr relates to two previously defined structuredescription formalisms ("path matrices" and "static shape graphs") by showing how an arbitrary shape descriptor from each of these formalisms can be translated into an Lr formula.

Journal ArticleDOI
TL;DR: Computing Reviews is a monthly journal that publishes critical reviews on a broad range of computing subjects, including models of computation, formal languages, computational complexity theory, analysis of algorithms, and logics and semantics of programs.
Abstract: As a service to our readers, SIGACT News has an agreement with Computing Reviews to reprint reviews of books and articles of interest to the theoretical computer science community. Computing Reviews is a monthly journal that publishes critical reviews on a broad range of computing subjects, including models of computation, formal languages, computational complexity theory, analysis of algorithms, and logics and semantics of programs. ACM members can receive a subscription to Computing Reviews for $45 per year by writing to ACM headquarters.

Journal ArticleDOI
TL;DR: It will be shown that greatest consistent specializations (GCSs) always exist and are compatible with conjunctions of invariants and under certain mild restrictions the general construction of such GCSs is possible.
Abstract: State oriented specifications with invariants occur in almost all formal specification languages. Hence the problem is to prove the consistency of the specified operations with respect to the invariants. Whilst the problem seems to be easily solvable in predicative specifications, it usually requires sophisticated verification efforts, when specifications in the style of Dijkstra's guarded commands as e.g. in the specification language B are used. As an alternative consistency enforcement will be discussed in this paper. The basic idea is to replace inconsistent operations by new consistent ones preserving at the same time the intention of the old one. More precisely, this can be formalized by consistent spezializations, where specialization is a specific partial order on operations defined via predicate transformers. It will be shown that greatest consistent specializations (GCSs) always exist and are compatible with conjunctions of invariants. Then under certain mild restrictions the general construction of such GCSs is possible. Precisely, given the GCSs of simple basic assignments the GCS of a complex operation results from replacing involved assignments by their GCSs and the investigation of a guard. In general, GCS construction can be embedded in refinement calculi and therefore strengthens the systematic development of correct programs.

Journal ArticleDOI
TL;DR: An efficient, O ( n 2 ), syntax analyzer for languages generated by (dynamically) programmed grammars is introduced and the class of Grammars introduced is stronger descriptively than context-free grammar and it can be used for analysis of complex trend functions describing a behavior of an industrial equipment.

Book ChapterDOI
30 Aug 1999
TL;DR: The model of generalized P-systems, GP- systems for short, is considered, a new model for computations using membrane structures and recently introduced by Gheorghe Păun, allowing for the simulation of graph controlled grammars of arbitrary type based on productions working on single objects.
Abstract: We consider a variant of P-systems, a new model for computations using membrane structures and recently introduced by Gheorghe Paun. Using the membranes as a kind of filter for specific objects when transferring them into an inner compartment turns out to be a very powerful mechanism in combination with suitable rules to be applied within the membranes. The model of generalized P-systems, GP-systems for short, considered in this paper allows for the simulation of graph controlled grammars of arbitrary type based on productions working on single objects; for example, the general results we establish in this paper can immediately be applied to the graph controlled versions of context-free string grammars, n-dimensional #-context-free array grammars, and elementary graph grammars.

Journal ArticleDOI
01 Oct 1999
TL;DR: In this paper, the authors present an automatic approach to verify real-time distributed systems for complex timing requirements, which adhere to the hypothesis of analytical theory for fixed-priority scheduling.
Abstract: We present an automatic approach to verify designs of real-time distributed systems for complex timing requirements. We focus our analysis on designs which adhere to the hypothesis of analytical theory for Fixed-Priority scheduling. Unlike previous formal approaches, we draw from that theory and build small formal models (based on Timed Automata) to be analyzed by means of model checking tools. We are thus integrating scheduling analysis into the framework of automatic formal verification.

Proceedings ArticleDOI
12 Oct 1999
TL;DR: This work describes how Attempto Controlled English (ACE), a subset of English that can be unambiguously translated into first-order logic and thus can conveniently replace first- order logic as a formal notation, has been used as a front-end to EP Tableaux.
Abstract: Many domain specialists are not familiar or comfortable with formal notations and formal tools like theorem provers or model generators. To address this problem, we developed Attempto Controlled English (ACE), a subset of English that can be unambiguously translated into first-order logic and thus can conveniently replace first-order logic as a formal notation. We describe how ACE has been used as a front-end to EP Tableaux, a model generation method complete for unsatisfiability and for finite satisfiability. We specified in ACE, a database example that was previously expressed in the EP Tableaux language PRQ, automatically translated the ACE specification into PRQ, and with the help of EP Tableaux reproduced the previously found results.

Journal ArticleDOI
TL;DR: The formal definition of the SCAN language is provided and the underlying method for spatial access, motivated by the principle of recursive decomposition of an image array into hierarchical levels for efficient local and global processing is described.

Book ChapterDOI
26 Apr 1999
TL;DR: MobiS, a specification language based on a tuple-spaces based model which specifies coordination bymultiset rewriting is introduced, it is shown how MobiS can be flexibly used to specify architectures containing mobile components and give formalization of some common mobility paradigms.
Abstract: New formal languages have been proposed for the specification of mobility aspects of systems and for understanding the recent devised technologies for mobile computing. In this paper we introduce MobiS, a specification language based on a tuple-spaces based model which specifies coordination bymultiset rewriting. We show how MobiS can be flexibly used to specify architectures containing mobile components and give formalization of some common mobility paradigms. We explore the styles we introduce showing how they model the software architecture of a "Purchasing System", a case study in electronic commerce.

Journal ArticleDOI
TL;DR: This paper solves the famous open problem of Kanellakis showing that uniform boundedness is undecidable for single rule programs (called also sirups).
Abstract: DATALOG is the language of logic programs without function symbols. It is considered to be the paradigmatic database query language. If it is possible to eliminate recursion from a DATALOG program then it is bounded. Since bounded programs can be executed in parallel constant time, the possibility of automatized boundedness detecting is believed to be an important issue and has been studied in many papers. Boundedness was proved to be undecidable for different kinds of semantical assumptions and syntactical restrictions. Many different proof techniques were used. In this paper we propose a uniform proof method based on the discovery of, as we call it, the Achilles--Turtle machine, and make strong improvements on most of the known undecidability results. In particular we solve the famous open problem of Kanellakis showing that uniform boundedness is undecidable for single rule programs (called also sirups). This paper is the full version of [J. Marcinkowski, Proc. 13th STACS, Lecture Notes in Computer Science 1046, pp. 427--438], and [J. Marcinkowski, 11th IEEE Symposium on Logic in Computer Science, pp. 13--24].

Journal ArticleDOI
TL;DR: It is proved that all recursively enumerable languages can be generated by context-free returning parallel communicating grammar systems by showing how the parallel communicating grammars can simulate two-counter machines.

Journal ArticleDOI
TL;DR: This paper outlines the basic steps of a methodology that allows business analysts to go from high-level enterprise objectives, to detailed and formal specifications of business processes that can be enacted to realise these objectives.
Abstract: This paper presents a formal framework for representing enterprise knowledge. The concepts of our framework (objectives and goals, roles and actors, actions and processes, responsibilities and constraints) allow business analysts to capture knowledge about an enterprise in a way that is both intuitive and mathematically formal. It also outlines the basic steps of a methodology that allows business analysts to go from high-level enterprise objectives, to detailed and formal specifications of business processes that can be enacted to realise these objectives. The formal language used means that the specifications can be verified as having certain correctness properties, e.g. that responsibilities assigned to roles are fulfilled, and constraints are maintained as a result of process execution.

Book ChapterDOI
28 Jun 1999
TL;DR: An integration of Z and timed CSP called RT-Z is presented, incorporating the strengths of both formal languages in a coherent frame, and is equipped with structuring constructs built on top of the integration.
Abstract: We present an integration of Z and timed CSP called RT-Z, incorporating the strengths of both formal languages in a coherent frame. To cope with complex systems, RT-Z is equipped with structuring constructs built on top of the integration, because both Z and timed CSP lack appropriate facilities. For RT-Z to be built on formal grounds, a formal semantics is defined based on the denotational semantics of Z and timed CSP.

Journal ArticleDOI
TL;DR: This paper gives the theoretical setup so that an ecological model, as a particular mathematical model, can be considered a text written in a formal language (mathematics), and therefore, statistical linguistic laws can be applied to obtain information parameters in different semantic levels of the same model.
Abstract: This paper gives the theoretical setup so that an ecological model, as a particular mathematical model, can be considered a text written in a formal language (mathematics), and therefore, statistical linguistic laws can be applied to obtain information parameters in different semantic levels of the same model. The statistical laws will be useful to: a) compare semantic levels, submodels, and different models mutually; b) prove that information temperature parameter is an indirect measure of meaning: the significance or semantic component of information, opposed to significant, or comprehension, on the part of the observer (modeller) of the model text. We will apply these ideas in two practical examples.

Journal ArticleDOI
TL;DR: The authors develop the syntax’s of a formal language of complex structures, called here L(M), based on general assumptions from the theory of linguistic mathematics, which develops this interpretation over a reproductive submodel of a more general one.

Book ChapterDOI
20 Sep 1999
TL;DR: This paper presents a method for the development of mixed systems, i.e. systems with both a static and a dynamic part, equipped with object oriented code generation in Java, to be used for prototyping concerns.
Abstract: Methods are needed to help using formal specifications in a practical way. We herein present a method for the development of mixed systems, i.e. systems with both a static and a dynamic part. Our method helps the specifier providing means to structure the system in terms of communicating subcomponents and to give the sequential components using a semi-automatic concurrent automata generation with associated algebraic data types. These components and the whole system may be verified using common set of tools for transition systems or algebraic specifications. Furthermore, our method is equipped with object oriented code generation in Java, to be used for prototyping concerns. In this paper, we present our method on a small example: a transit node component in a communication network.