scispace - formally typeset
Search or ask a question

Showing papers on "Formal system published in 1988"


Journal ArticleDOI
01 Jun 1988
TL;DR: Higher-order abstract syntax incorporates name binding information in a uniform and language generic way and acts as a powerful link integrating diverse tools in program manipulation and other formal systems where matching and substitution or unification are central operations.
Abstract: We describe motivation, design, use, and implementation of higher-order abstract syntax as a central representation for programs, formulas, rules, and other syntactic objects in program manipulation and other formal systems where matching and substitution or unification are central operations. Higher-order abstract syntax incorporates name binding information in a uniform and language generic way. Thus it acts as a powerful link integrating diverse tools in such formal environments. We have implemented higher-order abstract syntax, a supporting matching and unification algorithm, and some clients in Common Lisp in the framework of the Ergo project at Carnegie Mellon University.

726 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examine a different aspect of "informality": the pervasive utilization of informal modes of exchange within the formal sector itself, including various forms of trading influence and bureaucratic favors for equivalent services or cash.
Abstract: some degree from access to services provided by the modern state. Networks of reciprocity and patron-client relations have been shown to play an important role within these disadvantaged sectors, in articulating their members to the formal market system and in creating an informal social security system to survive (Lomnitz 1977, 1978, 1982). The present article proposes to examine a different aspect of "informality": the pervasive utilization of informal modes of exchange within the formal sector itself. These exchanges include various forms of trading influence and bureaucratic favors for equivalent services or cash. Depending on the political system, some forms of informal exchange may be tolerated while others may be severely repressed. Even in the latter case, however, illicit economic activities ("economic crimes") in the state bureaucracy are often seen as inevitable (if not actually useful) by members of elite groups within the formal system. I show that these activities are not random or chaotic but are based on informal networks following principles similar to those in shantytown networks: patronage, loyalty, and confianza (trust). Often the networks run underneath and parallel to the formal hierarchy. Weberian analysis of the rationality of bureaucratic systems ignored the informal activities that sprang up within formal organizations as a response to the malfunctionings of bureaucracies. Political science and anthropology, however, based on first-hand observations in underdeveloped, non-Western societies, have developed an extensive literature focusing on the discrepancies between the goals and structures of organizations and the historical and cultural specificities of the social systems in which those organizations are embedded. The main consequences of this conflict appear to be inefficiencies resulting from rigidity and corruption (arising from inefficiency). Personalistic, culturally determined loyalties to kin and local groups often defy the more nationalistic ideologies that underlie bureaucratic rationality. (For a review of this literature, see Britan and Cohen 1980; Scott 1972.) This article is generally in line with the literature mentioned above, but goes a step further in that it sees "informality" not only as a residue of traditionalism, but as an intrinsic element of "formality" insofar as it is a response to the inadequacies of formal

252 citations


Book ChapterDOI
30 May 1988
TL;DR: There have been several technical improvements and gained insights in understanding the computational model, the logic itself, the proof system and its presentation, and connections with alternative formalisms, such as finite automata.
Abstract: In this survey paper we present some of the recent developments in the temporal formal system for the specification, verification and development of reactive programs. While the general methodology remains very much the one presented in some earlier works on the subject, such as [MP83c,MP83a,Pnu86], there have been several technical improvements and gained insights in understanding the computational model, the logic itself, the proof system and its presentation, and connections with alternative formalisms, such as finite automata. In this paper we explicate some of these improvements and extensions.

166 citations


Journal ArticleDOI
TL;DR: In this paper, the authors draw attention to certain aspects of causal reasoning which are pervasive in ordinary discourse yet, based on the author's scan of the literature, have not received due treatment by logical formalisms of common-sense reasoning.

129 citations


Book ChapterDOI
01 Jan 1988
TL;DR: Analogical reasoning is a traditional mode of argument which has been applied in a variety of contexts, such as rhetorics, ethics, politics, mythology, metaphysics, theology, mathematics, logic, physics, biology, medicine, psychology, jurisprudence, engineering, and artificial intelligence as mentioned in this paper.
Abstract: Analogical reasoning is a traditional mode of argument which has been applied in a variety of contexts — such as rhetorics, ethics, politics, mythology, metaphysics, theology, mathematics, logic, physics, biology, medicine, psychology, jurisprudence, engineering, and artificial intelligence. Within its everyday and scientific applications, analogy has been employed for different purposes — such as heuristics, justification, and problem solving. For these reasons, it cannot be expected that there is only one “correct” formal system for representing the “valid” patterns of analogical inference.

43 citations


Journal ArticleDOI
TL;DR: This paper describes two pieces of software developed to allow easy definition of assembly tasks using the ABC system, and develops 14 primitive operations which can be used to completely specify assembly steps for a large class of problems.
Abstract: Experience has proved that exploiting robots for assembly tasks is much more difficult than manufacturing engineers had expected and many attempts at implementing robotic assembly have failed. Our research has led us to believe that a formal approach to specifying the steps required for assembly would be of great benefit in developing the required software for a specific task, and in adaptively controlling and monitoring the execution of robotic assembly steps. The US National Bureau of Standards has developed a formal system, called ABC (for Assembly By Constraints) for specifying the steps required for assembly. The system is based on the reduction in the degrees of freedom of objects as they are assembled. Using this basic concept, we have developed 14 primitive operations which can be used to completely specify assembly steps for a large class of problems. This paper initially outlines the historical development of the system, then describes two pieces of software developed to allow easy definition of assembly tasks using the ABC system, and finally presents two practical examples.

22 citations


Book ChapterDOI
01 Jan 1988

21 citations


01 Jan 1988
TL;DR: In this paper, an interval-based temporal logic (IQ) model is used to characterize the semantics of the progressive, and an event-based AI model of temporal reference is used for the same purpose.
Abstract: Formal semantics constitutes the framework of this thesis, and the aim is to characterise the semantics of the progressive, as it appears in sentence (1). (1) Max was running towards the station Among the problems is one known as the "imperfective paradox". According to intuitions, sentence (1) entails (2), but no entailment holds between (3) and (4). (1) Max was running towards the station (2) Max ran towards the station (3) Max was running to the station (4) Max ran to the station Since (1) and (3) would seem to have the same logical form, they ought to have similar entailments. Why is this not so? This thesis is divided into two parts. The first part, containing chapters 2 to 5, evaluates the current formal theories that tackle the imperfective paradox. Solving the imperfective paradox consists of two tasks: the first is to characterise a semantic distinction between (2) and (4), and the second is to supply a semantic analysis of the progressive that is sensitive to this distinction and so results in a solution to the imperfective paradox. According to how the current theories tackle these two tasks, they can be classified into three camps which I will name as follows: the Heterogeneous Strategy (adopted by Dowty, Taylor and Cooper) provides one approach for fulfilling the first task, the Eventual Outcome Strategy (adopted by Dowty, Cooper and Hinrichs) provides an approach for defining the semantics of the progressive, and the Event-based Strategy (adopted by Parsons and Bach) provides a further alternative for achieving the two tasks at hand. All these strategies are intuitively motivated, but we will argue that they are ultimately untenable. The Heterogeneous Strategy and the Event-based Strategy fail to mesh with the treatment of point adverbials such as "At 3pm", and the Eventual Outcome Strategy produces a definition of the progressive that is viciously circular. Thus although the current theories that tackle the imperfective paradox are highly intuitively motivated, we will ultimately show that the formulations of these intuitions give rise to conflicts and tensions when it comes to explaining the natural language data. The second part of the thesis, containing chapters 6 and 7, offers a new approach for tackling the imperfective paradox. This new approach invokes two tools; the interval-based temporal logic IQ (Richards 1986), and Moens' (1987) event-based AI model of temporal reference. IQ is an interval-based temporal logic with several innovations. First, unlike the previous interval-based theories, IQ maintains a high level of homogeneity: an atomic sentence is true at an interval I only if it is true at all subintervals of I. Second, IQ offers a technique whereby temporal expressions can have representations that receive their semantic interpretation with respect to context. We use the roles of homogeneity and context in IQ to characterise the semantics of aspect, where the characterisation is based on Moens' model. This provides an arena in which to tackle the imperfective paradox anew. We explain the entailment between (1) and (2), and at the same time explain why no entailment holds between (3) and (4). Furthermore, we overcome the problems concerning the treatment of adverbials such as "At 3pm" that are encountered in the Heterogeneous Strategy and the Event-based Strategy, and, since we do not adopt the Eventual Outcome Strategy in defining the progressive, we overcome that strategy's problem of circularity. Hence our solution to the imperfective paradox will provide answers to the puzzles posed in the earlier chapters of the thesis.

16 citations


11 Jul 1988
TL;DR: A general specification language, Z, based on set theory and developed at Oxford University is presented and some conclusions are drawn about the advantages and disadvantages of using a formal approach.
Abstract: A general specification language, Z, based on set theory and developed at Oxford University is presented. A major advantage of a formal notation is that it is precise and unambiguous and thus the formal notation always provides the definitive description in the case of any misunderstanding. A number of examples are discussed, including network services, window systems, and microprocessor instruction sets. This paper is split into two main parts. The first half deals with the nature of formal specification and why it should be used. Additionally, a brief introduction to Z and how it is used is also presented in general terms, without covering the notation itself. The second half of the paper deals with the experiment gained using Z for the design and documentation of network services and during some case studies of existing systems. Finally some conclusions are drawn about the advantages and disadvantages of using a formal approach.

14 citations



Journal ArticleDOI
TL;DR: In this article, a number of managerial perspectives are developed and illustrated in relation to the management of universities, and the approach is also contrasted with the management in health authorities, where some authors provide a critique of the Jarratt proposals without illustrating the managerial perspective from which this is being done, and that whilst their organisational analysis may be insightful it does not add to the debate on how to manage universities or health authorities.
Abstract: Designing an accounting and information system (ACS) for a university is difficult, and whilst there is substantial evidence that formal systems can have dysfunctional effects, it is necessary to precede any design or critique of an ACS by clarifying the underlying managerial perspective that is being applied to the organisation. In this paper a number of managerial perspectives are developed and illustrated in relation to the management of universities. The approach is also contrasted with the management of health authorities. It is demonstrated that some authors provide a critique of the Jarratt proposals without illustrating the managerial perspective from which this is being done, and that whilst their organisational analysis may be insightful it does not add to the debate on how to manage universities or health authorities.

Proceedings ArticleDOI
05 Jul 1988
TL;DR: A precise notion of a formal framework meant to capture the intuition of an open-ended range of deductive interpreted languages is proposed, and a particular framework called the logical theory of constructions (LTC) is developed as an example.
Abstract: A precise notion of a formal framework, meant to capture the intuition of an open-ended range of deductive interpreted languages, is proposed. A particular framework called the logical theory of constructions (LTC) is developed as an example. A series of languages in the LTC framework is defined, demonstrating how a language can be thought of as gradually evolving. >

Proceedings ArticleDOI
01 Jan 1988

Book ChapterDOI
04 Sep 1988

T.W.G. Docker1
11 Jul 1988
TL;DR: An eclectic approach is being pursued in which formal, semi-formal, and informal approaches can be integrated to support the software development process, thus providing flexibility both in terms of the rigour required by (and from) a user, and in the choice of methods and tools available.
Abstract: In a related paper by Docker and Tate (1986), a prototyping environment called SAME (Structured Analysis Modelling Environment) is described, which supports the exercising of structured systems analysis data flow diagrams (DFDs). SAME provides formal operational semantics for DFDs. These are suitable for building executable models of DFDs but are inadequate for proving properties about either DFDs or their components. The authors address this inadequacy by providing a formal mathematical semantics for DFDs from which formal specifications can be generated. The intention is to 'graft' this formal system onto SAME, and other tools, so that (traditional) semi-formal tools can be used as the 'front-ends' in the rigorous development of software. The aim is not merely to create an environment consisting of a set of tools that occupy the middle of the informal-formal spectrum. Rather, an eclectic approach is being pursued in which formal, semi-formal, and informal approaches can be integrated to support the software development process, thus providing flexibility both in terms of the rigour required by (and from) a user, and in terms of the choice of methods and tools available.

Journal ArticleDOI
TL;DR: It is argued that formal semantics, in the model-theoretic style pioneered by Tarski, is appropriate for specifying the meanings of the compositional component of artificial formal languages but not of natural languages.
Abstract: It is argued that formal semantics, in the model-theoretic style pioneered by Tarski, is appropriate for specifying the meanings of the compositional component of artificial formal languages but not of natural languages. Since computer programming languages are both formal and artificial, formal semantics has a clear application to them, but this does not mean that it is in any way relevant to the problem of meaning in AI. The distinction is drawn between what an expression in a language means, and what a person means by using it. The former is the only kind of meaning that formal semantics can ever explain, whereas for AI to succeed it is essential to elucidate, and then to recreate, the latter. No verdict is offered on whether or not this may ultimately be possible; but it is argued that formal semantics would be an inappropriate tool to use to this end.

Journal ArticleDOI
01 Dec 1988-Noûs
TL;DR: The second incompleteness theorem of Gddel's second Incompleteness Theorem is more or less in G6del's own words as mentioned in this paper, which states that no consistent formal system that contains a certain amount of finitary number theory can prove its own consistency.
Abstract: NO CONSISTENT formal system that contains a certain amount of finitary number theory can prove its own consistency. This statement of Gddel's second Incompleteness Theorem, which is more or less in G6del's own words,' is imprecise but the imprecision is remediable. What counts as a consistent formal system, what constitutes its containing the requisite amount of finitary number theory and what it is for a system not to be able to prove its own consistency are all susceptible of exact specification. Nevertheless, in textbook treatments of it, the second Incompleteness Theorem (unlike the first) does not very often receive a fully general statement which is any more precise than this. Often it is applied to some particular formal system, to yield the result that such and such a sentence is not a theorem of that system; and if it is then characterized as a generalization of this result, and the generalization is spelt out, then the statement above is typical of what one finds.2 One explanation for this is that precision here is particularly hard to come by and, once attained, very much lacking in intuitive impact. But I shall argue that there are also non-heuristic reasons why it can be more appropriate to invoke this imprecise statement than a corresponding precise version. First, I need to broach the question of what a corresponding precise version would look like. This in turn requires some declaration of the canons of precision that I am adopting. I shall proceed to lay these down stipulatively. Given any mathematical claim, there will be some theory in which it is embedded (be it set theory, number theory, analysis or

01 Jan 1988
TL;DR: The naturalness of this approach is demonstrated by offering new analyses and solutions to some classical distributed computing problems, namely the coordinated attack and authenticated Byzantine agreement.
Abstract: : We present a formal system to reason about implicit belief. Implicit belief captures the (possibly probabilistic) information available to agents in probabilistic distributed systems. Our system also deals with non-determinism where all the non-deterministic choices are made at the beginning of the computation. The naturalness of this approach is demonstrated by offering new analyses and solutions to some classical distributed computing problems, namely the coordinated attack and authenticated Byzantine agreement. Keywords: Safety properties; Knowledge; Distributed systems; Probabilistic system; Liveliness; Byzantine agreement; Coordinated.



Proceedings ArticleDOI
03 Jun 1988
TL;DR: This paper defines several types of scenarios of interest and discusses the algorithms for generating them, and defines a data structure, called “influence graphs,” that reflects how the objects in a system affect one another.
Abstract: A formal system specification is often written declaratively, in terms of the properties of the system components. This makes the system description modular and concise. However, this does not make the procedural aspects of the system easily understandable. Here arises the need for automatically generating interesting behavior patterns, specifically, scenarios.This paper is a preliminary report of our research in generating interesting scenarios. We define several types of scenarios of interest and discuss the algorithms for generating them. At the center of the algorithms is a data structure, called “influence graphs,” that reflects how the objects in a system affect one another.

Book ChapterDOI
23 May 1988

Book ChapterDOI
01 Jan 1988
TL;DR: In this paper, the relation between formal systems and natural language is investigated, and it is shown that Hofstadter's ideas concerning meaning are interlocked with his ideas concerning formal systems, and those latter ideas should be addressed first, to preserve continuity.
Abstract: There are many problems and possibilities relevant to the new paradigm, especially in regard to the relation between formal systems and natural language, that must be investigated. I have already touched on some of them, but, since Hofstadter’s ideas concerning meaning are obviously interlocked with his ideas concerning formal systems and natural language, those latter ideas should be addressed first, to preserve continuity, before I turn to the ideas of others.