scispace - formally typeset
Search or ask a question

Showing papers on "Formal system published in 1985"



Journal ArticleDOI
TL;DR: Any attempt to give “foundations”, for category theory or any domain in mathematics, could have two objectives: to provide a formal frame rich enough so that all the actual activity in the domain can be carried out within this frame, and consistent with a well-established and “safe” theory.
Abstract: Any attempt to give “foundations”, for category theory or any domain in mathematics, could have two objectives, of course related. (0.1) Noncontradiction: Namely, to provide a formal frame rich enough so that all the actual activity in the domain can be carried out within this frame, and consistent, or at least relatively consistent with a well-established and “safe” theory, e.g. Zermelo-Frankel (ZF). (0.2) Adequacy, in the following, nontechnical sense: (i) The basic notions must be simple enough to make transparent the syntactic structures involved. (ii) The translation between the formal language and the usual language must be, or very quickly become, obvious. This implies in particular that the terminology and notations in the formal system should be identical, or very similar, to the current ones. Although this may seem minor, it is in fact very important. (iii) “Foundations” can only be “foundations of a given domain at a given moment”, therefore the frame should be easily adaptable to extensions or generalizations of the domain, and, even better, in view of (i), it should suggest how to find meaningful generalizations. (iv) Sometimes (ii) and (iii) can be incompatible because the current notations are not adapted to a more general situation. A compromise is then necessary. Usually when the tradition is very strong (ii) is predominant, but this causes some incoherence for the notations in the more general case (e.g. the notation f(x) for the value of a function f at x obliges one, in category theory, to denote the composition of arrows (f, g) → g∘f, and all attempts to change this notation have, so far, failed).

144 citations


01 Jan 1985
TL;DR: In this paper, the problem of giving "foundations" for category theory or any domain in mathematics has been addressed, in the following way: (i) the basic notions must be simple enough to make transparent the syntactic structures involved, and (ii) the translation between the formal language and the usual language must be or very quickly become, obvious.
Abstract: ?0. Introduction. Any attempt to give "foundations", for category theory or any domain in mathematics, could have two objectives, of course related. (0.1) Noncontradiction: Namely, to provide a formal frame rich enough so that all the actual activity in the domain can be carried out within this frame, and consistent, or at least relatively consistent with a well-established and "safe" theory, e.g. Zermelo-Frankel (ZF). (0.2) Adequacy, in the following, nontechnical sense: (i) The basic notions must be simple enough to make transparent the syntactic structures involved. (ii) The translation between the formal language and the usual language must be, or very quickly become, obvious. This implies in particular that the terminology and notations in the formal system should be identical, or very similar, to the current ones. Although this may seem minor, it is in fact very important. (iii) "Foundations" can only be "foundations of a given domain at a given moment", therefore the frame should be easily adaptable to extensions or generalizations of the domain, and, even better, in view of (i), it should suggest how to find meaningful generalizations. (iv) Sometimes (ii) and (iii) can be incompatible because the current notations are not adapted to a more general situation. A compromise is then necessary. Usually when the tradition is very strong (ii) is predominant, but this causes some incoherence for the notations in the more general case (e.g. the notation f (x) for the value of a function f at x obliges one, in category theory, to denote the composition of arrows (f, g) -+ g o , and all attempts to change this notation have, so far, failed). (0.3) Although it seems to have been the main preoccupation of the logicians who tried to give foundations for category theory, I am only mildly interested in mere consistency, for the following reasons: (i) Categoricians have, in their everyday work, a clear view of what could lead to contradiction, and know how to build ad hoc safeguards. (ii) If a formal system fails to satisfy too many of the adequacy requirements, it will be totally useless; and worse, the inadequacy will probably reflect too superficial an analysis of the real activity of categoricians.

140 citations


Journal ArticleDOI
01 Jun 1985-Synthese
TL;DR: In this paper, a survey of options in constructing a formal system of dialogue rules is presented, and the equivalence of derivability in intuitionistic logic and the existence of a winning strategy (for the opponent) on the strength of Ei is shown by simple inductive proofs.
Abstract: Section 1 contains a survey of options in constructing a formal system of dialogue rules. The distinction between material and formal systems is discussed (section 1.1). It is stressed that the material systems are, in several senses, formal as well. In section 1.2 variants as to language form (choices of logical constants and logical rules) are pointed out. Section 1.3 is concerned with options as to initial positions and the permissibility of attacks on elementary statements. The problem of ending a dialogue, and of infinite dialogues, is treated in section 1.4. Other options, e.g., as to the number of attacks allowed with respect to each statement, are listed in section 1.5. Section 1.6 explains the concept of a ‘chain of arguments’. From section 2 onward four types of dialectic systems are picked out for closer study: D, E, Di and Ei. After a preliminary section on dialogue sequents and winning strategies, the equivalence of derivability in intuitionistic logic and the existence of a winning strategy (for the Proponent) on the strength of Ei is shown by simple inductive proofs. Section 3 contains a — relatively quick — proof of the equivalence of the four systems. It follows that each of them yields intuitionistic logic.

48 citations


Journal ArticleDOI
TL;DR: In this article, the authors argue that the most important applications of logical theories are normative, and that the epistemology is that of wide reflective equilibrium, and tie their discussion to Thagard's views concerning the relationship between psychology and logic, arguing against him that psychology has and should have only a peripheral role in normative applications of logic.
Abstract: By a logical theory I mean a formal system together with its semantics, meta-theory, and rules for translating ordinary language into its notation. Logical theories can be used descriptively (for example, to represent particular arguments or to depict the logical form of certain sentences). Here the logician uses the usual methods of empirical science to assess the correctness of his descriptions. However, the most important applications of logical theories are normative, and here, I argue, the epistemology is that of wide reflective equilibrium. The result is that logic not only assesses our inferential practice but also changes it. I tie my discussion to Thagard's views concerning the relationship between psychology and logic, arguing against him that psychology has and should have only a peripheral role in normative (and most descriptive) applications of logic.

33 citations


Book ChapterDOI
TL;DR: This chapter is concerned mainly with formal systems for the hyperarithmetical sets and represents a vast extension of the program of Reverse Mathematics.
Abstract: Publisher Summary This chapter focuses on Friedman's research on subsystems of second-order arithmetic, which can be examined from two related but essentially disparate points of view. On the one hand, such systems have many interesting meta-mathematical properties, which can be investigated using proof-theoretic and model-theoretic tools. On the other hand, subsystems of second-order arithmetic are a natural vehicle for the formal axiomatic study of ordinary mathematics. Ordinary mathematics comprises branches of mathematics—such as geometry, number theory, differential equations, algebra, and functional analysis. The chapter is concerned mainly with formal systems for the hyperarithmetical sets and represents a vast extension of the program of Reverse Mathematics. Reductionist programs, where a reductionist is anyone who proposes to reduce a large part of mathematics to some restricted set of “acceptable” principles, are also discussed.

30 citations



Journal ArticleDOI
TL;DR: Using a formal grammar makes it possible to provide a clearer logical basis for the arguments for and against particular pronouncements of the FASB in three areas: possibility, consistency and resolution of accounting principles.

6 citations


01 Jan 1985
TL;DR: A foundational investigation of the notion of hypothesis in knowledge representation schemes is presented, and the relation between these two parts, for introducing and reasoning with hypotheses, is explored.
Abstract: A foundational investigation of the notion of hypothesis in knowledge representation schemes is presented. The problem has two distinct but related components. The first concerns the ultimately non-deductive problem of forming and maintaining a set of hypotheses, or a theory, based on a stream of ground atomic formulae. The second concerns deductively reasoning with a theory, together with arbitrary known and hypothesised sentences. For the first part, theory formation, a language HL (and from it an algebra and logic) is derived for forming hypotheses framed in set-theoretic terms. Two soundness and completeness results for the logic are presented. The first explicitly links the logic to the algebra; the second treats the logic as a three-valued system. Through the formal systems, the set of potential hypotheses is precisely specified, and a procedure is derived for restoring the consistency of a set of hypotheses after conflicting evidence is encountered. For the second part, reasoning with a theory, an existent first-order language that can represent and reason about what it knows is extended to one that can reason with knowledge and hypothesis. The original proof-theoretic and semantic results for the language are extended appropriately. The relation between these two parts, for introducing and reasoning with hypotheses, is also explored. The theory formation process is considered as a source of hypothetical sentences for the deductive (reasoning) component, and the deductive component is considered as a source of a priori knowledge and hypothesis for the theory formation process.

5 citations


01 Jan 1985
TL;DR: A programming notation for CMOS-circuits is given and it is shown that this can be achieved by postulating two rules, the substitution rule described in section 3, and the elimination rule describedIn section 4.
Abstract: Rudolf H. Mak Department of Mathematics and Computing Science, Eindhoven University of Technology, P.O. Box 513, 5600 MB Eindhoven, The Netherlands. A programming notation for CMOS-circuits is given. With each circuit a Boolean expression is associated that specifies the logic properties of the circuit. Circuits are designed in a hierarchical fashion and rules are given to derive the logic properties of a composite circuit from the logic properties of its subcomponents. Combinational circuits and sequential circuits are treated in a uniform fashion. 1. Int",oduction The task of designing a circuit is, or should become, very similar to the task of designing a program. Ideally, one would like to describe a circuit in some "high level" language, in which the designer needs to be concerned with the functional aspects of his design only, and is not burdened with the physics underlying the constituting components, nor with the problem of their layout on the chip. Hence, just like programs, we would like to be able to derive circuits from their formal specifications. We show that this can be achieved by postulating two rules, the substitution rule described in section 3, and the elimination rule described in section 4.

5 citations


Journal ArticleDOI
TL;DR: Different formal systems development methodologies have different levels of acceptability to DP staff and users, strategic planning, estimation, flexibility, maintenance, interfaces and suitability to applications.

Posted Content
TL;DR: This paper considers four aspects of end user computing: false stereotypes, information tasks supported, inconsistency between flexibility and formal systems, and contributions made by information technology.
Abstract: This paper considers four aspects of end user computing:false stereotypes, information tasks supported, inconsistencybetween flexibility and formal systems, andcontributions made by information technology.

Book ChapterDOI
01 Jan 1985
TL;DR: The changing role of comments is described, giving examples of new uses of program comments in the area of formal system specification and verification, and the need for different types of comments and their syntactical representation is outlined.
Abstract: Comments in programming languages have traditionally been used to add extra descriptive information to program source code in an informal manner; programmers have been encouraged to use commenting only as a means of creating readable code and specific code documentation However, research currently being carried out indicates a dramatic change in the importance of comments within programs This paper describes the changing role of comments, giving examples of new uses of program comments in the area of formal system specification and verification The relative unimportance which language designers and implementors have associated with a comment facility is highlighted; the need for different types of comments and their syntactical representation is outlined An increasing awareness of the power of software tools has lead to integrated programming environments which support the manipulation of source code and possibly its intermediate forms This approach provides a base for tools to act upon comments in a functional way Proposals are made which are intended to promote discussion on language design, implementation, and internal representation


Book ChapterDOI
01 Jan 1985
TL;DR: In this paper, the authors outline an approach to economic development that aims at avoiding domestic inflationary pressures and external payments imbalances, while increasing the per capita consumption of certain commodities which are treated as "basics".
Abstract: This paper attempts to outline an approach to economic development that aims at avoiding domestic inflationary pressures and external payments’ imbalances, while increasing the per capita consumption of certain commodities which are treated as ‘basics’. Predictably, a prior assessment of the availability of and demand for ‘basics’ emerges as the keystone of policies for balanced development. The argument is offered, however, as no more than a tentative first step toward strengthening the real economic content of policies for financial balance. Also, the aim is not so much to present a complete formal system as to assemble a few operationally related ideas. As such, the most that the paper can offer is a theoretical basis for further reflections upon a somewhat new framework for analysing an old problem.

Journal ArticleDOI
TL;DR: The CODE concept can be used to develop formal system specification languages that are application-specific, and that can be part of a semi-automatic system generation process, in which the system specification is transformed directly to the implementation.

01 Jan 1985
TL;DR: The incompleteness of formal systems for arithmetic has been a recognized fact of mathematics as discussed by the authors, which suggests that the formal system in question fails to offer a deduction which it ought to.
Abstract: Publisher Summary The incompleteness of formal systems for arithmetic has been a recognized fact of mathematics. The term “incompleteness” suggests that the formal system in question fails to offer a deduction which it ought to. This chapter focuses on the status of a formal system, Peano Arithmetic, and explores a viewpoint on which Peano Arithmetic occupies an intrinsic, conceptually well-defined region of arithmetical truth. The idea is that it consists of those truths which can be perceived directly from the purely arithmetical content of a categorical conceptual analysis of the notion of natural number. The chapter explores its conceptual stability in light of the apparently destabilizing effect, through extensibility, of the phenomenon of incompleteness. It also offers heuristic and conceptual support for the viewpoint that Peano Arithmetic is the strongest natural first-order system for arithmetic, and that there is a sense in which it is complete with respect to purely arithmetical truth.

Book ChapterDOI
01 Jan 1985
TL;DR: The Theory of Interaction Systems is briefly introduced which adequately models the behavior in distributed systems and basic principles and regulations in the human organization like delegation of decision, hierarchical privileges and their influences primitive or standard representations are constructed.
Abstract: In order to be independent of implementational or environmental details requirements for the practical use of formal models in designing and reorganizing, large information systems are developed, discussing the example of an insurance company (criteria for model adequacy). Then the Theory of Interaction Systems is briefly introduced which adequately models the behavior in distributed systems (including partial autonomy of components, limited range of decisions, conflicting interests). For basic principles and regulations in the human organization like delegation of decision, hierarchical privileges and their influences primitive or standard representations are constructed. In this way both human and technical requirements can for the first time be specified in a common and precise language. The particular modeling flexibility of the formalism is discussed, hints and references for the practical use (implementation) of the formal tools is given.

Proceedings ArticleDOI
01 Aug 1985
TL;DR: The history of advances in programming - the little that there is of it - is the history of successful formalisation: by inventing and studying formalism, by extracting rigorous procedures, the authors progressed from programming in machine code to programming in high level languages (HLLs).
Abstract: The history of advances in programming - the little that there is of it - is the history of successful formalisation: by inventing and studying formalism, by extracting rigorous procedures, we progressed from programming in machine code to programming in high level languages (HLLs). Note that before HLLs were sufficiently formalised compilers used to be unreliable and expensive to use, programs could not be ported between machines, and the very use of HLLs was more or less restricted to academia. Observe also that if today a piece of software is still difficult to port it is almost invariably due to a deviation from strict formality.For many application domains HLLs provide an acceptable linguistic level for program (and system) specification. One does not hear horror stories about computer applications in such domains (physics, astronomy). The expert views of such domains, their descriptive theories, can be easily expressed on the linguistic level of HLL, thus becoming prescriptive theories (specifications) for computer software. If it is desired to use computers reliably in domains where this is not the case, then we must try to narrow the gap between the linguistic level on which domain-descriptive theories are formulated and the linguistic level on which programs are prescribed (specified).The programs, as ultimately run, are always formal objects. A rigorous, calculably verifiable relationship may be defined only between formal systems (scientific theories of nature are not calculably verified; they are tested by experiments!). Therefore, if we want the programs to reliably satisfy their specifications, the latter must be formal objects too. Or, put it differently: from a natural domain to a formal system there cannot lead an unbroken chain consisting of calculably verified steps only. There must be an interface at which the informality of a natural domain (or its description) comes into contact with the formality of a computer program (or its specification). Such a watershed interface can be avoided only if the application domain is itself formal (e.g. mathematical).The problem statement at the watershed interface is crucial for the eventual success of any computer application: if it is formal, we stand a chance of developing and implementing a provably correct solution, but we cannot prove its validity in application domain. The informality of problem statement does not help a bit: it is still impossible to prove the validity and we cannot guarantee correctness of eventually implemented software. Taking the lesser of two evils, we choose a formal problem statement and call it the specification.Basically, there are two policies with respect to selection of the watershed linguistic level. They can be roughly characterised by the size and structure of the extra logic component (EC) of that level. We may opt for as big an EC as we can handle in our formal system, or for an EC not larger than necessary to validate the problem statement in application domain. The large EC policy corresponds to axiomatisation of individual elementary observations (as exemplified by Horn clause and other data-based specifications); the small EC, to axiomatisation of 'laws' abstracted from and validated by elementary observations within the application domain (as illustrated by physical sciences). Notice, however, that the choice of EC is almost entirely domain-dependent, determined by the availability (or otherwise) of well-tested developed theories for each individual domain.Invention, formulation and validation of domain-descriptive theories is, of course, the main activity of scientifically-minded experts. Nothing in a software engineer's education entitles him to claim the necessary knowledge and experience. Similarly, no results of computing science entitle computing scientists to claim that theory-forming research in application domains can be reduced to collecting elementary observations. And logic does not recognise the notion of intra-domain validity.Logic - a calculus of formal systems - plays an important role in software development from specification to implementation. It also can play a minor role in validation of specifications, insofar as the validation may be seen to include derivation of consequences. That logic plays no discernible role in descriptive theory formation - an act always serendipitous and contingent on invention - should not worry us too much, as long as we do not wish to claim the possession of the philosopher's stone.

Journal ArticleDOI
Maria Nowakowska1
TL;DR: A theoretical analysis of the process of communication actions and dialogues, implying experimental research, shows the basic concepts of the formal system are multimedial units and languages of actions.

Book ChapterDOI
01 Jan 1985

Journal ArticleDOI
TL;DR: The primary goal of this paper is to define an initial step towards the definition of ‘systems grammar’ based on the notion of formal languages which can be used as a ‘tool’ in the formal representation of computer security systems.
Abstract: The primary goal of this paper is to define an initial step towards the definition of ‘systems grammar’ based on the notion of formal languages which can be used as a ‘tool’ in the formal representation of computer security systems. Currently all modelling done on computer security systems is written up as mathematical models. These mathematical models are usually based on the mathematics of relations amongst objects, as opposed to the model described in this paper which is based on the theory of formal languages. This paper is aimed at people who are doing research on the logical aspects of computer security. It is the first of a series of two papers. This paper will give interim results and make more specific the definition of a ‘formal language’ which suits the computer security environment. The second paper will illustrate the actual use of the defined ‘formal language’ and show how to represent the characteristics of a computer security environment by using this ‘formal language’.