scispace - formally typeset
Search or ask a question

Showing papers on "Program transformation published in 1994"


Journal ArticleDOI
TL;DR: In this paper, an approach to the semantics of logic programs whose aim is to find notions of models which really capture the operational semantics, and are, therefore, useful for defining program equivalences and for semantics-based program analysis is presented.
Abstract: This paper is a general overview of an approach to the semantics of logic programs whose aim is to find notions of models which really capture the operational semantics, and are, therefore, useful for defining program equivalences and for semantics-based program analysis. The approach leads to the introduction of extended interpretations which are more expressive than Herbrand interpretations. The semantics in terms of extended interpretations can be obtained as a result of both an operational (top-down) and a fixpoint (bottom-up) construction. It can also be characterized from the model-theoretic viewpoint, by defining a set of extended models which contains standard Herbrand models. We discuss the original construction modeling computed answer substitutions, its compositional version, and various semantics modeling more concrete observables. We then show how the approach can be applied to several extensions of positive logic programs. We finally consider some applications, mainly in the area of semantics-based program transformation and analysis.

168 citations


Journal ArticleDOI
TL;DR: This paper provides natural syntactic conditions that allow the occur-check to be safely omitted from the unification algorithm, and proposes a program transformation that transforms every program into a program for which only the calls to the built-in unification predicate need to be resolved by a unification algorithm with the occurs-check.
Abstract: In most PROLOG implementations, for efficiency occur-check is omitted from the unification algorithm. This paper provides natural syntactic conditions that allow the occur-check to be safely omitted. The established results apply to most well-known PROLOG programs, including those that use difference lists, and seem to explain why this omission does not lead in practice to any complications. When applying these results to general programs, we show their usefulness for proving absence of floundering. Finally, we propose a program transformation that transforms every program into a program for which only the calls to the built-in unification predicate need to be resolved by a unification algorithm with the occur-check.

46 citations


Journal ArticleDOI
TL;DR: The specification and derivation of substitution for the de Bruijn representation of λ-terms is used to illustrate programming with a function-sequence monad and is improved by interactive program transformation methods into an efficient implementation that uses primitive machine arithmetic.

39 citations


Book ChapterDOI
14 Sep 1994
TL;DR: An unfold/fold program transformation system which extends the unfold/ fold transformations of H. Tamaki and T. Sato is presented and a proof of the correctness of the proposed transformations in the sense of the least Herbrand model semantics of the program is presented.
Abstract: An unfold/fold program transformation system which extends the unfold/fold transformations of H. Tamaki and T. Sato is presented in this paper. The system consists of unfolding, simultaneous folding, and generalization + equality introduction rules. The simultaneous folding rule permits the folding of a set of folded clauses into a single clause, using a set of folding clauses, while the generalization + equality introduction rule facilitates the application of the simultaneous folding rule by performing appropriate abstractions. A proof of the correctness of the proposed transformations in the sense of the least Herbrand model semantics of the program is also presented.

36 citations


Book ChapterDOI
Robert Paige1
14 Sep 1994
TL;DR: How to decrease labor and improve reliability in the development of efficient implementations of nonnumerical algorithms and labor intensive software is an increasingly important problem as the demand for computer technology shifts from easier applications to more complex algorithmic ones.
Abstract: How to decrease labor and improve reliability in the development of efficient implementations of nonnumerical algorithms and labor intensive software is an increasingly important problem as the demand for computer technology shifts from easier applications to more complex algorithmic ones; e.g., optimizing compilers for supercomputers, intricate data structures to implement efficient solutions to operations research problems, search and analysis algorithms in genetic engineering, complex software tools for workstations, design automation, etc. It is also a difficult problem that is not solved by current CASE tools and software management disciplines, which are oriented towards data processing and other applications, where the implementation and a prediction of its resource utilization follow more directly from the specification.

36 citations


Journal ArticleDOI
Martin Ward1
TL;DR: A detailed look at a larger example of program analysis by transformation, carried out in the WSL language, a «wide spectrum language» which includes both low-level program operations and high level specifications, and which has been specifically designed to be easy to transform.
Abstract: In this paper we will take a detailed look at a larger example of program analysis by transformation. We will be considering Algorithm 2.3.3.A from Knuth's «Fundamental Algorithms» Knuth (1968) (p. 357) which is an algorithm for the addition of polynomials represented using four-directional links. Knuth (1974) describes this as having «a complicated structure with excessively unrestrained goto statements» and goes on to say «I hope someday to see the algorithm cleaned up without loss of its efficiency». Our aim is to manipulate the program, using semantics-preserving operations, into an equivalent, high-level specification. The transformations are carried out in the WSL language, a «wide spectrum language» which includes both low-level program operations and high level specifications, and which has been specifically designed to be easy to transform

32 citations


01 Jan 1994
TL;DR: This paper puts Turchin's driving methodology on a solid semantic foundation which is not tied to any particular programming language or data structure, and includes program optimizations not achievable by simple partial evaluation.
Abstract: An abstract framework is developed to describe program transformation by specializing a given program to a restricted set of inputs. Particular cases include partial evaluation [19] and Turchin's more powerful "driving" transformation [33]. Such automatic program speedups have been seen to give quite significant speedups in practical applications. This paper's aims are similar to those of [18]: better to understand the fundamental mathematical phenomena that make such speedups possible. The current paper is more complete than [18], since it precisely formulates correctness of code generation; and more powerful, since it includes program optimizations not achievable by simple partial evaluation. Moreover, for the first time it puts Turchin's driving methodology on a solid semantic foundation which is not tied to any particular programming language or data structure. This paper is dedicated to Satoru Takasu with thanks for good advice early in my career on how to do research, and for insight into how to see the essential part of a new problem.

30 citations


Journal ArticleDOI
06 Jun 1994
TL;DR: A verification method for proving the correctness of formula transformations makes it possible to prove just once that a formula transformation corresponds to a program transformation, removing the need to prove separately the Correctness of each transformed program.
Abstract: The incorporation of a recovery algorithm into a program can be viewed as a program transformation, converting the basic program into a fault-tolerant version. We present a framework in which such program transformations are accompanied by a corresponding formula transformation which obtains properties of the fault-tolerant versions of the programs from properties of the basic programs. Compositionality is achieved when every property of the fault-tolerant version can be obtained from a transformed property of the basic program. A verification method for proving the correctness of formula transformations is presented. This makes it possible to prove just once that a formula transformation corresponds to a program transformation, removing the need to prove separately the correctness of each transformed program. Keywords: Parallel Algorithms, Distributed Algorithms, Fault-Tolerance, Specification, Verification.

26 citations


04 Nov 1994
TL;DR: The principles that have driven the design and implementation of vpo and the implications of these principles on vpo''s implementation are described and its use as a key component for the realization of several other applications is described.
Abstract: Current and future high-performance systems require language processors that can generate code that fully exploits the power of the underlying architecture. A key and necessary component of such language processors is a global code improver. This article describes the key principles behind the design and implementation of a global code improver that has been use to construct several high-quality compilers and other program transformation and analysis tools. The code improver, called vpo, employs a paradigm of compilation that has proven to be flexible and adaptableall code improving transformations are performed on a target-specific representation of the program. The aggressive use of this paradigm yields a code improver with several valuable properties. Four properties stand out. First, vpo is language and compiler independent. That is, it has been used to implement compilers for several different computer languages. For the C programming language, it has been used with several front ends each of which generates a different intermediate language. Second, because all code improvements are applied to a single low-level intermediate representation, phase ordering programs are minimized. Third, vpo is easily retargeted and handles a wide variety of architectures. In particular, vpo''s structure allows new architectures and new implementations of existing architectures to be accommodated quickly and easily. Fourth and finally, because of its flexible structure, vpo has several other interesting uses in addition to its primary use in an optimizing compiler. This article describes the principles that have driven the design of vpo and the implications of these principles on vpo''s implementation. The article concludes with a brief description of vpo''s use as a back end with front ends for several different languages, and its use as a key component for the realization of several other applications.

25 citations


Journal ArticleDOI
TL;DR: A compilation system for the concurrent programming language Program Composition Notation (PCN) that incorporates a concurrent-transformation system that allows abstractions to be defined through concurrent source-to-source transformations; these convert programmer-defined operators into a core notation.
Abstract: We describe a compilation system for the concurrent programming language Program Composition Notation (PCN). This notation provides a single-assignment programming model that permits concurrent-programming concerns such as decomposition, communication, synchronization, mapping, granularity, and load balancing to be addressed separately in a design. PCN is also extensible with programmer-defined operators, allowing common abstractions to be encapsulated and reused in different contexts.The compilation system incorporates a concurrent-transformation system that allows abstractions to be defined through concurrent source-to-source transformations; these convert programmer-defined operators into a core notation. Run-time techniques allow the core notation to be compiled into a simple concurrent abstract machine which can be implemented in a portable fashion using a run-time library. The abstract machine provides a uniform treatment of single-assignment and mutable data structures, allowing data sharing between concurrent and sequential program segments and permitting integration of sequential C and Fortran code into concurrent programs.This compilation system forms part of a program development toolkit that operates on a wide variety of networked workstations, multicomputers, and shared-memory multiprocessors. The toolkit has been used both to develop substantial applications and to teach introductory concurrent-programming classes, including a freshman course at Caltech.

23 citations


Dissertation
01 Jan 1994
TL;DR: The results show that program transformation is a viable method of renovating old (370 assembler) code in a cost elective way, and that MetaWSL provides an effective basis for clearly and concisely expressing the required transformations.
Abstract: This thesis addresses the software maintenance problem of extracting high-level designs from code. The investigated solution is to use a mathematically-based formal program transformation system. The resulting tool, the Maintainer's Assistant, is based on Ward's [177] WSL (wide spectrum language) and method of proving program equivalence. The problems addressed include: how to reverse engineer from code alone (the only reliable source of information about a program [158]), how to express program transformations within the system, what kinds of transformations should be incorporated, how to make the tool simple to use, how to perform abstraction and how to create a tool suitable for use with large programs. Using the Maintainer's Assistant, the program code is automatically translated into WSL and the transformations, although tested for valid applicability by the system, are interactively applied by the user. Notable features include a mathematical simplifier, a large flexible transformation catalogue and, significantly, the use of an extension of WSL, A4etaWSL, for representing the transformations. MetaWSL expands WSL by incorporating a variety of extensions, including: program editing statements, pattern matching and template filling functions, symbolic mathematics and logic functions, statements for moving within the program’s syntax tree and statements for repeating an operation at each node of the tree. Using MetaWSL, 80% of the 601 transformations can be expressed in less than 20 program statements. The Maintainer's Assistant has been used on a wide variety of examples of up to several thousand lines, including commercial software written in IBM 370 assembler. It has been possible to transform initially unstructured programs into a hierarchy of procedures, facilitating subsequent design recovery. These results show that program transformation is a viable method of renovating old (370 assembler) code in a cost elective way, and that MetaWSL provides an effective basis for clearly and concisely expressing the required transformations.

31 Jan 1994
TL;DR: The original construction modeling computed answer substitutions, its compositional version, and various semantics modeling more concrete observables are discussed, and it is shown how the approach can be applied to several extensions of positive logic programs.
Abstract: This paper is a general overview of an approach to the semantics of logic programs whose aim is to find notions of models which really capture the operational semantics, and are, therefore, useful for defining program equivalences and for semantics-based program analysis. The approach leads to the introduction of extended interpretations which are more expressive than Herbrand interpretations. The semantics in terms of extended interpretations can be obtained as a result of both an operational (top-down) and a fixpoint (bottom-up) construction. It can also be characterized from the model-theoretic viewpoint, by defining a set of extended models which contains standard Herbrand models. We discuss the original construction modeling computed answer substitutions, its compositional version, and various semantics modeling more concrete observables. We then show how the approach can be applied to several extensions of positive logic programs. We finally consider some applications, mainly in the area of semantics-based program transformation and analysis.

Book ChapterDOI
14 Sep 1994
TL;DR: In this paper, the authors propose a non-standard semantics for data flow analysis of constraint logic programs (clp-like programs), which is equivalent to the standard semantics for suspension-free programs.
Abstract: Because of synchronization based on blocking ask, some of the most important techniques for data flow analysis of (sequential) constraint logic programs (clp) are no longer applicable to cc languages. In particular, the generalized approach to the semantics, intended to factorize the (standard) semantics so as to make explicit the domain-dependent features (i.e. operators and semantic objects which may be influenced by abstraction) becomes useless for relevant applications. A possible solution to this problem is based on a more abstract (non-standard) semantics: the success semantics, which models non suspended computations only. With a program transformation (NoSynch) that simply ignores synchronization, we obtain a clp-like program which allows us to apply standard techniques for data flow analysis. For suspension-free programs the success semantics is equivalent to the standard semantics thus justifying the use of suspension analysis to generate sound approximations. A second transformation (Angel) is introduced, applying a different abstraction of synchronization in possibly suspending programs and resulting in a framework which is adequate to suspension analysis. Applicability and accuracy of these solutions are investigated.

Proceedings ArticleDOI
06 Apr 1994
TL;DR: Research interests • Combinatorics and Exact Circuit Synthesis Algorithms • Design and Implementation of Declarative and Object Oriented Programming Languages • Interoperation between various programming language paradigms • Java and Prolog based Agent Infrastructures.
Abstract: Research interests • Combinatorics and Exact Circuit Synthesis Algorithms • Design and Implementation of Declarative and Object Oriented Programming Languages • Interoperation between various programming language paradigms • Java and Prolog based Agent Infrastructures • Logic Programming and Logic Grammars • Natural Language Processing, Conversational Agents • Graph and Hypergraph Representations for Logic and Knowledge Processing • Compilers, Run-time Systems and Automatic Memory Management • Visualisation of Semantic Networks, Shared 3D-Virtual Worlds

Proceedings ArticleDOI
06 Apr 1994
TL;DR: A program transformation process is described, which transforms originally procedural systems to object-oriented systems, and the objects of the resulting systcrn may then be used for further object- oriented systems engineering, avoiding many problems arising in connection with procedural software reuse.
Abstract: Object-oriented concepts seem to be useful concerning the reuse of ~xlsting soRwarc. Therefore a transformation of procedural programs to object-oriented programs becomes an important process to enhance the reuse potentiai of procedural programs. In this paper we describe a program transformation process, which transforms originally procedural systems to object-oriented systems. The objects of the resulting systcrn may then be used for further object-oriented systems engineering, avoiding many problems arising in connection with procedural software reuse (i.e. module interconnectionl etc.).

Proceedings Article
01 Nov 1994
TL;DR: This paper considers a class of commonly encountered computations whose atural" speciica-tion is essentially sequential, and shows how algebraic properties of the operators involved can be used to transform them into divide-and-conquer programs that are considerably more eecient, both in theory and in practice, on parallel machines.
Abstract: Most of the research, to date, on optimizing program transformations for declarative languages has focused on sequential execution strategies. In this paper, we consider a class of commonly encountered computations whose atural" speciica-tion is essentially sequential, and show how algebraic properties of the operators involved can be used to transform them into divide-and-conquer programs that are considerably more eecient, both in theory and in practice, on parallel machines.

Dissertation
01 Jan 1994
TL;DR: The method has indicated that acquiring a data design from existing data intensive program code by program transformation with human assistance is an effective method in software maintenance.
Abstract: The problem area addressed in this thesis is extraction of a data design from existing data intensive program code The purpose of this is to help a software maintainer to understand a software system more easily because a view of a software system at a high abstraction level can be obtained Acquiring a data design from existing data intensive program code is an important part of reverse engineering in software maintenance A large proportion of software systems currently needing maintenance is data intensive The research results in this thesis can be directly used in a reverse engineering tool A method has been developed for acquiring data designs from existing data intensive programs, COBOL programs in particular Program transformation is used as the main tool Abstraction techniques and the method of crossing levels of abstraction are also studied for acquiring data designs A prototype system has been implemented based on the method developed This involved implementing a number of program transformations for data abstraction, and thus contributing to the production of a tool Several case studies, including one case study using a real program with 7000 Hues of source code, are presented The experiment results show that the Entity-Relationship Attribute Diagrams derived from the prototype can represent the data designs of the original data intensive programs The original contribution of the thesis is that the approach presented in this thesis can identify and extract data relationships from the existing code by combining analysis of data with analysis of code The approach is believed to be able to provide better capabilities than other work in the field The method has indicated that acquiring a data design from existing data intensive program code by program transformation with human assistance is an effective method in software maintenance Future work is suggested at the end of the thesis including extending the method to build an industrial strength tool

Book ChapterDOI
01 Jan 1994
TL;DR: The Weight Finder is introduced, an advanced profiler for Fortran programs, which is based on a von Neumann architecture, to detect the most important regions of code in the program, as far as execution time is concerned.
Abstract: This paper introduces the Weight Finder, an advanced profiler for Fortran programs, which is based on a von Neumann architecture Existing Fortran codes are generally too large to analyze fully in depth with respect to performance tuning It is the responsibility of the Weight Finder to detect the most important regions of code in the program, as far as execution time is concerned Program transformation systems, compilers and users may then subsequently concentrate their optimization efforts upon these areas in code

15 Dec 1994
TL;DR: This thesis studies mechanisms for program transformation at the source program level, in the context of Polya, which attempts to support modularization and at the same time incorporate the operations that are provided by the modules in the programming language itself.
Abstract: In understanding complex algorithms, the notions of encapsulation and modularization have played a key role An algorithm is broken into several parts or modules, and understanding of each part is independent of others In addition, each part contains details that are not needed by other parts and so can be hidden from them Programming languages provide support for encapsulation and modularization in many different forms Early programming languages provided the procedure and function as a means for modularization Later, files were introduced as a means of modularizing programs More sophisticated mechanisms were then introduced, like modules, packages, structures, and classes In all these cases, the interface to a module remained the procedure or function call Programs that use such modules contain calls to functions and procedures for communicating with a module Ideally, using the operations that are provided by a module should be done in exactly the same way as using operations of primitive types of the programming language Primitive operations of the language and operations provided by modules should be easy to intermix In addition, substituting one module for another that has the same functionality but different implementation should involve a minimal amount of effort Recently, a new programming language, Polya, has been designed, which attempts to support modularization and at the same time incorporate the operations that are provided by the modules in the programming language itself This is done by a sophisticated type-definition facility and a mechanism for transforming programs at the source-program level This thesis studies mechanisms for program transformation at the source program level, in the context of Polya Program transformation is based on a set of transformation rules that prescribe how a part of a program is to be transformed, and a set of directives that prescribe which program variables are to be transformed We first give an algorithm for processing program transformations as described by the transform construct The algorithm constructs a coordinate transformation of an abstract program based on a set of transforms and transform directives for transforming program variables We then study the problem of transforming expressions that have compound types Both the type constructor and the component expressions of the original expression may be transformed No extra rules need be added to the bodies of transforms that transform the type constructor and the component expressions In the sequel we investigate the problem of transforming procedures and functions that have parameters that need to be transformed Finally, the problem of transforming program-transformation rules is studied The program transformation techniques are applied to two well-known algorithms The algorithms are source programs, which are subsequently transformed to programs of conventional programming languages, and then compiled and run

Proceedings ArticleDOI
01 Apr 1994
TL;DR: An original approach to automatic array alignment, the step in the hierarchical transformation system aimed at the efficient execution of shared memory programs on distributed memory machines, is presented.
Abstract: Presents an original approach to automatic array alignment, the step in the hierarchical transformation system aimed at the efficient execution of shared memory programs on distributed memory machines The array alignment algorithm deals with a broad set of intra-dimension and inter-dimension alignment preferences, including offsets, strides, permutations, embeddings, and their combinations The authors discuss the algorithm and the tests performed on the Connection Machine CM-2 >

Proceedings Article
23 Aug 1994
TL;DR: In this paper, the Davis-Putnam procedure is used to enumerate all stable models of disjunctive logic programs without repetition and without the need for a minimality check.
Abstract: In analogy to the Davis-Putnam procedure we develop a new procedure for computing stable models of propositional normal disjunctive logic programs, using case analysis and simplification. Our procedure enumerates all stable models without repetition and without the need for a minimality check. Since it is not necessary to store the set of stable models explicitly, the procedure runs in polynomial space. We allow clauses with empty heads, in order to represent truth or falsity of a proposition as a one-literal clause. In particular, a clause of form ∼A → expresses that A is constrained to be true, without providing a justification for A. Adding this clause to a program restricts its stable models to those containing A, without introducing new stable models. Together with A → this provides the basis for case analysis. We present our procedure as a set of rules which transform a program into a set of solved forms, which resembles the standard method for presenting unification algorithms. Rules are sound in the sense that they preserve the set of stable models. A subset of the rules is shown to be complete in the sense that for each stable model a solved form can be obtained. The method allows for concise presentation, flexible choice of a control strategy and simple correctness proofs.

Book ChapterDOI
17 Jun 1994
TL;DR: A new constructive characterization of those semantics for disjunctive logic programs which are extensions of the well-founded semantics for normal programs, consisting of an inefficient preprocessing phase (implementing the program transformation procedure), and of an efficient runtime computation, obtained as a variation of any effective procedural semantics for theWell-founded model.
Abstract: In this paper, we propose a new constructive characterization of those semantics for disjunctive logic programs which are extensions of the well-founded semantics for normal programs. Based on considerations about how disjunctive information is treated by a given semantics, we divide the computation of that semantics into two phases. The first one is a program transformation phase, which applies axiom schemata expressing how derivations involving disjunctions are made in the given semantic framework. The second one is a constructive phase, based on a variation of the well-founded model construction for normal programs. We apply this two-phases procedural semantics to the computation of the static semantics of disjunctive logic programs as a case-study, showing how it works and what its results are in several examples. A main perspective of this proposal is a procedural semantics for disjunctive programs consisting of an inefficient preprocessing phase (implementing the program transformation procedure), to be however performed only once, and of an efficient runtime computation, obtained as a variation of any effective procedural semantics for the well-founded model.

Book ChapterDOI
01 Jan 1994
TL;DR: The transformation rules are based on a theory of contextual equivalence for functional languages with imperative features that are fundamental for the process of program specification, derivation, transformation, refinement and other forms of code generation and optimization.
Abstract: In this paper we describe progress towards a theory of tranformational program development. The transformation rules are based on a theory of contextual equivalence for functional languages with imperative features. Such notions of equivalence are fundamental for the process of program specification, derivation, transformation, refinement and other forms of code generation and optimization. This paper is dedicated to Professor Satoru Takasu.

Book ChapterDOI
Robert Paige1
14 Sep 1994
TL;DR: In this article, the authors propose a method to decrease labor and improve reliability in the development of efficient implementations of non-numerical algorithms and labor intensive software, which is an increasingly important problem as the demand for computer technology shifts from easier applications to more complex algorithmic ones.
Abstract: How to decrease labor and improve reliability in the development of efficient implementations of nonnumerical algorithms and labor intensive software is an increasingly important problem as the demand for computer technology shifts from easier applications to more complex algorithmic ones; e.g., optimizing compilers for supercomputers, intricate data structures to implement efficient solutions to operations research problems, search and analysis algorithms in genetic engineering, complex software tools for workstations, design automation, etc. It is also a difficult problem that is not solved by current CASE tools and software management disciplines, which are oriented towards data processing and other applications, where the implementation and a prediction of its resource utilization follow more directly from the specification.

Journal ArticleDOI
TL;DR: The techniques have been applied in practice to a wide range of source programs and analysis problems, including assessment problems, and meeting some of the integrity requirements for verification tools given in ‘The procurement of safety critical software in defence equipment’ (MoD, 1991).
Abstract: This paper describes an approach to the semantic analysis of procedural code. The techniques differ from those adopted in current static analysis tools such as MALPAS (Bramson, 1984) and SPADE (Clutterbuck and Carre, 1988) in two key respects: (1) A database is used, together with language-specific and language-independent data models, as a repository for all information about a program or set of programs which is required for analysis, and for storing and interrelating the results of analyses; (2) The techniques aim to treat the full language under consideration by a process of successive transformation and abstraction from the source code until a representation is obtained which is amenable to analysis. This abstraction process can include the production of formal specifications from code. The techniques have been partially implemented for the OS/VS IBM diallect of COBOL '74 and for FORTRAN '77. Several components of the resulting toolset have been formally specified in Z, thus meeting some of the integrity requirements for verification tools given in ‘The procurement of safety critical software in defence equipment’ (MoD, 1991). The techniques have been applied in practice to a wide range of source programs and analysis problems (Lano and Haughton, 1993b; Lano, et al., 1991), including assessment problems (Lloyd's Register, 1992, 1993; Hornsby and Eldridge, 1990). Section 1 gives an overview of the analysis process. Section 2 describes the representations used to support the process. Section 3 describes some of the techniques involved, and Section 4 gives examples of applications of the process. The Appendix contains extracts from a large case study carried out using tools developed to support the process.


Book ChapterDOI
16 Oct 1994
TL;DR: An approach to transforming the algebraic specification of a mathematical domain of computation into a knowledge base, preserving the semantics determined in the specification, is introduced and involves thegebraic specification language Formal-⌆ and the hybrid knowledge representation system Mantra.
Abstract: An approach to transforming the algebraic specification of a mathematical domain of computation into a knowledge base, preserving the semantics determined in the specification, is introduced. It involves the algebraic specification language Formal-⌆ and the hybrid knowledge representation system Mantra. In the framework of Formal-⌆ mathematical domains of computation are represented algebraically. The transformation aims at achieving the executability of a specification.

Book ChapterDOI
06 Jun 1994
TL;DR: A Knowledge-Based Program Transformation System that has been designed on top of an object-oriented knowledge base for the purpose of automatic program transformation and optimization, and the conventional unification algorithm has been modified into an analogical unification algorithm.
Abstract: This paper describes a Knowledge-Based Program Transformation System (KBPTS) that has been designed on top of an object-oriented knowledge base for the purpose of automatic program transformation and optimization. In KBPTS, a program can be specified by means of a flowchart or a set of logical descriptions. Generally, given a specification of a program, it can be synthesized with a set of procedural methods. However, a simple substitution of a method for a basic computation in a specification may leave a large amount of possible optimizations unexplored. Assuming that a set of efficient algorithms for abstract problems (e.g., graph algorithms) is implemented and saved in an object-oriented knowledge base, a given program (or parts of it) can be evaluated by those algorithms efficiently with proper instantiations of variables. To identify the proper algorithms, the conventional unification algorithm has been modified into an analogical unification algorithm. Also, in order to control the overall search space more clearly, a set of global search strategies are encoded in meta rules.

01 Jan 1994
TL;DR: A new procedure for computing stable models of propositional normal disjunctive logic programs, using case analysis and simplification, which enumerates all stable models without repetition and without the need for a minimality check is developed.
Abstract: In analogy to the Davis-Putnam procedure we develop a new procedure for computing stable models of propositional normal disjunctive logic programs, using case analysis and simplification. Our procedure enumerates all stable models without repetition and without the need for a minimality check. Since it is not necessary to store the set of stable models explicitly, the procedure runs in polynomial space. We allow clauses with empty heads, in order to represent truth or falsity of a proposition as a one-literal clause. In particular, a clause of form ∼A → expresses that A is constrained to be true, without providing a justification for A. Adding this clause to a program restricts its stable models to those containing A, without introducing new stable models. Together with A → this provides the basis for case analysis. We present our procedure as a set of rules which transform a program into a set of solved forms, which resembles the standard method for presenting unification algorithms. Rules are sound in the sense that they preserve the set of stable models. A subset of the rules is shown to be complete in the sense that for each stable model a solved form can be obtained. The method allows for concise presentation, flexible choice of a control strategy and simple correctness proofs.

Book ChapterDOI
01 Jan 1994
TL;DR: A practical tool is described which enables the user to extract high level specifications from existing source codes, using semantic preserving formal transformations, and extensions are described to support the acquisition of explicit timing information from real-time source codes.
Abstract: A practical tool is described which enables the user to extract high level specifications from existing source codes, using semantic preserving formal transformations. A brief overview of the theoretical foundation is given. Extensions are then described to support the acquisition of explicit timing information from real-time source codes.