scispace - formally typeset
Search or ask a question

Showing papers in "New Generation Computing in 1986"


Journal ArticleDOI
TL;DR: An approach for reasoning about events and time within a logic programming framework where the notion of event is taken to be more primitive than that of time and both are represented explicitly by means of Horn clauses augmented with negation by failure.
Abstract: We outline an approach for reasoning about events and time within a logic programming framework. The notion of event is taken to be more primitive than that of time and both are represented explicitly by means of Horn clauses augmented with negation by failure. The main intended applications are the updating of databases and narrative understanding. In contrast with conventional databases which assume that updates are made in the same order as the corresponding events occur in the real world, the explicit treatment of events allows us to deal with updates which provide new information about the past. Default reasoning on the basis of incomplete information is obtained as a consequence of using negation by failure. Default conclusions are automatically withdrawn if the addition of new information renders them inconsistent. Because events are differentiated from times, we can represent events with unknown times, as well as events which are partially ordered and concurrent.

1,573 citations


Journal ArticleDOI
TL;DR: In this paper, the authors contrast the interpreted-symbolic-structure approach and situated-automata approach, which seeks to analyze knowledge in terms of relations between the state of a machine and its environment over time using logic as a metalanguage in which the analysis is carried out.
Abstract: Although the concept ofknowledge plays a central role in artificial intelligence, the theoretical foundations of knowledge representation currently rest on a very limited conception of what it means for a machine to know a proposition. In the current view, the machine is regarded as knowing a fact if its state either explicitly encodes the fact as a sentence of an interpreted formal language or if such a sentence can be derived from other encoded sentences according to the rules of an appropriate logical system. We contrast this conception, the interpreted-symbolic-structure approach, with another, the situated-automata approach, which seeks to analyze knowledge in terms of relations between the state of a machine and the state of its environment over time using logic as a metalanguage in which the analysis is carried out.

162 citations


Journal ArticleDOI
TL;DR: The Alexander Method is the first solution exhibiting all these properties: the program to evaluate the query always terminates, the relational program is produced by a pure compilation of a source query and of the axioms, and the relational operations are optimized.
Abstract: We propose a technique for handling recursive axioms in deductive databases. More precisely, we solve the following problem: Given a relational query including virtual relations defined from axioms (Horn clauses, with variables in the conclusion predefined in the hypotheses), which can be recursive, how to translate this query into arelational program, i. e. a set of relational operations concerning only real relations (not virtual). Our solution has the following properties: As far as we know, the Alexander Method is the first solution exhibiting all these properties. This work is partly funded by Esprit Project 112 (KIMS).

135 citations


Journal ArticleDOI
TL;DR: An extension of PROLOG using modal logic is presented and a new deduction method is also given based on a rule closer to the classical inference rule ofPROLOG.
Abstract: In this paper we present an extension of PROLOG using modal logic. A new deduction method is also given based on a rule closer to the classical inference rule of PROLOG.

95 citations


Journal ArticleDOI
Wolfgang Bibel1
TL;DR: In this paper, a deductive method for solving robot problems is presented, where the descriptions of initial and goal situations are given by logical formulas, and primitive actions are described by rules, i.e., logical formulas as well.
Abstract: A new deductive method for solving robot problems is presented in this paper. The descriptions of initial and goal situations are given by logical formulas. The primitive actions are described by rules, i.e., logical formulas as well. Altogether this results in a problem description like in program synthesis or logic programming. A solution is generated by a proof of this description as usual in logic programming except that here proofs have to be strictly linear. This restriction is the clue of our solution; it can be easily added as an option to any theorem prover such as one based on the connection method used hereafter. In fact this restriction speeds up the proof search considerably. At the same time our approach offers an elegant solution of the frame problem.

90 citations


Journal ArticleDOI
K A Bowen1
TL;DR: It is shown how frames, semantic nets, scripts, message passing, and non-standard control can be represented in Prolog.
Abstract: The nature of a metalevel extension of Prolog is outlined. The key features include the treatment of theories (databases) and metalevel names as first-class objects which may be the values of variables. The use of the power of these constructs in traditional knowledge representation is explored. In particular, it is shown how frames, semantic nets, scripts, message passing, and non-standard control can be represented.

76 citations


Journal ArticleDOI
TL;DR: A new style of information processing, requirements for knowledge representation and a knowledge representation satisfying these requirements are discussed, a knowledge processing system designed on this basis and a newstyle of problem solving using this system.
Abstract: A new generation computer is expected to be the knowledge processing system of the future. However, many aspects are yet unknown regarding this technology, and a number of fundamental concepts, directly concerning knowledge processing system design need investigation, such as knowledge, data, inference, communication, information management, learning, and human interface.

49 citations


Journal ArticleDOI
TL;DR: The Incremental Query as mentioned in this paper is a new query mechanism for Prolog, which can be used to display the state of the substitution of an incremental query in Prolog as a spreadsheet.
Abstract: We believe that currently marketed programs leave unexploited much of the potential of the spreadsheet interface. The purpose of our work is to obtain suggestions for wider application of this interface by showing how to obtain its main features as a subset of logic programming. Our work is based on two observations. The first is that spreadsheets would already be a useful enhancement to interactive languages such as APL and Basic. Although Prolog is also an interactive language, this interface cannot be used in the same direct way. Hence our second observation: the usual query mechanism of Prolog does not provide the kind of interaction this application requires. But it can be provided by the Incremental Query, a new query mechanism for Prolog. The two observations together yield the spreadsheet as a display of the state of the substitution of an incremental query in Prolog. Recalculation of dependent cells is achieved by automatic modification of the query in response to a new increment that would make it unsolvable without the modification.

28 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a parallel execution model for Horn Clause logic programs based on the generator-consumer approach, which can be implemented efficiently with small run-time overhead.
Abstract: This paper presents a parallel execution model for exploiting AND-parallelism in Horn Clause logic programs. The model is based upon the generator-consumer approach, and can be implemented efficiently with small run-time overhead. Other related models that have been proposed to minimize the run-time overhead are unable to exploit the full parallelism inherent in the generator-consumer approach. Furthermore, our model performs backtracking more intelligently than these models. We also present two implementation schemes to realize our model: one has a coordinator to control the activities of processes solving different literals in the same clause; and the other achieves synchronization by letting processes pass messages to each other in a distributed fashion. Trade-offs between these two schemes are then discussed.

25 citations


Journal ArticleDOI
Y Shoham1
TL;DR: This work proposes a list of requirements that a rigorous theory of time and change must meet and “benchmark” some of the better known temporal formalisms that have been proposed in Artificial Intelligence by stating which of the requirements are met, in the opinion, by each formalism.
Abstract: In order to focus the discussion on temporal reasoning in Artificial Intelligence, we propose a list of requirements that a rigorous theory of time and change must meet. These requirements refer to the desired expressiveness of the formalism (such as representing continuous change) and to potential pitfalls (such as theinter- andintra-frame problems) that should be avoided. We “benchmark” some of the better known temporal formalisms that have been proposed in Artificial Intelligence by stating which of the requirements are met, in our opinion, by each formalism.

25 citations


Journal ArticleDOI
N Nilsson1
TL;DR: It is demonstrated that Prolog can be used for implementation of table driven parsers that generalize deterministic LR(k)-parsing techniques to the case of ambiguous grammars and an experimental system, AID, based on this technique is presented.
Abstract: This paper presents an alternative approach to implementation of DCGs. In contrast to the standard implementation we use the well-known bottom-up SLR(1)-parsing technique. An experimental system, AID, based on this technique is presented and discussed. The aim of this work is twofold. Firstly, we describe an alternative principle of implementation of DCGs, which makes it possible to cope with left-recursive DCGs and to avoid unnecessary backtracking when parsing. Secondly, we demonstrate that Prolog can be used for implementation of table driven parsers that generalize deterministic LR(k)-parsing techniques to the case of ambiguous grammars. The parsers generated by our system are deterministic whenever the submitted grammar is SLR(1). Otherwise the shift/reduce or reduce/reduce conflicts are stored in the generalized table and are used as Prolog backtrack points.

Journal ArticleDOI
TL;DR: A more detailed account is given of the implementation for a lazy combinator reduction machine, as this offers one solution to well-known problems with debugging functional programs in the context of lazy evaluation and combinator code.
Abstract: A functional computation involves substituting for function applications in an expression until that expression is reduced to normal form. The views of computations presented by the sequence- and state-oriented debugging tools of imperative systems are inappropriate for use with functional computations. New debugging tools are needed to aid the development of functional programs, especially in the context of lazy evaluation.

Journal ArticleDOI
TL;DR: The development of one such model, its implementation and the design of a data-driven machine to support it are described, which exploits a form of parallelism known as OR-parallelism and is particularly suited to database applications, although it would also support general applications.
Abstract: Research in the area of parallel evaluation mechanisms for logic programs have led to the proposal of a number of schemes exploiting various forms of parallelism. Many of the early models have been based on the conventional approach of organising concurrent components of computations as communicating processes. More recently, however, models based on more novel computation organisations, in particular, data-driven organisations, have been proposed. This paper describes the development of one such model, its implementation and the design of a data-driven machine to support it. The model exploits a form of parallelism known as OR-parallelism and is particularly suited to database applications, although it would also support general applications. It is envisaged that the proposed machine may be refined into an efficient database engine, which can then be a component of a more powerful and integrated logic programming machine.

Journal ArticleDOI
TL;DR: This paper describes an interpreter-centered list processing language TAO which supports the logic programming paradigm and the object-oriented programming paradigm together with the conventional procedural programming paradigm in the framework of the Lisp language.
Abstract: This paper describes an interpreter-centered list processing language TAO which supports the logic programming paradigm and the object-oriented programming paradigm together with the conventional procedural programming paradigm in the framework of the Lisp language. TAO allows the user to mix these programming paradigms in solving complicated and multifaceted AI problems. The fundamentals of these programing paradigms, namely, unification, message passing and function call can nest each other in an expression. Thus, the user can use the result of a function call or a message passing in a unification straightforwardly and vice versa. TAO also supports the concurrent programming. The implementation of the TAO interpreter on a Lisp machine called ELIS achieves a remarkable efficiency.

Journal ArticleDOI
TL;DR: The syntax and semantics of the new language are described and detailed examples are given to demonstrate discourse models based on situation semantics in the proposed language.
Abstract: Complex indeterminates and their notation are introduced into Prolog. The syntax and semantics of the new language are described. The language builds in an extension to the usual unification. Detailed examples are given to demonstrate discourse models based on situation semantics in the proposed language. Discourse, connective and resource situations are included in the examples.

Journal ArticleDOI
Yukio Kaneda1, Naoyuki Tamura1, Koichi Wada1, Hideo Matsuda1, S Kuo1 
TL;DR: The sequential Prolog machine PEK as discussed by the authors is an experimental machine designed for high speed execution of Prolog programs, which includes bit slice microprocessor elements comprising a microprogram sequencer and ALU, and possesses hardware circuits for unification and backtracking.
Abstract: The sequential Prolog machine PEK currently under development is described PEK is an experimental machine designed for high speed execution of Prolog programs The PEK machine is controlled by horizontal-type microinstructions The machine includes bit slice microprocessor elements comprising a microprogram sequencer and ALU, and possesses hardware circuits for unification and backtracking The PEK machine consists of a host processor (MC68000) and a backend processor (PEK engine) A Prolog interpreter has been developed on the machine and the machine performance evaluated A single inference can be executed in 89 microinstructions, and execution speed is approximately 60–70 KLIPS

Journal ArticleDOI
TL;DR: This work addresses the problem of collecting information about failures and successes while unifying a set of equations and shows that an algorithm due to Yasuura is particularly well suited as a basis for a method to construct the maximal unifiable subset and minimal non-unifiable subsets in conjunction with the unification process.
Abstract: We address the problem of collecting information about failures and successes while unifying a set of equations. This is relevant to the study of efficient backtracking, for which Cox used the concept of maximal unifiable subsets while Bruynooghe and Pereira used a notion which is closely related to that of minimal non-unifiable subsets. As we show, both these concepts play a fundamental role in the process of exploring the search space for breadth first resolution in logic programs. In a special case they lead to similar search strategies but in general have complementary and even incompatible aspects. We then show that an algorithm due to Yasuura is particularly well suited as a basis for a method to construct the maximal unifiable subsets and minimal non-unifiable subsetsin conjunction with the unification process. In addition to its simplicity this method provides an answer for two problems raised by Cox concerning the preservation of successful partial computations and unification without occur check.

Journal ArticleDOI
TL;DR: The design of a high-speed cellular pattern matcher, called the Associative Linear Text Processor (ALTEP), is presented and it is shown that it is suitable for systems which store the database in fixed length blocks.
Abstract: The design of a high-speed cellular pattern matcher, called theAssociative Linear Text Processor (ALTEP), is presented.ALTEP was originally designed for systems which use signature files as an access method. However, it is also suitable for systems which store the database in fixed length blocks.ALTEP is a linear array of logic cells which respond to commands sent from a central controller over a bus. A text block is loaded into the cells and pattern characters are broadcast to the cells for comparison.ALTEP has the capability of recognizing full regular expressions and is the only cellular logic array which has this capability. It requiresO(p) steps for patterns which do not contain closures andO(len(max(T(P)))) steps for closures, wherep is the the length of the pattern andlen(max(T(P))) is the length of the longest substring in the text which matched the closure.

Journal ArticleDOI
TL;DR: In this paper, the authors defined efficient multiway dynamic mergers with constant delay for a parallel logic programming language, which is an essential component of a parallel LJCL language.
Abstract: Multiway dynamic mergers with constant delay are an essential component of a parallel logic programming language. Previous attempts to defined efficient mergers have required complex optimising compilers and run-time support.

Journal ArticleDOI
TL;DR: It is shown that there exists a fixed logic program form which can provide a universal descriptive capability in the sense that any recursively enumerable language is expressed by a logic program obtained from the program form.
Abstract: In this paper we propose the concept of a logic program form. A logic program form is a kind of program abstraction where the skeleton of a program called program form is separated from its detailed structural information called interpretation. Given a logic program form, the class of logic programs obtained from the master form by giving interpretations is defined. It is shown that there exists a fixed logic program form which can provide a universal descriptive capability in the sense that any recursively enumerable language is expressed by a logic program obtained from the program form.

Journal ArticleDOI
TL;DR: Three different uses of Prolog are identified: building expert systems directly in ordinary Prolog, using Prolog as the implementation language for an higher level of interpretation, and extending Prolog with suitable features and directly using it.
Abstract: Prolog is becoming a popular language in A. I. applications and particularly in the implementation of knowledge based expert systems. We have identified three different uses of Prolog: (1) building expert systems directly in ordinary Prolog, (2) using Prolog as the implementation language for an higher level of interpretation, and (3) extending Prolog with suitable features and directly using it. In this paper, we define the three uses in more details, compare them, and cite some concrete examples.

Journal ArticleDOI
TL;DR: Man-machine dialogue functions were added to the production system and the potential of the rule translation method to be applied to expert systems was shown.
Abstract: Several attempts have been made to design a production system using Prolog. To construct a forward reasoning system, the rule interpreter is often written in Prolog, but its execution is slow. To develop an efficient production system, we propose a rule translation method where production rules are translated into a Prolog program and forward reasoning is done by the translated program. To translate the rules, we adopted the technique developed in BUP, the bottom-up parsing system in Prolog. Man-machine dialogue functions were added to the production system and showed the potential of our method to be applied to expert systems.

Journal ArticleDOI
TL;DR: This paper describes prospective models for a logic computer system, and its hardware and software components, using the language Concurrent Prolog as the single implementation, specification, and machine language.
Abstract: A logic computer system consists of an inference machine and a compatable logic operating system. This paper describes prospective models for a logic computer system, and its hardware and software components. The language Concurrent Prolog serves as the single implementation specification, and machine language. The computer system is represented as a logic programming goal {\em logic\_computer\_system}. Specification of the system corresponds to resolution of this goal. Clauses used to solve the goal --- and ensuing subgoals --- progressively refine the machine, operating system, and computer system designs. In addition, the accumulation of all clauses describing the logic operating system constitute its implementation. Logic computer systems with vastly different fundamental characteristics can be concisely specified in this manner. Two contrasting examples are given and discussed. An important characteristic of both peripheral devices and the overall computer system, whether they are restartable or perpetual, is examined. As well, a method for operational initialization of the logic computer system is presented. The same clauses which incrementally specify characteristics of the computer system also describe the manner in which this initialization takes place.

Journal ArticleDOI
TL;DR: The design and implementation of a Prolog-based based representation system called DLOG, which extends the definite clause language of Prolog with several features, including sets, descriptions, unary lambda abstracts, and simple integrity constraints is described.
Abstract: We describe the design and implementation of a Prolog-based based representation system called DLOG. The DLOG representation language extends the definite clause language of Prolog with several features, including sets, descriptions, unary lambda abstracts, and simple integrity constraints. Our development methodology relies on the notion of a logic-based knowledge model, which is a synthesis of the concepts of representation language and database management data model. This model provides a framework for specifying DLOG’s syntax, semantics, query mechanism, and assertion mechanism. The semantics of DLOG is not first order. A specification of its semantics based on meta axioms expressed in Prolog provides an implementation of the non-first order features. We provide a suggestion for the development of an amalgamation independent specification of DLOG semantics.

Journal ArticleDOI
TL;DR: A logic programming language R+-Maple is presented which computes by solving equations instead of unifications and refutations and has the advantage over the traditional method which buries the rules, as they are closely connected with environments, deeply within the code of an interpreter.
Abstract: A computation of formulas of the full first order predicate calculus is performed by first converting the multi-variable formulas into a single variable presentation of the Theory of Pairs (TP). Pairs are symbolic expressions of LISP with only one atom 0. Single variable calculus is suitable for both computations and mechanical theorem proving because the problems of multiple variable names and clashes between free and bound variables are eliminated. We present a logic programming language R+-Maple which computes by solving equations instead of unifications and refutations. Pairs permit a single one-variable equation calledenvironment equation to hold the values of all variables. Traditionally environments are implementation tools used to carry bindings of variables during the computation. With the help of environment equations for the single variable of our calculus we make the environments visible within the framework of a first order theory. This allows a straightforward demonstration of soundness of our computations. Moreover, the explicit form of environments allows to experiment with different forms of computational rules directly within the logic. The soundness of new rules can be thus readily proven formally. This has the advantage over the traditional method which buries the rules, as they are closely connected with environments, deeply within the code of an interpreter.

Journal ArticleDOI
Hideo Aiso1

Journal ArticleDOI
TL;DR: This paper defines the class of “low cost calls” and introduces a technique of detecting and executing these calls in a modified shallow binding system known as “standardized shallow binding”, and finds that standardized shallow binding is also well suited for an efficient implementation of static scoping.
Abstract: This paper presents a general approach to the optimization of function calls. We define the class of “low cost calls” and introduce a technique of detecting and executing these calls in a modified shallow binding system known as “standardized shallow binding”. We show that by this technique the overhead expenses of changing environments for low cost calls are nearly cut down to zero. The new method can be applied to any imperative or applicative language. In this paper LISP is taken as an example; it is shown how the technique has been applied in the implementation of a LISP interpreter. We prove that our method exceeds a number of optimizations that have been proposed recently. Finally, we find that standardized shallow binding is also well suited for an efficient implementation of static scoping.



Journal ArticleDOI
TL;DR: An algorithm for inserting injection operations to denotational specifications as part of the typechecking process for semantic processing systems which accept denotations as input and mechanically calculate them for debugging the semantics.
Abstract: In describing denotational semantics of programming languages, injection operations into sum domains are conventionally omitted for the sake of brevity. This in turn leads to difficulties for semantic processing systems which accept denotational specifications as input and mechanically calculate them for debugging the semantics. This paper describes an algorithm for inserting injection operations to denotational specifications as part of the typechecking process.