scispace - formally typeset
Search or ask a question
Book

Foundations of logic programming

01 Jan 1984-
TL;DR: This is the second edition of an account of the mathematical foundations of logic programming, which collects, in a unified and comprehensive manner, the basic theoretical results of the field, which have previously only been available in widely scattered research papers.
Abstract: This is the second edition of an account of the mathematical foundations of logic programming. Its purpose is to collect, in a unified and comprehensive manner, the basic theoretical results of the field, which have previously only been available in widely scattered research papers. In addition to presenting the technical results, the book also contains many illustrative examples and problems. The text is intended to be self-contained, the only prerequisites being some familiarity with PROLOG and knowledge of some basic undergraduate mathematics. The material is suitable either as a reference book for researchers or as a textbook for a graduate course on the theoretical aspects of logic programming and deductive database systems.
Citations
More filters
Journal ArticleDOI
TL;DR: By showing that argumentation can be viewed as a special form of logic programming with negation as failure, this paper introduces a general logic-programming-based method for generating meta-interpreters for argumentation systems, a method very much similar to the compiler-compiler idea in conventional programming.

4,386 citations


Cites background from "Foundations of logic programming"

  • ...Then according to theorem 9.6 in [ 38 ], there is a definite program P and an n+1-ary predicate symbol pf such that all computed answers for P ∪ {← p f(s k1 (0),..,s kn(0),x)} have the form {x/sk(0)} and for all...

    [...]

  • ...A program is said to be hierarchical [ 38 ] if there is no p such that p ≥ p. A program is said to be stratified [2] if we never have both p ≡ q and p ≥-1 q. A program is strict [35,54] if there are no p,q such that p ≥+1 q and p ≥-1 q. A program is call-consistent [35,54,12,18] if there is no predicate symbol p such that p ≥-1 p....

    [...]

Journal ArticleDOI
TL;DR: Experiments with a real-world database and knowledge base in a university domain illustrate the promise of this approach to combining first-order logic and probabilistic graphical models in a single representation.
Abstract: We propose a simple approach to combining first-order logic and probabilistic graphical models in a single representation. A Markov logic network (MLN) is a first-order knowledge base with a weight attached to each formula (or clause). Together with a set of constants representing objects in the domain, it specifies a ground Markov network containing one feature for each possible grounding of a first-order formula in the KB, with the corresponding weight. Inference in MLNs is performed by MCMC over the minimal subset of the ground network required for answering the query. Weights are efficiently learned from relational databases by iteratively optimizing a pseudo-likelihood measure. Optionally, additional clauses are learned using inductive logic programming techniques. Experiments with a real-world database and knowledge base in a university domain illustrate the promise of this approach.

2,916 citations

Proceedings ArticleDOI
03 Jun 2002
TL;DR: The tutorial is focused on some of the theoretical issues that are relevant for data integration: modeling a data integration application, processing queries in data integration, dealing with inconsistent data sources, and reasoning on queries.
Abstract: Data integration is the problem of combining data residing at different sources, and providing the user with a unified view of these data. The problem of designing data integration systems is important in current real world applications, and is characterized by a number of issues that are interesting from a theoretical point of view. This document presents on overview of the material to be presented in a tutorial on data integration. The tutorial is focused on some of the theoretical issues that are relevant for data integration. Special attention will be devoted to the following aspects: modeling a data integration application, processing queries in data integration, dealing with inconsistent data sources, and reasoning on queries.

2,716 citations


Cites background from "Foundations of logic programming"

  • ...Such an expansion is performed by viewing each foreign key constraint r1[X] i r2[Y], where X and Y are sets of h attributes and Y is a key for r2, as a logic programming [ 77 ] rule r 0 2( ~...

    [...]

Journal ArticleDOI
TL;DR: It is shown that some facts of commonsense knowledge can be represented by logic programs and disjunctive databases more easily when classical negation is available.
Abstract: An important limitation of traditional logic programming as a knowledge representation tool, in comparison with classical logic, is that logic programming does not allow us to deal directly with incomplete information. In order to overcome this limitation, we extend the class of general logic programs by including classical negation, in addition to negation-as-failure. The semantics of such extended programs is based on the method of stable models. The concept of a disjunctive database can be extended in a similar way. We show that some facts of commonsense knowledge can be represented by logic programs and disjunctive databases more easily when classical negation is available. Computationally, classical negation can be eliminated from extended programs by a simple preprocessor. Extended programs are identical to a special case of default theories in the sense of Reiter.

2,451 citations

Journal ArticleDOI
TL;DR: It is shown that the class of programs possessing a total well-founded model properly includes previously studied classes of "stratified" and "locally stratified" programs, and is compared with other proposals in the literature.
Abstract: A general logic program (abbreviated to "program" hereafter) is a set of roles that have both positive and negative subgoals. It is common to view a deductive database as a general logic program consisting of rules (IDB) slttmg above elementary relations (EDB, facts). It is desirable to associate one Herbrand model with a program and think of that model as the "meaning of the program, " or Its "declarative semantics. " Ideally, queries directed to the program would be answered in accordance with this model. Recent research indicates that some programs do not have a "satisfactory" total model; for such programs, the question of an appropriate partial model arises. Unfounded sets and well-founded partial models are introduced and the well-founded semantics of a program are defined to be its well-founded partial model. If the well-founded partial model is m fact a total model. it is called the well-founded model. It n shown that the class of programs possessing a total well-founded model properly includes previously studied classes of "stratified" and "locally stratified" programs, The method in this paper is also compared with other proposals in the literature, including Clark's "program completion, " Fitting's and Kunen's 3-vahred interpretations of it, and the "stable models" of Gelfond and Lifschitz.

1,908 citations

References
More filters
Journal ArticleDOI
TL;DR: This paper proposes a logic for default reasoning, develops a complete proof theory and shows how to interface it with a top down resolution theorem prover, and provides criteria under which the revision of derived beliefs must be effected.

4,146 citations

Journal ArticleDOI
TL;DR: In the present paper, a uniform proof procedure for quantification theory is given which is feasible for use with some rather complicated formulas and which does not ordinarily lead to exponentiation.
Abstract: The hope that mathematical methods employed in the investigation of formal logic would lead to purely computational methods for obtaining mathematical theorems goes back to Leibniz and has been revived by Peano around the turn of the century and by Hilbert's school in the 1920's. Hilbert, noting that all of classical mathematics could be formalized within quantification theory, declared that the problem of finding an algorithm for determining whether or not a given formula of quantification theory is valid was the central problem of mathematical logic. And indeed, at one time it seemed as if investigations of this “decision” problem were on the verge of success. However, it was shown by Church and by Turing that such an algorithm can not exist. This result led to considerable pessimism regarding the possibility of using modern digital computers in deciding significant mathematical questions. However, recently there has been a revival of interest in the whole question. Specifically, it has been realized that while no decision procedure exists for quantification theory there are many proof procedures available—that is, uniform procedures which will ultimately locate a proof for any formula of quantification theory which is valid but which will usually involve seeking “forever” in the case of a formula which is not valid—and that some of these proof procedures could well turn out to be feasible for use with modern computing machinery.Hao Wang [9] and P. C. Gilmore [3] have each produced working programs which employ proof procedures in quantification theory. Gilmore's program employs a form of a basic theorem of mathematical logic due to Herbrand, and Wang's makes use of a formulation of quantification theory related to those studied by Gentzen. However, both programs encounter decisive difficulties with any but the simplest formulas of quantification theory, in connection with methods of doing propositional calculus. Wang's program, because of its use of Gentzen-like methods, involves exponentiation on the total number of truth-functional connectives, whereas Gilmore's program, using normal forms, involves exponentiation on the number of clauses present. Both methods are superior in many cases to truth table methods which involve exponentiation on the total number of variables present, and represent important initial contributions, but both run into difficulty with some fairly simple examples.In the present paper, a uniform proof procedure for quantification theory is given which is feasible for use with some rather complicated formulas and which does not ordinarily lead to exponentiation. The superiority of the present procedure over those previously available is indicated in part by the fact that a formula on which Gilmore's routine for the IBM 704 causes the machine to computer for 21 minutes without obtaining a result was worked successfully by hand computation using the present method in 30 minutes. Cf. §6, below.It should be mentioned that, before it can be hoped to employ proof procedures for quantification theory in obtaining proofs of theorems belonging to “genuine” mathematics, finite axiomatizations, which are “short,” must be obtained for various branches of mathematics. This last question will not be pursued further here; cf., however, Davis and Putnam [2], where one solution to this problem is given for ele

2,743 citations


"Foundations of logic programming" refers background in this paper

  • ...Building on work of Herbrand [44] in 1930, there was much activity in theorem proving in the early 1960's by Prawitz [84], Gilmore [39], Davis, Putnam [26] and others....

    [...]

Book
01 Jan 1972
TL;DR: A comparison of first- and second-order logic in the case of SETs shows that the former is more likely to be correct and the latter is less likely.
Abstract: USEFUL FACTS ABOUT SETS. SENTENTIAL LOGIC. FIRST-ORDER LOGIC. UNDECIDABILITY. SECOND-ORDER LOGIC.

2,216 citations


"Foundations of logic programming" refers background or methods in this paper

  • ...We will also require the usual type theory [33]....

    [...]

  • ...The intuitive idea of a typed theory (also called a many-sorted theory [33]) is that there are several sorts of variables, each ranging over a different domain....

    [...]

  • ...The other fact that we will need about typed logics is that there is a transformation of typed formulas into (type-free) formulas, which shows that the apparent extra generality provided by typed logics is illusory [33]....

    [...]

  • ...For this, we use a standard transfonnation [33]....

    [...]

  • ...We suggest reading the first few chapters of [14], [33], [64], [69] or [99]....

    [...]

Journal ArticleDOI
TL;DR: In this paper the operational and fixpoint semantics of predicate logic programs are defined, and the connections with the proof theory and model theory of logic are investigated, and it is concluded that operational semantics is a part ofProof theory and that fixpoint semantic is a special case of model-theoretic semantics.
Abstract: Sentences in first-order predicate logic can be usefully interpreted as programs. In this paper the operational and fixpoint semantics of predicate logic programs are defined, and the connections with the proof theory and model theory of logic are investigated. It is concluded that operational semantics is a part of proof theory and that fixpoint semantics is a special case of model-theoretic semantics.

1,636 citations

Book
11 Jun 1973
TL;DR: This book contains an introduction to symbolic logic and a thorough discussion of mechanical theorem proving its applications and how it can be applied to various areas such as question answering, problem solving, program analysis, and program synthesis.
Abstract: This book contains an introduction to symbolic logic and a thorough discussion of mechanical theorem proving its applications. The book consists of three major parts. Chapters 2 and 3 constitute an introduction to symbolic logic. Chapters 4-9 introduce several techniques in mechanical theorem proving, and Chapters 10 an 11 show how theorem proving can be applied to various areas such as question answering, problem solving, program analysis, and program synthesis.

1,611 citations


"Foundations of logic programming" refers background in this paper

  • ...We suggest reading the first few chapters of [14], [33], [64], [69] or [99]....

    [...]

  • ...We suggest consulting [9], [14], [64] or [66]....

    [...]