scispace - formally typeset
Search or ask a question

Showing papers in "ACM Transactions on Computational Logic in 2006"


Journal ArticleDOI
TL;DR: The experimental results confirm the solidity of DLV and highlight its potential for emerging application areas like knowledge management and information integration, and the main international projects investigating the potential of the system for industrial exploitation are described.
Abstract: Disjunctive Logic Programming (DLP) is an advanced formalism for knowledge representation and reasoning, which is very expressive in a precise mathematical sense: it allows one to express every property of finite structures that is decidable in the complexity class ΣP2 (NPNP). Thus, under widely believed assumptions, DLP is strictly more expressive than normal (disjunction-free) logic programming, whose expressiveness is limited to properties decidable in NP. Importantly, apart from enlarging the class of applications which can be encoded in the language, disjunction often allows for representing problems of lower complexity in a simpler and more natural fashion.This article presents the DLV system, which is widely considered the state-of-the-art implementation of disjunctive logic programming, and addresses several aspects. As for problem solving, we provide a formal definition of its kernel language, function-free disjunctive logic programs (also known as disjunctive datalog), extended by weak constraints, which are a powerful tool to express optimization problems. We then illustrate the usage of DLV as a tool for knowledge representation and reasoning, describing a new declarative programming methodology which allows one to encode complex problems (up to ΔP3-complete problems) in a declarative fashion. On the foundational side, we provide a detailed analysis of the computational complexity of the language of DLV, and by deriving new complexity results we chart a complete picture of the complexity of this language and important fragments thereof.Furthermore, we illustrate the general architecture of the DLV system, which has been influenced by these results. As for applications, we overview application front-ends which have been developed on top of DLV to solve specific knowledge representation tasks, and we briefly describe the main international projects investigating the potential of the system for industrial exploitation. Finally, we report about thorough experimentation and benchmarking, which has been carried out to assess the efficiency of the system. The experimental results confirm the solidity of DLV and highlight its potential for emerging application areas like knowledge management and information integration.

1,306 citations


Journal ArticleDOI
TL;DR: The basic calculus is developed, the most interesting models are presented, and applicability is demonstrated by two examples: algebraic reconstructions of Noethericity and propositional Hoare logic based on equational reasoning.
Abstract: We propose Kleene algebra with domain (KAD), an extension of Kleene algebra by simple equational axioms for a domain and a codomain operation. KAD considerably augments the expressiveness of Kleene algebra, in particular for the specification and analysis of programs and state transition systems. We develop the basic calculus, present the most interesting models and discuss some related theories. We demonstrate applicability by two examples: algebraic reconstructions of Noethericity and propositional Hoare logic based on equational reasoning.

170 citations


Journal ArticleDOI
TL;DR: In this paper, an implementation methodology for partial and disjunctive stable models where partiality and disjctions are unfolded from a logic program so that an implementation of stable models for normal (disjunction-free) programs can be used as the core inference engine is presented.
Abstract: This article studies an implementation methodology for partial and disjunctive stable models where partiality and disjunctions are unfolded from a logic program so that an implementation of stable models for normal (disjunction-free) programs can be used as the core inference engine. The unfolding is done in two separate steps. First, it is shown that partial stable models can be captured by total stable models using a simple linear and modular program transformation. Hence, reasoning tasks concerning partial stable models can be solved using an implementation of total stable models. Disjunctive partial stable models have been lacking implementations which now become available as the translation handles also the disjunctive case. Second, it is shown how total stable models of disjunctive programs can be determined by computing stable models for normal programs. Thus an implementation of stable models of normal programs can be used as a core engine for implementing disjunctive programs. The feasibility of the approach is demonstrated by constructing a system for computing stable models of disjunctive programs using the SMODELS system as the core engine. The performance of the resulting system is compared to that of DLV, which is a state-of-the-art system for disjunctive programs.

140 citations


Journal ArticleDOI
TL;DR: This article shows that any equivalent translation from logic programs to propositional formulas involves a significant increase in size, and assuming P ⊈ NC1 / poly, that this is inevitable.
Abstract: A theorem by Lin and Zhao shows how to turn any nondisjunctive logic program, understood in accordance with the answer set semantics, into an equivalent set of propositional formulas. The set of formulas generated by this process can be significantly larger than the original program. In this article we show (assuming P n NC1 s poly, a conjecture from the theory of computational complexity that is widely believed to be true) that this is inevitable: any equivalent translation from logic programs to propositional formulas involves a significant increase in size.

138 citations


Journal ArticleDOI
TL;DR: This article gives an algorithm that terminates with a solution for all, except for very special, pathological inputs, and ensures the practical efficiency of this algorithm by employing constraint programming techniques.
Abstract: Let a quantified inequality constraint over the reals be a formula in the first-order predicate language over the structure of the real numbers, where the allowed predicate symbols are ≤ and

87 citations


Journal ArticleDOI
TL;DR: In this article, the authors show how the classical concurrent constraint (cc) programming framework can work with soft constraints, and also propose an extension of cc languages which can use soft constraints to prune and direct the search for a solution.
Abstract: Soft constraints extend classical constraints to represent multiple consistency levels, and thus provide a way to express preferences, fuzziness, and uncertainty. While there are many soft constraint solving formalisms, even distributed ones, as yet there seems to be no concurrent programming framework where soft constraints can be handled. In this article we show how the classical concurrent constraint (cc) programming framework can work with soft constraints, and we also propose an extension of cc languages which can use soft constraints to prune and direct the search for a solution. We believe that this new programming paradigm, called soft cc (scc), can be also very useful in many Web-related scenarios. In fact, the language level allows Web agents to express their interaction and negotiation protocols, and also to post their requests in terms of preferences, and the underlying soft constraint solver can find an agreement among the agents even if their requests are incompatible.

67 citations


Journal ArticleDOI
TL;DR: This article shows how temporal, procedural and HTN-based control knowledge can be incorporated into AnsProlog* by the modular addition of a small number of domain-dependent rules, without the need to modify the planner.
Abstract: In this article we consider three different kinds of domain-dependent control knowledge (temporal, procedural and HTN-based) that are useful in planning. Our approach is declarative and relies on the language of logic programming with answer set semantics (AnsProloga). AnsProloga is designed to plan without control knowledge. We show how temporal, procedural and HTN-based control knowledge can be incorporated into AnsProloga by the modular addition of a small number of domain-dependent rules, without the need to modify the planner. We formally prove the correctness of our planner, both in the absence and presence of the control knowledge. Finally, we perform some initial experimentation that demonstrates the potential reduction in planning time that can be achieved when procedural domain knowledge is used to solve planning problems with large plan length.

67 citations


Journal ArticleDOI
TL;DR: The &lamda;ΠΣS calculus models type constructors and kinds in the intermediate language used by the TILT compiler for Standard ML to implement the SML module system and it is shown that type checking, subtyping, and all other judgments of the system are decidable.
Abstract: We study the λΠΣS≤ calculus, which contains singleton types S(M) classifying terms of base type provably equivalent to the term M. The system includes dependent types for pairs and functions (Σ and Π) and a subtyping relation induced by regarding singletons as subtypes of the base type. The decidability of type checking for this language is non-obvious, since to type check we must be able to determine equivalence of well-formed terms. But in the presence of singleton types, the provability of an equivalence judgment Γ ⊢ M1 ≡M2 : A can depend both on the typing context Γ and on the particular type A at which M1 and M2 are compared.We show how to prove decidability of term equivalence, hence of type checking, in λΠΣS≤ by exhibiting a type-directed algorithm for directly computing normal forms. The correctness of normalization is shown using an unusual variant of Kripke logical relations organized around sets; rather than defining a logical equivalence relation, we work directly with (subsets of) the corresponding equivalence classes.We then provide a more efficient algorithm for checking type equivalence without constructing normal forms. We also show that type checking, subtyping, and all other judgments of the system are decidable.The λΠΣS≤ calculus models type constructors and kinds in the intermediate language used by the TILT compiler for Standard ML to implement the SML module system. The decidability of λΠΣS≤ term equivalence allows us to show decidability of type checking for TILT's intermediate language. We also obtain a consistency result that allows us to prove type safety for the intermediate language. The algorithms derived here form the core of the type checker used for internal type checking in TILT.

54 citations


Journal ArticleDOI
TL;DR: It is shown how several features of monodic FOTL can be established as corollaries of the completeness result for the clausal temporal resolution method, including definitions of new decidable monodic classes, simplification of existing monodic Classes by reductions, and completeness of clausalporal resolution in the case ofmonodic logics with expanding domains.
Abstract: Until recently, First-Order Temporal Logic (FOTL) has been only partially understood. While it is well known that the full logic has no finite axiomatisation, a more detailed analysis of fragments of the logic was not previously available. However, a breakthrough by Hodkinson et al., identifying a finitely axiomatisable fragment, termed the monodic fragment, has led to improved understanding of FOTL. Yet, in order to utilise these theoretical advances, it is important to have appropriate proof techniques for this monodic fragment.In this paper, we modify and extend the clausal temporal resolution technique, originally developed for propositional temporal logics, to enable its use in such monodic fragments. We develop a specific normal form for monodic formulae in FOTL, and provide a complete resolution calculus for formulae in this form. Not only is this clausal resolution technique useful as a practical proof technique for certain monodic classes, but the use of this approach provides us with increased understanding of the monodic fragment. In particular, we here show how several features of monodic FOTL can be established as corollaries of the completeness result for the clausal temporal resolution method. These include definitions of new decidable monodic classes, simplification of existing monodic classes by reductions, and completeness of clausal temporal resolution in the case of monodic logics with expanding domains, a case with much significance in both theory and practice.

53 citations


Journal ArticleDOI
TL;DR: In this paper, a general algebraic splitting theory for logics with a fixpoint semantics is presented, together with the framework of approximation theory, a general fixpoint theory for arbitrary operators, this gives us a uniform and powerful way of deriving splitting results for each logic with a fixedpoint semantics.
Abstract: It is well known that, under certain conditions, it is possible to split logic programs under stable model semantics, that is, to divide such a program into a number of different “levels”, such that the models of the entire program can be constructed by incrementally constructing models for each level. Similar results exist for other nonmonotonic formalisms, such as auto-epistemic logic and default logic. In this work, we present a general, algebraic splitting theory for logics with a fixpoint semantics. Together with the framework of approximation theory, a general fixpoint theory for arbitrary operators, this gives us a uniform and powerful way of deriving splitting results for each logic with a fixpoint semantics. We demonstrate the usefulness of these results, by generalizing existing results for logic programming, auto-epistemic logic and default logic.

48 citations


Journal ArticleDOI
TL;DR: This article gives a detailed exposition of a soundness and completeness proof for the rather new type of a deductive propositional system CL1, the logical vocabulary of which contains operators for the so called parallel and choice operations, and the atoms of which represent elementary problems, that is, predicates in the standard sense.
Abstract: In the same sense as classical logic is a formal theory of truth, the recently initiated approach called computability logic is a formal theory of computability. It understands (interactive) computational problems as games played by a machine against the environment, their computability as existence of a machine that always wins the game, logical operators as operations on computational problems, and validity of a logical formula as being a scheme of “always computabl ” problems. Computability logic has been introduced semantically, and now among its main technical goals is to axiomatize the set of valid formulas or various natural fragments of that set. The present contribution signifies a first step towards this goal. It gives a detailed exposition of a soundness and completeness proof for the rather new type of a deductive propositional system CL1, the logical vocabulary of which contains operators for the so called parallel and choice operations, and the atoms of which represent elementary problems, that is, predicates in the standard sense.This article is self-contained as it explains all relevant concepts. While not technically necessary, familiarity with the foundational paper “Introduction to Computability Logi ” [Annals of Pure and Applied Logic 123 (2003), pp.1-99] would greatly help the reader in understanding the philosophy, underlying motivations, potential and utility of computability logic---the context that determines the value of the present results.

Journal ArticleDOI
TL;DR: This article proposes two logics based on predicate calculus as formalisms for encoding search problems and shows that the expressive power of these logics is given by the class NPMV.
Abstract: The answer-set programming (ASP) paradigm is a way of using logic to solve search problems. Given a search problem, to solve it one designs a logic theory so that models of this theory represent problem solutions. To compute a solution to the problem, one computes a model of the theory. Several answer-set programming formalisms have been developed on the basis of logic programming with the semantics of answer sets. In this article we show that predicate logic also gives rise to effective implementations of the ASP paradigm, similar in spirit to logic programming with the answer-set semantics and with a similar scope of applicability. Specifically, we propose two logics based on predicate calculus as formalisms for encoding search problems. We show that the expressive power of these logics is given by the class NPMV. We demonstrate their use in programming and discuss computational approaches to model finding. To address this latter issue, we follow a two-pronged approach. On the one hand, we show that the problem can be reduced to that of computing models of propositional theories and, more generally, of collections of pseudo-Boolean constraints. Consequently, programs (solvers) developed in the areas of propositional and pseudo-Boolean satisfiability can be used to compute models of theories in our logics. On the other hand, we develop native solvers designed specifically to exploit features of our formalisms. We present experimental results demonstrating the computational effectiveness of the overall approach.

Journal ArticleDOI
TL;DR: This article addresses the problems of contradictory information elimination, conflict resolution, and syntactic representation in logic program-based updates in a systematic manner and provides nontrivial solutions to simplify various update evaluation procedures under certain conditions.
Abstract: In logic program-based updates, contradictory information elimination, conflict resolution, and syntactic representation are three major issues that interfere with each other and significantly influence the update result. We observe that existing approaches of logic program-based updates, in one way or another, are problematic to deal with these issues. In this article, we address all these problems in a systematic manner. Our approach to the logic program-based update has the following features: (1) a prioritized logic programming language is employed for providing a formal basis of formalizing logic program-based updates, so that information conflict and its related problems in updates can be handled properly; (2) our approach presents both semantic characterization and syntactic representation for the underlying update procedure, and hence is consistent with the nature of updates within the logic program extent-declarative semantics and syntactic sensitivity; and (3) our approach also provides nontrivial solutions to simplify various update evaluation procedures under certain conditions.

Journal ArticleDOI
TL;DR: Here it is proved (sometimes subject to an assumption) that certain theories weaker than S/sub 2//sup 1/ do not prove either BB(/spl Sigma//sub 1//sup b/) or BB( /spl Sigma-sub 0//Sup b/), assuming that integer factoring is not possible in probabilistic polynomial time.
Abstract: The replacement (or collection or choice) axiom scheme BB(Γ) asserts bounded quantifier exchange as follows: ∀i

Journal ArticleDOI
TL;DR: In this article, simple techniques for defining and reasoning about quotient constructions, based on a general lemma library concerning functions that operate on equivalence classes, are presented for defining the integers from the natural numbers, and then to the definition of a recursive datatype satisfying equational constraints.
Abstract: A quotient construction defines an abstract type from a concrete type, using an equivalence relation to identify elements of the concrete type that are to be regarded as indistinguishable. The elements of a quotient type are equivalence classes: sets of equivalent concrete values. Simple techniques are presented for defining and reasoning about quotient constructions, based on a general lemma library concerning functions that operate on equivalence classes. The techniques are applied to a definition of the integers from the natural numbers, and then to the definition of a recursive datatype satisfying equational constraints.

Journal ArticleDOI
TL;DR: In this article, the authors define the concept of a heterogeneous temporal probabilistic (HTP) agent, which can be built on top of existing databases, data structures, and software code bases without explicitly accessing the internal code of those systems.
Abstract: To date, there has been no work on temporal probabilistic agent reasoning on top of heterogeneous legacy databases and software modules. We will define the concept of a heterogeneous temporal probabilistic (HTP) agent. Such agents can be built on top of existing databases, data structures, and software code bases without explicitly accessing the internal code of those systems and can take actions compatible with a policy or operating principles specified by an agent developer. We will develop a formal semantics for such agents through the notion of a feasible temporal probabilistic status interpretation (FTPSI for short). Intuitively, an FTPSI specifies what all an HTP agent is permitted/forbidden/obliged to do at various times t. As changes occur in the environment, the HTP agent must compute a new FTPSI. HTP agents continuously compute FTPSIs in order to determine what they should do and, hence, the problem of computing FTPSIs is very important. We give a sound and complete algorithm to compute FTPSIs for a very large class of HTP agents called strict HTP agents. In a given state, many FTPSIs may exist. These represent alternative courses of action that the HTP agent can take. We provide a notion of an optimal FTPSI that selects an FTPSI optimizing an objective function and give a sound and complete algorithm to compute an optimal FTPSI.

Journal ArticleDOI
TL;DR: A type inference algorithm for lambda terms in elementary affine logic (EAL) is proposed that decorates the syntax tree of a simple typed lambda term and collects a set of linear constraints, resulting in a parametric elementary type that can be instantiated with any solution of the set of collected constraints.
Abstract: We propose a type inference algorithm for lambda terms in elementary affine logic (EAL). The algorithm decorates the syntax tree of a simple typed lambda term and collects a set of linear constraints. The result is a parametric elementary type that can be instantiated with any solution of the set of collected constraints.We point out that the typeability of lambda terms in EAL has a practical counterpart, since it is possible to reduce any EAL-typeable lambda terms with the Lamping's abstract algorithm obtaining a substantial increase of performances.We show how to apply the same techniques to obtain decorations of intuitionistic proofs into linear logic proofs.

Journal ArticleDOI
TL;DR: This article shows how to obtain linear-time algorithms for EssNet, and shows further that it is possible to optimize the verification so that each node of the input structure is visited at most once.
Abstract: We consider the following decision problems:ProofNet: Is a given multiplicative linear logic (MLL) proof structure a proof net?EssNet: Is a given essential net (of an intuitionistic MLL sequent) correct?In this article we show how to obtain linear-time algorithms for EssNet. As a corollary, by showing that ProofNet is linear-time reducible to EssNet (by the Trip Translation), we obtain a linear-time algorithm for ProofNet.We show further that it is possible to optimize the verification so that each node of the input structure is visited at most once. Finally, we present linear-time algorithms for sequentializing proof nets and essential nets, that is, for finding derivations of the underlying sequents.

Journal ArticleDOI
TL;DR: This article applies mathematical logic to obtain a rigorous foundation for previous inherently nonrigorous results and also extends those previous results to find the main theorem, which avoids infallibility assumptions on both the agent and the software.
Abstract: This article applies mathematical logic to obtain a rigorous foundation for previous inherently nonrigorous results and also extends those previous results. Roughly speaking, our main theorem states: any agent A that comprehends the correctness-related properties of software S also comprehends an intelligence-related limitation of S. The theorem treats the output of S, if any, as an attempt at solving a halting problem. Previous nonrigorous attempts to obtain similar theorems depend on infallibility assumptions on both the agent and the software. The hypothesis that intelligent agents and intelligent software must be infallible has been widely questioned. In addition, recent work by others has determined that well-known previous attempts use a fallacious form of reasoning; that is, the same form of reasoning can yield paradoxical results. Our main theorem avoids infallibility assumptions on both the agent and the software. In addition, our proof is rigorous, in the sense that in principle one can carry it out in Zermelo-Fraenkel set theory. The software correctness framework considered in the main theorem is that of Hoare logic.

Journal ArticleDOI
TL;DR: This work establishes the first size separation between Nullstellensatz and polynomial calculus refutations, and obtains new upper bounds on refutation sizes for certain CNFs in constant-depth Frege with counting axioms systems.
Abstract: We show that constant-depth Frege systems with counting axioms modulo m polynomially simulate Nullstellensatz refutations modulo m. Central to this is a new definition of reducibility from propositional formulas to systems of polynomials. Using our definition of reducibility, most previously studied propositional formulas reduce to their polynomial translations. When combined with a previous result of the authors, this establishes the first size separation between Nullstellensatz and polynomial calculus refutations. We also obtain new upper bounds on refutation sizes for certain CNFs in constant-depth Frege with counting axioms systems.

Journal ArticleDOI
TL;DR: It is proved that choosing the optimal literal to branch on in DPLL is Δp2[log n]-hard, and becomes NPPP-hard if branching is only allowed on a subset of variables.
Abstract: DPLL and resolution are two popular methods for solving the problem of propositional satisfiability. Rather than algorithms, they are families of algorithms, as their behavior depends on some choices they face during execution: DPLL depends on the choice of the literal to branch on; resolution depends on the choice of the pair of clauses to resolve at each step. The complexity of making the optimal choice is analyzed in this article. Extending previous results, we prove that choosing the optimal literal to branch on in DPLL is Δp2[log n]-hard, and becomes NPPP-hard if branching is only allowed on a subset of variables. Optimal choice in regular resolution is both NP-hard and coNP-hard. The problem of determining the size of the optimal proofs is also analyzed: it is coNP-hard for DPLL, and Δp2[log n]-hard if a conjecture we make is true. This problem is coNP-hard for regular resolution.

Journal ArticleDOI
TL;DR: In this article, the authors show the decidability of this problem relativized to ∃a∀-sentences, and develop a goal-driven unification algorithm to solve it.
Abstract: Formal set theory is traditionally concerned with pure sets; consequently, the satisfiability problem for fragments of set theory was most often addressed (and in many cases positively solved) in the pure framework. In practical applications, however, it is common to assume the existence of a number of primitive objects (sometimes called atoms) that can be members of sets but behave differently from them. If these entities are assumed to be devoid of members, the standard extensionality axiom must be revised; then decidability results can sometimes be achieved via reduction to the pure case and sometimes can be based on direct goal-driven algorithms. An alternative approach to modeling atoms that allows one to retain the original formulation of extensionality was proposed by Quine: atoms are self-singletons. In this article we adopt this approach in coping with the satisfiability problem: We show the decidability of this problem relativized to ∃a∀-sentences, and develop a goal-driven unification algorithm.