scispace - formally typeset
Search or ask a question

Showing papers in "Theory and Practice of Logic Programming in 2005"


Journal ArticleDOI
TL;DR: FLUX is a programming method for the design of agents that reason logically about their actions and sensor information in the presence of incomplete knowledge.
Abstract: FLUX is a programming method for the design of agents that reason logically about their actions and sensor information in the presence of incomplete knowledge. The core of FLUX is a system of Constraint Handling Rules, which enables agents to maintain an internal model of their environment by which they control their own behavior. The general action representation formalism of the fluent calculus provides the formal semantics for the constraint solver. FLUX exhibits excellent computational behavior due to both a carefully restricted expressiveness and the inference paradigm of progression.

175 citations


Journal ArticleDOI
TL;DR: In this article, the authors compare two recent extensions of the answer set semantics of logic programs, one of which allows the bodies and heads of rules to contain nested expressions and the other of which uses weight constraints.
Abstract: We compare two recent extensions of the answer set (stable model) semantics of logic programs. One of them, due to Lifschitz, Tang and Turner, allows the bodies and heads of rules to contain nested expressions. The other, due to Niemela and Simons, uses weight constraints. We show that there is a simple, modular translation from the language of weight constraints into the language of nested expressions that preserves the program's answer sets. Nested expressions can be eliminated from the result of this translation in favor of additional atoms. The translation makes it possible to compute answer sets for some programs with weight constraints using satisfiability solvers, and to prove the strong equivalence of programs with weight constraints using the logic of here-and-there.

153 citations


Journal ArticleDOI
TL;DR: In this article, an or-parallel tabling engine called OPTYap is proposed for logic programs with tabling, which is based on the SLG-WAM for tabling and on the environment copying for implicit parallelism.
Abstract: Logic programming languages, such as Prolog, provide a high-level, declarative approach to programming. Logic Programming offers great potential for implicit parallelism, thus allowing parallel systems to often reduce a program's execution time without programmer intervention. We believe that for complex applications that take several hours, if not days, to return an answer, even limited speedups from parallel execution can directly translate to very significant productivity gains. It has been argued that Prolog's evaluation strategy – SLD resolution – often limits the potential of the logic programming paradigm. The past years have therefore seen widening efforts at increasing Prolog's declarativeness and expressiveness. Tabling has proved to be a viable technique to efficiently overcome SLD's susceptibility to infinite loops and redundant subcomputations. Our research demonstrates that implicit or-parallelism is a natural fit for logic programs with tabling. To substantiate this belief, we have designed and implemented an or-parallel tabling engine – OPTYap – and we used a shared-memory parallel machine to evaluate its performance. To the best of our knowledge, OPTYap is the first implementation of a parallel tabling engine for logic programming systems. OPTYap builds on Yap's efficient sequential Prolog engine. Its execution model is based on the SLG-WAM for tabling, and on the environment copying for or-parallelism. Preliminary results indicate that the mechanisms proposed to parallelize search in the context of SLD resolution can indeed be effectively and naturally generalized to parallelize tabled computations, and that the resulting systems can achieve good performance on shared-memory parallel machines. More importantly, it emphasizes our belief that through applying or-parallelism and tabling to logic programs the range of applications for Logic Programming can be increased.

65 citations


Journal ArticleDOI
TL;DR: This is the fifth and most recent edition of a legendary book whose first edition dates from 1981 and was probably the first introductory Prolog book and it still is the most gentle introduction to Prolog for everyone, including non-computer scientists.
Abstract: This is the fifth and most recent edition of a legendary book whose first edition dates from 1981. It was probably the first introductory Prolog book and it still is the most gentle introduction to Prolog for everyone, including non-computer scientists. The authors make very few assumptions indeed about the computer science knowledge and programming skills of their readers. Even so, the book covers most of the Prolog language. The first edition was based on the de facto Edinburgh Prolog standard. Obviously, with the appearance of the ISO Prolog Standard, a new edition of the book needed to adapt to ISO, and that is the reason for the subtitle Using the ISO Standard. Other changes were introduced as well and this review mentions some of the differences, but mainly concentrates on the current contents, independent of the history involved. The Preface of the book sketches the rationale behind the book and the new edition. It also lists in a table differences between the new and the old editions related to switching from the de facto standard to ISO. These differences are related to syntax and built-in predicates. There are surprisingly few differences and they are small. The book consists of eleven chapters

55 citations


Journal ArticleDOI
TL;DR: This paper shows how to use different kinds of information in the compilation of CHRs to obtain access efficiency, and a better translation of the CHR rules into the underlying language, which in this case is HAL.
Abstract: In this paper we discuss the optimizing compilation of Constraint Handling Rules (CHRs). CHRs are a multi-headed committed choice constraint language, commonly applied for writing incremental constraint solvers. CHRs are usually implemented as a language extension that compiles to the underlying language. In this paper we show how we can use different kinds of information in the compilation of CHRs to obtain access efficiency, and a better translation of the CHR rules into the underlying language, which in this case is HAL. The kinds of information used include the types, modes, determinism, functional dependencies and symmetries of the CHR constraints. We also show how to analyze CHR programs to determine this information about functional dependencies, symmetries and other kinds of information supporting optimizations.

54 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a method based on level mappings, which allows to provide uniform characterizations of different semantics for logic programs, more particular of the least model semantics for definite programs, of the Fitting semantics, and of the well-founded semantics.
Abstract: Part of the theory of logic programming and nonmonotonic reasoning concerns the study of fixed-point semantics for these paradigms. Several different semantics have been proposed during the last two decades, and some have been more successful and acknowledged than others. The rationales behind those various semantics have been manifold, depending on one's point of view, which may be that of a programmer or inspired by commonsense reasoning, and consequently the constructions which lead to these semantics are technically very diverse, and the exact relationships between them have not yet been fully understood. In this paper, we present a conceptually new method, based on level mappings, which allows to provide uniform characterizations of different semantics for logic programs. We will display our approach by giving new and uniform characterizations of some of the major semantics, more particular of the least model semantics for definite programs, of the Fitting semantics, and of the well-founded semantics. A novel characterization of the weakly perfect model semantics will also be provided.

40 citations


Journal ArticleDOI
TL;DR: In this article, a programming tactic involving polyhedra is reported that has been widely applied in the polyhedral analysis of (constraint) logic programs, which enables the computations of convex hulls that are required for polyhedral analyses to be coded with linear constraint solving machinery that is available in many Prolog systems.
Abstract: A programming tactic involving polyhedra is reported that has been widely applied in the polyhedral analysis of (constraint) logic programs. The method enables the computations of convex hulls that are required for polyhedral analysis to be coded with linear constraint solving machinery that is available in many Prolog systems.

35 citations


Journal ArticleDOI
TL;DR: In this paper, the authors provide correctness results for partial evaluation based on needed narrowing and show that the nice properties of this strategy are essential for the specialization process, in particular the structure of the original program is preserved by partial evaluation and thus the same evaluation strategy can be applied for the execution of specialized programs.
Abstract: Many functional logic languages are based on narrowing, a unification-based goal-solving mechanism which subsumes the reduction mechanism of functional languages and the resolution principle of logic languages. Needed narrowing is an optimal evaluation strategy which constitutes the basis of modern (narrowing-based) lazy functional logic languages. In this work, we present the fundamentals of partial evaluation in such languages. We provide correctness results for partial evaluation based on needed narrowing and show that the nice properties of this strategy are essential for the specialization process. In particular, the structure of the original program is preserved by partial evaluation and, thus, the same evaluation strategy can be applied for the execution of specialized programs. This is in contrast to other partial evaluation schemes for lazy functional logic programs which may change the program structure in a negative way. Recent proposals for the partial evaluation of declarative multi-paradigm programs use (some form of) needed narrowing to perform computations at partial evaluation time. Therefore, our results constitute the basis for the correctness of such partial evaluators.

33 citations


Journal ArticleDOI
TL;DR: The architecture of cTI is described and an extensive experimental evaluation of the system is reported, covering many classical examples from the logic programming termination literature and several Prolog programs of respectable size and complexity.
Abstract: We present cTI, the first system for universal left-termination inference of logic programs. Termination inference generalizes termination analysis and checking. Traditionally, a termination analyzer tries to prove that a given class of queries terminates. This class must be provided to the system, for instance by means of user annotations. Moreover, the analysis must be redone every time the class of queries of interest is updated. Termination inference, in contrast, requires neither user annotations nor recomputation. In this approach, terminating classes for all predicates are inferred at once. We describe the architecture of cTI and report an extensive experimental evaluation of the system covering many classical examples from the logic programming termination literature and several Prolog programs of respectable size and complexity.

30 citations


Journal ArticleDOI
TL;DR: This work shows how to deal with correctness and completeness in a declarative way, treating programs only from the logical point of view, and employs the ideas of this work to generalize a known method of proving termination of normal programs.
Abstract: We advocate a declarative approach to proving properties of logic programs. Total correctness can be separated into correctness, completeness and clean termination; the latter includes non-floundering. Only clean termination depends on the operational semantics, in particular on the selection rule. We show how to deal with correctness and completeness in a declarative way, treating programs only from the logical point of view. Specifications used in this approach are interpretations (or theories). We point out that specifications for correctness may differ from those for completeness, as usually there are answers which are neither considered erroneous nor required to be computed. We present proof methods for correctness and completeness for definite programs and generalize them to normal programs. For normal programs we use the 3-valued completion semantics; this is a standard semantics corresponding to negation as finite failure. The proof methods employ solely the classical 2-valued logic. We use a 2-valued characterization of the 3-valued completion semantics, which may be of separate interest. The method of proving correctness of definite programs is not new and can be traced back to the work of clark in 1979. However a more complicated approach using operational semantics was proposed by some authors. We show that it is not stronger than the declarative one, as far as properties of program answers are concerned. For a corresponding operational approach to normal programs, we show that it is (strictly) weaker than our method. We also employ the ideas of this work to generalize a known method of proving termination of normal programs.

22 citations


Journal ArticleDOI
TL;DR: In this paper, a technique for the optimization of bound queries over disjunctive deductive databases with constraints is presented, which is based on the exploitation of binding propagation techniques which reduce the size of the data relevant to answer the query and, consequently, reduce both the complexity of computing a single model and the number of models to be considered.
Abstract: This paper presents a technique for the optimization of bound queries over disjunctive deductive databases with constraints. The proposed approach is an extension of the well-known Magic-Set technique and is well-suited for being integrated in current bottom-up (stable) model inference engines. More specifically, it is based on the exploitation of binding propagation techniques which reduce the size of the data relevant to answer the query and, consequently, reduces both the complexity of computing a single model and the number of models to be considered. The motivation of this work stems from the observation that traditional binding propagation optimization techniques for bottom-up model generator systems, simulating the goal driven evaluation of top-down engines, are only suitable for positive (disjunctive) queries, while hard problems are expressed using unstratified negation. The main contribution of the paper consists in the extension of a previous technique, defined for positive disjunctive queries, to queries containing both disjunctive heads and constraints (a simple and expressive form of unstratified negation). As the usual way of expressing declaratively hard problems is based on the guess-and-check technique, where the guess part is expressed by means of disjunctive rules and the check part is expressed by means of constraints, the technique proposed here is highly relevant for the optimization of queries expressing hard problems. The value of the technique has been proved by several experiments.

Journal ArticleDOI
TL;DR: In this paper, the authors present a framework for automatic generation of constraint logic programming solvers given the logical specification of the constraints, taking advantage of the power of tabled resolution.
Abstract: In this paper, we present a framework for automatic generation of CHR solvers given the logical specification of the constraints. This approach takes advantage of the power of tabled resolution for constraint logic programming, in order to check the validity of the rules. Compared to previous work (Apt and Monfroy 1999; Ringeissen and Monfroy 2000; Abdennadher and Rigotti 2000; Abdennadher and Rigotti 2001a), where different methods for automatic generation of constraint solvers have been proposed, our approach enables the generation of more expressive rules (even recursive and splitting rules) that can be used directly as CHR solvers.

Journal ArticleDOI
TL;DR: This paper experimentally evaluates the issue of whether tracking compoundness allows the computation of more sharing information and an original proposal for the addition of a new mode recording the set of variables that are deemed to be ground or free.
Abstract: $\textup{\textsf{Sharing}}$, an abstract domain developed by D. Jacobs and A. Langen for the analysis of logic programs, derives useful aliasing information. It is well-known that a commonly used core of techniques, such as the integration of $\textup{\textsf{Sharing}}$ with freeness and linearity information, can significantly improve the precision of the analysis. However, a number of other proposals for refined domain combinations have been circulating for years. One feature that is common to these proposals is that they do not seem to have undergone a thorough experimental evaluation even with respect to the expected precision gains. In this paper we experimentally evaluate: helping $\textup{\textsf{Sharing}}$ with the definitely ground variables found using $\textit{Pos}$, the domain of positive Boolean formulas; the incorporation of explicit structural information; a full implementation of the reduced product of $\textup{\textsf{Sharing}}$ and $\textit{Pos}$; the issue of reordering the bindings in the computation of the abstract $\mgu$; an original proposal for the addition of a new mode recording the set of variables that are deemed to be ground or free; a refined way of using linearity to improve the analysis; the recovery of hidden information in the combination of $\textup{\textsf{Sharing}}$ with freeness information. Finally, we discuss the issue of whether tracking compoundness allows the computation of more sharing information.

Journal ArticleDOI
TL;DR: In this paper, the authors define a formal model for abduction with penalization over logic programs, which extends the abductive framework proposed by Kakas and Mancarella, and design a translation from abduction problems with penalties into logic programs with weak constraints.
Abstract: Abduction, first proposed in the setting of classical logics, has been studied with growing interest in the logic programming area during the last years. In this paper we study abduction with penalization in the logic programming framework. This form of abductive reasoning, which has not been previously analyzed in logic programming, turns out to represent several relevant problems, including optimization problems, very naturally. We define a formal model for abduction with penalization over logic programs, which extends the abductive framework proposed by Kakas and Mancarella. We address knowledge representation issues, encoding a number of problems in our abductive framework. In particular, we consider some relevant problems, taken from different domains, ranging from optimization theory to diagnosis and planning; their encodings turn out to be simple and elegant in our formalism. We thoroughly analyze the computational complexity of the main problems arising in the context of abduction with penalization from logic programs. Finally, we implement a system supporting the proposed abductive framework on top of the DLV engine. To this end, we design a translation from abduction problems with penalties into logic programs with weak constraints. We prove that this approach is sound and complete.

Journal ArticleDOI
TL;DR: A methodology allowing us to perform a correct termination analysis for a broad class of practical meta-interpreters, including negation and performing different tasks during the execution, based on combining the power of general orderings and the well-known acceptability condition.
Abstract: The term meta-programming refers to the ability of writing programs that have other programs as data and exploit their semantics The aim of this paper is presenting a methodology allowing us to perform a correct termination analysis for a broad class of practical meta-interpreters, including negation and performing different tasks during the execution It is based on combining the power of general orderings, used in proving termination of term-rewrite systems and programs, and on the well-known acceptability condition, used in proving termination of logic programs The methodology establishes a relationship between the ordering needed to prove termination of the interpreted program and the ordering needed to prove termination of the meta-interpreter together with this interpreted program If such a relationship is established, termination of one of those implies termination of the other one, ie the meta-interpreter preserves termination Among the meta-interpreters that are analysed correctly are a proof trees constructing meta-interpreter, different kinds of tracers and reasoners

Journal ArticleDOI
TL;DR: In this paper, the authors compare an improved version of conflict-directed backjumping and two variants of dynamic backtracking with respect to chronological backtracking on some of the AIM instances which are a benchmark set of random 3-SAT problems.
Abstract: The most advanced implementation of adaptive constraint processing with Constraint Handling Rules (CHR) allows the application of intelligent search strategies to solve Constraint Satisfaction Problems (CSP). This presentation compares an improved version of conflict-directed backjumping and two variants of dynamic backtracking with respect to chronological backtracking on some of the AIM instances which are a benchmark set of random 3-SAT problems. A CHR implementation of a Boolean constraint solver combined with these different search strategies in Java is thus being compared with a CHR implementation of the same Boolean constraint solver combined with chronological backtracking in SICStus Prolog. This comparison shows that the addition of “intelligence” to the search process may reduce the number of search steps dramatically. Furthermore, the runtime of their Java implementations is in most cases faster than the implementations of chronological backtracking. More specifically, conflict-directed backjumping is even faster than the SICStus Prolog implementation of chronological backtracking, although our Java implementation of CHR lacks the optimisations made in the SICStus Prolog system.

Journal ArticleDOI
TL;DR: A general hierarchy of argumentation semantics parameterised by the notions of attack chosen by proponent and opponent is defined and equivalence to the paraconsistent well-founded semantics with explicit negation, WFSX$_p$.
Abstract: Argumentation has proved a useful tool in defining formal semantics for assumption-based reasoning by viewing a proof as a process in which proponents and opponents attack each others arguments by undercuts (attack to an argument's premise) and rebuts (attack to an argument's conclusion). In this paper, we formulate a variety of notions of attack for extended logic programs from combinations of undercuts and rebuts and define a general hierarchy of argumentation semantics parameterised by the notions of attack chosen by proponent and opponent. We prove the equivalence and subset relationships between the semantics and examine some essential properties concerning consistency and the coherence principle, which relates default negation and explicit negation. Most significantly, we place existing semantics put forward in the literature in our hierarchy and identify a particular argumentation semantics for which we prove equivalence to the paraconsistent well-founded semantics with explicit negation, WFSX$_p$. Finally, we present a general proof theory, based on dialogue trees, and show that it is sound and complete with respect to the argumentation semantics.

Journal ArticleDOI
TL;DR: Costantini as discussed by the authors introduced two normal forms for logic programs under stable/answer set semantics, which can simplify the study of program properties, mainly consistency, and showed that the 3-kernel normal form is very useful for the static analysis of program consistency.
Abstract: Normal forms for logic programs under stable/answer set semantics are introduced. We argue that these forms can simplify the study of program properties, mainly consistency. The first normal form, called the kernel of the program, is useful for studying existence and number of answer sets. A kernel program is composed of the atoms which are undefined in the Well-founded semantics, which are those that directly affect the existence of answer sets. The body of rules is composed of negative literals only. Thus, the kernel form tends to be significantly more compact than other formulations. Also, it is possible to check consistency of kernel programs in terms of colorings of the Extended Dependency Graph program representation which we previously developed. The second normal form is called 3-kernel. A 3-kernel program is composed of the atoms which are undefined in the Well-founded semantics. Rules in 3-kernel programs have at most two conditions, and each rule either belongs to a cycle, or defines a connection between cycles. 3-kernel programs may have positive conditions. The 3-kernel normal form is very useful for the static analysis of program consistency, i.e. the syntactic characterization of existence of answer sets. This result can be obtained thanks to a novel graph-like representation of programs, called Cycle Graph which presented in the companion article Costantini (2004b).

Journal ArticleDOI
TL;DR: This paper identifies a condition on the components of the analyser which guarantees that termination inference will infer all modes which can be checked to terminate, and applies this methodology to enhance a traditional termination analyser to perform also termination inference.
Abstract: This paper focuses on the inference of modes for which a logic program is guaranteed to terminate. This generalises traditional termination analysis where an analyser tries to verify termination for a specified mode. Our contribution is a methodology in which components of traditional termination analysis are combined with backwards analysis to obtain an analyser for termination inference. We identify a condition on the components of the analyser which guarantees that termination inference will infer all modes which can be checked to terminate. The application of this methodology to enhance a traditional termination analyser to perform also termination inference is demonstrated.

Journal ArticleDOI
TL;DR: An extended compilation model is presented that treats higher-order unification and also handles dynamically emergent goals, and a satisfactory representation for lambda terms is developed by exploiting the nameless notation of de Bruijn as well as explicit encodings of substitutions.
Abstract: The logic programming paradigm provides the basis for a new intensional view of higher-order notions. This view is realized primarily by employing the terms of a typed lambda calculus as representational devices and by using a richer form of unification for probing their structures. These additions have important meta-programming applications but they also pose non-trivial implementation problems. One issue concerns the machine representation of lambda terms suitable to their intended use: an adequate encoding must facilitate comparison operations over terms in addition to supporting the usual reduction computation. Another aspect relates to the treatment of a unification operation that has a branching character and that sometimes calls for the delaying of the solution of unification problems. A final issue concerns the execution of goals whose structures become apparent only in the course of computation. These various problems are exposed in this paper and solutions to them are described. A satisfactory representation for lambda terms is developed by exploiting the nameless notation of de Bruijn as well as explicit encodings of substitutions. Special mechanisms are molded into the structure of traditional Prolog implementations to support branching in unification and carrying of unification problems over other computation steps; a premium is placed in this context on exploiting determinism and on emulating usual first-order behaviour. An extended compilation model is presented that treats higher-order unification and also handles dynamically emergent goals. The ideas described here have been employed in the Teyjus implementation of the $\lambda$Prolog language, a fact that is used to obtain a preliminary assessment of their efficacy.

Journal ArticleDOI
TL;DR: This paper shows how a solver for the two sorted CLP language, defined in previous work, has been implemented in the Constraint Handling Rules (CHR) language, a declarative language particularly suitable for high level implementation of constraint solvers.
Abstract: In classical CLP(FD) systems, domains of variables are completely known at the beginning of the constraint propagation process. However, in systems interacting with an external environment, acquiring the whole domains of variables before the beginning of constraint propagation may cause waste of computation time, or even obsolescence of the acquired data at the time of use. For such cases, the Interactive Constraint Satisfaction Problem (ICSP) model has been proposed (Cucchiara et al. 1999a) as an extension of the CSP model, to make it possible to start constraint propagation even when domains are not fully known, performing acquisition of domain elements only when necessary, and without the need for restarting the propagation after every acquisition. In this paper, we show how a solver for the two sorted CLP language, defined in previous work (Gavanelli et al. 2005) to express ICSPs, has been implemented in the Constraint Handling Rules (CHR) language, a declarative language particularly suitable for high level implementation of constraint solvers.

Journal ArticleDOI
TL;DR: The Language for Structured Design (LSD) as mentioned in this paper is a language for combining the design and programming activities in one language, which allows the designer/programmer to work at a higher level, giving declarative specifications of a design in order to obtain the design descriptions.
Abstract: Computer Aided Design systems provide tools for building and manipulating models of solid objects. Some also provide access to programming languages so that parametrised designs can be expressed. There is a sharp distinction, therefore, between building models, a concrete graphical editing activity, and programming, an abstract, textual, algorithm-construction activity. The recently proposed Language for Structured Design (LSD) was motivated by a desire to combine the design and programming activities in one language. LSD achieves this by extending a visual logic programming language to incorporate the notions of solids and operations on solids. Here we investigate another aspect of the LSD approach, namely, that by using visual logic programming as the engine to drive the parametrised assembly of objects, we also gain the powerful symbolic problem-solving capability that is the forte of logic programming languages. This allows the designer/programmer to work at a higher level, giving declarative specifications of a design in order to obtain the design descriptions. Hence LSD integrates problem solving, design synthesis, and prototype assembly in a single homogeneous programming/design environment. We demonstrate this specification-to-final-assembly capability using the masterkeying problem for designing systems of locks and keys.

Journal ArticleDOI
TL;DR: Constraint Handling Rules have become a major specification and implementation language for logical, constraint-based algorithms and intelligent applications that can be almost directly written in CHR.
Abstract: During the last decade, Constraint Handling Rules (CHR) have become a major specification and implementation language for logical, constraint-based algorithms and intelligent applications (as witnessed for example by several hundred publications available online that mention CHR). Algorithms are often specified using inference rules, rewrite rules, sequents, proof rules, or logical axioms that can be almost directly written in CHR.



Journal ArticleDOI
TL;DR: This study systematically derive a scheduler for a class of rules that naturally arise in the context of rule-based constraint programming from a generic iteration algorithm of Apt and Monfroy (2001), leading to an implementation that yields a considerably better performance than their execution as standard CHR rules.
Abstract: We study here schedulers for a class of rules that naturally arise in the context of rule-based constraint programming. We systematically derive a scheduler for them from a generic iteration algorithm of Apt (2000). We apply this study to so-called membership rules of Apt and Monfroy (2001). This leads to an implementation that yields a considerably better performance for these rules than their execution as standard CHR rules. Finally, we show how redundant rules can be identified and how appropriately reduced sets of rules can be computed.

Journal ArticleDOI
TL;DR: In this article, the authors define mode checking for strongly typed constraint logic programming (CLP) languages which require reordering of clause body literals and show how to handle a simple case of polymorphic modes by using polymorphic types.
Abstract: Recent constraint logic programming (CLP) languages, such as HAL and Mercury, require type, mode and determinism declarations for predicates. This information allows the generation of efficient target code and the detection of many errors at compile-time. Unfortunately, mode checking in such languages is difficult. One of the main reasons is that, for each predicate mode declaration, the compiler is required to appropriately re-order literals in the predicate's definition. The task is further complicated by the need to handle complex instantiations (which interact with type declarations and higher-order predicates) and automatic initialization of solver variables. Here we define mode checking for strongly typed CLP languages which require reordering of clause body literals. In addition, we show how to handle a simple case of polymorphic modes by using the corresponding polymorphic types.