scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Automated Reasoning in 2007"


Journal ArticleDOI
TL;DR: It is shown that, for the DLs of the DL-Lite family, the usual DL reasoning tasks are polynomial in the size of the TBox, and query answering is LogSpace in thesize of the ABox, which is the first result ofPolynomial-time data complexity for query answering over DL knowledge bases.
Abstract: We propose a new family of description logics (DLs), called DL-Lite, specifically tailored to capture basic ontology languages, while keeping low complexity of reasoning. Reasoning here means not only computing subsumption between concepts and checking satisfiability of the whole knowledge base, but also answering complex queries (in particular, unions of conjunctive queries) over the instance level (ABox) of the DL knowledge base. We show that, for the DLs of the DL-Lite family, the usual DL reasoning tasks are polynomial in the size of the TBox, and query answering is LogSpace in the size of the ABox (i.e., in data complexity). To the best of our knowledge, this is the first result of polynomial-time data complexity for query answering over DL knowledge bases. Notably our logics allow for a separation between TBox and ABox reasoning during query evaluation: the part of the process requiring TBox reasoning is independent of the ABox, and the part of the process requiring access to the ABox can be carried out by an SQL engine, thus taking advantage of the query optimization strategies provided by current database management systems. Since even slight extensions to the logics of the DL-Lite family make query answering at least NLogSpace in data complexity, thus ruling out the possibility of using on-the-shelf relational technology for query processing, we can conclude that the logics of the DL-Lite family are the maximal DLs supporting efficient query answering over large amounts of instances.

1,482 citations


Journal ArticleDOI
TL;DR: This paper developed two different algorithms, one implementing a bottom-up approach using support of an external description logic reasoner, the other implementing a specialized tableau-based calculus.
Abstract: In this paper we study the diagnosis and repair of incoherent terminologies. We define a number of new nonstandard reasoning services to explain incoherence through pinpointing, and we present algorithms for all of these services. For one of the core tasks of debugging, the calculation of minimal unsatisfiability preserving subterminologies, we developed two different algorithms, one implementing a bottom-up approach using support of an external description logic reasoner, the other implementing a specialized tableau-based calculus. Both algorithms have been prototypically implemented. We study the effectiveness of our algorithms in two ways: we present a realistic case study where we diagnose a terminology used in a practical application, and we perform controlled benchmark experiments to get a better understanding of the computational properties of our algorithms in particular and the debugging problem in general.

211 citations


Journal ArticleDOI
TL;DR: A novel reasoning algorithm is developed that reduces a knowledge base to a disjunctive datalog program while preserving the set of ground consequences and identifies an expressive fragment of $\mathcal{SHIQ}$ with polynomial data complexity.
Abstract: As applications of description logics proliferate, efficient reasoning with knowledge bases containing many assertions becomes ever more important. For such cases, we developed a novel reasoning algorithm that reduces a $\mathcal{SHIQ}$ knowledge base to a disjunctive datalog program while preserving the set of ground consequences. Queries can then be answered in the resulting program while reusing existing and practically proven optimization techniques of deductive databases, such as join-order optimizations or magic sets. Moreover, we use our algorithm to derive precise data complexity bounds: we show that $\mathcal{SHIQ}$ is data complete for NP, and we identify an expressive fragment of $\mathcal{SHIQ}$ with polynomial data complexity.

188 citations


Journal ArticleDOI
TL;DR: This work presents a decision procedure for $\mathcal{SHOIQ}$, a slightly more expressive logic than $\Mathcal{ SHOIN}$, extending the well-known algorithm for $\ MathCal{SHIQ}', which is the basis for several highly successful implementations.
Abstract: OWL DL, a new W3C ontology language recommendation, is based on the expressive description logic $\mathcal{SHOIN}$ . Although the ontology consistency problem for $\mathcal{SHOIN}$ is known to be decidable, up to now there has been no known "practical" decision procedure, that is, a goal-directed procedure that is likely to perform well with realistic ontology derived problems. We present such a decision procedure for $\mathcal{SHOIQ}$ , a slightly more expressive logic than $\mathcal{SHOIN}$ , extending the well-known algorithm for $\mathcal{SHIQ}$ , which is the basis for several highly successful implementations.

163 citations


Journal ArticleDOI
TL;DR: This paper identifies a general property of concrete domains that is sufficient for proving decidability of DLs with both concrete domains and general TBoxes, and presents a tableau algorithm for reasoning in DLs equipped with concrete domainsand general T boxes.
Abstract: In order to use description logics (DLs) in an application, it is crucial to identify a DL that is sufficiently expressive to represent the relevant notions of the application domain, but for which reasoning is still decidable. Two means of expressivity required by many modern applications of DLs are concrete domains and general TBoxes. The former are used for defining concepts based on concrete qualities of their instances such as the weight, age, duration, and spatial extension. The purpose of the latter is to capture background knowledge by stating that the extension of a concept is included in the extension of another concept. Unfortunately, combining concrete domains with general TBoxes often leads to DLs for which reasoning is undecidable. In this paper, we identify a general property of concrete domains that is sufficient for proving decidability of DLs with both concrete domains and general TBoxes. We exhibit some useful concrete domains, most notably a spatial one based on the RCC-8 relations that have this property. Then, we present a tableau algorithm for reasoning in DLs equipped with concrete domains and general TBoxes.

123 citations


Journal ArticleDOI
TL;DR: This work focuses on optimizations of the description logic system FaCT++, which implements a wide variety of such optimizations, some present in other reasoners and some novel or refined inFaCT++.
Abstract: Tableau algorithms are currently the most widely used and empirically the fastest algorithms for reasoning in expressive description logics, including the important description logics $\mathcal{SHIQ}$ and $\mathcal{SHOIQ}$ . Achieving a high level of performance on terminological reasoning in expressive description logics when using tableau-based algorithms requires the incorporation of a wide variety of optimizations. The description logic system FaCT++ implements a wide variety of such optimizations, some present in other reasoners and some novel or refined in FaCT++.

103 citations


Journal ArticleDOI
TL;DR: This paper focuses on some of the distinctive features of the user interaction with Matita, characterized mostly by the organization of the library as a searchable knowledge base, the emphasis on a high-quality notational rendering, and the complex interplay between syntax, presentation, and semantics.
Abstract: Matita is a new, document-centric, tactic-based interactive theorem prover This paper focuses on some of the distinctive features of the user interaction with Matita, characterized mostly by the organization of the library as a searchable knowledge base, the emphasis on a high-quality notational rendering, and the complex interplay between syntax, presentation, and semantics

95 citations


Journal ArticleDOI
TL;DR: This work describes the reconstruction of a phylogeny for a set of taxa, with a character-based cladistics approach, in a declarative knowledge representation formalism, and shows how to use computational methods of answer set programming to generate conjectures about the evolution of the given taxa.
Abstract: We describe the reconstruction of a phylogeny for a set of taxa, with a character-based cladistics approach, in a declarative knowledge representation formalism, and show how to use computational methods of answer set programming to generate conjectures about the evolution of the given taxa. We have applied this computational method in two domains: historical analysis of languages and historical analysis of parasite-host systems. In particular, using this method, we have computed some plausible phylogenies for Chinese dialects, for Indo-European language groups, and for Alcataenia species. Some of these plausible phylogenies are different from the ones computed by other software. Using this method, we can easily describe domain-specific information (e.g., temporal and geographical constraints), and thus prevent the reconstruction of some phylogenies that are not plausible.

83 citations


Journal ArticleDOI
TL;DR: For the first time, the testing and evaluation of ATP systems for intuitionistic logic have been put on a firm basis by running comprehensive tests of currently available intuitionistic ATP systems on all problems in theILTP library.
Abstract: The Intuitionistic Logic Theorem Proving (ILTP) library provides a platform for testing and benchmarking automated theorem proving (ATP) systems for intuitionistic propositional and first-order logic. It includes about 2,800 problems in a standardized syntax from 24 problem domains. For each problem an intuitionistic status and difficulty rating were obtained by running comprehensive tests of currently available intuitionistic ATP systems on all problems in the library. Thus, for the first time, the testing and evaluation of ATP systems for intuitionistic logic have been put on a firm basis.

69 citations


Journal ArticleDOI
TL;DR: This paper has implemented a diagrammatic theorem prover, called Edith, which has access to four sound and complete sets of reasoning rules for Euler diagrams, and develops a sophisticated heuristic to guide the search for a proof.
Abstract: Diagrammatic reasoning has the potential to be important in numerous application areas. This paper focuses on the simple, but widely used, Euler diagrams that form the basis of many more expressive logics. We have implemented a diagrammatic theorem prover, called Edith, which has access to four sound and complete sets of reasoning rules for Euler diagrams. Furthermore, for each rule set we develop a sophisticated heuristic to guide the search for a proof. This paper is about understanding how the choice of reasoning rule set affects the time taken to find proofs. Such an understanding will influence reasoning rule design in other logics. Moreover, this work specific to Euler diagrams directly benefits the many logics based on Euler diagrams. We investigate how the time taken to find a proof depends not only on the proof task but also on the reasoning system used. Our evaluation allows us to predict the best choice of reasoning system, given a proof task, in terms of time taken, and we extract a guide for defining reasoning rules for other logics in order to minimize time requirements.

51 citations


Journal ArticleDOI
TL;DR: This paper addresses the decision problem for the future fragment of Neighborhood Logic (RPNL), and positively solve it by showing that the satisfiability problem for RPNL over natural numbers is NEXPTIME-complete, and develops a sound and complete tableau-based decision procedure, and proves its optimality.
Abstract: Propositional interval temporal logics are quite expressive temporal logics that allow one to naturally express statements that refer to time intervals. Unfortunately, most such logics turn out to be (highly) undecidable. In order to get decidability, severe syntactic or semantic restrictions have been imposed to interval-based temporal logics to reduce them to point-based ones. The problem of identifying expressive enough, yet decidable, new interval logics or fragments of existing ones that are genuinely interval-based is still largely unexplored. In this paper, we focus our attention on interval logics of temporal neighborhood. We address the decision problem for the future fragment of Neighborhood Logic (Right Propositional Neighborhood Logic, RPNL for short), and we positively solve it by showing that the satisfiability problem for RPNL over natural numbers is NEXPTIME-complete. Then, we develop a sound and complete tableau-based decision procedure, and we prove its optimality.

Journal ArticleDOI
TL;DR: The design of a graphical user interface to deal with proofs in geometry is presented and an interactive proof system (Coq) to mechanically check proofs built interactively by the user is developed.
Abstract: We present in this paper the design of a graphical user interface to deal with proofs in geometry. The software developed combines three tools: a dynamic geometry software to explore, measure, and invent conjectures; an automatic theorem prover to check facts; and an interactive proof system (Coq) to mechanically check proofs built interactively by the user.

Journal ArticleDOI
TL;DR: This paper aims to gain new insights into the hardness of the SAT problem and to help in teaching SAT algorithms by converting SAT instances into graphs and applying established graph layout techniques.
Abstract: SAT-solvers have turned into essential tools in many areas of applied logic like, for example, hardware verification or satisfiability checking modulo theories. However, although recent implementations are able to solve problems with hundreds of thousands of variables and millions of clauses, much smaller instances remain unsolved. What makes a particular instance hard or easy is at most partially understood --- and is often attributed to the instance's internal structure. By converting SAT instances into graphs and applying established graph layout techniques, this internal structure can be visualized and thus serve as the basis of subsequent analysis. Moreover, by providing tools that animate the structure during the run of a SAT algorithm, dynamic changes of the problem instance become observable. Thus, we expect both to gain new insights into the hardness of the SAT problem and to help in teaching SAT algorithms.

Journal ArticleDOI
TL;DR: This paper investigates using an automated proof assistant, particularly Isabelle/HOL, as the model supporting first year undergraduate exercises in which students write proofs in number theory, using MathsTiles: composable tiles that resemble written mathematics.
Abstract: The Intelligent Book project aims to improve online education by designing materials that can model the subject matter they teach, in the manner of a reactive learning environment. In this paper, we investigate using an automated proof assistant, particularly Isabelle/HOL, as the model supporting first year undergraduate exercises in which students write proofs in number theory. Automated proof assistants are generally considered to be difficult for novices to learn. We examine whether, by providing a very specialized interface, it is possible to build something that is usable enough to be of educational value. To ensure students cannot "game the system" the exercise avoids tactic-choosing interaction styles but asks the student to write out the proof. Proofs are written using MathsTiles: composable tiles that resemble written mathematics. Unlike traditional syntax-directed editors, MathsTiles allows students to keep many answer fragments on the canvas at the same time and does not constrain the order in which an answer is written. Also, the tile syntax does not need to match the underlying Isar syntax exactly, and different tiles can be used for different questions. The exercises take place within the context of an Intelligent Book. We performed a user study and qualitative analysis of the system. Some users were able to complete proofs with much less training than is usual for the automated proof assistant itself, but there remain significant usability issues to overcome.

Journal ArticleDOI
TL;DR: This paper introduces the notion of models for quantified Boolean formulas and shows the complexity of the model checking problem, which is to check whether a given set of Boolean functions is a model for a formula.
Abstract: In this paper, we introduce the notion of models for quantified Boolean formulas. For various classes of quantified Boolean formulas and various classes of Boolean functions, we investigate the problem of determining whether a model exists. Furthermore, we show for these classes the complexity of the model checking problem, which is to check whether a given set of Boolean functions is a model for a formula. For classes of Boolean functions, we establish some characterizations in terms of classes of quantified Boolean formulas that have such a model.

Journal ArticleDOI
TL;DR: This work illustrates a methodology for formalizing and reasoning about Abadi and Cardelli’s object-based calculi, in (co)inductive type theory, by taking advantage of natural deduction semantics and coinduction in combination with weak higher-order abstract syntax and the Theory of Contexts.
Abstract: We illustrate a methodology for formalizing and reasoning about Abadi and Cardelli's object-based calculi, in (co)inductive type theory, such as the Calculus of (Co)Inductive Constructions, by taking advantage of natural deduction semantics and coinduction in combination with weak higher-order abstract syntax and the Theory of Contexts. Our methodology allows us to implement smoothly the calculi in the target metalanguage; moreover, it suggests novel presentations of the calculi themselves. In detail, we present a compact formalization of the syntax and semantics for the functional and the imperative variants of the ?-calculus. Our approach simplifies the proof of subject deduction theorems, which are proved formally in the proof assistant Coq with a relatively small overhead.

Journal ArticleDOI
TL;DR: This paper abstracts from the scenario two distinct styles of searching and describes how the Alcor interface implements these with a keyword and LSI-based search.
Abstract: The vision of a computerized assistant to mathematicians has existed since the inception of theorem-proving systems. The Alcor system has been designed to investigate and explore how a mathematician might interact with such an assistant by providing an interface to Mizar and the Mizar Mathematical Library. Our current research focuses on the integration of searching and authoring while proving. In this paper we use a scenario to elaborate on the nature of the interaction. We abstract from the scenario two distinct styles of searching and describe how the Alcor interface implements these with a keyword and LSI-based search. Though Alcor is still in its early stages of development, there are clear implications for the general problem of integrating searching and authoring, as well as technical issues with Mizar.

Journal ArticleDOI
TL;DR: The relationship between tableau-based and saturation-based calculi is investigated and the question to what extent refutation or consistency proofs in one calculus can be simulated in another one is answered.
Abstract: The clause-linking technique of Lee and Plaisted proves the unsatisfiability of a set of first-order clauses by generating a sufficiently large set of instances of these clauses that can be shown to be propositionally unsatisfiable. In recent years, this approach has been refined in several directions, leading to both tableau-based methods, such as the disconnection tableau calculus, and saturation-based methods, such as primal partial instantiation and resolution-based instance generation. We investigate the relationship between these calculi and answer the question to what extent refutation or consistency proofs in one calculus can be simulated in another one.

Journal ArticleDOI
TL;DR: A graph-based decision procedure for Gödel-Dummett logics and an algorithm to compute countermodels are presented and a conditional bicolored graph in which some specific cycles and alternating chains are detected.
Abstract: We present a graph-based decision procedure for Godel-Dummett logics and an algorithm to compute countermodels. A formula is transformed into a conditional bicolored graph in which we detect some specific cycles and alternating chains using matrix computations. From an instance graph containing no such cycle, (resp. no (n + 1)-alternating chain) we extract a countermodel in LC, (resp. LC n ).

Journal ArticleDOI
TL;DR: This paper gives a comprehensive presentation of the disconnection tableau calculus, a proof method for formulas in classical first-order clause logic that uses unification in such a manner that important proof-theoretic advantages of the classical tableAU calculus are preserved.
Abstract: In this paper we give a comprehensive presentation of the disconnection tableau calculus, a proof method for formulas in classical first-order clause logic. The distinguishing property of this calculus is that it uses unification in such a manner that important proof-theoretic advantages of the classical (i.e., Smullyan-style) tableau calculus are preserved, specifically the termination and model generation characteristics for certain formula classes. Additionally, the calculus is well suited for fully automated proof search. The calculus is described in detail with soundness and completeness proofs, and a number of important calculus refinements developed over the past years are presented. Referring to the model-finding abilities of the disconnection calculus, we explain the extraction and representation of models. We also describe the integration of paramodulation-based equality handling. Finally, we give an overview of related methods.

Journal ArticleDOI
TL;DR: A generic sound δ-rule is proposed, based on a quite general method for the construction of Skolem terms, which can be used as a common framework for proving the soundness of known variants of the ε-rule, and two versions are proposed that are proved sound by reducing them within this framework.
Abstract: We propose a generic sound ?-rule, based on a quite general method for the construction of Skolem terms, which can be used as a common framework for proving the soundness of known variants of the ?-rule, and we compare their relative effectiveness. Attempts to instantiate some of the ?-rules present in the literature within our framework allowed us to pinpoint unsoundness problems for two of them. In both cases, we propose revised versions that are proved sound by reducing them within our framework.

Journal ArticleDOI
TL;DR: The splitting system is liberalized with respect to β-inferences analogously to a well-known liberalization of δ-rules, and this is used to show an exponential speedup compared to free variable systems without splitting.
Abstract: Variable splitting is a technique applicable to free variable tableaux, sequent calculi, and matrix characterizations that exploits a relationship between β- and ?-rules. Using contextual information to differentiate between occurrences of the same free variable in different branches, the technique admits conditions under which these occurrences may safely be assigned different values by substitutions. This article investigates a system of variable splitting and shows its consistency by a semantical argument. The splitting system is liberalized with respect to β-inferences analogously to a well-known liberalization of ?-rules, and this is used to show an exponential speedup compared to free variable systems without splitting.

Journal ArticleDOI
TL;DR: A target for knowledge compilation called the ri-trie is introduced; it has the property that even if the knowledge bases are large, they nevertheless admit fast queries, so that a query can be processed in time linear in the size of the query regardless of thesize of the compiled knowledge base.
Abstract: The goal of knowledge compilation is to enable fast queries. Prior approaches had the goal of small (i.e., polynomial in the size of the initial knowledge bases) compiled knowledge bases. Typically, query---response time is linear, so that the efficiency of querying the compiled knowledge base depends on its size. In this paper, a target for knowledge compilation called the ri-trie is introduced; it has the property that even if the knowledge bases are large, they nevertheless admit fast queries. Specifically, a query can be processed in time linear in the size of the query regardless of the size of the compiled knowledge base.

Journal ArticleDOI
TL;DR: These foundations are also used to justify significant enhancements to ACL2(r), which has been used to prove basic theorems of analysis and the correctness of the implementation of transcendental functions in hardware.
Abstract: ACL2(r) is a modified version of the theorem prover ACL2 that adds support for the irrational numbers using nonstandard analysis. It has been used to prove basic theorems of analysis, as well as the correctness of the implementation of transcendental functions in hardware. This paper presents the logical foundations of ACL2(r). These foundations are also used to justify significant enhancements to ACL2(r).

Journal ArticleDOI
TL;DR: It is proved that, as in other approaches, such inferences into variable positions in the goal-directed procedure are not needed and the system is independent of selection rule and complete for any E-unification problem.
Abstract: We present a goal-directed E-unification procedure with eager Variable Elimination and a new rule, Cycle, for the case of collapsing equations --- that is, equations of the type x???v where x ?Var(v). Cycle replaces Variable Decomposition (or the so-called Root Imitation) and thus removes possibility of some obviously unnecessary infinite paths of inferences in the E-unification procedure. We prove that, as in other approaches, such inferences into variable positions in our goal-directed procedure are not needed. Our system is independent of selection rule and complete for any E-unification problem.

Journal ArticleDOI
TL;DR: In this article, it was shown that the set of quantifier-free formulas in the first-order language with variables ranging over functions, symbols for 0, +,?, min, max, and absolute value, and a ternary relation f = g + O(h) is decidable.
Abstract: Let F be the set of functions from an infinite set, S, to an ordered ring, R. For f, g, and h in F, the assertion f = g + O(h) means that for some constant C, |f(x) ? g(x)| ?C |h(x)| for every x in S. Let L be the first-order language with variables ranging over such functions, symbols for 0, +, ?, min , max , and absolute value, and a ternary relation f = g + O(h). We show that the set of quantifier-free formulas in this language that are valid in the intended class of interpretations is decidable and does not depend on the underlying set, S, or the ordered ring, R. If R is a subfield of the real numbers, we can add a constant 1 function, as well as multiplication by constants from any computable subfield. We obtain further decidability results for certain situations in which one adds symbols denoting the elements of a fixed sequence of functions of strictly increasing rates of growth.

Journal ArticleDOI
TL;DR: A new higher-order rewriting formalism, called expression reduction systems with patterns (ERSP), where abstraction is allowed not only on variables but also on nested patterns with metavariables, and it is shown that confluence holds for a reasonable class of systems and terms.
Abstract: We introduce a new higher-order rewriting formalism, called expression reduction systems with patterns (ERSP), where abstraction is allowed not only on variables but also on nested patterns with metavariables. These patterns are built by combining standard algebraic patterns with choice constructors denoting cases. In other words, the nondeterministic choice between different rewrite rules which is inherent to classical rewriting formalisms can be lifted here to the level of patterns. We show that confluence holds for a reasonable class of systems and terms.

Journal ArticleDOI
TL;DR: This paper investigates an abstract form of rewriting, by following the paradigm of logical-system independency, and introduces convergent rewrite systems that enable one to describe deduction procedures for their corresponding theory, and proposes a Knuth-Bendix–style completion procedure.
Abstract: When rewriting is used to generate convergent and complete rewrite systems in order to answer the validity problem for some theories, all the rewriting theories rely on a same set of notions, properties, and methods. Rewriting techniques have been used mainly to answer the validity problem of equational theories, that is, to compute congruences. Recently, however, they have been extended in order to be applied to other algebraic structures such as preorders and orders. In this paper, we investigate an abstract form of rewriting, by following the paradigm of logical-system independency. To achieve this purpose, we provide a few simple conditions (or axioms) under which rewriting (and then the set of classical properties and methods) can be modeled, understood, studied, proven, and generalized. This enables us to extend rewriting techniques to other algebraic structures than congruences and preorders such as congruences closed under monotonicity and modus ponens. We introduce convergent rewrite systems that enable one to describe deduction procedures for their corresponding theory, and we propose a Knuth-Bendix---style completion procedure in this abstract framework.

Journal ArticleDOI
TL;DR: A variant of the basic ordered superposition rules to handle equality in an analytic free-variable tableau calculus is presented and completeness of this calculus is proved by an adaptation of the model generation technique commonly used for completeness proofs of superposition in the context of resolution calculi.
Abstract: We present a variant of the basic ordered superposition rules to handle equality in an analytic free-variable tableau calculus. We prove completeness of this calculus by an adaptation of the model generation technique commonly used for completeness proofs of superposition in the context of resolution calculi. The calculi and the completeness proof are compared to earlier results of Degtyarev and Voronkov. Some variations and refinements are discussed.

Journal ArticleDOI
TL;DR: In this article, the authors propose a graphically oriented interface for theorem provers, with intuitive gestures to supplement or even replace textual interaction, which can be used in restricted problem domains.
Abstract: Theorem proving is coming of age. While its foundations predate the first computers, and systems have been built since the 1950s, dramatic improvements of proving power and expressivity have been made over the last fifteen years. And like other parts of computing, the complex symbolic manipulations of theorem provers have greatly benefited from the rapid increase in computing power. This means that applications of theorem proving and related methods have now become large, diverse and mature, ranging from real-world hardware and software verification to the formalisation of complex and deep mathematical proofs. One area that is still very much in need of improvement is interfaces for theorem provers. There is no broad agreement about what makes a good user interface in this area, and little is known about how to bridge the gap between imprecise human interaction and the highly stringent demands of fully formal mathematics. Yet there is universal recognition that interfaces must improve if theorem proving is to become more accessible and productive. Many present systems are unbearably complicated to use and can take months to learn because their interfaces are inadequate for beginning users. Experts are held back too: many interface operations which could significantly enhance productivity are not supported, although they are commonplace in other modern applications. An example operation is searching : instead of carefully structuring data in order to later efficiently retrieve them, nowadays the emphasis has shifted towards flexible and speedy searching in a large shared body of knowledge. Theorem provers ought to provide such search facilities, for theorems, definitions, and proofs. Second, we have the critical question of how proofs are shown to the user: presentations of proof should be readily comprehensible to both author and subsequent readers. Finally, there is the issue of the interaction itself, taking place between the user and prover to help direct proof construction. Here, there is much scope for graphically oriented interfaces with intuitive gestures to supplement or even replace textual interaction. This seems especially possible in restricted problem domains.