Showing papers in "Artificial Intelligence in 1984"
TL;DR: In this article, a formalism for reasoning about actions is proposed that is based on a temporal logic, which allows a much wider range of actions to be described than with previous approaches such as the situation calculus.
Abstract: A formalism for reasoning about actions is proposed that is based on a temporal logic. It allows a much wider range of actions to be described than with previous approaches such as the situation calculus. This formalism is then used to characterize the different types of events, processes, actions, and properties that can be described in simple English sentences. In addressing this problem, we consider actions that involve non-activity as well as actions that can only be defined in terms of the beliefs and intentions of the actors. Finally, a framework for planning in a dynamic world with external events and multiple agents is suggested.
2,439 citations
TL;DR: This paper describes the basic concepts of qualitative process theory, several different kinds of reasoning that can be performed with them, and discusses its implications for causal reasoning.
Abstract: Objects move, collide, flow, bend, heat up, cool down, stretch, compress, and boil. These and other things that cause changes in objects over time are intuitively characterized as processes . To understand commonsense physical reasoning and make programs that interact with the physical world as well as people do we must understand qualitative reasoning about processes, when they will occur, their effects, and when they will stop. Qualitative process theory defines a simple notion of physical process that appears useful as a language in which to write dynamical theories. Reasoning about processes also motivates a new qualitative representation for quantity in terms of inequalities, called the quantity space . This paper describes the basic concepts of qualitative process theory, several different kinds of reasoning that can be performed with them, and discusses its implications for causal reasoning. Several extended examples illustrate the utility of the theory, including figuring out that a boiler can blow up, that an oscillator with friction will eventually stop, and how to say that you can pull with a string, but not push with it.
2,087 citations
PARC1
TL;DR: A fairly encompassing account of qualitative physics, which introduces causality as an ontological commitment for explaining how devices behave, and presents algorithms for determining the behavior of a composite device from the generic behavior of its components.
Abstract: A qualitative physics predicts and explains the behavior of mechanisms in qualitative terms. The goals for the qualitative physics are (1) to be far simpler than the classical physics and yet retain all the important distinctions (e.g., state, oscillation, gain, momentum) without invoking the mathematics of continuously varying quantities and differential equations, (2) to produce causal accounts of physical mechanisms that are easy to understand, and (3) to provide the foundations for commonsense models for the next generation of expert systems. This paper presents a fairly encompassing account of qualitative physics. First, we discuss the general subject of naive physics and some of its methodological considerations. Second, we present a framework for modeling the generic behavior of individual components of a device based on the notions of qualitative differential equations (confluences) and qualitative state. This requires developing a qualitative version of the calculus. The modeling primitives induce two kinds of behavior, intrastate and interstate, which are governed by different laws. Third, we present algorithms for determining the behavior of a composite device from the generic behavior of its components. Fourth, we examine a theory of explanation for these predictions based on logical proof. Fifth, we introduce causality as an ontological commitment for explaining how devices behave.
1,550 citations
TL;DR: In this paper, the authors describe a system that reasons from first principles, i.e., using knowledge of structure and behavior, to deal with situations that are novel in the sense that their outward manifestations may not have been encountered previously.
Abstract: We describe a system that reasons from first principles, i.e., using knowledge of structure and behavior. The system has been implemented and tested on several examples in the domain of troubleshooting digital electronic circuits. We give an example of the system in operation, illustrating that this approach provides several advantages, including a significant degree of device independence, the ability to constrain the hypotheses it considers at the outset, yet deal with a progressively wider range of problems, and the ability to deal with situations that are novel in the sense that their outward manifestations may not have been encountered previously. As background we review our basic approach to describing structure and behavior, then explore some of the technologies used previously in troubleshooting. Difficulties encountered there lead us to a number of new contributions, four of which make up the central focus of this paper. • — We describe a technique we call constraint suspension that provides a powerful tool for troubleshooting. • — We point out the importance of making explicit the assumptions underlying reasoning and describe a technique that helps enumerate assumptions methodically. • — The result is an overall strategy for troubleshooting based on the progressive relaxation of underlying assumptions. The system can focus its efforts initially, yet will methodically expand its focus to include a broad range of faults. • — Finally, abstracting from our examples, we find that the concept of adjacency proves to be useful in understanding why some faults are especially difficult to diagnose and why multiple representations are useful.
959 citations
TL;DR: Dart differs from previous approaches to diagnosis taken in the design-automation community in that it is more general and in many cases more efficient, and allows it to be applied to a wide class of devices ranging from digital logic to nuclear reactors.
Abstract: This paper describes a device-independent diagnostic program called dart. dart differs from previous approaches to diagnosis taken in the Artificial Intelligence community in that it works directly from design descriptions rather than mycin -like symptom-fault rules. dart differs from previous approaches to diagnosis taken in the design-automation community in that it is more general and in many cases more efficient. dart uses a device-independent language for describing devices and a device-independent inference procedure for diagnosis. The resulting generality allows it to be applied to a wide class of devices ranging from digital logic to nuclear reactors. Although this generality engenders some computational overhead on small problems, it facilitates the use of multiple design descriptions and thereby makes possible combinatoric savings that more than offsets this overhead on problems of realistic size.
598 citations
TL;DR: This paper presents a qualitative-reasoning method for predicting the behavior of mechanisms characterized by continuous, time-varying parameters that can detect a previously unsuspected landmark point at which the system is in stable equilibrium.
Abstract: This paper presents a qualitative-reasoning method for predicting the behavior of mechanisms characterized by continuous, time-varying parameters. The structure of a mechanism is described in terms of a set of parameters and the constraints that hold among them: essentially a ‘qualitative differential equation’. The qualitative-behavior description consists of a discrete set of time-points, at which the values of the parameters are described in terms of ordinal relations and directions of change. The behavioral description, or envisionment, is derived by two sets of rules: propagation rules which elaborate the description of the current time-point, and prediction rules which determine what is known about the next qualitatively distinct state of the mechanism. A detailed example shows how the envisionment method can detect a previously unsuspected landmark point at which the system is in stable equilibrium.
552 citations
TL;DR: A domain-independent planning program that supports both automatic and interactive generation of hierarchical, partially ordered plans is described, and an improved formalism makes extensive use of constraints and resources to represent domains and actions more powerfully.
Abstract: A domain-independent planning program that supports both automatic and interactive generation of hierarchical, partially ordered plans is described. An improved formalism makes extensive use of constraints and resources to represent domains and actions more powerfully. The formalism also offers efficient methods for representing properties of objects that do not change over time, allows specification of the plan rationale (which includes scoping of conditions and appropriately relating different levels in the hierarchy), and provides the ability to express deductive rules for deducing the effects of actions. The implications of allowing parallel actions in a plan or problem solution are discussed, and new techniques for efficiently detecting and remedying harmful parallel interactions are presented. The most important of these techniques, reasoning about resources, is emphasized and explained. The system supports concurrent exploration of different branches in the search, making best-first search easy to implement.
411 citations
TL;DR: A new approach to knowledge representation where knowledge bases are characterized not in terms of the structures they use to represent knowledge, but functionally, in Terms of what they can be asked or told about some domain, which cleanly separates functionality from implementation structure.
Abstract: We present a new approach to knowledge representation where knowledge bases are characterized not in terms of the structures they use to represent knowledge, but functionally, in terms of what they can be asked or told about some domain. Starting with a representation system that can be asked questions and told facts in a full first-order logical language, we then define ask- and tell-operations over an extended language that can refer not only to the domain but to what the knowledge base knows about that domain. The major technical result is that the resulting knowledge, which now includes auto-epistemic aspects, can still be represented symbolically in first-order terms. We also consider extensions to the framework such as defaults and definitional facilities. The overall result is a formal foundation for knowledge representation which, in accordance with current principles of software design, cleanly separates functionality from implementation structure.
385 citations
TL;DR: In this article, the problem of motion measurement is formulated as the computation of an instantaneous two-dimensional velocity field, and a smoothness constraint of the velocity field is formulated based on the physical assumption that surfaces are generally smooth.
Abstract: The organization of movement in a changing image provides a valuable source of information for analyzing the environment in terms of objects, their motion in space, and their three-dimensional structure. A description of this movement is not provided to our visual system directly, however; it must be inferred from the pattern of changing intensity that reaches the eye. This paper examines the problem of motion measurement, which we formulate as the computation of an instantaneous two-dimensional velocity field. Initial motion measurements occur at the location of significant intensity changes, as suggested by Marr and Ullman [1]. These measurements provide only one component of velocity, and must be integrated to compute the two-dimensional velocity field. A fundamental problem for this integration is that motion is not determined uniquely from the changing image. An additional constraint of smoothness of the velocity field is formulated, based on the physical assumption that surfaces are generally smooth, allowing the computation of a unique solution. A theoretical analysis of the conditions under which this computation yields the correct velocity field suggests that the solution is physically plausible; empirical studies show the results to be consistent with human motion perception.
349 citations
TL;DR: A polymorphic type scheme for Prolog is described which makes static type checking possible and it is observed that the type resolution problem for a Prolog program is another Prolog (meta)program.
Abstract: We describe a polymorphic type scheme for Prolog which makes static type checking possible. Polymorphism gives a good degree of flexibility to the type system, and makes it intrude very little on a user's programming style. The only additions to the language are type declarations, which an interpreter can ignore if it so desires, with the guarantee that a well-typed program will behave identically with or without type checking. Our implementation is discussed and we observe that the type resolution problem for a Prolog program is another Prolog (meta)program.
316 citations
TL;DR: Recent insights in the AM program spawn questions about "where the meaning really resides" in the concepts discovered by AM, leading to an appreciation of the crucial and unique role of representation in theory formation, a role involving the relationship between Form and Content.
Abstract: The am program was constructed by Lenat in 1975 as an early experiment in getting machines to learn by discovery. In the preceding article in this issue of the AI Journal, Ritchie and Hanna focus on that work as they raise several fundamental questions about the methodology of artificial intelligence research. Part of this paper is a response to the specific points they make. It is seen that the difficulties they cite fall into four categories, the most serious of which are omitted heuristics , and the most common of which are miscommunications . Their considerations, and our post- am work on machines that learn, have clarified why am succeeded in the first place, and why it was so difficult to use the same paradigm to discover new heuristics. Those recent insights spawn questions about “where the meaning really resides” in the concepts discovered by am . This in turn leads to an appreciation of the crucial and unique role of representation in theory formation, specifically the benefits of having syntax mirror semantics. Some criticism of the paradigm of this work arises due to the ad hoc nature of many pieces of the work; at the end of this article we examine how this very adhocracy may be a potential source of power in itself.
[...]
PARC1
TL;DR: A theory of commonsense understanding of the behavior of electronic circuits, based on the intuitive qualitative reasoning electrical engineers use when they analyze circuits, which describes important quantitative aspects of circuit functioning.
Abstract: This paper presents a theory of commonsense understanding of the behavior of electronic circuits. It is based on the intuitive qualitative reasoning electrical engineers use when they analyze circuits. This intuitive reasoning provides a great deal of important information about the operation of the circuit, which although qualitative in nature, describes important quantitative aspects of circuit functioning (feedback paths, stability, impedance and gain estimates, etc.). One aspect of the theory, causal analysis, describes how the behavior of the individual components can be combined to explain the behavior of composite systems. Another aspect of the theory, teleological analysis, describes how the notion that the system has a purpose can be used to structure and aid this causal analysis. The theory is implemented in a computer program, EQUAL, which, given a circuit topology, can construct by qualitative causal analysis a description of the mechanism by which the circuit operates. This mechanism is then parsed by a grammar for circuit functions.
PARC1
TL;DR: This volume brings together current work on qualitative reasoning, and presents knowledge bases for a number of very different domains, from heat flow, to transistors, to digital computation.
Abstract: This volume brings together current work on qualitative reasoning. Previous publication has been primarily in scattered conference proceedings. The appearance of this volume reflects the maturity of qualitative reasoning as a research area, and the growing interest in problems of reasoning about physical systems. The papers present knowledge bases for a number of very different domains, from heat flow, to transistors, to digital computation. Anyone concerned with automated reasoning about the real (physical) world should read and understand this material.
TL;DR: Temporal qualitative analysis as discussed by the authors is a technique for analyzing the qualitative large signal behavior of MOS circuits that straddle the line between the digital and analog domains, and is based on the following four components: First, a qualitative representation is composed of a set of open regions separated by boundaries.
Abstract: With the push towards submicron technology, transistor models have become increasingly complex. The number of components in integrated circuits has forced designers' efforts and skills towards higher levels of design. This has created a gap between design expertise and the performance demands increasingly imposed by the technology. To alleviate this problem, software tools must be developed that provide the designer with expert advice on circuit performance and design. This requires a theory that links the intuitions of an expert circuit analyst with the corresponding principles of formal theory (i.e., algebra, calculus, feedback analysis, network theory, and electrodynamics), and that makes each underlying assumption explicit. Temporal qualitative analysis is a technique for analyzing the qualitative large signal behavior of MOS circuits that straddle the line between the digital and analog domains. Temporal qualitative analysis is based on the following four components: First, a qualitative representation is composed of a set of open regions separated by boundaries. These boundaries are chosen at the appropriate level of detail for the analysis. This concept is used in modeling time, space, circuit state variables, and device operating regions. Second, constraints between circuit state variables are established by circuit theory. At a finer time scale, the designer's intuition of electrodynamics is used to impose a causal relationship among these constraints. Third, large signal behavior is modeled by transition analysis, using continuity and theorems of calculus to determine how quantities pass between regions over time. Finally, feedback analysis uses knowledge about the structure of equations and the properties of structure classes to resolve ambiguities.
TL;DR: A partial solution to the problem of constructing a fuzzy map is presented, an algorithm that assimilates a fact first by imposing constraints on the fuzzy coordinates of the objects involved, then by rearranging or growing the tree of frames of reference.
Abstract: Planning routes and executing them requires both topological and metric information. A natural implementation of a ‘cognitive map’ might therefore consist of an assertional data base for topological information and a ‘fuzzy map’ for the metric information. A fuzzy map captures facts about objects by recording their relative positions, orientations, and scales in convenient frames of reference. It is fuzzy in the sense that coordinates are specified to lie in a range rather than having fixed values. The fuzzy map allows easy retrieval of information. The same information is also represented in a discrimination tree, which allows an object to be retrieved given its location and other attributes. The problem of constructing a fuzzy map is more difficult; we present a partial solution, an algorithm that assimilates a fact first by imposing constraints on the fuzzy coordinates of the objects involved, then by rearranging or growing the tree of frames of reference. Route planning is modelled as a process of finding the overall direction and topology of the path, then filling in the details by deciding how to go around barriers. It uses the retrieval algorithms. Our program SPAM carries out all these processes.
TL;DR: Verify as mentioned in this paper is a prolog program that attempts to prove the correctness of a digital design by showing that the behavior inferred from the interconnection of its parts and their behaviors is equivalent to the specified behavior.
Abstract: verify is a prolog program that attempts to prove the correctness of a digital design. It does so by showing that the behavior inferred from the interconnection of its parts and their behaviors is equivalent to the specified behavior. It has successfully verified large designs involving many thousands of transistors.
TL;DR: An aigoritlitr1 to obtaitl local surface orierltatiorr frofti the apparetlt surface-pattertl distortion in an image is described and a spherical projection to model perspctiue imaging is proposed.
Abstract: niis paper describes an aigoritlitr1 to obtaitl local surface orierltatiorr frofti the apparetlt surface-pattertl distortion in an image. We propose a spherical projection to model perspctiue imaging. A trtappitlg is debtled based of1 the measurement of the local distortions of a repeated known rexture pattern due Io the image projection. 77tis mapping maps an apparent shape on the image sphere lo a locus of possible surface orienrations on the Gaussian sphere. An ireratiw constraint propagation algorithm with the orientations at occluding boundaries reduces possible surface orientations to a unique orientation. 7lis algorithm can recover local surface orientation as well as interpolate surjace orientations where no information is auailable. ?Xis algorithm is applied to a real image ro demonstrate its perfontlance.
TL;DR: An abstract computational system for extended person-machine interface that incorporates many of the rules used by individuals in everyday interactions and function in place of explicit meta-communication about the structure of an ongoing exchange.
Abstract: Current computer natural language systems lack discourse capabilities. They mainly treat utterances in isolation and are not able to track the coherence of extended dialogue. In general, however, human communication and task performance are not accomplished in this manner. Useful interface systems should incorporate extended conversational power. In this paper, we present an abstract computational system for extended person-machine interface. The system is theoretically motivated by an investigation into the human conversational system. The computer system incorporates many of the rules used by individuals in everyday interactions. Use of these rules enable implicit communication between conversants about their underlying models of an interaction. They function in place of explicit meta-communication about the structure of an ongoing exchange. Such communication is necessary since building compatible models of an interaction is necessary for effective interaction. Providing a comparable system of rules for the computer is then mandatory if we expect it to be able to integrate subsequent requests and exchanges to earlier ones.
TL;DR: It is proved to never expand more nodes than B or A∗ and to expand a much smaller number of them in some cases and a proof that no overall optimal algorithm exists if the cost of an algorithm is measured by the total number of node expansions.
Abstract: This paper describes an improved version of two previously published algorithms in the area: A∗ and B . The new approach is based on considering the estimate ĥ(n) on node n as a variable rather than as a constant. The new algorithm thus improves the estimate as it goes on, avoiding some useless node expansions. It is proved to never expand more nodes than B or A∗ and to expand a much smaller number of them in some cases. Another result of the paper is a proof that no overall optimal algorithm exists if the cost of an algorithm is measured by the total number of node expansions.
TL;DR: The problem of building a stable base for further research in artificial intelligence is illustrated by referring to Lenat's am program, in which the techniques employed are somewhat obscure in spite of the impressive performance.
Abstract: Much artificial intelligence research is based on the construction of large impressive-looking programs, the theoretical content of which may not always be clearly stated. This is unproductive from the point of view of building a stable base for further research. We illustrate this problem by referring to Lenat's am program, in which the techniques employed are somewhat obscure in spite of the impressive performance.
TL;DR: This paper presents a formulation of B&B general enough to include previous formulations as special cases, and shows how two well-known AI search procedures are special cases of this general formulation.
Abstract: Branch and Bound (B&B) is a problem-solving technique which is widely used for various problems encountered in operations research and combinatorial mathematics. Various heuristic search procedures used in artificial intelligence (AI) are considered to be related to B&B procedures. However, in the absence of any generally accepted terminology for B&B procedures, there have been widely differing opinions regarding the relationships between these procedures and B&B. This paper presents a formulation of B&B general enough to include previous formulations as special cases, and shows how two well-known AI search procedures ( A ∗ and AO ∗) are special cases of this general formulation.
TL;DR: In this article, a new approach is given to detect the surface orientation and motion from the texture on the surface by making use of a mathematical principle called "stereology", which can be used to detect surface motions relative to the viewer by computing features of its texture at one time and a short time later.
Abstract: A new approach is given to detect the surface orientation and motion from the texture on the surface by making use of a mathematical principle called ‘stereology’. Information about the surface orientation is contained in ‘features’ computed by scanning the image by parallel lines and counting the number of intersections with the curves of the texture. A synthetic example is given to illustrate the technique. This scheme can also detect surface motions relative to the viewer by computing features of its texture at one time and a short time later. The motion is specified by explicit formulae of the computed features.
TL;DR: The most primitive case is selecting between two independent alternatives, and the value of the control knowledge that makes the decision is determined.
Abstract: The most basic activity performed by an intelligent agent is deciding what to do next. Usually, the decision takes the form of selecting, from among many applicable methods, one method to try first, or opting to expand a particular node in a simple search. The most primitive case is selecting between two independent alternatives. Below, this case is examined and the value of the control knowledge that makes the decision is determined. Another result derived is the sensitivity of the expected value of control knowledge as a function of the accuracy of the parameters used to make these control decisions.
TL;DR: A comprehensive treatment of the use of an automated reasoning program to answer certain previously open questions from equivalential calculus with sufficient detail both to permit the work to be duplicated and to enable one to consider other applications of the techniques.
Abstract: The field of automated reasoning is an outgrowth of the field of automated theorem proving. The difference in the two fields is not so much in the procedures on which they rest, but rather in the way the corresponding programs are used. Here we present a comprehensive treatment of the use of an automated reasoning program to answer certain previously open questions from equivalential calculus. The questions are answered with a uniform method that employs schemata to study the infinite domain of theorems deducible from certain formulas. We include sufficient detail both to permit the work to be duplicated and to enable one to consider other applications of the techniques. Perhaps more important than either the results or the methodology is the demonstration of how an automated reasoning program can be used as an assistant and a colleague. Precise evidence is given of the nature of this assistance.
TL;DR: This paper shows how the libraries are compiled, how CHUNKER works, and discusses the plans for extending it to play the whole domain of king and pawn endings.
Abstract: CHUNKER is a chess program that uses chunked knowledge to improve its performance. Its domain is a subset of king and pawn endings in chess that has been studied for over 300 years. CHUNKER has a large library of chunk instances where each chunk type has a property list and each instance has a set of values for these properties. This allows CHUNKER to reason about positions that come up in the search that would otherwise have to be handled by means of additional search. Thus the program is able to solve the most difficult problem of its present domain (a problem that would require 45 ply of search and on the order of 10 13 years of CPU time to be solved by the best of present day chess programs) in 18 ply and one minute of CPU time. Further, CHUNKER is undoubtedly the world's foremost expert in its domain, and has discovered 2 mistakes in the literature and has been instrumental in discovering a new theorem about the domain that allows the assessing of positions with a new degree of ease and confidence. In this paper we show how the libraries are compiled, how CHUNKER works, and discuss our plans for extending it to play the whole domain of king and pawn endings.
TL;DR: This method, which is basically a constraint refinement search, provides for a much more complete analysis of carbon-13 nuclear magnetic resonance spectra of complex organic molecules than any approach currently utilized.
Abstract: An expert system has been developed to aid in the analysis of carbon-13 nuclear magnetic resonance ( 13 C nmr) spectra of complex organic molecules. This system uses a knowledge base of rules relating substructural and spectral features: these rules are derived automatically from data for known structures. Such rules have a number of current, practical applications relating to spectrum prediction. They also constitute the basis of a method for the structural interpretation of 13 C spectral data of unknown compounds. This method, which is basically a constraint refinement search, provides for a much more complete analysis of such data than any approach currently utilized.
PARC1
PARC1
PARC1