scispace - formally typeset
Search or ask a question

Showing papers on "Commonsense reasoning published in 1988"


Journal ArticleDOI
01 May 1988
TL;DR: This paper shows that the difficulties McDermott described are a result of insisting on using logic as the language of commonsense reasoning, and if (Bayesian) probability is used, none of the technical difficulties found in using logic arise.
Abstract: The paper examines issues connected with the choice of the best method for representing and reasoning about common sense. McDermott (1978) has shown that a direct translation of common sense reasoning into logical form leads to insurmountable difficulties. It is shown, in the present work, that if Bayesian probability is used instead of logic as the language of such reasoning, none of the technical difficulties found in using logic arise. Bayesian inference is applied to a simple example of linguistic information to illustrate the potential of this type of inference for artificial intelligence.

161 citations


Journal ArticleDOI
01 Jan 1988
TL;DR: The work shows that qualitative simulation and consolidation work on different problems of qualitative reasoning, and that their differences and similarities lead to several implications about their role in qualitative reasoning.
Abstract: To understand commonsense reasoning, it is necessary to discover what kinds of problems a commonsense reasoner should be able to solve, what the reasoner needs to have in order to solve those problems, and the relationships among the various kinds of problem-solving abilities. Four methods for performing qualitative reasoning about the behaviour of physical situations are examined. Three of the methods perform qualitative simulation, which determines the behavior of a situation by a qualitative version of simulation methods. The other method is called consolidation, which derives the behavior of a situation by composing the behavior of the situation's components. The work shows that qualitative simulation and consolidation work on different problems of qualitative reasoning, and that their differences and similarities lead to several implications about their role in qualitative reasoning. >

32 citations


Journal ArticleDOI
TL;DR: This paper proposes a method for combining the imprecision and uncertainty of the fuzzy number based on the use of information measures, and develops a theoretical context for the information measures on fuzzy values.

29 citations



Book ChapterDOI
01 Jan 1988
TL;DR: Reasoning about any realistic domain requires that some simplifications be made, and one way to summarize exceptions is to assign to each proposition a numerical measure of uncertainty and then combine these measures according to uniform syntactic principles, the way truth values are combined in logic.
Abstract: Reasoning about any realistic domain requires that some simplifications be made. The act of preparing knowledge to support reasoning requires that many facts are left unknown, unsaid, or crudely summarized. An alternative to the extremes of ignoring or enumerating exceptions is to summarize them, i.e., provide some warning signs to indicate which areas of the minefield are more dangerous than others. Summarization is essential to find a reasonable compromise between safety and speed of movement. One way to summarize exceptions is to assign to each proposition a numerical measure of uncertainty and then combine these measures according to uniform syntactic principles, the way truth values are combined in logic. Artificial Intelligence [“AI”] provides a computational model of intelligent behavior and commonsense reasoning. Probability theory provides a coherent account of how belief should change in light of partial or uncertain information. Network representations are not foreign to AI systems. Most reasoning systems encode relevancies using intricate systems of pointers, i.e., networks of indices that group facts into structures, such as frames, scripts, causal chains, and inheritance hierarchies.

10 citations


Proceedings Article
21 Aug 1988
TL;DR: A radically different semantics for reflexives is presented, based on nonmonotonic inheritance and an extension to Touretzky's inferential distance ordering, which can derive new generic reflexive statements as well as statements about individuals.
Abstract: Generic reflexive statements such as Elephants love themselves have traditionally been formalized using some variant of predicate logic, with variables to mark coreferentiality. We present a radically different semantics for reflexives, based on nonmonotonic inheritance and an extension to Touretzky's inferential distance ordering. Our system can derive new generic reflexive statements as well as statements about individuals. And unlike the leading predicate logic-based approaches, our formalism does not use variables; this brings it closer in structure to actual human languages. The significance of this work for AI is its demonstration of the benefits of a non-classical knowledge representation for analyzing commonsense reasoning phenomena.

8 citations


Journal ArticleDOI
TL;DR: The authors present, in a unifying framework, the principles of the IIICAD system, a generic design apprentice currently under development at CWI, which uses AI techniques in the following areas: formalization of design processes, extensional versus intentional descriptions, modal and other non-standard logics as knowledge representation tools.
Abstract: The authors present, in a unifying framework, the principles of the IIICAD system, a generic design apprentice currently under development at CWI. IIICAD incorporates three kinds of design knowledge. First, it has general knowledge about the stepwise nature of design based on a set-theoretic design theory. Second, it has domain-dependent knowledge belonging to the specific design areas where it may actually be used. Finally, it maintains knowledge about the previously designed objects; this is somewhat similar to software reuse. Furthermore, IIICAD uses AI techniques in the following areas: formalization of design processes, extensional versus intentional descriptions, modal and other non-standard logics as knowledge representation tools; commonsense reasoning about the physical world (naive physics), coupling symbolic and numerical computation; integration of object-oriented and logic programming paradigms, and development of a common base language for design.

8 citations


Book ChapterDOI
Ronald R. Yager1
04 Jul 1988
TL;DR: An new operation is introduced called a non-monotonic intersection that has significant applications in default and other commonsense reasoning systems and is neither commutative nor associative but rather supports a concept of priority amongst its arguments.
Abstract: We introduce an new operation called a non-monotonic intersection. This operation has significant applications in default and other commonsense reasoning systems. In addition to being non-monotonic this operator is not a pointwise operator. Futhermore it is neither commutative nor associative but rather supports a concept of priority amongst its arguments.

4 citations


01 Jun 1988
TL;DR: Modal Situation Logic, KZ is introduced, which is a modal logic approach of the situational calculus, which when combined with reflective reasoning, can provide a step by step derivation of various commonsense reasoning problems.
Abstract: The most natural aspect of human thinking; namely, commonsense, has been the hardest to grasp in the artificial intelligence field. Without commonsense, it is impossible for a robot to survive in a real world. Classical logics are not suitable for commonsense reasoning because, first, they are not flexible enough to allow conflicting conclusions from incomplete knowledge, and second, they are not powerful enough to reason about a changing world. In this research, We introduce Modal Situation Logic, KZ, which is a modal logic approach of the situational calculus. When combined with reflective reasoning (where a knowledgebase can be described in terms of itself), this logic system can provide a step by step derivation of various commonsense reasoning problems. Three of the most fundamental commonsense reasoning problems; namely, the nonmonotonicity problem, the frame problem and the belief revision problem are investigated using our formalism to show its applicability.

3 citations


Proceedings ArticleDOI
10 Oct 1988
TL;DR: A computation architecture called 'Conposit' is outlined, which manipulates very-short-term complex symbolic data structures of types that are useful in high-level cognitive tasks such as commonsense reasoning, planning, and natural language understanding.
Abstract: A computation architecture called 'Conposit' is outlined. Composit manipulates very-short-term complex symbolic data structures of types that are useful in high-level cognitive tasks such as commonsense reasoning, planning, and natural language understanding. Conposit's data structures are, essentially, temporary configurations of symbolic occurrences in a two-dimensional array of registers. Each register is implementable as a neural subnetwork whose activation pattern realizes the symbol occurrence. The data structures are manipulated by condition-action rules that are realizable as further neural subnetworks attached to the array. In simulations, Conposit performs symbolic processing of types previously found difficult for connectionist/neural networks. A version of Conposit, simulated on the massively parallel processor, embodying core aspects of P. Johnson-Laird's mental model theory (1983) of human syllogistic reasoning is concentrated on. This version illustrates Conposit's power and flexibility, which arises from two unusual data-structure encoding techniques: relative-position encoding and pattern-similarity association. >

2 citations


Book
01 Dec 1988
TL;DR: The semantics of non-monotonic entailment defined using partial interpretations and the complexity of Model-Preference Default theories are studied.
Abstract: General theory of cumulative inference.- New results on semantical nonmonotonic reasoning.- The semantics of non-monotonic entailment defined using partial interpretations.- Hierarchic autoepistemic theories for nonmonotonic reasoning: Preliminary report.- Autoepistemic stable closures and contradiction resolution.- Compiling circumscriptive theories into logic programs.- A circumscriptive theorem prover.- The complexity of Model-Preference Default theories.- Massively parallel Assumption-based Truth Maintenance.- An extended basic ATMS.- A nonmonotonic logic for reasoning about speech acts and belief revision.- Autoepistemic logic and formalization of commonsense reasoning preliminary report.- Nonmonotonic reasoning in temporal domains: The knowledge independence problem.- Benchmark problems for formal nonmonotonic reasoning.- Logics for inheritance theory.

Book ChapterDOI
01 Jan 1988
TL;DR: A method is proposed which combines three types of information to disambiguate: fixed and frequent phrases, syntactic information and commonsense reasoning, which differs in using a psycholinguistically motivated word meaning representations as the basis of a generalizeddisambiguation procedure.
Abstract: Computational lexical approaches to disambiguation divide into syntactic category assignment such as whether farm is a noun or a verb (Milne, 1986) and word sense disambiguation within syntactic category.9 The latter problem is the subject of this chapter. Assuming that word senses are listed together under one lexical entry in a given syntactic category, the problem is to select the correct one. One computational method of disambiguation is pattern matching where the surrounding words frequently associated with a sense are used to disambiguate a word. Such methods are powerful and can be used to eliminate 70% of the ambiguity (Black, 1986). A second method employs a rich syntactic lexicon which includes selectional restrictions (Gross, 1985). A third method uses a combination of structural and conceptual analysis for disambiguation (Black, 1986). In the present work a method is proposed which combines three types of information to disambiguate: fixed and frequent phrases, syntactic information and commonsense reasoning. It is similar to Black’s approach, but it differs in using a psycholinguistically motivated word meaning representations as the basis of a generalized disambiguation procedure. The advantage of the method is that it employs computationally expensive commonsense reasoning only for the difficult cases, and not for simpler cases.

Book ChapterDOI
01 Jan 1988
TL;DR: This paper discusses ARMS, a system which has been implemented to extend Prolog and provides some of the above features and shows how the system is extended to reason with information residing in a changing database.
Abstract: Prolog is a programming language based on logic. Its ability to traverse a search space efficiently and automatically as well as its ability to do automated inference has proved very useful in constructing expert systems which involve search or reasoning. But there are a wide range of applications which need some additional functionalities like hypothetical reasoning, commonsense reasoning with incomplete knowledge, explanation capability and capabilities to assimilate new information into its knowledge base. This paper discusses ARMS, a system which has been implemented to extend Prolog and provide some of the above features. The meta-interpreter which embeds a model of assumption-based reasoning with backward and forward inferencing capability is discussed along with the interface to an assumption-based truth maintenance system. The syntax and semantics of the extended logic-programming language of ARMS are given. We show how the system is extended to reason with information residing in a changing database. ARMS has been implemented on the TI Explorer and runs as an extension of the TI-Prolog programming environment.