scispace - formally typeset
Search or ask a question

Showing papers on "Knowledge representation and reasoning published in 1982"


Proceedings Article
18 Aug 1982
TL;DR: The experiments of the late 1960s on problem-solving by theorem-proving did not show that the use of logic and deduction in AI systems was necessarily inefficient, but rather that what was needed was better control of the deduction process, combined with more attention to the computational properties of axioms.
Abstract: This paper examines the role that formal logic ought to play in representing and reasoning with commonsense knowledge. We take issue with the commonly held view (as expressed by Newell [1980]) that the use of representations based on formal logic is inappropriate in most applications of artificial intelligence. We argue to the contrary that there is an important set of issues, involving incomplete knowledge of a problem situation, that so far have been addressed only by systems based on formal logic and deductive inference, and that, in some sense, probably can be dealt with only by systems based on logic and deduction. We further argue that the experiments of the late 1960s on problem-solving by theorem-proving did not show that the use of logic and deduction in AI systems was necessarily inefficient, but rather that what was needed was better control of the deduction process, combined with more attention to the computational properties of axioms.

287 citations



01 Jan 1982
TL;DR: The approach taken in this research is that the generation process should not simply trace the knowledge representation to produce text, but instead, communicative strategies people are familiar with are used to effectively convey information.
Abstract: There are two major aspects of computer-based text generation: (1) determining the content and textual shape of what is to be said; and (2) transforming that message into natural language. Emphasis in this research has been on a computational solution to the questions of what to say and how to organize it effectively. A generation method was developed and implemented in a system called TEXT that uses principles of discourse structure, discourse coherency, and relevancy criterion. The main features of the generation method developed for the TEXT strategic component include (1) selection of relevant information for the answer, (2) the pairing of rhetorical techniques for communication (such as analogy) with discourse purposes (for example, providing definitions) and (3) a focusing mechanism. Rhetorical techniques, which encode aspects of discourse structure, are used to guide the selection of propositions from a relevant knowledge pool. The focusing mechanism aids in the organization of the message by constraining the selection of information to be talked about next to that which ties in with the previous discourse in an appropriate way. This work on generation has been done within the framework of a natural language interface to a database system. The implemented system generates responses of paragraph length to questions about database structure. Three classes of questions have been considered: questions about information available in the database, requests for definitions, and questions about the differences between database entities. The main theoretical results of this research have been on the effect of discourse structure and focus constraints on the generation process. A computational treatment of rhetorical devices has been developed which is used to guide the generation process. Previous work on focus of attention has been extended for the task of generation to provide constraints on what to say next. The use of these two interacting mechanisms constitutes a departure from earlier generation systems. The approach taken in this research is that the generation process should not simply trace the knowledge representation to produce text. Instead, communicative strategies people are familiar with are used to effectively convey information. This means that the same information may be described in different ways on different occasions.

125 citations


Proceedings Article
18 Aug 1982
TL;DR: Some aspects of current representation research that offer a foundation for coping with complex and incompletenesson knowledge representation systems are reviewed, and a way of integrating these ideas into a powerful, practical knowledge representation paradigm is suggested.
Abstract: The range of domains and tasks for “knowledgebased systems” has been expanding at a furious pace. As we move away from trivial domains, such as the “blocks world”, the demands on knowledge representation systems used by expert programs are becoming more extreme. For one thing, the domains themselves are getting so complex that specialized technical vocabularies are unavoidable; consequently, the issue of a system talking with un expert in ki8 own ianguage cannot be ignored. For another, tasks such as medical diagnosis, scene analysis, speech understanding, and game playing all have as a central feature an incrementally evolving model representing probably incomplete knowledge of part of the task domain. In this paper, we explore some of the impact of these two critical issues-complexity and incompletenesson knowledge representation systems. We review some aspects of current representation research that offer a foundation for coping with these problems, and finally suggest a way of integrating these ideas into a powerful, practical knowledge representation paradigm.

75 citations


Journal ArticleDOI
01 Nov 1982
TL;DR: An illustrative example is presented to show how the knowledge base system assists the decisionmaker in grasping the complex problem in its totality.
Abstract: A decision support system using a knowledge base of documentary data is presented. Causal assertions in documents are extracted and organized into cognitive maps, which are networks of causal relations, by the methodology of documentary coding. Our knowledge base is constructed by joining cognitive maps of several documents concerned with a societal complex problem. The knowledge base is an integration of several expertises described in documents, though it is only concerned with causal structure of the problem, and includes overall and detailed information about the problem. Decisionmakers concerned with the problem interactively retrieve relevant information from the knowledge base in the process of decisionmaking and form their overall and detailed understanding of the complex problem based on the expertises stored in the knowledge base. Three retrieval modes are proposed according to types of the decisionmakers' requests: 1) skeleton maps indicate overall causal structure of the problem, 2) hierarchical graphs give detailed information about parts of the causal structure, and 3) sources of causal relations are presented when necessary, for example when the decisionmaker wants to browse the causal assertions in documents. An illustrative example is presented to show how our knowledge base system assists the decisionmaker in grasping the complex problem in its totality. In addition two coders' coding results of a document are investigated under the same condition and confirm the reliability of the methodology for coding cognitive maps, i.e., the capacity of the method to yield the same coding result.

72 citations


Proceedings Article
18 Aug 1982
TL;DR: This paper attempts to resolve some of the controversy between advocates of predicate calculus and users of other knowledge representation languages by demonstrating that it is possible to have the key features of both in a hybrid system.
Abstract: This paper attempts to resolve some of the controversy between advocates of predicate calculus and users of other knowledge representation languages by demonstrating that it is possible to have the key features of both in a hybrid system. An example is given of a recently implemented hybrid system in which a specialized planning language coexists with its translation into predicate calculus. In this system, various kinds of reasoning required for a program understanding task are implemented at either the predicate calculus level or the planning language level, depending on which is more natural.

56 citations


Journal ArticleDOI
01 Jul 1982
TL;DR: Many of the ideas presented here have already been implemented in the MEDAS system¿a medical DSS for emergency and critical care medicine, and are accompanied by examples for military situation assessment.
Abstract: Situation assessment tasks, e.g., medical diagnosis, battlefield reading, corporation status assessment for merger or acquisition purposes, are formulated as a general family of problem solving tasks. The generic nature of this family task as a multimembership hierarchical pattern recognition problem is characterized and the types of decision support systems (DSS) are identified. The focus is on knowledge representation and elicitation, although issues related to inference mechanisms, system structure, and expert-machine-user interface are also discussed. Two types of knowledge are distinguished: global knowledge and local knowledge. Global knowledge is required to determine directions on which to focus attention, while local knowledge is required for assessing the validity of a specific alternative based on a given set of findings. Global knowledge is represented as a network of relevancy pointers between alternatives and features. Attached to the links of this network are weights by which the strength of relevancy is evaluated and global directions (hypotheses) for situation analysis are determined. For local knowledge, it seems that in most practical problems multiple representation techniques would be required to characterize adequately the alternatives by means of their relevant features. The presentation is accompanied by examples for military situation assessment. However, comparable examples from medical and business applications are also cited. In fact, many of the ideas presented here have already been implemented in the MEDAS system?a medical DSS for emergency and critical care medicine.

53 citations


Journal ArticleDOI
TL;DR: A method for measuring both labile emotional and more permanent psychological events through content analysis of verbal behavior has been developed by Gottschalk and Gleser, and provides an interesting challenge for “artificial intelligence,” the study of thought and action in forms that can be expressed in computer programs.

40 citations


Proceedings ArticleDOI
16 Jun 1982
TL;DR: This paper presents a representation language in the notation of FOPC whose form facilitates the design of a semantic-network-like retriever.
Abstract: Ever since Wood's "What's in a Link" paper, there has been a growing concern for formalization in the study of knowledge representation. Several arguments have been made that frame representation languages and semantic-network languages are syntactic variants of the first-order predicate calculus (FOPC). The typical argument proceeds by showing how any given frame or network representation can be mapped to a logically isomorphic FOPC representation. For the past two years we have been studying the formalization of knowledge retrievers as well as the representation languages that they operate on. This paper presents a representation language in the notation of FOPC whose form facilitates the design of a semantic-network-like retriever.

39 citations


Book
01 Jan 1982

33 citations


Book
01 Jan 1982
TL;DR: The use of AI in expert systems, computer vision, natural language processing, speech recognition and understanding, speech synthesis, problem solving, and planning is examined.
Abstract: An overview of artificial intelligence (AI), its core ingredients, and its applications is presented. The knowledge representation, logic, problem solving approaches, languages, and computers pertaining to AI are examined, and the state of the art in AI is reviewed. The use of AI in expert systems, computer vision, natural language processing, speech recognition and understanding, speech synthesis, problem solving, and planning is examined. Basic AI topics, including automation, search-oriented problem solving, knowledge representation, and computational logic, are discussed.

14 Sep 1982
TL;DR: KBS provides facilities for interactive model creation and alteration, simulation monitoring and control, graphics display, and selective instrumentation, and allows the user to define and simulate a system at different levels of abstraction, and to check the completeness and consistency of a model, hence reducing model debugging time.
Abstract: : This report describes KBS, a Knowledge-Based simulationsystem The report describes the use of SRL, an Al-based knowledge representation system for modeling (eg, factory organizations), and its interpretation of discrete simulations KBS provides facilities for interactive model creation and alteration, simulation monitoring and control, graphics display, and selective instrumentation It also allows the user to define and simulate a system at different levels of abstraction, and to check the completeness and consistency of a model, hence reducing model debugging time (Author)

Proceedings ArticleDOI
16 Jun 1982
TL;DR: A system is presented which uses the contents of the database to form part of this knowledge representation automatically and employs three types of world knowledge axioms to ensure that the representation formed is meaningful and contains salient information.
Abstract: The knowledge representation is an important factor in natural language generation since it limits the semantic capabilities of the generation system. This paper identifies several information types in a knowledge representation that can be used to generate meaningful responses to questions about database structure. Creating such a knowledge representation, however, is a long and tedious process. A system is presented which uses the contents of the database to form part of this knowledge representation automatically. It employs three types of world knowledge axioms to ensure that the representation formed is meaningful and contains salient information.

Proceedings ArticleDOI
TL;DR: An algebraic model of SBA programs is presented which allows us to determine the expressive power of the SBA language and a more general control mechanism is discussed, based on homomorphic transformations, which allows to compute all primitive recursive functions.
Abstract: Programming by Examples is one of the methodologies that has been proposed for supporting the development of office applications by non computer specialists. Programs are built by performing direct manipulations on objects visually represented on a display, simulating the execution of the program on exemplary data items. We distinguish two major approaches: a declarative approach (SBA by DeJong and Zloof) and a procedural approach (Tinker by Lieberman and Hewitt). We examine the computational power and the expressive convenience of the two systems. The procedural approach achieves full generality by allowing the user to explicitly introduce iterative or recursive control structures. The SBA system has built in control mechanisms, which avoids explicit use of such constructs, but it is correspondingly less powerful. We present an algebraic model of SBA programs which allows us to determine the expressive power of the SBA language. We discuss a more general control mechanism then the one embedded in SBA, based on homomorphic transformations, which allows to compute all primitive recursive functions. Further flexibility is obtained by coupling the PBE methodology with the knowledge representation language Omega.

Proceedings ArticleDOI
16 Jun 1982
TL;DR: In this paper, a rule-based knowledge engineering approach for natural language understanding is presented, where knowledge of various types can be entered and utilized: syntactic and semantic; assertions and rules.
Abstract: This paper describes the results of a preliminary study of a Knowledge Engineering approach to Natural Language Understanding. A computer system is being developed to handle the acquisition, representation, and use of linguistic knowledge. The computer system is rule-based and utilizes a semantic network for knowledge storage and representation. In order to facilitate the interaction between user and system, input of linguistic knowledge and computer responses are in natural language. Knowledge of various types can be entered and utilized: syntactic and semantic; assertions and rules. The inference tracing facility is also being developed as a part of the rule-based system with output in natural language. A detailed example is presented to illustrate the current capabilities and features of the system.

Proceedings Article
18 Aug 1982
TL;DR: The QBKG system produces critical analyses of possible moves for a wide variety of backgammon positions, using a hierarchically structurcd non-discrete form of knowledge representation.
Abstract: The QBKG system produces critical analyses of possible moves for a wide variety of backgammon positions, using a hierarchically structurcd. non-discrete form of knowledge representation. This report compares discrete and continuous representations and reasoning systems, addressing issues of competence, robustness, and explainability. The QBKG system is described and demonstrated.

Proceedings ArticleDOI
13 Sep 1982
TL;DR: It is shown that constants as well as rules and regulations typically found in business applications should be factored out and stored separately from the application programs in a data base.
Abstract: This paper describes a methodology for application software development, the objective being the reduction of volume of code and ease of maintenance. It is shown that constants as well as rules and regulations typically found in business applications should be factored out and stored separately from the application programs in a data base. Definitional equations are proposed as a method for specifying such rules and regulations. The equations can be used as parameters to various types of interpreters to be used by application programs.As an illustration of the methodology, one such interpreter has been implemented. This paper shows its application to a screen handling program; other uses are discussed. The interpreter and its implementation are outlined.

Journal ArticleDOI
TL;DR: Conceptual systems analysis analyzes the organizing principle binding related concepts, procedures, relations, etc. together to permit the construction of visual representations of a subject matter so that the scope of this conceptual system and its subordinate and superordinate systems can be seen readily.
Abstract: Conceptual systems analysis analyzes the organizing principle binding related concepts, procedures, relations, etc. together. It permits the construction of visual representations of a subject matter so that the scope of this conceptual system and its subordinate and superordinate systems can be seen readily. Any teacher or instructional designer must have and use a model or representation of the subject matter, no matter how inchoate. Equally the instructional system (e.g. computer program) must incorporate a knowledge representation of the subject matter. But it must also have a representation of the student's knowledge and representation of his capability for carrying out required operations though this feature is difficult to implement today. Ideally the instructional system will have a knowledge representation of the teaching process or learning opportunities so that it or the student can select an optimal route through the knowledge structure. In the absence of such intelligent software, the use of knowledge representations as teaching aids is a valuable addition to the theory and practice of educational technology.

Journal ArticleDOI
TL;DR: Interlisp, formerly available only on large timesharing systems, now operates in the form of lnterlisp-D on the Xerox 1100, 1108, and 1132 scientific information processors, and the multiple window system provides reshaping, horizontal and vertical scrolling, and redisplaying.
Abstract: \"Because of the type of problems we work on, we encountered the limitations of machine architectures earlier than anyone else,\" David L. Waltz of the University of Illinois, program chairman , pointed out in an interview at the National Conference on Artificial Intelligence last August. Sponsored by the American Association for Artificial Intelligence , the meeting attracted more than 1500 participants to the University of Pittsburgh and Carnegie-Mellon University for sessions on vision, natural languages and speech, problem solving and search, knowledge representation, robotics, cognitive modeling, and other subjects. This attendance represented an increase of 50 percent over last year's. The artificial intelligence community needed computers that were able to run very large programs on the time scale of interactive usage, and-because certain concepts were reaching the point of practical application-it needed computers at a price low enough to make these applications economically feasible. The key features required were a large virtual address space and the ability to process symbolic information. Symbolics, Inc., one of the vendors who offer machines with this capability, describes symbolic processing in terms of symbols and relationships. \"A symbol represents a real-world object and provides a place to store information about that. object. Symbols and properties of symbols may be freely created and manipulated while the symbol processor worries about the details.\" November 1982 Whereas a traditional database represents a person's name, for example, as a fixed-format sequence of characters, a symbol-processing computer represents the name as a symbol and can assign various properties to that symbol: first name John, birthdate 6/8/47, hair color blond, etc. These properties or relation-Interlisp, formerly available only on large timesharing systems, now operates in the form of lnterlisp-D on the Xerox 1100, 1108, and 1132 scientific information processors. The multiple window system provides reshaping, horizontal and vertical scrolling, and redisplaying. In addition to interactive graphics, Interlisp-D contains a display editor and inspector, programmer's assistant, debugging tools, and program analysis tools.

Proceedings ArticleDOI
05 Jul 1982
TL;DR: The purpose of FKR-0 is to stored information required for machine translation processing as flexibly as possible, and to make the translation system as expandable as possible.
Abstract: This paper describes a new knowledge representation called "frame knowledge representation-0" (FKR-0), and an experimental machine translation system named ATLAS/I which uses FKR-0.The purpose of FKR-0 is to stored information required for machine translation processing as flexibly as possible, and to make the translation system as expandable as possible.

Proceedings ArticleDOI
05 Jul 1982
TL;DR: The main part of the paper describes the design criteria for the SEMANTICS, PRAGMATICS, and DIALOG modules, and the structure of their interactions within a general discourse understanding framework, in particular the role of a user/task model.
Abstract: This paper reflects some thoughts on pragmatics in the context of a Speech Understanding System which is currently developed at the University Erlangen-Nuernberg. After a brief outline of the system's structure with an emphasis on the characteristics of the parser and the knowledge representation scheme we present some of the underlying theoretical considerations. The main part of the paper describes the design criteria for the SEMANTICS, PRAGMATICS, and DIALOG modules, and the structure of their interactions within a general discourse understanding framework, in particular the role of a user/task model.

Journal ArticleDOI
TL;DR: The detailed design specifications of an artificial intelligence system called KNOWTRAN are developed and ideas about knowledge representation are developed to meet the requirements of a general heat transfer problem solver.


01 Jan 1982
TL;DR: The need is discussed for design procedures and protocols to insure that knowledge bases and cognitive engines, which enable integration of facts and values to form judgments, are as free as possible from information processing biases.
Abstract: The authors attempt an integration of cognitive structural mapping, knowledge representation in expert consulting systems, and information processing biases in judgment and choice activities The need is discussed for design procedures and protocols to insure that knowledge bases and cognitive engines, which enable integration of facts and values to form judgments, are as free as possible from information processing biases 35 references

Book ChapterDOI
01 Jan 1982
TL;DR: A deduction-oriented language is planned to be designed, which, with its processor, will form the core of the whole computing system, which includes an intelligent programming system, a knowledge representation language and system, and a meta inference system to be built on the core.
Abstract: The heart of the fifth generation computer in prospect is powerful mechanisms for problem solving and inference. A deduction-oriented language is planned to be designed, which, with its processor, will form the core of the whole computing system. The language is based on predicate logic with the extended features of structuring facilities, meta structures and relational data base interfaces. Parallel computation mechanisms and specialized hardware architectures are extensively investigated to make possible efficient realization of the language features. The project includes an intelligent programming system, a knowledge representation language and system, and a meta inference system to be built on the core.

Journal ArticleDOI
TL;DR: The program endeavours to teach database concepts and is based upon a relational database management system, influenced by the recent emphasis on knowledge representation in artificial intelligence.
Abstract: This paper describes a program written to investigate the derivation of teaching strategies in computer-assisted learning dialogue systems. The program endeavours to teach database concepts and is based upon a relational database management system. The program's design, however, is intended to be independent of the particular subject matter taught and is influenced by the recent emphasis on knowledge representation in artificial intelligence.

Journal ArticleDOI
TL;DR: Knowledge representation and utilisation in a man-machine dialogue with a medical decision aid system and its applications in medicine and science are studied.
Abstract: Knowledge representation and utilisation in a man-machine dialogue with a medical decision aid system. -

Proceedings Article
01 Jul 1982

01 Jan 1982
TL;DR: This dissertation describes BRUIN, a unified AI system that can perform both problem-solving and language comprehension tasks and describes how context recognition can be done in a problem solving environment.
Abstract: Over the years, it has been argued that there ought to be an artificial intelligence (AI) system which can do both problem solving and language comprehension using the same database of knowledge. Such a system has not previously been constructed because researchers in these two areas have generally used rather different knowledge representations: the predicate calculus for problem solving and some frame-like representation for language comprehension. This dissertation describes BRUIN, a unified AI system that can perform both problem-solving and language comprehension tasks. Included in the system is a frame-based knowledge-representation language called FRAIL, a problem solving component called NASL (which is based on McDermott's problem-solving language of the same name), and a context-recognition component known as PRAGMATICS. Examples that have been tested in this system are drawn from the inventory-control, restaurant and blocks-world domains. The main intent of this dissertation is to describe how context recognition can be done in a problem solving environment. Also discussed is the knowledge representation language FRAIL and the relevant portions of the problem solver, NASL. Finally, there is a discussion of the problems with the context recognizer, PRAGMATICS and possibilities for future research.

Proceedings ArticleDOI
05 Jul 1982
TL;DR: A first experimental system including a grammatico-semantic analysis of the input texts and questions, a procedure of inferencing, a search for appropriate answers to individual questions and a synthesis of the answers are being implemented, mainly in the language Q and PL/1.
Abstract: A method of automatic answering of questions in natural language, based only on input texts and a set of rules of inference, is described. A first experimental system including a grammatico-semantic analysis of the input texts and questions, a procedure of inferencing, a search for appropriate answers to individual questions and a synthesis of the answers are being implemented, mainly in the language Q and PL/1. The output of the analysis, the underlying representations of the utterances of the input text, serves as a base of the knowledge representation scheme, on which the inference rules (mapping dependency trees into dependency trees) operate.