scispace - formally typeset
Search or ask a question

Showing papers in "Intelligence\/sigart Bulletin in 1977"


Journal ArticleDOI
TL;DR: The type of cognitive system studied here has a set of interacting elementary productions, called classifiers, and a performance algorithm that directs the action of the system in the environment and modifies the set of classifiers so that variants of good classifiers persist and new, potentially better ones are created in a provably efficient manner.
Abstract: The type of cognitive system (CS) studied here has four basic parts: (1) a set of interacting elementary productions, called classifiers, (2) a performance algorithm that directs the action of the system in the environment, (3) a simple learning algorithm that keeps a record of each classifier's success in bringing about rewards, and (4) a more complex learning algorithm, called the genetic algorithm, that modifies the set of classifiers so that variants of good classifiers persist and new, potentially better ones are created in a provably efficient manner.

656 citations


Proceedings ArticleDOI
TL;DR: An approach to a problem that arises in this context: futures which were thought to be relevant when they were created become irrelevant through being ignored in the body of the expression where they were bound is discussed.
Abstract: This paper investigates some problems associated with an argument evaluation order that we call “future” order, which is different from both call-by-name and call-by-value, In call-by-future, each formal parameter of a function is bound to a separate process (called a “future”) dedicated to the evaluation of the corresponding argument. This mechanism allows the fully parallel evaluation of arguments to a function, and has been shown to augment the expressive power of a language.We discuss an approach to a problem that arises in this context: futures which were thought to be relevant when they were created become irrelevant through being ignored in the body of the expression where they were bound. The problem of irrelevant processes also appears in multiprocessing problem-solving systems which start several processors working on the same problem but with different methods, and return with the solution which finishes first. This parallel method strategy has the drawback that the processes which are investigating the losing methods must be identified, stopped, and re-assigned to more useful tasks.

335 citations


Journal ArticleDOI
TL;DR: This note describes LIFER, a practical facility for creating natural language interfaces to other computer software that has bundled natural language specification and parsing technology into one convenient package.
Abstract: This note describes LIFER, a practical facility for creating natural language interfaces to other computer software. Emphasizing human engineering, LIFER has bundled natural language specification and parsing technology into one convenient package.

179 citations


Journal ArticleDOI
TL;DR: Four examples, including the classic examples of Winograd and Charniak, are presented that demonstrate pronoun resolution within the semantic approach.
Abstract: Two approaches to the problem of pronoun resolution are presented. The first is a naive algorithm that works by traversing the surface parse trees of the sentences of the text in a particular order looking for noun phrases of the correct gender and number. The algorithm is shown to incorporate many, though not all, of the constraints on co-referentiality between a nonreflective pronoun and a possible antecedent, which have been discovered recently by linguists. The algorithm clearly does not work in all cases, but the results of an examination of several hundred examples from published texts show that it performs remarkably well.In the second approach, it is shown how pronoun resolution is handled in a comprehensive system for semantic analysis of English texts. The system consists of four basic semantic operations which work by accessing a data base of 'World knowledge" inferences, which are drawn selectively and in a context-dependent way in response to the operations. The first two operations seek to satisfy the demands made by predicates on the nature of their arguments and to discover the relations between sentences. The third operation - knitting - recognizes and merges redundant expressions. These three operations frequently result in a pronoun reference being resolved as a by-product. The fourth operation seeks to resolve those pronouns not resolved by the first three. It involves a bidirectional search of the text and 'World knowledge" for an appropriate chain of inference and utilizes the efficiency of the naive algorithm.Four examples, including the classic examples of Winograd and Charniak, are presented that demonstrate pronoun resolution within the semantic approach.

152 citations


Journal ArticleDOI
TL;DR: The intent has been to produce a front end system which enables the user to concentrate on his problem or task rather than making him worry about how to communicate his ideas or questions to the machine.
Abstract: One of the major stumbling blocks to more effective used computers by naive users is the lack of natural means of communication between the user and the computer system. This report discusses a paradigm for constructing efficient and friendly man-machine interface systems involving subsets of natural language for limited domains of discourse. As such this work falls somewhere between highly constrained formal language query systems and unrestricted natural language under-standing systems. The primary purpose of this research is not to advance our theoretical under-standing of natural language but rather to put forth a set of techniques for embedding both semantic/conceptual and pragmatic information into a useful natural language interface module. Our intent has been to produce a front end system which enables the user to concentrate on his problem or task rather than making him worry about how to communicate his ideas or questions to the machine.

122 citations


Journal ArticleDOI
TL;DR: The paper gives a brief descriFtion of methodology for rule induction and of a computer program which permits a more general rule format than typically used, and facilitates a compact and easy to understand expression of descriptions of different degrees of generality.
Abstract: The problem considered is a transformation of a set of user given decision rules into a set of new rules which are more general than the original ones and more optimal with regard to a user defined criterion The decision rules are expressed in the VL21 logic system which permits a more general rule format than typically used, and facilitates a compact and easy to understand expression of descriptions of different degrees of generality The paper gives a brief descriFtion of methodology for rule induction and of a computer program

119 citations


Journal ArticleDOI
TL;DR: The Meta-DENDRAL program is described in general terms that are intended to clarify the similarities and differences to other learning programs and appears well suited to many induction tasks.
Abstract: The Meta-DENDRAL program is described in general terms that are intended to clarify the similarities and differences to other learning programs. Its approach of model-directed heuristic search through a comptex space of possible rules appears well suited to many induction tasks. The use of a strong model of the domain to direct time rule Eearch has been demons{rated for rule formation in two areas of chemistry. The high performance of programs which use the generated rules attests to the success of this learning strategy.

108 citations


Journal ArticleDOI
TL;DR: The role of conflict resolution in providing support for production systems designed to function and grow in environments that make large numbers of different, sometimes competing, and sometimes unexpected demands is explored.
Abstract: Production systems designed to function and grow in environments that make large numbers of different, sometimes competing, and sometimes unexpected demands require support from their interpreters that is qualitatively different from the support required by systems that can be carefully hand crafted to function in constrained environments. In this paper we explore the role of conflict resolution in providing such support. Using criteria developed in the paper, we evaluate both individual conflict resolution rules and strategies that make use of several rules.

102 citations


Journal ArticleDOI
TL;DR: A separate, explicit control structure for modeling asynchronous, concurrent processes With PS's is proposed; the use of a Petri net is addressed and a hierarchy of such networks is proposed.
Abstract: Because of the event-driven, nature of asynchronous, concurrent processes, production systems (PS's) are an attractive modeling tool. The system of interest can be modeled with a large number of independent states, with independent actions, and tile knowledge base can be conveniently encoded declaratively. However, asynchronous, concurrent processes normally have strict requirements for inter-process communication and coordination; this requires a substantial degree of inter-rule communication in the PS. The result of this is that a complex control structure is embedded in the short term memory (STM); this is generally considered unattractive for a number of reasons. This paper proposes a separate, explicit control structure for modeling asynchronous, concurrent processes With PS's. Specifically, the use of a Petri net is addressed. A system of asynchronous, concurrent processes can be modeled using 'PS's to model the individual processes or events and using a Petri net to model the relationships between the processes. Furthermore, a hierarchy of such networks is proposed; an allowable production rule action is the instantiation of another network. This is supported with a structured, hierarchial STM.

79 citations


Proceedings ArticleDOI
TL;DR: An approach to dealing with the construction of expert problem-solving systems based on making some knowledge which is usually implicitly part of an expert problem solver explicit, thus allowing this knowledge about control to be manipulated and reasoned about is described.
Abstract: The construction of expert problem-solving systems requires the development of techniques for using modular representations of knowledge without encountering combinatorial explosions in the solution effort. This report describes an approach to dealing with this problem based on making some knowledge which is usually implicitly part of an expert problem solver explicit, thus allowing this knowledge about control to be manipulated and reasoned about. The basic components of this approach involve using explicit representations of the control structure of the problem solver, and linking this and other knowledge manipulated by the expert by means of explicit data dependencies.

74 citations


Journal ArticleDOI
Frederick Jelinek1
TL;DR: This group works towards automatic transcription of continuous speech with a vocabulary and syntax as unrestricted as possible and an experimental system is operational.
Abstract: This group works towards automatic transcription of continuous speech with a vocabulary and syntax as unrestricted as possible. It is a long-term effort; however, an experimental system is operational. The acoustic processor contains a spectrum analyzer based on the Fast Fourier Transform and a phone segmenter/recognizer which makes use of transitional and steady-state information in its classification. The linguistic processor accepts an imperfect string of phones and produces an estimated transcription of the speech input.

Journal ArticleDOI
TL;DR: The design of SU/X and SU/P is concerned with the interpretation of large quantities of digitized signal data and some features of the design are incremental interpretation of data employing many different pattern-invoked sources of knowledge.
Abstract: SU/X and SU/P are knowledge-based programs which employ pattern-invoked inference methods. Both tasks are concerned with the interpretation of large quantities of digitized signal data. The task of SU/X is to understand "continuous signals", that is, signals which persist over time. The task of SU/P is to interpret protein x-ray crystallographic data. Some features of the design are: (1) incremental interpretation of data employing many different pattern-invoked sources of knowledge, (2) production rule representation of knowledge, including high level strategy knowledge, (3) "opportunistic" hypothesis formation using both data-driven and model-driven techniques within a general hypothesize-and-test paradigm; and (4) multilevel representation of the solution hypothesis.

Journal ArticleDOI
TL;DR: PHLIQA1 is an experimental system for answering isolated English questions about a restricted subjectdomain under the MDS time sharing system on a Philips P1400 computer.
Abstract: PHLIQA1 is an experimental system for answering isolated English questions about a restricted subjectdomain. It was developed at Philips Research Laboratories in Eindhoven, The Netherlands, by a team consisting of W. J. Bronnenberg, H. C. Bunt, S. P. J. Landsbergen, P. Medema, R. J. H. Scha, W. J. Schoenmakers and E. P. C. van Utteren. The system was designed during the period 1972-1975, and implemented in 1975. It runs under the MDS time sharing system on a Philips P1400 computer. Some details of the 1975 version have been improved in the currently running system.

Proceedings ArticleDOI
TL;DR: Examples are presented to illustrate the unusual position taken by production systems on a number of control and pattern-matching issues and to provide critical tests which might be used to evaluate the effectiveness of new designs.
Abstract: Programs in the artificial intelligence domain impose unusual requirements on control structures. Production systems are a control structure with promising attributes for building, generally intelligent systems with large knowledge bases. This paper presents examples to illustrate the unusual position taken by production systems on a number of control and pattern-matching issues. Examples are chosen to illustrate certain powerful features and to provide critical tests which might be used to evaluate the effectiveness of new designs.

Journal ArticleDOI
TL;DR: An operational theory for the generalization of productions for before-and-after situation pairs and situation sequences is developed, based on previous work in concept induction, and this theory has been computer implemented.
Abstract: Relational productions provide a mathematically tractable formal model for operators in discrete systems. Two paradigms for the inductive learning of such operators are considered: before-and-after situation pairs and situation sequences. An operational theory for the generalization of productions for these paradigms is then developed, based on previous work in concept induction. This theory has been computer implemented. In examples three "blocks world" operators are learned from six before-and-after pairs and also from a sequence of fifteen blocks world situations. A transformational grammar learning example of Hayes-Roth is repeated with an improvement in speed of two orders of magnitude.

Journal ArticleDOI
TL;DR: This work describes a rule-based system that uses a partitioned semantic network representation for the premises and conclusions of the rules, and proposes a suitable representational form for this representation.
Abstract: Rule-based inference systems allow judgmental knowledge about a specific problem domain to be represented as a collection of discrete rules. Each rule states that if certain premises are known, then certain conclusions can be inferred. An important design issue concerns the representational form for the premises and conclusions of the rules. We describe a rule-based system that uses a partitioned semantic network representation for the premises and conclusions.

Journal ArticleDOI
TL;DR: In this article, a production system architecture is augmented with a mechanism that enables knowledge of the degree to which each production is currently satisfied to be maintained across cycles, then the dependency on the size of working memory can be eliminated as well.
Abstract: The obvious method of determining which productions are satisfied on a given cycle involves matching productions, one at a time, against the contents of working memory. The cost of this processing is essentially linear in the product of the number of productions in production memory and the number of assertions in working memory. By augmenting a production system architecture with a mechanism that enables knowledge of similarities among productions to be precomputed and then exploited during a run, it is possible to eliminate the dependency on the size of production memory. If in addition, the architecture is augmented with a mechanism that enables knowledge of the degree to which each production is currently satisfied to be maintained across cycles, then the dependency on the size of working memory can be eliminated as well. After a particular production system architecture, PSG, is described, two sets of mechanisms that increase its efficiency are presented. To determine their effectiveness, two augmented versions of PSG are compared experimentally with each other and with the original version.

Journal ArticleDOI
TL;DR: This work presents a challenging problem in system construction: assembling the required knowledge base is a difficult task that often extends over several years, and involves numerous modifications to the knowledge base.
Abstract: Recent research efforts aimed at task-oriented systems have emphasized the importance of large stores of domainspecific knowledge as a basis for high performance. But assembling the required knowledge base is a difficult task that often extends over several years, and involves numerous modifications to the knowledge base. Given the difficulty of making even srnall changes to a program, this presents a challenging problem in system construction.

Journal ArticleDOI
TL;DR: A semantic network is defined with its arcs and nodes separated into various sets, and a match routine is defined which is given a source node and a binding and finds target nodes, target bindings and more fully specified source bindings.
Abstract: A semantic network is defined with its arcs and nodes separated into various sets. Arcs are partitioned into descending, ascending, and auxiliary arcs. Nodes are partitioned into base, variable, assertion, pattern and auxiliary nodes. Nodes can be temporary or permanent.Some pattern and assertion nodes, called rule nodes, represent propositional functions of the nodes they dominate. Rule nodes may bind the variables they dominate with any one of a set of binding relations representing quantifiers. A rule node which dominates variables all of which are bound is a constant deduction rule.Deduction rules may be viewed as pattern-invoked procedures. The type of propositional function determines the procedure, the variables bound by the rule are the local variables, and the quantifier determines the type of binding.A binding is defined as a list of variables associated with the nodes they are bound to. A binding can be used like a substitution, except it is seldom actually applied. Instead, a pattern node and a binding for it are used as a pair.A match routine is defined which is given a source node and a binding and finds target nodes, target bindings and more fully specified source bindings. Target nodes that are patterns provide entrees into relevant rules.

Journal ArticleDOI
W. A. Woods1
TL;DR: The use of natural language to manipulate data retrieval and display capabilities to enable a decision-maker to obtain a grasp of an overall situation in the face of an overwhelming availability of low level data.
Abstract: For many years, I have been pursuing a long-range research objective in the area of natural language understanding for man-machine communication, an objective that I share with many of my colleagues. The objective is to develop the capability for people to interact directly in fluent natural language with a computer system for support of some decision making task they are involved in. Specifically, I am concerned with the use of natural language to manipulate data retrieval and display capabilities to enable a decision-maker to obtain a grasp of an overall situation in the face of an overwhelming availability of low level data. Such a system must give concise answers to specific high-level questions posed by the decision-maker within a small number of seconds in most cases.

Journal ArticleDOI
TL;DR: It is argued that the use of a particular class of production systems demands a more detailed justification in domain-specific terms than is often given.
Abstract: The use of production systems as the primary method for encoding knowledge in large knowledge-based systems is discussed at two levels; their suitability as an architecture that can be efficiently supported and their appropriateness as a language of expression. Questions of efficiency are posed in the framework of a broad class of pattern-directed rewrite systems. Factors governing efficiency are discussed informally, and the usefulness of production systems as an information processing abstraction is examined critically. In this regard, several problems suggested by work on lexically motivated inference are described. It is argued that the use of a particular class of production systems demands a more detailed justification in domain-specific terms than is often given.

Journal ArticleDOI
TL;DR: As you can see by the thickness of this issue, the response to my request for contributions was overwhelming - I received 52 separate items!
Abstract: As you can see by the thickness of this issue, the response to my request for contributions was overwhelming - I received 52 separate items! Since the contributions arrived over a period of time, and since we had to use archaic means (typing, cutting and pasting) to produce this newsletter, the articles are not in optimal order. I hope that the index below will help untangle the issue.

Journal ArticleDOI
TL;DR: How 13 kinds of design deviations arise from the level of sophistication of the task that the system is designed to perform is shown, which should be significantly more powerful and natural for building rule systems that do scientific discovery tasks.
Abstract: Some scientific inference tasks (including mass spectrum identification DerzdrczL, medical diagnosis Mycin, and math theory development AM) have been successfully modelled as rule-directed search processes. These rule systems are designed quite differently from "pure production systems". By concentrating upon the design of one program (AM), we shall show how 13 kinds of design deviations arise from (i) the level of sophistication of the task that the system is designed to perform, (ii) the inherent nature of the task, and (iii) the designer's view of the task. The limitations of AM suggest even more radical departures from traditional rule system architecture. All these modifications are then collected into a new, complicated set of constraints on the form of the data structures, the rules, the interpreter, and the distribution of knowledge between rules and data structures. These new policies sacrifice uniformity in the interests of clarity, efficiency and power derivable from a thorough characterization of the task. Rule systems whose architectures conform to the new design principles will be more awkward for many tasks than would "pure" systems. Nevertheless, the new architecture should be significantly more powerful and natural for building rule systems that do scientific discovery tasks.

Proceedings ArticleDOI
TL;DR: A detailed look is taken at the problem of factoring program proofs into a proof of the underlying algorithm, followed by aProof of correct implementation of abstract variables at the concrete level and an intermediate assertion as well as sufficient conditions for correct initialization, invariance, and correctness at termination.
Abstract: A detailed look is taken at the problem of factoring program proofs into a proof of the underlying algorithm, followed by a proof of correct implementation of abstract variables at the concrete level. We do this considering four different concrete “marking” algorithms and formulating a single abstract algorithm and set of abstract specifications that can be instantiated to each of the four concrete cases. An intermediate assertion, as well as sufficient conditions for correct initialization, invariance, and correctness at termination are given at the abstract level. Proofs at the concrete level are then given by exhibiting appropriate mapping functions (from the concrete state vector to the abstract variables), and showing that the sufficient conditions are true. Proofs of termination are given by instantiating “termination schemas”.

Proceedings ArticleDOI
TL;DR: This work presents a stack implementation of multiple environments similar in principle to that of Bobrow and Wegbreit, but based on a model which provides both static and dynamic scoping.
Abstract: We present a stack implementation of multiple environments similar in principle to that of Bobrow and Wegbreit, but based on a model which provides both static and dynamic scoping. We note some of the pragmatic consequences of this choice of models; one is that no unnecessary control stack is retained for certain important constructions such as “upward funargs” and coroutines. We also discuss the correct treatment of exit functions, and the need for “entry functions” if dynamic switching of control contexts is to be consistent.

Journal ArticleDOI
TL;DR: This work reports on a rough and inconclusive experiment designed to answer one aspect of the tantalizing question: How much chess-specific knowledge does it take to play at a given level of competence?
Abstract: Chess has served as a convenient vehicle for studying cognition and perception (see de Groot [1965], Chase and Simon [1973]) as well as machine intelligence. Perhaps the central question for both of these research uses of chess is: How much chess-specific knowledge does it take to play at a given level of competence, for example, at the master level? It is difficult to say what chess-specific knowledge is, and it certainly consists of different types of knowledge, that must be considered independently of each other (for example, "book knowledge" obtained by studying chess books is quite different from experience obtained in over-the-board play). Even if one succeeds in defining what "chess-specific knowledge" is, there remains the difficulty of measuring it. Because of these difficulties, any approach to measuring the amount of knowledge possessed by a practitioner of a craft must be based on questionable assumptions, and any result obtained is subject to uncertainty and criticism. Only the inherent interest of the question posed justifies reporting on a rough and inconclusive experiment designed to answer one aspect of the tantalizing question: How much chess-specific knowledge does it take to play at a given level of competence?

Journal ArticleDOI
TL;DR: It is argued that the proposed pattern-directed processing model could be successfully implemented in artificial intelligence systems to provide adaptive error-handling mechanisms such as those observed in human behavior.
Abstract: A framework for viewing human text comprehension, memory, and recall is presented that assumes patterns of abstract conceptual relations are used to guide processing. These patterns consist of clusters of knowledge that encode prototypical co-occurrences of situations and events in narrative texts. The patterns are assumed to be a part of a person's world knowledge and can be activated during comprehension to build associations among multiple linguistic propositions in memory according to their higher-order conceptual relations. During text reproduction from memory, these patterns provide retrieval plans for recall and a mechanism for sophisticated "guessing" when retrieval fails. Some data from human text learning tasks are presented as evidence for these higher-order conceptual patterns. Several structural and processing properties of the model are evaluated in light of these data. It is argued that the proposed pattern-directed processing model could be successfully implemented in artificial intelligence systems to provide adaptive error-handling mechanisms such as those observed in human behavior.

Proceedings ArticleDOI
TL;DR: By extending a given analogy, a known program which solves a given problem is converted to a programWhich solves a different but analogous problem which is related by an initial specified analogy.
Abstract: By extending a given analogy, a known program which solves a given problem is converted to a program which solves a different but analogous problem. The domains of the two problems need not be the same but they must be related by an initial specified analogy. There are three features which distinguish the approach. First the analogy formation evolves gradually with the synthesis of the new program. Secondly the formation of the analogy is directed by the correctness proof of the known program. Finally the output of the synthesis process produces a correctness proof for the synthesized program.

Journal ArticleDOI
D. A. Waterman1
TL;DR: The development of a RITA agent for Exemplary Programming (EP) is described, which learns new facts and stores them in a data base and can learn new procedures for data manipulation.
Abstract: This paper describes the development of a RITA agent for Exemplary Programming (EP). The EP agent learns new facts and stores them in a data base and can learn new procedures for data manipulation. Both the EP agent and the programs it creates are written as sets of IF-THEN rules (production systems) in RITA: the Rule-directed Interactive Transaction Agent system. The programs produced by the EP agent act as "personal computer agents" to perform a variety of tasks for the user. Program creation is a cooperative effort between the user and the EP agent: the user illustrates what he wants done by performing a series of operations on the computer, and the agent watches and asks the user pertinent questions during the demonstration. The resulting program then becomes the user's personal computer agent for performing the given task.

Journal ArticleDOI
TL;DR: By making only dictionary changes the ROBOT project develops an English interface to data base management systems (DBMS) in a "movable miniworld" since the semantic primitives are fixed, but the area of discourse varies with the content of the data base.
Abstract: The ROBOT project is to develop an English interface to data base management systems (DBMS). Our approach calls for mapping English language questions into a language of data base semantics that is independent of the content of the data base. In this way we work in a "movable miniworld" since the semantic primitives are fixed, but the area of discourse varies with the content of the data base. Thus, by making only dictionary changes we have interfaced a student grade file and employee file and a data dictionary.