scispace - formally typeset
Search or ask a question

Showing papers on "Domain knowledge published in 1985"


Journal ArticleDOI
TL;DR: The knowledge and skills that were central to each of the above contexts are described and discussed and the functional utility of each is discussed.
Abstract: A designer's expertise rests on the knowledge and skills which develop with experience in a domain. As a result, when a designer is designing an object in an unfamiliar domain he will not have the same knowledge and skills available to him as when he is designing an object in a familiar domain. In this paper we look at the software designer's underlying constellation of knowledge and skills, and at the way in which this constellation is dependent upon experience in a domain. What skills drop out, what skills, or interactions of skills come forward as experience with the domain changes? To answer the above question, we studied expert designers in experimentally created design contexts with which they were differentially familiar. In this paper we describe the knowledge and skills we found were central to each of the above contexts and discuss the functional utility of each. In addition to discussing the knowledge and skills we observed in expert designers, we will also compare novice and expert behavior.

392 citations


Journal ArticleDOI
TL;DR: Research on knowledge representation in artificial intelligence provides a wealth of relevant techniques that can be incorporated into specification languages.
Abstract: Specification of many kinds of knowledge about the world is essential to requirements engineering. Research on knowledge representation in artificial intelligence provides a wealth of relevant techniques that can be incorporated into specification languages.

274 citations


Journal ArticleDOI
TL;DR: In this paper, a reconceptualisation of the theory-practice problem in initial and continuing professional education is presented, based on considering the influence of academic and professional contexts on knowledge operation and knowledge use.
Abstract: This paper presents a radical reconceptualisation of the theory-practice problem in initial and continuing professional education, based on considering the influence of academic and professional contexts on knowledge operation and knowledge use. The first part is concerned with making important conceptual distinctions, the second with implications for the practice of professional education and the relationships between higher education and the professions. The conceptual section of the paper first distinguishes different kinds of professional knowledge with particular attention to generalisability (knowledge of particular cases, knowledge of precepts, knowledge of theory) and explicitness (codified knowledge, knowledge embedded in traditions, craft Knowledge, tacit knowledge, etc.). Then it takes Broudy's four modes of knowledge use—replication, application, interpretation, association—and illustrates their significance for understanding the theory-practice relationship in a number of professions...

262 citations


Journal ArticleDOI
TL;DR: In this paper, the authors describe knowledge acquisition strategies developed in the course of handcrafting a diagnostic system and reports on their consequent implementation in MORE, an automated knowledge acquisition system.
Abstract: This paper describes knowledge acquisition strategies developed in the course of handcrafting a diagnostic system and reports on their consequent implementation in MORE, an automated knowledge acquisition system. We describe MORE in some detail, focusing on its representation of domain knowledge, rule generation capabilities, and interviewing techniques. MORE's approach is shown to embody methods which may prove fruitful to the development of knowledge acquisition systems in other domains.

229 citations


Proceedings Article
18 Aug 1985
TL;DR: A program that verifies the consistency and completeness of expert system knowledge bases which utilize the Lookheed Expert System (LES) framework, called CHECK, which combines logical principles as well as specific information about the knowledge representation formalism of LES.
Abstract: In this paper we describe a program that verifies the consistency and completeness of expert system knowledge bases which utilize the Lookheed Expert System (LES) framework. The algorithms described here are not specific to LES and can be applied to most rule-based systems. The program, called CHECK, combines logical principles as well as specific information about the knowledge representation formalism of LES. The program checks for redundant rules, conflicting rules, subsumed rules, missing rules, circular rules, unreachable clauses, and deadend clauses. It also generates a dependency chart which shows the dependencies among the rules and between the rules and the goals. CHECK can help the knowledge engineer to detect many programming errors even before the knowledge base testing phase. It also helps detect gaps in the knowledge base which the knowledge engineer and the expert might have overlooked. A wide variety of knowledge bases have been analyzed using CHECK.

220 citations


Patent
07 Jun 1985
TL;DR: In this paper, a tool is used for knowledge engineers for building and interpreting a knowledge base having separate portions encoding control knowledge, factual knowledge, and judgmental rules, which can be used to build knowledge systems that can always explain their conclusions and reasoning, and are intelligible and modifiable.
Abstract: A tool is used for knowledge engineers for building and interpreting a knowledge base having separate portions encoding control knowledge, factual knowledge, and judgmental rules. The tool has an inference engine applying the judgmental rules according to a built-in control procedure defining discrete states or control steps during a consultation with a user. The control knowledge is encoded in an applicative and imperative language defining control actions to be executed during interruption of the built-in control procedure at specified control steps. Since the control knowledge is explicit and results in the modification of data only in a precisely defined fashion, the tool can be used to build knowledge systems that can always explain their conclusions and reasoning, and that are intelligible and modifiable. To provide transparent representation of control knowledge as well as factual knowledge, the knowledge base is preferably organized into distinct frames which include the rules; control blocks separately encoding the control knowledge; and classes which become instantiated, attributes which take on values describing the class instances, class types, legal value hierarchies, and user-defined functions, which all encode factual knowledge. The knowledge engineer may provide control blocks to be executed at the start of the consultation, after the instantiation of specified classes, when a value for a specified attribute is to be determined, after a specified attribute is determined, and upon explicit invocation by another control block. The tool can also implicitly determine subsumed attributes.

213 citations


Patent
06 Jun 1985
TL;DR: A tool for building a knowledge system and running a consultation on a computer is easily mastered by people with little computer experience yet also provides advanced capabilities for the experienced knowledge engineer as discussed by the authors.
Abstract: A tool for building a knowledge system and running a consultation on a computer is easily mastered by people with little computer experience yet also provides advanced capabilities for the experienced knowledge engineer. The knowledge system includes a knowledge base in an easily understood English-like language expressing facts, rules, and meta-facts for specifying how the rules are to be applied to solve a specific problem. The tool includes interactive knowledge base debugging, question generation, legal response checking, explanation, certainty factors, and the use of variables. The knowledge base language permits recursion and is extensible. Preferably, control during a consultation is goal directed in depth-first fashion as specified by rule order. The tool is easily embodied in assembly language, or in PROLOG to allow user-defined PROLOG functions.

209 citations


Journal ArticleDOI
TL;DR: A framework for domain-specific automatic programming is developed in terms of two activities, formalization and implementation, each of which transforms descriptions of the program as it proceeds through intermediate states of development.
Abstract: Domain knowledge is crucial to an automatic programming system and the interaction between domain knowledge and programming at the current time. The NIX project at Schlumberger-Doll Research has been investigating this issue in the context of two application domains related to oil well logging. Based on these experiments we have developed a framework for domain-specific automatic programming. Within the framework, programming is modeled in terms of two activities, formalization and implementation, each of which transforms descriptions of the program as it proceeds through intermediate states of development. The activities and transformations may be used to characterize the interaction of programming knowledge and domain knowledge in an automatic programming system.

142 citations


Proceedings Article
18 Aug 1985
TL;DR: More is a tool that assists in eliciting knowledge from domain experts by formulating its questions in a way that focuses on what kinds of knowledge are likely to be diagnostically significant.
Abstract: MORE is a tool that assists in eliciting knowledge from domain experts. Acquired information is added to a domain model of qualitative causal relations that may hold among hypotheses, symptoms, and background conditions. After generating diagnostic rules from the domain model, MORE prompts for additional information that would allow a stronger set of diagnostic rules to be generated, MORE'S primary value lies in its understanding of what kinds of knowledge are likely to be diagnostically significant. By formulating its questions in a way that focuses on such knowledge, it makes the most effective use of the domain experts' time.

127 citations


Journal ArticleDOI
TL;DR: Some of the advantages of using a diverse collection of domain experts are considered, which are based on collaboration with single domain expert.
Abstract: Expert system projects are often based on collaboration with single domain expert. This leads to difficulties in judging the suitability of the chosen task and in acquiring the detailed knowledge required to carry out the task. This anecdotal article considers some of the advantages of using a diverse collection of domain experts.

122 citations


Proceedings ArticleDOI
01 May 1985
TL;DR: A “complete” model of a database emlronment is employed to study the performance of three dlfferent approaches to the concurrency control problem under a “arleb of modehng assumptions” and examines how differences m the underlymg assumpnons explam the seemmgl\ contradlctorq performance results.
Abstract: A number of recent studies hale exammed the performance of concurrent\ control algorithms for database management s\stems The results reported to date rather than bemg defimove ha\e tended to be contradIctor> In 011s paper rather than presenting “yet another algorithm performance stud)” we crlbcally mvesngate the assumwons made m the models used m past studies and their Imphcatlons We employ a “complete” model of a database emlronment to study the relatn~e performance of three dlfferent approaches to the concurrency control problem under a \arleb of modehng assumptions We shon how differences m the underlymg assumpnons explam the seemmgl\ contradlctorq performance results We also examine how reahsnc the various assumptions would be for “real” database sqqtems

Proceedings ArticleDOI
01 May 1985

Proceedings Article
18 Aug 1985
TL;DR: SALT is described, a tool designed to assist with knowledge acquisition for configuration tasks that exploits a problem-solving strategy involving stages of generate, test, backup, modify, and regenerate to guide its interrogation of domain experts.
Abstract: Over the past ten years, significant progress has been made in understanding how the knowledge acquisition process for classification systems can be automated. But during this period little attention has been paid to the problem of how to automate the knowledge acquisition process for systems that solve problems by constructing solutions. This paper describes SALT, a tool designed to assist with knowledge acquisition for configuration tasks. SALT assumes a problem-solving strategy involving stages of generate, test, backup, modify, and regenerate. It exploits this problem-solving strategy to guide its interrogation of domain experts and to represent the knowledge they provide in a way that insures it will be brought to bear whenever relevant.

Proceedings Article
18 Aug 1985
TL;DR: It is shown how justifications can be used by a system to generate explanations - for its own use-of potential causes of observed failures and to allow the system to isolate potential faulty supporting beliefs for its rules and to effect repairs.
Abstract: We discuss the representation and use of justification structures as an aid to knowledge base refinement We show how justifications can be used by a system to generate explanations - for its own use-of potential causes of observed failures. We discuss specific information that is usefully included in these justifications to allow the system to isolate potential faulty supporting beliefs for its rules and to effect repairs. This research is part of a larger effort to develop a Learning Apprentice System (LAS) that partially automates initial construction of a knowledge base from first-principle domain knowledge as well as knowledge base refinement during routine use. A simple implementation has been constructed that demonstrates the feasibility of building such a system.

01 Jan 1985
TL;DR: The design and implementation of a knowledge representation framework, called Ace, geared towards facilitating the interaction of linguistic and conceptual knowledge in language processing, and a general-purpose natural language generator, KING (Knowledge INtensive Generator), has been implemented to apply knowledge in the Ace form.
Abstract: The development of natural language interfaces to Artificial Intelligence systems is dependent on the representation of knowledge. A major impediment to building such systems has been the difficulty in adding sufficient linguistic and conceptual knowledge to extend and adapt their capabilities. This difficulty has been apparent in systems which perform the task of language production, I. e. the generation of natural language output to satisfy the communicative requirements of a system. The problem of extending and adapting linguistic capabilities is rooted in the problem of integrating abstract and specialized knowledge and applying this knowledge to the language processing task. Three aspects of a knowledge representation system are highlighted by this problem: hierarchy, or the ability to represent relationships between abstract and specific knowledge structures; explicit referential knowledge, or knowledge about relationships among concepts used in referring to concepts; and uniformity, the use of a common framework for linguistic and conceptual knowledge. The knowledge-based approach to language production addresses the language generation task from within the broader context of the representation and application of conceptual and linguistic knowledge. This knowledge-based approach has led to the design and implementation of a knowledge representation framework, called Ace, geared towards facilitating the interaction of linguistic and conceptual knowledge in language processing. Ace is a uniform, hierarchical representation of the referential and metaphorical relationships among concepts. A general-purpose natural language generator, KING (Knowledge INtensive Generator), has been implemented to apply knowledge in the Ace form. The generator is designed for knowledge-intensivity and incrementality, to exploit the power of the Ace knowledge in generation. The generator works by applying structured associations, or mappings from conceptual to linguistic structures, and combining these structures into grammatical utterances. This has proven to be a simple but powerful mechanism, easy to adapt and extend, and has provided strong support for the role of conceptual organization in language generation.

Proceedings Article
18 Aug 1985
TL;DR: This paper describes an approach to knowledge base refinement, an important aspect of knowledge acquisition, and describes both domain-independent and domain-specific metaknowledge about the refinement process.
Abstract: This paper describes an approach to knowledge base refinement, an important aspect of knowledge acquisition. Knowledge base refinement is characterized by the addition, deletion, and alteration of rule-components in an existing knowledge base, in an attempt to improve an expert system's performance. SEEK2 extends the capabilities of its predecessor rule refinement system. SEEK [1], In this paper we describe the progress we have made since developing the original SEEK program: (a) SEEK2 works with a more general class of knowledge bases than SEEK, (b) SEEK2 has an "automatic pilot" capability, i.e., it can, if desired, perform all of the basic tasks involved in knowledge base refinement without human interaction, (c) a metalanguage for knowledge base refinement has been specified which describes both domain-independent and domain-specific metaknowledge about the refinement process.

Journal ArticleDOI
TL;DR: The primary features of this approach are that it gives software engineers who do not know knowledge engineering an easy place to start, and that it proceeds in a step-by-step fashion from initiation to implementation without inducing conceptual bottlenecks into the development process.
Abstract: Getting started on a new knowledge engineering project is a difficult and challenging task, even for those who have done it before. For those who haven't, the task can often prove impossible. One reason is that the requirements-oriented methods and intuitions learned in the development of other types of software do not carry over well to the knowledge engineering task. Another reason is that methodologies for developing expert systems by extracting, representing, and manipulating an expert's knowledge have been slow in coming. At Tektronix, we have been using step-by-step approach to prototyping expert systems for over two years now. The primary features of this approach are that it gives software engineers who do not know knowledge engineering an easy place to start, and that it proceeds in a step-by-step fashion from initiation to implementation without inducing conceptual bottlenecks into the development process. This methodology has helped us collect the knowledge necessary to implement several prototype knowledge-based systems, including a troubleshooting assistant for the Tektronix FG-502 function generator and an operator's assistant for a wave solder machine.

Proceedings ArticleDOI
18 Aug 1985
TL;DR: The method utilizes proofs of design correctness to guide the process of generalization and can generalize a rotational shift register into a schema describing devices capable of computing arbitrary permutations.
Abstract: This paper presents a method of learning to solve design problems by generalizing examples. The technique has been developed in the domain of logic circuit design. It involves the use of domain knowledge to analyze examples and produce generalized circuit designs. The method utilizes proofs of design correctness to guide the process of generalization. Our approach is illustrated by showing it can generalize a rotational shift register into a schema describing devices capable of computing arbitrary permutations.

Book ChapterDOI
01 Apr 1985
TL;DR: HERACLES as mentioned in this paper is a relational language for organizing domain knowledge, which provides a generic framework for constructing knowledge bases for related problems in other domains and also provides a useful starting point for studying the nature of strategies.
Abstract: A poorly designed knowledge base can be as cryptic as an arbitrary program and just as difficult to maintain. Representing inference procedures abstractly, separately from domain facts and relations, makes the design more transparent and explainable. The combination of abstract procedures and a relational language for organizing domain knowledge provides a generic framework for constructing knowledge bases for related problems in other domains and also provides a useful starting point for studying the nature of strategies. In HERACLES, inference procedures are represented as abstract metarules, expressed in a form of the predicate calculus, organized and controlled as rule sets. A compiler converts the rules into Lisp code and allows domain relations to be encoded as arbitrary data structures for efficiency. Examples are given of the explanation and teaching capabilities afforded by this representation. Different perspectives for understanding HERACLES'' inference procedure and how it defines knowledge bases are discussed in some detail.


Proceedings ArticleDOI
01 Jun 1985
TL;DR: A retrospective on that codified knowledge base, KBES, examining what has been learned about VLSI design is taken, discussing both the major steps in the implementation design process and the extent to which each rule embodies domain knowledge.
Abstract: The Design Automation Assistant is a knowledge-based expert-system, KBES, that generates a technology-independent list of operators, registers, data paths and control signals from an algorithmic description of a VLSI system. One merit of codifying knowledge in a KBES is that it can be easily quantified and qualified. This paper takes a retrospective on that codified knowledge base, examining what has been learned about VLSI design. It discusses both the major steps in the implementation design process and the extent to which each rule embodies domain knowledge. Finally, the paper provides an example design with typical rules from each of the major steps in the implementation design process.

Proceedings Article
18 Aug 1985
TL;DR: This work proposes using the Knuth-Bendix completion procedure to synthesize logic programs, as well as functional programs, from specifications and domain knowledge expressed as equivalence-preserving rewrite rules.
Abstract: The Knuth-Bendix completion procedure was introduced as a means of deriving canonical term-rewriting systems to serve as decision procedures for given equational theories. The procedure generates new rewrite rules to resolve ambiguities resulting from existing rules that overlap. We propose using this procedure to synthesize logic programs, as well as functional programs, from specifications and domain knowledge expressed as equivalence-preserving rewrite rules. An implementation is underway.

Proceedings ArticleDOI
01 May 1985
TL;DR: The paper proposes concepts, methods and tools to support the extraction, mtegratlon, transformatlon and evaluation of termmologlcal knowledge that are based on database design techmques and discusses the posslblhtles and lm-ntatlons of automatmg these keywords and phrases knowledge based systems, knowledge base design, database design.
Abstract: One of the most dlfflcult problems m knowledge base design 1s the acqulsltlon and formahzatlon of an expert’s rules concerning a special universe of discourse In most cases different experts and the knowledge base designer hnnself will use different termmologles. and ~111 represent rules concerning the same objects m a different way Therefore, one of the first steps m knowledge base design has to be the construction of an integrated, commonly accepted terrmnology, that can be shared by all persons involved in the design process This design step will be the topic of the paper The paper proposes concepts, methods and tools to support the extraction, mtegratlon, transformatlon and evaluation of termmologlcal knowledge that are based on database design techmques and discusses the posslblhtles and lm-ntatlons of automatmg these keywords and phrases knowledge based systems, knowledge base design, database design. conceptual modellmg. semantic modellmg. termmological knowledge acqulsltlon. knowledge mtegratlon, design automation

Journal ArticleDOI
TL;DR: Three basic methodologies—frames, rules, and logic—have emerged to support the complex task of storing human knowledge in an expert system and each of the articles in this Special Section describes and illustrates one of these methodologies.
Abstract: A fundamental shift in the preferred approach to building applied artificial intelligence (AI) systems has taken place since the late 1960s. Previous work focused on the construction of general-purpose intelligent systems; the emphasis was on powerful inference methods that could function efficiently even when the available domain-specific knowledge was relatively meager. Today the emphasis is on the role of specific and detailed knowledge, rather than on reasoning methods.The first successful application of this method, which goes by the name of knowledge-based or expert-system research, was the DENDRAL program at Stanford, a long-term collaboration between chemists and computer scientists for automating the determination of molecular structure from empirical formulas and mass spectral data. The key idea is that knowledge is power, for experts, be they human or machine, are often those who know more facts and heuristics about a domain than lesser problem solvers. The task of building an expert system, therefore, is predominantly one of “teaching” a system enough of these facts and heuristics to enable it to perform competently in a particular problem-solving context. Such a collection of facts and heuristics is commonly called a knowledge base. Knowledge-based systems are still dependent on inference methods that perform reasoning on the knowledge base, but experience has shown that simple inference methods like generate and test, backward-chaining, and forward-chaining are very effective in a wide variety of problem domains when they are coupled with powerful knowledge bases.If this methodology remains preeminent, then the task of constructing knowledge bases becomes the rate-limiting factor in expert-system development. Indeed, a major portion of the applied AI research in the last decade has been directed at developing techniques and tools for knowledge representation. We are now in the third generation of such efforts. The first generation was marked by the development of enhanced AI languages like Interlisp and PROLOG. The second generation saw the development of knowledge representation tools at AI research institutions; Stanford, for instance, produced EMYCIN, The Unit System, and MRS. The third generation is now producing fully supported commercial tools like KEE and S.1. Each generation has seen a substantial decrease in the amount of time needed to build significant expert systems. Ten years ago prototype systems commonly took on the order of two years to show proof of concept; today such systems are routinely built in a few months.Three basic methodologies—frames, rules, and logic—have emerged to support the complex task of storing human knowledge in an expert system. Each of the articles in this Special Section describes and illustrates one of these methodologies. “The Role of Frame-Based Representation in Reasoning,” by Richard Fikes and Tom Kehler, describes an object-centered view of knowledge representation, whereby all knowldge is partitioned into discrete structures (frames) having individual properties (slots). Frames can be used to represent broad concepts, classes of objects, or individual instances or components of objects. They are joined together in an inheritance hierarchy that provides for the transmission of common properties among the frames without multiple specification of those properties. The authors use the KEE knowledge representation and manipulation tool to illustrate the characteristics of frame-based representation for a variety of domain examples. They also show how frame-based systems can be used to incorporate a range of inference methods common to both logic and rule-based systems."Rule-Based Systems,” by Frederick Hayes-Roth, chronicles the history and describes the implementation of production rules as a framework for knowledge representation. In essence, production rules use IF conditions THEN conclusions and IF conditions THEN actions structures to construct a knowledge base. The autor catalogs a wide range of applications for which this methodology has proved natural and (at least partially) successful for replicating intelligent behavior. The article also surveys some already-available computational tools for facilitating the construction of rule-based knowledge bases and discusses the inference methods (particularly backward- and forward-chaining) that are provided as part of these tools. The article concludes with a consideration of the future improvement and expansion of such tools.The third article, “Logic Programming, ” by Michael Genesereth and Matthew Ginsberg, provides a tutorial introduction to the formal method of programming by description in the predicate calculus. Unlike traditional programming, which emphasizes how computations are to be performed, logic programming focuses on the what of objects and their behavior. The article illustrates the ease with which incremental additions can be made to a logic-oriented knowledge base, as well as the automatic facilities for inference (through theorem proving) and explanation that result from such formal descriptions. A practical example of diagnosis of digital device malfunctions is used to show how significantand complex problems can be represented in the formalism.A note to the reader who may infer that the AI community is being split into competing camps by these three methodologies: Although each provides advantages in certain specific domains (logic where the domain can be readily axiomatized and where complete causal models are available, rules where most of the knowledge can be conveniently expressed as experiential heuristics, and frames where complex structural descriptions are necessary to adequately describe the domain), the current view is one of synthesis rather than exclusivity. Both logic and rule-based systems commonly incorporate frame-like structures to facilitate the representation of large amounts of factual information, and frame-based systems like KEE allow both production rules and predicate calculus statements to be stored within and activated from frames to do inference. The next generation of knowledge representation tools may even help users to select appropriate methodologies for each particular class of knowledge, and then automatically integrate the various methodologies so selected into a consistent framework for knowledge.

Proceedings Article
18 Aug 1985
TL;DR: A novel expert system architecture which supports explicit representation and effective use of both declarative and procedural knowledge, and has been adopted for the design of PROP, an expert system for on-line monitoring of the cycle water pollution in a thermal power plant.
Abstract: The paper presents a novel expert system architecture which supports explicit representation and effective use of both declarative and procedural knowledge These two types of expert knowledge are represented by means of production rules and event-graphs respectively, and they are processed by a unified inference engine Communication between the rule level and the event-graph level is based on a full visibility of each level on the internal state of the other, and it is structured in such a way as to allow each level to expert control on the other This structure offers several advantages over more traditional architectures Knowledge representation is more natural and transparent; knowledge acquisition turns out to be easier as pieces of knowledge can be immediately represented without the need of complex transformation and restructuring; inference is more effective due to reduced non-determinism resulting from explicit representation of fragments of procedural knowledge in event-graphs; finally, explanations are more natural and understandable The proposed architecture has been adopted for the design of PROP, an expert system for on-line monitoring of the cycle water pollution in a thermal power plant PROP is running on a SUN-2 workstation and has been tested on a sample of real cases

Proceedings ArticleDOI
01 May 1985
TL;DR: This paper proposes several parallel recovery architectures for multaprocessor database machmes, examines the characterrstacs and performance of each, and evaluates the impact on the performance of the database machme.
Abstract: Dunng the past decade. a number of database machme designs have been proposed Most of these desagns have been optamlzed wrth respect to retrreval querres, whrle rgnormg the issue of recovery and tts Impact on performance of the proposed archztecture In this paper, we propose several parallel recovery architectures for multaprocessor database machmes, examme the characterrstacs and performance of each, and evaluate thear Impact on the performance of the database machme Our results rndlcate that a recovery archatecture based on parallel loggmg has the best overall performance

Book ChapterDOI
01 Jan 1985
TL;DR: The paper discusses problems of how to structure the knowledge base and to adopt the most appropriate approaches and tools for developing SICONFEX, a knowledge-based system for the interactive configuration of an operation system for SICOMP computers.
Abstract: The paper discusses problems of how to structure the knowledge base and to adopt the most appropriate approaches and tools for developing SICONFEX. SICONFEX is a knowledge-based system for the interactive configuration of an operation system for SICOMP computers. It was developed by the expert systems research lab of Siemens, mainly based on an object-oriented approach. The paper’s aim is not to give an comprehensive overview of the system’s structure and development, but rather reflects and discusses some problems we encountered during this phase. These, in our opinion, are problems of a more general character, emerging in designing an expert system in a non-trivial domain.


01 Jan 1985
TL;DR: A uniform, hierarchical representation system, Ace, geared towards facilitating the interaction of linguistic and conceptual knowledge in language processing, which facilitates the use of abstractions in the encoding of specialized knowledge and the representation of the referential and metaphorical relationships among concepts.
Abstract: The development of natural language interfaces to Artificial Intelligence systems is dependent on the representation of knowledge. A major impediment to building such systems has been the difficulty in adding sufficient linguistic and conceptual knowledge to extend and adapt their capabilities. This difficulty has been apparent in systems which perform the task of language production, i. e. the generation of natural language output to satisfy the communicative requirements of a system. The problem of extending and adapting linguistic capabilities is rooted in the problem of integrating abstract and specialized knowledge and applying this knowledge to the language processing task. Three aspects of a knowledge representation system are highlighted by this problem: hierarchy, or the ability to represent relationships between abstract and specific knowledge structures; explicit referential knowledge, or knowledge about relationships among concepts used in referring to concepts; and uniformity, the use of a common framework for linguistic and conceptual knowledge. The knowledge-based approach to language production addresses the language generation task from within the broader context of the representation and application of conceptual and linguistic knowledge. This knowledge-based approach has led to the design and implementation of a knowledge representation framework, called Ace, geared towards facilitating the interaction of linguistic and conceptual knowledge in language processing. Ace is a uniform, hierarchical representation system, which facilitates the use of abstractions in the encoding of specialized knowledge and the representation of the referential and metaphorical relationships among concepts. A general-purpose natural language generator, KING (Knowledge INtensive Generator), has been implemented to apply knowledge in the Ace form. The generator is designed for knowledge-intensivity and incrementality, to exploit the power of the Ace knowledge in generation. The generator works by applying structured associations, or mappings, from conceptual to linguistic structures, and combining these structures into grammatical utterances. This has proven to be a simple but powerful mechanism, easy to adapt and extend, and has provided strong support for the role of conceptual organization in language generation.

Proceedings Article
21 Aug 1985
TL;DR: This paper discusses an alternative approach: a system whose knowledge is in the form of text fragments plus a query language to help users access appropriate fragments to support knowledge exploration.
Abstract: Knowledge exploration is the activity of finding out what other people have thought about Normally, people explore knowledge by reading books or articles or by talking to other people This paper discusses an alternative approach: a system whose knowledge is in the form of text fragments plus a query language to help users access appropriate fragments Drawing primary inspiration from database theory, hypertext systems, knowledge representation, and a study of textual fragments called fragment theory, the paper describes and motivates a data model to support knowledge exploration