scispace - formally typeset
Search or ask a question

Showing papers in "Expert Systems With Applications in 1996"


Journal ArticleDOI
TL;DR: Three alternative techniques that can be used to empirically select predictors for neural networks in failure prediction, based on linear discriminant analysis, logit analysis and genetic algorithms, are focused on.
Abstract: We are focusing on three alternative techniques-linear discriminant analysis, logit analysis and genetic algorithms-that can be used to empirically select predictors for neural networks in failure prediction. The selected techniques all have different assumptions about the relationships between the independent variables. Linear discriminant analysis is based on linear combination of independent variables, logit analysis uses the logistical cumulative function and genetic algorithms is a global search procedure based on the mechanics of natural selection and natural genetics. In an empirical test all three selection methods chose different bankruptcy prediction variables. The best prediction results were achieved when using genetic algorithms.

195 citations


Journal ArticleDOI
TL;DR: This paper provides an initial framework for verifying Knowledge Sharing Technology (KST), the engineering activity that guarantees the correctness of the definitions in an ontology, its associated software environments and documentation with respect to a frame of reference during each phase and between phases of its life cycle.
Abstract: Based on the empirical verification of bibliographic-data and other Ontolingua ontologies, this paper provides an initial framework for verifying Knowledge Sharing Technology (KST). Verification of KST refers to the engineering activity that guarantees the correctness of the definitions in an ontology, its associated software environments and documentation with respect to a frame of reference during each phase and between phases of its life cycle. Verification of the ontologies refers to building the correct ontology, and it verifies that (1) the architecture of the ontology is sound, (2) the lexicon and the syntax of the definitions are correct and (3) the content of the ontologies and their definitions are internally and metaphysically consistent, complete, concise, expandable and sensitive.

171 citations


Journal ArticleDOI
Hongkyu Jo1, Ingoo Han1
TL;DR: In this article, a new structured model with multiple stages was proposed for bankruptcy prediction, which consists of four phases (training, test, adjustment, and prediction), and three types of input data.
Abstract: Recently, it has been an issue of interest how to integrate classification models to increase the prediction performance. This paper suggests a new structured model with multiple stages. It consists of four phases (training, test, adjustment, and prediction), and three types of input data (training, testing, and generalization). The integrated model is applied for bankruptcy prediction. A statistical model, discriminant analysis and two artificial intelligence models, neural network and case-based forecasting, are used in this study. The integration approach produces higher prediction accuracy than individual models.

130 citations


Journal ArticleDOI
TL;DR: The methodology for spatial scheduling, including the time dimension, is applied to the scheduling of Daewoo shipbuilding to build a system DAS-CURVE, which is successfully operational and its experimental performance is remarkable.
Abstract: Spatial scheduling considers not only traditional scheduling constraints like resource capacity and due dates, but also dynamic spatial layout of the objects. Automation of spatial scheduling is particularly important when the spatial resources are critical bottleneck resources, as is the case in the shipbuilding industry. To develop a spatial scheduling expert system for shipbuilding, a methodology for spatial layout of polygonal objects within rectangular plates is first developed. This study is then extended to the methodology for spatial scheduling, including the time dimension. The methodology is applied to the scheduling of Daewoo shipbuilding to build a system DAS-CURVE. DAS-CURVE is successfully operational and its experimental performance is remarkable.

70 citations


Journal ArticleDOI
TL;DR: It is concluded that AI technologies in the form of robotic platforms and decision support tools can have a tremendous impact on overall search efficiency for the USAR community, and represent an important field of study for related military applications.
Abstract: Structural collapse disasters routinely inspire sympathy not only for victims and their families, but also for heroic rescue personnel who are faced with a tremendously complex, hazardous and often frustrating task environment. Military operations and rescue activities in the aftermath of recent earthquakes and bombings indicate a tremendous need for greater access to denied areas within any crisis site involving collapsed structures. Recent developments in the remote inspection industry show great potential for employment of small robotic micro-rover systems in expanded roles for Urban Search and Rescue. This paper discusses key issues in the application of robotic systems to Urban Search and Rescue (USAR) activities and discusses ongoing development of a knowledge-based system for efficient management of automated search assets. USAR modeling and “micro-bot” employment advantages are addressed first, followed by a discussion of numerical method shortcomings in the context of search asset allocation. KNOBSAR is then proposed as an initial expert system prototype designed to interact with various structural collapse simulation packages and provide advice on search asset allocation to specific entry points within a crisis site. KNOBSAR structure and design is then illustrated in terms of micro-bot allocation scenarios to various collapsed structure entry points. The conclusion drawn from literature review, experimentation and personal experience is that AI technologies in the form of robotic platforms and decision support tools can have a tremendous impact on overall search efficiency for the USAR community, and represent an important field of study for related military applications.

67 citations


Journal ArticleDOI
TL;DR: A new methodology is presented that allows the development of highly simplified backpropagation neural network models based on a variable importance measure that addresses the problem of producing an interpretation of a neural network's functioning.
Abstract: A new methodology for building inductive expert systems known as neural networks has emerged as one of the most promising applications of artificial intelligence in the 1990s. The primary advantages of a neural network approach for modeling expert decision processes are: (1) the ability of the network to learn from examples of experts' decisions that avoids the costly, time consuming, and error prone task of trying to directly extract knowledge of a problem domain from an expert and (2) the ability of the network to handle noisy, incomplete, and distorted data that are typically found in decision making under conditions of uncertainty. Unfortunately, a major limitation of neural network-based models has been the opacity of the inference process. Unlike conventional expert system decision support tools, decision makers are generally unable to understand the basis of neural network decisions. This problem often makes such systems undesirable for decision support applications. A new methodology is presented that allows the development of highly simplified backpropagation neural network models. This methodology simplifies netw variables that are not contributing to the networks ability to produce accurate predictions. Elimination of unnecessary input variables directly reduces the number of network parameters that must be estimated and consequently the complexity of the network structure. A primary benefit of this development methodology is that it is based on a variable importance measure that addresses the problem of producing an interpretation of a neural network's functioning. Decision makers may easily understand the resulting networks in terms of the proportional contribution each input variable is making in the production of accurate predictions. Furthermore, in actual application the accuracy of these simplified models should be comparable to or better than the more complex models developed with the standard approach. This new methodology is demonstrated by two classification problems based on sets of actual data.

54 citations


Journal ArticleDOI
TL;DR: FBRL, a new language for representing behavior and function by combining the primitive identified, is proposed and the relation between function and behavior based on the primitives is investigated.
Abstract: Although a lot of researchers have pointed out the significance of functional representation, the general relations between function and behavior are not yet fully understood. We consider the knowledge of each component in a system as consisting of two elements. One is a necessary and sufficient concept far simulation of the component which we call the behavior model. The other is the interpretation of the behavior under a desirable state which the component is expected to achieve, which we call function. By classification of the primitives necessary for the interpretation of the behavior in various domains, which we call “domain ontology”, we can capture and represent the function by selection and combination of the primitives. This paper proposes FBRL, a new language for representing behavior and function by combining the primitive we identified. Also we investigate the relation between function and behavior based on the primitives. As the primitives can represent concepts at various levels of abstraction, they will contribute to those tasks which rely on the simulations on the model of the target object, such as diagnosis, design, explanation and so on.

39 citations


Journal ArticleDOI
TL;DR: The paper discusses the structure of the hybrid system, the methods for improving learning performance of backpropagation procedure using a multilayer ANN, as well as providing an example of application.
Abstract: The paper presents a knowledge-based decision support system (KB-CEDSS) that is integrated with a multilayer artificial neural network (ANN) for evaluating urban development. The multilayer ANN is trained to rapidly replicate the evaluation provided by the KB-CEDSS. By integrating knowledge-based systems, decision support systems and ANNs, the system achieves improvements in the implementation of each as well as increases the scope of the application. The paper discusses the structure of the hybrid system, the methods for improving learning performance of backpropagation procedure using a multilayer ANN, as well as providing an example of application.

38 citations


Journal ArticleDOI
TL;DR: This work believes that this is the first egress and evacuation modelling tool to incorporate both decision making and movement modelling, and is therefore an important step in the introduction of improved approaches to the evaluation of offshore safety management.
Abstract: In this paper we desribe a software tool that we have developed for modelling the decisions that people make in emergency situations in offshore environments. The tool was developed using C ++ and runs on a PC under MS Windows. It has a generic architecture and can be easily extended to other environments with different characteristics, e.g. hospitals, commercial buildings, etc. We use frames to represent a person's characteristics and their perception of the environment; scripts are used to define typical behaviours for particular situations. Our tool can be used to predict the likely behaviours of a population in hazardous situations and help evaluate the effectiveness of emergency procedures and training. We have worked with our collaborators to integrate our decision model with their model of people's movement to produce a system that can realistically simulate emergency scenarios on offshore structures. We believe that this is the first egress and evacuation modelling tool to incorporate both decision making and movement modelling. Our work is therefore an important step in the introduction of improved approaches to the evaluation of offshore safety management. Validating the decision model proved difficult because of lack of suitable data. We acquired additional data by interviewing offshore personnel and monitoring a mustering exercise. We then simulated an offshore emergency scenario and the results were encouraging. In the future we would like to enhance our model by incorporating communication between personnel. This would allow us to model complex scenarios, especially those that cannot be simulated realistically in training exercises.

37 citations


Journal ArticleDOI
TL;DR: The Cognitive Interview is a technique that was developed specifically to capture episodic knowledge and has been extensively tested for use in other disciplines and has practical implications for enhanced knowledge elicitation in the development of expert systems.
Abstract: Various knowledge elicitation techniques have been recommended for the development of expert systems, but a review of these techniques reveals that most are designed to elicit rules (i.e. procedural knowledge) from the expert in order to build rule-based expert systems. With the growing interest in case-based expert systems comes the need to introduce and evaluate new knowledge elicitation techniques designed specifically to capture knowledge in the form of cases. Cases represent the unique combination of situational variables and solutions experienced by the expert, i.e. episodic knowledge. This paper reports the finding of a study that investigated the applicability of the Cognitive Interview for eliciting the detail-rich cases required for the development of case-based expert systems. The Cognitive Interview is a technique that was developed specifically to capture episodic knowledge and has been extensively tested for use in other disciplines. This theoretically grounded technique has practical implications for enhanced knowledge elicitation in the development of expert systems.

33 citations


Journal ArticleDOI
TL;DR: Automated tools to assist in terrain analysis rely on the combination of several concepts and techniques that have been developed within the artificial intelligence field, including knowledge representation schemes, spatial reasoning techniques, autonomous agent planning methods, rule-based paradigms, and heuristic search strategies.
Abstract: Terrain analysis in support of planned military training or combat operations is a task which requires considerable training, skill and experience Military planners must synthesize knowledge of both their own and their expected adversary's tactics, weapons systems and probable courses of action with descriptions of the physical battlefield area to identify key terrain, those features in the environment which have the greatest potential to influence the outcome of military operations Terrain analysis is the cornerstone activity during the “intelligence preparation of the battlefield”, an analytic process designed to reduce uncertainty, identify likely enemy courses of action, and help select the most favorable friendly course of action Automated tools to assist in this process are now being constructed to support a variety of Defense Advanced Research Projects Agency (DARPA) programs The tools rely on the combination of several concepts and techniques that have been developed within the artificial intelligence field These include knowledge representation schemes, spatial reasoning techniques, autonomous agent planning methods, rule-based paradigms, and heuristic search strategies While no single technique in isolation can fully solve the broad problems of military operations planning, their combination provides a synergy that results in a useful end product

Journal ArticleDOI
TL;DR: The expert system STATEXS for dimensioning optimization and manufacture of gears and gearing is presented, using genetic algorithms which are well suited for such problems, particularly because of their robustness and ability to detect global extremes.
Abstract: This paper presents the expert system STATEXS for dimensioning optimization and manufacture of gears and gearing. To determine the optimum dimensions of gearing, we used genetic algorithms which are well suited for such problems, particularly because of their robustness and ability to detect global extremes. After completion of calculations and optimization of gears or gear pairs, one of the most difficult operations follows, i.e. the manufacture of the product with theoretically determined and optimized properties. To this end, at our Faculty we have also started to develop an expert system for the field of manufacture of various products of demanding shapes.

Journal ArticleDOI
TL;DR: It is shown that the use of forecasting tools based on Expert Systems can provide a substantial improvement over traditional techniques.
Abstract: The ability to provide an accurate prediction of future events plays an important role in most decision making processes. The effectiveness of any decision depends upon the nature of the sequence of events that follow the decision,. In most cases, the risks involved in decision making are reduced by forecasting future events and by predicting their uncontrollable aspects. Unfortunately, most decision making processes involve the consideration of several operational factors which by themselves may require forecasting. As a result, the whole process can be extremely complex, and therefore it is important not only to forecast accurately but also to use an appropriate forecasting model. Failure to use such a model can lead to substantial errors or risks and cause unnecessary financial setbacks. The work presented in this paper investigates the impact of Expert Systems on the decision making process. In particular, this paper presents the work carried out during a case study involving the forecasting of gas demands by a regional gas company. The software development of an appropriate forecasting tool based on an Expert System shell is presented and evaluated. The evaluation process described here involved running the software at the regional control centre of the gas company and comparing its results with those obtained from traditional techniques. It is shown that the use of forecasting tools based on Expert Systems can provide a substantial improvement over traditional techniques.

Journal ArticleDOI
TL;DR: It is shown that the availability of a model can be of substantial help in identifying positive and negative impacts of a knowledge based system.
Abstract: In the framework of the CommonKADS methodology attention is paid to the organizational consequences knowledge-based systems can have. The CommonKADS organization model is developed to support this important aspect of project work. This model is described in some detail. Additionally the model is implemented in the frame of the CommonXADS Workbench. This permits computer supported construction of a model. The organization model has been used in several projects. Results of applying the model are described. It is shown that the availability of a model can be of substantial help in identifying positive and negative impacts of a knowledge based system. These impacts are crucial for decision making before, during and after a project. In addition the model provides people involved in a knowledge-based systems project with a kind of “feeling” for the organization in which the system will become operational.

Journal ArticleDOI
TL;DR: A novel method to acquire efficient decision rules from questionnaire data using both simulated breeding and inductive learning techniques, which has been qualitatively and quantitatively validated by a case study on consumer product questionnaire data.
Abstract: Marketing decision making tasks require the acquisition of efficient decision rules from noisy questionnaire data. Unlike popular learning-from-example methods, in such tasks, we must interpret the characteristics of the data without clear features of the data nor pre-determined evaluation criteria. The problem is how domain experts get simple, easy-to-understand, and accurate knowledge from noisy data. This paper describes a novel method to acquire efficient decision rules from questionnaire data using both simulated breeding and inductive learning techniques. The basic ideas of the method are that simulated breeding is used to get the effective features from the questionnaire data and that inductive learning is used to acquire simple decision rules from the data. The simulated breeding is one of the Genetic Algorithm based techniques to subjectively or interactively evaluate the qualities of offspring generated by genetic operations. The proposed method has been qualitatively and quantitatively validated by a case study on consumer product questionnaire data: the acquired rules are simpler than the results from the direct application of inductive learning; a domain expert admits that they are easy to understand; and they are at the same level on the accuracy compared with the other methods. Furthermore, we address three variations of the basic interactive version of the method: (i) with semiautomated GA phases, (ii) with the relatively evaluation phase via AHP, and (iii) with an automated multiagent learning method.

Journal ArticleDOI
TL;DR: An on-tine expert system-based fault-tolerant control system (ESFTC) is considered which allows reconfiguration of the controller in feedback process systems during sensor or actuator failures or misoperudon.
Abstract: Fault-tolerant control systems have been studied very actively during recent years. Software and hardware redundancy techniques are the most common methods used to solve the problems. In recent years, expert systems or artificial intelligence have been used successfully in fault diagnosis of the dynamic systems and their suitability for fault-tolerant control problems has also been demonstrated. In this paper an on-tine expert system-based fault-tolerant control system (ESFTC) is considered which allows reconfiguration of the controller in feedback process systems during sensor or actuator failures or misoperudon. It forms an on-line expert system, which consists of an analytical problem solution, a process knowledge base, a knowledge acquisition part and an inference mechanism. The analytical problem solution is based on process coefficient changes, which are symptoms of process faults. A controller is reconfigured based on the symptoms. The process knowledge base comprises analytical knowledge in the form of process performances and heuristic knowledge in the form of fault trees and fault statistics. In the phase of knowledge acquisition the process specific knowledge, like theoretical process models, the abnormal behavior, failures and misoperation, is compiled The inference mechanism performs the fault-tolerant controller, based on the observable symptoms, controller trees, fault probabilities and process history. Case study experiments with the Yeast Fermentation Process Control System show the performance of the fault-tolerant controller.

Journal ArticleDOI
TL;DR: In this article, a comparative study of inductive learning and statistical methods using the simulation approach to provide a generalizable results was conducted, which showed that the relative classification accuracy of ID3 to probit increases as the proportion of binary variables increases in the classification model.
Abstract: This is a comparative study of inductive learning and statistical methods using the simulation approach to provide a generalizable results. The purpose of this study is to investigate the impact of measurement scale of explanatory variables on the relative performance of the statistical method (probit) and the inductive learning method (ID3) and to examine the impact of correlation structure on the classification behavior of the probit method and the ID3 method. The simulation results show that the relative classification accuracy of ID3 to probit increases as the proportion of binary variables increases in the classification model, and that the relative accuracy of ID3 to probit is higher when the covariance matrices are unequal among populations than when the covariance matrices are equal among populations. The empirical tests on ID3 reveal that the classification accuracy of ID3 is lower when the covariance matrices are unequal among populations than when the covariance matrices are equal among populations and that the classification accuracy of ID3 decreases as the correlations among explanatory variables increases.

Journal ArticleDOI
TL;DR: A knowledge-based software framework for automating the execution of collaborative organizational processes performed by multiple organizational members used as the basis for the solution design proved to increase organizational productivity by effectively carrying out several tedious watchdog activities, thereby freeing humans to work on other challenging job-related responsibilities.
Abstract: In this paper we present a knowledge-based software framework for automating the execution of collaborative organizational processes performed by multiple organizational members. Intelligent computerized assistants or agents are used as the basis for the solution design. These assistants emulate the work and behavior of human agents; each of them is capable of acting autonomously, cooperatively and collaboratively to achieve the collective goal. Organizational processes expand in scope and evolve in time, and thus suffer from constantly changing requirements and assumptions. We demonstrate the effectiveness of our design formalism in managing change so that the software solution can adapt easily to the changing needs and situations of the organization. A knowledge-based operational prototype of the framework, using object-oriented technology and implemented using PROLOG, allowed us to examine the feasibility of having the “assistants” replace human agents in organizational process execution. These assistants proved to increase organizational productivity by effectively carrying out several tedious watchdog activities, thereby freeing humans to work on other challenging job-related responsibilities.

Journal ArticleDOI
TL;DR: This research explores a Delphi-based approach for managing a portion of the knowledge acquisition process with multiple, geographically dispersed experts and asserts the resulting model to be applicable at virtually any hospital site by nature of the way the knowledge is identified and structured.
Abstract: Prerequisite to the development of an expert system is the identification of relevant knowledge and expertise. This research explores a Delphi-based approach for managing a portion of the knowledge acquisition process with multiple, geographically dispersed experts. The general context in which the methodology is useful and beneficial, the multiple site, related domain problem, is defined and discussed. An instance of a multiple site, related domain application, hospital operating room scheduling, is used to demonstrate the feasibility of the approach. Thirty five domain experts were engaged in a Delphi process that enabled the creation of a global model of knowledge for operating room scheduling. The resulting model is asserted to be applicable at virtually any hospital site by nature of the way the knowledge is identified and structured. The generality of the knowledge base is novel vis-a-vis the traditional paradigm of custom development typically associated with knowledge acquisition and knowledge bases. The model was successfully validated using two hospitals, one whose operating room Director participated in the Delphi study and one whose Director did not.

Journal ArticleDOI
TL;DR: An analogical reasoning algorithm, analogy by abstraction (ABA), that uses previously solved problems as cases to solve new design problems to solve case-based reasoning systems focusing on design expert systems is proposed.
Abstract: In this paper we discuss issues of case retrieval and adaptation in case-based reasoning systems focusing on design expert systems. We also propose an analogical reasoning algorithm, analogy by abstraction (ABA), that uses previously solved problems as cases to solve new design problems. ABA employs a classification hierarchy of previous problems to find wider similarities for case retrieval. ABA also uses a concept hierarchy for problem descriptions to modify a similar case for case adaptation efficiently. The major steps ofABA are SEARCH, ABSTRACT, REFINE, APPLY, and VERIFY. These steps are designed generally so that they can be applied to various task domains in synthesis type of expert systems. We exemplify the effectiveness of ABA with a case study of invention of a simple mechanical device.

Journal ArticleDOI
TL;DR: In this article, the authors discuss mechanisms by which domain knowledge can be used effectively in discovering knowledge from databases, in particular, the use of domain knowledge to reduce the search as well as optimize the hypotheses which represent the interesting knowledge to be discovered.
Abstract: Modern database technologies process large volumes of data to discover new knowledge. Some large databases make discovery computationally expensive. Additional knowledge, known as domain or background knowledge, hidden in the database can often guide and restrict the search for interesting knowledge. This paper discusses mechanisms by which domain knowledge can be used effectively in discovering knowledge from databases. In particular, we look at the use of domain knowledge to reduce the search as well as to optimize the hypotheses which represent the interesting knowledge to be discovered. Also, we discuss how to use domain knowledge to test the validity of the discovered knowledge. Although domain knowledge can be used to improve database searches, it should not block the discovery of unexpected knowledge. We provide some guidelines to use domain knowledge properly.

Journal ArticleDOI
Christof Ebert1
TL;DR: Fuzzy classification techniques are introduced as a basis for constructing rule-based quality models that can identify outlying software components that might cause potential quality problems.
Abstract: Managing software development and maintenance projects requires predictions about components of the software system that are likely to have a high error rate or that need high development effort. The value of any expert system is determined by the accuracy and cost of such predictions. Fuzzy classification techniques are introduced as a basis for constructing rule-based quality models that can identify outlying software components that might cause potential quality problems. The suggested approach and its advantages towards common classification and decision techniques is illustrated with experimental results. A module quality model—with respect to changes—provides both quality of fit (according to past data) and predictive accuracy (according to ongoing projects). Its portability is shown by applying it to industrial real-time projects.

Journal ArticleDOI
TL;DR: Five techniques for aggregating expertise are presented and the neural net method was shown to outperform the other methods in robustness and predictive accuracy.
Abstract: Knowledge acquisition consists of eliciting expertise from one or more experts in order to construct a knowledge base. When knowledge is elicited from multiple experts, it is necessary to combine the multiple sources of expertise in order to arrive at a single knowledge base. In this paper, we present and compare five techniques for aggregating expertise. An experiment was conducted to extract expert judgments on new product entry timing. The elicited knowledge was aggregated using classical statistical methods (logit regression and discriminant analysis), the ID3 pattern classification method, the k-NN (Nearest Neighbor) technique, and neural networks. The neural net method was shown to outperform the other methods in robustness and predictive accuracy. In addition, the explanation capability of the neural net was investigated. The contributions of the input variables to the change in the output variable were interpreted by analyzing the connection strengths of the neural net when the net stabilized. We conclude by discussing the use of neural nets in knowledge aggregation and decision support.

Journal ArticleDOI
TL;DR: This paper presents an audit expert system, called INFAUDITOR, aiding the audit process related to management information systems, and is a decision aid for auditors and provides integrated support for all aspects of the information system audit process.
Abstract: In a competitive environment, selecting and effectively using reliable information technologies, accurate methodologies and the right information systems, led to a search for an audit process. The information system audit process is therefore viewed as a detailed field program of many steps related to two basic decisions: 1. • The information acquisition decision, addressing the selection of audit tests to be performed, 2. • The audit opinion decision, involving the integration of test results leading to an opinion. In this paper, we present an audit expert system, called INFAUDITOR, aiding the audit process related to management information systems. It has two fundamental features: 1. • It covers all domains of information systems, managerial as well as technical aspects. 2. • It helps to determine, in a given audit situation, the respective importance that should be given to the different audit domains and tests of control. This customization process is determined by means of rules and depends on the audit objectives, the entreprise's characteristics and its information system. Since information systems auditing is a multicriterion decision, INFAUDITOR structures the audit domains and tests of control as a hierarchical audit tree following an analytic hierarchical process. It is a decision aid for auditors and provides integrated support for all aspects of the information system audit process. INFAUDITOR consists of several expert systems as in blackboard architecture. Its fact bases include the characteristics of the enterprise, its information system and the audit objectives, while the rule bases encompass the criteria, the general audit tree and the rules of customization. INFAUDITOR has been successfully applied to several real-life situations.

Journal ArticleDOI
TL;DR: The methodology involves the development of a knowledge-based system (KBIDEF) for the integration of the IDEFI model with its corresponding IDEFO model, and the design of two databases: object-oriented IDEF0 and IDEFI databases.
Abstract: In this paper a methodology is proposed for the integration of IDEFI with IDEF0, allowing an IDEFI model to be generated easily from the corresponding IDEFO model. The methodology involves: (1) the principle for the integration of IDEFI with IDEFO based on the concepts of IDEF methods; (2) the new requirements for IDEFO diagrams at the relative bottom levels to meet the prerequisites for the integration of IDEF models; (3) the development of a knowledge-based system (KBIDEF) for the integration of the IDEFI model with its corresponding IDEFO model; (4) the design of two databases: object-oriented IDEF0 and IDEFI databases, three libraries: Entity Class Library, Relation Class Library and Domain Relation Class Library and two knowledge bases: Relation Analysis Knowledge Base and Domain Knowledge Base for CIM information system design. Finally, the paper suggests some areas for future work.

Journal ArticleDOI
TL;DR: This work introduces an automated timetabler that combines data, model and knowledge-bases, developed using object-oriented methodology, which allows for a more efficient design and code implementation of scheduling procedures.
Abstract: The timetabling problem has traditionally been treated as a mathematical optimization, heuristic, or human-machine interactive problem. These approaches tend to suffer from lack of flexibility, computational intractability, and poor results. We introduce an automated timetabler that combines data, model and knowledge-bases, developed using object-oriented methodology. The inclusion of expert knowledge allows for solutions that fit the problem context better, while the use of a database enables a more flexible and maintainable system. The object-oriented paradigm allows for a more efficient design and code implementation of scheduling procedures. Results have been promising as the system has automatically scheduled over 2000 students and instructors into more than 300 classes.

Journal ArticleDOI
TL;DR: In this paper, the authors describe lessons learned through the sequential construction of four expert systems for menu planning, and show how to represent common sense knowledge about food and menus in a form amenable to successful menu planning.
Abstract: Planning nutritious and appetizing menus is a task at which experts consistently outperform computer systems, making it a challenging domain for AI research. This paper describes lessons learned through the sequential construction of four expert systems for menu planning. The first of these systems, ESOMP (Expert System on Menu Planning), was built to plan menus for patients on severely restricted low protein diets. A need for common sense in structuring “sensible” looking meals was identified and addressed via meal patterns and food exchange groups. PRISM (Pattern Regulator for the Intelligent Selection of Menus) expanded upon ESOMP by planning menus to meet a broad range of dietary requirements and personal preferences. Simple knowledge representation structures, implemented in PRISM 1.0, proved inadequate for the expanded task. A hierarchical network structure was developed and implemented in PRISM 2.0 and PRISM 3.0. This structure captures the common sense concept of menu form and describes context-sensitive relationships among menu parts. A major contribution of this paper is showing how to represent common sense knowledge about food and menus in a form amenable to successful menu planning.

Journal ArticleDOI
TL;DR: This paper presents a collaborative effort to design and implement a cooperative material handling system by a small team of human and robotic agents in an unstructured indoor environment and constructed an experimental distributed multi-agent architecture testbed facility.
Abstract: In this paper we present a collaborative effort to design and implement a cooperative material handling system by a small team of human and robotic agents in an unstructured indoor environment. Our approach makes fundamental use of the human agent's expertise for aspects of task planning, task monitoring and error recovery. Our system is neither fully autonomous nor fully teleoperated. It is designed to make effective use of the human's abilities within the present state of the art of autonomous systems. Our robotic agents refer to systems which are each equipped with at least one sensing modality and which possess some capability for self-orientation and/or mobility. Our robotic agents are not required to be homogeneous with respect to either capabilities or function. Our research stresses both paradigms and testbed experimentation. Theory issues include the requisite coordination principles and techniques which are fundamental to a cooperative multi-agent system's basic functioning. We have constructed an experimental distributed multi-agent architecture testbed facility. The required modular components of this testbed are currently operational and have been tested individually. Our current research focuses on the agent's integration in a scenario for cooperative material handling.

Journal ArticleDOI
TL;DR: The integrated modeling approach is presented, based on object-oriented and event-driven modeling, which provides the explicit, declarative and executable representation of heuristic scheduling knowledge which aids the communication between human schedulers and system analysts.
Abstract: In spite of the efforts to automate production scheduling systems, most manufacturing plants rely on human schedulers for their practical production scheduling due mainly to the complexity and unpredictability of the manufacturing systems One of the reasons for the difficulties in automating the production scheduling process is the lack of an explicit representation scheme of heuristic scheduling knowledge In this paper, an integrated modeling approach is presented to represent the complicated heuristic knowledge in production scheduling systems The integrated modeling approach is based on object-oriented and event-driven modeling It provides the explicit, declarative and executable representation of heuristic scheduling knowledge which aids the communication between human schedulers and system analysts The declarative and executable representation of scheduling knowledge will enhance the flexibility of production scheduling systems and aid the development of production scheduling information systems A heuristic production scheduling model of a tire manufacturing system is used throughout the paper to illustrate the concepts and applicability of the approach

Journal ArticleDOI
TL;DR: The neural network had the best overall prediction rate and the best prediction rate for asthma (96%) and the discriminant analysis had the worst prediction rates (80%) while the CBR had the lowest prediction rates in all categories.
Abstract: This paper compared three knowledge models (namely, neural network, case-based reasoning, and discriminant analysis) for the diagnosis of asthma. The data were collected from 294 patients with asthmatic symptoms who visited the Bronchial Asthma Clinics, Internal Medicine Department of Yonsei University Severance Hospital from June 1992 to May 1995. Diagnostic capabilities for the three knowledge models varied. The neural network had the best overall prediction rate (92%) and the best prediction rate for asthma (96%); the discriminant analysis had the best prediction rate for non-asthma (80%); and the CBR had the lowest prediction rates in all categories.