scispace - formally typeset
Search or ask a question

Showing papers in "Expert Systems With Applications in 1998"


Journal ArticleDOI
TL;DR: This paper addresses the issue of identifying important input parameters in building a multilayer, backpropagation network for a typical class of engineering problems by comparing three different methods for ranking input importance: sensitivity analysis, fuzzy curves, and change of MSE (mean square error).
Abstract: Artificial neural networks have been used for simulation, modeling, and control purposes in many engineering applications as an alternative to conventional expert systems. Although neural networks usually do not reach the level of performance exhibited by expert systems, they do enjoy a tremendous advantage of very low construction costs. This paper addresses the issue of identifying important input parameters in building a multilayer, backpropagation network for a typical class of engineering problems. These problems are characterized by having a large number of input variables of varying degrees of importance; and identifying the important variables is a common issue since elimination of the unimportant inputs leads to a simplification of the problem and often a more accurate modeling or solution. We compare three different methods for ranking input importance: sensitivity analysis, fuzzy curves, and change of MSE (mean square error); and analyze their effectiveness. Simulation results based on experiments with simple mathematical functions as well as a real engineering problem are reported. Based on the analysis and our experience in building neural networks, we also propose a general methodology for building backpropagation networks for typical engineering applications.

187 citations


Journal ArticleDOI
Sung Ho Ha1, Sang Chan Park1
TL;DR: This paper presents the data mining process from data extraction to knowledge interpretation and data mining tasks, and corresponding algorithms, and proposes a new marketing strategy that fully utilizes the knowledge resulting from data mining.
Abstract: Data mining, which is also referred to as knowledge discovery in databases, is the process of extracting valid, previously unknown, comprehensible and actionable information from large databases and using it to make crucial business decisions. In this paper, we present the data mining process from data extraction to knowledge interpretation and data mining tasks, and corresponding algorithms. Before applying data mining techniques to a real-world application, we build a data mart on the enterprise Intranet. RFM (recency, frequency, and monetary) data extracted from the data mart are used extensively for our analysis. We then propose a new marketing strategy that fully utilizes the knowledge resulting from data mining.

117 citations


Journal ArticleDOI
TL;DR: The development and implementation of an automatic controller for path planning and navigation of an autonomous mobile robot using simulated annealing and fuzzy logic and a collision-free optimal trajectory among fixed polygonal obstacles is described.
Abstract: This article describes the development and implementation of an automatic controller for path planning and navigation of an autonomous mobile robot using simulated annealing and fuzzy logic. The simulated annealing algorithm was used to obtain a collision-free optimal trajectory among fixed polygonal obstacles. C-space was used to represent the working space and B-spline curves were used to represent the trajectories. The trajectory tracking was performed with a fuzzy logic algorithm. A detailed explanation of the algorithm is given. The objectives of the control algorithm were to track the planned trajectory and to avoid collision with moving obstacles. Simulation and implementation results are shown. A Nomadic 200 mobile robot was used to perform the experiments.

91 citations


Journal ArticleDOI
TL;DR: Simulation and experimental results show the fuzzy logic controller designed in this study can be used satisfactorily by wheeled mobile robots moving on unknown static terrains.
Abstract: In this study a fuzzy logic controller for mobile robot navigation has been designed. The designed controller deals with the uncertainty and ambiguity of the information the system receives. The technique has been used on an experimental mobile robot which uses a set of seven ultrasonic sensors to perceive the environment. The designed fuzzy controller maps the input space (information coming from ultrasonic sensors) to a safe collision-avoidance trajectory (output space) in real time. This is accomplished by an inference process based on rules (a list of IF-THEN statements) taken from a knowledge base. The technique generates satisfactory direction and velocity maneuvers of the autonomous vehicle which are used by the robot to reach its goal safely. Simulation and experimental results show the method can be used satisfactorily by wheeled mobile robots moving on unknown static terrains.

62 citations


Journal ArticleDOI
TL;DR: A CSCL environment based on two kinds of intelligent agents, domain agents and mediator agents, modeled based on ideas from distributed artificial intelligence is presented.
Abstract: One of the current trends in the research on AI applied to computer assisted learning is the area of Computer Supported Collaborative Learning (CSCL). A CSCL environment should include software elements that assist the learners in the application of the domain knowledge and promote opportunities of effective collaboration and learning. In this paper we present a CSCL environment based on two kinds of intelligent agents, domain agents and mediator agents, modeled based on ideas from distributed artificial intelligence. The domain agents are able to assist the learners in the application of domain knowledge elements. In order to support the collaboration between learners each mediator agent constructs and maintains a learner model represented as a set of beliefs about the capabilities, commitments, intentions and learning opportunities of its learner. Mediator agents cooperate by exchanging their beliefs about the capabilities of their learners and proposing the practice of those knowledge elements that promote assistance among learners, keeping the group heterogeneous and collaborative. Based on its learner model the mediator agent is able to support awareness in the environment and maintain the learning possibilities of its learner in a community of practice. This framework provides a collaborative environment for second language learning based on an implementation of the theoretical concepts of social learning (Vygotsky, L. S., 1978, Mind in Society: the Development of Higher Psychological Processes. London: Harvard University Press) and agent oriented programming (Shoham, Y, 1993, Agent-oriented programming. Artificial Inteligence, 60, 51–92).

53 citations


Journal ArticleDOI
TL;DR: In this article, a rule-based expert system called BOARDEX was used to screen officer personnel records being considered for Command and General Staff College (CGSC) for the first time.
Abstract: The purpose of this study was to begin the development and testing of an Expert System (ES) to screen officer personnel records being considered for Command and General Staff College (CGSC). The study included a review of the Artificial Intelligence (AI) literature relevant to the military and human resource management, problem selection, knowledge acquisition, knowledge representation, knowledge encoding, and knowledge testing and evaluation. The AI computer language PROLOG was used to develop a basic rule-based expert system called BOARDEX. Data from mock officer personnel records obtained from U.S. Army Total Personnel Command (PERSCOM) were entered into BOARDEX. Results from BOARDEX were correlated to votes generated by a board consisting of 10 human experts from the U.S. Army War College Class of 1996. The results show that BOARDEX's decisions are not statistically different from the decisions made by human experts.

52 citations


Journal ArticleDOI
TL;DR: A scalable probabilistic algorithm that expedites feature selection further and can scale up without sacrificing the quality of selected features and an incremental algorithm that adapts to the newly extended feature set and captures `concept drifts' by removing features from previously selected and newly added ones are proposed.
Abstract: Feature selection determines relevant features in the data. It is often applied in pattern classification, data mining, as well as machine learning. A special concern for feature selection nowadays is that the size of a database is normally very large, both vertically and horizontally. In addition, feature sets may grow as the data collection process continues. Effective solutions are needed to accommodate the practical demands. This paper concentrates on three issues: large number of features, large data size, and expanding feature set. For the first issue, we suggest a probabilistic algorithm to select features. For the second issue, we present a scalable probabilistic algorithm that expedites feature selection further and can scale up without sacrificing the quality of selected features. For the third issue, we propose an incremental algorithm that adapts to the newly extended feature set and captures `concept drifts' by removing features from previously selected and newly added ones. We expect that research on scalable feature selection will be extended to distributed and parallel computing and have impact on applications of data mining and machine learning.

46 citations


Journal ArticleDOI
TL;DR: This paper presents a novel inductive learning algorithm called the Inductive Learning Algorithm (ILA) for extracting production rules from a set of examples and describes application of the ILA to a range of data sets with different numbers of attributes and classes.
Abstract: In this paper we present a novel inductive learning algorithm called the Inductive Learning Algorithm (ILA) for extracting production rules from a set of examples. We also describe application of the ILA to a range of data sets with different numbers of attributes and classes. The results obtained show that the ILA is more general and robust than most other algorithms for inductive learning. Most of the time, ILA appears to be comparable to other well-known algorithms, such as AQ and ID3, if not better.

43 citations


Journal ArticleDOI
TL;DR: Two fuzzy personnel performance models will be represented and their variables fuzzified and an application to the performance evaluation in a higher educational setting will be proposed.
Abstract: This paper presents a proposed application of the fuzzy set theory to a personnel performance evaluation system. Two fuzzy personnel performance models will be represented and their variables fuzzified. An application to the performance evaluation in a higher educational setting will be proposed.

42 citations


Journal ArticleDOI
TL;DR: An automated graphical knowledge acquisition tool is presented, based upon object-oriented principles, that provides a representation independent methodology that can easily be mapped into any other object- oriented expert system or otherobject-oriented intelligent tools.
Abstract: Knowledge acquisition and knowledge representation are the fundamental building blocks of knowledge-based systems (KBSs). How to efficiently elicit knowledge from experts and transform this elicited knowledge into a machine usable format is a significant and time consuming problem for KBS developers. Object-orientation provides several solutions to persistent knowledge acquisition and knowledge representation problems including transportability, knowledge reuse, and knowledge growth. An automated graphical knowledge acquisition tool is presented, based upon object-oriented principles. The object-oriented graphical interface provides a modeling platform that is easily understood by experts and knowledge engineers. The object-oriented base for the automated KA tool provides a representation independent methodology that can easily be mapped into any other object-oriented expert system or other object-oriented intelligent tools.

40 citations


Journal ArticleDOI
TL;DR: This paper shows how to analyze a document containing natural language sentences, so as to recognize its main topics or themes, as well as to find similarities and discrepancies.
Abstract: The computer can easily carry out many operations on systematic collections of data when these are numbers: • What are the data about? What are the main topics? • Make a summary. Obtain a summary of May sales of a given store. • Compare. Compare May sales in stores A and B. • Find similarities and discrepancies. How are sales of stores A and B similar? • Find averages. Find the sales in the South of Mexico, in Fall 1997. • Find tendencies. Extrapolate. On the other hand, when data appear in documents in Spanish, organized in sections, paragraphs and sentences, it is not possible for the computer to carry out the above operations. As much of human knowledge is in texts written in natural language, it is convenient to discover methods to carry out those operations. For that, the computer must understand or comprehend the text. This paper shows how to analyze a document containing natural language sentences, so as to recognize its main topics or themes.

Journal ArticleDOI
TL;DR: Results on nonlinear system identification, nonlinear trajectory tracking, and input-to-state stability (ISS) of dynamic neural networks are presented and the Lyapunov approach is used.
Abstract: In this paper, the authors summarize their research related to dynamic neural control In particular, results on nonlinear system identification, nonlinear trajectory tracking, and input-to-state stability (ISS) of dynamic neural networks are presented The main analysis tool utilized is the Lyapunov approach References for the detailed demonstrations are given We illustrate the applicability of the results by means of examples

Journal ArticleDOI
James R. Nolan1
TL;DR: F fuzzy classification techniques are introduced as a basis for constructing rule-based scoring models that can encapsulate knowledge needed for consistent scoring results, which allows for more valid individual and group assessment.
Abstract: This article reports on the design and development of an expert fuzzy classification scoring system for grading student writing samples. The growing use of written response tests in the education sector provides fertile domain areas for the application of soft computing and expert systems technology. The main function of the expert fuzzy classification scoring system is to support teachers in the evaluation of student writing samples by providing them with a uniform framework for generating ratings based on the consistent application of scoring rubrics. The system has been tested using actual student response data. A controlled experiment demonstrated that teachers using the expert fuzzy classification scoring system can make assessments in less time and with a level of accuracy comparable to the best teacher graders. The article introduces fuzzy classification techniques as a basis for constructing rule-based scoring models that can encapsulate knowledge needed for consistent scoring results. This increased consistency in the application of the scoring rubrics allows for more valid individual and group assessment.

Journal ArticleDOI
Boo-Sik Kang1, Jang-Hee Lee2, Chung-Kwan Shin1, Song Jin Yu1, Sang Chan Park1 
TL;DR: This research provides a framework for implementing an integrated yield management system, which uses inductive decision trees and neural networks with a back propagation algorithm and a self-organizing mapping algorithm to manage yields over major manufacturing processes.
Abstract: Yield is one of the most important indices determining the success in semiconductor manufacturing business. Previous yield management efforts are to enhance yield of the specific process through the use of statistical and experimental analysis, but they fail to manage the yields of overall manufacturing processes. This research provides a framework for implementing such an integrated yield management system, which uses inductive decision trees and neural networks with a back propagation algorithm and a self-organizing mapping algorithm to manage yields over major manufacturing processes.

Journal ArticleDOI
TL;DR: The payoff distribution schem presented here opens the way for the use of the fuzzy classifier system in control tasks and other mechanisms that improve learning speed are presented.
Abstract: This paper describes the fuzzy classifier system and a new payoff distribution scheme that performs true reinforcement learning. The fuzzy classifier system is a crossover between learning classifier systems and fuzzy logic controllers. By the use of fuzzy logic, the fuzzy classifier system allows for variables to take continuous values, and thus, could be applied to the identification and control of continuous dynamic systems. The fuzzy classifier system adapt the mechanics of learning classifier system to fuzzy logic to evolve sets of coadapted fuzzy rules. The payoff distribution schem presented here opens the way for the use of the fuzzy classifier system in control tasks. Additionally, other mechanisms that improve learning speed are presented.

Journal ArticleDOI
TL;DR: An Intelligent Task Management System (ITMS) is presented which comprises a rule-based inference mechanism responsible for the division of a client's job request into basic tasks and an Object-Oriented Virtual Agent module created using object-oriented technology for achieving automatic task decomposition and assignment.
Abstract: Recent research related to agent-based systems has seen significant advances made in terms of the `intelligence' level of collaborative and autonomous features of agents with a number of proposed frameworks reported in contemporary publications. However, the automatic decomposition of job requests into basic tasks to be carried out by relevant agents, which enhances the `intelligence' level of the system, has not received as much attention as it deserves. This article presents an Intelligent Task Management System (ITMS) which can usefully be deployed in a manufacturing information network. It comprises a rule-based inference mechanism responsible for the division of a client's job request into basic tasks and an Object-Oriented Virtual Agent (OOVA) module created using object-oriented technology for achieving automatic task decomposition and assignment. The prototype program of this ITMS has been developed and then tested in an emulated manufacturing environment, using CLIPS as the tool for building the rule-based program and Visual Basic 5 for constructing the OOVA module. It is expected that the experience of developing and implementing the ITMS may be useful for the design of the next generation of collaborative agent-based systems to be adopted in a manufacturing information network. In this article, details related to the structure, design and implementation of the ITMS are covered with actual program codes included. Further, a methodology for the design of the Rule-based Inference Mechanism (RIM) is also presented.

Journal ArticleDOI
TL;DR: The proposed TNBN model is an extension of a standard Bayesian network, in which each temporal node represents an event or state change of a variable and the arcs represent causal–temporal relationships between nodes.
Abstract: Many real-world applications, such as industrial diagnosis, require an adequate representation and inference mechanism that combines uncertainty and time. In this work, we propose a novel approach for representing dynamic domains under uncertainty based on a probabilistic framework, called temporal nodes Bayesian networks (TNBN). The TNBN model is an extension of a standard Bayesian network, in which each temporal node represents an event or state change of a variable and the arcs represent causal–temporal relationships between nodes. A temporal node has associated a probability distribution for its time of occurrence, where time is discretized in a finite number of temporal intervals; allowing a different number of intervals for each node and a different duration for the intervals within a node (multiple granularity). The main difference with previous probabilistic temporal models is that the representation is based on state changes at different times instead of state values at different times. Given this model, we can reason about the probability of occurrence of certain events, for diagnosis or prediction, using standard probability propagation techniques developed for Bayesian networks. The proposed approach is applied to fossil power plant diagnosis through two detailed case studies: power load increment and control level system failure. The results show that the proposed formalism could help to improve power plant availability through early diagnosis of events and disturbances.

Journal ArticleDOI
TL;DR: The relevance of an intelligent software agent approach in environmental scanning activities and ways that an agent can help in accomplishing scanning tasks are examined and a prototype system that is currently under development is described.
Abstract: A good knowledge and understanding of the business environment is a basic premise for strategic management. A system that is able to help managers actively scan the environment contributes to active executive support. This paper examines the relevance of an intelligent software agent approach in environmental scanning activities and exploits ways that an agent can help in accomplishing scanning tasks. It then describes a prototype system that is currently under development. The system is designed to make use of potential business environment information resources on the World Wide Web to extract useful information for managers. Using the pulp and paper industry as a case context, it is shown in this paper how the agent is constructed and how it provides up-to-date industry news and market information to managers.

Journal ArticleDOI
TL;DR: KROL has sufficient expressive power to be used in applying demanding knowledge-based modeling methodologies, such as KADS and Generic Task, which are the major landmarks of the second-generation expert systems technology.
Abstract: This paper presents a knowledge representation object language (KROL) on top of Prolog. KROL is aimed at providing the ability to develop second-generation expert systems. The main aspects of KROL include multi-paradigm knowledge representation (first-order predicate logic, objects, rules), inference mechanisms at different levels of granularity, explanation facility, object-oriented database management module, and user-friendly interface. KROL has sufficient expressive power to be used in applying demanding knowledge-based modeling methodologies, such as KADS and Generic Task, which are the major landmarks of the second-generation expert systems technology. Four successful agricultural expert systems have been developed in the last 6 years using KROL. To demonstrate the language capabilities, we present an example of disorder diagnosis.

Journal ArticleDOI
TL;DR: This paper investigates an approach that deals with inter-tabular anomalies in decision tables to verify knowledge based systems (KBS) and partly uses heuristics where exhaustive checks would be too inefficient.
Abstract: The use of decision tables to verify knowledge based systems (KBS) has been advocated several times in the validation and verification (V&V) literature. However, one of the main drawbacks of these systems is that they fail to detect anomalies that occur over rule chains. In a decision table based context this means that anomalies that occur due to interactions between tables are neglected. These anomalies are called inter-tabular anomalies. In this paper we investigate an approach that deals with inter-tabular anomalies. One of the prerequisites for the approach was that it could be used by the knowledge engineer during the development of the KBS. This requires that the anomaly check can be performed on-line. As a result, the approach partly uses heuristics where exhaustive checks would be too inefficient. All detection facilities that will be described have been implemented in a table-based development tool called prologa . The use of this tool will be briefly illustrated. In addition, some experiences in verifying large knowledge bases are discussed.

Journal ArticleDOI
TL;DR: An integrated knowledge-based system for alternative design decisions, materials selection and cost estimating mainly for pre-design analysis is presented and an analysis of alternative design and materials selection for a residential building is presented.
Abstract: Building design and material type used are the two main parameters that have a significant impact on the cost of a building. Therefore, it is important for a cost estimating tool to have a mechanism that allows the designer to perform a rapid `what if' analysis on design alternatives and materials selection at the early planning stage without the accompaniment of detailed design and drawings. However, the traditional design cost estimating procedures generally focus on assessing costs after design decisions are made and conveyed to the construction drawing as a quantity take off. On the other hand, comparative cost estimating (unit cost per square meter) uses historical data collected by averaging buildings of different designs, heights and materials. Even today's computer aided design (CAD) software incorporates a mechanism to take off quantities from completed designs and drawings. However, basically they are the same as the traditional quantity take off estimating because a single specified design still controls the resulting costs, and there is no alternative materials selection integrated to the cost system. This article presents an integrated knowledge-based system for alternative design decisions, materials selection and cost estimating mainly for pre-design analysis. The knowledge-based system also includes materials selection and cost estimating for after-detailed design by providing edit tables for inputting the dimensions of building elements from completed designs and drawings or by directly importing from CAD. The system generates activities quantities, costs and lists of materials selected and their respective quantities. An analysis of alternative design and materials selection for a residential building using the integrated system is presented.

Journal ArticleDOI
TL;DR: KAMET is a formal plan based on models designed to manage knowledge acquisition from multiple knowledge sources and to improve the phase of knowledge acquisition and knowledge modeling process, making them more efficient.
Abstract: At the beginning of the 1980s, the Artificial Intelligence (AI) community showed little interest in research on methodologies for the construction of knowledge-based systems (KBS) and for knowledge acquisition (KA). The main idea was the rapid construction of prototypes with LISP machines, expert system shells, and so on. Over time, the community saw the need for a structured development of KBS projects, and KA was recognized as the critical stage and the bottleneck for the construction of KBS. Concerning KA, many publications have appeared since then. However, very few have focused on formal plans to manage knowledge acquisition and from multiple knowledge sources. This paper addresses this important problem. KAMET is a formal plan based on models designed to manage knowledge acquisition from multiple knowledge sources. The objective of KAMET is to improve, in some sense, the phase of knowledge acquisition and knowledge modeling process, making them more efficient.

Journal ArticleDOI
TL;DR: A case-based expert system approach for developing new product properties that can effectively support all the steps in the design process from storing past cases, through retrieving similar cases, to adapting the retrieved case for the new product.
Abstract: Quality design in an enterprise is to design the formulation and processing parameters of products in a cost-effective manner so that the resulting properties can meet the quality specifications given by customers. This design problem often lacks the exact formula for predicting quality properties, and highly depends upon human designers' past experiences. This paper presents a case-based expert system approach for developing new product properties that can effectively support all the steps in the design process from storing past cases, through retrieving similar cases, to adapting the retrieved case for the new product. Within this approach, case-based reasoning uses a hierarchical case indexing method for efficient case retrieval, and provides sophisticated similarity metrics for accurate case matching. When there is a discrepancy between the most similar case retrieved and new product, the case-based design process embraces the expert system technique to reconcile the discrepancy by using domain-specific knowledge on the relationships between design parameters and quality properties. Such integration of case-based reasoning and expert system provides a systematic procedure for design engineers to retrieve past cases quickly and accurately, and to effectively accumulate their expertise for design adaptation.

Journal ArticleDOI
TL;DR: The development of a framework titled PAFEX (Performance Analysis and Forecasting Expert System), a diagnostic expert system that has the objective of analyzing construction project performance and adopts causal diagnostic reasoning as a strategy and object-oriented programming technology as a knowledge representation scheme is described.
Abstract: Analyzing construction project performance is a diagnostic process, and forecasting project performance is a cognitive process Both processes require the exercise of expert judgment and intelligence Artificial Intelligence programming techniques such as expert systems and neural networks provide assistance to construction project managing personnel in capturing construction knowledge needed for analyzing and forecasting construction projects This paper describes the development of a framework titled PAFEX (Performance Analysis and Forecasting Expert System) The framework has two main modules The first module is a diagnostic expert system that has the objective of analyzing construction project performance and adopts causal diagnostic reasoning as a strategy and object-oriented programming technology as a knowledge representation scheme The second module has the objective of forecasting the construction performance and applies artificial neural networks as a modeling strategy The advantages and limitations of these two modeling processes are discussed Issues regarding integrating the proposed approach with existing project management systems are also discussed

Journal ArticleDOI
TL;DR: A set of critical issues underlying CBR are presented, then their consequences for a practical domain are explored and issues such as retrieval and adaptation are examined in the context of production process control.
Abstract: Explicit and quick problem-solving is one of the critical processes that creates a competitive advantage in the business environment. By analyzing the current problem-solving process in the production line, we identified that the main problem is lack of knowledge sharing and consistent solutions which are referable. Therefore, all knowledge which is related to the problem-solving process must be accumulated and shared in the integrated system. When a process engineer is confronted with a new problem, he needs to receive the suggested solution based on accumulated data and knowledge. The Case-Based Reasoning (CBR) approach utilizes previous experience in the form of problem description or cases. To date, however, most Case-Based Systems (CBS) have tended to focus on relatively simple domains. The current study involves a project to develop a decision support system for a complex production process. This paper presents a set of critical issues underlying CBR, then explores their consequences for a practical domain. For concreteness, issues such as retrieval and adaptation are examined in the context of production process control. In this research, the evaluation of the system is carried out by comparing the solution of the process engineer with the solution proposed by the system.

Journal ArticleDOI
TL;DR: It has been demonstrated that the adaptive knowledge-acquisition system performs better than FOIL on inducing logical relations from perfect or noisy training examples, and this result suggests that the process of natural selection and evolution can successfully evolve a high-performance learning system.
Abstract: The knowledge-acquisition bottleneck greatly obstructs the development of knowledge-based systems. One popular approach to knowledge acquisition uses inductive concept learning to derive knowledge from examples stored in databases. However, existing learning systems cannot improve themselves automatically. This paper describes an adaptive knowledge-acquisition system that can learn first-order logical relations and improve itself automatically. The system is composed of an external interface, a biases base, a knowledge base of background knowledge, an example database, an empirical ILP learner, a meta-level learner, and a learning controller. In this system, the empirical ILP learner performs top-down search in the hypothesis space defined by the concept description language, the language bias, and the background knowledge. The search is directed by search biases which can be induced and refined by the meta-level learner based on generic genetic programming. It has been demonstrated that the adaptive knowledge-acquisition system performs better than FOIL on inducing logical relations from perfect or noisy training examples. The result implies that the search bias evolved by evolutionary learning is better than that of FOIL which is designed by a top researcher in the field. Consequently, generic genetic programming is a promising technique for implementing a meta-level learning system. The result is very encouraging as it suggests that the process of natural selection and evolution can successfully evolve a high-performance learning system.

Journal ArticleDOI
TL;DR: Tests on well-researched cases have shown that the algorithm is capable of finding a better loading pattern—enabling the reactor to run both longer and more efficiently per cycle—than solutions reported in the domain literature and found by other methods, such as expert systems and simulated annealing.
Abstract: How to `refuel' a nuclear power reactor, when it is shut down every year or so between two successive operation cycles, is the `in-core fuel management' problem. To solve it, it is necessary to design and simulate a safe and efficient fuel loading pattern. `Reload design' plays a crucial role in nuclear power plant operation, in terms of both economy and safety. This article presents FuelGen, a system embodying a specialized genetic algorithm for designing refuellings. The tests on well-researched cases have shown that the algorithm is capable of finding a better loading pattern—enabling the reactor to run both longer and more efficiently per cycle—than solutions reported in the domain literature and found by other methods, such as expert systems and simulated annealing. Over a decade, the parent-project Fuelcon first inaugurated the rule-driven refuelling paradigm, then turned to probing hybrid architectures. Its sequel, FuelGen, radically supersedes Fuelcon's search mechanism, while retaining the architectural and ergonomic outlook that Fuelcon had evolved.

Journal ArticleDOI
TL;DR: How a three-layer artificial neural network (NN) can be used to improve the accuracy of short-term automatic numerical prediction of road surface temperature in order to cut winter road maintenance costs, reduce environmental damage from oversalting and provide safer roads for road users is described.
Abstract: This paper describes how a three-layer artificial neural network (NN) can be used to improve the accuracy of short-term (3–12 hours) automatic numerical prediction of road surface temperature, in order to cut winter road maintenance costs, reduce environmental damage from oversalting and provide safer roads for road users. In this paper, the training of the network is based on historical and preliminary meteorological parameters measured at an automatic roadside weather station, and the target of the training is hourly error of original numerical forecasts. The generalization of the trained network is then used to adjust the original model forecast. The effectiveness of the network in improving the accuracy of numerical model forecasts was tested at 39 sites in eight countries. Results of the tests show that the NN technique is able to reduce absolute error and root-mean-square error of temperature forecasts by 9.9–29%, and increase the accuracy of frost/ice prediction.

Journal ArticleDOI
TL;DR: A new fuzzy expert system for real-time process condition monitoring and incident prevention is developed based on dynamic membership functions of fuzzy systems and has been successfully used in a chemical pulp mill.
Abstract: A new fuzzy expert system for real-time process condition monitoring and incident prevention is developed. Its reasoning strategy is based on dynamic membership functions of fuzzy systems. With a multimedia user interface, the fuzzy expert system can codify the expertise knowledge to handle incidents, perform process condition monitoring, and provide operation support. The prototype of this system has been successfully used in a chemical pulp mill for process condition monitoring and incident prevention.

Journal ArticleDOI
TL;DR: The evaluations performed in this work indicate that SADEP can potentially improve plant availability through early diagnosis of disturbances that could lead to plant shutdown.
Abstract: Artificial Intelligence applications in large-scale industry, such as fossil fuel power plants, require the ability to manage uncertainty and time. In these domains, the knowledge about the process comes from experts' experience and it is generally expressed in a vague-fuzzy way using ill-defined linguistic terms. In this paper, we present a fuzzy expert system shell to assist an operator of fossil power plants. The fuzzy expert system shell, called SADEP, is based.on a new methodology for dealing with uncertainty and time called Fuzzy Temporal Network (FTN). The FTN generates a formal and systematic structure used to model the temporal evolution of a process under uncertainty. The inference mechanism for a FIN consists in the calculation of the possibility degree of the real time occurrence of the events using the fuzzy compositional rule Sup-min. A FTN can be used to recognize the significance of events and state variables with respect to current plant conditions and predict the future propagation of disturbances. SADEP was validated with the diagnosis of two detailed disturbances of a fossil power plant: a power load increment in the drum level and a water condenser pump failure. The evaluations performed in this work indicate that SADEP can potentially improve plant availability through early diagnosis of disturbances that could lead to plant shutdown.