scispace - formally typeset
Search or ask a question

Showing papers in "Expert Systems With Applications in 2000"


Journal ArticleDOI
Kyoung-jae Kim1, Ingoo Han1
TL;DR: Genetic algorithms approach to feature discretization and the determination of connection weights for artificial neural networks (ANNs) to predict the stock price index is proposed.
Abstract: This paper proposes genetic algorithms (GAs) approach to feature discretization and the determination of connection weights for artificial neural networks (ANNs) to predict the stock price index. Previous research proposed many hybrid models of ANN and GA for the method of training the network, feature subset selection, and topology optimization. In most of these studies, however, GA is only used to improve the learning algorithm itself. In this study, GA is employed not only to improve the learning algorithm, but also to reduce the complexity in feature space. GA optimizes simultaneously the connection weights between layers and the thresholds for feature discretization. The genetically evolved weights mitigate the well-known limitations of the gradient descent algorithm. In addition, globally searched feature discretization reduces the dimensionality of the feature space and eliminates irrelevant factors. Experimental results show that GA approach to the feature discretization model outperforms the other two conventional models.

669 citations


Journal ArticleDOI
TL;DR: A hybrid intelligent system that predicts the failure of firms based on the past financial performance data, combining rough set approach and neural network is proposed, which implies that the number of evaluation criteria such as financial ratios and qualitative variables is reduced with no information loss through roughSet approach.
Abstract: This paper proposes a hybrid intelligent system that predicts the failure of firms based on the past financial performance data, combining rough set approach and neural network. We can get reduced information table, which implies that the number of evaluation criteria such as financial ratios and qualitative variables is reduced with no information loss through rough set approach. And then, this reduced information is used to develop classification rules and train neural network to infer appropriate parameters. The rules developed by rough set analysis show the best prediction accuracy if a case does match any of the rules. The rationale of our hybrid system is using rules developed by rough sets for an object that matches any of the rules and neural network for one that does not match any of them. The effectiveness of our methodology was verified by experiments comparing traditional discriminant analysis and neural network approach with our hybrid approach. For the experiment, the financial data of 2400 Korean firms during the period 1994–1997 were selected, and for the validation, k-fold validation was used.

308 citations


Journal ArticleDOI
TL;DR: The findings indicate that both the fuzzy clustering and self-organizing neural networks are promising classification tools for identifying potentially failing banks.
Abstract: In this paper, we present experimental results of fuzzy clustering and two self-organizing neural networks used as classification tools for identifying potentially failing banks. We first describe the distinctive characteristics of fuzzy clustering algorithm, which provides probability of the likelihood of bank failure. We then perform the comparison between the results of the closest hard partitioning of fuzzy clustering and of two self-organizing neural networks and present our results as the ranking structure of relative bankruptcy likelihood. Our findings indicate that both the fuzzy clustering and self-organizing neural networks are promising classification tools for identifying potentially failing banks.

122 citations


Journal ArticleDOI
JB Noh1, Kun Chang Lee, Jae Kyeong Kim2, Jae Kwang Lee, Soung Hie Kim1 
TL;DR: Using cognitive map (CM) as a main vehicle of formalizing tacit knowledge, and case-based reasoning as a tool for storing CM-driven tacit knowledge in the form of frame-typed cases, and retrieving appropriate tacit knowledge from the case base according to a new problem are proposed.
Abstract: Knowledge is at the heart of knowledge management. In literature, a lot of studies have been suggested covering the role of knowledge in improving the performance of management. However, there are few studies about investigating knowledge itself in the arena of knowledge management. Knowledge circulating in an organization may be explicit or tacit. Until now, literature in knowledge management shows that it has mainly focused on explicit knowledge. On the other hand, tacit knowledge plays an important role in the success of knowledge management. It is relatively hard to formalize and reuse tacit knowledge. Therefore, research proposing the explication and reuse of tacit knowledge would contribute significantly to knowledge management research. In this sense, we propose using cognitive map (CM) as a main vehicle of formalizing tacit knowledge, and case-based reasoning as a tool for storing CM-driven tacit knowledge in the form of frame-typed cases, and retrieving appropriate tacit knowledge from the case base according to a new problem. Our proposed methodology was applied to a credit analysis problem in which decision-makers need tacit knowledge to assess whether a firm under consideration is healthy or not. Experiment results showed that our methodology for tacit knowledge management can provide decision makers with robust knowledge-based support.

118 citations


Journal ArticleDOI
TL;DR: A more generalized fuzzy Petri net model for expert systems is proposed, called AFPN (Adaptive Fuzzy Petri Nets), which has both the features of a fuzzyPetri net and the learning ability of a neural network.
Abstract: Knowledge in some fields like Medicine, Science and Engineering is very dynamic because of the continuous contributions of research and development. Therefore, it would be very useful to design knowledge-based systems capable to be adjusted like human cognition and thinking, according to knowledge dynamics. Aiming at this objective, a more generalized fuzzy Petri net model for expert systems is proposed, which is called AFPN (Adaptive Fuzzy Petri Nets). This model has both the features of a fuzzy Petri net and the learning ability of a neural network. Being trained, an AFPN model can be used for dynamic knowledge representation and inference. After the introduction of the AFPN model, the reasoning algorithm and the weight learning algorithm are developed. An example is included as an illustration.

98 citations


Journal ArticleDOI
TL;DR: An expert system for differential diagnosis of erythemato-squamous diseases incorporating decisions made by three classification algorithms: nearest neighbor classifier, naive Bayesian classifier and voting feature intervals-5 is presented.
Abstract: This paper presents an expert system for differential diagnosis of erythemato-squamous diseases incorporating decisions made by three classification algorithms: nearest neighbor classifier, naive Bayesian classifier and voting feature intervals-5. This tool enables doctors to differentiate six types of erythemato-squamous diseases using clinical and histopathological parameters obtained from a patient. The program also gives explanations for the classifications of each classifier. The patient records are also maintained in a database for further references.

83 citations


Journal ArticleDOI
TL;DR: Two methods based on rough set analysis were developed and merged with the integration of neural networks and expert systems, forming a new hybrid architecture of expert systems called a rough neural expert system, which has some properties over the conventional architectures of Expert systems.
Abstract: The knowledge acquisition process is a crucial stage in the technology of expert systems. However, this process is not well defined. One of the promising structured sources of learning can be found in the recent work on neural network technology. A neural network can serve as a knowledge base of expert systems that does classification tasks. Another way of learning is by using the rough set as a new mathematical tool to deal with uncertain and imprecise data. Two methods based on rough set analysis were developed and merged with the integration of neural networks and expert systems, forming a new hybrid architecture of expert systems called a rough neural expert system. The first method works as a pre-processor for neural networks within the architecture, and it is called a pre-processing rough engine, while the second one was added to the architecture for building a new structure of inference engine called a rough neural inference engine. Consequently, a new architecture of knowledge base was designed. This new architecture was based on the connectionist of neural networks and the reduction of rough set analysis. The performance of the proposed system was evaluated by an application to the field of medical diagnosis using a real example of hepatitis diseases. The results indicate that the new methods have improved the inference procedures of the expert systems, and have showed that this new architecture has some properties over the conventional architectures of expert systems.

57 citations


Journal ArticleDOI
TL;DR: A tool is presented which facilitates the fulfilment of the tasks included in the methodology, whilst covering quantitative as well as heuristic aspects, and the result is an intelligent tool for the validation of intelligent systems.
Abstract: One of the most important phases in the methodology for the development of intelligent systems is that corresponding to the evaluation of the performance of the implemented product. This process is popularly known as verification and validation (V&V). The majority of tools designed to support the V&V process are preferentially directed at verification in detriment to validation, and limited to an analysis of the internal structures of the system. The authors of this article propose a methodology for the development of a results-oriented validation, and a tool (SHIVA) is presented which facilitates the fulfilment of the tasks included in the methodology, whilst covering quantitative as well as heuristic aspects. The result is an intelligent tool for the validation of intelligent systems.

57 citations


Journal ArticleDOI
TL;DR: A new algorithm is proposed to deal with the problem of producing a set of maximally general fuzzy rules for coverage of training examples from quantitative data and can be used to build a prototype knowledge base in a fuzzy expert system.
Abstract: Expert systems have been widely used in domains where mathematical models cannot be easily built, human experts are not available or the cost of querying an expert is high. Machine learning or data mining can extract desirable knowledge or interesting patterns from existing databases and ease the development bottleneck in building expert systems. In the past we proposed a method [Hong, T.P., Wang, T.T., Wang, S.L. (2000). Knowledge acquisition from quantitative data using the rough-set theory. Intelligent Data Analysis (in press).], which combined the rough set theory and the fuzzy set theory to produce all possible fuzzy rules from quantitative data. In this paper, we propose a new algorithm to deal with the problem of producing a set of maximally general fuzzy rules for coverage of training examples from quantitative data. A rule is maximally general if no other rule exists that is both more general and with larger confidence than it. The proposed method first transforms each quantitative value into a fuzzy set of linguistic terms using membership functions and then calculates the fuzzy lower approximations and the fuzzy upper approximations. The maximally general fuzzy rules are then generated based on these fuzzy approximations by an iterative induction process. The rules derived can then be used to build a prototype knowledge base in a fuzzy expert system.

54 citations


Journal ArticleDOI
Kyong Jo Oh1, Ingoo Han1
TL;DR: This study examines the predictability of the integrated neural network model for interest rates forecasting using change-point detection using the backpropagation neural network (BPN).
Abstract: Interest rates are one of the most closely watched variables in the economy. They have been studied by a number of researchers as they strongly affect other economic and financial parameters. Contrary to other chaotic financial data, the movement of interest rates has a series of change points owing to the monetary policy of the US government. The basic concept of this proposed model is to obtain intervals divided by change points, to identify them as change-point groups, and to use them in interest rates forecasting. The proposed model consists of three stages. The first stage is to detect successive change points in the interest rates dataset. The second stage is to forecast the change-point group with the backpropagation neural network (BPN). The final stage is to forecast the output with BPN. This study then examines the predictability of the integrated neural network model for interest rates forecasting using change-point detection.

51 citations


Journal ArticleDOI
Taeksoo Shin1, Ingoo Han1
TL;DR: An integrated thresholding design of the optimal or near-optimal wavelet transformation by genetic algorithms (GAs) to represent a significant signal most suitable in artificial neural network models is proposed and applied to Korean won/US dollar exchange-rate forecasting.
Abstract: Detecting the features of significant patterns from historical data is crucial for good performance in time-series forecasting. Wavelet analysis, which processes information effectively at different scales, can be very useful for feature detection from complex and chaotic time series. In particular, the specific local properties of wavelets can be useful in describing the signals with discontinuous or fractal structure in financial markets. It also allows the removal of noise-dependent high frequencies, while conserving the signal bearing high frequency terms of the signal. However, one of the most critical issues to be solved in the application of the wavelet analysis is to choose the correct wavelet thresholding parameters. If the threshold is small or too large, the wavelet thresholding parameters will tend to overfit or underfit the data. The threshold has so far been selected arbitrarily or by a few statistical criteria. This study proposes an integrated thresholding design of the optimal or near-optimal wavelet transformation by genetic algorithms (GAs) to represent a significant signal most suitable in artificial neural network models. This approach is applied to Korean won/US dollar exchange-rate forecasting. The experimental results show that this integrated approach using GAs has better performance than the other three wavelet thresholding algorithms (cross-validation, best basis selection and best level tree).

Journal ArticleDOI
TL;DR: Fuzzy reasoning algorithms were designed to evaluate and assess the likelihood of equipment failure mode precipitation and aggravation and an approximate reasoning scheme which considers local, product, and adjacent machinery effects was constructed to prioritize the equipment failure modes likely to precipitate in the process.
Abstract: A new framework for the implementation of reliability centered maintenance (RCM) in the initial design phases of industrial chemical processes was developed and implemented. Fuzzy reasoning algorithms were designed to evaluate and assess the likelihood of equipment failure mode precipitation and aggravation. Furthermore, an approximate reasoning scheme which considers local, product, and adjacent machinery effects was constructed to prioritize the equipment failure modes likely to precipitate in the process. The new RCM approach was implemented through an expert system. The computer system reads the process flowsheet generated by ASPEN Plus and, based on relevant machine operating data, it provides the user with the final process RCM availability structure diagram. This availability diagram consists of a listing of all critical machine failure modes likely to precipitate, prioritized according to their overall negative impact on the process, as well as important information on their corresponding local and system effects, and suggested controls for their detection.

Journal ArticleDOI
TL;DR: In this paper, a low-level approach to measure the value of separate knowledge assets is defined in a formal model, which calculates the return on a knowledge asset (its value) as the difference between the cost incurred for using the knowledge asset in activities to produce products minus the revenues generated by these products.
Abstract: Measuring the value of knowledge is rapidly becoming a topic of interest in the wake of the increasing attention for knowledge management. Several approaches have been proposed in the past, most of them focused on measuring at a high level of abstraction the “intellectual capital” of a company. A low-level approach, meant to measure the value of separate knowledge assets is defined in a formal model. The model calculates the return on a knowledge asset (its value) as the difference between the cost incurred for using the knowledge asset in activities to produce products minus the revenues generated by these products. The activity side of this equation relies on Activity Based Costing. For the revenues side different procedures can be used for distributing product revenues over activities and knowledge assets. The approach is illustrated by a case study concerning loan revision performed in a large bank in Netherlands. It was shown that the method is applicable and led in the case study to the unexpected result that the return on most knowledge assets for loan revision was negative. The results of the method could also be used to calculate the financial prospects of re-engineering proposals. To conclude, several constraints and benefits of the method are discussed.

Journal ArticleDOI
TL;DR: The objective of this project is to develop an automated case-based help desk system to support both call center personnel and customers, and to contribute to shortening the response time on incoming calls and reduce training time for new employees.
Abstract: Help desks are computer-aided environments in customer support centers that provide frontline support to external and internal customers. The paper reports on an automated help desk system developed at an information technology company. With the proliferation of diverse software and hardware, the center provides support to a large variety of client systems. The number of calls increases while the turnover rate of employees is high, which means the cost of training escalates. The objective of this project is to develop an automated case-based help desk system to support both call center personnel and customers. The system would contribute to shortening the response time on incoming calls and reduce training time for new employees. The focus of the paper is on the knowledge engineering process of the system. We discuss in detail the knowledge acquisition, knowledge representation, system implementation and verification processes, and we emphasize the structured and automated development methods adopted.

Journal ArticleDOI
TL;DR: The integration of the discrete event simulation with arule-based system is suggested to effectively handle condition-based events within a rule-based environment and the capability to change the dispatching rules during simulation is desirable to reduce OSBSS development efforts.
Abstract: The detailed system architecture of the optimized simulation-based scheduling system (OSBSS) is presented to generate an optimized scheduling. In OSBSS environment, a simulation optimizer interactively communicates with a simulation model to improve the current scheduling with respect to the performance criteria. The performance of a simulation engine is very important issue in developing OSBSS since it is necessary to simulate multiple alternatives until given criteria are satisfied. The integration of the discrete event simulation with a rule-based system is suggested to effectively handle condition-based events within a rule-based environment. The capability to change the dispatching rules during simulation is also desirable to reduce OSBSS development efforts. Database-driven simulation model generation concept is presented to effectively generate and maintain the simulation model with updated domain data. The generic development procedures for OSBSS, using IDEF modeling methods, are presented, which may serve as a template for actual development.

Journal ArticleDOI
TL;DR: A real environment for integrating ontologies supplied by a predetermined set of (experts) users, who might be distributed through a communication network and working cooperatively in the integration process, is introduced.
Abstract: Nowadays, we can find systems and environments supporting processes of ontology building. However, these processes have not been specified enough yet. In this work, a real environment for integrating ontologies supplied by a predetermined set of (experts) users, who might be distributed through a communication network and working cooperatively in the integration process, is introduced. In this environment, the (expert) user can check for the ontology that is being produced, so he/she is able to refine his/her private ontology. Furthermore, the experts who take part of the ontology construction process are allowed to use their own terminology even for requesting information about the global-derived ontology until a specific instant after the integration.

Journal ArticleDOI
TL;DR: An expert system as a decision support tool to optimize natural gas pipeline operations by providing consistent, fast and reliable decision support to the dispatcher so that inconsistency in the dispatcher's performance can be minimized.
Abstract: This paper presents an expert system as a decision support tool to optimize natural gas pipeline operations. A natural gas pipeline control system is a controlling system that involves many complicated operating processes. Since a dispatcher (who operates the system) might not be able to handle all of his or her tasks consistently, an expert system has been developed for optimizing the operations by providing consistent, fast and reliable decision support to the dispatcher. Consequently, inconsistency in the dispatcher's performance can be minimized. To build an expert system, the knowledge from an experienced dispatcher, who is familiar with the process in this controlling system is acquired and that knowledge has been implemented as rules in the knowledge base of the expert system. When this expert system has been validated by gas pipeline experts, it can help inexperienced dispatchers to operate the processes more effectively. The expert system is implemented on the real-time expert system shell G2 (trademark of Gensym Corp. of USA). The system also consists of a user interface that helps dispatchers visualize system conditions.

Journal ArticleDOI
TL;DR: This paper attempts to introduce a Neural On-Line Analytical Processing System (NOLAPS), which is able to contribute to the creation of decision support functionality in a virtual enterprise network.
Abstract: Enterprises are now facing growing global competition and the continual success in the marketplace depends very much on how efficient and effective the companies are able to respond to customer demands. The Internet has provided a powerful tool to link up manufacturers, suppliers and consumers to facilitate the bi-directional interchange of useful information. The formation of virtual enterprise network is gathering momentum to meet this challenge. The idea of virtual enterprise network is meant to establish a dynamic organization by the synergetic combination of dissimilar companies with different core competencies, thereby forming a “best of everything” consortium to perform a given business project to achieve maximum degree of customer satisfaction. In this emerging business model of virtual enterprise network, the decision support functionality, which addresses the issues such as selection of business partners, coordination in the distribution of production processes and the prediction of production problems, is an important domain to be studied. This paper attempts to introduce a Neural On-Line Analytical Processing System (NOLAPS), which is able to contribute to the creation of decision support functionality in a virtual enterprise network. NOLAPS is equipped with two main technologies for achieving various objectives, including neural network for extrapolating probable outcomes based on available pattern of events and data mining for converting complex data into useful corporate information. A case example is also covered to validate the feasibility of the adoption of NOLAPS in real industrial situations.

Journal ArticleDOI
TL;DR: These actions involve structuring a Decision Supporting Expert System (DSES) to help with those decisions related to the preliminary activities for inspection development—most of them relating to determining of the need or convenience of carrying out the inspection itself.
Abstract: This paper describes the basic actions proposed for quality inspection. These actions involve structuring a Decision Supporting Expert System (DSES) to help with those decisions related to the preliminary activities for inspection development—most of them relating to determining of the need or convenience of carrying out the inspection itself. Once the opportunity to carry it out is defined, the expert system helps the user to select the type of inspection to adopt from amongst: (1) automatic or sensorial inspection; (2) inspection by samples or complete; (3) acceptance or rectifying and, in the most relevant module, (4) inspection by attributes or by variables. The complementary documentation of the DSES contains the directions to operate it, the rules and qualifiers that make up the system, as well as the results achieved through its experimental implementation.

Journal ArticleDOI
F. Li1, R.K. Aggarwal1
TL;DR: The proposed relaxed hybrid genetic algorithm (RHGA) and gradient technique (GT) is proposed to economically allocate power generation in a fast, accurate, and relaxed manner and the simulation results obtained are very encouraging with regard to the computational time and production cost.
Abstract: A relaxed hybrid genetic algorithm (RHGA) and gradient technique (GT) is proposed to economically allocate power generation in a fast, accurate, and relaxed manner. The proposed hybrid scheme is constructed in such a way that a GA performs a base-level search, makes rapid decisions to direct the local GT to quickly climb the potential hill. The proposed method further ensures the dispatch quality as well as speed by allowing a loose match between the power generation and the load demand at the base search, and compensates for any mismatch at the beginning of the local search. Consequently, a GA is able to deliver equal effort to the search for the least cost and power balance without the risk of attaining infeasible solutions. The effectiveness of the proposed RHGA is verified on two test cases. The first is the static economic dispatch (SED) on a three-generator system, for which the near optimal solution is found within a comparable short time. The second is the dynamic economic dispatch (DED) problem on the practical Northern Ireland Electricity (NIE) system, which has a total of 25 generator units. The simulation results obtained are very encouraging with regard to the computational time and production cost.

Journal ArticleDOI
TL;DR: This paper compares the relative performance of ANN and multiple regression when the data contain skewed variables and reports results for two separate data sets; one related to individual performance and the second to firm performance.
Abstract: Business organizations can be viewed as information-processing units making decisions under varying conditions of uncertainty, complexity, and fuzziness in the causal links between performance and various organizational and environmental factors. The development and use of appropriate decision-making tools has, therefore, been an important activity of management researchers and practitioners. Artificial neural networks (ANNs) are turning out to be an important addition to an organization's decision-making tool kit. A host of studies has compared the efficacy of ANNs to that of multivariate statistical methods. Our paper contributes to this stream of research by comparing the relative performance of ANN and multiple regression when the data contain skewed variables. We report results for two separate data sets; one related to individual performance and the second to firm performance. The results are used to highlight some salient issues related to the use of ANN and multiple regression models in organizational decision-making.

Journal ArticleDOI
TL;DR: It has shown that a suggested hybrid cognitive model that was consistent with maintainers’ cognitive types was reciprocally affected by fault recovery, and the FRMM conceptual model could serve as a guide for other similar logistic systems to prevent maintainer errors.
Abstract: Effective and efficient problem solving mechanism is one of the critical processes that ensure a good service quality in the maintenance environment. Maintenance errors can be easily induced by the time stress due to frequent task varieties and logistic decision uncertainties. In the sense, comprehensive maintenance support to the maintainers in critical events to reduce maintainer errors was strongly suggested. A practical framework is proposed for analyzing cognitive types and enhancing fault recovery ability through knowledge-based system. It has shown that a suggested hybrid cognitive model that was consistent with maintainers’ cognitive types was reciprocally affected by fault recovery. On the other hand, a vast amount of maintenance data, which included lots of implicit information, could indicate critical events for the policymaker by statistical analyses in the maintenance domain. These same data were used to reassess which kind of issue should be treated as the first priority. Through interviewing professional maintenance engineers and analyzing documents at maintenance tasks, the development process of a maintenance protocol is applied in the knowledge acquisition implementation. Based on human experts’ domain-specific knowledge sharing and well-preserved documents utilizing, a fault recovery management mechanism (FRMM) was developed. Such integration of reliability-centered maintenance method and expert system provided a systematic procedure for maintenance engineers and managers to retrieve fault cases quickly and accurately, and to effectively accumulate their expertise for logistic adaptation. The FRMM conceptual model could serve as a guide for other similar logistic systems to prevent maintainer errors.

Journal ArticleDOI
TL;DR: An application of case-based reasoning with an automatic indexing IR component in the legal domain of bankruptcy law and the end result is an IR–CBR bankruptcy support system (BanXupport).
Abstract: This paper is a presentation of an on-going work in which we attempt to take advantage of information retrieval (IR) and artificial intelligence techniques combined. It is an application of case-based reasoning (CBR) with an automatic indexing IR component in the legal domain of bankruptcy law. The model is based on our intuition of how lawyers go about doing their legal research and reasoning tasks in case law. We take advantage of the built-in knowledge contained in the carefully prepared statute text in a front-end processor and classification component to the CBR system. Our end result is an IR–CBR bankruptcy support system (BanXupport).

Journal ArticleDOI
TL;DR: An Expert tutoring system (E-TCL) for teaching computer programming languages through WWW, which allows having many teacher agents attending to the needs of a single or multiple student agent(s).
Abstract: This paper presents an Expert tutoring system (E-TCL) for teaching computer programming languages through WWW. In this version, many teachers can cooperate together to put the curriculum of one/more computer programming language(s). Their contributions may include: (a) add or modify the commands' structure that will be taught; (b) generate different tutoring dialogs for the same command; and (c) generate different tutoring styles. On the contrary, the students can access the system through WWW, select any language they want to learn as well as the style of presentation they prefer and they can exchange their experiences. A personal assistant agent for teachers (PAA-T), a personal assistant agent for students (PAA-S) with an adaptive interface, and tutoring agent (TA) has been built. The TA resides on the server side and communicates via HTTP and IIOP with both the PAA-T and PAA-S on the clients side. This structure allows customization of the PAA-T and PAA-S to the needs of the teachers and students, without putting extra burden on the server. In addition, this allows having many teacher agents attending to the needs of a single or multiple student agent(s).

Journal ArticleDOI
A Waheed1
TL;DR: A knowledge-based expert system, called PERMIT_EXPERT, has been developed in the Microsoft Windows environment for handling of superload hauling permit applications and can be used as a knowledgeable assistant by bridge rating engineers as well as for training new personnel.
Abstract: The preservation of the state highway system and normal flow and safety of the motoring public is the statutory responsibility of state departments of transportation. The trucking companies hauling overweight and oversized loads apply for Special Hauling Permits for travel on state routes. The process of approving such superload permit applications is currently performed manually. This is a labor-intensive task, which can take a few days per application. To alleviate the problem, a knowledge-based expert system, called PERMIT_EXPERT, has been developed in the Microsoft Windows environment for handling of superload hauling permit applications. Knowledge representation is through decision trees and production rules. Since calculations of various rating factors require structural analysis, PERMIT_EXPERT is developed as a coupled expert system with access to an external structural analysis and bridge rating program. The knowledge base of PERMIT_EXPERT is acquired from design and operation guides of AASHTO specifications and the Ohio Department of Transportation and through interviews with experienced rating engineers. PERMIT_EXPERT can be used as a knowledgeable assistant by bridge rating engineers as well as for training new personnel.

Journal ArticleDOI
TL;DR: This research introduces in this research rule-predicted typicality and suggests a combined approach of rule-induction and case-based reasoning to improve the ranking of the classified cases.
Abstract: Direct mail is a typical example for response modeling to be used. In order to decide which people will receive the mailing, the potential customers are divided into two groups or classes (buyers and non-buyers) and a response model is created. Since the improvement of response modeling is the purpose of this paper, we suggest a combined approach of rule-induction and case-based reasoning. The initial classification of buyers and non-buyers is done by means of the C5-algorithm. To improve the ranking of the classified cases, we introduce in this research rule-predicted typicality . The combination of these two approaches is tested on synergy by elaborating a direct mail example.

Journal ArticleDOI
TL;DR: In this paper, a hybrid knowledge-based system developed in China is introduced, which integrates a deductive rule-based reasoning system, an inductive casebased reasoning (CBR) system, and a quantitative model-based (MBR) system was applied to diagnose and assess AIDS risky patients.
Abstract: Improving health care assessment through the implementation of advanced technologies has been a concern of medical professionals as well as computer scientists in China. Many believe that advanced technology can improve Chinese people's access to health service and improve service quality. This belief has encouraged the application of technology to health services. One of the earliest expert systems developed in the late 70s in China was for the diagnosis and treatment of hepatitis. Since then Chinese researchers have developed numerous knowledge based systems that have been implemented in various areas. In this study, a hybrid knowledge-based system developed in China is introduced. The system which integrates a deductive rule-based reasoning system, an inductive case-based reasoning (CBR) system, and a quantitative model-based reasoning (MBR) system was applied to diagnose and assess AIDS risky patients. The system was tested using actual data, and results are encouraging.

Journal ArticleDOI
TL;DR: The new expert system framework encompasses the design programs but does not require any further modification and aims to automate the design process and make the process more efficient.
Abstract: This paper describes work completed in creating a new prototype blackboard framework to automate the design process used for the design of high-recirculation airlift reactors. The new expert system framework encompasses the design programs but does not require any further modification. It aims to automate the design process and make the process more efficient. The paper begins by describing high-recirculation airlift reactors and continues by considering the function of the new framework and presenting the new framework. Components of the new system are described in terms of its function and implementation. Initial testing is described and the results are presented.

Journal ArticleDOI
M Rao1, X Sun1, J Feng1
TL;DR: A generic intelligent system platform, namely INTEMOR (INTElligent Multimedia system for On-line Real-time application), for process operation support, developed by combining artificial intelligence, computer system and information technology into a unified environment.
Abstract: This paper presents a generic intelligent system platform, namely INTEMOR (INTElligent Multimedia system for On-line Real-time application), for process operation support. INTEMOR is developed by combining artificial intelligence, computer system and information technology into a unified environment. It integrates various function modules to perform operation support tasks, including communication gateway, data processing and analysis, on-line process monitoring and diagnosis, on-line operation manual, equipment maintenance assistance, reasoning system, knowledge-base creator and multimedia interface. The industrial applications of INTEMOR on a boiler system and a chemical pulping process are illustrated.

Journal ArticleDOI
TL;DR: A cognitive approach is presented that allows the design of linguistic applications that integrates different formalisms, reuses existing language resources and supports the implementation of the required control in a flexible way and shows the suitability of knowledge-based technology in linguistic engineering.
Abstract: The automatic processing of written texts is being tackled by a variety of scientific disciplines. Within Computer Science the area of Natural Language Processing is deeply concerned with the problem of developing software systems that include language analysis functionalities to solve real problems. During the last decade the main contributions faced the problem in a more engineering way taking into account several available technologies from different areas, in order to develop a suitable framework allowing the integration of different techniques and resources required to solve natural language processing problems. Knowledge Engineering is increasingly regarded as a means to complement traditional formal models by adding symbolic modelling and inference capabilities in a way that facilitates the introduction and maintenance of linguistic experience. Also there has been an important effort in resources development, so reusability is a key question shared by Language Applications. In this paper we present a cognitive approach that allows the design of linguistic applications that integrates different formalisms, reuses existing language resources and supports the implementation of the required control in a flexible way. Also several related work showing the current state of the knowledge based engineering technology is included. Finally, in order to show the suitability of knowledge-based technology in linguistic engineering, a case study dealing with the automatic generation of database conceptual models from Spanish short texts is included.