scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Computational Intelligence Systems in 2008"


Journal ArticleDOI
TL;DR: A literature review in clinical decision support systems with a focus on the way knowledge bases are constructed and how inference mechanisms and group decision making methods are used in CDSSs, with particular attention to the uncertainty handling capability of commonly used knowledge representation and inference schemes.
Abstract: This paper provides a literature review in clinical decision support systems (CDSSs) with a focus on the way knowledge bases are constructed, and how inference mechanisms and group decision making methods are used in CDSSs. Particular attention is paid to the uncertainty handling capability of the commonly used knowledge representation and inference schemes. The definition of what constitute good CDSSs and how they can be evaluated and validated are also considered. Some future research directions for handling uncertainties in CDSSs are proposed.

138 citations


Journal ArticleDOI
TL;DR: A Knowledge Based Recommender System that uses the fuzzy linguistic approach to de-stabilise recommender systems and help users find out suitable items by means of recommendations based on information provided by different sources such as: other users, experts, item features, etc.
Abstract: Recommender systems are applications that have emerged in the e-commerce area in order to assist users in their searches in electronic shops. These shops usually offer a wide range of items that cover the necessities of a great variety of users. Nevertheless, searching in such a wide range of items could be a very difficult and time-consuming task. Recommender systems assist users to find out suitable items by means of recommendations based on information provided by different sources such as: other users, experts, item features, etc. Most of the recommender systems force users to provide their preferences or necessities using an unique numerical scale of information fixed in advance. In spite of this information is usually related to opinions, tastes and perceptions, therefore, it seems that is usually better expressed in a qualitative way, with linguistic terms, than in a quantitative way, with precise numbers. We propose a Knowledge Based Recommender System that uses the fuzzy linguistic approach to de...

84 citations


Journal ArticleDOI
TL;DR: The computational results indicate that the proposed efficient genetic algorithm approach is effective in terms of reduced total completion time or makespan (Cmax) for HFS problems.
Abstract: This paper addresses the Hybrid Flow Shop (HFS) scheduling problems to minimize the makespan value. In recent years, much attention is given to heuristic and search techniques. Genetic algorithms (GAs) are also known as efficient heuristic and search techniques. This paper proposes an efficient genetic algorithm for hybrid flow shop scheduling problems. The proposed algorithm is tested by Carlier and Neron's (2000) benchmark problem from the literature. The computational results indicate that the proposed efficient genetic algorithm approach is effective in terms of reduced total completion time or makespan (Cmax) for HFS problems.

81 citations


Journal ArticleDOI
TL;DR: This article model a Smart Home scenario, using knowledge in the form of Event-Condition-Action rules together with a new inference scheme which incorporates spatio-temporal reasoning and uncertainty, and extends RIMER to permit the monitoring of situations according to the place where they occur and the specific order and duration of the activities.
Abstract: The health system in developed countries is facing a problem of scalability in order to accommodate the increased proportion of the elderly population. Scarce resources cannot be sustained unless innovative technology is considered to provide health care in a more eective way. The Smart Home provides preventive and assistive technology to vulnerable sectors of the population. Much research and development has been focused on the technological side (e.g., sensors and networks) but less eort has been invested in the capability of the Smart Home to intelligently monitor situations of interest and act in the best interest of the occupants. In this article we model a Smart Home scenario, using knowledge in the form of Event-Condition-Action rules together with a new inference scheme which incorporates spatio-temporal reasoning and uncertainty. A reasoning system called RIMER, has been extended to permit the monitoring of situations according to the place where they occur and the specific order and duration of the activities. The system allows for the specification of uncertainty both in terms of knowledge representation and credibility of the conclusions that can be achieved in terms of the evidence available.

73 citations


Journal ArticleDOI
TL;DR: Current trends in research studies related to reliability prediction and prognostics are overviews and a Bayesian technique is presented which integrates the prognostic types by incorporate prior reliability knowledge into the progostic models.
Abstract: The article overviews current trends in research studies related to reliability prediction and prognostics. The trends are organized into three major types of prognostic models: failure data models, stressor models, and degradation models. Methods in each of these categories are presented and examples are given. Additionally, three particular computational prognostic approaches are considered; these are Markov chain-based models, general path models, and shock models. A Bayesian technique is then presented which integrates the prognostic types by incorporate prior reliability knowledge into the prognostic models. Finally, the article also discusses the usage of diagnostic/prognostic predictions for optimal control.

56 citations


Journal ArticleDOI
TL;DR: This paper aims at proposing an associative classification approach, namely Classification with Fuzzy Association Rules (CFAR), where fuzzy logic is used in partitioning the domains, and revealed that CFAR generated better understandability in terms of fewer rules and smother boundaries than the traditional CBA approach while maintaining satisfactory accuracy.
Abstract: Classification based on association rules is considered to be effective and advantageous in many cases. However, there is a so-called "sharp boundary" problem in association rules mining with quantitative attribute domains. This paper aims at proposing an associative classification approach, namely Classification with Fuzzy Association Rules (CFAR), where fuzzy logic is used in partitioning the domains. In doing so, the notions of support and confidence are extended, along with the notion of compact set in dealing with rule redundancy and conflict. Furthermore, the corresponding mining algorithm is introduced and tested on benchmarking datasets. The experimental results revealed that CFAR generated better understandability in terms of fewer rules and smother boundaries than the traditional CBA approach while maintaining satisfactory accuracy.

43 citations


Journal ArticleDOI
TL;DR: A sensory evaluation model that manages multigranular linguistic evaluation framework based on a decision analysis scheme is presented that will be applied to the sensor system.
Abstract: Evaluation is a process that analyzes elements in order to achieve different objectives such as quality inspection, marketing and other fields in industrial companies. This paper focuses on sensory evaluation where the evaluated items are assessed by a panel of experts according to the knowledge acquired via human senses. In these evaluation processes the information provided by the experts implies uncertainty, vagueness and imprecision. The use of the Fuzzy Linguistic Approach32 has provided successful results modelling such a type of information. In sensory evaluation it may happen that the panel of experts have more or less degree knowledge of about the evaluated items or indicators. So, it seems suitable that each expert could express their preferences in different linguistic term sets based on their own knowledge. In this paper, we present a sensory evaluation model that manages multigranular linguistic evaluation framework based on a decision analysis scheme. This model will be applied to the sensor...

39 citations


Journal ArticleDOI
TL;DR: This paper proposes a new method to compute the distance of an object using a single image according to the observation there exists a relationship between the physical distance of a object and its pixel height.
Abstract: Computing object distance using image processing is an important research area in the field of computer vision and robot navigation applications. In this paper we have proposed a new method to compute the distance of an object using a single image. According to our observation there exists a relationship between the physical distance of an object and its pixel height. We exploit this relationship to train a system that finds a mapping between an object's pixel height and physical distance. This mapping is then used to find the physical distance of test objects from the pixel height in the image. Experimental results demonstrate the capability of our proposed technique by estimating physical distance with accuracy as high as 98.76%.

37 citations


Journal ArticleDOI
TL;DR: A probabilistic fuzzy rule-based life-l ong learning system, equipped with intention reading capability by learning human behavioral patterns, which is introduced as a solution in uncertain and time-varying situations is discussed.
Abstract: The smart house under consideration is a service-integrated complex system to assist older persons and/or people with disabilities. The primary goal of the system is to achieve independent living by various robotic devices and systems. Such a system is treated as a human-in-the loop system in which humanrobot interaction takes place intensely and frequently. Ba sed on our experiences of having designed and implemented a smart house environment, called Intelligent Sweet Home (ISH), we present a framework of realizing human-friendly HRI (human-robot interaction) module with various effective techniques of computational intelligence. More specifically, we partiti on the robotic tasks of HRI module into three groups in consideration of the level of specificity, fuzzine ss or uncertainty of the context of the system, and present effective interaction method for each case. We fi rst show a task planning algorithm and its architecture to deal with well-structured tasks autonomously by a simplified set of commands of the user instead of inconvenient manual operations. To provide with capability of interacting in a human-friendly way in a fuzzy context, it is proposed that the robot should make use of human bio-signals as input of the HRI module as shown in a hand gesture recognition system, called a soft remote control system. Finally we discuss a probabilistic fuzzy rule-based life-l ong learning system, equipped with intention reading capability by learning human behavioral patterns, which is introduced as a solution in uncertain and time-varying situations.

32 citations


Journal ArticleDOI
TL;DR: The ER approach is applied to the assessment of strategic R&D projects for a car manufacturer, which is characterized by many qualitative factors that may be imprecise or fuzzy and can generate comprehensive distributed assessments for different projects.
Abstract: Assessment of strategic R&D projects is in essence a multiple-attribute decision analysis (MADA) problem. In such problems, qualitative information with subjective judgments of ambiguity is often provided by people together with quantitative data that may be imprecise or incomplete. A few approaches can be used to deal with such quantitative and qualitative MADA problems under uncertainty, such as the evidential reasoning (ER) approach that has its own unique features. In this paper, the ER approach is applied to the assessment of strategic R&D projects for a car manufacturer, which is characterized by many qualitative factors that may be imprecise or fuzzy. The ER approach is well-suited for dealing with such problems and can generate comprehensive distributed assessments for different projects. The group analytic hierarchy process (GAHP) method is applied to calculate the weights of attributes in the E-R assessment process, where a group of people from the company were involved. We also provide a new al...

30 citations


Journal ArticleDOI
TL;DR: The presentation theorem of set-valued stochastic integral is proved and further properties that will be useful to study set- values of differential equations with their applications are discussed.
Abstract: In this paper, we shall firstly illustrate why we should introduce set-valued stochastic integrals, and then we shall discuss some properties of set-valued stochastic processes and the relation between a set-valued stochastic process and its selection set. After recalling the Aumann type definition of stochastic integral, we shall introduce a new definition of Lebesgue integral of a set-valued stochastic process with respect to the time t. Finally we shall prove the presentation theorem of set-valued stochastic integral and discuss further properties that will be useful to study set-valued stochastic differential equations with their applications.

Journal ArticleDOI
TL;DR: The nonlinear and non-stationary time series traffic is predicted using neural network and statistical methods and the results of both the methods are compared on different time scales or time granularity.
Abstract: In a wireless network environment accurate and timely estimation or prediction of network traffic has gained much importance in the recent past The network applications use traffic prediction results to maintain its performance by adopting its behaviors Network Service provider will use the prediction values in ensuring the better Quality of Service(QoS) to the network users by admission control and load balancing by inter or intra network handovers This paper presents modeling and prediction of wireless network traffic Here traffic is modeled as nonlinear and non-stationary time series The nonlinear and non-stationary time series traffic is predicted using neural network and statistical methods The results of both the methods are compared on different time scales or time granularity The Neural Network (NN) architectures used in this study are Recurrent Radial Basis Function Network (RRBFN) and Echo state network (ESN)The statistical model used here in this work is Fractional Auto Regressive Integ

Journal ArticleDOI
TL;DR: The theory of natural immune system is first briefly introduced and several representative artificial immune networks are discussed, and their principles and learning algorithms are given here in details.
Abstract: Artificial Immune Systems (AIS), which is inspired by the nature immune system, has been applied for solving complex computational problems in classification, pattern recognition, and optimization. In this paper, the theory of the natural immune system is first briefly introduced. Next, we compare some well-known AIS and their applications. Several representative artificial immune networks models are also discussed. Moreover, we demonstrate the applications of artificial immune networks in various engineering fields.

Journal ArticleDOI
TL;DR: This paper presents a web-based fuzzy group decision support system (WFGDSS) and demonstrates how this system can provide a means of support for generating team SA in a distributed team work context with the ability of handling uncertain information.
Abstract: Situation awareness (SA) is an important element to support responses and decision making to crisis problems. Decision making for a complex situation often needs a team to work cooperatively to get consensus awareness for the situation. Team SA is characterized including information sharing, opinion integration and consensus SA generation. In the meantime, various uncertainties are involved in team SA during information collection and awareness generation. Also, the collaboration between team members may be across distances and need web-based technology to facilitate. This paper presents a web-based fuzzy group decision support system (WFGDSS) and demonstrates how this system can provide a means of support for generating team SA in a distributed team work context with the ability of handling uncertain information.

Journal ArticleDOI
TL;DR: The objective of this paper is to propose visual clues based procedure to identify Kannada, Hindi and English text portions of the Indian multilingual document.
Abstract: In a multilingual country like India, a document may contain text words in more than one language. For a multilingual environment, multi lingual Optical Character Recognition (OCR) system is needed to read the multilingual documents. So, it is necessary to identify different language regions of the document before feeding the document to the OCRs of individual language. The objective of this paper is to propose visual clues based procedure to identify Kannada, Hindi and English text portions of the Indian multilingual document.

Journal ArticleDOI
TL;DR: An extended kth-best approach is proposed to solve the referential-uncooperative BLMF problem and an example of logistics planning illustrates the impact this approach has on decision-making in a hierarchical organization.
Abstract: Bilevel decision techniques have been mainly developed for solving decentralized management problems with decision makers in a hierarchical organization. When multiple followers are involved in a bilevel decision problem, called a bilevel multi-follower (BLMF) decision problem, the leader's decision will be affected, not only by the reactions of these followers, but also by the relationships among these followers. The referential-uncooperative situation is one of the popular cases of BLMF decision problems where these multiple followers don't share decision variables with each other but may take others' decisions as references to their decisions. This paper presents a model for the referential-uncooperative BLMF decision problem. As the kth-best approach is one of the most successful approaches in dealing with normal bilevel decision problems, this paper then proposes an extended kth-best approach to solve the referential-uncooperative BLMF problem. Finally an example of logistics planning illustrates the...

Journal ArticleDOI
TL;DR: This work tries to build a regression tool to partially replace the use of CPU-time consuming atomic-level procedures for the calculation of point-defect migration energies in Atomistic Kinetic Monte Carlo simulations, as functions of the Local Atomic Configuration (LAC).
Abstract: In this work, we try to build a regression tool to partially replace the use of CPU-time consuming atomic-level procedures for the calculation of point-defect migration energies in Atomistic Kinetic Monte Carlo (AKMC) simulations, as functions of the Local Atomic Configuration (LAC). Two approaches are considered: the Cluster Expansion (CE) and the Artificial Neural Network (ANN). The first is found to be unpromising because of its high computational complexity. On the contrary, the second provides very encouraging results and is found to be very well behaved.

Journal ArticleDOI
TL;DR: Current trends in research studies related to reliability prediction and prognostics are overviews and a Bayesian technique is presented which integrates the prognostic types by incorporate prior reliability knowledge into the progostic models.
Abstract: The article overviews current trends in research studies related to reliability prediction and prognostics. The trends are organized into three major types of prognostic models: failure data models, stressor models, and degradation models. Methods in each of these categories are presented and examples are given. Additionally, three particular computational prognostic approaches are considered; these are Markov chain-based models, general path models, and shock models. A Bayesian technique is then presented which integrates the prognostic types by incorporate prior reliability knowledge into the prognostic models. Finally, the article also discusses the usage of diagnostic/prognostic predictions for optimal control.

Journal ArticleDOI
TL;DR: This work investigates a number of measures relating partitions and introduces a measure of the non-specificity of a partition, which measures a feature of a partitions related to the granularity of the constituent classes of the partition.
Abstract: We investigate a number of measures relating partitions. One class of measures we consider are congruence measures. These measures are used to calculate the similarity between two partitionings. We provide a number of examples of this type of measure. Another class of measures we investigate are prognostication measures. This type of measure, closely related to a concept of containment between partitions, is useful in indicating how well knowledge of an objects class in one partitioning indicates its class in a second partitioning. We apply our measures to some data mining applications. One example is in choosing the appropriate level of a concept hierarchy. We also introduce a measure of the non-specificity of a partition. This measures a feature of a partition related to the granularity of the constituent classes of the partition.

Journal ArticleDOI
TL;DR: The paper proposes the use of attack pattern ontology and formal framework for network traffic anomalies detection within a distributed multiagent Intrusion Detection System architecture and defined how to use the proposed techniques in distributed IDS using attack patterns ontology.
Abstract: The paper proposes the use of attack pattern ontology and formal framework for network traffic anomalies detection within a distributed multiagent Intrusion Detection System architecture. Our framework assumes ontology-based attack definition and distributed processing scheme with exchange of communicates between agents. The role of traffic anomalies detection was presented then it has been discussed how some specific values characterizing network communication can be used to detect network anomalies caused by security incidents (worm attack, virus spreading). Finally, it has been defined how to use the proposed techniques in distributed IDS using attack pattern ontology.

Journal ArticleDOI
TL;DR: A framework for decision-making in relation to disaster management with a focus on situation assessment during disaster management monitoring is presented which provides a framework for the assistance of one decision-maker but also how to handle opinions from a hierarchy of decision-makers.
Abstract: We present a framework for decision-making in relation to disaster management with a focus on situation assessment during disaster management monitoring. The use of causality reasoning based on the temporal evolution of a scenario provides a natural way to chain meaningful events and possible states of the system. There are usually different ways to analyse a problem and different strategies to follow as a solution and it is also often the case that information originating in different sources can be inconsistent or unreliable. Therefore we allow the specification of possibly conflicting situations as they are typical elements in disaster management. A decision procedure to decide on those conflicting situations is presented which not only provides a framework for the assistance of one decision-maker but also how to handle opinions from a hierarchy of decision-makers.

Journal ArticleDOI
TL;DR: This paper describes a method for finding a fuzzy membership matrix in case of numerical and categorical features using fuzzy c-means and shows the method to be very effective in comparison with other methods.
Abstract: This paper describes a method for finding a fuzzy membership matrix in case of numerical and categorical features. The set of feature vectors with mixed features is mapped to a set of feature vectors with only real valued components with the condition that the new set of vectors has the same proximity matrix as the original feature vectors. This new set of vectors is then clustered using fuzzy c-means. Simulations show the method to be very effective in comparison with other methods.

Journal ArticleDOI
TL;DR: This work considers a measure between words based on dictionaries assuming that a dictionary is formalized as a fuzzy graph and shows that the approach permits to compute measures not only for pairs of words but for sets of them.
Abstract: The computation of similarities between words is a basic element of information retrieval systems, when retrieval is not solely based on word matching. In this work we consider a measure between words based on dictionaries. This is achieved assuming that a dictionary is formalized as a fuzzy graph. We show that the approach permits to compute measures not only for pairs of words but for sets of them.

Journal ArticleDOI
TL;DR: Both quality measurement analysis and human knowledge processing are integrated in the system and it allows designers to optimize the structure of nonwoven materials with limited trials according to the functional properties given in customers’ specifications.
Abstract: In this paper, a computer aided system for designing nonwoven materials is presented. As an original approach in the field of nonwoven research, both quality measurement analysis and human knowledge processing are integrated in the system. It allows designers to optimize the structure of nonwoven materials with limited trials according to the functional properties given in customers’ specifications. This system aims at modeling the relation between functional or physical properties (outputs) and structural parameters (inputs) of nonwoven products. In order to reduce the complexity of the system, a procedure is proposed for selecting the most relevant input variables based on a ranking criterion, which takes into account both the expertise of manufacturers and the measured data. In this criterion, fuzzy logic is used to establish a good compromise or a fusion between these two uncertain and incomplete information sources. Then, two models are set up by utilizing multilayer feed forward neural networks, which take into account the generality and the specificity of the product families respectively. The presented models have been validated with the use of experimental data concerning several families of nonwoven products.

Journal ArticleDOI
TL;DR: In this paper, a new vector ordering for colours modelled in the RedGreenBlue colour model is presented, which is compatible with the morphological magnification method described in 8 and 1.
Abstract: In this paper we present a new vector ordering ⪯RGB for colours modelled in the RedGreenBlue colour model. The RedGreenBlue colour model becomes with this new ordering and associated minimum and maximum operators a complete lattice. We also have defined a complement co for colours in the RedGreenBlue model, with which our new ordering is compatible. As an application we illustrate the extension of our morphological magnification method, described in 8 and 1, towards colour images with sharp edges modelled in the RedGreenBlue model. There we have needed the compatibility of ⪯RGB with co to detect the corners in an image using the hit-or-miss transformation. Experimental results demonstrate that our method gives very good results.

Journal ArticleDOI
TL;DR: This paper addresses the question of which method to use when the learning sample is large or small, and of the computational complexity resulting from the choice of external parameters such as the number of kernels and their widths in kernel mixture models, the robustness to initial conditions, etc.
Abstract: Probability density estimation (PDF) is a task of primary importance in many contexts, including Bayesian learning and novelty detection. Despite the wide variety of methods at disposal to estimate PDF, only a few of them are widely used in practice by data analysts. Among the most used methods are the histograms, Parzen windows, vector quantization based Parzen, and finite Gaussian mixtures. This paper compares these estimations methods from a practical point of view, i.e. when the user is faced to various requirements from the applications. In particular it addresses the question of which method to use when the learning sample is large or small, and of the computational complexity resulting from the choice (by cross-validation methods) of external parameters such as the number of kernels and their widths in kernel mixture models, the robustness to initial conditions, etc. Expected behaviour of the estimation algorithms is drawn from an algorithmic perspective; numerical experiments are used to illustrate these results.

Journal ArticleDOI
TL;DR: Two methods to deal with the problem using support vector regression are proposed and two new methods for evaluating performance for estimating prediction interval are presented as well.
Abstract: Support vector machines (classification and regression) are powerful machine learning techniques for crisp data. In this paper, the problem is considered for interval data. Two methods to deal with the problem using support vector regression are proposed and two new methods for evaluating performance for estimating prediction interval are presented as well.

Journal ArticleDOI
TL;DR: This work introduces online feature selection for the classification of emphysema, a smoking related disease that appears as low attenuation regions in High Resolution Computer Tomography (HRCT) images.
Abstract: Feature subset selection, applied as a preprocessing step to machine learning, is valuable in dimensionality reduction, eliminating irrelevant data and improving classifier performance. In the classic formulation of the feature selection problem, it is assumed that all the features are available at the beginning. However, in many real world problems, there are scenarios where not all features are present initially and must be integrated as they become available. In such scenarios, online feature selection provides an efficient way to sort through a large space of features. It is in this context that we introduce online feature selection for the classification of emphysema, a smoking related disease that appears as low attenuation regions in High Resolution Computer Tomography (HRCT) images. The technique was successfully evaluated on 61 HRCT scans and compared with different online feature selection approaches, including hill climbing, best first search, grafting, and correlation-based feature selection. ...

Journal ArticleDOI
TL;DR: In this paper, a survey for data mining frame work has been done for proposing Data Mining methodologies to engineering materials design applications An exhaustive literature survey made in this article has covered the modeling systems such as Analytical Model, Numerical Simulation Model and Computer Based Modeling Systems, which were developed and implemented for Polymer Composite processing from the year 1950 to till 2006 Motivation for the present investigation is inspired by the Computer Based Models and is depicted as Mining Frame Work for determining optimal decision making strategies and performing intelligent computational operations associated to advanced Composite materials design application Data Mining and Knowledge
Abstract: In this paper, a survey for Data Mining frame work has been done for proposing Data Mining methodologies to engineering materials design applications An exhaustive literature survey made in this article has covered the modeling systems such as Analytical Model, Numerical Simulation Model and Computer Based Modeling Systems, which were developed and implemented for Polymer Composite processing from the year 1950 to till 2006 Motivation for the present investigation is inspired by the Computer Based Models and is depicted as Mining Frame Work for determining optimal decision making strategies and performing intelligent computational operations associated to advanced Composite materials design applications Data Mining and Knowledge Discovery has made tremendous progress in Computer Science in the last 15 years However, a large gap exists between the results of Data Mining and Knowledge Base system that can provide and support proper decision making Though many Modeling and Simulation Systems have been d

Journal ArticleDOI
TL;DR: An exhaustive literature survey made in this article has covered the modeling systems such as Analytical Model, Numerical Simulation Model and Computer Based Modeling Systems, which were developed and implemented for Polymer Composite processing from the year 1950 to till 2006.
Abstract: In this paper, a survey for Data Mining frame work has been done for proposing Data Mining methodologies to engineering materials design applications. An exhaustive literature survey made in this article has covered the modeling systems such as Analytical Model, Numerical Simulation Model and Computer Based Modeling Systems, which were developed and implemented for Polymer Composite processing from the year 1950 to till 2006. Motivation for the present investigation is inspired by the Computer Based Models and is depicted as Mining Frame Work for determining optimal decision making strategies and performing intelligent computational operations associated to advanced Composite materials design applications. Data Mining and Knowledge Discovery has made tremendous progress in Computer Science in the last 15 years. However, a large gap exists between the results of Data Mining and Knowledge Base system that can provide and support proper decision making. Though many Modeling and Simulation Systems have been d...