scispace - formally typeset
Search or ask a question

Showing papers on "Rough set published in 2016"


Journal ArticleDOI
TL;DR: The definition and basic properties of the different types of fuzzy sets that have appeared up to now in the literature are reviewed and the relationships between them are analyzed.
Abstract: In this paper, we review the definition and basic properties of the different types of fuzzy sets that have appeared up to now in the literature. We also analyze the relationships between them and enumerate some of the applications in which they have been used.

386 citations


Journal ArticleDOI
TL;DR: It is proved that the newly-defined entropy meets the common requirement of monotonicity and can equivalently characterize the existing attribute reductions in the fuzzy rough set theory.

259 citations


Journal ArticleDOI
TL;DR: The basic concepts, operations and characteristics on the rough set theory are introduced, and then the extensions of rough set model, the situation of their applications, some application software and the key problems in applied research for the roughSet theory are presented.

185 citations


Journal ArticleDOI
TL;DR: This paper constructs a novel rough set model for feature subset selection, and defines the dependency between fuzzy decision and condition attributes and employ the dependency to evaluate the significance of a candidate feature, using which a greedyfeature subset selection algorithm is designed.
Abstract: Rough set theory has been extensively discussed in machine learning and pattern recognition. It provides us another important theoretical tool for feature selection. In this paper, we construct a novel rough set model for feature subset selection. First, we define the fuzzy decision of a sample by using the concept of fuzzy neighborhood. A parameterized fuzzy relation is introduced to characterize fuzzy information granules for analysis of real-valued data. Then, we use the relationship between fuzzy neighborhood and fuzzy decision to construct a new rough set model: fuzzy neighborhood rough set model. Based on this model, the definitions of upper and lower approximation, boundary region and positive region are given, and the effects of parameters on these concepts are discussed. To make the new model tolerate noises in data, we introduce a variable-precision fuzzy neighborhood rough set model. This model can decrease the possibility that a sample is classified into a wrong category. Finally, we define the dependency between fuzzy decision and condition attributes and employ the dependency to evaluate the significance of a candidate feature, using which a greedy feature subset selection algorithm is designed. The proposed algorithm is compared with some classical algorithms. The experiments show that the proposed algorithm gets higher classification performance and the numbers of selected features are relatively small.

177 citations


Journal ArticleDOI
TL;DR: By introducing the new concepts of fuzzy -covering and fuzzy -neighborhood, two new types of fuzzy covering rough set models are defined which can be regarded as bridges linking coveringrough set theory and fuzzy rough set theory.

173 citations


Journal ArticleDOI
TL;DR: This paper proposes a generalized attribute reduct which not only considers the data but also user preference, and several reduction approaches are summarized to help users to design their appropriate reducts.
Abstract: Attribute reduction plays an important role in the areas of rough sets and granular computing. Many kinds of attribute reducts have been defined in previous studies. However, most of them concentrate on data only, which result in the difficulties of choosing appropriate attribute reducts for specific applications. It would be ideal if we could combine properties of data and user preference in the definition of attribute reduct. In this paper, based on reviewing existing definitions of attribute reducts, we propose a generalized attribute reduct which not only considers the data but also user preference. The generalized attribute reduct is the minimal subset which satisfies a specific condition defined by users. The condition is represented by a group of measures and a group of thresholds, which are relevant to user requirements or real applications. For the same data, different users can define different reducts and obtain their interested results according to their applications. Most current attribute reducts can be derived from the generalized reduct. Several reduction approaches are also summarized to help users to design their appropriate reducts.

154 citations


Journal ArticleDOI
TL;DR: This study investigates the relationship between multigranulation rough sets and concept lattices via rule acquisition and algorithm complexity analysis is made for the acquisition of "AND" decision rules, "OR" decisionrules, granular rules and disjunctive rules.
Abstract: Transforming decision systems into formal decision contexts is studied.Relationship between "AND" decision rules and granular rules is discussed.Relationship between "OR" decision rules and disjunctive rules is investigated.Support and certainty factors of different rules are compared.Algorithm complexity of rule acquisition is analyzed. Recently, by combining rough set theory with granular computing, pessimistic and optimistic multigranulation rough sets have been proposed to derive "AND" and "OR" decision rules from decision systems. At the same time, by integrating granular computing and formal concept analysis, Wille's concept lattice and object-oriented concept lattice were used to obtain granular rules and disjunctive rules from formal decision contexts. So, the problem of rule acquisition can bring rough set theory, granular computing and formal concept analysis together. In this study, to shed some light on the comparison and combination of rough set theory, granular computing and formal concept analysis, we investigate the relationship between multigranulation rough sets and concept lattices via rule acquisition. Some interesting results are obtained in this paper: (1) "AND" decision rules in pessimistic multigranulation rough sets are proved to be granular rules in concept lattices, but the inverse may not be true; (2) the combination of the truth parts of an "OR" decision rule in optimistic multigranulation rough sets is an item of the decomposition of a disjunctive rule in concept lattices; (3) a non-redundant disjunctive rule in concept lattices is shown to be the multi-combination of the truth parts of "OR" decision rules in optimistic multigranulation rough sets; and (4) the same rule is defined with a same certainty factor but a different support factor in multigranulation rough sets and concept lattices. Moreover, algorithm complexity analysis is made for the acquisition of "AND" decision rules, "OR" decision rules, granular rules and disjunctive rules.

149 citations


Journal ArticleDOI
TL;DR: This paper proposes a unified framework to classify and compare existing studies to explain rough sets in a multigranulation space through rough sets derived by using individual equivalence relations.

147 citations


Journal ArticleDOI
01 Sep 2016
TL;DR: A hybridization of two techniques, Tolerance Rough Set and Firefly Algorithm are used to select the imperative features of brain tumor to show the effectiveness of the proposed technique as well as improvements over the existing supervised feature selection algorithms.
Abstract: Brain tumor is one of the most harmful diseases, and has affected majority of people including children in the world. The probability of survival can be enhanced if the tumor is detected at its premature stage. The intention of feature selection approach is to select a small subset of features which minimizes redundancy and maximizes relevance to the target such as the class labels in classification. Thus, the machine learning model receives a brief organization with high predictive accuracy using the selected prominent features. Therefore, currently, feature selection plays a significant role in machine learning and knowledge discovery. A novel hybrid supervised feature selection algorithm, called TRSFFQR (Tolerance Rough Set Firefly based Quick Reduct), is developed and applied for MRI brain images. The hybrid intelligent system aims to exploit the benefits of the basic models and at the same time, moderate their limitations. Different categories of features are extracted from the segmented MRI images, i.e., shape, intensity and texture based features. The features extracted from brain tumor Images are real values. Hence Tolerance Rough set is applied in this work. In this study, a hybridization of two techniques, Tolerance Rough Set (TRS) and Firefly Algorithm (FA) are used to select the imperative features of brain tumor. Performance of TRSFFQR is compared with Artificial Bee Colony (ABC), Cuckoo Search Algorithm (CSA), Supervised Tolerance Rough Set-PSO based Relative Reduct (STRSPSO-RR) and Supervised Tolerance Rough Set-PSO based Quick Reduct (STRSPSO-QR).The experimental result shows the effectiveness of the proposed technique as well as improvements over the existing supervised feature selection algorithms.

128 citations


Journal ArticleDOI
TL;DR: The diagnosis results show that the proposed method is able to reliably identify the different fault categories which include both single fault and compound faults, which has a better classification performance compared to any one of the individual classifiers.

125 citations


Journal ArticleDOI
TL;DR: A novel way of performing design concept evaluations where instead of considering cost and benefit characteristics of design criteria, the work identifies best concept which satisfy constraints imposed by the team of designers on design criteria's as well as fulfilling maximum customers' preferences is proposed.

Journal ArticleDOI
TL;DR: This work proposes two kinds of generalized multigranulation double-quantitative decision-theoretic rough set theory, which will be more feasible when making decisions in real life and an illustrative case is shown to elaborate the theories advantage of GMDq-DTRS.
Abstract: The principle of the minority subordinate to the majority is the most feasible and credible when people make decisions in real world. So generalized multigranulation rough set theory is a desirable fusion method, in which upper and lower approximations are approximated by granular structures satisfying a certain level of information. However, the relationship between a equivalence class and a concept under each granular structure is very strict. Therefore, more attention are paid to fault tolerance capabilities of double-quantitative rough set theory and the feasibility of majority principle. By considering relative and absolute quantitative information between the class and concept, we propose two kinds of generalized multigranulation double-quantitative decision-theoretic rough sets(GMDq-DTRS). Firstly, we define upper and lower approximations of generalized multigranulation double-quantitative rough sets by introducing upper and lower support characteristic functions. Then, important properties of two kinds of GMDq-DTRS models are explored and corresponding decision rules are given. Moreover, internal relations between the two models under certain constraints and GMDq-DTRS and other models are explored. The definition of the approximation accuracy in GMDq-DTRS is proposed to show the advantage of GMDq-DTRS. Finally, an illustrative case is shown to elaborate the theories advantage of GMDq-DTRS which are valuable to deal with practical problems. Generalized multigranulation double-quantitative decision-theoretic rough set theory will be more feasible when making decisions in real life.

Journal ArticleDOI
TL;DR: This paper investigates approaches to attribute reduction in parallel using dominance-based neighborhood rough sets (DNRS), which take into consideration the partial orders among numerical and categorical attribute values, and can be utilized in a multicriteria decision-making method.

Journal ArticleDOI
TL;DR: 3 missing value imputation methods based on fuzzy-rough nearest neighbors: FRNNI, OWANNI and VQNNI are proposed and the results show they perform excellently, via non-parametric statistical analysis.

Journal ArticleDOI
01 Feb 2016
TL;DR: This paper will review how to define orthopairs and a hierarchy on them in the light of granular computing and possible generalizations and connections with different paradigms.
Abstract: Pairs of disjoint sets (orthopairs) naturally arise or have points in common with many tools to manage uncertainty: rough sets, shadowed sets, version spaces, three-valued logics, etc. Indeed, they can be used to model partial knowledge, borderline cases, consensus, examples and counter-examples pairs. Moreover, generalized versions of orthopairs are the well known theories of Atanassov intuitionistic fuzzy sets and possibility theory and the newly established three-way decision theory. Thus, it is worth studying them on an abstract level in order to outline general properties that can then be casted to the different paradigms they are in connection with. In this paper, we will review how to define orthopairs and a hierarchy on them in the light of granular computing. Aggregation operators will also be discussed as well as possible generalizations and connections with different paradigms. This will permit us to point out new facets of these paradigms and outline some possible future developments.

Journal ArticleDOI
TL;DR: Four kinds of constructive methods of rough approximation operators from existing rough sets are established, and the important conclusion is obtained: some rough set are essentially direct applications of these constructive methods.
Abstract: Four kinds of constructive methods of rough approximation operators from existing rough sets are established, and the important conclusion is obtained: some rough sets are essentially direct applications of these constructive methods. Moreover, the new notions of non-dual multigranulation rough sets and hybrid multigranulation rough sets are introduced, and some properties are investigated.

Journal ArticleDOI
01 Mar 2016
TL;DR: It is shown how the accuracies of rule-based classifiers can be increased by learning number and parameters of the granules, which partition the involved variables, to exploit a multi-objective evolutionary approach to the classifier generation the authors have recently proposed.
Abstract: In the last years, rule-based systems have been widely employed in several different application domains. The performance of these systems is strongly affected by the process of information granulation, which defines in terms of specific information granules such as sets, fuzzy sets and rough sets, the labels used in the rules. Generally, information granules are either provided by an expert, when possible, or extracted from the available data. In the framework of rule-based classifiers, we investigate the importance of determining an effective information granulation from data, preserving the comprehensibility of the granules. We show how the accuracies of rule-based classifiers can be increased by learning number and parameters of the granules, which partition the involved variables. To perform this analysis, we exploit a multi-objective evolutionary approach to the classifier generation we have recently proposed. We discuss different levels of information granulation optimization employing both the learning of the number of granules per variable and the tuning of each granule during the evolutionary process. We show and discuss the results obtained on several classification benchmark datasets using fuzzy sets and intervals as types of information granules.

Journal ArticleDOI
TL;DR: A new decision-making method to evaluate product design concepts based on the distance between interval vectors each alternative and positive and negative ideal reference vectors and the results are compared to TOPSIS method.
Abstract: An interval-based relative closeness index is proposed to rank design concepts.Design concepts with respect to quantitative criteria are evaluated with rough set.Design concepts with respect to qualitative criteria are evaluated with fuzzy set.The weights of criteria are obtained using the extent analysis method on fuzzy AHP.The results of design concept evaluation for our method are compared to TOPSIS. Design concept evaluation is a critical stage in the product development which has significant impact on the downstream process in product development thus on success of new product. Design concept evaluation is widely recognized as a complex multi-criteria decision-making (MCDM) problem involving various decision criteria and large amount of data which are usually imprecise and subjective. This paper proposes a new decision-making method to evaluate product design concepts based on the distance between interval vectors each alternative and positive and negative ideal reference vectors. Rank of design concepts is obtained by calculating interval-based relative closeness index for each alternative. In this method, to deal with uncertainty and vagueness of data in the primary phases of product design, performance of design concepts with respect to quantitative and qualitative criteria are concurrently evaluated using rough set and fuzzy set. The weights of criteria used in the evaluation are obtained using the extent analysis method on fuzzy AHP. The efficacy of the method is demonstrated with a numerical example and the results are compared to TOPSIS method. In final, the conclusions of our method are represented and some future directions are proposed to improve the model.

Journal ArticleDOI
TL;DR: A neighborhood based decision-theoretic rough set model (NDTRS) under the framework of DTRS is proposed and a new neighborhood classifier based on three-way decisions is constructed and compared with other classifiers.

Journal ArticleDOI
TL;DR: This article studies 24 neighborhood operators that can be derived from a single covering, showing which operators yield smaller or greater neighborhoods than others and the connection between these operators and relation-based approximation operators, another prominent generalization of Pawlak's rough sets.

Journal ArticleDOI
TL;DR: The OSFS problem is considered from the rough sets (RS) perspective and a new OSFS algorithm, called OS-NRRSAR-SA, is proposed, which uses the classical significance analysis concepts in RS theory to control the unknown feature space in OSFS problems.

Journal ArticleDOI
01 Aug 2016
TL;DR: An incremental algorithm for attribute reduction with VPRS is presented, designed by the adoption of the attribute reduction process, to address the time complexity of current algorithms.
Abstract: Display Omitted Two Boolean row vectors are introduced to characterize the disdernibility matrix and reduct.Rather than the whole discernibility matrix, minimal elements are incrementally computed.The attribute reduction process is studied to reveal how to add and delete attributes.Our incremental algorithm is developed by the adoption of the attribute reduction process.The experimental results show our method can handle datasets with large samples. Attribute reduction with variable precision rough sets (VPRS) attempts to select the most information-rich attributes from a dataset by incorporating a controlled degree of misclassification into approximations of rough sets. However, the existing attribute reduction algorithms with VPRS have no incremental mechanisms of handling dynamic datasets with increasing samples, so that they are computationally time-consuming for such datasets. Therefore, this paper presents an incremental algorithm for attribute reduction with VPRS, in order to address the time complexity of current algorithms. First, two Boolean row vectors are introduced to characterize the discernibility matrix and reduct in VPRS. Then, an incremental manner is employed to update minimal elements in the discernibility matrix at the arrival of an incremental sample. Based on this, a deep insight into the attribute reduction process is gained to reveal which attributes to be added into and/or deleted from a current reduct, and our incremental algorithm is designed by this adoption of the attribute reduction process. Finally, experimental comparisons validate the effectiveness of our proposed incremental algorithm.

Journal ArticleDOI
TL;DR: The results show that the novel attribute reduction algorithm based on improved AFSA and rough set can search the attribute reduction set effectively, and it has low time complexity and the excellent global search ability.

Journal ArticleDOI
TL;DR: A new kind of dominance relation is introduced, named the characteristic-based dominance relation, to incomplete ordered information systems in which some attribute values may be lost or absent and a heuristic algorithm with polynomial time complexity for finding a unique (relative) reduct is designed.

Journal ArticleDOI
TL;DR: A novel technical approach based on domain mapping in Axiomatic Design and the quality and reliability data from product lifecycle and the integrated application of artificial intelligence techniques of Rough Set and fuzzy TOPSIS to compute the weight of root causes for complex product infant failure is put forward.

Journal ArticleDOI
TL;DR: Results indicated that latent knowledge can be identified to support location selection decisions, and the proposed data mining framework consists of four stages: problem definition and data collection; RST analysis; rule validation; and knowledge extraction and usage.

Journal ArticleDOI
TL;DR: A new rough set model of conflict analysis: a conflict analysis decision model based on rough set theory over two universes, which could reveal the core causes for a conflict situation but also can find a possible optimal feasible consensus strategy to solve the conflict situation which satisfies the agents as much as possible.

Journal ArticleDOI
Bao Qing Hu1
TL;DR: This paper attempts to generalize measurement on decision conclusion in three-way decision spaces from fuzzy lattices to partially ordered sets, and points out that the collection of non-empty subset of 0, 1 and the family of hesitant fuzzy sets are both partially order sets.
Abstract: Three-way decisions on three-way decision spaces are based on fuzzy lattices, i.e. complete distributive lattices with involutive negators. However, now some popular structures, such as hesitant fuzzy sets and type-2 fuzzy sets, do not constitute fuzzy lattices. It limits applications of the theory of three-way decision spaces. So this paper attempts to generalize measurement on decision conclusion in three-way decision spaces from fuzzy lattices to partially ordered sets. First three-way decision spaces and three-way decisions are discussed based on general partially ordered sets. Then this paper points out that the collection of non-empty subset of 0, 1 and the family of hesitant fuzzy sets are both partially ordered sets. Finally this paper systematically discusses three-way decision spaces and three-way decisions based on hesitant fuzzy sets and interval-valued hesitant fuzzy sets and obtains many useful decision evaluation functions.

Journal ArticleDOI
TL;DR: In this article, the authors exploit matrix approaches to study incremental decision-theoretic rough set approach for evolving data and develop incremental algorithms for updating probabilistic rough set approximations with respect to the addition/deletion of objects.
Abstract: Decision-theoretic rough sets is a generalized probabilistic model for the expression of uncertainties and the representation of knowledge from data. It provides a semantic explanation and systematically computation of probabilistic thresholds to define probabilistic rough set approximations, which offers a ternary classification framework based on Bayesian decision theory. In practice, data for decision making process resides in a dynamic database whose data is typically evolving through the periodical or occasional updating, e.g., new data are appended and obsolete data are removed. It is impractical to have a maturity decision model, stalled until the preparation of all helpful training data. To address this issue, incremental learning appeared to be a feasible solution for continuous knowledge modeling from evolving data with the incorporation of unlearned knowledge embedded in the updating data. In this paper, we exploit matrix approaches to study incremental decision-theoretic rough set approach for evolving data. Starting from the representation of object subset and indiscernibility relation in matrix form, we obtain a matrix characterization of probabilistic rough set approximations in decision-theoretic rough sets by using matrix properties associated with the multiplication operator. We also develop incremental algorithms for updating probabilistic rough set approximations with respect to the addition/deletion of objects, which enables decision theoretic rough sets to deal gracefully with evolving data. A detailed experimental study is conducted to examine the performance of the proposed incremental algorithms on UCI data sets.

Journal ArticleDOI
TL;DR: A three-way decision making approach based on decisions of acceptance, rejection or deferment that aims to mitigate the false decisions at the model level by determining a tradeoff between different properties of decision making such as accuracy, generality and uncertainty.