scispace - formally typeset
Search or ask a question

Showing papers on "Rough set published in 2009"


Journal ArticleDOI
TL;DR: Three new approaches to fuzzy-rough FS-based on fuzzy similarity relations based on crisp discernibility matrices are proposed and utilized and initial experimentation shows that the methods greatly reduce dimensionality while preserving classification accuracy.
Abstract: There has been great interest in developing methodologies that are capable of dealing with imprecision and uncertainty. The large amount of research currently being carried out in fuzzy and rough sets is representative of this. Many deep relationships have been established, and recent studies have concluded as to the complementary nature of the two methodologies. Therefore, it is desirable to extend and hybridize the underlying concepts to deal with additional aspects of data imperfection. Such developments offer a high degree of flexibility and provide robust solutions and advanced tools for data analysis. Fuzzy-rough set-based feature (FS) selection has been shown to be highly useful at reducing data dimensionality but possesses several problems that render it ineffective for large datasets. This paper proposes three new approaches to fuzzy-rough FS-based on fuzzy similarity relations. In particular, a fuzzy extension to crisp discernibility matrices is proposed and utilized. Initial experimentation shows that the methods greatly reduce dimensionality while preserving classification accuracy.

521 citations


Journal ArticleDOI
TL;DR: The equivalency between this type of covering-based rough sets and a type of binary relation based rough sets is established and axiomatic systems for this type-of-covering lower and upper approximation operations are presented.

427 citations


Journal ArticleDOI
TL;DR: A new procedure is proposed, joining quantitative value of RFM attributes and K-means algorithm into rough set theory (RS theory), to extract meaning rules, and it can effectively improve these drawbacks of data mining tool.
Abstract: Data mining is a powerful new technique to help companies mining the patterns and trends in their customers data, then to drive improved customer relationships, and it is one of well-known tools given to customer relationship management (CRM). However, there are some drawbacks for data mining tool, such as neural networks has long training times and genetic algorithm is brute computing method. This study proposes a new procedure, joining quantitative value of RFM attributes and K-means algorithm into rough set theory (RS theory), to extract meaning rules, and it can effectively improve these drawbacks. Three purposes involved in this study in the following: (1) discretize continuous attributes to enhance the rough sets algorithm; (2) cluster customer value as output (customer loyalty) that is partitioned into 3, 5 and 7 classes based on subjective view, then see which class is the best in accuracy rate; and (3) find out the characteristic of customer in order to strengthen CRM. A practical collected C-company dataset in Taiwan's electronic industry is employed in empirical case study to illustrate the proposed procedure. Referring to [Hughes, A. M. (1994). Strategic database marketing. Chicago: Probus Publishing Company], this study firstly utilizes RFM model to yield quantitative value as input attributes; next, uses K-means algorithm to cluster customer value; finally, employs rough sets (the LEM2 algorithm) to mine classification rules that help enterprises driving an excellent CRM. In analysis of the empirical results, the proposed procedure outperforms the methods listed in terms of accuracy rate regardless of 3, 5 and 7 classes on output, and generates understandable decision rules.

386 citations


Journal ArticleDOI
01 Jan 2009
TL;DR: The rough sets hybridization with fuzzy sets, neural network and metaheuristic algorithms have been reviewed and the performance analysis of the algorithms has been discussed in connection with the classification.
Abstract: A rough set theory is a new mathematical tool to deal with uncertainty and vagueness of decision system and it has been applied successfully in all the fields. It is used to identify the reduct set of the set of all attributes of the decision system. The reduct set is used as preprocessing technique for classification of the decision system in order to bring out the potential patterns or association rules or knowledge through data mining techniques. Several researchers have contributed variety of algorithms for computing the reduct sets by considering different cases like inconsistency, missing attribute values and multiple decision attributes of the decision system. This paper focuses on the review of the techniques for dimensionality reduction under rough set theory environment. Further, the rough sets hybridization with fuzzy sets, neural network and metaheuristic algorithms have also been reviewed. The performance analysis of the algorithms has been discussed in connection with the classification.

348 citations


Book ChapterDOI
Yiyu Yao1
01 Jul 2009
TL;DR: A new interpretation of rules in rough set theory is introduced, which enables us to derive three types of decision rules, namely, positive rules for acceptance, boundary rules for indecision or delayed decision, and negative rules for rejection.
Abstract: A new interpretation of rules in rough set theory is introduced According to the positive, boundary, and negative regions of a set, one can make a three-way decision: accept, abstain and reject The three regions enable us to derive three types of decision rules, namely, positive rules for acceptance, boundary rules for indecision or delayed decision, and negative rules for rejection Within the decision-theoretic rough set model, the associated costs of rules are analyzed

305 citations


Journal ArticleDOI
TL;DR: The equivalence of the unary covering and the covering with the property that the intersection of any two elements is the union of finite elements in this covering is established.

274 citations


Journal ArticleDOI
TL;DR: The experimental results show that the combined approach provided in this paper improves the forecasting performance of each individual forecast and is free from a rough sets approach's restrictions as well, a promising forecasting approach and a new application of soft sets theory.

251 citations


Journal ArticleDOI
TL;DR: Through attribute reduction based on variable precision with rough set, the influence of noise data and weak interdependency data to BP is avoided so the time taken for training is decreased.
Abstract: Precise Short term load forecasting (STLF) plays a significant role in the management of power system of countries and regions on the grounds of insufficient electric energy for increased need. This paper presents an approach of back propagation neural network with rough set (RSBP) for complicated STLF with dynamic and non-linear factors to develop the accuracy of predictions. Through attribute reduction based on variable precision with rough set, the influence of noise data and weak interdependency data to BP is avoided so the time taken for training is decreased. Using load time series from a practical power system, we tested the performance of RSBP by comparing its predictions with that of BP network.

231 citations


Journal ArticleDOI
TL;DR: The result of an example shows that the proposed rough-grey analysis has provided a novel alternative to perform design concept evaluation, in which the vague design information and expert knowledge can be modeled and analyzed more effectively and objectively.
Abstract: Design concept evaluation plays a critical role in the early phases of product development as it has significant impact on the downstream development processes as well as on the success of the product developed. Essentially, design concept evaluation is a complex multi-criteria decision-making process involving large amount of data and expert knowledge which are usually imprecise and subjective. Aiming to improve the effectiveness and objectivity of the design concept evaluation process, this paper proposes a novel method based on grey relation analysis and rough set theory. By integrating the strength of rough sets in handling vagueness and the merit of grey relation analysis in modeling multi-criteria decision-making, a rough number enabled grey relation analysis (called rough-grey analysis) is proposed to evaluate design concepts. The result of an example shows that the proposed rough-grey analysis has provided a novel alternative to perform design concept evaluation, in which the vague design information and expert knowledge can be modeled and analyzed more effectively and objectively.

222 citations


Journal ArticleDOI
TL;DR: This paper sets up a model named fuzzy VPRSs (FVPRss) by combining FRS and VPRs with the goal of making FRS a special case, and compares FRS with RS, FRS, and several flexible RS-based approaches with respect to misclassification and perturbation.
Abstract: The fuzzy rough set (FRS) model has been introduced to handle databases with real values. However, FRS was sensitive to misclassification and perturbation (here misclassification means error or missing values in classification, and perturbation means small changes of numerical data). The variable precision rough sets (VPRSs) model was introduced to handle databases with misclassification. However, it could not effectively handle the real-valued datasets. Now, it is valuable from theoretical and practical aspects to combine FRS and VPRS so that a powerful tool, which not only can handle numerical data but also is less sensitive to misclassification and perturbation, can be developed. In this paper, we set up a model named fuzzy VPRSs (FVPRSs) by combining FRS and VPRS with the goal of making FRS a special case. First, we study the knowledge representation ways of FRS and VPRS, and then, propose the set approximation operators of FVPRS. Second, we employ the discernibility matrix approach to investigate the structure of attribute reductions in FVPRS and develop an algorithm to find all reductions. Third, in order to overcome the NP-complete problem of finding all reductions, we develop some fast heuristic algorithms to obtain one near-optimal attribute reduction. Finally, we compare FVPRS with RS, FRS, and several flexible RS-based approaches with respect to misclassification and perturbation. The experimental comparisons show the feasibility and effectiveness of FVPRS.

201 citations


Journal ArticleDOI
TL;DR: An operator-oriented characterization of L- fuzzy rough sets is presented, that is, L-fuzzy approximation operators are defined by axioms, and the relationship between L-magnitude rough sets and L-topological spaces is obtained.
Abstract: Rough set theory was developed by Pawlak as a formal tool for approximate reasoning about data Various fuzzy generalizations of rough approximations have been proposed in the literature As a further generalization of the notion of rough sets, L-fuzzy rough sets were proposed by Radzikowska and Kerre In this paper, we present an operator-oriented characterization of L-fuzzy rough sets, that is, L-fuzzy approximation operators are defined by axioms The methods of axiomatization of L-fuzzy upper and L-fuzzy lower set-theoretic operators guarantee the existence of corresponding L-fuzzy relations which produce the operators Moreover, the relationship between L-fuzzy rough sets and L-topological spaces is obtained The sufficient and necessary condition for the conjecture that an L-fuzzy interior (closure) operator derived from an L-fuzzy topological space can associate with an L-fuzzy reflexive and transitive relation such that the corresponding L-fuzzy lower (upper) approximation operator is the L-fuzzy interior (closure) operator is examined

Journal ArticleDOI
TL;DR: An advantage of VP-DRSA over variable-consistency dominance-based rough set approach in decision rule induction is emphasized and some relations among the VP- DRSA-based attribute reduction approaches are investigated.

Journal ArticleDOI
TL;DR: This paper compares the covering-based rough sets defined by Zhu with ones defined by Xu and Zhang, and further explores the properties and structures of these types of rough set models.

Journal ArticleDOI
Duoqian Miao1, Yan Zhao2, Yiyu Yao2, Huaxiong Li2, Feifei Xu1 
TL;DR: This paper investigates three different classification properties, and suggests three distinct definitions accordingly, based on the common structure of the specific definitions of relative reducts and discernibility matrices.

Journal ArticleDOI
TL;DR: An axiomatic definition of knowledge granulation for an information system is given, under which these three measures are modified and show that the modified measures are effective and suitable for evaluating the roughness and accuracy of a set in an Information system and the approximation accuracy of an rough classification in a decision table.

Journal ArticleDOI
01 Nov 2009
TL;DR: A review of the current literature on rough- set- and near-set-based approaches to solving various problems in medical imaging such as medical image segmentation, object extraction, and image classification and rough set frameworks hybridized with other computational intelligence technologies are presented.
Abstract: This paper presents a review of the current literature on rough-set- and near-set-based approaches to solving various problems in medical imaging such as medical image segmentation, object extraction, and image classification. Rough set frameworks hybridized with other computational intelligence technologies that include neural networks, particle swarm optimization, support vector machines, and fuzzy sets are also presented. In addition, a brief introduction to near sets and near images with an application to MRI images is given. Near sets offer a generalization of traditional rough set theory and a promising approach to solving the medical image correspondence problem as well as an approach to classifying perceptual objects by means of features in solving medical imaging problems. Other generalizations of rough sets such as neighborhood systems, shadowed sets, and tolerance spaces are also briefly considered in solving a variety of medical imaging problems. Challenges to be addressed and future directions of research are identified and an extensive bibliography is also included.

Journal ArticleDOI
01 Jul 2009
TL;DR: It is shown that consistency measures used so far in the definition of rough approximation lack some of monotonicity properties, and new measures within two kinds of rough set approaches are proposed: Variable Consistency Indiscernibility-based Rough Set Approaches (VC-IRSA) and Variable Consistsency Dominance-basedrough set Approaches(VC-DRSA).
Abstract: We consider probabilistic rough set approaches based on different versions of the definition of rough approximation of a set. In these versions, consistency measures are used to control assignment of objects to lower and upper approximations. Inspired by some basic properties of rough sets, we find it reasonable to require from these measures several properties of monotonicity. We consider three types of monotonicity properties: monotonicity with respect to the set of attributes, monotonicity with respect to the set of objects, and monotonicity with respect to the dominance relation. We show that consistency measures used so far in the definition of rough approximation lack some of these monotonicity properties. This observation led us to propose new measures within two kinds of rough set approaches: Variable Consistency Indiscernibility-based Rough Set Approaches (VC-IRSA) and Variable Consistency Dominance-based Rough Set Approaches (VC-DRSA). We investigate properties of these approaches and compare them to previously proposed Variable Precision Rough Set (VPRS) model, Rough Bayesian (RB) model, and previous versions of VC-DRSA.

Proceedings ArticleDOI
01 Apr 2009
TL;DR: RST and SVM schema could improve the false positive rate and accuracy and the method is effective to decrease the space density of data.
Abstract: The main function of IDS (Intrusion Detection System) is to protect the system, analyze and predict the behaviors of users. Then these behaviors will be considered an attack or a normal behavior. Though IDS has been developed for many years, the large number of return alert messages makes managers maintain system inefficiently. In this paper, we use RST (Rough Set Theory) and SVM (Support Vector Machine) to detect intrusions. First, RST is used to preprocess the data and reduce the dimensions. Next, the features selected by RST will be sent to SVM model to learn and test respectively. The method is effective to decrease the space density of data. The experiments will compare the results with different methods and show RST and SVM schema could improve the false positive rate and accuracy.

Journal ArticleDOI
TL;DR: This paper identifies the main uncertainty factors affecting evaluation process, and then model and analyze them using the rough data envelopment analysis (RDEA) models, and creates rough DEA by integrating classical DEA and rough set theory.

Journal ArticleDOI
TL;DR: The dominance principle is reformulated and the meaning of the precisiation property is extended to the considered case and a way to reduce decision tables and to induce decision rules from rough approximations is presented.

Book ChapterDOI
01 Jan 2009
TL;DR: Multiple attribute (or multiple criteria) decision support aims at giving the decision maker (DM) a recommendation concerning a set of objects A evaluated from multiple points of view called attributes.
Abstract: Multiple attribute (or multiple criteria) decision support aims at giving the decision maker (DM) a recommendation concerning a set of objects A (also called alternatives, actions, acts, solutions, options, candidates, ...) evaluated from multiple points of view called attributes (also called features, variables, criteria, objectives, ...).

Book ChapterDOI
07 Jul 2009
TL;DR: This paper revisits the hybridization of rough sets and fuzzy sets by introducing vague quantifiers like "some" or "most" into the definition of upper and lower approximation, and develops a vaguely quantified rough set model that is closely related to Ziarko's variable precision rough set (VPRS) model.
Abstract: The hybridization of rough sets and fuzzy sets has focused on creating an end product that extends both contributing computing paradigms in a conservative way As a result, the hybrid theory inherits their respective strengths, but also exhibits some weaknesses In particular, although they allow for gradual membership, fuzzy rough sets are still abrupt in a sense that adding or omitting a single element may drastically alter the outcome of the approximations In this paper, we revisit the hybridization process by introducing vague quantifiers like "some" or "most" into the definition of upper and lower approximation The resulting vaguely quantified rough set (VQRS) model is closely related to Ziarko's variable precision rough set (VPRS) model

Journal ArticleDOI
TL;DR: Combining rough set theory, Kano's model, analytical hierarchy process (AHP), and scale method, an integrated method is proposed to obtain the final importance of CRs in PPHOQ.
Abstract: Owing to the typical vagueness or imprecision of customer requirements (CRs) in product planning house of quality (PPHOQ), determining the final importance of CRs is very difficult. Combining rough set theory, Kano's model, analytical hierarchy process (AHP), and scale method, an integrated method is proposed to obtain the final importance of CRs in PPHOQ. Firstly, by using relative reduction and relative core in rough set theory, a decision system is built to acquire CRs in PPHOQ. Secondly, based on relative positive field in rough set, the decision system is simplified and its corresponding new decision system is established to determine the fundamental importance ratings of CRs. Thirdly, by integrating scale method into AHP approach, calculating formulas of the importance rating of achieving the improvement ratio of satisfaction estimation of a CR are developed. Next, for every CR, based on a combination of its fundamental importance rating, the importance rating of achieving the improvement ratio of its satisfaction estimation, and ''its sales point'', its final importance rating is determined. Finally, a case study is provided to illustrate the effectiveness of the presented method.

Journal ArticleDOI
01 Nov 2009
TL;DR: The purpose of this paper is to further investigate the dominance-based rough set in incomplete interval-valued information system, which contains both incomplete and imprecise evaluations of objects.
Abstract: Since preference order is a crucial feature of data concerning decision situations, the classical rough set model has been generalized by replacing the indiscernibility relation with a dominance relation. The purpose of this paper is to further investigate the dominance-based rough set in incomplete interval-valued information system, which contains both incomplete and imprecise evaluations of objects. By considering three types of unknown values in the incomplete interval-valued information system, a data complement method is used to transform the incomplete interval-valued information system into a traditional one. To generate the optimal decision rules from the incomplete interval-valued decision system, six types of relative reducts are proposed. Not only the relationships between these reducts but also the practical approaches to compute these reducts are then investigated. Some numerical examples are employed to substantiate the conceptual arguments.

BookDOI
01 Jan 2009
TL;DR: Speech Man-Machine Communication Stochastic Effects in Signaling Pathways in Cells: Interaction between Visualization and Modeling and Rough-Granular Computing in Human-Centric Information Processing.
Abstract: Keynote Talks.- Speech Man-Machine Communication.- Stochastic Effects in Signaling Pathways in Cells: Interaction between Visualization and Modeling.- Rough-Granular Computing in Human-Centric Information Processing.- Discovering Affinities between Perceptual Granules.- Human-Computer Interactions.- A Psycholinguistic Model of Man-Machine Interactions Based on Needs of Human Personality.- Adaptable Graphical User Interfaces for Player-Based Applications.- Case-Based Reasoning Model in Process of Emergency Management.- Enterprise Ontology According to Roman Ingarden Formal Ontology.- Hand Shape Recognition for Human-Computer Interaction.- System for Knowledge Mining in Data from Interactions between User and Application.- Computational Techniques in Biosciences.- Analyze of Maldi-TOF Proteomic Spectra with Usage of Mixture of Gaussian Distributions.- Energy Properties of Protein Structures in the Analysis of the Human RAB5A Cellular Activity.- Fuzzy Weighted Averaging of Biomedical Signal Using Bayesian Inference.- Fuzzy Clustering and Gene Ontology Based Decision Rules for Identification and Description of Gene Groups.- Estimation of the Number of Primordial Genes in a Compartment Model of RNA World.- Quasi Dominance Rough Set Approach in Testing for Traces of Natural Selection at Molecular Level.- Decision Support, Rule Inferrence and Representation.- The Way of Rules Representation in Composited Knowledge Bases.- Clustering of Partial Decision Rules.- Decision Trees Constructing over Multiple Data Streams.- Decision Tree Induction Methods for Distributed Environment.- Extensions of Multistage Decision Transition Systems: The Rough Set Perspective.- Emotion Recognition Based on Dynamic Ensemble Feature Selection.- Rough Fuzzy Investigations.- On Construction of Partial Association Rules with Weights.- Fuzzy Rough Entropy Clustering Algorithm Parametrization.- Data Grouping Process in Extended SQL Language Containing Fuzzy Elements.- Rough Sets in Flux: Crispings and Change.- Simplification of Neuro-Fuzzy Models.- Fuzzy Weighted Averaging Using Criterion Function Minimization.- Approximate String Matching by Fuzzy Automata.- Remark on Membership Functions in Neuro-Fuzzy Systems.- Capacity-Based Definite Rough Integral and Its Application.- Advances in Classification Methods.- Classifier Models in Intelligent CAPP Systems.- Classification Algorithms Based on Template's Decision Rules.- Fast Orthogonal Neural Network for Adaptive Fourier Amplitude Spectrum Computation in Classification Problems.- Relative Reduct-Based Selection of Features for ANN Classifier.- Enhanced Ontology Based Profile Comparison Mechanism for Better Recommendation.- Privacy Preserving Classification for Ordered Attributes.- Incorporating Detractors into SVM Classification.- Bayes Multistage Classifier and Boosted C4.5 Algorithm in Acute Abdominal Pain Diagnosis.- Pattern Recognition and Signal Processing.- Skrybot - A System for Automatic Speech Recognition of Polish Language.- Speaker Verification Based on Fuzzy Classifier.- Support Vector Classifier with Linguistic Interpretation of the Kernel Matrix in Speaker Verification.- Application of Discriminant Analysis to Distinction of Musical Instruments on the Basis of Selected Sound Parameters.- Computer Vision, Image Analysis and Virtual Reality.- Spatial Color Distribution Based Indexing and Retrieval Scheme.- Synthesis of Static Medical Images with an Active Shape Model.- New Method for Personalization of Avatar Animation.- Multidimensional Labyrinth - Multidimensional Virtual Reality.- Shape Recognition Using Partitioned Iterated Function Systems.- Computer Vision Support for the Orthodontic Diagnosis.- From Museum Exhibits to 3D Models.- Advances in Algorithmics.- A Method for Automatic Standardization of Text Attributes without Reference Data Sets.- Internal Conflict-Free Projection Sets.- The Comparison of an Adapted Evolutionary Algorithm with the Invasive Weed Optimization Algorithm Based on the Problem of Predetermining the Progress of Distributed Data Merging Process.- Cumulation of Pheromone Values in Web Searching Algorithm.- Mining for Unconnected Frequent Graphs with Direct Subgraph Isomorphism Tests.- Numerical Evaluation of the Random Walk Search Algorithm.- On Two Variants of the Longest Increasing Subsequence Problem.- Computing the Longest Common Transposition-Invariant Subsequence with GPU.- Databases and Data Warehousing.- Usage of the Universal Object Model in Database Schemas Comparison and Integration.- Computational Model for Efficient Processing of Geofield Queries.- Applying Advanced Methods of Query Selectivity Estimation in Oracle DBMS.- How to Efficiently Generate PNR Representation of a Qualitative Geofield.- RBTAT: Red-Black Table Aggregate Tree.- Performing Range Aggregate Queries in Stream Data Warehouse.- LVA-Index: An Efficient Way to Determine Nearest Neighbors.- Embedded Systems Applications.- Basic Component of Computational Intelligence for IRB-1400 Robots.- Factors Having Influence upon Efficiency of an Integrated Wired-Wireless Network.- FFT Based EMG Signals Analysis on FPGAs for Dexterous Hand Prosthesis Control.- The VHDL Implementation of Reconfigurable MIPS Processor.- Time Optimal Target Following by a Mobile Vehicle.- Improving Quality of Satellite Navigation Devices.

Journal ArticleDOI
TL;DR: This work proposes a rough set based QFD approach to manage the aforementioned imprecise design information in product development using a novel concept known as rough number^*, which is derived from the basic notions of rough sets.

Journal ArticleDOI
TL;DR: The proposed cluster validity index based on the decision-theoretic rough set model by considering various loss functions is shown to help determine optimal number of clusters, as well as an important parameter called threshold in rough clustering.
Abstract: Quality of clustering is an important issue in application of clustering techniques. Most traditional cluster validity indices are geometry-based cluster quality measures. This paper proposes a cluster validity index based on the decision-theoretic rough set model by considering various loss functions. Experiments with synthetic, standard, and real-world retail data show the usefulness of the proposed validity index for the evaluation of rough and crisp clustering. The measure is shown to help determine optimal number of clusters, as well as an important parameter called threshold in rough clustering. The experiments with a promotional campaign for the retail data illustrate the ability of the proposed measure to incorporate financial considerations in evaluating quality of a clustering scheme. This ability to deal with monetary values distinguishes the proposed decision-theoretic measure from other distance-based measures. The proposed validity index can also be extended for evaluating other clustering algorithms such as fuzzy clustering.

Journal ArticleDOI
TL;DR: This introduction to the R package sets is a (slightly) modied version of Meyer and Hornik (2009a), published in the Journal of Statistical Software.
Abstract: This introduction to the R package sets is a (slightly) modied version of Meyer and Hornik (2009a), published in the Journal of Statistical Software. We present data structures and algorithms for sets and some generalizations thereof (fuzzy sets, multisets, and fuzzy multisets) available for R through the sets package. Fuzzy (multi-)sets are based on dynamically bound fuzzy logic families. Further extensions include user-denable iterators and matching functions.

Journal ArticleDOI
TL;DR: This paper proposes a new approach based on the tolerance rough set model, which has the ability to deal with real-valued data whilst simultaneously retaining dataset semantics and describes the underlying mechanism for this new approach to utilise the information contained within the boundary region or region of uncertainty.

Journal ArticleDOI
TL;DR: The switching relation between type-2 fuzzy sets and intuitionistic fuzzy sets is defined axiomatically, and the switching results are applied to show the usefulness of the proposed method in pattern recognition and medical diagnosis reasoning.
Abstract: When dealing with vagueness, there are situations when there is insufficient information available, making it impossible to satisfactorily evaluate membership. The intuitionistic fuzzy set theory is more suitable than fuzzy sets to deal with such problem. In 1996, Atanassov proposed the mapping from intuitionistic fuzzy sets to fuzzy sets. Furthermore, intuitionistic fuzzy sets are isomorphic to interval valued fuzzy sets, and interval valued fuzzy sets are regarded as the special cases of type-2 fuzzy sets in recently studies. However, their discussions are not only hardly comprehending but also lacking the reliable applications. In this study, the advantage of type-2 fuzzy sets is employed, and the switching relation between type-2 fuzzy sets and intuitionistic fuzzy sets is defined axiomatically. The switching results are applied to show the usefulness of the proposed method in pattern recognition and medical diagnosis reasoning.