scispace - formally typeset
Search or ask a question

Showing papers in "Information Sciences in 2007"


Journal ArticleDOI
TL;DR: The basic concepts of rough set theory are presented and some rough set-based research directions and applications are pointed out, indicating that the rough set approach is fundamentally important in artificial intelligence and cognitive sciences.
Abstract: Worldwide, there has been a rapid growth in interest in rough set theory and its applications in recent years. Evidence of this can be found in the increasing number of high-quality articles on rough sets and related topics that have been published in a variety of international journals, symposia, workshops, and international conferences in recent years. In addition, many international workshops and conferences have included special sessions on the theory and applications of rough sets in their programs. Rough set theory has led to many interesting applications and extensions. It seems that the rough set approach is fundamentally important in artificial intelligence and cognitive sciences, especially in research areas such as machine learning, intelligent systems, inductive reasoning, pattern recognition, mereology, knowledge discovery, decision analysis, and expert systems. In the article, we present the basic concepts of rough set theory and point out some rough set-based research directions and applications.

2,004 citations


Journal ArticleDOI
TL;DR: Some extensions of the rough set approach are presented and a challenge for the roughSet based research is outlined and it is outlined that the current rough set based research paradigms are unsustainable.
Abstract: In this article, we present some extensions of the rough set approach and we outline a challenge for the rough set based research.

1,161 citations


Journal ArticleDOI
TL;DR: The basic properties of soft sets are introduced, and compare soft sets to the related concepts of fuzzy sets and rough sets, and a definition of soft groups is given.
Abstract: Molodtsov introduced the concept of soft set theory, which can be used as a generic mathematical tool for dealing with uncertainty. In this paper we introduce the basic properties of soft sets, and compare soft sets to the related concepts of fuzzy sets and rough sets. We then give a definition of soft groups, and derive their basic properties using Molodtsov's definition of the soft sets.

1,012 citations


Journal ArticleDOI
TL;DR: Methods based on the combination of rough sets and Boolean reasoning with applications in pattern recognition, machine learning, data mining and conflict analysis are discussed.
Abstract: In this article, we discuss methods based on the combination of rough sets and Boolean reasoning with applications in pattern recognition, machine learning, data mining and conflict analysis.

940 citations


Journal ArticleDOI
Zeshui Xu1
TL;DR: This paper develops an approach to group decision making based on intuitionistic preference relations and an approach based on incomplete intuitionism preference relations respectively, in which the intuitionistic fuzzy arithmetic averaging operator and intuitionism fuzzy weighted arithmetic averagingoperator are used to aggregate intuitionistic preferences.
Abstract: Intuitionistic fuzzy set, characterized by a membership function and a non-membership function, was introduced by Atanassov [Intuitionistic fuzzy sets, Fuzzy Sets and Systems 20 (1986) 87-96]. In this paper, we define the concepts of intuitionistic preference relation, consistent intuitionistic preference relation, incomplete intuitionistic preference relation and acceptable intuitionistic preference relation, and study their various properties. We develop an approach to group decision making based on intuitionistic preference relations and an approach to group decision making based on incomplete intuitionistic preference relations respectively, in which the intuitionistic fuzzy arithmetic averaging operator and intuitionistic fuzzy weighted arithmetic averaging operator are used to aggregate intuitionistic preference information, and the score function and accuracy function are applied to the ranking and selection of alternatives. Finally, a practical example is provided to illustrate the developed approaches.

781 citations


Journal ArticleDOI
TL;DR: A process for quantitative SWOT analysis that can be performed even when there is dependence among strategic factors is demonstrated, which uses the analytic network process (ANP), which allows measurement of the dependency among the strategic factors, as well as AHP, which is based on the independence between the factors.
Abstract: Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis does not provide an analytical means to determine the importance of the identified factors or the ability to assess decision alternatives according to these factors. Although the analysis successfully pinpoints the factors, individual factors are usually described briefly and very generally. For this reason, SWOT analysis possesses deficiencies in the measurement and evaluation steps. Although the analytic hierarchy process (AHP) technique removes these deficiencies, it does not allow for measurement of the possible dependencies among the factors. The AHP method assumes that the factors presented in the hierarchical structure are independent; however, this assumption may be inappropriate in light of certain internal and external environmental effects. Therefore, it is necessary to employ a form of SWOT analysis that measures and takes into account the possible dependency among the factors. This paper demonstrates a process for quantitative SWOT analysis that can be performed even when there is dependence among strategic factors. The proposed algorithm uses the analytic network process (ANP), which allows measurement of the dependency among the strategic factors, as well as AHP, which is based on the independence between the factors. Dependency among the SWOT factors is observed to effect the strategic and sub-factor weights, as well as to change the strategy priorities.

635 citations


Journal ArticleDOI
TL;DR: In this state-of-the-art paper, important advances that have been made during the past five years for both general and interval type-2 fuzzy sets and systems are described.
Abstract: In this state-of-the-art paper, important advances that have been made during the past five years for both general and interval type-2 fuzzy sets and systems are described. Interest in type-2 subjects is worldwide and touches on a broad range of applications and many interesting theoretical topics. The main focus of this paper is on the theoretical topics, with descriptions of what they are, what has been accomplished, and what remains to be done.

614 citations


Journal ArticleDOI
TL;DR: This paper explores the topological properties of covering-based rough sets, studies the interdependency between the lower and the upper approximation operations, and establishes the conditions under which two coverings generate the same lower approximation operation and the same upper approximation operation.
Abstract: Rough sets, a tool for data mining, deal with the vagueness and granularity in information systems. This paper studies covering-based rough sets from the topological view. We explore the topological properties of this type of rough sets, study the interdependency between the lower and the upper approximation operations, and establish the conditions under which two coverings generate the same lower approximation operation and the same upper approximation operation. Lastly, axiomatic systems for the lower approximation operation and the upper approximation operation are constructed.

588 citations


Journal ArticleDOI
TL;DR: A new diversity parameter has been used to ensure sufficient diversity amongst the solutions of the non-dominated fronts, while retaining at the same time the convergence to the Pareto-optimal front.
Abstract: In this article we describe a novel Particle Swarm Optimization (PSO) approach to multi-objective optimization (MOO), called Time Variant Multi-Objective Particle Swarm Optimization (TV-MOPSO). TV-MOPSO is made adaptive in nature by allowing its vital parameters (viz., inertia weight and acceleration coefficients) to change with iterations. This adaptiveness helps the algorithm to explore the search space more efficiently. A new diversity parameter has been used to ensure sufficient diversity amongst the solutions of the non-dominated fronts, while retaining at the same time the convergence to the Pareto-optimal front. TV-MOPSO has been compared with some recently developed multi-objective PSO techniques and evolutionary algorithms for 11 function optimization problems, using different performance measures.

482 citations


Journal ArticleDOI
TL;DR: A hybrid approach involving genetic algorithms (GA) and bacterial foraging algorithms for function optimization problems and results clearly illustrate that the proposed approach is very efficient and could easily be extended for other global optimization problems.
Abstract: The social foraging behavior of Escherichia coli bacteria has been used to solve optimization problems. This paper proposes a hybrid approach involving genetic algorithms (GA) and bacterial foraging (BF) algorithms for function optimization problems. We first illustrate the proposed method using four test functions and the performance of the algorithm is studied with an emphasis on mutation, crossover, variation of step sizes, chemotactic steps, and the lifetime of the bacteria. The proposed algorithm is then used to tune a PID controller of an automatic voltage regulator (AVR). Simulation results clearly illustrate that the proposed approach is very efficient and could easily be extended for other global optimization problems.

468 citations


Journal ArticleDOI
TL;DR: This work interprets a fuzzy differential equation by using the strongly generalized differentiability concept, and finds solutions which have a decreasing length of their support and in several applications better reflects the behaviour of some real-world systems.
Abstract: First order linear fuzzy differential equations are investigated. We interpret a fuzzy differential equation by using the strongly generalized differentiability concept, because under this interpretation, we may obtain solutions which have a decreasing length of their support (which means a decreasing uncertainty). In several applications the behaviour of these solutions better reflects the behaviour of some real-world systems. Derivatives of the H-difference and the product of two functions are obtained and we provide solutions of first order linear fuzzy differential equations, using different versions of the variation of constants formula. Some examples show the rich behaviour of the solutions obtained.

Journal ArticleDOI
TL;DR: This paper studies arbitrary binary relation based generalized rough sets, in which a binary relation can generate a lower approximation operation and an upper approximation operation, but some of common properties of classical lower and upper approximation operations are no longer satisfied.
Abstract: Rough set theory has been proposed by Pawlak as a tool for dealing with the vagueness and granularity in information systems. The core concepts of classical rough sets are lower and upper approximations based on equivalence relations. This paper studies arbitrary binary relation based generalized rough sets. In this setting, a binary relation can generate a lower approximation operation and an upper approximation operation, but some of common properties of classical lower and upper approximation operations are no longer satisfied. We investigate conditions for a relation under which these properties hold for the relation based lower and upper approximation operations.

Journal ArticleDOI
TL;DR: A new SVM approach is proposed, named Enhanced SVM, which combines these two methods in order to provide unsupervised learning and low false alarm capability, similar to that of a supervised S VM approach.
Abstract: Zero-day cyber attacks such as worms and spy-ware are becoming increasingly widespread and dangerous. The existing signature-based intrusion detection mechanisms are often not sufficient in detecting these types of attacks. As a result, anomaly intrusion detection methods have been developed to cope with such attacks. Among the variety of anomaly detection approaches, the Support Vector Machine (SVM) is known to be one of the best machine learning algorithms to classify abnormal behaviors. The soft-margin SVM is one of the well-known basic SVM methods using supervised learning. However, it is not appropriate to use the soft-margin SVM method for detecting novel attacks in Internet traffic since it requires pre-acquired learning information for supervised learning procedure. Such pre-acquired learning information is divided into normal and attack traffic with labels separately. Furthermore, we apply the one-class SVM approach using unsupervised learning for detecting anomalies. This means one-class SVM does not require the labeled information. However, there is downside to using one-class SVM: it is difficult to use the one-class SVM in the real world, due to its high false positive rate. In this paper, we propose a new SVM approach, named Enhanced SVM, which combines these two methods in order to provide unsupervised learning and low false alarm capability, similar to that of a supervised SVM approach. We use the following additional techniques to improve the performance of the proposed approach (referred to as Anomaly Detector using Enhanced SVM): First, we create a profile of normal packets using Self-Organized Feature Map (SOFM), for SVM learning without pre-existing knowledge. Second, we use a packet filtering scheme based on Passive TCP/IP Fingerprinting (PTF), in order to reject incomplete network traffic that either violates the TCP/IP standard or generation policy inside of well-known platforms. Third, a feature selection technique using a Genetic Algorithm (GA) is used for extracting optimized information from raw internet packets. Fourth, we use the flow of packets based on temporal relationships during data preprocessing, for considering the temporal relationships among the inputs used in SVM learning. Lastly, we demonstrate the effectiveness of the Enhanced SVM approach using the above-mentioned techniques, such as SOFM, PTF, and GA on MIT Lincoln Lab datasets, and a live dataset captured from a real network. The experimental results are verified by m-fold cross validation, and the proposed approach is compared with real world Network Intrusion Detection Systems (NIDS).

Journal ArticleDOI
TL;DR: Formulas for computing the cardinality, fuzziness, variance and skewness of an IT2FS are derived and should be useful in IT2 fuzzy logic systems design using the principles of uncertainty, and in measuring the similarity between two IT2 FSs.
Abstract: Fuzziness (entropy) is a commonly used measure of uncertainty for type-1 fuzzy sets. For interval type-2 fuzzy sets (IT2 FSs), centroid, cardinality, fuzziness, variance and skewness are all measures of uncertainties. The centroid of an IT2 FS has been defined by Karnik and Mendel. In this paper, the other four concepts are defined. All definitions use a Representation Theorem for IT2 FSs. Formulas for computing the cardinality, fuzziness, variance and skewness of an IT2 FS are derived. These definitions should be useful in IT2 fuzzy logic systems design using the principles of uncertainty, and in measuring the similarity between two IT2 FSs.

Journal ArticleDOI
TL;DR: This paper proposes that an interval type-2 fuzzy set (IT2 FS) be used as a FS model of a word, because it is characterized by its footprint of uncertainty (FOU), and therefore has the potential to capture word uncertainties.
Abstract: Words mean different things to different people, and so are uncertain. We, therefore, need a fuzzy set model for a word that has the potential to capture their uncertainties. In this paper I propose that an interval type-2 fuzzy set (IT2 FS) be used as a FS model of a word, because it is characterized by its footprint of uncertainty (FOU), and therefore has the potential to capture word uncertainties. Two approaches are presented for collecting data about a word from a group of subjects and then mapping that data into a FOU for that word. The person MF approach, in which each person provides their FOU for a word, is limited to fuzzy set experts because it requires the subject to be knowledgeable about fuzzy sets. The interval end-points approach, in which each person provides the end-points for an interval that they associate with a word on a prescribed scale is not limited to fuzzy set experts. Both approaches map data collected from subjects into a parsimonious parametric model of a FOU, and illustrate the combining of fuzzy sets and statistics-type-2 fuzzistics.

Journal ArticleDOI
TL;DR: A lossless and reversible steganography scheme for hiding secret data in each block of quantized discrete cosine transformation (DCT) coefficients in JPEG images that can provide expected acceptable image quality of stego-images and successfully achieve reversibility.
Abstract: This paper presents a lossless and reversible steganography scheme for hiding secret data in each block of quantized discrete cosine transformation (DCT) coefficients in JPEG images. In this scheme, the two successive zero coefficients of the medium-frequency components in each block are used to hide the secret data. Furthermore, the scheme modifies the quantization table to maintain the quality of the stego-image. Experimental results also confirm that the proposed scheme can provide expected acceptable image quality of stego-images and successfully achieve reversibility.

Journal ArticleDOI
TL;DR: Numerical tests show that the proposed attribute reductions of covering decision systems accomplish better classification performance than those of traditional rough sets.
Abstract: Traditional rough set theory is mainly used to extract rules from and reduce attributes in databases in which attributes are characterized by partitions, while the covering rough set theory, a generalization of traditional rough set theory, does the same yet characterizes attributes by covers. In this paper, we propose a way to reduce the attributes of covering decision systems, which are databases characterized by covers. First, we define consistent and inconsistent covering decision systems and their attribute reductions. Then, we state the sufficient and the necessary conditions for reduction. Finally, we use a discernibility matrix to design algorithms that compute all the reducts of consistent and inconsistent covering decision systems. Numerical tests on four public data sets show that the proposed attribute reductions of covering decision systems accomplish better classification performance than those of traditional rough sets.

Journal ArticleDOI
TL;DR: Three numerical methods to solve ''The fuzzy ordinary differential equations'' are discussed and predictor-corrector is obtained by combining Adams-Bashforth and Adams-Moulton methods.
Abstract: In this paper three numerical methods to solve ''The fuzzy ordinary differential equations'' are discussed. These methods are Adams-Bashforth, Adams-Moulton and predictor-corrector. Predictor-corrector is obtained by combining Adams-Bashforth and Adams-Moulton methods. Convergence and stability of the proposed methods are also proved in detail. In addition, these methods are illustrated by solving two fuzzy Cauchy problems.

Journal ArticleDOI
TL;DR: This work tackles the problem of Project Scheduling Problem by using genetic algorithms (GAs) to solve many different software project scenarios and shows that GAs are quite flexible and accurate for this application, and an important tool for automatic project management.
Abstract: A Project Scheduling Problem consists in deciding who does what during the software project lifetime. This is a capital issue in the practice of software engineering, since the total budget and human resources involved must be managed optimally in order to end in a successful project. In short, companies are principally concerned with reducing the duration and cost of projects, and these two goals are in conflict with each other. In this work we tackle the problem by using genetic algorithms (GAs) to solve many different software project scenarios. Thanks to our newly developed instance generator we can perform structured studies on the influence the most important problem attributes have on the solutions. Our conclusions show that GAs are quite flexible and accurate for this application, and an important tool for automatic project management.

Journal ArticleDOI
TL;DR: A fuzzy integrated multi-period and multi-product production and distribution model in supply chain is developed in terms of fuzzy programming and the solution is provided by genetic optimization (genetic algorithm).
Abstract: Aggregate production-distribution planning (APDP) is one of the most important activities in supply chain management (SCM). When solving the problem of APDP, we are usually faced with uncertain market demands and capacities in production environment, imprecise process times, and other factors introducing inherent uncertainty to the solution. Using deterministic and stochastic models in such conditions may not lead to fully satisfactory results. Using fuzzy models allows us to remove this drawback. It also facilitates the inclusion of expert knowledge. However, the majority of existing fuzzy models deal only with separate aggregate production planning without taking into account the interrelated nature of production and distribution systems. This limited approach often leads to inadequate results. An integration of the two interconnected processes within a single production-distribution model would allow better planning and management. Due to the need for a joint general strategic plan for production and distribution and vague planning data, in this paper we develop a fuzzy integrated multi-period and multi-product production and distribution model in supply chain. The model is formulated in terms of fuzzy programming and the solution is provided by genetic optimization (genetic algorithm). The use of the interactive aggregate production-distribution planning procedure developed on the basis of the proposed fuzzy integrated model with fuzzy objective function and soft constraints allows sound trade-off between the maximization of profit and fillrate. The experimental results demonstrate high efficiency of the proposed method.

Journal ArticleDOI
TL;DR: This paper deals with the design of control systems using type-2 fuzzy logic for minimizing the effects of uncertainty produced by the instrumentation elements, environmental noise, etc.
Abstract: Uncertainty is an inherent part in control systems used in real world applications. The use of new methods for handling incomplete information is of fundamental importance. Type-1 fuzzy sets used in conventional fuzzy systems cannot fully handle the uncertainties present in control systems. Type-2 fuzzy sets that are used in type-2 fuzzy systems can handle such uncertainties in a better way because they provide us with more parameters and more design degrees of freedom. This paper deals with the design of control systems using type-2 fuzzy logic for minimizing the effects of uncertainty produced by the instrumentation elements, environmental noise, etc. The experimental results are divided in two classes, in the first class, simulations of a feedback control system for a non-linear plant using type-1 and type-2 fuzzy logic controllers are presented; a comparative analysis of the systems' response in both cases was performed, with and without the presence of uncertainty. For the second class, a non-linear identification problem for time-series prediction is presented. Based on the experimental results the conclusion is that the best results are obtained using type-2 fuzzy systems.

Journal ArticleDOI
TL;DR: An interactive method for multiple attribute group decision making under fuzzy environment that can not only reflect the importance of the given arguments and the ordered positions of the arguments, but also relieve the influence of unfair arguments on the decision result.
Abstract: In this paper, we develop an interactive method for multiple attribute group decision making under fuzzy environment. The method can be used in situations where the information about attribute weights is partly known, the weights of decision makers are expressed in exact numerical values or triangular fuzzy numbers, and the attribute values are triangular fuzzy numbers. The method transforms fuzzy decision matrices into their expected decision matrices, constructs the corresponding normalized expected decision matrices by two simple formulas, and then aggregates these normalized expected decision matrices into a complex decision matrix. Moreover, the decision makers are asked to provide their preferences gradually in the course of interactions. By solving linear programming models, the method diminishes the given alternative set gradually, and finally finds the most preferred alternative. By using the method, the decision makers can provide and modify their preference information gradually in the process of decision making so as to make the decision result more reasonable. The method can not only reflect the importance of the given arguments and the ordered positions of the arguments, but also relieve the influence of unfair arguments on the decision result. Finally, a practical problem is used to illustrate the developed method.

Journal ArticleDOI
TL;DR: This paper discusses the mathematical relationship between intuitionistic fuzzy sets and other models of imprecision.
Abstract: Intuitionistic fuzzy sets [KT Atanassov, Intuitionistic fuzzy sets, VII ITKR's Session, Sofia (deposed in Central Science-Technical Library of Bulgarian Academy of Science, 1697/84), 1983 (in Bulgarian)] are an extension of fuzzy set theory in which not only a membership degree is given, but also a non-membership degree, which is more or less independent Considering the increasing interest in intuitionistic fuzzy sets, it is useful to determine the position of intuitionistic fuzzy set theory in the framework of the different theories modelling imprecision In this paper we discuss the mathematical relationship between intuitionistic fuzzy sets and other models of imprecision

Journal ArticleDOI
TL;DR: An adaptive fuzzy control approach is proposed for a class of multiple-input-multiple-output (MIMO) nonlinear systems with completely unknown nonaffine functions by introducing some special type Lyapunov functions and taking advantage of the mean-value theorem, the backstepping design method and the approximation property of the fuzzy systems.
Abstract: An adaptive fuzzy control approach is proposed for a class of multiple-input-multiple-output (MIMO) nonlinear systems with completely unknown nonaffine functions. The MIMO systems are composed of n subsystems and each of subsystems is in the nested lower triangular form. It is difficult and complicated to control this class of systems due to the existence of unknown nonaffine functions and the couplings among the nested subsystems. This difficulty is overcome by introducing some special type Lyapunov functions and taking advantage of the mean-value theorem, the backstepping design method and the approximation property of the fuzzy systems. The proposed control approach can guarantee that all the signals in the closed-loop system are bounded. A simulation experiment is utilized to verify the feasibility of the proposed approach.

Journal ArticleDOI
TL;DR: The results show that the combined utilization of SVD with demographic data is promising, since it does not only tackle some of the recorded problems of Recommender Systems, but also assists in increasing the accuracy of systems employing it.
Abstract: In this paper we examine how Singular Value Decomposition (SVD) along with demographic information can enhance plain Collaborative Filtering (CF) algorithms. After a brief introduction to SVD, where some of its previous applications in Recommender Systems are revisited, we proceed with a full description of our proposed method which utilizes SVD and demographic data at various points of the filtering procedure in order to improve the quality of the generated predictions. We test the efficiency of the resulting approach on two commonly used CF approaches (User-based and Item-based CF). The experimental part of this work involves a number of variations of the proposed approach. The results show that the combined utilization of SVD with demographic data is promising, since it does not only tackle some of the recorded problems of Recommender Systems, but also assists in increasing the accuracy of systems employing it.

Journal ArticleDOI
TL;DR: This paper provides new definitions of fuzzy lower and upper approximations by considering the similarity between the two objects and proposes a heuristic algorithm to learn fuzzy rules from initial fuzzy data.
Abstract: Although the traditional rough set theory has been a powerful mathematical tool for modeling incompleteness and vagueness, its performance in dealing with initial fuzzy data is usually poor. This paper makes an attempt to improve its performance by extending the traditional rough set approach to the fuzzy environment. The extension is twofold. One is knowledge representation and the other is knowledge reduction. First, we provide new definitions of fuzzy lower and upper approximations by considering the similarity between the two objects. Second, we extend a number of underlying concepts of knowledge reduction (such as the reduct and core) to the fuzzy environment and use these extensions to propose a heuristic algorithm to learn fuzzy rules from initial fuzzy data. Finally, we provide some numerical experiments to demonstrate the feasibility of the proposed algorithm. One of the main contributions of this paper is that the fundamental relationship between the reducts and core of rough sets is still pertinent after the proposed extension.

Journal ArticleDOI
TL;DR: This paper identifies a set of QoS metrics in the context of WS workflows, and proposes a unified probabilistic model for describing QoS values of a broader spectrum of atomic and composite Web services.
Abstract: Web services promise to become a key enabling technology for B2B e-commerce. One of the most-touted features of Web services is their capability to recursively construct a Web service as a workflow of other existing Web services. The quality of service (QoS) of Web-services-based workflows may be an essential determinant when selecting constituent Web services and determining the service-level agreement with users. To make such a selection possible, it is essential to estimate the QoS of a WS workflow based on the QoSs of its constituent WSs. In the context of WS workflow, this estimation can be made by a method called QoS aggregation. While most of the existing work on QoS aggregation treats the QoS as a deterministic value, we argue that due to some uncertainty related to a WS, it is more realistic to model its QoS as a random variable, and estimate the QoS of a WS workflow probabilistically. In this paper, we identify a set of QoS metrics in the context of WS workflows, and propose a unified probabilistic model for describing QoS values of a broader spectrum of atomic and composite Web services. Emulation data are used to demonstrate the efficiency and accuracy of the proposed approach.

Journal ArticleDOI
TL;DR: The new model for fuzzy rough sets is based on the concepts of both fuzzy covering and binary fuzzy logical operators (fuzzy conjunction and fuzzy implication) and a link between the generalized fuzzy rough approximation operators and fundamental morphological operators is presented in a translation-invariant additive group.
Abstract: This paper proposes an approach to fuzzy rough sets in the framework of lattice theory. The new model for fuzzy rough sets is based on the concepts of both fuzzy covering and binary fuzzy logical operators (fuzzy conjunction and fuzzy implication). The conjunction and implication are connected by using the complete lattice-based adjunction theory. With this theory, fuzzy rough approximation operators are generalized and fundamental properties of these operators are investigated. Particularly, comparative studies of the generalized fuzzy rough sets to the classical fuzzy rough sets and Pawlak rough set are carried out. It is shown that the generalized fuzzy rough sets are an extension of the classical fuzzy rough sets as well as a fuzzification of the Pawlak rough set within the framework of complete lattices. A link between the generalized fuzzy rough approximation operators and fundamental morphological operators is presented in a translation-invariant additive group.

Journal ArticleDOI
TL;DR: The central idea of the jittered ensemble is adding noises to the input data and thus augments the original training data set to form models based on different but related training samples to consistently outperform the single modeling approach with a variety of time series processes.
Abstract: Improving forecasting especially time series forecasting accuracy is an important yet often difficult task facing decision makers in many areas. Combining multiple models can be an effective way to improve forecasting performance. Recently, considerable research has been taken in neural network ensembles. Most of the work, however, is devoted to the classification type of problems. As time series problems are often more difficult to model due to issues such as autocorrelation and single realization at any particular time point, more research is needed in this area. In this paper, we propose a jittered ensemble method for time series forecasting and test its effectiveness with both simulated and real time series. The central idea of the jittered ensemble is adding noises to the input data and thus augments the original training data set to form models based on different but related training samples. Our results show that the proposed method is able to consistently outperform the single modeling approach with a variety of time series processes. We also find that relatively small ensemble sizes of 5 and 10 are quite effective in forecasting performance improvement.

Journal ArticleDOI
TL;DR: The AdROSA system for automatic web banner personalization, which integrates web usage and content mining techniques to reduce user input and to respect users' privacy, is presented in the paper.
Abstract: One of the greatest and most recent challenges for online advertising is the use of adaptive personalization at the same time that the Internet continues to grow as a global market. Most existing solutions to online advertising placement are based on demographic targeting or on information gained directly from the user. The AdROSA system for automatic web banner personalization, which integrates web usage and content mining techniques to reduce user input and to respect users' privacy, is presented in the paper. Furthermore, certain advertising policies, important factors for both publishers and advertisers, are taken into consideration. The integration of all the relevant information is accomplished in one vector space to enable online and fully personalized advertising.