scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Computers Communications & Control in 2011"


Journal ArticleDOI
TL;DR: A novel Energy Efficient Clustering and Data Aggregation (EECDA) protocol is proposed for the heterogeneous WSNs which combines the ideas of energy efficient cluster based routing and data aggregation to achieve a better performance in terms of lifetime and stability.
Abstract: In recent years, energy efficiency and data gathering is a major concern in many applications of Wireless Sensor Networks (WSNs). One of the important issues in WSNs is how to save the energy consumption for prolonging the network lifetime. For this purpose, many novel innovative techniques are required to improve the energy efficiency and lifetime of the network. In this paper, we propose a novel Energy Efficient Clustering and Data Aggregation (EECDA) protocol for the heterogeneous WSNs which combines the ideas of energy efficient cluster based routing and data aggregation to achieve a better performance in terms of lifetime and stability. EECDA protocol includes a novel cluster head election technique and a path would be selected with maximum sum of energy residues for data transmission instead of the path with minimum energy consumption. Simulation results show that EECDA balances the energy consumption and prolongs the network lifetime by a factor of 51%, 35% and 10% when compared with Low-Energy Adaptive Clustering Hierarchy (LEACH), Energy Efficient Hierarchical Clustering Algorithm (EEHCA) and Effective Data Gathering Algorithm (EDGA), respectively.

101 citations


Journal ArticleDOI
TL;DR: This paper surveys the current state of complex fuzzy sets, complex fuzzy classes, and complex fuzzy logic; provides an axiomatic basis for first order predicate complex fuzzy Logic and complex class theory and uses these rigorous definitions to exemplify inference in complex economic systems.
Abstract: Complex fuzzy sets, classes, and logic have an important role in applications, such as prediction of periodic events and advanced control systems, where several fuzzy variables interact with each other in a multifaceted way that cannot be represented effectively via simple fuzzy operations such as union, intersection, complement, negation, conjunction and disjunction. The initial formulation of these terms stems from the definition of complex fuzzy grade of membership. The problem, however, with these definitions are twofold: 1) the complex fuzzy membership is limited to polar representation with only one fuzzy component. 2) The definition is based on grade of membership and is lacking the rigor of axiomatic formulation. A new interpretation of complex fuzzy membership enables polar and Cartesian representation of the membership function where the two function components carry uncertain information. Moreover, the new interpretation is used to define complex fuzzy classes and develop an axiomatic based theory of complex propositional fuzzy logic. Additionally, the generalization of the theory to multidimensional fuzzy grades of membership has been demonstrated. In this paper we propose an axiomatic framework for first order predicate complex fuzzy logic and use this framework for axiomatic definition of complex fuzzy classes. We use these rigorous definitions to exemplify inference in complex economic systems. The new framework overcomes the main limitations of current theory and provides several advantages. First, the derivation of the new theory is based on axiomatic approach and does not assume the existence of complex fuzzy sets or complex fuzzy classes. Second, the new form significantly improves the expressive power and inference capability of complex fuzzy logic and class theory. The paper surveys the current state of complex fuzzy sets, complex fuzzy classes, and complex fuzzy logic; and provides an axiomatic basis for first order predicate complex fuzzy logic and complex class theory.

60 citations


Journal ArticleDOI
TL;DR: A hybrid Pareto-based artificial bee colony (HABC) algorithm for solving the multi-objective flexible job shop scheduling problem and efficiency and effectiveness are shown.
Abstract: In this paper, we propose a hybrid Pareto-based artificial bee colony (HABC) algorithm for solving the multi-objective flexible job shop scheduling problem. In the hybrid algorithm, each food sources is represented by two vectors, i.e., the machine assignment vector and the operation schedul- ing vector. The artificial bee is divided into three groups, namely, employed bees, onlookers, and scouts bees. Furthermore, an external Pareto archive set is introduced to record non-dominated solutions found so far. To balance the exploration and exploitation capability of the algorithm, the scout bees in the hybrid algorithm are divided into two parts. The scout bees in one part per- form randomly search in the predefined region while each scout bee in another part randomly select one non-dominated solution from the Pareto archive set. Experimental results on the well-known benchmark instances and comparisons with other recently published algorithms show the efficiency and effectiveness of the proposed algorithm.

59 citations


Journal ArticleDOI
TL;DR: This paper will present a new approach which is based on an adaptive non-singleton interval type-2 FLS where the numerical uncertainties will be modeled and handled by non- Singleton type- 2 fuzzy inputs and the linguistic uncertaintieswill be handled by interval type of-2 fuzzy sets to represent the antecedents’ linguistic labels.
Abstract: Real world environments are characterized by high levels of linguistic and numerical uncertainties. A Fuzzy Logic System (FLS) is recognized as an adequate methodology to handle the uncertainties and imprecision available in real world environments and applications. Since the invention of fuzzy logic, it has been applied with great success to numerous real world applications such as washing machines, food processors, battery chargers, electrical vehicles, and several other domestic and industrial appliances. The first generation of FLSs were type-1 FLSs in which type-1 fuzzy sets were employed. Later, it was found that using type-2 FLSs can enable the handling of higher levels of uncertainties. Recent works have shown that interval type-2 FLSs can outperform type-1 FLSs in the applications which encompass high uncertainty levels. However, the majority of interval type-2 FLSs handle the linguistic and input numerical uncertainties using singleton interval type-2 FLSs that mix the numerical and linguistic uncertainties to be handled only by the linguistic labels type-2 fuzzy sets. This ignores the fact that if input numerical uncertainties were present, they should affect the incoming inputs to the FLS. Even in the papers that employed non-singleton type-2 FLSs, the input signals were assumed to have a predefined shape (mostly Gaussian or triangular) which might not reflect the real uncertainty distribution which can vary with the associated measurement. In this paper, we will present a new approach which is based on an adaptive non-singleton interval type-2 FLS where the numerical uncertainties will be modeled and handled by non-singleton type-2 fuzzy inputs and the linguistic uncertainties will be handled by interval type-2 fuzzy sets to represent the antecedents’ linguistic labels. The non-singleton type-2 fuzzy inputs are dynamic and they are automatically generated from data and they do not assume a specific shape about the distribution associated with the given sensor. We will present several real world experiments using a real world robot which will show how the proposed type-2 non-singleton type-2 FLS will produce a superior performance to its singleton type-1 and type-2 counterparts when encountering high levels of uncertainties.

37 citations


Journal ArticleDOI
TL;DR: A time-bound ticket-based mutual authentication scheme that achieves mutual authentication between the server and the client and is resistant to masquerade attack, replay attack and password guessing attack.
Abstract: Cloud computing is becoming popular quickly. In cloud computing, people store their important data in the cloud, which makes it important to ensure the data integrity and availability. Remote data integrity checking enables the client to perform data integrity verification without access to the complete file. This service brings convenience to clients, but degrades the server’s performance severely. Proper schemes must be designed to reduce the performance degradation. In this paper, a time-bound ticket-based mutual authentication scheme is proposed for solving this problem. The proposed authentication scheme achieves mutual authentication between the server and the client. The use of timebound tickets reduces the server’s processing overhead efficiently. The correspondence relationship between the digital ticket and the client’s smart card prevents user masquerade attack effectively. By security analysis, we show that the proposed scheme is resistant to masquerade attack, replay attack and password guessing attack. By performance analysis, we show that the proposed scheme has good efficiency. The proposed scheme is very suitable for cloud computing.

35 citations


Journal ArticleDOI
TL;DR: The purpose of this paper is to present heuristic algorithms to solve this problem approximately and present constructive algorithms and local search algorithms for solving the generalized vehicle routing problem.
Abstract: The vehicle routing problem (VRP) is one of the most famous combinatorial optimization problems and has been intensively studied due to the many practical applications in the field of distribution, collection, logistics, etc We study a generalization of the VRP called the generalized vehicle routing problem (GVRP) where given a partition of the nodes of the graph into node sets we want to find the optimal routes from the given depot to the number of predefined clusters which include exactly one node from each cluster The purpose of this paper is to present heuristic algorithms to solve this problem approximately We present constructive algorithms and local search algorithms for solving the generalized vehicle routing problem

33 citations


Journal ArticleDOI
TL;DR: An investigation about the possibilities of enhancement of learning Java using the visualization software Jeliot indicates that a significant percentage of students had achieved better results when they were using a software visualization tool.
Abstract: This study was carried out to observe, measure and analyze the effects of using software visualization in teaching programming with participants from two institutions of higher educations in Serbia. Basic programming learning is notorious for complex for many novice students at university level. The visualizations of examples of program code or programming tasks could help students to grasp programming structures more easily. This paper describes an investigation about the possibilities of enhancement of learning Java using the visualization software Jeliot. An analysis of 400 students’ test results indicates that a significant percentage of students had achieved better results when they were using a software visualization tool. In the experience of the authors Jeliot may yield the best results if implemented in with students who are new to the art of programming.

33 citations


Journal ArticleDOI
TL;DR: The goal of the paper is, to show that the potential of applying fuzzy technology in management is still very large and hardly exploited so far.
Abstract: Fuzzy Set Theory has been developed during the last decades to a demanding mathematical theory. There exist more than 50,000 publications in this area by now. Unluckily the number of reports on applications of fuzzy technology has become very scarce. The reasons for that are manifold: Real applications are normally not single-method-applications but rather complex combinations of different techniques, which are not suited for a publication in a journal. Sometimes considerations of competition my play a role, and sometimes the theoretical core of an application is not suited for publication. In this paper we shall focus on applications of fuzzy technology on real problems in business management. Two versions of fuzzy technology will be used: Fuzzy Knowledge based systems and fuzzy clustering. It is assumed that the reader is familiar with basic fuzzy set theory and the goal of the paper is, to show that the potential of applying fuzzy technology in management is still very large and hardly exploited so far.

26 citations


Journal ArticleDOI
TL;DR: This book is about the disassembly of end-of-life products with particular emphasis on methods and techniques for solving the Disassembly Line Balancing Problem, a complex NP complete problem which necessitates specialized solution techniques.
Abstract: This book is about the disassembly of end-of-life products with particular emphasis on methods and techniques for solving the Disassembly Line Balancing Problem. Disassembly is viewed as "the systematic separation and extracting valuable entities for possible future re-usage". In fact, disassembly is a distinct phase of the product lifecycle. It follows the "before life" phases (such as design and economical evaluation), "useful period" phases (such as manufacturing, distribution, usage and maintenance) and "end of life" phases (such as collecting, sorting). Disassembly might represent the essential first phase of the future activities, such as re-use and re-manufacturing and recycle. Due to the ever higher public awareness, the more and more strict regulations concerning environment quality preservation and increasing economic effectiveness and attractiveness for industry, the activities of recovering valuable parts and subassemblies have become a desirable alternative to the old fashioned disposal processes of end-of-life products. The authors state, in the preface of the book, that the "disassembly line seems to be the most efficient way to disassemble a product". Consequently, the primary concern of the book is "the complete disassembly of [end-of-life] products on a paced disassembly line for component/material recovery purposes". The authors aim at investigating "the qualitative and quantitative aspects of multi-criteria solution sequences using the various combinatorial optimization techniques" (page 16) to solve the Disassembly Line Balancing Problem (DLPB). The DLBP consists in finding a disassembly feasible solution sequence which preserves precedence constraints and aims at attaining several objectives, such as minimizing the number of work stations and total idle time, ensuring similar idle time at each work station, while attempting to remove hazardous parts and materials and extracting highly demanded product components at the earliest moments possible and minimizing the number of direction changes required for disassembly (removing parts with similar part removal directions together), (page 102). The book is composed of 29 chapters grouped into three parts entitled "Disassembly Background", "Disassembly-Line Balancing" and "Further Disassembly-Line Considerations" which address general aspects concerning disassembly processes, variations of methods and techniques to solve the DLBP, and other problems related to the disassembly line, respectively. Part I comprises six chapters which are meant to set the stage for the subsequent chapters. Various information concerning disassembly processes, assembly lines, disassembly lines, other related researches, graphical representations and computational complexity of combinatorial problems are provided. Part II is made up of 20 chapters and addresses the statement and analysis of the DLBP and several specific variations of methods and techniques which were adapted for solving the problem, tested on four application cases exprimental instances and compared. The objectives of this part of the book are: stating the mathematical model of DLBP, establishing the difficulty of the problem by using the complexity theory and determining the data sets and evaluation criteria to be used in analyzing the problem and solving techniques which are selected (page 99). It is demonstrated (in chapter 9) that the DLBP is a complex NP complete problem in the strong sense and necessitates specialized solution techniques. Accordingly, authors plea for combinatorial optimization approaches and select several algorithms to solve the problem. The techniques to be utilized to solve the DLBP are introduced in chapter 10 and their usage and performances in solving the problem are presented in chapters 12 through 19. There are seven techniques which are adapted, tested and compared. The exhaustive search is used to provide the optimal solution. Two metaheuristic approaches (genetic algorithms and ant colony optimization) are next studied. Two purely deterministic searches (the greedy algorithm and the "hunter-killer" search) and two 2-phase hybrid methods are adapted and tested. The four experimental instances (the eight-part personal computer, the enhanced 10-part DLBP case, the 25-part cellular instance, and the size independent "a priori" benchmark with a known optimal solution) are described in chapter 11. Chapter 20 contains a detailed comparison of the six heuristic and metaheuristic techniques as applied to the DLBP with respect to several performance measures. Several complementary research results are reviewed in chapter 21 together with future research directions. Disassembly processes interact with other "before life", "useful", and "after life" periods of product usage and recovery. As a result, to make the picture complete, Part III addresses other areas of disassembly research such as product planning, line and facility design, operations scheduling and sequencing, inventory, "just-in-time", revenue and unbalanced lines (chapter 22 through 29). The authors of the book form a team who may be viewed as a fine and synergic combination of two complementary experiences and backgrounds from academia and industry. Seamus McGovern, an Electronics Engineer at the Volpe National Transportation System Center, holds a commission in the US Navy as an aerospace duty engineer as well as a part-time industrial engineering faculty appointment at the Northeastern University. Surendra M. Gupta is a professor of Mechanical and Industrial Engineering and a director of the Laboratory for Responsible Manufacturing at the Northeastern University. He has authored/co-authored over 400 technical papers and is a pioneer in the domain of the book. This book represents a very valuable work in a rather young research domain, which may be viewed as opened by the pioneering paper of Gungor and Gupta entitled "Disassembly Line in product Recovery (International Journal of Production Recovery, 40 (11), 2002). The volume mainly reflects the original studies of authors and their colleagues. It also makes an exhaustive and systematic review of the results which are reported in the domain scientific literature and are due to other scientists. The organization of the document is well thought and the presentation style is rigorous and clear. Subsequently, though information content is very dense and diverse, the book is accessible and its study is scientifically rewarding. Special remarks can be made to the uniform and coherent notation which is used throughout the book and to graphical illustrations. A final remark of appreciation is to be made to the excellent quality of editing and printing of the book due to the staff of McGraw-Hill Companies. In conclusion, the book is a timely work which contains relevant, inspiring and challenging information. Therefore, this reviewer warmly recommends it to the readers of academia and industry as well who are interested in modern manufacturing issues and combinatorial optimization methods and software.

25 citations


Journal ArticleDOI
TL;DR: The idea of sensitivity in ant colony systems has been exploited in hybrid ant-based models with promising results for many combinatorial optimization problems, such as generalized vehicle routing problems as discussed by the authors.
Abstract: The idea of sensitivity in ant colony systems has been exploited in hybrid ant-based models with promising results for many combinatorial optimization problems. Heterogeneity is induced in the ant population by endowing individual ants with a certain level of sensitivity to the pheromone trail. The variable pheromone sensitivity within the same population of ants can potentially intensify the search while in the same time inducing diversity for the exploration of the environment. The performance of sensitive ant models is investigated for solving the generalized vehicle routing problem. Numerical results and comparisons are discussed and analysed with a focus on emphasizing any particular aspects and potential benefits related to hybrid ant-based models.

23 citations


Journal ArticleDOI
TL;DR: An optimal task scheduling algorithm (OTSA-WSN) in a clustered wireless sensor network is proposed based on divisible load theory and the optimal number of rounds and the most reasonable load allocation ratio on each node could be derived.
Abstract: Sensing tasks should be allocated and processed among sensor nodes in minimum times so that users can draw useful conclusions through analyzing sensed data. Furthermore, finishing sensing task faster will benefit energy saving, which is critical in system design of wireless sensor networks. To minimize the execution time (makespan) of a given task, an optimal task scheduling algorithm (OTSA-WSN) in a clustered wireless sensor network is proposed based on divisible load theory. The algorithm consists of two phases: intra-cluster task scheduling and inter-cluster task scheduling. Intra-cluster task scheduling deals with allocating different fractions of sensing tasks among sensor nodes in each cluster; inter-cluster task scheduling involves the assignment of sensing tasks among all clusters in multiple rounds to improve overlap of communication with computation. OTSA-WSN builds from eliminating transmission collisions and idle gaps between two successive data transmissions. By removing performance degradation caused by communication interference and idle, the reduced finish time and improved network resource utilization can be achieved. With the proposed algorithm, the optimal number of rounds and the most reasonable load allocation ratio on each node could be derived. Finally, simulation results are presented to demonstrate the impacts of different network parameters such as the number of clusters, computation/communication latency, and measurement/communication speed, on the number of rounds, makespan and energy consumption.

Journal ArticleDOI
TL;DR: This paper proposes a fuzzy signature model including relevance weights and weighted aggregations for each node and parent node, respectively, so that as a result a single membership value may be calculated for each building in question.
Abstract: In historical district at European cities it is a major problem how to take decision on renovating or replacing existing buildings. This problem is imminent in Budapest (Hungary) in many traditional districts such as the Ferencvaros district where we selected a compound area for further examination. By financial aid for the renovation of these buildings which awarded by Municipal Assembly of this district in question there is much uncertainty and confusion concerning how to decide whether or not and how to reconstruct a building where new private owners apply for support. In this paper we propose a formal evaluation method based on fuzzy signature rule bases (the formal being a special case of L-fuzzy object). Using the available expert knowledge we propose a fuzzy signature model including relevance weights and weighted aggregations for each node and parent node, respectively, so that as a result a single membership value may be calculated for each building in question. Linguistic labels for decision (such as worthless, average, highly valuable, etc.) are generated from the values thus obtained. Such linguistic calculations might be of help for the Municipal Assembly awarding financial support. A complete example wit 26 buildings is presented.

Journal ArticleDOI
TL;DR: A domain ontology was developed based on the UML models of the VirDenT information system, making sure in this way the ontology captures knowledge identified and described in the analysis of the information system.
Abstract: The goal of using virtual and augmented reality technologies in therapeutic interventions simulation, in the fixed prosthodontics (VirDenT) project, is to increase the quality of the educational process in dental faculties, by assisting students in learning how to prepare teeth for all-ceramic restorations. Its main component is an e-learning virtual reality-based software system that will be used for the developing skills in grinding teeth, needed in all-ceramic restorations. This paper presents a domain ontology that formally describes the knowledge of the domain problem that the VirDenT e-learning system dealt with. The ontology was developed based on the UML models of the VirDenT information system, making sure in this way the ontology captures knowledge identified and described in the analysis of the information system. At first, we constructed the taxonomy of these concepts, using the DOLCE ontology and its modules. Then, we defined the conceptual relations between the concepts. We also added SWRL rules that formally describe the business rules and knowledge previously identified. Finally, with the assistance of the Pellet reasoner system, we checked the ontology consistency.

Journal ArticleDOI
TL;DR: In this paper, an extension of spiking neural P systems, where several types of spikes are allowed, is investigated. And some further extensions are mentioned, such as considering a process of decay in time of the spikes, where enzymes act in series, similar to train spikes traveling along the axons of neurons.
Abstract: With a motivation related to gene expression, where enzymes act in series, somewhat similar to the train spikes traveling along the axons of neurons, we consider an extension of spiking neural P systems, where several types of “spikes" are allowed. The power of the obtained spiking neural P systems is investigated. Some further extensions are mentioned, such as considering a process of decay in time of the spikes

Journal ArticleDOI
Tudor Barbu1
TL;DR: A robust face detection approach that works for digital color images by deciding for each skin region if it represents a human face or not using a set of candidate criteria, an edge detection process, a correlation based technique and a threshold-based method.
Abstract: We propose a robust face detection approach that works for digital color images. Our automatic detection method is based on image skin regions, therefore a skin-based segmentation of RGB images is provided first. Then, we decide for each skin region if it represents a human face or not, using a set of candidate criteria, an edge detection process, a correlation based technique and a threshold-based method. A high face detection rate is obtained using the proposed method.

Journal ArticleDOI
TL;DR: This paper presents a comprehensive containerized cargo monitoring system based on WSNs which incorporated tilt/motion sensor to improve the network convergence time of container networks and periodically switch the nodes into sleeping mode to save energy and extend the lifetime of the network.
Abstract: Due to globalization, global trade is strongly growing nowadays. The use of containers has significantly increased and bringing the change on the shape of the world economy. Thus, monitoring every single container is a big challenge for port industries. Furthermore, rapid development in embedded computing systems has led to the emergence of Wireless Sensor Network (WSN) technology which enabled us to envision the intelligent containers. This represents the next evolutionary development in logistics industry to increase the efficiency, productivity, security of containerized cargo shipping. In this paper, we present a comprehensive containerized cargo monitoring system based on WSNs. We incorporated tilt/motion sensor to improve the network convergence time of container networks. Moreover, we periodically switch the nodes into sleeping mode to save energy and extend the lifetime of the network. Based on the technical implementation on a real container vessel, we strongly believed that our design which employed WSN technology is viable to be implemented in container logistics to improve port services and provide safe transport of containerized cargo.

Journal ArticleDOI
TL;DR: Feature extraction is especially effective for classification algorithms that do not have any inherent feature selections or feature extraction build in, such as the nearest neighbour methods or some types of neural networks.
Abstract: A comparison between several classification algorithms with feature extraction on real dataset is presented. Principal Component Analysis (PCA) has been used for feature extraction with different values of the ratio R, evaluated and compared using four different types of classifiers on two real benchmark data sets. Accuracy of the classifiers is influenced by the choice of different values of the ratio R. There is no best value of the ratio R, for different datasets and different classifiers accuracy curves as a function of the number of features used may significantly differ. In our cases feature extraction is especially effective for classification algorithms that do not have any inherent feature selections or feature extraction build in, such as the nearest neighbour methods or some types of neural networks.

Journal ArticleDOI
TL;DR: This communication provides an example of such development also by showing how to include the automation level operational modes into the interfacing system, and how the human operator can enter the control loop in different ways.
Abstract: Human-Machine-Interfaces are with no doubt one of the constitu- tive parts of an automation system. However, it is not till recently that they have received appropriate attention. It is because of a major concern about as- pects related to maintenance, safety, achieve operator awareness, etc has been gained. Even there are in the market software solutions that allow for the design of efficient and complex interaction systems, it is not widespread the use of a rational design of the overall interface system, especially for large scale systems where the monitoring and supervision systems may include hundreds of interfacing screens. It is on this respect hat this communication provides an example of such development also by showing how to include the automation level operational modes into the interfacing system. Another important aspect is how the human operator can enter the control loop in different ways, and such interaction needs to be considered as an integral part of the automation procedure as well as the communication of the automation device.In this paper the application of design and operational modes guidelines in automation are

Journal ArticleDOI
TL;DR: This work proposes a fuzzy inference system used to produce sharp control states from noisy EEG data that can be processed with proper algorithms and used to perform actions in a Brain-Computer Interface.
Abstract: A Brain-Computer Interface uses measurements of scalp electric potential (electroencephalography - EEG) reflecting brain activity, to communicate with external devices. Recent developments in electronics and computer sciences have enabled applications that may help users with disabilities and also to develop new types of Human Machine Interfaces. By producing modifications in their brain potential activity, the users can perform control of different devices. In order to perform actions, this EEG signals must be processed with proper algorithms. Our approach is based on a fuzzy inference system used to produce sharp control states from noisy EEG data.

Journal ArticleDOI
TL;DR: A genetic algorithm for multi-objective optimization of a multi pickup and delivery problem with time windows based on aggregation method and lower bounds is presented, minimizing the compromise between total travel cost and total tardiness time.
Abstract: The PDPTW is an optimization vehicles routing problem which must meet requests for transport between suppliers and customers in purpose to satisfy precedence, capacity and time constraints. We present, in this paper, a genetic algorithm for multi-objective optimization of a multi pickup and delivery problem with time windows (m-PDPTW), based on aggregation method and lower bounds. We propose in this sense a brief literature review of the PDPTW, present our approach to give a satisfying solution to the m-PDPTW minimizing the compromise between total travel cost and total tardiness time.

Journal ArticleDOI
TL;DR: This paper illuminating the first decade of Fuzzy Sets and Systems (FSS) where nobody thought that this theory would be successful in the field of applied sciences and technology shows that especially Lotfi A. Zadeh expected his theory would have a role in the future of computer systems as well as humanities and social sciences.
Abstract: In this paper we illuminate the first decade of Fuzzy Sets and Systems (FSS) where nobody thought that this theory would be successful in the field of applied sciences and technology. We show that especially Lotfi A. Zadeh, the founder of the theory of FSS, expected that his theory would have a role in the future of computer systems as well as humanities and social sciences. When Mamdani and Assilian picked up the idea of FSS and particularly Fuzzy Algorithms to establish a first Fuzzy Control system for a small steam engine, this was the Kick-off for the “Fuzzy-Boom” in Japan and later in the whole world and Zadeh’s primary intention trailed away for decades. Just in the new millennium a new movement for Fuzzy Sets in Social Sciences and Humanities was launched and, hopefully, will persist!

Journal ArticleDOI
TL;DR: It was identified that the present outsourcing scenario has created to have frequent requirement changes, shrunk design and stretched development phases, and frequent deliverables, which have to be accommodated by the software developer with extra effort and commitment beyond the project norms.
Abstract: The software Requirement Engineering (RE) is one of the most important and fundamental activities in the software life cycle. With the introduction of different software process paradigms, the Requirement Engineering appeared in different facets, yet remaining its significance without a doubt. The software development outsourcing is considered as a win-win situation for both developed and developing countries. High numbers of low paid, yet talented workforce in developing countries could be employed for software outsourcing projects with the demanding power of the outsourcer to decide the projects, their scope and priorities with the intention of profit maximization. This study was conducted to analyze the impact of poor Requirement Engineering in outsourced software projects from the developers’ context (sample size n = 57). It was identified that the present outsourcing scenario has created to have frequent requirement changes, shrunk design and stretched development phases, and frequent deliverables, which have to be accommodated by the software developer with extra effort and commitment beyond the project norms. The results reveal important issues and open policy level discussions while questioning our insights on the outsourcing benefits as a whole.

Journal ArticleDOI
TL;DR: A learning mechanism based on genetic algorithms with locally crossover that can be applied to various topologies of fuzzy neural networks with fuzzy numbers with L-R fuzzy numbers as inputs, outputs and weights is proposed.
Abstract: Fuzzy feed-forward (FFNR) and fuzzy recurrent networks (FRNN) proved to be solutions for "real world problems". In the most cases, the learning algorithms are based on gradient techniques adapted for fuzzy logic with heuristic rules in the case of fuzzy numbers. In this paper we propose a learning mechanism based on genetic algorithms (GA) with locally crossover that can be applied to various topologies of fuzzy neural networks with fuzzy numbers. The mechanism is applied to FFNR and FRNN with L-R fuzzy numbers as inputs, outputs and weights and fuzzy arithmetic as forward signal propagation. The α-cuts and fuzzy biases are also taken into account. The effectiveness of the proposed method is proven in two applications: the mapping a vector of triangular fuzzy numbers into another vector of triangular fuzzy numbers for FFNR and the dynamic capture of fuzzy sinusoidal oscillations for FRNN.

Journal ArticleDOI
TL;DR: A non-cooperative game algorithm (NGTSA) for task scheduling in wireless sensor networks is proposed, which provides incentives to the sensors to obey the prescribed algorithms, and to truthfully report their parameters, leading to an effient task scheduling and execution.
Abstract: Scheduling tasks in wireless sensor networks is one of the most challenging problems. Sensing tasks should be allocated and processed among sensors in minimum times, so that users can draw prompt and effective conclusions through analyzing sensed data. Furthermore, finishing sensing task faster will benefit energy saving, which is critical in system design of wireless sensor networks. But sensors may refuse to take pains to carry out the tasks due to the limited energy. To solve the potentially selfish problem of the sensors, a non-cooperative game algorithm (NGTSA) for task scheduling in wireless sensor networks is proposed. In the proposed algorithm, according to the divisible load theory, the tasks are distributed reasonably to every node from SINK based on the processing capability and communication capability. By removing the performance degradation caused by communications interference and idle, the reduced task completion time and the improved network resource utilization are achieved. Strategyproof mechanism which provide incentives to the sensors to obey the prescribed algorithms, and to truthfully report their parameters, leading to an effient task scheduling and execution. A utility function related with the total task completion time and tasks allocating scheme is designed. The Nash equilibrium of the game algorithm is proved. The simulation results show that with the mechanism in the algorithm, selfish nodes can be forced to report their true processing capability and endeavor to participate in the measurement, thereby the total time for accomplishing the task is minimized and the energy-consuming of the nodes is balanced.

Journal ArticleDOI
TL;DR: Several types of fuzzy logic system iterations are exemplified in relationship with oscillations in FLS and with the problem of stability in fuzzy logic control.
Abstract: I exemplify various elementary cases of fuzzy sequences and results related to the iteration of fuzzy mappings and to fuzzy logic systems (FLS). Several types of fuzzy logic system iterations are exemplified in relationship with oscillations in FLS and with the problem of stability in fuzzy logic control. I establish several conditions for fixed points and periodicity of the iterations based on fuzzy systems.

Journal ArticleDOI
TL;DR: This paper investigates the use of Rate Monotonic algorithm by making adjustments in order to make it more suitable for real-time applications.
Abstract: For real-time applications, task scheduling is a problem of paramount importance. Several scheduling algorithms were proposed in the literature, starting from static scheduling or cyclic executives which provide very deterministic yet inflexible behaviour, to the so called best-effort scheduling, which facilitates maximum run-time flexibility but allows only probabilistic predictions of run-time performance presenting a non-predictable and nondeterministic solution. Between these two extremes lies fixed priority scheduling algorithms, such as Rate Monotonic, that is not so efficient for real-time purposes but exhibits a predictable approach because scheduling is doing offline and guarantees regarding process deadlines could be obtained using appropriate analysis methods. This paper investigates the use of Rate Monotonic algorithm by making adjustments in order to make it more suitable for real-time applications. The factors that motivate the interest for fixed priority scheduling algorithms such Rate Monotonic when doing with real-time systems lies in its associated analysis that could be oriented in two directions: schedulability analysis and analysis of process interactions. The analyzing process is carried out using a previously implemented framework that allows modelling, simulation and schedulability analysis for a set of real-time system tasks, and some of the results obtained are presented.

Journal ArticleDOI
TL;DR: A discrete time digital control is proposed where the design of a decentralized controller is faced and local controllers are given the form of a Two-Degree-of-Freedom PI controller tuned using the data-driven Virtual-Reference Feedback tuning approach.
Abstract: The Activated Sludge Process (ASP) is arguably the most popular bioprocess utilized in the treatment of polluted water. The ASP is described by means of a nonlinear model and results on a Two-Input Two-Output multivariable system. In this paper a discrete time digital control is proposed where the design of a decentralized controller is faced. Local controllers are given the form of a Two-Degree-of-Freedom PI controller tuned using the data-driven Virtual-Reference Feedback tuning approach.

Journal ArticleDOI
TL;DR: An authenticated key agreement protocol, which is against the attack of quantum computer, is proposed and can provide the security properties known session key security, forward security, resistance to key-compromise impersonation attack and to unknown key-share attack, key control.
Abstract: All the current public-key cryptosystems will become insecure when size of a quantum register is sufficient. An authenticated key agreement protocol, which is against the attack of quantum computer, is proposed. The proposed protocol can provide the security properties known session key security, forward security, resistance to key-compromise impersonation attack and to unknown key-share attack, key control. We also prove its security in a widely accepted model.

Journal ArticleDOI
TL;DR: A new human-inspired approach to concept identification in documents is introduced here, which takes keywords and builds concept structures based on them using an Adaptive Assignment of Term Importance (AATI) schema.
Abstract: Intelligent agent based system can be used to identify high-level concepts matching sets of keywords provided by users. A new human-inspired approach to concept identification in documents is introduced here. The proposed method takes keywords and builds concept structures based on them. These concept structures are represented as hierarchies of concepts (HofC). The ontology is used to enrich HofCs with terms and other concepts (sub-concepts) based on concept definitions, as well as with related concepts. Additionally, the approach uses levels of importance of terms defining the concepts. The levels of importance of terms are continuously updated based on a flow of documents using an Adaptive Assignment of Term Importance (AATI) schema. The levels of activation of concepts identified in a document that match these in the HofC are estimated using ordered weighted averaging (OWA) operators with linguistic quantifiers. A simple case study presented in the paper is designed to illustrate the approach.

Journal ArticleDOI
TL;DR: This work proposes a novel feedback based network coding (FNC) retransmission scheme which has high throughput and quite low decoding delay without sacrificing throughput, and keeps the end-to-end philosophy of TCP that the coding operations are only performed at the end hosts.
Abstract: Incorporating network coding into TCP has the advantage of masking packet losses from the congestion control algorithm. It could make a lossy channel appear as a lossless channel for TCP, therefore the transport protocol can only focus on handling congestion. However, most schemes do not consider the decoding delay, thus are not suitable to be implemented in practical systems. We propose a novel feedback based network coding (FNC) retransmission scheme which has high throughput and quite low decoding delay without sacrificing throughput. It uses the implicit information of the seen scheme to acquire the exact number of packets the receiver needs for decoding all packets based on feedback. We also change the encoding rules of retransmission, so as to decode part of packets in advance. The scheme can work well on handling not only random losses but also bursty losses. Our scheme also keeps the end-to-end philosophy of TCP that the coding operations are only performed at the end hosts. Thus it is easier to be implemented in practical systems. Simulation results show that our scheme significantly outperforms the previous coding approach in reducing decoding delay, and obtains the throughput which is close to the scenarios where there is zero error loss. It is particularly useful for streaming applications.