scispace - formally typeset
Search or ask a question

Showing papers in "Evolutionary Intelligence in 2019"


Journal ArticleDOI
TL;DR: A new metaheuristic algorithm, inspired by the behavior of emperor penguins which is called Emperor Penguins Colony (EPC), is proposed, which is controlled by the body heat radiation of the penguins and their spiral-like movement in their colony.
Abstract: A metaheuristic is a high-level problem independent algorithmic framework that provides a set of guidelines or strategies to develop heuristic optimization algorithms. Metaheuristic algorithms attempt to find the best solution out of all possible solutions of an optimization problem. A very active area of research is the design of nature-inspired metaheuristics. Nature acts as a source of concepts, mechanisms and principles for designing of artificial computing systems to deal with complex computational problems. In this paper, a new metaheuristic algorithm, inspired by the behavior of emperor penguins which is called Emperor Penguins Colony (EPC), is proposed. This algorithm is controlled by the body heat radiation of the penguins and their spiral-like movement in their colony. The proposed algorithm is compared with eight developed metaheuristic algorithms. Ten benchmark test functions are applied to all algorithms. The results of the experiments to find the optimal result, show that the proposed algorithm is better than other metaheuristic algorithms.

112 citations


Journal ArticleDOI
TL;DR: This work focuses on reviewing a heuristic global optimization method called particle swarm optimization (PSO), the mathematical representation of PSO in contentious and binary spaces, the evolution and modifications ofPSO over the last two decades and a comprehensive taxonomy of heuristic-based optimization algorithms.
Abstract: Swarm intelligence is a kind of artificial intelligence that is based on the collective behavior of the decentralized and self-organized systems. This work focuses on reviewing a heuristic global optimization method called particle swarm optimization (PSO). This includes the mathematical representation of PSO in contentious and binary spaces, the evolution and modifications of PSO over the last two decades. We also present a comprehensive taxonomy of heuristic-based optimization algorithms such as genetic algorithms, tabu search, simulated annealing, cross entropy and illustrate the advantages and disadvantages of these algorithms. Furthermore, we present the application of PSO on graphics processing unit and show various applications of PSO in networks.

99 citations


Journal ArticleDOI
TL;DR: A spiral cuckoo search based clustering method has been introduced to discover spam reviews and the experimental results and statistical analysis validate that the proposed method outruns the existing methods.
Abstract: Nowadays, online reviews play an important role in customer’s decision. Starting from buying a shirt from an e-commerce site to dining in a restaurant, online reviews has become a basis of selection. However, peoples are always in a hustle and bustle since they don’t have time to pay attention to the intrinsic details of products and services, thus the dependency on online reviews have been hiked. Due to reliance on online reviews, some people and organizations pompously generate spam reviews in order to promote or demote the reputation of a person/product/organization. Thus, it is impossible to identify whether a review is a spam or a ham by the naked eye and it is also impractical to classify all the reviews manually. Therefore, a spiral cuckoo search based clustering method has been introduced to discover spam reviews. The proposed method uses the strength of cuckoo search and Fermat spiral to resolve the convergence issue of cuckoo search method. The efficiency of the proposed method has been tested on four spam datasets and one Twitter spammer dataset. To validate the efficacy of proposed clustering method it is compared with six metaheuristics clustering methods namely; particle swarm optimization, differential evolution, genetic algorithm, cuckoo search, K-means, and improved cuckoo search. The experimental results and statistical analysis validate that the proposed method outruns the existing methods.

57 citations


Journal ArticleDOI
TL;DR: The proposed short-term electrical load forecasting method based on stacked auto-encoding and GRU (Gated recurrent unit) neural network can effectively predict the daily variation of power load and have lower prediction error and higher precision.
Abstract: With the rapid development of smart grid, to solve the power enterprises’ requirement in short-term load forecasting, this paper proposes a short-term electrical load forecasting method based on stacked auto-encoding and GRU (Gated recurrent unit) neural network. Firstly, the method input historical data which contains power load, weather information, and holiday information, and use auto-encoding to compress the historical data; and then, the multi-layer GRU is used to construct the model to predict the power load. The experiment results show, compared with traditional models, the proposed method can effectively predict the daily variation of power load and have lower prediction error and higher precision.

53 citations


Journal ArticleDOI
TL;DR: The proposed SCA-FOPID controller is design at a global optimum of objective function and has a good reference tracking ability and frequency responses, and gives an excellent performance from the extensive simulations studies.
Abstract: To enhance the controller performance, an advanced sine–cosine-algorithm (SCA) is employed for Fractional order PID (FOPID) controller tuning in this paper. The SCA-FOPID controller is based on model-based controller design method of physical systems to get better performance. The FOPID controller is designed by SCA optimization technique using the time domain objective function for AVR system. The SCA technique is responsible to optimize five parameters of FOPID controller based on minimum value of objective function of the controller design. The proposed SCA-FOPID controller is design at a global optimum of objective function is acheived. Then, the AVR system has good regulation of terminal voltage at the output to meet desired performance. The proposed method has a good reference tracking ability and frequency responses. This method is compared with the PID and FOPID controller designs of AVR system in the recent years, the proposed SCA-FOPID controller gives an excellent performance from the extensive simulations studies.

47 citations


Journal ArticleDOI
TL;DR: Feed-forward neural network training problems are dealt with using the application of a recently invented meta-heuristic optimization algorithm locust swarm optimization (LSO) for the first time and the experimental results show that the training algorithm not only attained a very good performance in terms of speed convergence but also achieved reliability due to the reduced likelihood of being trapped in local minima.
Abstract: The need to avoid computer system breaches is increasing. Many researchers have adopted different approaches, such as intrusion detection systems (IDSs), to handle various threats. Intrusion detection has become an imperative system to detect various security breaches. Until today, researchers face the problem of building reliable and effective IDSs that can handle numerous attacks with changing patterns. This paper deals with feed-forward neural network (FNN) training problems using the application of a recently invented meta-heuristic optimization algorithm locust swarm optimization (LSO) for the first time. FNN is combined with LSO (FNN-LSO) to build an advanced detection system and improve the performance of IDS. Our method is applied to a series of experiments to study the capability and performance of the proposed approach. Experimental studies began by using intrusion detection evaluation data, namely, NSL-KDD and UNSW-NB15, to benchmark the performance of the proposed approach. The most common evolutionary trainers, namely, particle swarm optimizer PSO-based trainer and genetic algorithm GA-based trainer, were implemented to verify the results. Compared with existing methods in the literature, our proposed approach provides to be more accurate to be an alternative solution for IDS. The experimental results show that our training algorithm not only attained a very good performance in terms of speed convergence but also achieved reliability due to the reduced likelihood of being trapped in local minima. Furthermore, our proposed model improves the detection rate.

42 citations


Journal ArticleDOI
TL;DR: A better performance of the proposed algorithm to get the optimal solution with fewer iterations number than other methods is confirmed.
Abstract: This paper proposes a new optimization algorithm named future search algorithm (FSA) This algorithm mimics the person’s life People in the world search for the best life If any person found that his life is not good, he tries to change it and he imitates the successful persons According to this behavior, this algorithm is built by mathematical equations The FSA can update the random initial Furthermore, it uses the local search between people and the global search between the histories optimal persons to achieve the best solutions The proposed algorithm does not have tuned parameters In addition, it has low computational complexity, fast convergence, and high local optima avoidance The performance of the proposed algorithm is evaluated by applying it to solve some benchmarks test functions These test functions have various characteristics necessary to evaluate the FSA In addition, the performance of the proposed algorithm is compared with five other well-known methods The results confirm a better performance of the proposed algorithm to get the optimal solution with fewer iterations number than other methods

38 citations


Journal ArticleDOI
TL;DR: This work proposes a discrete version of the Particle Swarm Optimization (PSO) algorithm, namely Integer-PSO, for task scheduling in the cloud computing environment which can be used for optimizing a single objective function and multiple objective functions as well.
Abstract: Cloud computing is an emerging technology that changes the computing world through its power to serve the need of any user who requires better computing power over the Internet. For this environment the end user may want to have a better Quality of Service at low cost and cloud service providers have a different goal of achieving maximum profit and minimal management overhead. Task scheduling is a challenging task in this scenario to meet the requirements of both the ends. This work proposes a discrete version of the Particle Swarm Optimization (PSO) algorithm, namely Integer-PSO, for task scheduling in the cloud computing environment which can be used for optimizing a single objective function and multiple objective functions as well. Experimental studies on different types of task set characterising normal traffic and bursty traffic in the cloud computing environment shows that this approach is better, have good convergence and load balancing.

37 citations


Journal ArticleDOI
TL;DR: An improved elephant herding optimization (IEHO) to solve the multilevel image thresholding problem for image segmentation by introducing oppositional-based learning (OBL) and dynamic cauchy mutation (DCM) is presented.
Abstract: This paper presents an improved elephant herding optimization (IEHO) to solve the multilevel image thresholding problem for image segmentation by introducing oppositional-based learning (OBL) and dynamic cauchy mutation (DCM). OBL accelerates the convergence rate and enhances the performance of standard EHO whereas DCM mitigates the premature convergence. The suggested optimization approach maximizes two popular objective functions: ‘Kapur’s entropy’ and ‘between-class variance’ to estimate optimized threshold values for segmentation of the image. The performance of the proposed technique is verified on a set of test images taken from the benchmark Berkeley segmentation dataset. The results are analyzed and compared with conventional EHO and other four popular recent metaheuristic algorithms namely cuckoo search, artificial bee colony, bat algorithm, particle swarm optimization and one classical method named dynamic programming found from the literature. Experimental results show that the proposed IEHO provides promising performance compared to other methods in view of optimized fitness value, peak signal-to-noise ratio, structure similarity index and feature similarity index. The suggested algorithm also has better convergence than the other methods taken into consideration.

31 citations


Journal ArticleDOI
TL;DR: The proposed technique has been named as FA-PS in which PS has been used to introduce enhancement in the solution quality of standard FA, which has been applied to various types of maximization and minimization functions and the performance has been compared with standard FA and genetic algorithm.
Abstract: Firefly algorithm (FA) is a newly introduced meta-heuristic, nature-inspired, stochastic algorithm for solving various types of optimization problems. FA takes inspiration from natural phenomenon of light emission by fireflies and is one of the robust and easily implementable algorithms. The standard FA consists of three stages namely initialization, firefly position changing stage and termination stage. A major drawback associated with standard FA in its termination stage is its failure in getting the most optimal value due to the fact that after a fixed number of iterations, no significant improvement can be observed in the solution quality. In this paper, this issue is resolved by introducing pattern search (PS) at the termination stage of standard FA when there is no further improvement in the solution quality. The proposed approach consists of three stages. In the first stage, the parameters of standard FA are initialized. In the firefly changing position stage, the randomization factor is used to update the solution in each iteration of operational stages. In the final stage, the optimized values obtained from the FA during its maximum number of iteration are given as inputs to the pattern search algorithm. The pattern search is an optimization algorithm that further optimizes the values obtained in the maximum iterations of standard FA. The proposed technique has been named as FA-PS in which PS has been used to introduce enhancement in the solution quality of standard FA. The developed approach has been applied to various types of maximization and minimization functions and the performance has been compared with standard FA and genetic algorithm in terms of getting the most optimal values for the functions being considered. A significant improvement has been observed in the solution quality of FA.

29 citations


Journal ArticleDOI
TL;DR: The proposed HL-NBC method for sentimental analysis does sentiment classification in an improved way and gives accuracy of 82%, which is comparatively better than other methods, and achieves 93% improvement in processing time for larger datasets.
Abstract: Twitter is a major micro-blogging service, with millions of active users. These users use Twitter to post status messages called tweets and share their opinions using hash tags on various events. Hence, Twitter is considered a major real time streaming source and one of an effective and accurate indicator of opinions. The amount of data generated by Twitter is huge and it is difficult to scan entire data manually. This paper proposes a Hybrid Lexicon-Naive Bayesian Classifier (HL-NBC) method for sentimental analysis. In addition to that, Sentiment analysis engine is preceded by topic classification, which classifies tweets into different categories and filters irrelevant tweets. The proposed method is compared with Lexicon, Naive Bayesian classifier for uni-gram and bi-gram features. Out of the different approaches, the proposed HL-NBC method does sentiment classification in an improved way and gives accuracy of 82%, which is comparatively better than other methods. Also, the sentiment analysis is performed in a shorter time compared to traditional methods and achieves 93% improvement in processing time for larger datasets.

Journal ArticleDOI
TL;DR: An improved fast and robust fuzzy c means algorithm segmentation algorithm has been proposed in this research work for reduction of noise and smoothening of brain tumor magnetic resonance image.
Abstract: A novel modified adaptive sine cosine optimization algorithm (MASCA) integrated with particle swarm optimization (PSO) based local linear radial basis function neural network (LLRBFNN) model has been proposed for automatic brain tumor detection and classification. In the process of segmentation, the fuzzy C means algorithm based techniques drastically fails to remove noise from the magnetic resonance images. So, for reduction of noise and smoothening of brain tumor magnetic resonance image an improved fast and robust fuzzy c means algorithm segmentation algorithm has been proposed in this research work. The gray level co-occurrence matrix technique has been employed to extract features from brain tumor magnetic resonance images and the extracted features are fed as input to the proposed modified ASCA–PSO based LLRBFNN model for classification of benign and malignant tumors. In this research work the LLRBFNN model’s weights are optimized by using proposed MASCA–PSO algorithm which provides a unique solution to get rid of the hectic task of radiologist from manual detection. The classification accuracy results obtained from sine cosine optimization algorithm, PSO and adaptive sine cosine optimization algorithm integrated with particle swarm optimization based LLRBFNN models are compared with the proposed MASCA–PSO based LLRBFNN model. It is observed that the result obtained from the proposed model shows better classification accuracy results as compared to the other LLRBFNN based models.

Journal ArticleDOI
TL;DR: Computational results show that CSPSO outperforms other existing algorithms by obtaining the optimum solutions for most of the systems of nonlinear equations and 28 benchmark functions of CEC 2013, and reveals its efficacy in the comparison with other algorithms in the literature.
Abstract: In numerical computations, one of the most strenuous problems is to solve systems of nonlinear equations. It is known that traditional numerical methods such as Newton methods and their variants require differentiability and/or good initial guess for the solutions. In practice, it will be difficult to get this initial solution and costly in term of the time to compute Jacobian. Therefore, there is a need to develop an algorithm to avoid the requirements of these traditional methods. This study proposes a new hybrid algorithm by incorporating cuckoo search (CS) with particle swarm optimization (PSO), called CSPSO, for solving systems of nonlinear equations. The goal of the hybridization between CS and PSO is to incorporate the best attributes of two algorithms together to structure a good-quality algorithm. One of the disadvantages to CS, it requires a large number of function evaluations to get the optimal solution, and to PSO, it is trapped into local minima. Our proposed hybrid algorithm attempts to overcome the disadvantages of CS and PSO. Computational experiments of nine benchmark systems of nonlinear equations and 28 benchmark functions of CEC 2013 with various dimensions are applied to test the performance of CSPSO. Computational results show that CSPSO outperforms other existing algorithms by obtaining the optimum solutions for most of the systems of nonlinear equations and 28 benchmark functions of CEC 2013, and reveals its efficacy in the comparison with other algorithms in the literature.

Journal ArticleDOI
TL;DR: This paper presents a new optimization methodology called movable damped wave algorithm for solving global optimization problems that mimics mathematically the behavior of waveform induced by oscillating phenomena and outperforms the comparative algorithms in most cases.
Abstract: This paper presents a new optimization methodology called movable damped wave algorithm for solving global optimization problems. The proposed methodology mimics mathematically the behavior of waveform induced by oscillating phenomena. It starts by creating multiple initial random solutions which are updated through introducing a mathematical model based on a damped wave function. In the proposed methodology, the updating mechanisms of solutions are based on designing a mathematical relation for the movable wave with the aim to effectively achieve robust solutions. Therefore, this methodology can be more robust, statistically sound, and convergent quickly to the optimal global solution. The performance of the proposed is validated by carrying out on 23 benchmark problems and three engineering design problems. The results show vividly that the proposed is a reliable algorithm and outperforms the comparative algorithms in most cases.

Journal ArticleDOI
TL;DR: The present paper deals with the tuning of gains (Kp, Kd and Ki) of the proposed PID controller using two non-traditional global optimization algorithms, namely Particle Swarm Optimization and a variant of Invasive Weed Optimization called Modified Chaotic Invasive weed Optimization (MCIWO) algorithms, which is newly proposed by the authors.
Abstract: The design of appropriate controller plays an important role in achieving the dynamically balanced gaits of the biped robot. The present paper deals with the tuning of gains (Kp, Kd and Ki) of the proposed PID controller using two non-traditional global optimization algorithms, namely Particle Swarm Optimization (PSO) and a variant of Invasive Weed Optimization (IWO) called Modified Chaotic Invasive Weed Optimization (MCIWO) algorithms, which is newly proposed by the authors. The effectiveness of the newly proposed MCIWO algorithm has been verified with the help of benchmark functions by conducting the normality test, parametric and non-parametric tests. Further, the developed MCIWO algorithm is used to develop the optimal PID controller for the biped robot. Once the PID controllers are optimized, the performance of the controllers in terms of various performance measures of the biped robot are compared. Finally, the gait generated using the optimal PID controllers are tested on a real biped robot.

Journal ArticleDOI
TL;DR: In this paper, the CRO based algorithms with respect to some well-known optimization problems are reviewed to show the robustness of CRO algorithm.
Abstract: Chemical Reaction Optimization (CRO) is a recently established population based metaheuristic for optimization problems inspired by the natural behavior of chemical reactions . Optimization is a way of ensuring the usability of resources and related technologies in the best possible way. We experience optimization problems in our daily lives while some problems are so hard that we can, at best, approximate the best solutions with heuristic or metaheuristic methods. This search (CRO) algorithm inherits several features from other metaheuristics like Simulated Annealing and Genetic Algorithm. After its invention, it was successfully applied to various optimization problems that were solved by other metaheuristic algorithms . The robustness of CRO algorithm was proved when the comparisons with other evolutionary algorithms like Particle Swarm Optimization, Genetic Algorithm, Simulated Annealing, Ant Colony Optimization, Tabu Search, Bee Colony Optimization etc. showed the superior results. As a result, the CRO algorithm has been started to use for solving problems in different fields of optimization . In this paper, we have reviewed the CRO based algorithms with respect to some well-known optimization problems. A brief description of variants of CRO algorithm will help the readers to understand the diversified quality of CRO algorithm. For different problems where CRO algorithms were used, the study on parameters and the experimental results are included to show the robustness of CRO algorithm.

Journal ArticleDOI
TL;DR: In this study, 55 important risk factors causing cost overrun in Indian construction projects are identified through intensive literature review and expert opinion and a new fuzzy based model has been proposed to estimate the risk magnitude.
Abstract: Cost is considered as one of the most important parameters for the success of any construction project. Therefore the risk factors causing cost overrun in the construction industry should be assessed. In this study, 55 important risk factors causing cost overrun in Indian construction projects are identified through intensive literature review and expert opinion. A new fuzzy based model has been proposed to estimate the risk magnitude of these factors, as the theory has the potential to deal with the vagueness, uncertainty and subjective nature of any problems and It is capable of handling the almost same analogous which is found in the complex construction projects. In order to assess the risk factors causing cost overrun, probability index and severity index are considered. A new cost overrun factor index, namely fuzzy index for cost overrun is calculated which indicates the risk magnitude of a certain factor. The applicability of the model has been shown by an example. The risk magnitude for the factor “fluctuation in price material” is determined by collecting the data from the experts of Indian construction industry. On the basis of these risk magnitudes, the importance level the factors are assessed. Top ten factors for causing cost overrun in the Indian construction industry are recognised as fluctuation in price material, lowest bid procurement policy, inflation inappropriate govt. Policy, mistakes and discrepancies in the contract document, inaccurate time and cost estimate, additional work, frequent design change, unrealistic contract duration and financial difficulty faced by contractors.

Journal ArticleDOI
TL;DR: An overview of the main concepts and supervised algorithms of incremental learning, including a synthesis of research studies done in this field and focusing on neural networks, decision trees and support vector machines are presented.
Abstract: The most effective well-known methods in the context of static machine learning offer no alternative to evolution and dynamic adaptation to integrate new data or to restructure problems already partially learned. In this area, the incremental learning represents an interesting alternative and constitutes an open research field, becoming one of the major concerns of the machine learning and classification community. In this paper, we study incremental supervised learning techniques and their applications, especially in the field of pattern recognition. This article presents an overview of the main concepts and supervised algorithms of incremental learning, including a synthesis of research studies done in this field and focusing on neural networks, decision trees and support vector machines.

Journal ArticleDOI
TL;DR: A parallel metaheuristic framework which is based on moth-flame optimization (MFO), clustering and pre-processed datasets to solve the link prediction problem and Experimental results show that PMFO-LP algorithm outperforms other well-regarded algorithms in terms of error rate, the area under curve and speedup.
Abstract: Providing a solution for the link prediction problem attracts several computer science fields and becomes a popular challenge in researches. This challenge is presented by introducing several approaches keen to provide the most precise prediction quality within a short period of time. The difficulty of the link prediction problem comes from the sparse nature of most complex networks such as social networks. This paper presents a parallel metaheuristic framework which is based on moth-flame optimization (MFO), clustering and pre-processed datasets to solve the link prediction problem. This framework is implemented and tested on a high-performance computing cluster and carried out on large and complex networks from different fields such as social, citation, biological, and information and publication networks. This framework is called Parallel MFO for Link Prediction (PMFO-LP). PMFO-LP is composed of data preprocessing stage and prediction stage. Dataset division with stratified sampling, feature extraction, data under-sampling, and feature selection are performed in the data preprocessing stage. In the prediction stage, the MFO based on clustering is used as the prediction optimizer. The PMFO-LP provides a solution to the link prediction problem with more accurate prediction results within a reasonable amount of time. Experimental results show that PMFO-LP algorithm outperforms other well-regarded algorithms in terms of error rate, the area under curve and speedup. Note that the source code of the PMFO-LP algorithm is available at https://github.com/RehamBarham/PMFO_MPI.cpp .

Journal ArticleDOI
TL;DR: This paper proposes to solve the MOMKP with an ant colony optimization approach based on a gradual weight generation method, named Gw-ACO, which enables ants to target, at each cycle, different regions in order to try to achieve almost all solutions covering the Pareto front.
Abstract: The multiobjective multidimensional knapsack problem (MOMKP) is an extension of the multiobjective knapsack problem that consists in selecting a subset of items in order to maximize m objective functions. The MOMKP creates an additional difficulty than the monodimensional version caused by the fact of respecting more than one constraint simultaneously. In this paper, we propose to solve the MOMKP with an ant colony optimization approach based on a gradual weight generation method, named Gw-ACO. Here, the weight vectors are gradually distributed in the objective space and change relatively to the optimization process. This enables ants to target, at each cycle, different regions in order to try to achieve almost all solutions covering the Pareto front. To evaluate the suggested Gw-ACO approach, a set of experiments is performed on MOMKP benchmark instances and compared with well-known state-of-the-art metaheuristic approaches. The obtained experimental results show that Gw-ACO is significantly better and able to achieve a well distribution all over the Pareto-optimal front.

Journal ArticleDOI
TL;DR: This paper proposes an efficient sentiment analysis classifier using convolutional neural networks by analyzing the impact of the hyper-parameters on the model performance and shows that the different configurations exceed the reference models in the most of the cases and have similar performance to the models of the state of the art with gains of up to 2% in some cases.
Abstract: Convolutional neural networks are known for their excellent performance in computer vision, achieving results in the state of the art Moreover, recent research has shown that these networks can also provide promising results for natural language processing In this case, the basic idea is to concatenate the vector representations of words into a single block and use it as an image However, despite the good results, the problem of using convolution networks is the large numbers of design decisions that need to be made a priori These models require the definition of many hyper-parameters, including the type of word embeddings, which consists of the data vectorized representation, the activation function that prints the non-linearity characteristics to the model, the size of the filter that applies data convolution, the number of feature maps, which are responsible for identifying the attributes and the pooling method used for data reduction In addition, one must also predefine the regularization constant and the dropout rate, which are responsible for avoiding any network over-fitting In existing research works, convolutional neural network architectures capable of overcoming the performance of traditional machine learning models are presented Even though these can compete with more complex models, the problem of how the different setting of the hyper-parameters may affect the performance of this type of network has not yet been explored In this paper, we propose an efficient sentiment analysis classifier using convolutional neural networks by analyzing the impact of the hyper-parameters on the model performance The main interest in analyzing sentiment comes from the advent of social media and the technological advances that flood the Internet with opinions Nonetheless, mining the Internet for opinion and sentiment analysis is not an easy task and thus needs outstanding models with the best hyper-parameters setting to be able to get pertinent answers The results achieved are obtained with the use of GPU and show that the different configurations exceed the reference models in the most of the cases with gains of up to 18% and have similar performance to the models of the state of the art with gains of up to 2% in some cases

Journal ArticleDOI
TL;DR: A self-adaptive mutation factor cross-over probability based differential evolution (SA-MCDE) algorithm is proposed for LSL problem to improve convergence speed and improve localization accuracy with high convergence speed.
Abstract: Node localization or positioning is essential for many position aware protocols in a wireless sensor network. The classical global poisoning system used for node localization is limited because of its high cost and its unavailability in the indoor environments. So, several localization algorithms have been proposed in the recent past to improve localization accuracy and to reduce implementation cost. One of the popular approaches of localization is to define localization as a least square localization (LSL) problem. During optimization of LSL problem, the performance of the classical Gauss–Newton method is limited because it can be trapped by local minima. By contrast, differential evolution (DE) algorithm has high localization accuracy because it has an ability to determine global optimal solution to the LSL problem. However, the convergence speed of the conventional DE algorithm is low as it uses fixed values of mutation factor and cross-over probability. Thus, in this paper, a self-adaptive mutation factor cross-over probability based differential evolution (SA-MCDE) algorithm is proposed for LSL problem to improve convergence speed. The SA-MCDE algorithm adaptively adjusts the mutation factor and cross-over probability in each generation to better explore and exploit the global optimal solution. Thus, improved localization accuracy with high convergence speed is expected from the SA-MCDE algorithm. The rigorous simulation results conducted for several localization algorithms declare that the propose SA-MCDE based localization has about (40–90) % more localization accuracy over the classical techniques.

Journal ArticleDOI
TL;DR: This survey paper provides an overview of the clustering techniques for reducing energy consumption by reviewing several CH selection techniques in WSN that provide high energy efficiency.
Abstract: Recently, wireless sensor networks (WSNs) are becoming very famous as they are inexpensive and easy to maintain and manage. The network contains a group of sensor nodes, which are capable of sensing, computing, and transmitting. Energy efficiency is one of the most important challenging problems in WSN. Sensor nodes have inadequate energy and installed in remote areas. Hence, it is difficult to restore the batteries in WSN. Therefore, to maximize the network lifetime, appropriate clustering techniques and cluster head (CH) selection methods should be implemented. The main idea behind the clustering technique is that it clusters the sensor nodes and reduces the composed data simultaneously and then, it broadcasts the data. In this process, CH selection is an essential part. Therefore, this survey paper provides an overview of the clustering techniques for reducing energy consumption by reviewing several CH selection techniques in WSN that provide high energy efficiency. Several techniques have been employed for CH selection based on partitional clustering, optimization, low-energy adaptive clustering hierarchy, hierarchical, distributed, and other classification methods. Finally, an analysis is done based on the implementation tools, metrics employed, accuracy, and achievements of the considered CH selection techniques.

Journal ArticleDOI
TL;DR: A hybrid model which combines genetic algorithm and heuristics like remove-sharp and local-opt with ant colony system (ACS) has been implemented to speed-up convergence as well as positive feedback and optimizes the search space to generate an efficient solution for complex problems.
Abstract: In this paper, a hybrid model which combines genetic algorithm and heuristics like remove-sharp and local-opt with ant colony system (ACS) has been implemented to speed-up convergence as well as positive feedback and optimizes the search space to generate an efficient solution for complex problems. This model is validated with well-known travelling salesman problem (TSP). Finally, performance and complexity analysis show that proposed nested hybrid ACS has faster convergence rate than other standard existing algorithms such as exact and approximation algorithms to reach the optimal solution. The standard TSP problems from the TSP library are also tested and found satisfactory.

Journal ArticleDOI
TL;DR: Results obtained by the proposed ensemble are compared with some popular FS models like gravitational search algorithm, histogram based multi objective GA, GA, BPSO and ACO, and it shows that the algorithm outperforms the others.
Abstract: Feature selection (FS) is an integral part of many machine learning problems in providing a better and time-efficient classification model. In recent times, many new FS algorithms have been proposed which combine well-established algorithms to overcome drawbacks of the constituent algorithms. The general process of combination is to allow them to operate consecutively or simultaneously. These rudimentary combinations in many cases do not allow for proper inclusion of the advantages of the specific algorithms and this necessitates an alternative approach for combining. Initially without interrupting the flow of the algorithms, we allow them to generate their results. After selection of the most dominant features, the rest of the combination is done using the concept of histogram and assigning a weightage to the fuzzy features based on the quality of the candidate solution in which they appear. In the proposed method, the outcome of the three popularly used algorithms with complementary exploitation–exploration trade-off namely genetic algorithm (GA), binary particle swarm optimisation (BPSO) and ant colony optimisation (ACO) are combined together. Then, 14 popular UCI datasets have been used to evaluate the proposed FS method. Results obtained by our proposed ensemble are compared with some popular FS models like gravitational search algorithm, histogram based multi objective GA, GA, BPSO and ACO, and it shows that our algorithm outperforms the others.

Journal ArticleDOI
TL;DR: The research results can be applied to the interactive mode practice of the intangible cultural heritage center, and provide an effective interactive mode for the intangiblecultural heritage in the form of augmented reality interaction.
Abstract: The current predicament of intangible cultural heritage is its vulnerability and rarity, and an expression that is difficult to construct in the space-time dimension. It is difficult for visitors to really understand and touch these local intangible cultural heritages. For this reason, the need to develop new methodologies for cultural heritage learning and art appreciation is always present, and this is increasingly supported by digital technologies. The research in this paper is to apply the self-issuance method to the folk intangible cultural heritage center. It is different from the traditional AR method of presenting crafts directly through HMD. Instead, the use of low-cost technology, Augmented Reality technology and unity 3D technology provide a new way of interaction. By using HMD, smartphone, Leap motion and software created in UNITY 3D, the application of the augmented reality program is achieved. It allows users to interact with the intangible cultural heritage easily and fully it in a virtual physical environment. Based on the principle of human natural interaction, it can be achieved by structured light target image scanning and the design and experiment of photographic measurement. Then a local intangible cultural heritage model database can be established to provide virtual prototype files. The results of the experimental show that the data transmission is effective and the three-dimensional interaction between people and exhibits can be realized. The research results can be applied to the interactive mode practice of the intangible cultural heritage center, and provide an effective interactive mode for the intangible cultural heritage in the form of augmented reality interaction. In this way, visitors can have a deep impression on the intangible cultural heritage and the culture can be spreaded widely all over the world.

Journal ArticleDOI
TL;DR: This paper identifies a set of common features among some well-known swarm-based algorithms and how each of these approaches implement them and provides the community with the core features of swarm-intelligence algorithms.
Abstract: The literature is now filled with swarm intelligence algorithms developed by taking inspiration from a number of insects and other animals and phenomena, such as ants, termites, bees, fishes and cockroaches, to name just a few. Many, if not most, of these bioinspirations carry with them some common issues and features which happen at the individual level, promoting very similar collective emergent phenomena. Thus, despite using different biological metaphors as inspiration, most algorithms present a similar structure and it is possible to identify common macro-processes among them. In this context, this paper identifies a set of common features among some well-known swarm-based algorithms and how each of these approaches implement them. By doing this, we provide the community with the core features of swarm-intelligence algorithms. This diagnostic is crucial and timely to the field, because once we are able to list and explain these commonalities, we are also able to better analyze and design swarm intelligence algorithms.

Journal ArticleDOI
TL;DR: In this article, the relation between robustness to mutations, phenotypic complexity, and evolvability in the context of artificial circuits evolved for the ability to solve a parity problem was analyzed.
Abstract: We analyze the relation between robustness to mutations, phenotypic complexity, and evolvability in the context of artificial circuits evolved for the ability to solve a parity problem. We demonstrate that whether robustness to mutations enhances or diminishes phenotypic variability and evolvability depends on whether robustness is achieved through the development of parsimonious (phenotypically simple) solutions, that minimize the number of genes playing functional roles, or through phenotypically more complex solutions, capable of buffering the effect of mutations. We show that the characteristics of the selection process strongly influence the robustness and the performance of the evolving candidate solutions. Finally, we propose a new evolutionary method that outperforms evolutionary algorithms commonly used in this domain.

Journal ArticleDOI
TL;DR: In the presented survey, several papers are taken for analyzing the performance of sisal fibers, and the contributions regarding the tensile strength and compression strength in the adopted papers are analyzed together with their percentage of composition.
Abstract: Sisal fiber cement is comprehensively deployed in the construction works owing to their flexibility as cladding panels and ridged equipment, and water containers that are accessible in a massive number of cultivation and construction applications. The most important reason for integrating the sisal fibers into the cement matrix is to increase the toughness; tensile strength and the bend features of the resulting composite. In recent times, the sisal fibers have been employed as reinforcement in concretes. These cementitious composites are presently deliberated to be one of the most accomplished structural equipment in the modern industrial technology. Accordingly, in the presented survey, several papers are taken for analyzing the performance of sisal fibers. In addition, the papers taken for review are studied depending on the composition of concrete and sisal constituents in the work of art of building. Moreover, the sisal fiber with a composition other than concrete is also illustrated. The contributions regarding the tensile strength and compression strength in the adopted papers are analyzed together with their percentage of composition. The evolutions of the adopted papers along with their various applications are moreover analyzed in detail.

Journal ArticleDOI
TL;DR: In this work, an ACRO algorithm is adopted to solve partitional clustering problems, but, this algorithm suffers with slow convergence rate and sometimes stuck in local optima, so the performance of proposed algorithm is tested over well-known clustering datasets.
Abstract: In the field of engineering, heuristic algorithms are widely adopted to solve variety of optimization problems. These algorithms have proven its efficacy over classical algorithms. It is seen that chemical reactions consist of an efficient computational procedure to design a new product. The formation of new product contains numbers of objects, states, events and well defined procedural steps. A meta-heuristic algorithm inspired through chemical reaction is developed, called artificial chemical reaction optimization (ACRO) algorithm. In this work, an ACRO algorithm is adopted to solve partitional clustering problems. But, this algorithm suffers with slow convergence rate and sometimes stuck in local optima. To handle these aforementioned problems, two operators are inculcated in ACRO algorithm. The performance of proposed algorithm is tested over well-known clustering datasets. The simulation results confirm that proposed ACRO algorithm is an effective and competitive algorithm to solve partitional clustering problems.