scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Repository and Mutation based Particle Swarm Optimization (RMPSO): A new PSO variant applied to reconstruction of Gene Regulatory Network

01 Jan 2019-Applied Soft Computing (Elsevier)-Vol. 74, pp 330-355
TL;DR: RMPSO is applied to a practical scenario: the reconstruction of Gene Regulatory Networks (GRN) based on Recurrent Neural Network (RNN) model and the experimental results ensure that the RMPSO performs better than the state-of-the-art methods in the synthetic gene data set (gold standard) as well as real gene data data set.
About: This article is published in Applied Soft Computing.The article was published on 2019-01-01. It has received 40 citations till now. The article focuses on the topics: Particle swarm optimization & Swarm intelligence.
Citations
More filters
Journal ArticleDOI
TL;DR: A rigorous yet systematic review is presented to organize and summarize the information on the PSO algorithm and the developments and trends of its most basic as well as of some of the very notable implementations that have been introduced recently, bearing in mind the coverage of paradigm, theory, hybridization, parallelization, complex optimization, and the diverse applications of the algorithm.
Abstract: Over the ages, nature has constantly been a rich source of inspiration for science, with much still to discover about and learn from. Swarm Intelligence (SI), a major branch of artificial intelligence, was rendered to model the collective behavior of social swarms in nature. Ultimately, Particle Swarm Optimization algorithm (PSO) is arguably one of the most popular SI paradigms. Over the past two decades, PSO has been applied successfully, with good return as well, in a wide variety of fields of science and technology with a wider range of complex optimization problems, thereby occupying a prominent position in the optimization field. However, through in-depth studies, a number of problems with the algorithm have been detected and identified; e.g., issues regarding convergence, diversity, and stability. Consequently, since its birth in the mid-1990s, PSO has witnessed a myriad of enhancements, extensions, and variants in various aspects of the algorithm, specifically after the twentieth century, and the related research has therefore now reached an impressive state. In this paper, a rigorous yet systematic review is presented to organize and summarize the information on the PSO algorithm and the developments and trends of its most basic as well as of some of the very notable implementations that have been introduced recently, bearing in mind the coverage of paradigm, theory, hybridization, parallelization, complex optimization, and the diverse applications of the algorithm, making it more accessible. Ease for researchers to determine which PSO variant is currently best suited or to be invented for a given optimization problem or application. This up-to-date review also highlights the current pressing issues and intriguing open challenges haunting PSO, prompting scholars and researchers to conduct further research both on the theory and application of the algorithm in the forthcoming years.

169 citations

Journal ArticleDOI
21 Mar 2020-Entropy
TL;DR: A systematic literature review about variants and improvements of the Particle Swarm Optimisation algorithm and its extensions to other classes of optimisation problems, taking into consideration the most important ones is made.
Abstract: The Particle Swarm Optimisation (PSO) algorithm was inspired by the social and biological behaviour of bird flocks searching for food sources. In this nature-based algorithm, individuals are referred to as particles and fly through the search space seeking for the global best position that minimises (or maximises) a given problem. Today, PSO is one of the most well-known and widely used swarm intelligence algorithms and metaheuristic techniques, because of its simplicity and ability to be used in a wide range of applications. However, in-depth studies of the algorithm have led to the detection and identification of a number of problems with it, especially convergence problems and performance issues. Consequently, a myriad of variants, enhancements and extensions to the original version of the algorithm, developed and introduced in the mid-1990s, have been proposed, especially in the last two decades. In this article, a systematic literature review about those variants and improvements is made, which also covers the hybridisation and parallelisation of the algorithm and its extensions to other classes of optimisation problems, taking into consideration the most important ones. These approaches and improvements are appropriately summarised, organised and presented, in order to allow and facilitate the identification of the most appropriate PSO variant for a particular application.

94 citations


Cites methods from "Repository and Mutation based Parti..."

  • ...For example, the Cauchy mutation operator can be implemented as follows [93]:...

    [...]

Journal ArticleDOI
TL;DR: A new selection method based on fitness-distance balance (FDB) is developed in order to solve the premature convergence problem in the MHS process and makes a significant contribution to the meta-heuristic search process.
Abstract: Selection methods have an important role in the meta-heuristic search (MHS) process. However, apart from a few successful methods developed in the past, new and effective studies have not been found in recent years. It is known that solution candidates selecting from the population during the search process directly affects the direction and success of the search. In this study, a new selection method based on fitness-distance balance (FDB) is developed in order to solve the premature convergence problem in the MHS process. Thanks to the developed method, solution candidates with the highest potential to improve the search process can be determined effectively and consistently from the population. Experimental studies have been conducted to test and verify the developed FDB selection method. For this purpose, 90 benchmark functions with different types and complexity levels have been used. In order to test the developed FDB method, numerous variants have been formed. These variants have been compared to each other to determine the most effective FDB variant. In the validation study, the FDB-SOS (FDB-based symbiotic organism search) algorithm is compared with thirteen well-known and up-to-date MHS techniques. The search performance of the algorithms has been analyzed by the Wilcoxon Rank Sum Test. The results show that the developed selection method makes a significant contribution to the meta-heuristic search process.

74 citations

Journal ArticleDOI
TL;DR: Two new path-loss models were formulated based on the MATLAB curve-fitting tool for ZigBee WSN in a farm field and noticeably improved the coefficient of determination (R2) of the regression line, with the mean absolute error found to be 1.6 and 2.7 dBm.
Abstract: Wireless sensor networks (WSNs) have received significant attention in the last few years in the agriculture field. Among the major challenges for sensor nodes’ deployment in agriculture is the path loss in the presence of dense grass or the height of trees. This results in degradation of communication link quality due to absorption, scattering, and attenuation through the crop’s foliage or trees. In this study, two new path-loss models were formulated based on the MATLAB curve-fitting tool for ZigBee WSN in a farm field. The path loss between the router node (mounted on a drone) and the coordinator node was modeled and derived based on the received signal strength indicator (RSSI) measurements with the particle swarm optimization (PSO) algorithm in the farm field. Two path-loss models were formulated based on exponential (EXP) and polynomial (POLY) functions. Both functions were combined with PSO, namely, the hybrid EXP-PSO and POLY-PSO algorithms, to find the optimal coefficients of functions that would result in accurate path-loss models. The results show that the hybrid EXP-PSO and POLY-PSO models noticeably improved the coefficient of determination (R2) of the regression line, with the mean absolute error (MAE) found to be 1.6 and 2.7 dBm for EXP-PSO and POLY-PSO algorithms. The achieved R2 in this study outperformed the previous state-of-the-art models. An accurate path-loss model is essential for smart agriculture application to determine the behavior of the propagated signals and to deploy the nodes in the WSN in a position that ensures data communication without unnecessary packets’ loss between nodes.

74 citations


Cites background from "Repository and Mutation based Parti..."

  • ...Each particle upgrades its velocity and positions as shown as follows [24]:...

    [...]

Journal ArticleDOI
TL;DR: The results of the analysis showed that the problem of premature convergence had been largely eliminated by the application of the FDB method and that the exploitation-exploration balance was also effectively provided, and the proposed FDBSFS algorithm ranked first among the thirty-nine competing algorithms.
Abstract: Stochastic Fractal Search (SFS) is a new and original meta-heuristic search (MHS) algorithm with strong foundations. As with many other MHS methods, there are problems in effectively balancing the exploitation-exploration in the SFS algorithm. In order to achieve this balance, it is necessary to improve its diversity capability. This article presents the studies that were carried out to strengthen the diversity and balanced search capabilities of the SFS algorithm. For this purpose, the diversity operator of the SFS algorithm was designed with a novel method called Fitness-Distance Balance (FDB), which more effectively mimics the way fractals occur in nature. Thus, the FDBSFS algorithm, which has a much stronger search performance, was developed. Comprehensive experimental studies were conducted to test and validate the developed FDB-based SFS algorithm (FDBSFS). Thirty-nine novel and powerful MHS algorithms, eighty-nine unconstrained test functions and five constrained engineering problems were used. Two nonparametric tests, the Wilcoxon signed rank test and the Friedman test, were used to analyze the results obtained from the experimental studies. The results of the analysis showed that the problem of premature convergence had been largely eliminated by the application of the FDB method and that the exploitation-exploration balance was also effectively provided. Moreover, the proposed FDBSFS algorithm ranked first among the thirty-nine competing algorithms.

52 citations

References
More filters
Journal ArticleDOI
TL;DR: A snapshot of particle swarming from the authors’ perspective, including variations in the algorithm, current and ongoing research, applications and open problems, is included.
Abstract: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced The evolution of several paradigms is outlined, and an implementation of one of the paradigms is discussed Benchmark testing of the paradigm is described, and applications, including nonlinear function optimization and neural network training, are proposed The relationships between particle swarm optimization and both artificial life and genetic algorithms are described

18,439 citations

Proceedings ArticleDOI
04 Oct 1995
TL;DR: The optimization of nonlinear functions using particle swarm methodology is described and implementations of two paradigms are discussed and compared, including a recently developed locally oriented paradigm.
Abstract: The optimization of nonlinear functions using particle swarm methodology is described. Implementations of two paradigms are discussed and compared, including a recently developed locally oriented paradigm. Benchmark testing of both paradigms is described, and applications, including neural network training and robot task learning, are proposed. Relationships between particle swarm optimization and both artificial life and evolutionary computation are reviewed.

14,477 citations

Proceedings ArticleDOI
04 May 1998
TL;DR: A new parameter, called inertia weight, is introduced into the original particle swarm optimizer, which resembles a school of flying birds since it adjusts its flying according to its own flying experience and its companions' flying experience.
Abstract: Evolutionary computation techniques, genetic algorithms, evolutionary strategies and genetic programming are motivated by the evolution of nature. A population of individuals, which encode the problem solutions are manipulated according to the rule of survival of the fittest through "genetic" operations, such as mutation, crossover and reproduction. A best solution is evolved through the generations. In contrast to evolutionary computation techniques, Eberhart and Kennedy developed a different algorithm through simulating social behavior (R.C. Eberhart et al., 1996; R.C. Eberhart and J. Kennedy, 1996; J. Kennedy and R.C. Eberhart, 1995; J. Kennedy, 1997). As in other algorithms, a population of individuals exists. This algorithm is called particle swarm optimization (PSO) since it resembles a school of flying birds. In a particle swarm optimizer, instead of using genetic operators, these individuals are "evolved" by cooperation and competition among the individuals themselves through generations. Each particle adjusts its flying according to its own flying experience and its companions' flying experience. We introduce a new parameter, called inertia weight, into the original particle swarm optimizer. Simulations have been done to illustrate the significant and effective impact of this new parameter on the particle swarm optimizer.

9,373 citations

Journal ArticleDOI
TL;DR: Artificial Bee Colony (ABC) Algorithm is an optimization algorithm based on the intelligent behaviour of honey bee swarm that is used for optimizing multivariable functions and the results showed that ABC outperforms the other algorithms.
Abstract: Swarm intelligence is a research branch that models the population of interacting agents or swarms that are able to self-organize. An ant colony, a flock of birds or an immune system is a typical example of a swarm system. Bees' swarming around their hive is another example of swarm intelligence. Artificial Bee Colony (ABC) Algorithm is an optimization algorithm based on the intelligent behaviour of honey bee swarm. In this work, ABC algorithm is used for optimizing multivariable functions and the results produced by ABC, Genetic Algorithm (GA), Particle Swarm Algorithm (PSO) and Particle Swarm Inspired Evolutionary Algorithm (PS-EA) have been compared. The results showed that ABC outperforms the other algorithms.

6,377 citations

BookDOI
01 Jan 1999
TL;DR: This chapter discusses Ant Foraging Behavior, Combinatorial Optimization, and Routing in Communications Networks, and its application to Data Analysis and Graph Partitioning.
Abstract: 1. Introduction 2. Ant Foraging Behavior, Combinatorial Optimization, and Routing in Communications Networks 3. Division of Labor and Task Allocation 4. Cemetery Organization, Brood Sorting, Data Analysis, and Graph Partitioning 5. Self-Organization and Templates: Application to Data Analysis and Graph Partitioning 6. Nest Building and Self-Assembling 7. Cooperative Transport by Insects and Robots 8. Epilogue

5,822 citations