scispace - formally typeset
Journal ArticleDOI

Glowworm swarm optimization for simultaneous capture of multiple local optima of multimodal functions

K. N. Krishnanand, +1 more
- 01 Jun 2009 - 
- Vol. 3, Iss: 2, pp 87-124
TLDR
Experimental results demonstrate the efficacy of the proposed glowworm based algorithm in capturing multiple optima of a series of standard multimodal test functions and more complex ones, such as stair-case and multiple-plateau functions.
Abstract
This paper presents glowworm swarm optimization (GSO), a novel algorithm for the simultaneous computation of multiple optima of multimodal functions. The algorithm shares a few features with some better known swarm intelligence based optimization algorithms, such as ant colony optimization and particle swarm optimization, but with several significant differences. The agents in GSO are thought of as glowworms that carry a luminescence quantity called luciferin along with them. The glowworms encode the fitness of their current locations, evaluated using the objective function, into a luciferin value that they broadcast to their neighbors. The glowworm identifies its neighbors and computes its movements by exploiting an adaptive neighborhood, which is bounded above by its sensor range. Each glowworm selects, using a probabilistic mechanism, a neighbor that has a luciferin value higher than its own and moves toward it. These movements—based only on local information and selective neighbor interactions—enable the swarm of glowworms to partition into disjoint subgroups that converge on multiple optima of a given multimodal function. We provide some theoretical results related to the luciferin update mechanism in order to prove the bounded nature and convergence of luciferin levels of the glowworms. Experimental results demonstrate the efficacy of the proposed glowworm based algorithm in capturing multiple optima of a series of standard multimodal test functions and more complex ones, such as stair-case and multiple-plateau functions. We also report the results of tests in higher dimensional spaces with a large number of peaks. We address the parameter selection problem by conducting experiments to show that only two parameters need to be selected by the user. Finally, we provide some comparisons of GSO with PSO and an experimental comparison with Niche-PSO, a PSO variant that is designed for the simultaneous computation of multiple optima.

read more

Citations
More filters
Journal ArticleDOI

Ab initio random structure searching

TL;DR: Ab initio random structure searching (AIRSS) as discussed by the authors searches for stable structures of materials using first-principles electronic structure methods, such as density-functional-theory (DFT), is a rapidly growing field.
Journal ArticleDOI

Manta ray foraging optimization: An effective bio-inspired optimizer for engineering applications

TL;DR: The comparison results on the benchmark functions suggest that MRFO is far superior to its competitors, and the real-world engineering applications show the merits of this algorithm in tackling challenging problems in terms of computational cost and solution precision.
Journal ArticleDOI

A survey on new generation metaheuristic algorithms

TL;DR: In this survey, fourteen new and outstanding metaheuristics that have been introduced for the last twenty years other than the classical ones such as genetic, particle swarm, and tabu search are distinguished.
Journal ArticleDOI

African vultures optimization algorithm: A new nature-inspired metaheuristic algorithm for global optimization problems

TL;DR: The proposed African Vultures Optimization Algorithm (AVOA) is named and simulates African vultures’ foraging and navigation behaviors and indicates the significant superiority of the AVOA algorithm at a 95% confidence interval.
Journal ArticleDOI

A novel meta-heuristic optimization algorithm

TL;DR: A new optimization algorithm based on Newton's law of cooling, which will be called Thermal Exchange Optimization algorithm, is developed and examined by some mathematical functions and four mechanical benchmark problems.
References
More filters
Proceedings ArticleDOI

Particle swarm optimization

TL;DR: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced, and the evolution of several paradigms is outlined, and an implementation of one of the paradigm is discussed.
Journal ArticleDOI

Ant system: optimization by a colony of cooperating agents

TL;DR: It is shown how the ant system (AS) can be applied to other optimization problems like the asymmetric traveling salesman, the quadratic assignment and the job-shop scheduling, and the salient characteristics-global data structure revision, distributed communication and probabilistic transitions of the AS.
Journal ArticleDOI

Ant colony system: a cooperative learning approach to the traveling salesman problem

TL;DR: The results show that the ACS outperforms other nature-inspired algorithms such as simulated annealing and evolutionary computation, and it is concluded comparing ACS-3-opt, a version of the ACS augmented with a local search procedure, to some of the best performing algorithms for symmetric and asymmetric TSPs.
Book

Ant Colony Optimization

TL;DR: Ant colony optimization (ACO) is a relatively new approach to problem solving that takes inspiration from the social behaviors of insects and of other animals as discussed by the authors In particular, ants have inspired a number of methods and techniques among which the most studied and the most successful is the general purpose optimization technique known as ant colony optimization.
Related Papers (5)