Journal ArticleDOI
GSA: A Gravitational Search Algorithm
Reads0
Chats0
TLDR
A new optimization algorithm based on the law of gravity and mass interactions is introduced and the obtained results confirm the high performance of the proposed method in solving various nonlinear functions.About:
This article is published in Information Sciences.The article was published on 2009-06-01. It has received 5501 citations till now. The article focuses on the topics: Metaheuristic & Best-first search.read more
Citations
More filters
Journal ArticleDOI
ASCA-PSO: Adaptive sine cosine optimization algorithm integrated with particle swarm for pairwise local sequence alignment
TL;DR: This article presents an enhanced version of the SCA by merging it with particle swarm optimization (PSO), called ASCA-PSO, which has been tested over several unimodal and multimodal benchmark functions, which show its superiority over theSCA and other recent and standard meta-heuristic algorithms.
Journal ArticleDOI
A global optimization algorithm inspired in the behavior of selfish herds.
TL;DR: The experimental results show the remarkable performance of the proposed approach against those of the other compared methods, and as such SHO is proven to be an excellent alternative to solve global optimization problems.
Journal ArticleDOI
Chaotic dynamic weight particle swarm optimization for numerical function optimization
Ke Chen,Fengyu Zhou,Aling Liu +2 more
TL;DR: The experimental results show that, for almost all functions, the proposed chaotic dynamic weight particle swarm optimization technique has superior performance compared with other nature-inspired optimizations and well-known PSO variants.
Journal ArticleDOI
Wild horse optimizer: a new meta-heuristic algorithm for solving engineering optimization problems
Iraj Naruei,Farshid Keynia +1 more
TL;DR: A new optimizer algorithm called the wild horse optimizer (WHO), which is inspired by the social life behaviour of wild horses, which showed that the proposed algorithm presented very competitive results compared to other algorithms.
Journal ArticleDOI
An efficient Harris hawks-inspired image segmentation method
Erick Rodríguez-Esparza,Laura A. Zanella-Calzada,Diego Oliva,Ali Asghar Heidari,Ali Asghar Heidari,Daniel Zaldivar,Marco Pérez-Cisneros,Loke Kok Foong +7 more
TL;DR: An efficient methodology for multilevel segmentation is proposed using the Harris Hawks Optimization (HHO) algorithm and the minimum cross-entropy as a fitness function and it presents an improvement over other segmentation approaches that are currently used in the literature.
References
More filters
Journal ArticleDOI
Optimization by Simulated Annealing
TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems.
Proceedings ArticleDOI
Particle swarm optimization
TL;DR: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced, and the evolution of several paradigms is outlined, and an implementation of one of the paradigm is discussed.
Book
Artificial Intelligence: A Modern Approach
Stuart Russell,Peter Norvig +1 more
TL;DR: In this article, the authors present a comprehensive introduction to the theory and practice of artificial intelligence for modern applications, including game playing, planning and acting, and reinforcement learning with neural networks.
Journal ArticleDOI
Ant system: optimization by a colony of cooperating agents
TL;DR: It is shown how the ant system (AS) can be applied to other optimization problems like the asymmetric traveling salesman, the quadratic assignment and the job-shop scheduling, and the salient characteristics-global data structure revision, distributed communication and probabilistic transitions of the AS.
Journal ArticleDOI
No free lunch theorems for optimization
TL;DR: A framework is developed to explore the connection between effective optimization algorithms and the problems they are solving and a number of "no free lunch" (NFL) theorems are presented which establish that for any algorithm, any elevated performance over one class of problems is offset by performance over another class.