scispace - formally typeset
Search or ask a question
Journal ArticleDOI

GSA: A Gravitational Search Algorithm

01 Jun 2009-Information Sciences (INFORMATION SCIENCES)-Vol. 179, Iss: 13, pp 2232-2248
TL;DR: A new optimization algorithm based on the law of gravity and mass interactions is introduced and the obtained results confirm the high performance of the proposed method in solving various nonlinear functions.
About: This article is published in Information Sciences.The article was published on 2009-06-01. It has received 5501 citations till now. The article focuses on the topics: Metaheuristic & Best-first search.
Citations
More filters
Journal ArticleDOI
TL;DR: The results of the classical engineering design problems and real application prove that the proposed GWO algorithm is applicable to challenging problems with unknown search spaces.

10,082 citations


Cites methods from "GSA: A Gravitational Search Algorit..."

  • ...The algorithm is then benchmarked on 29 well-known test functions, and the results are verified by a comparative study with Particle Swarm Optimization (PSO), Gravitational Search Algorithm (GSA), Differential Evolution (DE), Evolutionary Programming (EP), and Evolution Strategy (ES)....

    [...]

  • ...The results showed that GWO was able to provide highly competitive results compared to wellknown heuristics such as PSO, GSA, DE, EP, and ES....

    [...]

  • ...Generally speaking, constraint handling becomes very challenging when the fitness function directly affects the position updating of the search agents (GSA for instance)....

    [...]

  • ...GSA is another physics-based algorithm....

    [...]

  • ...Algorithm Optimum variables Optimum cost Ts Th R L GWO 0.812500 0.434500 42.089181 176.758731 6051.5639 GSA 1.125000 0.625000 55.9886598 84.4542025 8538.8359 PSO (He and Wang) 0.812500 0.437500 42.091266 176.746500 6061.0777 GA (Coello) 0.812500 0.434500 40.323900 200.000000 6288.7445 GA (Coello and Montes) 0.812500 0.437500 42.097398 176.654050 6059.9463 GA (Deb and Gene) 0.937500 0.500000 48.329000 112.679000 6410.3811 ES (Montes and Coello) 0.812500 0.437500 42.098087 176.640518 6059.7456 DE (Huang et al.)...

    [...]

Journal ArticleDOI
TL;DR: Optimization results prove that the WOA algorithm is very competitive compared to the state-of-art meta-heuristic algorithms as well as conventional methods.

7,090 citations

Journal ArticleDOI
TL;DR: The SCA algorithm obtains a smooth shape for the airfoil with a very low drag, which demonstrates that this algorithm can highly be effective in solving real problems with constrained and unknown search spaces.
Abstract: This paper proposes a novel population-based optimization algorithm called Sine Cosine Algorithm (SCA) for solving optimization problems. The SCA creates multiple initial random candidate solutions and requires them to fluctuate outwards or towards the best solution using a mathematical model based on sine and cosine functions. Several random and adaptive variables also are integrated to this algorithm to emphasize exploration and exploitation of the search space in different milestones of optimization. The performance of SCA is benchmarked in three test phases. Firstly, a set of well-known test cases including unimodal, multi-modal, and composite functions are employed to test exploration, exploitation, local optima avoidance, and convergence of SCA. Secondly, several performance metrics (search history, trajectory, average fitness of solutions, and the best solution during optimization) are used to qualitatively observe and confirm the performance of SCA on shifted two-dimensional test functions. Finally, the cross-section of an aircraft's wing is optimized by SCA as a real challenging case study to verify and demonstrate the performance of this algorithm in practice. The results of test functions and performance metrics prove that the algorithm proposed is able to explore different regions of a search space, avoid local optima, converge towards the global optimum, and exploit promising regions of a search space during optimization effectively. The SCA algorithm obtains a smooth shape for the airfoil with a very low drag, which demonstrates that this algorithm can highly be effective in solving real problems with constrained and unknown search spaces. Note that the source codes of the SCA algorithm are publicly available at http://www.alimirjalili.com/SCA.html .

3,088 citations

Journal ArticleDOI
TL;DR: The qualitative and quantitative results prove the efficiency of SSA and MSSA and demonstrate the merits of the algorithms proposed in solving real-world problems with difficult and unknown search spaces.

3,027 citations

Journal ArticleDOI
TL;DR: The MFO algorithm is compared with other well-known nature-inspired algorithms on 29 benchmark and 7 real engineering problems and the statistical results show that this algorithm is able to provide very promising and competitive results.
Abstract: In this paper a novel nature-inspired optimization paradigm is proposed called Moth-Flame Optimization (MFO) algorithm. The main inspiration of this optimizer is the navigation method of moths in nature called transverse orientation. Moths fly in night by maintaining a fixed angle with respect to the moon, a very effective mechanism for travelling in a straight line for long distances. However, these fancy insects are trapped in a useless/deadly spiral path around artificial lights. This paper mathematically models this behaviour to perform optimization. The MFO algorithm is compared with other well-known nature-inspired algorithms on 29 benchmark and 7 real engineering problems. The statistical results on the benchmark functions show that this algorithm is able to provide very promising and competitive results. Additionally, the results of the real problems demonstrate the merits of this algorithm in solving challenging problems with constrained and unknown search spaces. The paper also considers the application of the proposed algorithm in the field of marine propeller design to further investigate its effectiveness in practice. Note that the source codes of the MFO algorithm are publicly available at http://www.alimirjalili.com/MFO.html.

2,892 citations


Cites background or methods from "GSA: A Gravitational Search Algorit..."

  • ...As per the p-values in Table 7, the MFO algorithm is the only algorithm that provides a p-value greater than 0.05 on F12, which means that the superiority of the GSA algorithm is not statistically significant....

    [...]

  • ...Algorithm Optimal values for variables Optimal cost h l t b MFO 0.2057 3.4703 9.0364 0.2057 1.72452 GSA 0.182129 3.856979 10.0000 0.202376 1.87995 CPSO [66] 0.202369 3.544214 9.048210 0.205723 1.73148 GA [60] 0.1829 4.0483 9.3666 0.2059 1.82420 GA [62] 0.2489 6.1730 8.1789 0.2533 2.43312 Coello [58] 0.208800 3.420500 8.997500 0.2100 1.74831 Coello and Montes [67] 0.205986 3.471328 9.020224 0.206480 1.72822 Siddall [68] 0.2444 6.2189 8.2915 0.2444 2.38154 Ragsdell [65] 0.2455 6.1960 8.2730 0.2455 2.38594 Random [65] 0.4575 4.7313 5.0853 0.6600 4.11856 Simplex [65] 0.2792 5.6256 7.7512 0.2796 2.53073 David [65] 0.2434 6.2552 8.2915 0.2444 2.38411 APPROX [65] 0.2444 6.2189 8.2915 0.2444 2.38154 Fig....

    [...]

  • ...In other words, MFO and GSA perform very similar and can be considered as the best algorithms when solving F12....

    [...]

  • ...In order to verify the performance of the proposed MFO algorithm, some of the well-known and recent algorithms in the literature are chosen: PSO [55], GSA [30], BA [22], FPA [45], SMS [46], FA [23], and GA [56]....

    [...]

  • ...Algorithm Optimal values for variables Optimum cost Ts Th R L MFO 0.8125 0.4375 42.098445 176.636596 6059.7143 GSA 1.1250 0.6250 55.988659 84.4542025 8538.8359 PSO [66] 0.8125 0.4375 42.091266 176.746500 6061.0777 GA [79] 0.8125 0.4345 40.323900 200.000000 6288.7445 GA [67] 0.8125 0.4375 42.097398 176.654050 6059.9463 GA [80] 0.9375 0.5000 48.329000 112.679000 6410.3811 ES [81] 0.8125 0.4375 42.098087 176.640518 6059.7456 DE [82] 0.8125 0.4375 42.098411 176.637690 6059.7340 ACO [83] 0.8125 0.4375 42.103624 176.572656 6059.0888 Lagrangian multiplier [84] 1.1250 0.6250 58.291000 43.6900000 7198.0428 Branch-bound [85] 1.1250 0.6250 47.700000 117.701000 8129.1036 Table 14 Comparison results for cantilever design problem....

    [...]

References
More filters
Journal ArticleDOI
13 May 1983-Science
TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems.
Abstract: There is a deep and useful connection between statistical mechanics (the behavior of systems with many degrees of freedom in thermal equilibrium at a finite temperature) and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters). A detailed analogy with annealing in solids provides a framework for optimization of the properties of very large and complex systems. This connection to statistical mechanics exposes new information and provides an unfamiliar perspective on traditional optimization problems and methods.

41,772 citations

Proceedings ArticleDOI
06 Aug 2002
TL;DR: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced, and the evolution of several paradigms is outlined, and an implementation of one of the paradigm is discussed.
Abstract: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced. The evolution of several paradigms is outlined, and an implementation of one of the paradigms is discussed. Benchmark testing of the paradigm is described, and applications, including nonlinear function optimization and neural network training, are proposed. The relationships between particle swarm optimization and both artificial life and genetic algorithms are described.

35,104 citations

Book
01 Jan 2020
TL;DR: In this article, the authors present a comprehensive introduction to the theory and practice of artificial intelligence for modern applications, including game playing, planning and acting, and reinforcement learning with neural networks.
Abstract: The long-anticipated revision of this #1 selling book offers the most comprehensive, state of the art introduction to the theory and practice of artificial intelligence for modern applications. Intelligent Agents. Solving Problems by Searching. Informed Search Methods. Game Playing. Agents that Reason Logically. First-order Logic. Building a Knowledge Base. Inference in First-Order Logic. Logical Reasoning Systems. Practical Planning. Planning and Acting. Uncertainty. Probabilistic Reasoning Systems. Making Simple Decisions. Making Complex Decisions. Learning from Observations. Learning with Neural Networks. Reinforcement Learning. Knowledge in Learning. Agents that Communicate. Practical Communication in English. Perception. Robotics. For computer professionals, linguists, and cognitive scientists interested in artificial intelligence.

16,983 citations

Journal ArticleDOI
01 Feb 1996
TL;DR: It is shown how the ant system (AS) can be applied to other optimization problems like the asymmetric traveling salesman, the quadratic assignment and the job-shop scheduling, and the salient characteristics-global data structure revision, distributed communication and probabilistic transitions of the AS.
Abstract: An analogy with the way ant colonies function has suggested the definition of a new computational paradigm, which we call ant system (AS). We propose it as a viable new approach to stochastic combinatorial optimization. The main characteristics of this model are positive feedback, distributed computation, and the use of a constructive greedy heuristic. Positive feedback accounts for rapid discovery of good solutions, distributed computation avoids premature convergence, and the greedy heuristic helps find acceptable solutions in the early stages of the search process. We apply the proposed methodology to the classical traveling salesman problem (TSP), and report simulation results. We also discuss parameter selection and the early setups of the model, and compare it with tabu search and simulated annealing using TSP. To demonstrate the robustness of the approach, we show how the ant system (AS) can be applied to other optimization problems like the asymmetric traveling salesman, the quadratic assignment and the job-shop scheduling. Finally we discuss the salient characteristics-global data structure revision, distributed communication and probabilistic transitions of the AS.

11,224 citations


"GSA: A Gravitational Search Algorit..." refers background or methods in this paper

  • ...Various heuristic approaches have been adopted by researches so far, for example Genetic Algorithm [32], Simulated Annealing [21], Ant Colony Search Algorithm [5], Particle Swarm Optimization [17], etc....

    [...]

  • ...Genetic Algorithm, GA, are inspired from Darwinian evolutionary theory [32], Simulated Annealing, SA, is designed by use of thermodynamic effects [21], Artificial Immune Systems, AIS, simulate biological immune systems [8], Ant Colony Optimization, ACO, mimics the behavior of ants foraging for food [5], Bacterial Foraging Algorithm, BFA, comes from search and optimal foraging of bacteria [11,19] and Particle Swarm Optimization, PSO, simulates the behavior of flock of birds [3,17]....

    [...]

  • ...These operations are almost very simple, however their collective effect, known as swarm intelligence [5,33], produce a surprising result....

    [...]

  • ...In this case, member operations including randomized search, positive feedback, negative feedback and multiple interactions, conduct to a self-organization situation [5]....

    [...]

  • ...Over the last decades, there has been a growing interest in algorithms inspired by the behaviors of natural phenomena [5,8,17,19,21,32]....

    [...]

Journal ArticleDOI
TL;DR: A framework is developed to explore the connection between effective optimization algorithms and the problems they are solving and a number of "no free lunch" (NFL) theorems are presented which establish that for any algorithm, any elevated performance over one class of problems is offset by performance over another class.
Abstract: A framework is developed to explore the connection between effective optimization algorithms and the problems they are solving. A number of "no free lunch" (NFL) theorems are presented which establish that for any algorithm, any elevated performance over one class of problems is offset by performance over another class. These theorems result in a geometric interpretation of what it means for an algorithm to be well suited to an optimization problem. Applications of the NFL theorems to information-theoretic aspects of optimization and benchmark measures of performance are also presented. Other issues addressed include time-varying optimization problems and a priori "head-to-head" minimax distinctions between optimization algorithms, distinctions that result despite the NFL theorems' enforcing of a type of uniformity over all algorithms.

10,771 citations


"GSA: A Gravitational Search Algorit..." refers background in this paper

  • ...In other words, an algorithm may solve some problems better and some problems worse than others [35]....

    [...]

  • ...Hence, searching for new heuristic optimization algorithms is an open problem [35]....

    [...]