scispace - formally typeset
Search or ask a question
Journal Articleβ€’DOIβ€’

SCA: A Sine Cosine Algorithm for solving optimization problems

TL;DR: The SCA algorithm obtains a smooth shape for the airfoil with a very low drag, which demonstrates that this algorithm can highly be effective in solving real problems with constrained and unknown search spaces.
Abstract: This paper proposes a novel population-based optimization algorithm called Sine Cosine Algorithm (SCA) for solving optimization problems. The SCA creates multiple initial random candidate solutions and requires them to fluctuate outwards or towards the best solution using a mathematical model based on sine and cosine functions. Several random and adaptive variables also are integrated to this algorithm to emphasize exploration and exploitation of the search space in different milestones of optimization. The performance of SCA is benchmarked in three test phases. Firstly, a set of well-known test cases including unimodal, multi-modal, and composite functions are employed to test exploration, exploitation, local optima avoidance, and convergence of SCA. Secondly, several performance metrics (search history, trajectory, average fitness of solutions, and the best solution during optimization) are used to qualitatively observe and confirm the performance of SCA on shifted two-dimensional test functions. Finally, the cross-section of an aircraft's wing is optimized by SCA as a real challenging case study to verify and demonstrate the performance of this algorithm in practice. The results of test functions and performance metrics prove that the algorithm proposed is able to explore different regions of a search space, avoid local optima, converge towards the global optimum, and exploit promising regions of a search space during optimization effectively. The SCA algorithm obtains a smooth shape for the airfoil with a very low drag, which demonstrates that this algorithm can highly be effective in solving real problems with constrained and unknown search spaces. Note that the source codes of the SCA algorithm are publicly available at http://www.alimirjalili.com/SCA.html .
Citations
More filters
Journal Articleβ€’DOIβ€’
TL;DR: The qualitative and quantitative results prove the efficiency of SSA and MSSA and demonstrate the merits of the algorithms proposed in solving real-world problems with difficult and unknown search spaces.

3,027Β citations

Journal Articleβ€’DOIβ€’
TL;DR: The proposed slime mould algorithm has several new features with a unique mathematical model that uses adaptive weights to simulate the process of producing positive and negative feedback of the propagation wave of slime mould based on bio-oscillator to form the optimal path for connecting food with excellent exploratory ability and exploitation propensity.

1,443Β citations


Cites methods from "SCA: A Sine Cosine Algorithm for so..."

  • ...𝑧 βˆ’ π‘£π‘Žπ‘™π‘’π‘’ 𝑝 βˆ’ π‘£π‘Žπ‘™π‘’π‘’ 𝛼/𝑖 , 𝛼 = 0.05 𝛼/𝑖, 𝛼 = 0.10 T rad itio n al alg o rith m s PBIL 9.9878 8.6010E-24 0.0042 0.0083 DA 8.2494 7.9860E-17 0.0045 0.0091 SCA 6.9851 1.4240E-12 0.005 0.01 MFO 6.7639 6.7120E-12 0.0056 0.0111 PSO 4.9307 4.0900E-07 0.0063 0.0125 WOA 4.2669 9.9060E-06 0.0071 0.0143 GOA 3.8877 5.0540E-05 0.0083 0.0167 ALO 3.7296 9.5740E-05 0.01 0.02 GWO 2.9394 1.6460E-03 0.0125 0.025 SSA 2.3389 9.6680E-03 0.0167 0.0333 MVO 1.7384 0.04111 0.025 0.05 DE 1.6436 0.05009 0.05 0.1 A d v an ced alg o rith m s BLPSO 12.0748 7.1760E-34 0.00556 0.01111 CLPSO 10.7331 3.5580E-27 0.00625 0.0125 CSSA 9.3915 2.9580E-21 0.00714 0.01429 CDLOBA 8.0498 4.1460E-16 0.00833 0.01667 CBA 6.7082 9.8520E-12 0.01 0.02 RCBA 5.3666 4.0120E-08 0.0125 0.025 IWOA 4.0249 2.8500E-05 0.01667 0.03333 m_SCA 2.6833 0.00365 0.025 0.05 LWOA 1.3416 0.08986 0.05 0.1 E v alu atio n PSO 5.648039 8.1160E-09 0.005 0.01 MFO 4.896673 4.8730E-07 0.0056 0.0111 SCA 4.801299 7.8820E-07 0.00625 0.0125 GOA 3.532738 2.0570E-04 0.0071 0.0143 MVO 2.644126 0.0041 0.0083 0.0167 SSA 2.422361 0.0077 0.01 0.02 GWO 2.130033 0.0166 0.0125 0.025 WOA 1.901289 0.0286 0.0167 0.0333 AGA 1.156902 0.1237 0.025 0.05 DE 1.077811 0.1406 0.05 0.1...

    [...]

  • ...The first kind mainly simulate physical phenomena, apply mathematical rules or methodologies including: Multi-Verse Optimizer (MVO) [3], Gravitational Local Search Algorithm (GLSA) [4], Charged System Search (CSS) [5], Gravitational Search Algorithm (GSA) [6], Sine Cosine Algorithm (SCA) [7], Simulated Annealing (SA) [8], Teaching-Learning-Based Optimization (TLBO) [9], Central Force Optimization (CFO) [10] and Tabu Search (TS) [11]....

    [...]

  • ...The MAs used for comparison include well-regarded and recent ones: WOA [50], GWO [21], MFO [23], BA [20], SCA [7], FA[51], PSO[18], SSA [52], MVO [3], ALO...

    [...]

  • ...Table 15 True p-value obtained from comparison on thirty-three benchmarks F1 F2 F3 F4 F5 F6 F7 F8 F9 F10 F11 T rad itio n al M A s #TSS 2.08E-05 2.08E-05 2.08E-05 2.08E-05 2.08E-05 4.13E-03 2.68E-05 1.34E-04 1.00E+00 2.31E-05 1.00E+00 F12 F13 F14 F15 F16 F17 F18 F19 F20 F21 F22 #TSS 3.71E-05 5.23E-02 5.53E-02 6.34E-03 7.95E-01 1.32E-01 3.26E-01 2.24E-01 1.98E-01 6.88E-02 7.01E-01 F23 F24 F25 F26 F27 F28 F29 F30 F31 F32 F33 #TSS 4.53E-01 7.59E-01 9.96E-01 5.34E-01 4.96E-01 2.29E-05 2.08E-05 2.63E-04 4.36E-01 2.08E-05 2.16E-05 A d v a n ced F1 F2 F3 F4 F5 F6 F7 F8 F9 F10 F11 #TSS 1.56E-05 1.56E-05 1.56E-05 1.56E-05 1.85E-02 1.40E-03 2.60E-03 1.60E-05 1.00E+00 2.74E-05 1.00E+00 F12 F13 F14 F15 F16 F17 F18 F19 F20 F21 F22 #TSS 1.83E-01 4.91E-01 1.25E-01 1.91E-01 9.93E-01 1.86E-05 7.57E-01 5.59E-04 1.56E-05 3.22E-05 2.04E-03 F23 F24 F25 F26 F27 F28 F29 F30 F31 F32 F33 #TSS 3.37E-01 5.58E-02 3.91E-02 3.62E-01 8.15E-01 1.56E-05 1.56E-05 7.49E-05 7.75E-01 1.56E-05 1.64E-05 Table 166 Results of Friedman test of iterative version and function evaluation version Iterative version on F1-33 SMA SCA SSA GWO MFO WOA GOA DA ALO MVO PBIL PSO DE Avg 3.057 9.396 5.180 5.280 8.037 6.735 6.690 10.580 6.124 5.013 12.228 7.865 4.815 Rank 1 11 4 5 10 8 7 12 6 3 13 9 2 Iterative version on F1-33 SMA BLPS CLPS CBA RCBA CDLO m_SC IWOA LWOA CSSA Avg 2.297 7.578 6.996 5.507 5.408 6.100 5.000 4.710 4.907 6.497 Rank 1 10 9 6 5 7 4 2 3 8 Evaluation version on F1-21 SMA SCA SSA GWO MFO WOA GOA MVO PSO DE AGA Avg 3.189 8.103 5.668 5.369 8.201 5.135 6.805 5.895 8.970 4.292 4.373 Rank 1 9 6 5 10 4 8 7 11 2 3 Table 177 Holms’ test (take SMA as the control algorithm) 𝑆𝑀𝐴 𝑉𝑆....

    [...]

  • ...The MAs used for comparison include well-regarded and recent ones: WOA [50], GWO [21], MFO [23], BA [20], SCA [7], FA[51], PSO[18], SSA [52], MVO [3], ALO [53], PBIL [54], DE [55] and advanced MAs: AGA[56], BLPSO [57], CLPSO [58], CBA [59], RCBA [60], CDLOBA [61], m_SCA [62], IWOA [63], LWOA [64], and CSSA [65]....

    [...]

Journal Articleβ€’DOIβ€’
TL;DR: Experimental results show that the AOA provides very promising results in solving challenging optimization problems compared with eleven other well-known optimization algorithms.

1,218Β citations

Journal Articleβ€’DOIβ€’
TL;DR: From the experimental results of AO that compared with well-known meta-heuristic methods, the superiority of the developed AO algorithm is observed.

989Β citations

Journal Articleβ€’DOIβ€’
Gaurav Dhiman1, Vijay Kumar1β€’
TL;DR: The main concept behind this algorithm is the social relationship between spotted hyenas and their collaborative behavior and it is revealed that the proposed algorithm performs better than the other competitive metaheuristic algorithms.

676Β citations

References
More filters
Journal Articleβ€’DOIβ€’
Rainer Storn1, Kenneth Priceβ€’
TL;DR: In this article, a new heuristic approach for minimizing possibly nonlinear and non-differentiable continuous space functions is presented, which requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.
Abstract: A new heuristic approach for minimizing possibly nonlinear and non-differentiable continuous space functions is presented. By means of an extensive testbed it is demonstrated that the new method converges faster and with more certainty than many other acclaimed global optimization methods. The new method requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.

24,053Β citations

Proceedings Articleβ€’DOIβ€’
04 Oct 1995
TL;DR: The optimization of nonlinear functions using particle swarm methodology is described and implementations of two paradigms are discussed and compared, including a recently developed locally oriented paradigm.
Abstract: The optimization of nonlinear functions using particle swarm methodology is described. Implementations of two paradigms are discussed and compared, including a recently developed locally oriented paradigm. Benchmark testing of both paradigms is described, and applications, including neural network training and robot task learning, are proposed. Relationships between particle swarm optimization and both artificial life and evolutionary computation are reviewed.

14,477Β citations

Journal Articleβ€’DOIβ€’
David H. Wolpert1, William G. Macready1β€’
TL;DR: A framework is developed to explore the connection between effective optimization algorithms and the problems they are solving and a number of "no free lunch" (NFL) theorems are presented which establish that for any algorithm, any elevated performance over one class of problems is offset by performance over another class.
Abstract: A framework is developed to explore the connection between effective optimization algorithms and the problems they are solving. A number of "no free lunch" (NFL) theorems are presented which establish that for any algorithm, any elevated performance over one class of problems is offset by performance over another class. These theorems result in a geometric interpretation of what it means for an algorithm to be well suited to an optimization problem. Applications of the NFL theorems to information-theoretic aspects of optimization and benchmark measures of performance are also presented. Other issues addressed include time-varying optimization problems and a priori "head-to-head" minimax distinctions between optimization algorithms, distinctions that result despite the NFL theorems' enforcing of a type of uniformity over all algorithms.

10,771Β citations

Journal Articleβ€’DOIβ€’
TL;DR: Artificial Bee Colony (ABC) Algorithm is an optimization algorithm based on the intelligent behaviour of honey bee swarm that is used for optimizing multivariable functions and the results showed that ABC outperforms the other algorithms.
Abstract: Swarm intelligence is a research branch that models the population of interacting agents or swarms that are able to self-organize. An ant colony, a flock of birds or an immune system is a typical example of a swarm system. Bees' swarming around their hive is another example of swarm intelligence. Artificial Bee Colony (ABC) Algorithm is an optimization algorithm based on the intelligent behaviour of honey bee swarm. In this work, ABC algorithm is used for optimizing multivariable functions and the results produced by ABC, Genetic Algorithm (GA), Particle Swarm Algorithm (PSO) and Particle Swarm Inspired Evolutionary Algorithm (PS-EA) have been compared. The results showed that ABC outperforms the other algorithms.

6,377Β citations

Journal Articleβ€’DOIβ€’
TL;DR: A new optimization algorithm based on the law of gravity and mass interactions is introduced and the obtained results confirm the high performance of the proposed method in solving various nonlinear functions.

5,501Β citations