scispace - formally typeset
Open AccessJournal ArticleDOI

A comprehensive survey of sine cosine algorithm: variants and applications

TLDR
Sine Cosine Algorithm (SCA) as mentioned in this paper is a recent meta-heuristic algorithm inspired by the proprieties of trigonometric sine and cosine functions, which has attracted great attention from researchers and has been widely used to solve different optimization problems in several fields.
Abstract
Sine Cosine Algorithm (SCA) is a recent meta-heuristic algorithm inspired by the proprieties of trigonometric sine and cosine functions. Since its introduction by Mirjalili in 2016, SCA has attracted great attention from researchers and has been widely used to solve different optimization problems in several fields. This attention is due to its reasonable execution time, good convergence acceleration rate, and high efficiency compared to several well-regarded optimization algorithms available in the literature. This paper presents a brief overview of the basic SCA and its variants divided into modified, multi-objective, and hybridized versions. Furthermore, the applications of SCA in several domains such as classification, image processing, robot path planning, scheduling, radial distribution networks, and other engineering problems are described. Finally, the paper recommended some potential future research directions for SCA.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Novel Improved Salp Swarm Algorithm: An Application for Feature Selection

TL;DR: A modified version of the salp swarm algorithm for feature selection is proposed and the performance of the algorithm is compared to the best algorithms with the same test setup resulting in better number of features and classification accuracy for the proposed solution.
Journal ArticleDOI

Hybridized sine cosine algorithm with convolutional neural networks dropout regularization application

TL;DR: In this paper , the authors proposed an automated framework based on the hybridized sine cosine algorithm for tackling the overfitting problem in convolutional neural networks (CNNs).
Journal ArticleDOI

Boosting Marine Predators Algorithm by Salp Swarm Algorithm for Multilevel Thresholding Image Segmentation

TL;DR: In this paper , a hybrid Marine Predators Algorithm (MPA) with Salp Swarm Algorithm(SSA) was proposed to determine the optimal multilevel threshold image segmentation MPASSA.
Journal ArticleDOI

An Improved Teaching-Learning-Based Optimization Algorithm with Reinforcement Learning Strategy for Solving Optimization Problems

TL;DR: A new learning mode considering the effect of the teacher is presented and the Q-Learning method in reinforcement learning (RL) is introduced to build a switching mechanism between two different learning modes in the learner phase to improve the local optima avoidance ability of RLTLBO.
References
More filters
Journal ArticleDOI

Optimization by Simulated Annealing

TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems.
Journal ArticleDOI

Differential Evolution – A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces

TL;DR: In this article, a new heuristic approach for minimizing possibly nonlinear and non-differentiable continuous space functions is presented, which requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.
Book

Genetic Algorithms

Proceedings ArticleDOI

A new optimizer using particle swarm theory

TL;DR: The optimization of nonlinear functions using particle swarm methodology is described and implementations of two paradigms are discussed and compared, including a recently developed locally oriented paradigm.
Journal ArticleDOI

No free lunch theorems for optimization

TL;DR: A framework is developed to explore the connection between effective optimization algorithms and the problems they are solving and a number of "no free lunch" (NFL) theorems are presented which establish that for any algorithm, any elevated performance over one class of problems is offset by performance over another class.
Related Papers (5)