scispace - formally typeset
Journal ArticleDOI

Opposition versus randomness in soft computing techniques

Reads0
Chats0
TLDR
This paper mathematically and experimentally proves that the simultaneous consideration of randomness and opposition is more advantageous than pure randomness, and applies that to accelerate differential evolution (DE).
Abstract
For many soft computing methods, we need to generate random numbers to use either as initial estimates or during the learning and search process. Recently, results for evolutionary algorithms, reinforcement learning and neural networks have been reported which indicate that the simultaneous consideration of randomness and opposition is more advantageous than pure randomness. This new scheme, called opposition-based learning, has the apparent effect of accelerating soft computing algorithms. This paper mathematically and also experimentally proves this advantage and, as an application, applies that to accelerate differential evolution (DE). By taking advantage of random numbers and their opposites, the optimization, search or learning process in many soft computing techniques can be accelerated when there is no a priori knowledge about the solution. The mathematical proofs and the results of conducted experiments confirm each other.

read more

Citations
More filters
Journal ArticleDOI

Opposition-Based Differential Evolution

TL;DR: This paper presents a novel algorithm to accelerate the differential evolution (DE), which employs opposition-based learning (OBL) for population initialization and also for generation jumping and results confirm that the ODE outperforms the original DE and FADE in terms of convergence speed and solution accuracy.
Journal ArticleDOI

Parameters identification of solar cell models using generalized oppositional teaching learning based optimization

TL;DR: The performance of GOTLBO is comprehensively evaluated in thirteen benchmark functions and two parameter identification problems of solar cell models, i.e., single diode model and double diode models.
Proceedings ArticleDOI

Quasi-oppositional Differential Evolution

TL;DR: The proposed mathematical proof shows that in a black-box optimization problem quasi- opposite points have a higher chance to be closer to the solution than opposite points.
Journal ArticleDOI

Optimal reactive power dispatch using quasi-oppositional teaching learning based optimization

TL;DR: Results demonstrate superiority in terms of solution quality of the proposed QOTLBO approach over original TLBO and other optimization techniques and confirm its potential to solve the ORPD problem.
Journal ArticleDOI

Enhanced leader PSO (ELPSO)

TL;DR: A novel optimisation algorithm, named enhanced leader PSO (ELPSO), is introduced, which mitigates premature convergence problem of conventional PSO and confirms the outperformance of ELPSO over other compared algorithms.
References
More filters
Book

Genetic algorithms in search, optimization, and machine learning

TL;DR: In this article, the authors present the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields, including computer programming and mathematics.
Journal ArticleDOI

Optimization by Simulated Annealing

TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems.
Journal ArticleDOI

Differential Evolution – A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces

TL;DR: In this article, a new heuristic approach for minimizing possibly nonlinear and non-differentiable continuous space functions is presented, which requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.
Book

Genetic Algorithms

Related Papers (5)