scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Penalty Function Methods for Constrained Optimization with Genetic Algorithms

01 Apr 2005-Mathematical & Computational Applications (Association for Scientific Research)-Vol. 10, Iss: 1, pp 45-56
TL;DR: These penalty-based methods for handling constraints in Genetic Algorithms are presented and discussed and their strengths and weaknesses are discussed.
Abstract: Genetic Algorithms are most directly suited to unconstrained optimization. Application of Genetic Algorithms to constrained optimization problems is often a challenging effort. Several methods have been proposed for handling constraints. The most common method in Genetic Algorithms to handle constraints is to use penalty functions. In this paper, we present these penalty-based methods and discuss their strengths and weaknesses.

Content maybe subject to copyright    Report

Citations
More filters
Proceedings ArticleDOI
04 Apr 2016
TL;DR: The investigation shows that the attractive and repulsive Particle Swarm Optimization (ARPSO) performs as good as Differential Evolution and are consequently most suitable to solve the PST optimization problem.
Abstract: The European transmission system operators of today have to struggle with a general increase in power flows, leading to more and more to line congestions in certain transmission corridors. One reasonable countermeasure for this is active power flow control using phase shifting transformers (PST). To avoid negative interdependencies between these PSTs in different European control areas, the need for coordination of these power flow controlling is expected to rise. This paper uses sever-al variants of Differential Evolution, Genetic Algorithm, Mean Variance Mapping Optimization and Particle Swarm Optimization to solve a PST optimization problem in the IEEE 57-Bus System. With help of the results, the algorithms are compared with respect to the average fitness, standard deviation, computation time and necessary number of iterations. The investigation shows that the attractive and repulsive Particle Swarm Optimization (ARPSO) performs as good as Differential Evolution and are consequently most suitable to solve the PST optimization problem.

9 citations


Cites background from "Penalty Function Methods for Constr..."

  • ...If there is no limit violation the penalty term will be zero [14]....

    [...]

Journal ArticleDOI
TL;DR: In this article, the authors presented the application of the cuckoo search (CS) algorithm in attempts to the minimization of the commutation torque ripple in the brushless DC motor (BLDC).
Abstract: This paper presents the application of the cuckoo search (CS) algorithm in attempts to the minimization of the commutation torque ripple in the brushless DC motor (BLDC). The optimization algorithm was created based on the cuckoo’s reproductive behavior. The lumped-parameters mathematical model of the BLDC motor was developed. The values of self-inductances, mutual inductances, and back-electromotive force waveforms applied in the mathematical model were calculated by the use of the finite element method. The optimization algorithm was developed in Python 3.8. The CS algorithm was coupled with the static penalty function. During the optimization process, the shape of the voltage supplying the stator windings was determined to minimize the commutation torque ripple. Selected results of computer simulation are presented and discussed.

9 citations

Journal ArticleDOI
TL;DR: In this article, the authors used a scale down industrial pneumatic conveying and drying system in order to develop control-oriented models and suitable robust control strategies for the air preheating furnace of the system.
Abstract: The present work uses a scale down industrial pneumatic conveying and drying system in order to develop control-oriented models and suitable robust control strategies for the air preheating furnace of the system. A better control system has been achieved by utilizing the benefits of integrating first principle models, system identification techniques and parametric robust control methods. Though these processes are widely used in drying and transmission of different food, pharmaceutical and industrial products in the form of powder like fine-grained material, but suitable control oriented thermal models for these processes have not been studied. In the work the air preheating furnace of a pneumatic conveying and drying system is initially modeled with first principles. The novel dynamic models derived from first-principles is intended to evaluate dynamic changes in outlet air temperature corresponds to changes in current input to heating coils, air flow velocity and ambient temperature. Then a continuous ...

8 citations

Journal ArticleDOI
TL;DR: The introduced δ-PSO specialization is easy to implement in PSO or other swarm intelligence methods, and hopefully can provide similar improvements in other applications.
Abstract: The paper deals with the optimization of the S-Lay submarine pipe-laying. The considered laying model is based on a nonlinear elastic beam model with elastic contact interactions with rigid structures of roller supports and the seabed, solved in the Abaqus software. The optimization problem is formulated so as to determine the main parameters of pipe-laying. In order to maximize the efficiency of the optimization procedure, a specialized Particle Swarm Optimization variant is developed. The introduced ź-PSO employs an additional displacement of agent positions, through which the optimization is directed towards solutions based on offshore engineering practice. Two different cases of submarine pipe laying were used for testing. In these tests, the specialized PSO was compared to standard PSO and Mesh Adaptive Direct Search, which it both outperformed. The ź-PSO specialization is easy to implement in PSO or other swarm intelligence methods, and hopefully can provide similar improvements in other applications.

8 citations

Journal ArticleDOI
TL;DR: This paper proposes a general heuristic framework that extends the well known Variable Neighborhood Search algorithm to include dynamic constraint penalization and calls the new algorithm scheduled-penalty Variable neighborhood Search.

8 citations

References
More filters
Book
01 Sep 1988
TL;DR: In this article, the authors present the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields, including computer programming and mathematics.
Abstract: From the Publisher: This book brings together - in an informal and tutorial fashion - the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields Major concepts are illustrated with running examples, and major algorithms are illustrated by Pascal computer programs No prior knowledge of GAs or genetics is assumed, and only a minimum of computer programming and mathematics background is required

52,797 citations

Book
01 Jan 1992
TL;DR: GAs and Evolution Programs for Various Discrete Problems, a Hierarchy of Evolution Programs and Heuristics, and Conclusions.
Abstract: 1 GAs: What Are They?.- 2 GAs: How Do They Work?.- 3 GAs: Why Do They Work?.- 4 GAs: Selected Topics.- 5 Binary or Float?.- 6 Fine Local Tuning.- 7 Handling Constraints.- 8 Evolution Strategies and Other Methods.- 9 The Transportation Problem.- 10 The Traveling Salesman Problem.- 11 Evolution Programs for Various Discrete Problems.- 12 Machine Learning.- 13 Evolutionary Programming and Genetic Programming.- 14 A Hierarchy of Evolution Programs.- 15 Evolution Programs and Heuristics.- 16 Conclusions.- Appendix A.- Appendix B.- Appendix C.- Appendix D.- References.

12,212 citations

Book
03 Mar 1993
TL;DR: The book is a solid reference for professionals as well as a useful text for students in the fields of operations research, management science, industrial engineering, applied mathematics, and also in engineering disciplines that deal with analytical optimization techniques.
Abstract: COMPREHENSIVE COVERAGE OF NONLINEAR PROGRAMMING THEORY AND ALGORITHMS, THOROUGHLY REVISED AND EXPANDED"Nonlinear Programming: Theory and Algorithms"--now in an extensively updated Third Edition--addresses the problem of optimizing an objective function in the presence of equality and inequality constraints. Many realistic problems cannot be adequately represented as a linear program owing to the nature of the nonlinearity of the objective function and/or the nonlinearity of any constraints. The "Third Edition" begins with a general introduction to nonlinear programming with illustrative examples and guidelines for model construction.Concentration on the three major parts of nonlinear programming is provided: Convex analysis with discussion of topological properties of convex sets, separation and support of convex sets, polyhedral sets, extreme points and extreme directions of polyhedral sets, and linear programmingOptimality conditions and duality with coverage of the nature, interpretation, and value of the classical Fritz John (FJ) and the Karush-Kuhn-Tucker (KKT) optimality conditions; the interrelationships between various proposed constraint qualifications; and Lagrangian duality and saddle point optimality conditionsAlgorithms and their convergence, with a presentation of algorithms for solving both unconstrained and constrained nonlinear programming problemsImportant features of the "Third Edition" include: New topics such as second interior point methods, nonconvex optimization, nondifferentiable optimization, and moreUpdated discussion and new applications in each chapterDetailed numerical examples and graphical illustrationsEssential coverage of modeling and formulating nonlinear programsSimple numerical problemsAdvanced theoretical exercisesThe book is a solid reference for professionals as well as a useful text for students in the fields of operations research, management science, industrial engineering, applied mathematics, and also in engineering disciplines that deal with analytical optimization techniques. The logical and self-contained format uniquely covers nonlinear programming techniques with a great depth of information and an abundance of valuable examples and illustrations that showcase the most current advances in nonlinear problems.

6,259 citations

Journal ArticleDOI
TL;DR: GA's population-based approach and ability to make pair-wise comparison in tournament selection operator are exploited to devise a penalty function approach that does not require any penalty parameter to guide the search towards the constrained optimum.

3,495 citations


"Penalty Function Methods for Constr..." refers background in this paper

  • ...These approaches can be grouped in four major categories [28]: Category 1: Methods based on penalty functions - Death Penalty [2] - Static Penalties [15,20] - Dynamic Penalties [16,17] - Annealing Penalties [5,24] - Adaptive Penalties [10,12,35,37] - Segregated GA [21] - Co-evolutionary Penalties [8] Category 2: Methods based on a search of feasible solutions - Repairing unfeasible individuals [27] - Superiority of feasible points [9,32] - Behavioral memory [34]...

    [...]

Book
01 Jan 1996
TL;DR: In this work, the author compares the three most prominent representatives of evolutionary algorithms: genetic algorithms, evolution strategies, and evolutionary programming within a unified framework, thereby clarifying the similarities and differences of these methods.

2,679 citations