scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Penalty Function Methods for Constrained Optimization with Genetic Algorithms

01 Apr 2005-Mathematical & Computational Applications (Association for Scientific Research)-Vol. 10, Iss: 1, pp 45-56
TL;DR: These penalty-based methods for handling constraints in Genetic Algorithms are presented and discussed and their strengths and weaknesses are discussed.
Abstract: Genetic Algorithms are most directly suited to unconstrained optimization. Application of Genetic Algorithms to constrained optimization problems is often a challenging effort. Several methods have been proposed for handling constraints. The most common method in Genetic Algorithms to handle constraints is to use penalty functions. In this paper, we present these penalty-based methods and discuss their strengths and weaknesses.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
01 Apr 2016
TL;DR: By applying the idea of exact penalty function approach, a DS algorithm, where an S-type dynamical penalty factor is introduced so as to achieve a better balance between exploration and exploitation, is developed for constrained global optimization problems.
Abstract: Differential search (DS) is a recently developed derivative-free global heuristic optimization algorithm for solving unconstrained optimization problems. In this paper, by applying the idea of exact penalty function approach, a DS algorithm, where an S-type dynamical penalty factor is introduced so as to achieve a better balance between exploration and exploitation, is developed for constrained global optimization problems. To illustrate the applicability and effectiveness of the proposed approach, a comparison study is carried out by applying the proposed algorithm and other widely used evolutionary methods on 24 benchmark problems. The results obtained clearly indicate that the proposed method is more effective and efficient over the other widely used evolutionary methods for most these benchmark problems.

95 citations

Journal ArticleDOI
01 Dec 2013
TL;DR: Simulation results show that the proposed approach to maximize the profit of the electricity retailer (utility company) and minimize the payment bills of its customers is beneficial for both the customers and the retailer.
Abstract: This paper proposes a Stackelberg game approach to maximize the profit of the electricity retailer (utility company) and minimize the payment bills of its customers. The electricity retailer determines the retail price through the proposed smart energy pricing scheme to optimally adjust the real-time pricing with the aim to maximize its profit. The price information is sent to the customers through a smart meter. According to the announced price, the customers can automatically manage the energy use of appliances in the households by the proposed optimal electricity consumption scheduling system with the aim to minimize their electricity bills. We model the interactions between the retailer and its electricity customers as a 1-leader, N-follower Stackelberg game. At the leader's side, i.e., for the retailer, we adopt genetic algorithms to maximize its profit while at the followers' side, i.e., for customers, we develop an analytical solution to the linear programming problem to minimize their bills. Simulation results show that the proposed approach is beneficial for both the customers and the retailer.

93 citations

Journal ArticleDOI
Li-Chiu Chang1
TL;DR: The results demonstrated that a penalty-type genetic algorithm could effectively provide rational hydrographs to reduce flood damage during the flood operation and to increase final storage for future usages.

90 citations


Cites background from "Penalty Function Methods for Constr..."

  • ...The additive penalty-type has received much more attention than the multiplicative type in the GA community (Yeniay, 2005)....

    [...]

Journal ArticleDOI
TL;DR: A modified PSO, called self-adaptive velocity particle swarm optimization (SAVPSO), is presented, which adopts the recently proposed dynamic-objective constraint-handling method (DOCHM), which is essentially a constituent of the inherent search mechanism of the integrated SAVPSO.
Abstract: Particle swarm optimization (PSO) is originally developed as an unconstrained optimization technique, therefore lacks an explicit mechanism for handling constraints. When solving constrained optimization problems (COPs) with PSO, the existing research mainly focuses on how to handle constraints, and the impact of constraints on the inherent search mechanism of PSO has been scarcely explored. Motivated by this fact, in this paper we mainly investigate how to utilize the impact of constraints (or the knowledge about the feasible region) to improve the optimization ability of the particles. Based on these investigations, we present a modified PSO, called self-adaptive velocity particle swarm optimization (SAVPSO), for solving COPs. To handle constraints, in SAVPSO we adopt our recently proposed dynamic-objective constraint-handling method (DOCHM), which is essentially a constituent part of the inherent search mechanism of the integrated SAVPSO, i.e., DOCHM + SAVPSO. The performance of the integrated SAVPSO is tested on a well-known benchmark suite and the experimental results show that appropriately utilizing the knowledge about the feasible region can substantially improve the performance of the underlying algorithm in solving COPs.

88 citations

Journal ArticleDOI
01 Apr 2017
TL;DR: The obtained results show the effectiveness of the proposed enhanced optimization algorithm as an advanced optimization technique that was successively implemented with good performance characteristics.
Abstract: Display Omitted An adaptive differential evolution procedure is presented to solve optimal reactive power dispatch problem.A multi-objective function aims at minimizing power losses and enhancing voltage profile.An investigated strategy for adaptive penalty factor to alleviate the effects of dependent variable violation.Numerical applications are carried out on three standard IEEE test systems and on a Western Delta real system.The flexibility of synchronous machines as reactive power sources is proven compared to switchable devices. This paper introduces a proposed procedure to solve the optimal reactive power management (ORPM) problem based on a multi-objective function using a modified differential evolution algorithm (MDEA). The proposed MDEA is investigated in order to enhance the voltage profile as well as to reduce the active power losses by solving the ORPM problem. The ORPM objective function aims to minimize transmission power losses and voltage deviation considering the system constraints. The MDEA aims to enhance the convergence characteristic of the differential evolution algorithm through updating the self-adaptive scaling factor, which can exchange information dynamically every generation. The scaling factor dynamically adopts the global and local searches to efficiently eliminate trapping in local optima. In addition, a strategy is developed to update the penalty factor for alleviating the effects of various system constraints. Numerical applications of different case studies are carried out on three standard IEEE systems, i.e., 14-bus, 30-bus and 57-bus test systems. Also, the proposed procedure is applied on Western Delta Network, which is a real part of the Egyptian main grid system. The flexibility of synchronous machines to provide controllable reactive power is proven with less dependency on the discrete reactive power controllers, such as installing the switchable devices and variations of tap changers. The obtained results show the effectiveness of the proposed enhanced optimization algorithm as an advanced optimization technique that was successively implemented with good performance characteristics.

87 citations

References
More filters
Book
01 Sep 1988
TL;DR: In this article, the authors present the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields, including computer programming and mathematics.
Abstract: From the Publisher: This book brings together - in an informal and tutorial fashion - the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields Major concepts are illustrated with running examples, and major algorithms are illustrated by Pascal computer programs No prior knowledge of GAs or genetics is assumed, and only a minimum of computer programming and mathematics background is required

52,797 citations

Book
01 Jan 1992
TL;DR: GAs and Evolution Programs for Various Discrete Problems, a Hierarchy of Evolution Programs and Heuristics, and Conclusions.
Abstract: 1 GAs: What Are They?.- 2 GAs: How Do They Work?.- 3 GAs: Why Do They Work?.- 4 GAs: Selected Topics.- 5 Binary or Float?.- 6 Fine Local Tuning.- 7 Handling Constraints.- 8 Evolution Strategies and Other Methods.- 9 The Transportation Problem.- 10 The Traveling Salesman Problem.- 11 Evolution Programs for Various Discrete Problems.- 12 Machine Learning.- 13 Evolutionary Programming and Genetic Programming.- 14 A Hierarchy of Evolution Programs.- 15 Evolution Programs and Heuristics.- 16 Conclusions.- Appendix A.- Appendix B.- Appendix C.- Appendix D.- References.

12,212 citations

Book
03 Mar 1993
TL;DR: The book is a solid reference for professionals as well as a useful text for students in the fields of operations research, management science, industrial engineering, applied mathematics, and also in engineering disciplines that deal with analytical optimization techniques.
Abstract: COMPREHENSIVE COVERAGE OF NONLINEAR PROGRAMMING THEORY AND ALGORITHMS, THOROUGHLY REVISED AND EXPANDED"Nonlinear Programming: Theory and Algorithms"--now in an extensively updated Third Edition--addresses the problem of optimizing an objective function in the presence of equality and inequality constraints. Many realistic problems cannot be adequately represented as a linear program owing to the nature of the nonlinearity of the objective function and/or the nonlinearity of any constraints. The "Third Edition" begins with a general introduction to nonlinear programming with illustrative examples and guidelines for model construction.Concentration on the three major parts of nonlinear programming is provided: Convex analysis with discussion of topological properties of convex sets, separation and support of convex sets, polyhedral sets, extreme points and extreme directions of polyhedral sets, and linear programmingOptimality conditions and duality with coverage of the nature, interpretation, and value of the classical Fritz John (FJ) and the Karush-Kuhn-Tucker (KKT) optimality conditions; the interrelationships between various proposed constraint qualifications; and Lagrangian duality and saddle point optimality conditionsAlgorithms and their convergence, with a presentation of algorithms for solving both unconstrained and constrained nonlinear programming problemsImportant features of the "Third Edition" include: New topics such as second interior point methods, nonconvex optimization, nondifferentiable optimization, and moreUpdated discussion and new applications in each chapterDetailed numerical examples and graphical illustrationsEssential coverage of modeling and formulating nonlinear programsSimple numerical problemsAdvanced theoretical exercisesThe book is a solid reference for professionals as well as a useful text for students in the fields of operations research, management science, industrial engineering, applied mathematics, and also in engineering disciplines that deal with analytical optimization techniques. The logical and self-contained format uniquely covers nonlinear programming techniques with a great depth of information and an abundance of valuable examples and illustrations that showcase the most current advances in nonlinear problems.

6,259 citations

Journal ArticleDOI
TL;DR: GA's population-based approach and ability to make pair-wise comparison in tournament selection operator are exploited to devise a penalty function approach that does not require any penalty parameter to guide the search towards the constrained optimum.

3,495 citations


"Penalty Function Methods for Constr..." refers background in this paper

  • ...These approaches can be grouped in four major categories [28]: Category 1: Methods based on penalty functions - Death Penalty [2] - Static Penalties [15,20] - Dynamic Penalties [16,17] - Annealing Penalties [5,24] - Adaptive Penalties [10,12,35,37] - Segregated GA [21] - Co-evolutionary Penalties [8] Category 2: Methods based on a search of feasible solutions - Repairing unfeasible individuals [27] - Superiority of feasible points [9,32] - Behavioral memory [34]...

    [...]

Book
01 Jan 1996
TL;DR: In this work, the author compares the three most prominent representatives of evolutionary algorithms: genetic algorithms, evolution strategies, and evolutionary programming within a unified framework, thereby clarifying the similarities and differences of these methods.

2,679 citations