scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Penalty Function Methods for Constrained Optimization with Genetic Algorithms

01 Apr 2005-Mathematical & Computational Applications (Association for Scientific Research)-Vol. 10, Iss: 1, pp 45-56
TL;DR: These penalty-based methods for handling constraints in Genetic Algorithms are presented and discussed and their strengths and weaknesses are discussed.
Abstract: Genetic Algorithms are most directly suited to unconstrained optimization. Application of Genetic Algorithms to constrained optimization problems is often a challenging effort. Several methods have been proposed for handling constraints. The most common method in Genetic Algorithms to handle constraints is to use penalty functions. In this paper, we present these penalty-based methods and discuss their strengths and weaknesses.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: In this article, a probabilistic power flow (PPF)-embedded genetic algorithm (GA)-based approach is proposed in order to solve the optimisation problem that is modelled mathematically under a chance constrained programming framework.
Abstract: The scope of this study is the optimal siting and sizing of distributed generation within a power distribution network considering uncertainties. A probabilistic power flow (PPF)-embedded genetic algorithm (GA)-based approach is proposed in order to solve the optimisation problem that is modelled mathematically under a chance constrained programming framework. Point estimate method (PEM) is proposed for the solution of the involved PPF problem. The uncertainties considered include: (i) the future load growth in the power distribution system, (ii) the wind generation, (iii) the output power of photovoltaics, (iv) the fuel costs and (v) the electricity prices. Based on some candidate schemes of different distributed generation types and sizes, placed on specific candidate buses of the network, GA is applied in order to find the optimal plan. The proposed GA with embedded PEM (GA-PEM) is applied on the IEEE 33-bus network by considering several scenarios and is compared with the method of GA with embedded Monte Carlo simulation (GA-MCS). The main conclusions of this comparison are: (i) the proposed GA-PEM is seven times faster than GA-MCS, and (ii) both methods provide almost identical results.

170 citations

Journal ArticleDOI
TL;DR: The basic concepts and implementation of an enhanced PSO algorithm combined with a gradient‐based quasi‐Newton sequential quadratic programming (QP) method for handling structural optimization problems are presented.
Abstract: : The particle swarm optimization (PSO) method is an instance of a successful application of the philosophy of bounded rationality and decentralized decision making for solving global optimization problems. A number of advantages with respect to other evolutionary algorithms are attributed to PSO making it a prospective candidate for optimum structural design. The PSO-based algorithm is robust and well suited to handle nonlinear, nonconvex design spaces with discontinuities, exhibiting fast convergence characteristics. Furthermore, hybrid algorithms can exploit the advantages of the PSO and gradient methods. This article presents in detail the basic concepts and implementation of an enhanced PSO algorithm combined with a gradient-based quasi-Newton sequential quadratic programming (SQP) method for handling structural optimization problems. The proposed PSO is shown to explore the design space thoroughly and to detect the neighborhood of the global optimum. Then the mathematical optimizer, starting from the best estimate of the PSO and using gradient information, accelerates convergence toward the global optimum. A nonlinear weight update rule for PSO and a simple, yet effective, constraint handling technique for structural optimization are also proposed. The performance, the functionality, and the effect of different setting parameters are studied. The effectiveness of the approach is illustrated in some benchmark structural optimization problems. The numerical results confirm the ability of the proposed methodology to find better optimal solutions for structural optimization problems than other optimization algorithms.

149 citations


Additional excerpts

  • ...Yeniay (2005) examined various penalty function methods for GA, highlighting the strengths and weaknesses of each method....

    [...]

Journal ArticleDOI
TL;DR: This paper proposes a novel dynamic S-type soft-threshold penalty method, which is mainly comprised of two steps: parameter iteration and solution iteration, to solve generalized constrained optimization problems, particularly for structure design optimization problems.

144 citations


Cites background or methods from "Penalty Function Methods for Constr..."

  • ...We also compare our dynamic penalty method with the dynamic penalty method in [28] in which the dynamic penalty factor is generated through a polynomial function rather than our exponential formula in (8)....

    [...]

  • ...For the test problem g07, the numerical results obtained D-DS as well as the penalty method in [28] integrated DS algorithm is depicted in Fig....

    [...]

  • ...Different from the existing dynamic penalty method [28,29], we will introduce a novel S-type function to generate the dynamic penalty parameter....

    [...]

  • ...Different from the dynamic methods in the literature [28], a new dynamic penalty function method is introduced in this paper....

    [...]

  • ...6 clearly shows that our proposed dynamic penalty method can capture an optimal solution faster than that by the dynamic method in [28] while keeping smaller violations....

    [...]

Journal ArticleDOI
01 Sep 2016
TL;DR: The MOSOS is combined with adaptive penalty function to handle equality and inequality constrains associated with problems and reveals the superior performance of the proposed algorithm over multi-objective colliding bodies optimization (MOCBO), multi- objective particle swarm optimize (MOPSO), non-dominated sorting genetic algorithm II (NSGA-II) and two gradient based multi-Objective algorithms Multi-Gradient Explorer (MGE) and Multi- gradient Pathfinder (MGP).
Abstract: Graphical abstractDisplay Omitted HighlightsProposed a new multi-objective Symbiotic Organisms Search algorithm.Performance is validated on 12 unconstrained and 6 constrained problems.The real life applications are demonstrated on constrained truss design problems. Many real world engineering optimization problems are multi-modal and associated with constrains. The multi-modal problems involve presence of local optima and thus conventional derivative based algorithms do not able to effectively determine the global optimum. The complexity of the problem increases when there is requirement to simultaneously optimize two or more objective functions each of which associated with certain constrains. Recently in 2014, Cheng and Prayogo proposed a new metaheuristic optimization algorithm known as Symbiotic Organisms Search (SOS). The algorithm is inspired by the interaction strategies adopted by the living organisms to survive and propagate in the ecosystem. The concept aims to achieve optimal survivability in the ecosystem by considering the harm and benefits received from other organisms. In this manuscript the SOS algorithm is formulated to solve multi-objective problems (termed as MOSOS). The MOSOS is combined with adaptive penalty function to handle equality and inequality constrains associated with problems. Extensive simulation studies are carried out on twelve unconstrained and six constrained benchmark multi-objective functions. The obtained results over fifty independent runs reveal the superior performance of the proposed algorithm over multi-objective colliding bodies optimization (MOCBO), multi-objective particle swarm optimization (MOPSO), non-dominated sorting genetic algorithm II (NSGA-II) and two gradient based multi-objective algorithms Multi-Gradient Explorer (MGE) and Multi-Gradient Pathfinder (MGP). The engineering applications of the proposed algorithm are demonstrated by solving two constrained truss design problems.

138 citations

Journal ArticleDOI
TL;DR: In this paper, a multi-objective optimisation problem with the net present value (NPV), initial investment, energy target and payback period as constraints is formulated using genetic algorithms (GAs).

130 citations


Additional excerpts

  • ...GA is initially designed to handle unconstrained optimisation problems, there is a need to use additional tools to keep the solutions within the feasible domain [16]....

    [...]

References
More filters
Book
01 Sep 1988
TL;DR: In this article, the authors present the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields, including computer programming and mathematics.
Abstract: From the Publisher: This book brings together - in an informal and tutorial fashion - the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields Major concepts are illustrated with running examples, and major algorithms are illustrated by Pascal computer programs No prior knowledge of GAs or genetics is assumed, and only a minimum of computer programming and mathematics background is required

52,797 citations

Book
01 Jan 1992
TL;DR: GAs and Evolution Programs for Various Discrete Problems, a Hierarchy of Evolution Programs and Heuristics, and Conclusions.
Abstract: 1 GAs: What Are They?.- 2 GAs: How Do They Work?.- 3 GAs: Why Do They Work?.- 4 GAs: Selected Topics.- 5 Binary or Float?.- 6 Fine Local Tuning.- 7 Handling Constraints.- 8 Evolution Strategies and Other Methods.- 9 The Transportation Problem.- 10 The Traveling Salesman Problem.- 11 Evolution Programs for Various Discrete Problems.- 12 Machine Learning.- 13 Evolutionary Programming and Genetic Programming.- 14 A Hierarchy of Evolution Programs.- 15 Evolution Programs and Heuristics.- 16 Conclusions.- Appendix A.- Appendix B.- Appendix C.- Appendix D.- References.

12,212 citations

Book
03 Mar 1993
TL;DR: The book is a solid reference for professionals as well as a useful text for students in the fields of operations research, management science, industrial engineering, applied mathematics, and also in engineering disciplines that deal with analytical optimization techniques.
Abstract: COMPREHENSIVE COVERAGE OF NONLINEAR PROGRAMMING THEORY AND ALGORITHMS, THOROUGHLY REVISED AND EXPANDED"Nonlinear Programming: Theory and Algorithms"--now in an extensively updated Third Edition--addresses the problem of optimizing an objective function in the presence of equality and inequality constraints. Many realistic problems cannot be adequately represented as a linear program owing to the nature of the nonlinearity of the objective function and/or the nonlinearity of any constraints. The "Third Edition" begins with a general introduction to nonlinear programming with illustrative examples and guidelines for model construction.Concentration on the three major parts of nonlinear programming is provided: Convex analysis with discussion of topological properties of convex sets, separation and support of convex sets, polyhedral sets, extreme points and extreme directions of polyhedral sets, and linear programmingOptimality conditions and duality with coverage of the nature, interpretation, and value of the classical Fritz John (FJ) and the Karush-Kuhn-Tucker (KKT) optimality conditions; the interrelationships between various proposed constraint qualifications; and Lagrangian duality and saddle point optimality conditionsAlgorithms and their convergence, with a presentation of algorithms for solving both unconstrained and constrained nonlinear programming problemsImportant features of the "Third Edition" include: New topics such as second interior point methods, nonconvex optimization, nondifferentiable optimization, and moreUpdated discussion and new applications in each chapterDetailed numerical examples and graphical illustrationsEssential coverage of modeling and formulating nonlinear programsSimple numerical problemsAdvanced theoretical exercisesThe book is a solid reference for professionals as well as a useful text for students in the fields of operations research, management science, industrial engineering, applied mathematics, and also in engineering disciplines that deal with analytical optimization techniques. The logical and self-contained format uniquely covers nonlinear programming techniques with a great depth of information and an abundance of valuable examples and illustrations that showcase the most current advances in nonlinear problems.

6,259 citations

Journal ArticleDOI
TL;DR: GA's population-based approach and ability to make pair-wise comparison in tournament selection operator are exploited to devise a penalty function approach that does not require any penalty parameter to guide the search towards the constrained optimum.

3,495 citations


"Penalty Function Methods for Constr..." refers background in this paper

  • ...These approaches can be grouped in four major categories [28]: Category 1: Methods based on penalty functions - Death Penalty [2] - Static Penalties [15,20] - Dynamic Penalties [16,17] - Annealing Penalties [5,24] - Adaptive Penalties [10,12,35,37] - Segregated GA [21] - Co-evolutionary Penalties [8] Category 2: Methods based on a search of feasible solutions - Repairing unfeasible individuals [27] - Superiority of feasible points [9,32] - Behavioral memory [34]...

    [...]

Book
01 Jan 1996
TL;DR: In this work, the author compares the three most prominent representatives of evolutionary algorithms: genetic algorithms, evolution strategies, and evolutionary programming within a unified framework, thereby clarifying the similarities and differences of these methods.

2,679 citations