scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Penalty Function Methods for Constrained Optimization with Genetic Algorithms

01 Apr 2005-Mathematical & Computational Applications (Association for Scientific Research)-Vol. 10, Iss: 1, pp 45-56
TL;DR: These penalty-based methods for handling constraints in Genetic Algorithms are presented and discussed and their strengths and weaknesses are discussed.
Abstract: Genetic Algorithms are most directly suited to unconstrained optimization. Application of Genetic Algorithms to constrained optimization problems is often a challenging effort. Several methods have been proposed for handling constraints. The most common method in Genetic Algorithms to handle constraints is to use penalty functions. In this paper, we present these penalty-based methods and discuss their strengths and weaknesses.

Content maybe subject to copyright    Report

Citations
More filters
Book
17 Feb 2014
TL;DR: This book can serve as an introductory book for graduates, doctoral students and lecturers in computer science, engineering and natural sciences, and researchers and engineers as well as experienced experts will also find it a handy reference.
Abstract: Nature-Inspired Optimization Algorithms provides a systematic introduction to all major nature-inspired algorithms for optimization. The book's unified approach, balancing algorithm introduction, theoretical background and practical implementation, complements extensive literature with well-chosen case studies to illustrate how these algorithms work. Topics include particle swarm optimization, ant and bee algorithms, simulated annealing, cuckoo search, firefly algorithm, bat algorithm, flower algorithm, harmony search, algorithm analysis, constraint handling, hybrid methods, parameter tuning and control, as well as multi-objective optimization. This book can serve as an introductory book for graduates, doctoral students and lecturers in computer science, engineering and natural sciences. It can also serve a source of inspiration for new applications. Researchers and engineers as well as experienced experts will also find it a handy reference.Discusses and summarizes the latest developments in nature-inspired algorithms with comprehensive, timely literatureProvides a theoretical understanding as well as practical implementation hintsProvides a step-by-step introduction to each algorithm

901 citations

Journal ArticleDOI
TL;DR: The ABC algorithm is applied to engineering design problems by extending the basic ABC algorithm simply by adding a constraint handling technique into the selection step of the ABC algorithm in order to prefer the feasible regions of entire search space.
Abstract: Engineering design problems are generally large scale or nonlinear or constrained optimization problems. The Artificial Bee Colony (ABC) algorithm is a successful tool for optimizing unconstrained problems. In this work, the ABC algorithm is used to solve large scale optimization problems, and it is applied to engineering design problems by extending the basic ABC algorithm simply by adding a constraint handling technique into the selection step of the ABC algorithm in order to prefer the feasible regions of entire search space. Nine well-known large scale unconstrained test problems and five well-known constrained engineering problems are solved by using the ABC algorithm and the performance of ABC algorithm is compared against those of state-of-the-art algorithms.

468 citations


Cites background from "Penalty Function Methods for Constr..."

  • ...Although the use of penalty functions is very common since its simplicity and directapplicability(SmithandCoit1997;CoelloCoello1999; Yeniay 2005; Parsopoulos and Vrahatis 2002), they have several drawbacks, too....

    [...]

Journal ArticleDOI
TL;DR: The numerical results demonstrate that constrained blended BBO outperforms SGA and performs similarly to SPSO 07 for constrained single-objective optimization problems.

251 citations


Cites background from "Penalty Function Methods for Constr..."

  • ...However, penalty functions have several limitations and problems which are difficult to deal with (Smith and Coit, 1997; Yeniay, 2005), including the difficulty of tuning the penalty parameters....

    [...]

Book ChapterDOI
27 Aug 2005
TL;DR: A penalty function approach is employed and the algorithm is modified to preserve feasibility of the encountered solutions to investigate the performance of the recently proposed Unified Particle Swarm Optimization method on constrained engineering optimization problems.
Abstract: We investigate the performance of the recently proposed Unified Particle Swarm Optimization method on constrained engineering optimization problems. For this purpose, a penalty function approach is employed and the algorithm is modified to preserve feasibility of the encountered solutions. The algorithm is illustrated on four well–known engineering problems with promising results. Comparisons with the standard local and global variant of Particle Swarm Optimization are reported and discussed.

237 citations

Journal ArticleDOI
TL;DR: Two novel extensions for the well known ant colony optimization (ACO) framework are introduced here, which allow the solution of mixed integer nonlinear programs (MINLPs) and a hybrid implementation based on this extended ACO framework, specially developed for complex non-convex MINLPs is presented.

201 citations


Cites background or methods from "Penalty Function Methods for Constr..."

  • ...A wide range of modifications of this method is known and comprehensive reviews can be found in Coello [11] or Yeniay [35]....

    [...]

  • ...adaptive or annealing, see Yeniay [35]) are more powerful and adjustable to a specific problem due to a larger number of parameters....

    [...]

  • ...death or static, see Yeniay [35]) do not require a lot of problem specific parameters to be selected, which makes their use and implementation very easy and popular....

    [...]

  • ...Sophisticated penalty methods (e.g. adaptive or annealing, see Yeniay [35]) are more powerful and adjustable to a specific problem due to a larger number of parameters....

    [...]

  • ...In general it is to note, that simple penalty methods (e.g. death or static, see Yeniay [35]) do not require a lot of problem specific parameters to be selected, which makes their use and implementation very easy and popular....

    [...]

References
More filters
Book
01 Jan 2004
TL;DR: In this article, the authors present a set of heuristics for solving problems with probability and statistics, including the Traveling Salesman Problem and the Problem of Who Owns the Zebra.
Abstract: I What Are the Ages of My Three Sons?.- 1 Why Are Some Problems Difficult to Solve?.- II How Important Is a Model?.- 2 Basic Concepts.- III What Are the Prices in 7-11?.- 3 Traditional Methods - Part 1.- IV What Are the Numbers?.- 4 Traditional Methods - Part 2.- V What's the Color of the Bear?.- 5 Escaping Local Optima.- VI How Good Is Your Intuition?.- 6 An Evolutionary Approach.- VII One of These Things Is Not Like the Others.- 7 Designing Evolutionary Algorithms.- VIII What Is the Shortest Way?.- 8 The Traveling Salesman Problem.- IX Who Owns the Zebra?.- 9 Constraint-Handling Techniques.- X Can You Tune to the Problem?.- 10 Tuning the Algorithm to the Problem.- XI Can You Mate in Two Moves?.- 11 Time-Varying Environments and Noise.- XII Day of the Week of January 1st.- 12 Neural Networks.- XIII What Was the Length of the Rope?.- 13 Fuzzy Systems.- XIV Everything Depends on Something Else.- 14 Coevolutionary Systems.- XV Who's Taller?.- 15 Multicriteria Decision-Making.- XVI Do You Like Simple Solutions?.- 16 Hybrid Systems.- 17 Summary.- Appendix A: Probability and Statistics.- A.1 Basic concepts of probability.- A.2 Random variables.- A.2.1 Discrete random variables.- A.2.2 Continuous random variables.- A.3 Descriptive statistics of random variables.- A.4 Limit theorems and inequalities.- A.5 Adding random variables.- A.6 Generating random numbers on a computer.- A.7 Estimation.- A.8 Statistical hypothesis testing.- A.9 Linear regression.- A.10 Summary.- Appendix B: Problems and Projects.- B.1 Trying some practical problems.- B.2 Reporting computational experiments with heuristic methods.- References.

2,089 citations

Journal ArticleDOI
TL;DR: A comprehensive survey of the most popular constraint-handling techniques currently used with evolutionary algorithms, including approaches that go from simple variations of a penalty function, to others, more sophisticated, that are biologically inspired on emulations of the immune system, culture or ant colonies.

1,924 citations

Journal ArticleDOI
TL;DR: Difficulty connected with solving the general nonlinear programming problem is discussed; several approaches that have emerged in the evolutionary computation community are surveyed; and a set of 11 interesting test cases are provided that may serve as a handy reference for future methods.
Abstract: Evolutionary computation techniques have received a great deal of attention regarding their potential as optimization techniques for complex numerical functions. However, they have not produced a significant breakthrough in the area of nonlinear programming due to the fact that they have not addressed the issue of constraints in a systematic way. Only recently have several methods been proposed for handling nonlinear constraints by evolutionary algorithms for numerical optimization problems; however, these methods have several drawbacks, and the experimental results on many test cases have been disappointing. In this paper we (1) discuss difficulties connected with solving the general nonlinear programming problem; (2) survey several approaches that have emerged in the evolutionary computation community; and (3) provide a set of 11 interesting test cases that may serve as a handy reference for future methods.

1,620 citations


"Penalty Function Methods for Constr..." refers background in this paper

  • ...These approaches can be grouped in four major categories [28]: Category 1: Methods based on penalty functions - Death Penalty [2] - Static Penalties [15,20] - Dynamic Penalties [16,17] - Annealing Penalties [5,24] - Adaptive Penalties [10,12,35,37] - Segregated GA [21] - Co-evolutionary Penalties [8] Category 2: Methods based on a search of feasible solutions - Repairing unfeasible individuals [27] - Superiority of feasible points [9,32] - Behavioral memory [34]...

    [...]

Journal ArticleDOI
TL;DR: The notion of using co-evolution to adapt the penalty factors of a fitness function incorporated in a genetic algorithm (GA) for numerical optimization is introduced.

1,096 citations


"Penalty Function Methods for Constr..." refers background or methods in this paper

  • ...Co-evolutionary Penalties Coello [8] developed a method of co-evolutionary penalties that split the penalty into two values, so that the GA has enough information about the number of constraint violations and the amounts of the violation....

    [...]

  • ...These approaches can be grouped in four major categories [28]: Category 1: Methods based on penalty functions - Death Penalty [2] - Static Penalties [15,20] - Dynamic Penalties [16,17] - Annealing Penalties [5,24] - Adaptive Penalties [10,12,35,37] - Segregated GA [21] - Co-evolutionary Penalties [8] Category 2: Methods based on a search of feasible solutions - Repairing unfeasible individuals [27] - Superiority of feasible points [9,32] - Behavioral memory [34] Category 3: Methods based on preserving feasibility of solutions -...

    [...]

  • ...These approaches can be grouped in four major categories [28]: Category 1: Methods based on penalty functions - Death Penalty [2] - Static Penalties [15,20] - Dynamic Penalties [16,17] - Annealing Penalties [5,24] - Adaptive Penalties [10,12,35,37] - Segregated GA [21] - Co-evolutionary Penalties [8] Category 2: Methods based on a search of feasible solutions - Repairing unfeasible individuals [27] - Superiority of feasible points [9,32] - Behavioral memory [34]...

    [...]

01 Jan 1991
TL;DR: Evolution Strategies are algorithms which imitate the principles of natural evolution as a method to solve parameter optimization problems and adaptation of the strategy parameters for the mutation variances as well as their covariances are described.
Abstract: Similar to Genetic Algorithms, Evolution Strategies (ESs) are algorithms which imitate the principles of natural evolution as a method to solve parameter optimization problems. The development of Evolution Strategies from the rst mutation{selection scheme to the reened (,){ES including the general concept of self{adaptation of the strategy parameters for the mutation variances as well as their covariances are described.

876 citations


"Penalty Function Methods for Constr..." refers background in this paper

  • ...These approaches can be grouped in four major categories [28]: Category 1: Methods based on penalty functions - Death Penalty [2] - Static Penalties [15,20] - Dynamic Penalties [16,17] - Annealing Penalties [5,24] - Adaptive Penalties [10,12,35,37] - Segregated GA [21] - Co-evolutionary Penalties [8] Category 2: Methods based on a search of feasible solutions - Repairing unfeasible individuals [27] - Superiority of feasible points [9,32] - Behavioral memory [34] Category 3: Methods based on preserving feasibility of solutions -...

    [...]

  • ...These approaches can be grouped in four major categories [28]: Category 1: Methods based on penalty functions - Death Penalty [2] - Static Penalties [15,20] - Dynamic Penalties [16,17] - Annealing Penalties [5,24] - Adaptive Penalties [10,12,35,37] - Segregated GA [21] - Co-evolutionary Penalties [8] Category 2: Methods based on a search of feasible solutions - Repairing unfeasible individuals [27] - Superiority of feasible points [9,32] - Behavioral memory [34]...

    [...]