scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Penalty Function Methods for Constrained Optimization with Genetic Algorithms

01 Apr 2005-Mathematical & Computational Applications (Association for Scientific Research)-Vol. 10, Iss: 1, pp 45-56
TL;DR: These penalty-based methods for handling constraints in Genetic Algorithms are presented and discussed and their strengths and weaknesses are discussed.
Abstract: Genetic Algorithms are most directly suited to unconstrained optimization. Application of Genetic Algorithms to constrained optimization problems is often a challenging effort. Several methods have been proposed for handling constraints. The most common method in Genetic Algorithms to handle constraints is to use penalty functions. In this paper, we present these penalty-based methods and discuss their strengths and weaknesses.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: In this article, an integrated model is presented to determine an optimal recovery plan for a manufacturer, which is to maximize its recovery value when producing a remanufactured product by considering practical constraints of the manufacturing lead-time, waste and quality as a whole.

11 citations

Journal ArticleDOI
TL;DR: This study proposes a context-aware path planning mechanism based on spatial conceptual map (SCM) and genetic algorithm (GA), referred to as UbiPaPaGo, which attempts to automatically find the best path that satisfies the requirements of an individual user.
Abstract: The increased prevalence of digital devices with communication capability heralds the era of ubiquitous computing, as predicted by Mark Weiser. Ubiquitous computing aims to provide users with intelligent human-centric context-aware services at anytime anywhere. Optimal path planning in a ubiquitous network considers the needs of users and the surrounding context. This approach is very different from that applied by existing research on car navigation and mobile robots. This study proposes a context-aware path planning mechanism based on spatial conceptual map (SCM) and genetic algorithm (GA), referred to as UbiPaPaGo. The SCM model is adopted to represent the real map of the surrounding environment. The optimal path is planned using a GA, which is a robust metaheuristic algorithm. UbiPaPaGo attempts to automatically find the best path that satisfies the requirements of an individual user. A prototype of UbiPaPaGo is implemented to demonstrate its feasibility and scalability. Experimental results validate the effectiveness and the efficiency of UbiPaPaGo in finding the optimal path.

11 citations


Cites background from "Penalty Function Methods for Constr..."

  • ...The concept of penalty function is adopted to handle the constraints (Yeniay, 2005)....

    [...]

Journal ArticleDOI
TL;DR: A new efficient interval partitioning approach to solve constrained global optimization problems is proposed, which involves a new parallel subdivision direction selection method as well as an adaptive tree search that results in improved performance across standard solution and computational indicators when compared to previously proposed techniques.
Abstract: A new efficient interval partitioning approach to solve constrained global optimization problems is proposed. This involves a new parallel subdivision direction selection method as well as an adaptive tree search. The latter explores nodes (intervals in variable domains) using a restricted hybrid depth-first and best-first branching strategy. This hybrid approach is also used for activating local search to identify feasible stationary points. The new tree search management technique results in improved performance across standard solution and computational indicators when compared to previously proposed techniques. On the other hand, the new parallel subdivision direction selection rule detects infeasible and suboptimal boxes earlier than existing rules, and this contributes to performance by enabling earlier reliable deletion of such subintervals from the search space.

11 citations

Journal ArticleDOI
TL;DR: Results show that the CPS framework can reliably drive the aeroelastic specimen to the optimal solution which minimizes stiffness while satisfying multiple acceleration and drift constraints.

11 citations

Journal ArticleDOI
TL;DR: This work describes a new genetic algorithm for continuous and pattern-free heliostat field optimization that relies on elitism, uniform crossover, static penalization of infeasibility, and tournament selection and has been adapted to run in parallel on shared-memory environments.
Abstract: The heliostat field of solar power tower plants can suppose up to 50% of investment costs and 40% of energy loss. Unfortunately, obtaining an optimal field requires facing a complex non-convex, continuous, large-scale, and constrained optimization problem. Although pattern-based layouts and iterative deployment are popular heuristics to simplify the problem, they limit flexibility and might be suboptimal. This work describes a new genetic algorithm for continuous and pattern-free heliostat field optimization. Considering the potential computational cost of the objective function and the necessity of broad explorations, it has been adapted to run in parallel on shared-memory environments. It relies on elitism, uniform crossover, static penalization of infeasibility, and tournament selection. Interesting experimental results show an optimization speedup up to 15 $$\times $$ with 16 threads. It could approximately reduce a one year runtime, at complete optimization, to a month only. The optimizer has also been made available as a generic C++ library.

10 citations


Cites background from "Penalty Function Methods for Constr..."

  • ...However, GA are mainly aimed at unconstrained optimization [29] and, as the target problem is constrained, some adaptations must be incorporated....

    [...]

  • ...This strategy, which is quite common to handle constrained optimization problems with GA, is called ‘static penalization’ [29]....

    [...]

References
More filters
Book
01 Sep 1988
TL;DR: In this article, the authors present the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields, including computer programming and mathematics.
Abstract: From the Publisher: This book brings together - in an informal and tutorial fashion - the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields Major concepts are illustrated with running examples, and major algorithms are illustrated by Pascal computer programs No prior knowledge of GAs or genetics is assumed, and only a minimum of computer programming and mathematics background is required

52,797 citations

Book
01 Jan 1992
TL;DR: GAs and Evolution Programs for Various Discrete Problems, a Hierarchy of Evolution Programs and Heuristics, and Conclusions.
Abstract: 1 GAs: What Are They?.- 2 GAs: How Do They Work?.- 3 GAs: Why Do They Work?.- 4 GAs: Selected Topics.- 5 Binary or Float?.- 6 Fine Local Tuning.- 7 Handling Constraints.- 8 Evolution Strategies and Other Methods.- 9 The Transportation Problem.- 10 The Traveling Salesman Problem.- 11 Evolution Programs for Various Discrete Problems.- 12 Machine Learning.- 13 Evolutionary Programming and Genetic Programming.- 14 A Hierarchy of Evolution Programs.- 15 Evolution Programs and Heuristics.- 16 Conclusions.- Appendix A.- Appendix B.- Appendix C.- Appendix D.- References.

12,212 citations

Book
03 Mar 1993
TL;DR: The book is a solid reference for professionals as well as a useful text for students in the fields of operations research, management science, industrial engineering, applied mathematics, and also in engineering disciplines that deal with analytical optimization techniques.
Abstract: COMPREHENSIVE COVERAGE OF NONLINEAR PROGRAMMING THEORY AND ALGORITHMS, THOROUGHLY REVISED AND EXPANDED"Nonlinear Programming: Theory and Algorithms"--now in an extensively updated Third Edition--addresses the problem of optimizing an objective function in the presence of equality and inequality constraints. Many realistic problems cannot be adequately represented as a linear program owing to the nature of the nonlinearity of the objective function and/or the nonlinearity of any constraints. The "Third Edition" begins with a general introduction to nonlinear programming with illustrative examples and guidelines for model construction.Concentration on the three major parts of nonlinear programming is provided: Convex analysis with discussion of topological properties of convex sets, separation and support of convex sets, polyhedral sets, extreme points and extreme directions of polyhedral sets, and linear programmingOptimality conditions and duality with coverage of the nature, interpretation, and value of the classical Fritz John (FJ) and the Karush-Kuhn-Tucker (KKT) optimality conditions; the interrelationships between various proposed constraint qualifications; and Lagrangian duality and saddle point optimality conditionsAlgorithms and their convergence, with a presentation of algorithms for solving both unconstrained and constrained nonlinear programming problemsImportant features of the "Third Edition" include: New topics such as second interior point methods, nonconvex optimization, nondifferentiable optimization, and moreUpdated discussion and new applications in each chapterDetailed numerical examples and graphical illustrationsEssential coverage of modeling and formulating nonlinear programsSimple numerical problemsAdvanced theoretical exercisesThe book is a solid reference for professionals as well as a useful text for students in the fields of operations research, management science, industrial engineering, applied mathematics, and also in engineering disciplines that deal with analytical optimization techniques. The logical and self-contained format uniquely covers nonlinear programming techniques with a great depth of information and an abundance of valuable examples and illustrations that showcase the most current advances in nonlinear problems.

6,259 citations

Journal ArticleDOI
TL;DR: GA's population-based approach and ability to make pair-wise comparison in tournament selection operator are exploited to devise a penalty function approach that does not require any penalty parameter to guide the search towards the constrained optimum.

3,495 citations


"Penalty Function Methods for Constr..." refers background in this paper

  • ...These approaches can be grouped in four major categories [28]: Category 1: Methods based on penalty functions - Death Penalty [2] - Static Penalties [15,20] - Dynamic Penalties [16,17] - Annealing Penalties [5,24] - Adaptive Penalties [10,12,35,37] - Segregated GA [21] - Co-evolutionary Penalties [8] Category 2: Methods based on a search of feasible solutions - Repairing unfeasible individuals [27] - Superiority of feasible points [9,32] - Behavioral memory [34]...

    [...]

Book
01 Jan 1996
TL;DR: In this work, the author compares the three most prominent representatives of evolutionary algorithms: genetic algorithms, evolution strategies, and evolutionary programming within a unified framework, thereby clarifying the similarities and differences of these methods.

2,679 citations