scispace - formally typeset
Search or ask a question
Journal ArticleDOI

jMetal: A Java framework for multi-objective optimization

01 Oct 2011-Advances in Engineering Software (Elsevier Science Ltd.)-Vol. 42, Iss: 10, pp 760-771
TL;DR: This paper describes jMetal, an object-oriented Java-based framework aimed at the development, experimentation, and study of metaheuristics for solving multi-objective optimization problems, and includes two case studies to illustrate the use of jMetal in both solving a problem with a metaheuristic and designing and performing an experimental study.
About: This article is published in Advances in Engineering Software.The article was published on 2011-10-01. It has received 1025 citations till now. The article focuses on the topics: Metaheuristic & Multi-objective optimization.
Citations
More filters
Journal ArticleDOI
TL;DR: In the proposed algorithm, a scalarization approach, termed angle-penalized distance, is adopted to balance convergence and diversity of the solutions in the high-dimensional objective space, and reference vectors are effective and cost-efficient for preference articulation, which is particularly desirable for many-objective optimization.
Abstract: In evolutionary multiobjective optimization, maintaining a good balance between convergence and diversity is particularly crucial to the performance of the evolutionary algorithms (EAs). In addition, it becomes increasingly important to incorporate user preferences because it will be less likely to achieve a representative subset of the Pareto-optimal solutions using a limited population size as the number of objectives increases. This paper proposes a reference vector-guided EA for many-objective optimization. The reference vectors can be used not only to decompose the original multiobjective optimization problem into a number of single-objective subproblems, but also to elucidate user preferences to target a preferred subset of the whole Pareto front (PF). In the proposed algorithm, a scalarization approach, termed angle-penalized distance, is adopted to balance convergence and diversity of the solutions in the high-dimensional objective space. An adaptation strategy is proposed to dynamically adjust the distribution of the reference vectors according to the scales of the objective functions. Our experimental results on a variety of benchmark test problems show that the proposed algorithm is highly competitive in comparison with five state-of-the-art EAs for many-objective optimization. In addition, we show that reference vectors are effective and cost-efficient for preference articulation, which is particularly desirable for many-objective optimization. Furthermore, a reference vector regeneration strategy is proposed for handling irregular PFs. Finally, the proposed algorithm is extended for solving constrained many-objective optimization problems.

1,020 citations

Journal ArticleDOI
TL;DR: PlatEMO as discussed by the authors is a MATLAB platform for evolutionary multi-objective optimization, which includes more than 50 multiobjective evolutionary algorithms and more than 100 multobjective test problems, along with several widely used performance indicators.
Abstract: Over the last three decades, a large number of evolutionary algorithms have been developed for solving multi-objective optimization problems. However, there lacks an upto-date and comprehensive software platform for researchers to properly benchmark existing algorithms and for practitioners to apply selected algorithms to solve their real-world problems. The demand of such a common tool becomes even more urgent, when the source code of many proposed algorithms has not been made publicly available. To address these issues, we have developed a MATLAB platform for evolutionary multi-objective optimization in this paper, called PlatEMO, which includes more than 50 multiobjective evolutionary algorithms and more than 100 multi-objective test problems, along with several widely used performance indicators. With a user-friendly graphical user interface, PlatEMO enables users to easily compare several evolutionary algorithms at one time and collect statistical results in Excel or LaTeX files. More importantly, PlatEMO is completely open source, such that users are able to develop new algorithms on the basis of it. This paper introduces the main features of PlatEMO and illustrates how to use it for performing comparative experiments, embedding new algorithms, creating new test problems, and developing performance indicators. Source code of PlatEMO is now available at: http://bimk.ahu.edu.cn/index.php?s=/Index/Software/index.html.

915 citations

Posted Content
TL;DR: The main features of PlatEMO are introduced and how to use it for performing comparative experiments, embedding new algorithms, creating new test problems, and developing performance indicators are illustrated.
Abstract: Over the last three decades, a large number of evolutionary algorithms have been developed for solving multiobjective optimization problems. However, there lacks an up-to-date and comprehensive software platform for researchers to properly benchmark existing algorithms and for practitioners to apply selected algorithms to solve their real-world problems. The demand of such a common tool becomes even more urgent, when the source code of many proposed algorithms has not been made publicly available. To address these issues, we have developed a MATLAB platform for evolutionary multi-objective optimization in this paper, called PlatEMO, which includes more than 50 multi-objective evolutionary algorithms and more than 100 multi-objective test problems, along with several widely used performance indicators. With a user-friendly graphical user interface, PlatEMO enables users to easily compare several evolutionary algorithms at one time and collect statistical results in Excel or LaTeX files. More importantly, PlatEMO is completely open source, such that users are able to develop new algorithms on the basis of it. This paper introduces the main features of PlatEMO and illustrates how to use it for performing comparative experiments, embedding new algorithms, creating new test problems, and developing performance indicators. Source code of PlatEMO is now available at: http://bimk.ahu.edu.cn/index.php?s=/Index/Software/index.html.

828 citations

Journal ArticleDOI
TL;DR: This work develops pymoo, a multi-objective optimization framework in Python that addresses practical needs, such as the parallelization of function evaluations, methods to visualize low and high-dimensional spaces, and tools for multi-criteria decision making.
Abstract: Python has become the programming language of choice for research and industry projects related to data science, machine learning, and deep learning. Since optimization is an inherent part of these research fields, more optimization related frameworks have arisen in the past few years. Only a few of them support optimization of multiple conflicting objectives at a time, but do not provide comprehensive tools for a complete multi-objective optimization task. To address this issue, we have developed pymoo, a multi-objective optimization framework in Python. We provide a guide to getting started with our framework by demonstrating the implementation of an exemplary constrained multi-objective optimization scenario. Moreover, we give a high-level overview of the architecture of pymoo to show its capabilities followed by an explanation of each module and its corresponding sub-modules. The implementations in our framework are customizable and algorithms can be modified/extended by supplying custom operators. Moreover, a variety of single, multi- and many-objective test problems are provided and gradients can be retrieved by automatic differentiation out of the box. Also, pymoo addresses practical needs, such as the parallelization of function evaluations, methods to visualize low and high-dimensional spaces, and tools for multi-criteria decision making. For more information about pymoo, readers are encouraged to visit: https://pymoo.org.

644 citations


Cites methods from "jMetal: A Java framework for multi-..."

  • ...Recently, the well-known multi-objective optimization framework jMetal [15] developed in Java [19] has been ported to a Python version, namely jMetalPy [2]....

    [...]

  • ...If the search for frameworks is not limited to Python, other popular frameworks should be considered: PlatEMO [45] inMatlab,MOEA [20] and jMetal [15] in Java, jMetalCpp [31] and PaGMO [3] in C++....

    [...]

Journal ArticleDOI
TL;DR: In this paper, an evolutionary algorithm based on a new dominance relation is proposed for many-objective optimization that aims to enhance the convergence of the recently suggested nondominated sorting genetic algorithm III by exploiting the fitness evaluation scheme in the MOEA based on decomposition.
Abstract: Many-objective optimization has posed a great challenge to the classical Pareto dominance-based multiobjective evolutionary algorithms (MOEAs). In this paper, an evolutionary algorithm based on a new dominance relation is proposed for many-objective optimization. The proposed evolutionary algorithm aims to enhance the convergence of the recently suggested nondominated sorting genetic algorithm III by exploiting the fitness evaluation scheme in the MOEA based on decomposition, but still inherit the strength of the former in diversity maintenance. In the proposed algorithm, the nondominated sorting scheme based on the introduced new dominance relation is employed to rank solutions in the environmental selection phase, ensuring both convergence and diversity. The proposed algorithm is evaluated on a number of well-known benchmark problems having 3–15 objectives and compared against eight state-of-the-art algorithms. The extensive experimental results show that the proposed algorithm can work well on almost all the test functions considered in this paper, and it is compared favorably with the other many-objective optimizers. Additionally, a parametric study is provided to investigate the influence of a key parameter in the proposed algorithm.

556 citations


Cites methods from "jMetal: A Java framework for multi-..."

  • ...ing the proposed θ -DEA are implemented in the jMetal framework [80], and run on an Intel 2....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: This paper suggests a non-dominated sorting-based MOEA, called NSGA-II (Non-dominated Sorting Genetic Algorithm II), which alleviates all of the above three difficulties, and modify the definition of dominance in order to solve constrained multi-objective problems efficiently.
Abstract: Multi-objective evolutionary algorithms (MOEAs) that use non-dominated sorting and sharing have been criticized mainly for: (1) their O(MN/sup 3/) computational complexity (where M is the number of objectives and N is the population size); (2) their non-elitism approach; and (3) the need to specify a sharing parameter. In this paper, we suggest a non-dominated sorting-based MOEA, called NSGA-II (Non-dominated Sorting Genetic Algorithm II), which alleviates all of the above three difficulties. Specifically, a fast non-dominated sorting approach with O(MN/sup 2/) computational complexity is presented. Also, a selection operator is presented that creates a mating pool by combining the parent and offspring populations and selecting the best N solutions (with respect to fitness and spread). Simulation results on difficult test problems show that NSGA-II is able, for most problems, to find a much better spread of solutions and better convergence near the true Pareto-optimal front compared to the Pareto-archived evolution strategy and the strength-Pareto evolutionary algorithm - two other elitist MOEAs that pay special attention to creating a diverse Pareto-optimal front. Moreover, we modify the definition of dominance in order to solve constrained multi-objective problems efficiently. Simulation results of the constrained NSGA-II on a number of test problems, including a five-objective, seven-constraint nonlinear problem, are compared with another constrained multi-objective optimizer, and the much better performance of NSGA-II is observed.

37,111 citations


"jMetal: A Java framework for multi-..." refers background or methods in this paper

  • ...– Constrained problems: Srinivas [33], Tanaka [34], Osyczka2 [35], Constr_Ex [6], Golinski [36], Water [37]....

    [...]

  • ...Implementation of a number of modern multi-objective optimization algorithms: NSGA-II [6], SPEA2 [8], PAES [7], PESA-II [13], OMOPSO [14], MOCell [15], AbYSS [16], MOEA/D [17], Densea [18], CellDE [19], GDE3 [20], FastPGA [21], IBEA [22], SMPSO [23], MOCHC [24], and SMS-EMOA [25]....

    [...]

  • ...Implementation of the most widely used quality indicators: Hypervolume [38], Spread [6], Generational Distance [39], Inverted Generational Distance [39], Epsilon [40]....

    [...]

  • ...The constraint handling mechanism implemented by default is the one proposed in [6]....

    [...]

Book
01 Jan 2001
TL;DR: This text provides an excellent introduction to the use of evolutionary algorithms in multi-objective optimization, allowing use as a graduate course text or for self-study.
Abstract: From the Publisher: Evolutionary algorithms are relatively new, but very powerful techniques used to find solutions to many real-world search and optimization problems. Many of these problems have multiple objectives, which leads to the need to obtain a set of optimal solutions, known as effective solutions. It has been found that using evolutionary algorithms is a highly effective way of finding multiple effective solutions in a single simulation run. · Comprehensive coverage of this growing area of research · Carefully introduces each algorithm with examples and in-depth discussion · Includes many applications to real-world problems, including engineering design and scheduling · Includes discussion of advanced topics and future research · Features exercises and solutions, enabling use as a course text or for self-study · Accessible to those with limited knowledge of classical multi-objective optimization and evolutionary algorithms The integrated presentation of theory, algorithms and examples will benefit those working and researching in the areas of optimization, optimal design and evolutionary computing. This text provides an excellent introduction to the use of evolutionary algorithms in multi-objective optimization, allowing use as a graduate course text or for self-study.

12,134 citations

Journal ArticleDOI
TL;DR: The proof-of-principle results obtained on two artificial problems as well as a larger problem, the synthesis of a digital hardware-software multiprocessor system, suggest that SPEA can be very effective in sampling from along the entire Pareto-optimal front and distributing the generated solutions over the tradeoff surface.
Abstract: Evolutionary algorithms (EAs) are often well-suited for optimization problems involving several, often conflicting objectives. Since 1985, various evolutionary approaches to multiobjective optimization have been developed that are capable of searching for multiple solutions concurrently in a single run. However, the few comparative studies of different methods presented up to now remain mostly qualitative and are often restricted to a few approaches. In this paper, four multiobjective EAs are compared quantitatively where an extended 0/1 knapsack problem is taken as a basis. Furthermore, we introduce a new evolutionary approach to multicriteria optimization, the strength Pareto EA (SPEA), that combines several features of previous multiobjective EAs in a unique manner. It is characterized by (a) storing nondominated solutions externally in a second, continuously updated population, (b) evaluating an individual's fitness dependent on the number of external nondominated points that dominate it, (c) preserving population diversity using the Pareto dominance relationship, and (d) incorporating a clustering procedure in order to reduce the nondominated set without destroying its characteristics. The proof-of-principle results obtained on two artificial problems as well as a larger problem, the synthesis of a digital hardware-software multiprocessor system, suggest that SPEA can be very effective in sampling from along the entire Pareto-optimal front and distributing the generated solutions over the tradeoff surface. Moreover, SPEA clearly outperforms the other four multiobjective EAs on the 0/1 knapsack problem.

7,512 citations


"jMetal: A Java framework for multi-..." refers background or methods in this paper

  • ...3, Q = {A,B,C}, for problems where all objectives are to be minimized [38]....

    [...]

  • ...A number of quality indicators for measuring these two criteria have been proposed in the literature: Generational Distance (GD) [39], Inverse Generational Distance (IGD), Hypervolume (HV) [38], Epsilon [40], Spread or D [4], Generalized Spread indicators, and others....

    [...]

  • ...Implementation of the most widely used quality indicators: Hypervolume [38], Spread [6], Generational Distance [39], Inverted Generational Distance [39], Epsilon [40]....

    [...]

  • ...Hypervolume enclosed by the non-dominated solutions A,B, and C. Spread or D....

    [...]

  • ...For our purpose, we have to select Epsilon Indicator, Spread, and Hypervolume....

    [...]

Journal ArticleDOI
TL;DR: Goldberg's notion of nondominated sorting in GAs along with a niche and speciation method to find multiple Pareto-optimal points simultaneously are investigated and suggested to be extended to higher dimensional and more difficult multiobjective problems.
Abstract: In trying to solve multiobjective optimization problems, many traditional methods scalarize the objective vector into a single objective. In those cases, the obtained solution is highly sensitive to the weight vector used in the scalarization process and demands that the user have knowledge about the underlying problem. Moreover, in solving multiobjective problems, designers may be interested in a set of Pareto-optimal points, instead of a single point. Since genetic algorithms (GAs) work with a population of points, it seems natural to use GAs in multiobjective optimization problems to capture a number of solutions simultaneously. Although a vector evaluated GA (VEGA) has been implemented by Schaffer and has been tried to solve a number of multiobjective problems, the algorithm seems to have bias toward some regions. In this paper, we investigate Goldberg's notion of nondominated sorting in GAs along with a niche and speciation method to find multiple Pareto-optimal points simultaneously. The proof-of-principle results obtained on three problems used by Schaffer and others suggest that the proposed method can be extended to higher dimensional and more difficult multiobjective problems. A number of suggestions for extension and application of the algorithm are also discussed.

6,411 citations

Book
30 Jun 2002
TL;DR: This paper presents a meta-anatomy of the multi-Criteria Decision Making process, which aims to provide a scaffolding for the future development of multi-criteria decision-making systems.
Abstract: List of Figures. List of Tables. Preface. Foreword. 1. Basic Concepts. 2. Evolutionary Algorithm MOP Approaches. 3. MOEA Test Suites. 4. MOEA Testing and Analysis. 5. MOEA Theory and Issues. 3. MOEA Theoretical Issues. 6. Applications. 7. MOEA Parallelization. 8. Multi-Criteria Decision Making. 9. Special Topics. 10. Epilog. Appendix A: MOEA Classification and Technique Analysis. Appendix B: MOPs in the Literature. Appendix C: Ptrue & PFtrue for Selected Numeric MOPs. Appendix D: Ptrue & PFtrue for Side-Constrained MOPs. Appendix E: MOEA Software Availability. Appendix F: MOEA-Related Information. Index. References.

5,994 citations


"jMetal: A Java framework for multi-..." refers background in this paper

  • ...Among them, evolutionary algorithms are very popular [4,5], and some of the most well-known algorithms in this field belong to this class (e....

    [...]