Efficient multiobjective optimization employing Gaussian processes, spectral sampling and a genetic algorithm
Reads0
Chats0
TLDR
A new algorithm is proposed, TSEMO, which uses Gaussian processes as surrogates, which gives a simple algorithm without the requirement of a priori knowledge, reduced hypervolume calculations to approach linear scaling with respect to the number of objectives, the capacity to handle noise and the ability for batch-sequential usage.Abstract:
Many engineering problems require the optimization of expensive, black-box functions involving multiple conflicting criteria, such that commonly used methods like multiobjective genetic algorithms are inadequate. To tackle this problem several algorithms have been developed using surrogates. However, these often have disadvantages such as the requirement of a priori knowledge of the output functions or exponentially scaling computational cost with respect to the number of objectives. In this paper a new algorithm is proposed, TSEMO, which uses Gaussian processes as surrogates. The Gaussian processes are sampled using spectral sampling techniques to make use of Thompson sampling in conjunction with the hypervolume quality indicator and NSGA-II to choose a new evaluation point at each iteration. The reference point required for the hypervolume calculation is estimated within TSEMO. Further, a simple extension was proposed to carry out batch-sequential design. TSEMO was compared to ParEGO, an expected hypervolume implementation, and NSGA-II on nine test problems with a budget of 150 function evaluations. Overall, TSEMO shows promising performance, while giving a simple algorithm without the requirement of a priori knowledge, reduced hypervolume calculations to approach linear scaling with respect to the number of objectives, the capacity to handle noise and lastly the ability for batch-sequential usage.read more
Citations
More filters
Journal ArticleDOI
Machine learning meets continuous flow chemistry: Automated optimization towards the Pareto front of multiple objectives
Artur M. Schweidtmann,Artur M. Schweidtmann,Adam D. Clayton,Nicholas Holmes,Eric Bradford,Eric Bradford,Richard A. Bourne,Alexei A. Lapkin +7 more
TL;DR: The implementation of a new multi-objective machine learning optimization algorithm for self-optimization that identifies a set of optimal conditions corresponding to the trade-off curve (Pareto front) between environmental and economic objectives in both cases.
Journal ArticleDOI
Performance indicators in multiobjective optimization
Charles Audet,Jean Bigeon,Dominique Cartier,Sébastien Le Digabel,Ludovic Salomon,Ludovic Salomon +5 more
TL;DR: A review of a total of 63 performance indicators partitioned into four groups according to their properties: cardinality, convergence, distribution and spread is proposed.
Journal ArticleDOI
Deterministic Global Optimization with Artificial Neural Networks Embedded
TL;DR: Mitsos et al. as discussed by the authors proposed a method based on relaxations of algorithms using McCormick relaxations in a reduced space employing the convex and concave envelopes of the nonlinear activation function.
Journal ArticleDOI
Evolutionary multiobjective optimization: open research areas and some challenges lying ahead
Carlos A. Coello Coello,Silvia González Brambila,Josué Figueroa Gamboa,Ma. Guadalupe Castillo Tapia,Raquel Hernández Gómez +4 more
TL;DR: The main aim of this paper is to motivate researchers and students to develop research in open research areas, as this will contribute to maintaining this discipline active during the next few years.
Journal ArticleDOI
Global Deterministic Optimization with Artificial Neural Networks Embedded
TL;DR: The proposed method is based on relaxations of algorithms using McCormick relaxations in a reduced space employing the convex and concave envelopes of the nonlinear activation function for deterministic global optimization of optimization problems with artificial neural networks embedded.
References
More filters
Journal ArticleDOI
A fast and elitist multiobjective genetic algorithm: NSGA-II
TL;DR: This paper suggests a non-dominated sorting-based MOEA, called NSGA-II (Non-dominated Sorting Genetic Algorithm II), which alleviates all of the above three difficulties, and modify the definition of dominance in order to solve constrained multi-objective problems efficiently.
Book
Gaussian Processes for Machine Learning
TL;DR: The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.
Journal ArticleDOI
A comparison of three methods for selecting values of input variables in the analysis of output from a computer code
TL;DR: In this paper, two sampling plans are examined as alternatives to simple random sampling in Monte Carlo studies and they are shown to be improvements over simple sampling with respect to variance for a class of estimators which includes the sample mean and the empirical distribution function.
Journal ArticleDOI
Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approach
Eckart Zitzler,Lothar Thiele +1 more
TL;DR: The proof-of-principle results obtained on two artificial problems as well as a larger problem, the synthesis of a digital hardware-software multiprocessor system, suggest that SPEA can be very effective in sampling from along the entire Pareto-optimal front and distributing the generated solutions over the tradeoff surface.
Journal ArticleDOI
Efficient Global Optimization of Expensive Black-Box Functions
TL;DR: This paper introduces the reader to a response surface methodology that is especially good at modeling the nonlinear, multimodal functions that often occur in engineering and shows how these approximating functions can be used to construct an efficient global optimization algorithm with a credible stopping rule.