scispace - formally typeset
Open AccessJournal ArticleDOI

Direct multisearch for multiobjective optimization

Reads0
Chats0
TLDR
Direct multisearch (DMS) as discussed by the authors is a direct-search method that does not aggregate any of the objective functions to optimize and uses the concept of Pareto dominance to maintain a list of non-nominated points from which the new iterates or poll centers are chosen.
Abstract
In practical applications of optimization it is common to have several conflicting objective functions to optimize. Frequently, these functions are subject to noise or can be of black-box type, preventing the use of derivative-based techniques. We propose a novel multiobjective derivative-free methodology, calling it direct multisearch (DMS), which does not aggregate any of the objective functions. Our framework is inspired by the search/poll paradigm of direct-search methods of directional type and uses the concept of Pareto dominance to maintain a list of nondominated points (from which the new iterates or poll centers are chosen). The aim of our method is to generate as many points in the Pareto front as possible from the polling procedure itself, while keeping the whole framework general enough to accommodate other disseminating strategies, in particular, when using the (here also) optional search step. DMS generalizes to multiobjective optimization (MOO) all direct-search methods of directional type....

read more

Content maybe subject to copyright    Report

Direct Multisearch for Multiobjective Optimization
Ana Luísa Custódio
1
José F. Aguilar Madeira
2
A. Ismael F. Vaz
3
Luís Nunes Vicente
4
1
Universidade Nova de Lisboa
2
IDMEC-IST, ISEL
3
Universidade do Minho
4
Universidade de Coimbra
CERFACS
September 30, 2011
A.I.F. Vaz (CERFACS 2011) DMS September 30, 2011 1 / 53

Outline
Outline
1
Introduction and motivation
2
Direct MultiSearch
3
Numerical results
4
Further improvements on DMS
5
Conclusions and references
A.I.F. Vaz (CERFACS 2011) DMS September 30, 2011 2 / 53

Outline
Outline
1
Introduction and motivation
2
Direct MultiSearch
3
Numerical results
4
Further improvements on DMS
5
Conclusions and references
A.I.F. Vaz (CERFACS 2011) DMS September 30, 2011 2 / 53

Outline
Outline
1
Introduction and motivation
2
Direct MultiSearch
3
Numerical results
4
Further improvements on DMS
5
Conclusions and references
A.I.F. Vaz (CERFACS 2011) DMS September 30, 2011 2 / 53

Outline
Outline
1
Introduction and motivation
2
Direct MultiSearch
3
Numerical results
4
Further improvements on DMS
5
Conclusions and references
A.I.F. Vaz (CERFACS 2011) DMS September 30, 2011 2 / 53

Citations
More filters
Journal ArticleDOI

A tutorial on multiobjective optimization: fundamentals and evolutionary methods

TL;DR: This tutorial will review some of the most important fundamentals in multiobjective optimization and then introduce representative algorithms, illustrate their working principles, and discuss their application scope.
Journal ArticleDOI

Derivative-free optimization methods

TL;DR: A review of derivative-free methods for non-convex optimization problems is given in this paper, with an emphasis on recent developments and on unifying treatment of such problems in the non-linear optimization and machine learning literature.
Journal ArticleDOI

A survey on handling computationally expensive multiobjective optimization problems with evolutionary algorithms

TL;DR: A survey of 45 different recent algorithms proposed in the literature between 2008 and 2016 to handle computationally expensive multiobjective optimization problems and identifies and discusses some promising elements and major issues among algorithms in the Literature related to using an approximation and numerical settings used.
Journal ArticleDOI

Performance indicators in multiobjective optimization

TL;DR: A review of a total of 63 performance indicators partitioned into four groups according to their properties: cardinality, convergence, distribution and spread is proposed.
Journal ArticleDOI

Improving the Flexibility and Robustness of Model-based Derivative-free Optimization Solvers

TL;DR: Numerical experiments show that Py-BOBYQA is comparable to or better than existing general DFO solvers for noisy problems, and introduces an adaptive accuracy measure for data profiles of noisy functions, striking a balance between measuring the true and the noisy objective improvement.
References
More filters
Journal ArticleDOI

A fast and elitist multiobjective genetic algorithm: NSGA-II

TL;DR: This paper suggests a non-dominated sorting-based MOEA, called NSGA-II (Non-dominated Sorting Genetic Algorithm II), which alleviates all of the above three difficulties, and modify the definition of dominance in order to solve constrained multi-objective problems efficiently.
Book

Numerical Optimization

TL;DR: Numerical Optimization presents a comprehensive and up-to-date description of the most effective methods in continuous optimization, responding to the growing interest in optimization in engineering, science, and business by focusing on the methods that are best suited to practical problems.
Book

Optimization and nonsmooth analysis

TL;DR: The Calculus of Variations as discussed by the authors is a generalization of the calculus of variations, which is used in many aspects of analysis, such as generalized gradient descent and optimal control.
Book

Multiple Attribute Decision Making: Methods and Applications

TL;DR: In this paper, the authors present a classification of MADM methods by data type and propose a ranking method based on the degree of similarity of the MADM method to the original MADM algorithm.
Journal ArticleDOI

A comparison of three methods for selecting values of input variables in the analysis of output from a computer code

TL;DR: In this paper, two sampling plans are examined as alternatives to simple random sampling in Monte Carlo studies and they are shown to be improvements over simple sampling with respect to variance for a class of estimators which includes the sample mean and the empirical distribution function.
Related Papers (5)