scispace - formally typeset
Search or ask a question
Journal Article•DOI•

An algorithm for finding the global maximum of a multimodal, multivariate function

F H Mladineo1•
01 Mar 1986-Mathematical Programming (Springer-Verlag)-Vol. 34, Iss: 2, pp 188-200
TL;DR: This algorithm for global optimization uses an arbitrary starting point, requires no derivatives, uses comparatively few function evaluations and is not side-tracked by nearby relative optima, so as to build a gradually closer piecewise-differentiable approximation to the objective function.
Abstract: This algorithm for global optimization uses an arbitrary starting point, requires no derivatives, uses comparatively few function evaluations and is not side-tracked by nearby relative optima. The algorithm builds a gradually closer piecewise-differentiable approximation to the objective function. The computer program exhibits a (theoretically expected) strong tendency to cluster around relative optima close to the global. Results of testing with several standard functions are given.
Citations
More filters
Journal Article•DOI•
TL;DR: In this article, the Lipschitz constant is viewed as a weighting parameter that indicates how much emphasis to place on global versus local search, which accounts for the fast convergence of the new algorithm on the test functions.
Abstract: We present a new algorithm for finding the global minimum of a multivariate function subject to simple bounds. The algorithm is a modification of the standard Lipschitzian approach that eliminates the need to specify a Lipschitz constant. This is done by carrying out simultaneous searches using all possible constants from zero to infinity. On nine standard test functions, the new algorithm converges in fewer function evaluations than most competing methods. The motivation for the new algorithm stems from a different way of looking at the Lipschitz constant. In particular, the Lipschitz constant is viewed as a weighting parameter that indicates how much emphasis to place on global versus local search. In standard Lipschitzian methods, this constant is usually large because it must equal or exceed the maximum rate of change of the objective function. As a result, these methods place a high emphasis on global search and exhibit slow convergence. In contrast, the new algorithm carries out simultaneous searches using all possible constants, and therefore operates at both the global and local level. Once the global part of the algorithm finds the basin of convergence of the optimum, the local part of the algorithm quickly and automatically exploits it. This accounts for the fast convergence of the new algorithm on the test functions.

1,994 citations

Journal Article•DOI•
TL;DR: In this paper, the global optimization problem of a multidimensional "black-box" function satisfying the Lipschitz condition over a hyperinterval with an unknown Lipchitz constant is considered.
Abstract: In the paper, the global optimization problem of a multidimensional "black-box" function satisfying the Lipschitz condition over a hyperinterval with an unknown Lipschitz constant is considered. A new efficient algorithm for solving this problem is presented. At each iteration of the method a number of possible Lipschitz constants are chosen from a set of values varying from zero to infinity. This idea is unified with an efficient diagonal partition strategy. A novel technique balancing usage of local and global information during partitioning is proposed. A new procedure for finding lower bounds of the objective function over hyperintervals is also considered. It is demonstrated by extensive numerical experiments performed on more than 1600 multidimensional test functions that the new algorithm shows a very promising performance.

157 citations

Journal Article•DOI•
TL;DR: It is shown that the largest slope in a fixed size sample of slopes has an approximate Reverse Weibull distribution fitted to the largest slopes and the location parameter used as an estimator of the Lipschitz constant.
Abstract: A number of global optimisation algorithms rely on the value of the Lipschitz constant of the objective function. In this paper we present a stochastic method for estimating the Lipschitz constant. We show that the largest slope in a fixed size sample of slopes has an approximate Reverse Weibull distribution. Such a distribution is fitted to the largest slopes and the location parameter used as an estimator of the Lipschitz constant. Numerical results are presented.

145 citations

Journal Article•DOI•
TL;DR: An algorithm for finding the global maximum of a multimodal, multivariate function for which derivatives are available that assumes a bound on the second derivatives of the function and uses this to construct an upper envelope.
Abstract: We present an algorithm for finding the global maximum of a multimodal, multivariate function for which derivatives are available. The algorithm assumes a bound on the second derivatives of the function and uses this to construct an upper envelope. Successive function evaluations lower this envelope until the value of the global maximum is known to the required degree of accuracy. The algorithm has been implemented in RATFOR and execution times for standard test functions are presented at the end of the paper.

142 citations

Journal Article•DOI•
TL;DR: A review of the methods for global optimization reveals that most methods have been developed for unconstrained problems and need to be extended to general constrained problems because most of the engineering applications have constraints.
Abstract: A review of the methods for global optimization reveals that most methods have been developed for unconstrained problems. They need to be extended to general constrained problems because most of the engineering applications have constraints. Some of the methods can be easily extended while others need further work. It is also possible to transform a constrained problem to an unconstrained one by using penalty or augmented Lagrangian methods and solve the problem that way. Some of the global optimization methods find all the local minimum points while others find only a few of them. In any case, all the methods require a very large number of calculations. Therefore, the computational effort to obtain a global solution is generally substantial. The methods for global optimization can be divided into two broad categories: deterministic and stochastic. Some deterministic methods are based on certain assumptions on the cost function that are not easy to check. These methods are not very useful since they are not applicable to general problems. Other deterministic methods are based on certain heuristics which may not lead to the true global solution. Several stochastic methods have been developed as some variation of the pure random search. Some methods are useful for only discrete optimization problems while others can be used for both discrete and continuous problems. Main characteristics of each method are identified and discussed. The selection of a method for a particular application depends on several attributes, such as types of design variables, whether or not all local minima are desired, and availability of gradients of all the functions.

129 citations


Cites methods from "An algorithm for finding the global..."

  • ...Another group of methods is due to Danilin and Pijavskij (1967), Pijavskij (1967), and Shubert (1972), with modifications due to Mladineo (1986) and Pint@r (1986a,b)....

    [...]

References
More filters
Journal Article•DOI•
01 Mar 1953
TL;DR: The problem formulated below was motivated by that of determining an interval containing the point at which a unimodal function on the unit interval possesses a maximum, without postulating regularity conditions involving continuity, derivatives, etc.
Abstract: The problem formulated below was motivated by that of determining an interval containing the point at which a unimodal function on the unit interval possesses a maximum, without postulating regularity conditions involving continuity, derivatives, etc. Our solution is to give, for every e > 0 and every specified number N of values of the argument at which the function may be observed, a procedure which is €-minimax (see (1) below) among the class of all sequential nonrandomized procedures which terminate by giving an interval containing the required point, where the payoff of the computer to nature is the length of this final interval. (The same result holds if, e.g., we consider all nonrandomized procedures and let the payoff be length of interval plus c or 0 according to whether the interval fails to cover or covers the desired point, where c^l/Uif+i, the latter being defined below.) The analogous problem where errors are present in the observations was considered in [l ], but no optimum results are yet known for that more difficult case. Search for a maximum is a "second-order" search in the sense that information is given by pairs of observations. Thus, if Xix

999 citations


"An algorithm for finding the global..." refers methods in this paper

  • ...Many optimization methods have been developed which minimize the largest interval of uncertainty [ 2 , 6]. In our case this becomes meaningless....

    [...]

Journal Article•DOI•

525 citations

Book•
01 Jan 1964

524 citations

Journal Article•DOI•
TL;DR: In this article, a sequential search method for finding the global maximum of an objective function is proposed, which is applicable to a single variable defined on a closed interval and such that some bound on its rate of change is available.
Abstract: In this paper a sequential search method for finding the global maximum of an objective function is proposed. The method is applicable to an objective function of a single variable defined on a closed interval and such that some bound on its rate of change is available. The method is shown to be minimax. Computational aspects of the method are also discussed.

417 citations


"An algorithm for finding the global..." refers background or methods in this paper

  • ...The rate of this convergence is worst for a constant function [ 4 ]; for such a function our search rule amounts to a grid search....

    [...]

  • ...Pijavskii [3] and, independently, of B. Shubert [ 4 ] for a function of one variable....

    [...]

  • ...The proof that the algorithm is minimax [ 4 ] or optimal in one step [5] in the class, S, of all sequential sampling rules with respect to the estimate error, q5 - 49,, is similar to Shubert's....

    [...]

  • ...is non-increasing by construction: Mi is chosen as max Fi and graph Fi+l consists of erecting a cone at (x~,f(x~)) on the surface of graph f, thus approximating graph f even closer; thus M~+I = max F~+I ~ 4 ], but with some changes and is included for the sake of completeness....

    [...]

  • ...In using this sampling rule for computations, one must find the maximum of F. at step n. For N = 1, this is easy [ 4 ]....

    [...]

Journal Article•DOI•
TL;DR: A general algorithm for finding the absolute minimum of a function to a given accuracy is described and special aspects of its application are illustrated by examples involving functions of one or more variables, satisfying a Lipschitz condition.
Abstract: A GENERAL algorithm for finding the absolute minimum of a function to a given accuracy is described and special aspects of its application are illustrated by examples involving functions of one or more variables, satisfying a Lipschitz condition.

311 citations


"An algorithm for finding the global..." refers background in this paper

  • ...Pijavskii [ 3 ] and, independently, of B. Shubert [4] for a function of one variable....

    [...]