scispace - formally typeset
Search or ask a question

Showing papers by "John E. Dennis published in 2004"


Journal ArticleDOI
TL;DR: This paper formulates and analyzes a pattern search method for general constrained optimization based on filter methods for step acceptance that preserves the division into SEARCH and local POLL steps, which allows the explicit use of inexpensive surrogates or random search heuristics in the SEARCH step.
Abstract: This paper formulates and analyzes a pattern search method for general constrained optimization based on filter methods for step acceptance. Roughly, a filter method accepts a step that improves either the objective function value or the value of some function that measures the constraint violation. The new algorithm does not compute or approximate any derivatives, penalty constants, or Lagrange multipliers. A key feature of the new algorithm is that it preserves the division into SEARCH and local POLL steps, which allows the explicit use of inexpensive surrogates or random search heuristics in the SEARCH step. It is shown here that the algorithm identifies limit points at which optimality conditions depend on local smoothness of the functions and, to a greater extent, on the choice of a certain set of directions. Stronger optimality conditions are guaranteed for smoother functions and, in the constrained case, for a fortunate choice of the directions on which the algorithm depends. These directional conditions generalize those given previously for linear constraints, but they do not require a feasible starting point. In the absence of general constraints, the proposed algorithm and its convergence analysis generalize previous work on unconstrained, bound constrained, and linearly constrained generalized pattern search. The algorithm is illustrated on some test examples and on an industrial wing planform engineering design application.

315 citations


Journal ArticleDOI
TL;DR: In this paper, a non-gradient based pattern search method is used for shape optimization to minimize aerodynamic noise in a laminar flow past an acoustically compact airfoil.
Abstract: Shape optimization is applied to time-dependent trailing-edge flow in order to minimize aerodynamic noise. Optimization is performed using the surrogate management framework (SMF), a non-gradient based pattern search method chosen for its efficiency and rigorous convergence properties. Using SMF, design space exploration is performed not with the expensive actual function but with an inexpensive surrogate function. The use of a polling step in the SMF guarantees that the algorithm generates a convergent subsequence of mesh points in the parameter space. Each term of this subsequence is a weak local minimizer of the cost function on the mesh in a sense to be made precise later. We will discuss necessary optimality conditions for the design problem that are satisfied by the limit of this subsequence. Results are presented for an unsteady laminar flow past an acoustically compact airfoil. Constraints on lift and drag are handled within SMF by applying the filter pattern search method of Audet and Dennis, within which a penalty function is used to form and optimize a surrogate function. Optimal shapes that minimize noise have been identified for the trailing-edge problem in constrained and unconstrained cases. Results show a significant reduction (as much as 80%) in acoustic power with reasonable computational cost using several shape parameters. Physical mechanisms for noise reduction are discussed.

136 citations


ReportDOI
21 Jun 2004
TL;DR: This class combines and extends the Audet-Dennis Generalized Pattern Search algorithms for bound constrained mixed variable optimization, and their GPS-filter algorithms for general nonlinear constraints, and is believed to be the first algorithm with provable convergence results to directly target this class of problems.
Abstract: : A new class of algorithms for solving nonlinearly constrained mixed variable optimization problems is presented. This class combines and extends the Audet-Dennis Generalized Pattern Search (GPS) algorithms for bound constrained mixed variable optimization, and their GPS-filter algorithms for general nonlinear constraints. In generalizing existing algorithms, new theoretical convergence results are presented that reduce seamlessly to existing results for more specific classes of problems. While no local continuity or smoothness assumptions are required to apply the algorithm, a hierarchy of theoretical convergence results based on the Clarke calculus is given, in which local smoothness dictate what can be proved about certain limit points generated by the algorithm. To demonstrate the usefulness of the algorithm, the algorithm is applied to the design of a load-bearing thermal insulation system. We believe this is the first algorithm with provable convergence results to directly target this class of problems.

67 citations


Journal ArticleDOI
TL;DR: It is shown that rather rough approximations to the gradient are sufficient to reduce the pollstep to a single function evaluation, and it is proved that using these less expensive pollsteps does not weaken the known convergence properties of the method, all of which depend only on thepollstep.
Abstract: A common question asked by users of direct search algorithms is how to use derivative information at iterates where it is available. This paper addresses that question with respect to Generalized Pattern Search (GPS) methods for unconstrained and linearly constrained optimization. Specifically, this paper concentrates on the GPS pollstep. Polling is done to certify the need to refine the current mesh, and it requires O(n) function evaluations in the worst case. We show that the use of derivative information significantly reduces the maximum number of function evaluations necessary for pollsteps, even to a worst case of a single function evaluation with certain algorithmic choices given here. Furthermore, we show that rather rough approximations to the gradient are sufficient to reduce the pollstep to a single function evaluation. We prove that using these less expensive pollsteps does not weaken the known convergence properties of the method, all of which depend only on the pollstep.

61 citations


Journal ArticleDOI
TL;DR: In this paper, the authors describe the application of a derivative-free optimization technique, the surrogate management framework (SMF), for designing the shape of an airfoil trailing edge which minimizes the noise of vortex shedding.
Abstract: In this Letter we describe the application of a derivative-free optimization technique, the surrogate management framework (SMF), for designing the shape of an airfoil trailing edge which minimizes the noise of vortex shedding. Constraints on lift and drag are enforced within SMF using a filter. Several optimal shapes have been identified for the case of laminar vortex shedding with reasonable computational cost using several shape parameters, and results show a significant reduction in acoustic power. Physical mechanisms for noise reduction are discussed.

37 citations


Journal ArticleDOI
TL;DR: In this article, a filter-based method for nonlinear optimization problems with nonlinear inequality constraints is presented, where the active constraint normals are linearly independent at all points of interest on the boundary of the feasible region.
Abstract: A direct search method for nonlinear optimization problems with nonlinear inequality constraints is presented. A filter based approach is used, which allows infeasible starting points. The constraints are assumed to be continuously differentiable, and approximations to the constraint gradients are used. For simplicity it is assumed that the active constraint normals are linearly independent at all points of interest on the boundary of the feasible region. An infinite sequence of iterates is generated, some of which are surrounded by sets of points called bent frames. An infinite subsequence of these iterates is identified, and its convergence properties are studied by applying Clarke's non-smooth calculus to the bent frames. It is shown that each cluster point of this subsequence is a Karush-Kuhn-Tucker point of the optimization problem under mild conditions which include strict differentiability of the objective function at each cluster point. This permits the objective function to be non-smooth, infinite, or undefined away from these cluster points. When the objective function is only locally Lipschitz at these cluster points it is shown that certain directions still have interesting properties at these cluster points.

22 citations


01 Jan 2004
TL;DR: The mesh adaptive direct search (MADS) algorithm as discussed by the authors extends the generalized pattern search (GPS) class by allowing local exploration, called polling, in an asymptotically dense set of directions in the space of optimization variables.
Abstract: This paper addresses the problem of minimization of a nonsmooth function under general nonsmooth constraints when no derivatives of the objective or constraint functions are available. We introduce the mesh adaptive direct search (MADS) class of algorithms which extends the generalized pattern search (GPS) class by allowing local exploration, called polling, in an asymptotically dense set of directions in the space of optimization variables. This means that under certain hypotheses, including a weak constraint qualification due to Rockafellar, MADS can treat constraints by the extreme barrier approach of setting the objective to infinity for infeasible points and treating the problem as unconstrained. The main GPS convergence result is to identify limit points $\hat{x}$, where the Clarke generalized derivatives are nonnegative in a finite set of directions, called refining directions. Although in the unconstrained case, nonnegative combinations of these directions span the whole space, the fact that there can only be finitely many GPS refining directions limits rigorous justification of the barrier approach to finitely many linear constraints for GPS. The main result of this paper is that the general MADS framework is flexible enough to allow the generation of an asymptotically dense set of refining directions along which the Clarke derivatives are nonnegative. We propose an instance of MADS for which the refining directions are dense in the hypertangent cone at $\hat{x}$ with probability 1 whenever the iterates associated with the refining directions converge to a single $\hat{x}$. The instance of MADS is compared to versions of GPS on some test problems. We also illustrate the limitation of our results with examples.

13 citations