scispace - formally typeset
Open Access

Global Optimization for Constrained Nonlinear Programming

Tao Wang
Reads0
Chats0
TLDR
Constrained simulated annealing (CSA) is developed, a global optimization algorithm that asymptotically converges to constrained global minima (CGM) with probability one, for solving discrete constrained nonlinear programming problems (NLPs).
Abstract
In this thesis, we develop constrained simulated annealing (CSA), a global optimization algorithm that asymptotically converges to constrained global minima (CGM) with probability one, for solving discrete constrained nonlinear programming problems (NLPs). The algorithm is based on the necessary and sufficient condition for constrained local minima (CLM) in the theory of discrete constrained optimization using Lagrange multipliers developed in our group. The theory proves the equivalence between the set of discrete saddle points and the set of CLM, leading to the first-order necessary and sufficient condition for CLM. To find a CGM, CSA searches for a discrete saddle point with the minimum objective value by carrying out both probabilistic descents in the original-variable space of a discrete augmented Lagrangian function and probabilistic ascents in the Lagrange-multiplier space. We prove that CSA converges asymptotically to a CGM with probability one. We also extend CSA to solve continuous and mixed-integer constrained NLPs. By achieving asymptotic convergence, CSA represents one of the major developments in nonlinear constrained global optimization today, which complements simulated annealing (SA) in unconstrained global optimization. Based on CSA, we have studied various strategies of CSA and their trade-offs for solving continuous, discrete, and mixed-integer NLPs. The strategies evaluated include adaptive neighborhoods, distributions to control sampling, acceptance probabilities, and cooling schedules. An optimization software package based on CSA and its various strategies has been implemented. Finally, we apply CSA to solve a collection of engineering application benchmarks and design filters for subband image coding. Much better results have been reported in comparison with other existing methods.

read more

Citations
More filters

Global optimization and simulated annealing

TL;DR: The mathematical formulation of the simulated annealing algorithm is extended to continuous optimization problems, and it is proved asymptotic convergence to the set of global optima.
Book ChapterDOI

Network Support: The Radio Environment Map

TL;DR: This chapter discusses the strategy of exploiting network support in cognitive radio (CR) systems architectures introducing the radio environment map (REM) as an innovative vehicle of providing network support to CRs.
Proceedings ArticleDOI

Siting and sizing of distributed generation for optimal microgrid architecture

TL;DR: In this paper, a technique for determining the optimal location and sizes of DG units in a microgrid, given the network configuration and the heat and power requirements at various load points is presented.
Journal ArticleDOI

A robust approach for iterative contaminant source location and release history recovery.

TL;DR: The combination of CRLS with the global optimization solver achieved better performance than the combination of a non-robust estimator, i.e., the nonnegative least squares (NNLS) method, with the same solver.
Book ChapterDOI

Simulated Annealing with Asymptotic Convergence for Nonlinear Constrained Global Optimization

TL;DR: Con constrained simulated annealing (CSA), a global minimization algorithm that converges to constrained global minima with probability one, is presented, for solving nonlinear discrete non-convex constrained minimization problems.
References
More filters
Book

Genetic algorithms in search, optimization, and machine learning

TL;DR: In this article, the authors present the computer techniques, mathematical tools, and research results that will enable both students and practitioners to apply genetic algorithms to problems in many fields, including computer programming and mathematics.
Journal ArticleDOI

Optimization by Simulated Annealing

TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems.
Journal ArticleDOI

A simplex method for function minimization

TL;DR: A method is described for the minimization of a function of n variables, which depends on the comparison of function values at the (n 41) vertices of a general simplex, followed by the replacement of the vertex with the highest value by another point.
Journal ArticleDOI

Differential Evolution – A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces

TL;DR: In this article, a new heuristic approach for minimizing possibly nonlinear and non-differentiable continuous space functions is presented, which requires few control variables, is robust, easy to use, and lends itself very well to parallel computation.
Journal ArticleDOI

Monte Carlo Sampling Methods Using Markov Chains and Their Applications

TL;DR: A generalization of the sampling method introduced by Metropolis et al. as mentioned in this paper is presented along with an exposition of the relevant theory, techniques of application and methods and difficulties of assessing the error in Monte Carlo estimates.