Topic
Discrete optimization
About: Discrete optimization is a research topic. Over the lifetime, 4598 publications have been published within this topic receiving 158297 citations. The topic is also known as: discrete optimisation.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: Generalized simulated annealing is an optimization procedure for locating the global optimum (maximum or minimum) of multidimenisonal continuous functions and is applied to near‐infrared spectra.
Abstract: Generalized simulated annealing (GSA) is an optimization procedure for locating the global optimum (maximum or minimum) of multidimenisonal continuous functions. GSA has been modified for optimization of discrete functions. Selection of calibration samples from an existing set defines discrete optimization and GSA is used to select optimal sets of calibration samples for specific analysis samples. The procedure is applied to near-infrared spectra. When compared to using the complete set of 37 calibration samples, concentration prediction errors were reduced 50%–100% by using select sets of two to seven calibration samples. Additionally, GSA was able to improve a poorly designed experiment. GSA devised augmented experimental designs such that the overall experimental design (original plus augmented) was more orthogonal than the original.
43 citations
01 Jan 2004
TL;DR: The ideas from combinatorial optimization are applied to find globally optimal solutions to continuous variational problems to solve for globally optimal discrete minimal surfaces.
Abstract: In this paper, we apply the ideas from combinatorial optimization to find globally optimal solutions to continuous variational problems. At the heart of our method is an algorithm to solve for globally optimal discrete minimal surfaces. This discrete surface problem is a natural generalization of the planar-graph shortest path problem.
43 citations
•
09 Jul 2002TL;DR: Two methods for forming reduced models to speed up genetic-algorithm-based optimization by genetically engineering some individuals instead of using the regular Darwinian evolution approach are compared.
Abstract: In this paper we compare two methods for forming reduced models to speed up genetic-algorithm-based optimization. The methods work by forming functional approximations of the fitness function which are used to speed up the GA optimization. One method speeds up the optimization by making the genetic operators more informed. The other method speeds up the optimization by genetically engineering some individuals instead of using the regular Darwinian evolution approach. Empirical results in several engineering design domains are presented.
43 citations
••
TL;DR: A direct-search derivative-free Matlab optimizer for bound-constrained problems is described, whose remarkable features are its ability to handle a mix of continuous and discrete variables, a versatile interface as well as a novel self-training option.
Abstract: A direct-search derivative-free Matlab optimizer for bound-constrained problems is described, whose remarkable features are its ability to handle a mix of continuous and discrete variables, a versatile interface as well as a novel self-training option. Its performance compares favorably with that of NOMAD (Nonsmooth Optimization by Mesh Adaptive Direct Search), a well-known derivative-free optimization package. It is also applicable to multilevel equilibrium- or constrained-type problems. Its easy-to-use interface provides a number of user-oriented features, such as checkpointing and restart, variable scaling, and early termination tools.
43 citations