scispace - formally typeset
Search or ask a question
Topic

Discrete optimization

About: Discrete optimization is a research topic. Over the lifetime, 4598 publications have been published within this topic receiving 158297 citations. The topic is also known as: discrete optimisation.


Papers
More filters
Journal ArticleDOI
TL;DR: Selecting the optimal policy using simulation is subject to input model risk when input models that mimic real-world randomness in the simulation have estimation error due to finite sample sizes.
Abstract: Selecting the optimal policy using simulation is subject to input model risk when input models that mimic real-world randomness in the simulation have estimation error due to finite sample sizes. I...

29 citations

Journal ArticleDOI
TL;DR: Theoretical properties of the proposed discrete filled function are investigated, and an algorithm for discrete global optimization is developed from the new discretefilled function.

29 citations

Journal ArticleDOI
TL;DR: A discrete adjoint approach for the optimization of unsteady, turbulent flows that can be implemented efficiently with the use of a sparse forward mode of Automatic Differentiation andumerical results show the efficiency of the approach for a shape optimization problem involving a three dimensional Large Eddy Simulation (LES).
Abstract: In this paper we present a discrete adjoint approach for the optimization of unsteady, turbulent flows. While discrete adjoint methods usually rely on the use of the reverse mode of Automatic Differentiation (AD), which is difficult to apply to complex unsteady problems, our approach is based on the discrete adjoint equation directly and can be implemented efficiently with the use of a sparse forward mode of AD. We demonstrate the approach on the basis of a parallel, multigrid flow solver that incorporates various turbulence models. Due to grid deformation routines also shape optimization problems can be handled. We consider the relevant aspects, in particular the efficient generation of the discrete adjoint equation and the parallel implementation of a multigrid method for the adjoint, which is derived from the multigrid scheme of the flow solver. Numerical results show the efficiency of the approach for a shape optimization problem involving a three dimensional Large Eddy Simulation (LES).

29 citations

Journal ArticleDOI
TL;DR: This paper presents the derivation of the modified Newton step in the calculus of variation framework needed for image processing, and demonstrates the method with two common objective functionals: variational image deblurring and geometric active contours for image segmentation.
Abstract: Many problems in image processing are addressed via the minimization of a cost functional. The most prominently used optimization technique is gradient-descent, often used due to its simplicity and applicability where other techniques, e.g., those coming from discrete optimization, cannot be applied. Yet, gradient-descent suffers from slow convergence, and often to just local minima which highly depend on the initialization and the condition number of the functional Hessian. Newton-type methods, on the other hand, are known to have a faster, quadratic convergence. In its classical form, the Newton method relies on the $L^2$-type norm to define the descent direction. In this paper, we generalize and reformulate this very important optimization method by introducing Newton-type methods based on more general norms. Such norms are introduced both in the descent computation (Newton step) and in the corresponding stabilizing trust-region. This generalization opens up new possibilities in the extraction of the Newton step, including benefits such as mathematical stability and the incorporation of smoothness constraints. We first present the derivation of the modified Newton step in the calculus of variation framework needed for image processing. Then, we demonstrate the method with two common objective functionals: variational image deblurring and geometric active contours for image segmentation. We show that in addition to the fast convergence, norms adapted to the problem at hand yield different and superior results.

29 citations

Journal ArticleDOI
TL;DR: The interpolants are designed in such a way as to allow valid over- and underestimation functions to be constructed to provide the global optimization algorithm with a guarantee of e-global optimality for the surrogate problem.
Abstract: This paper presents an approach for the global optimization of constrained nonlinear programming problems in which some of the constraints are nonfactorable, defined by a computational model for which no explicit analytical representation is available. A three-phase approach to the global optimization is considered. In the sampling phase, the nonfactorable functions and their gradients are evaluated and an interpolation function is constructed. In the global optimization phase, the interpolants are used as surrogates in a deterministic global optimization algorithm. In the final local optimization phase, the global optimum of the interpolation problem is used as a starting point for a local optimization of the original problem. The interpolants are designed in such a way as to allow valid over- and underestimation functions to be constructed to provide the global optimization algorithm with a guarantee of e-global optimality for the surrogate problem.

29 citations


Network Information
Related Topics (5)
Optimization problem
96.4K papers, 2.1M citations
90% related
Optimal control
68K papers, 1.2M citations
84% related
Robustness (computer science)
94.7K papers, 1.6M citations
84% related
Scheduling (computing)
78.6K papers, 1.3M citations
83% related
Linear system
59.5K papers, 1.4M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202313
202236
2021104
2020128
2019113
2018140