scispace - formally typeset
Search or ask a question

Showing papers on "Discrete optimization published in 2014"


Journal ArticleDOI
TL;DR: The proposed binary bat algorithm (BBA) is able to significantly outperform others on majority of the benchmark functions and there is a real application of the proposed method in optical engineering called optical buffer design that evidence the superior performance of BBA in practice.
Abstract: Bat algorithm (BA) is one of the recently proposed heuristic algorithms imitating the echolocation behavior of bats to perform global optimization. The superior performance of this algorithm has been proven among the other most well-known algorithms such as genetic algorithm (GA) and particle swarm optimization (PSO). However, the original version of this algorithm is suitable for continuous problems, so it cannot be applied to binary problems directly. In this paper, a binary version of this algorithm is proposed. A comparative study with binary PSO and GA over twenty-two benchmark functions is conducted to draw a conclusion. Furthermore, Wilcoxon's rank-sum nonparametric statistical test was carried out at 5 % significance level to judge whether the results of the proposed algorithm differ from those of the other algorithms in a statistically significant way. The results prove that the proposed binary bat algorithm (BBA) is able to significantly outperform others on majority of the benchmark functions. In addition, there is a real application of the proposed method in optical engineering called optical buffer design at the end of the paper. The results of the real application also evidence the superior performance of BBA in practice.

549 citations


BookDOI
TL;DR: The Handbook of Simulation Optimization presents an overview of the state of the art of simulation optimization, providing a survey of the most well-established approaches for optimizing stochastic simulation models and a sampling of recent research advances in theory and methodology.
Abstract: The Handbook of Simulation Optimization presents an overview of the state of the art of simulation optimization, providing a survey of the most well-established approaches for optimizing stochastic simulation models and a sampling of recent research advances in theory and methodology. Leading contributors cover such topics as discrete optimization via simulation, ranking and selection, efficient simulation budget allocation, random search methods, response surface methodology, stochastic gradient estimation, stochastic approximation, sample average approximation, stochastic constraints, variance reduction techniques, model-based stochastic search methods and Markov decision processes. This single volume should serve as a reference for those already in the field and as a means for those new to the field for understanding and applying the main approaches. The intended audience includes researchers, practitioners and graduate students in the business/engineering fields of operations research, management science, operations management and stochastic control, as well as in economics/finance and computer science.

227 citations


Journal ArticleDOI
TL;DR: An ant colony optimization (ACO) algorithm that extends the ACOR algorithm for continuous optimization to tackle mixed-variable optimization problems, and a novel procedure to generate artificial, mixed- variable benchmark functions that is used to automatically tune ACOMV's parameters.
Abstract: In this paper, we introduce ACO MV : an ant colony optimization (ACO) algorithm that extends the ACO R algorithm for continuous optimization to tackle mixed-variable optimization problems. In ACO MV , the decision variables of an optimization problem can be explicitly declared as continuous, ordinal, or categorical, which allows the algorithm to treat them adequately. ACO MV includes three solution generation mechanisms: a continuous optimization mechanism (ACO R ), a continuous relaxation mechanism (ACO MV -o) for ordinal variables, and a categorical optimization mechanism (ACO MV -c) for categorical variables. Together, these mechanisms allow ACO MV to tackle mixed-variable optimization problems. We also define a novel procedure to generate artificial, mixed-variable benchmark functions, and we use it to automatically tune ACO MV 's parameters. The tuned ACO MV is tested on various real-world continuous and mixed-variable engineering optimization problems. Comparisons with results from the literature demonstrate the effectiveness and robustness of ACO MV on mixed-variable optimization problems.

211 citations


Journal Article
TL;DR: BayesOpt as mentioned in this paper is a library with state-of-the-art Bayesian optimization methods to solve nonlinear optimization, stochastic bandits or sequential experimental design problems.
Abstract: BayesOpt is a library with state-of-the-art Bayesian optimization methods to solve nonlinear optimization, stochastic bandits or sequential experimental design problems Bayesian optimization characterized for being sample efficient as it builds a posterior distribution to capture the evidence and prior knowledge of the target function Built in standard C++, the library is extremely efficient while being portable and flexible It includes a common interface for C, C++, Python, Matlab and Octave

200 citations



Journal ArticleDOI
TL;DR: Optimization results indicate that the modified TLBO algorithm can generate improved designs when compared to other population-based techniques and in some cases improve the overall computational efficiency.

163 citations


Journal ArticleDOI
TL;DR: It can be found that the proposed multilevel method can increase the performance of the whole drive system, such as bigger output power and lower material cost, and decrease the computation cost significantly compared with those of single-level design optimization method.
Abstract: Electrical drive systems are key components in modern appliances, industry equipment, and systems, e.g., hybrid electric vehicles. To obtain the best performance of these drive systems, the motors and their control systems should be designed and optimized at the system level rather than the component level. This paper presents an effort to develop system-level design and optimization methods for electrical drive systems. Two system-level design optimization methods are presented in this paper: 1) single-level method (only at system level); and 2) multilevel method. Meanwhile, the approximate models, the design of experiments technique, and the sequential subspace optimization method are presented to improve the optimization efficiency. Finally, a drive system consisting of a permanent-magnet transverse flux machine with a soft magnetic composite core is investigated, and detailed results are presented and discussed. This is a high-dimensional optimization problem with 14 parameters mixed with both discrete and continuous variables. The finite-element analysis model and method are verified by the experimental results on the motor prototype. From the discussion, it can be found that the proposed multilevel method can increase the performance of the whole drive system, such as bigger output power and lower material cost, and decrease the computation cost significantly compared with those of single-level design optimization method.

141 citations


Journal ArticleDOI
TL;DR: A way of finding energy representations with large classical gaps between ground and first excited states, efficient algorithms for mapping non-compatible Ising models into the hardware, and the use of decomposition methods for problems that are too large to fit in hardware are proposed.
Abstract: This paper discusses techniques for solving discrete optimization problems using quantum annealing. Practical issues likely to affect the computation include precision limitations, finite temperature, bounded energy range, sparse connectivity, and small numbers of qubits. To address these concerns we propose a way of finding energy representations with large classical gaps between ground and first excited states, efficient algorithms for mapping non-compatible Ising models into the hardware, and the use of decomposition methods for problems that are too large to fit in hardware. We validate the approach by describing experiments with D-Wave quantum hardware for low density parity check decoding with up to 1000 variables.

132 citations


Posted Content
TL;DR: This article aims to present the principles of primal?dual approaches while providing an overview of the numerical methods that have been proposed in different contexts and lead to algorithms that are easily parallelizable.
Abstract: Optimization methods are at the core of many problems in signal/image processing, computer vision, and machine learning. For a long time, it has been recognized that looking at the dual of an optimization problem may drastically simplify its solution. Deriving efficient strategies which jointly brings into play the primal and the dual problems is however a more recent idea which has generated many important new contributions in the last years. These novel developments are grounded on recent advances in convex analysis, discrete optimization, parallel processing, and non-smooth optimization with emphasis on sparsity issues. In this paper, we aim at presenting the principles of primal-dual approaches, while giving an overview of numerical methods which have been proposed in different contexts. We show the benefits which can be drawn from primal-dual algorithms both for solving large-scale convex optimization problems and discrete ones, and we provide various application examples to illustrate their usefulness.

118 citations



Journal ArticleDOI
TL;DR: The main objective of this paper is to investigate how the (hidden) structure of a given real/complex-valued optimization problem makes it easy to solve, and to this end, three conic relaxations are proposed.
Abstract: This work is concerned with finding a global optimization technique for a broad class of nonlinear optimization problems, including quadratic and polynomial optimization problems. The main objective of this paper is to investigate how the (hidden) structure of a given real/complex-valued optimization problem makes it easy to solve. To this end, three conic relaxations are proposed. Necessary and sufficient conditions are derived for the exactness of each of these relaxations, and it is shown that these conditions are satisfied if the optimization problem is highly structured. More precisely, the structure of the optimization problem is mapped into a generalized weighted graph, where each edge is associated with a weight set extracted from the coefficients of the optimization problem. In the real-valued case, it is shown that the relaxations are all exact if each weight set is sign definite and in addition a condition is satisfied for each cycle of the graph. It is also proved that if some of these conditi...

Journal ArticleDOI
TL;DR: A novel synthetic discrete optimization design methodology of IBDC based on advanced components for battery energy storage system (BESS) is proposed and an experimental prototype is implemented.
Abstract: This paper reviews recent advances of key components in isolated bidirectional dc-dc converter (IBDC) and discusses potential of IBDC based on advanced components. The concept of safe operation area of IBDC is proposed and clearly defined, which is determined by the intersection of the effective operation areas of transmission power, current stress, and current root mean square. By analyzing efficiency and power density characteristic, this paper points out that the efficiency optimization of IBDC needs to give way to the power density optimization. On this basis, a novel synthetic discrete optimization design methodology of IBDC based on advanced components for battery energy storage system (BESS) is proposed. Finally, an experimental prototype is implemented, and some design suggestions for hardware optimization are provided. The experimental results verify the effectiveness of the designed BESS and show that the practical applications of next-generation high-frequency conversion system are expected with the appearance and application of advanced components.

Journal ArticleDOI
TL;DR: This paper studies leader selection in order to minimize convergence errors experienced by the follower agents, and introduces a novel connection to random walks on the network graph that shows that the convergence error has an inherent supermodular structure as a function of the leader set.
Abstract: In a leader-follower multi-agent system (MAS), the leader agents act as control inputs and influence the states of the remaining follower agents. The rate at which the follower agents converge to their desired states, as well as the errors in the follower agent states prior to convergence, are determined by the choice of leader agents. In this paper, we study leader selection in order to minimize convergence errors experienced by the follower agents, which we define as a norm of the distance between the follower agents' intermediate states and the convex hull of the leader agent states. By introducing a novel connection to random walks on the network graph, we show that the convergence error has an inherent supermodular structure as a function of the leader set. Supermodularity enables development of efficient discrete optimization algorithms that directly approximate the optimal leader set, provide provable performance guarantees, and do not rely on continuous relaxations. We formulate two leader selection problems within the supermodular optimization framework, namely, the problem of selecting a fixed number of leader agents in order to minimize the convergence error, as well as the problem of selecting the minimum-size set of leader agents to achieve a given bound on the convergence error. We introduce algorithms for approximating the optimal solution to both problems in static networks, dynamic networks with known topology distributions, and dynamic networks with unknown and unpredictable topology distributions. Our approach is shown to provide significantly lower convergence errors than existing random and degree-based leader selection methods in a numerical study.

Journal ArticleDOI
TL;DR: This work presents an approach for field development optimization with two objectives, called BiPSOMADS, which utilizes at its core the recently developed PSO–MADS (Particle Swarm Optimization–Mesh Adaptive Direct Search) hybrid optimization algorithm.

Journal ArticleDOI
TL;DR: It is concluded that the TLBO algorithm presented in this study can be effectively used in the weight minimization of truss structures.
Abstract: The aim of this study is to present a new efficient optimization algorithm called Teaching-Learning-Based Optimization (TLBO). The TLBO algorithm is based on the effect of the influence of a teacher on the output of learners in a class. Several benchmark problem related truss structures with discrete design variables are used to show the efficiency of the TLBO algorithm and the results are compared with those reported in the literature. It is concluded that the TLBO algorithm presented in this study can be effectively used in the weight minimization of truss structures.

Proceedings ArticleDOI
01 Aug 2014
TL;DR: A selection of complementary theoretical advances in addressing power system planning and operation problems in the fields of non-convex optimization, in mixed-integer programming, and in optimization under uncertainty are introduced.
Abstract: Power system planning and operation offers multitudinous opportunities for optimization methods. In practice, these problems are generally large-scale, non-linear, subject to uncertainties, and combine both continuous and discrete variables. In the recent years, a number of complementary theoretical advances in addressing such problems have been obtained in the field of applied mathematics. The paper introduces a selection of these advances in the fields of non-convex optimization, in mixed-integer programming, and in optimization under uncertainty. The practical relevance of these developments for power systems planning and operation are discussed, and the opportunities for combining them, together with high-performance computing and big data infrastructures, as well as novel machine learning and randomized algorithms, are highlighted.

Journal ArticleDOI
TL;DR: In this paper, a new one-parameter discrete distribution is introduced, and its mathematical properties and estimation procedures are derived, and four real data sets are used to show that the new model performs at least as well as the traditional one parameter discrete models.
Abstract: A new one-parameter discrete distribution is introduced. Its mathematical properties and estimation procedures are derived. Four real data sets are used to show that the new model performs at least as well as the traditional one-parameter discrete models and other newly proposed two-parameter discrete models.

Journal ArticleDOI
TL;DR: Bat inspired (BI) algorithm is examined in the context of discrete size optimization of steel frames designed for minimum weight in order to provide sufficient evidence for successful performance of the BI algorithm in comparison to other metaheuristics employed in structural optimization.

Journal ArticleDOI
TL;DR: It is shown that a standard formulation of the BB-BC algorithm occasionally falls short of producing acceptable solutions to problems from discrete size optimum design of steel trusses, so a reformulation of the algorithm is proposed and implemented for design optimization of various discrete truss structures.
Abstract: This article presents a methodology that provides a method for design optimization of steel truss structures based on a refined big bang–big crunch (BB-BC) algorithm. It is shown that a standard formulation of the BB-BC algorithm occasionally falls short of producing acceptable solutions to problems from discrete size optimum design of steel trusses. A reformulation of the algorithm is proposed and implemented for design optimization of various discrete truss structures according to American Institute of Steel Construction Allowable Stress Design (AISC-ASD) specifications. Furthermore, the performance of the proposed BB-BC algorithm is compared to its standard version as well as other well-known metaheuristic techniques. The numerical results confirm the efficiency of the proposed algorithm in practical design optimization of truss structures.

Journal ArticleDOI
Tunchan Cura1
TL;DR: This study proposes a relatively new technique called artificial bee colony (ABC) approach to solve the TOPTW and introduces a new food source acceptance criterion and a new scout bee search behavior, both of which significantly contribute to the solution quality.

Journal ArticleDOI
TL;DR: This paper proposes a new scheme that derives a sampling distribution from a fast fitted Gaussian process based on previously evaluated solutions that has the desired properties and can automatically balance the exploitation and exploration trade-off.
Abstract: Random search algorithms are often used to solve discrete optimization-via-simulation DOvS problems. The most critical component of a random search algorithm is the sampling distribution that is used to guide the allocation of the search effort. A good sampling distribution can balance the trade-off between the effort used in searching around the current best solution which is called exploitation and the effort used in searching largely unknown regions which is called exploration. However, most of the random search algorithms for DOvS problems have difficulties in balancing this trade-off in a seamless way. In this paper we propose a new scheme that derives a sampling distribution from a fast fitted Gaussian process based on previously evaluated solutions. We show that the sampling distribution has the desired properties and can automatically balance the exploitation and exploration trade-off. Furthermore, we integrate this sampling distribution into a random research algorithm, called a Gaussian process-based search GPS and show that the GPS algorithm has the desired global convergence as the simulation effort goes to infinity. We illustrate the properties of the algorithm through a number of numerical experiments.

Journal ArticleDOI
TL;DR: A new evolutionary optimization algorithm based on the actual manifold of objective function and fast opposite gradient search was proposed to improve the accuracy and speed of solution finding.

Journal ArticleDOI
01 Jul 2014
TL;DR: This paper discusses the linear optimization problem constrained by a system of bipolar fuzzy relational equations with max-$$T$$T composition, where the involved triangular norm is the Łukasiewicz t-norm.
Abstract: This paper discusses the linear optimization problem constrained by a system of bipolar fuzzy relational equations with max- $$T$$ T composition, where the involved triangular norm is the ?ukasiewicz t-norm. Although it is in general NP-hard, such an optimization problem can be reformulated in polynomial time into a 0-1 integer linear optimization problem and then solved taking advantage of well developed techniques in integer optimization.

BookDOI
20 Jan 2014
TL;DR: The authors compare the post-optimal analysis with alternative approaches to uncertain LSio problems and provide readers with criteria to choose the best way to model a given uncertain LSIO problem depending on the nature and quality of the data along with the available software.
Abstract: Post-Optimal Analysis in Linear Semi-Infinite Optimizationexamines the following topicsin regards tolinear semi-infinite optimization: modeling uncertainty, qualitative stability analysis, quantitative stability analysis and sensitivity analysis. Linear semi-infinite optimization (LSIO) deals with linear optimization problems where the dimension of the decision space or the number of constraints is infinite. The authors compare the post-optimal analysis with alternative approaches to uncertain LSIO problems and providereaders withcriteria to choose the best way to model a given uncertain LSIO problem depending on the nature and quality of the data along withthe available software. Thiswork also contains open problems which readers will find intriguing a challenging. Post-Optimal Analysis in Linear Semi-Infinite Optimization is aimed towardresearchers, graduate and post-graduate students of mathematics interested in optimization, parametric optimization and related topics.

Journal ArticleDOI
TL;DR: A exible Mixed Integer Linear Programming formulation that allows easy modications for representing dierent situations and scenarios is proposed that can be solved to optimality by a standard Branch-and-Cut procedure even for very long planning horizons.
Abstract: Personnel scheduling deals with the attribution of a number of duty shifts to a number of workers respecting several types of requirements. In this work, the problem of scheduling physicians in health care departments is studied. This problem is NP-hard, and we propose a exible Mixed Integer Linear Programming formulation that allows easy modications for representing dierent situations and scenarios. This formulation can be solved to optimality by a standard Branch-and-Cut procedure even for very long planning horizons. A real-world case study is considered. A comparison of the solutions obtained by the proposed approach with the solutions currently adopted in the considered structure is presented. Results are very encouraging both from the schedule quality (e.g., workload balancing) and from the computational point of view.

Journal ArticleDOI
TL;DR: The proposed algorithm, HCSGA, is first applied to 13 standard benchmark constrained optimization functions and subsequently used to solve three well-known design problems reported in the literature.
Abstract: This article presents an effective hybrid cuckoo search and genetic algorithm (HCSGA) for solving engineering design optimization problems involving problem-specific constraints and mixed variables such as integer, discrete and continuous variables. The proposed algorithm, HCSGA, is first applied to 13 standard benchmark constrained optimization functions and subsequently used to solve three well-known design problems reported in the literature. The numerical results obtained by HCSGA show competitive performance with respect to recent algorithms for constrained design optimization problems.

Journal ArticleDOI
TL;DR: In this paper, an approach for dynamic optimization considering uncertainties is developed and applied to robust aircraft trajectory optimization, and the nonintrusive polynomial chaos expansion scheme is employed to convert a robust trajectory optimization problem with stochastic ordinary differential equations into an equivalent deterministic trajectory optimized problem with deterministic ordinary differential equation.
Abstract: The development of algorithms for aircraft robust dynamic optimization considering uncertainties (for example, trajectory optimization) is relatively limited compared to aircraft robust static optimization (for example, configuration shape optimization). In this paper, an approach for dynamic optimization considering uncertainties is developed and applied to robust aircraft trajectory optimization. In the present approach, the nonintrusive polynomial chaos expansion scheme is employed to convert a robust trajectory optimization problem with stochastic ordinary differential equations into an equivalent deterministic trajectory optimization problem with deterministic ordinary differential equations. Two computational strategies for trajectory optimization considering uncertainties are compared. The performance of the developed method is studied by considering a classical deterministic trajectory optimization problem of supersonic aircraft short-time climb with uncertainties in the aerodynamic data. Detailed...

Book
27 May 2014
TL;DR: Nonlinear parameter optimization using r tools as discussed by the authors, a systematic and comprehensive treatment of optimization software using R tools has been proposed to analyze more variables especially under non linear multivariable conditions.
Abstract: nonlinear parameter optimization using r tools 1st edition nonlinear parameter optimization using r tools 1st edition nonlinear parameter optimization using r john c nash telfer school of management university of ottawa canada this item nonlinear parameter optimization using r tools by john c nash hardcover 67 57 only 2 left in stock more on the way, nonlinear parameter optimization using r tools 1st edition this item nonlinear parameter optimization using r tools 1st edition by nash john c 2014 hardcover hardcover 121 18 in stock ships from and sold by d m books store, nonlinear parameter optimization using r tools 1 john c nonlinear parameter optimization using r tools kindle edition by john c nash download it once and read it on your kindle device pc phones or tablets use features like bookmarks note taking and highlighting while reading nonlinear parameter optimization using r tools, nonlinear parameter optimization using r tools by john c nonlinear parameter optimization using r john c nash telfer school of management university of ottawa canada a systematic and comprehensive treatment of optimization software using r, wiley nonlinear parameter optimization using r tools john c nash telfer school of management university of ottawa canada a systematic and comprehensive treatment of optimization software using r in recent decades optimization techniques have been streamlined by computational and artificial intelligence methods to analyze more variables especially under non linear multivariable conditions, nonlinear parameter optimization using r tools john c nonlinear parameter optimization using r john c nash telfer school of management university of ottawa canada a systematic and comprehensive treatment of optimization software using r in recent decades optimization techniques have been streamlined by computational and artificial intelligence methods to analyze more variables especially under non linear multivariable conditions more, nonlinear parameter optimization using r tools by john c nonlinear parameter optimization using r tools by john c nash 2014 05 27 on amazon com free shipping on qualifying offers, nonlinear parameter optimization using r tools nash nonlinear parameter optimization using r john c nash telfer school of management university of ottawa canada a systematic and comprehensive treatment of optimization software using r, nonlinear parameter optimization using r tools nash j c 2014 nonlinear least squares in nonlinear parameter optimization using r tools john wiley sons ltd chichester uk doi 1

Journal ArticleDOI
TL;DR: In this paper, the authors proposed an alternative information-maximization clustering method based on a squared-loss variant of mutual information, which gives a clustering solution analytically in a computationally efficient way via kernel eigenvalue decomposition.
Abstract: Information-maximization clustering learns a probabilistic classifier in an unsupervised manner so that mutual information between feature vectors and cluster assignments is maximized. A notable advantage of this approach is that it involves only continuous optimization of model parameters, which is substantially simpler than discrete optimization of cluster assignments. However, existing methods still involve nonconvex optimization problems, and therefore finding a good local optimal solution is not straightforward in practice. In this letter, we propose an alternative information-maximization clustering method based on a squared-loss variant of mutual information. This novel approach gives a clustering solution analytically in a computationally efficient way via kernel eigenvalue decomposition. Furthermore, we provide a practical model selection procedure that allows us to objectively optimize tuning parameters included in the kernel function. Through experiments, we demonstrate the usefulness of the pr...

Book ChapterDOI
06 Sep 2014
TL;DR: This work proposes shape convexity as a new high-order regularization constraint for binary image segmentation and designs a dynamic programming technique for evaluating and approximating these cliques in linear time.
Abstract: Convexity is known as an important cue in human vision. We propose shape convexity as a new high-order regularization constraint for binary image segmentation. In the context of discrete optimization, object convexity is represented as a sum of 3-clique potentials penalizing any 1-0-1 configuration on all straight lines. We show that these non-submodular interactions can be efficiently optimized using a trust region approach. While the quadratic number of all 3-cliques is prohibitively high, we designed a dynamic programming technique for evaluating and approximating these cliques in linear time. Our experiments demonstrate general usefulness of the proposed convexity constraint on synthetic and real image segmentation examples. Unlike standard second-order length regularization, our convexity prior is scale invariant, does not have shrinking bias, and is virtually parameter-free.