scispace - formally typeset
Search or ask a question

Showing papers on "Constraint programming published in 2020"


Journal ArticleDOI
TL;DR: The results show that the sequence-based MILP model is the most efficient one, and the proposed CP model is effective in finding good quality solutions for the both the small-sized and large-sized instances.

94 citations


Posted Content
TL;DR: This work proposes a general and hybrid approach, based on DRL and CP, for solving combinatorial optimization problems, and experimentally shows that the framework introduced outperforms the stand-alone RL and CP solutions, while being competitive with industrial solvers.
Abstract: Combinatorial optimization has found applications in numerous fields, from aerospace to transportation planning and economics. The goal is to find an optimal solution among a finite set of possibilities. The well-known challenge one faces with combinatorial optimization is the state-space explosion problem: the number of possibilities grows exponentially with the problem size, which makes solving intractable for large problems. In the last years, deep reinforcement learning (DRL) has shown its promise for designing good heuristics dedicated to solve NP-hard combinatorial optimization problems. However, current approaches have two shortcomings: (1) they mainly focus on the standard travelling salesman problem and they cannot be easily extended to other problems, and (2) they only provide an approximate solution with no systematic ways to improve it or to prove optimality. In another context, constraint programming (CP) is a generic tool to solve combinatorial optimization problems. Based on a complete search procedure, it will always find the optimal solution if we allow an execution time large enough. A critical design choice, that makes CP non-trivial to use in practice, is the branching decision, directing how the search space is explored. In this work, we propose a general and hybrid approach, based on DRL and CP, for solving combinatorial optimization problems. The core of our approach is based on a dynamic programming formulation, that acts as a bridge between both techniques. We experimentally show that our solver is efficient to solve two challenging problems: the traveling salesman problem with time windows, and the 4-moments portfolio optimization problem. Results obtained show that the framework introduced outperforms the stand-alone RL and CP solutions, while being competitive with industrial solvers.

70 citations


Journal ArticleDOI
TL;DR: A survey of intelligent scheduling systems is provided by categorizing them into five major techniques containing fuzzy logic, expert systems, machine learning, stochastic local search optimization algorithms and constraint programming.
Abstract: Intelligent scheduling covers various tools and techniques for successfully and efficiently solving the scheduling problems. In this paper, we provide a survey of intelligent scheduling systems by categorizing them into five major techniques containing fuzzy logic, expert systems, machine learning, stochastic local search optimization algorithms and constraint programming. We also review the application case studies of these techniques.

67 citations


Proceedings ArticleDOI
11 Jun 2020
TL;DR: The filtering method of GraphQL is competitive to that of the latest algorithms CFL, CECI and DP-iso in terms of pruning power; the ordering methods in GraphQL and RI are usually the most effective; and the set intersection based local candidate computation in CECi andDP-iso performs the best in the enumeration.
Abstract: We study the performance of eight representative in-memory subgraph matching algorithms. Specifically, we put QuickSI, GraphQL, CFL, CECI, DP-iso, RI and VF2++ in a common framework to compare them on the following four aspects: (1) method of filtering candidate vertices in the data graph; (2) method of ordering query vertices; (3) method of enumerating partial results; and (4) other optimization techniques. Then, we compare the overall performance of these algorithms with Glasgow, an algorithm based on the constraint programming. Through experiments, we find that (1) the filtering method of GraphQL is competitive to that of the latest algorithms CFL, CECI and DP-iso in terms of pruning power; (2) the ordering methods in GraphQL and RI are usually the most effective; (3) the set intersection based local candidate computation in CECI and DP-iso performs the best in the enumeration; and (4) the failing sets pruning in DP-iso can significantly improve the performance when queries become large. Our source code is publicly available at https://github.com/RapidsAtHKUST/SubgraphMatching.

62 citations


Journal ArticleDOI
TL;DR: The results are promising and show that the proposed model and the solution approach can handle the real case study of Tehran earthquake in an efficient way.
Abstract: In this study, a stochastic multi-objective mixed-integer mathematical programming is proposed for logistic distribution and evacuation planning during an earthquake. Decisions about the pre- and post-phases of the disaster are considered seamless. The decisions of the pre-disaster phase relate to the location of permanent relief distribution centers and the number of the commodities to be stored. The decisions of the second phase are to determine the optimal location for the establishment of temporary care centers to increase the speed of treating the injured people and the distribution of the commodities at the affected areas. Humanitarian and cost issues are considered in the proposed models through three objective functions. Several sets of constraints are also considered in the proposed model to make it flexible to handle real issues. Demands for food, blood, water, blanket, and tent are assumed to be probabilistic which are related to several complicated factors and modeled using a complicated network in this study. A simulation is setup to generate the probabilistic distribution of demands through several scenarios. The stochastic demands are assumed as inputs for the proposed stochastic multi-objective mixed integer mathematical programming model. The model is transformed to its deterministic equivalent using chance constraint programming approach. The equivalent deterministic model is solved using an efficient epsilon-constraint approach and an evolutionary algorithm, called non-dominated sorting genetic algorithm (NSGA-II). First several illustrative numerical examples are solved using both solution procedures. The performance of solution procedures is compared and the most efficient solution procedure, i.e., NSGA-II, is used to handle the case study of Tehran earthquake. The results are promising and show that the proposed model and the solution approach can handle the real case study in an efficient way.

46 citations


Journal ArticleDOI
TL;DR: Mixed integer linear programming and constraint programming models for the minimization of the makespan are presented and the commercial solver is able to deliver feasible solutions for the large-sized instances that are of the size of the instances that appear in practice.

32 citations


Proceedings ArticleDOI
27 Apr 2020
TL;DR: This work proposes a scheduling model for converged networks supporting different traffic types and introduces a novel procedure for schedule planning of isochronous traffic which exploits the hierarchical structure of factory networks.
Abstract: Industry 4.0 and the vision of smart factories drive the need for real-time communication. Time-Sensitive Networking (TSN) augments the IEEE Std 802.1Q with a family of mechanisms enabling real-time communication. One of the key mechanisms is the Time-Aware Shaper (TAS) implementing a TDMA scheme on a traffic class basis. With proper synchronization it can even be used to schedule individual frames or streams. With this capability, the network can guarantee communication deadlines, bounded latency, and bounded jitter. However, for these guarantees a system-wide schedule needs to be calculated, which is an NP-hard problem. Current approaches are mainly based on constraint programming and optimization problems, and, therefore do not scale well for larger topologies and number of streams. In this paper, our contribution is twofold: first, we propose a scheduling model for converged networks supporting different traffic types and, secondly, we introduce a novel procedure for schedule planning of isochronous traffic which exploits the hierarchical structure of factory networks. To this end, we split the network into sub-networks and use a two-stage approach based on a heuristic and tracing. Our evaluation shows that the new scheduling approach outperforms the reference scheduler by more than two orders of magnitude with regard to execution time.

30 citations


Journal ArticleDOI
TL;DR: This work develops the first exact decomposition approaches for a multi-level operating room planning and scheduling problem that integrates case mix planning, master surgical scheduling, and surgery sequencing in the presence of multiple surgical specialties.

29 citations


Journal ArticleDOI
TL;DR: This work introduces and analyzes four algorithmic ideas for this class of time/sequence-dependent over-subscribed scheduling problems with time windows: a novel hybridization of adaptive large neighbourhood search (ALNS) and tabu search (TS), a new randomization strategy for neighbourhood operators, a partial sequence dominance heuristic, and a fast insertion strategy.
Abstract: In intelligent manufacturing, it is important to schedule orders from customers efficiently. Make-to-order companies may have to reject or postpone orders when the production capacity does not meet the demand. Many such real-world scheduling problems are characterised by processing times being dependent on the start time (time dependency) or on the preceding orders (sequence dependency), and typically have an earliest and latest possible start time. We introduce and analyze four algorithmic ideas for this class of time/sequence-dependent over-subscribed scheduling problems with time windows: a novel hybridization of adaptive large neighbourhood search (ALNS) and tabu search (TS), a new randomization strategy for neighbourhood operators, a partial sequence dominance heuristic, and a fast insertion strategy. Through factor analysis, we demonstrate the performance of these new algorithmic features on problem domains with varying properties. Evaluation of the resulting general purpose algorithm on three domains—an order acceptance and scheduling problem, a real-world multi-orbit agile Earth observation satellite scheduling problem, and a time-dependent orienteering problem with time windows—shows that our hybrid algorithm robustly outperforms general algorithms including a mixed integer programming method, a constraint programming method, recent state-of-the-art problem-dependent meta-heuristic methods, and a two-stage hybridization of ALNS and TS.

28 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed the Integrated Port Container Terminal Problem (IPCTP) that considers the joint optimization of quay crane assignment and scheduling, yard crane assignment, yard location assignment, and yard truck assignment, aiming at minimizing the turnover times of the vessels and maximizing terminal throughput.

28 citations


Journal ArticleDOI
TL;DR: The solution of this integrated scheduling problem for container handling operations of a single vessel is improved in terms of the instance size, solution efficiency, and solution optimality.

Posted Content
TL;DR: This work introduces SketchGraphs, a collection of 15 million sketches extracted from real-world CAD models coupled with an open-source data processing pipeline that demonstrates and establishes benchmarks for two use cases of the dataset: generative modeling of sketches and conditional generation of likely constraints given unconstrained geometry.
Abstract: Parametric computer-aided design (CAD) is the dominant paradigm in mechanical engineering for physical design. Distinguished by relational geometry, parametric CAD models begin as two-dimensional sketches consisting of geometric primitives (e.g., line segments, arcs) and explicit constraints between them (e.g., coincidence, perpendicularity) that form the basis for three-dimensional construction operations. Training machine learning models to reason about and synthesize parametric CAD designs has the potential to reduce design time and enable new design workflows. Additionally, parametric CAD designs can be viewed as instances of constraint programming and they offer a well-scoped test bed for exploring ideas in program synthesis and induction. To facilitate this research, we introduce SketchGraphs, a collection of 15 million sketches extracted from real-world CAD models coupled with an open-source data processing pipeline. Each sketch is represented as a geometric constraint graph where edges denote designer-imposed geometric relationships between primitives, the nodes of the graph. We demonstrate and establish benchmarks for two use cases of the dataset: generative modeling of sketches and conditional generation of likely constraints given unconstrained geometry.

Journal ArticleDOI
TL;DR: This paper proposes a mobility-aware robust PRA approach (MRPRA) in heterogeneous networks that pre-allocates resources in both time and frequency domains among mobile users with users’ trajectories predicted by hidden Markov model.
Abstract: Proactive resource allocation (PRA) is an essential technology boosting intelligent communication, as it can make full use of prediction and significantly improve network performance. However, most promising gains base on perfect prediction which is unrealistic. How to make PRA robust against prediction uncertainty and maximize benefits brought by prediction becomes an important issue. In this paper, we tackle this problem and propose a mobility-aware robust PRA approach (MRPRA) in heterogeneous networks. MRPRA pre-allocates resources in both time and frequency domains among mobile users with users’ trajectories predicted by hidden Markov model. The objective is to minimize service delay under constraints of different levels of quality-of-service (QoS) requirement and mobility intensity. MRPRA is robust against prediction uncertainty by exploiting probabilistic constraint programming to model QoS requirements in a probabilistic sense. To this end, the probabilistic distribution of achievable rate is derived. To flexibly coordinate resource allocation among multiple mobile users over time horizon, a deep reinforcement learning based multi-actor deep deterministic policy gradient algorithm is designed. It learns robust PRA policies by distributed acting and centralized criticizing. Extensive numerical simulations are performed to analyze performances of the proposed approach.

Journal ArticleDOI
Andy Ham1
TL;DR: Two different constraint programming formulations are proposed for the first time for a flexible job shop scheduling problem with transbots, significantly outperforming all other benchmark approaches in the literature and proving optimality of the well-known benchmark instances.
Abstract: This paper studies a simultaneous scheduling of production and material transfer in a flexible job shop environment. The simultaneous scheduling approach has been recently adopted by a robotic mobile fulfillment system, wherein transbots pick up jobs and deliver to pick-stations for processing, which requires a simultaneous scheduling of jobs, transbots, and stations. Two different constraint programming formulations are proposed for the first time for a flexible job shop scheduling problem with transbots, significantly outperforming all other benchmark approaches in the literature and proving optimality of the well-known benchmark instances.

Journal ArticleDOI
TL;DR: This study investigates this type of allocation of tasks to minimise makespan in the printed circuit boards industry using Constraint Programming based (CP) approach to solve the problem as the main novelty of this study.
Abstract: The advancement of technology and the empowerment of the industry have made humans and robots more closely tied together, known as human-robot collaboration. A sector that specifically utilises thi...

Journal ArticleDOI
TL;DR: This paper addresses the trade-off between makespan and total energy consumption in hybrid flowshops, where machines can operate at varying speed levels and a bi-objective mixed-integer linear programming (MILP) model and aBi-Objective constraint programming (CP) model are proposed for the problem employing speed scaling.
Abstract: Due to its practical relevance, the hybrid flowshop scheduling problem (HFSP) has been widely studied in the literature with the objectives related to production efficiency. However, studies regarding energy consumption and environmental effects have rather been limited. This paper addresses the trade-off between makespan and total energy consumption in hybrid flowshops, where machines can operate at varying speed levels. A bi-objective mixed-integer linear programming (MILP) model and a bi-objective constraint programming (CP) model are proposed for the problem employing speed scaling. Since the objectives of minimizing makespan and total energy consumption are conflicting with each other, the augmented epsilon (e)-constraint approach is used for obtaining the Pareto-optimal solutions. While close approximations for the Pareto-optimal frontier are obtained for small-sized instances, sets of non-dominated solutions are obtained for large instances by solving the MILP and CP models under a time limit. As the problem is NP-hard, two variants of the iterated greedy algorithm, a variable block insertion heuristic and four variants of ensemble of metaheuristic algorithms are also proposed, as well as a novel constructive heuristic. The performances of the proposed seven bi-objective metaheuristics are compared with each other as well as the MILP and CP solutions on a set of well-known HFSP benchmarks in terms of cardinality, closeness, and diversity of the solutions. Initially, the performances of the algorithms are tested on small-sized instances with respect to the Pareto-optimal solutions. Then, it is shown that the proposed algorithms are very effective for solving large instances in terms of both solution quality and CPU time.

Journal ArticleDOI
TL;DR: A flexible repetitive scheduling model by integrating soft logic into time-cost tradeoffs that allows the same activities in different units to be performed in parallel, in sequence, or part in parallel and part in sequence.

Journal ArticleDOI
TL;DR: An effective Variable Neighborhood Search (VNS) algorithm which incorporates the proposed decoding scheme and that uses a self-tuning routine to set its most important parameter and which exhibited consistent and very competitive performance in terms of computer time and solution quality.

Journal ArticleDOI
TL;DR: This paper proposes a new, more scalable approach based on Constraint programming for learning decision trees with a fixed maximum depth minimizing the classification error and presents a new approach for efficiently creating an optimal decision tree of limited depth.
Abstract: Decision trees are among the most popular classification models in machine learning. Traditionally, they are learned using greedy algorithms. However, such algorithms pose several disadvantages: it is difficult to limit the size of the decision trees while maintaining a good classification accuracy, and it is hard to impose additional constraints on the models that are learned. For these reasons, there has been a recent interest in exact and flexible algorithms for learning decision trees. In this paper, we introduce a new approach to learn decision trees using constraint programming. Compared to earlier approaches, we show that our approach obtains better performance, while still being sufficiently flexible to allow for the inclusion of constraints. Our approach builds on three key building blocks: (1) the use of AND/OR search, (2) the use of caching, (3) the use of the CoverSize global constraint proposed recently for the problem of itemset mining. This allows our constraint programming approach to deal in a much more efficient way with the decompositions in the learning problem.

Journal ArticleDOI
TL;DR: This work presents a Constraint Programming approach capable of automating the short-term scheduling process in a cut-and-fill mine, and introduces two models: one that directly solves the original interruptible scheduling problem, and one that is based on solving a related uninterruptable scheduling problem and transforming its solution back to the original domain.

Book ChapterDOI
25 Jun 2020
TL;DR: The Glasgow Subgraph Solver provides an implementation of state of the art algorithms for subgraph isomorphism problems, and is suitable for use on a wide range of graphs, including many that are found to be computationally hard by other solvers.
Abstract: The Glasgow Subgraph Solver provides an implementation of state of the art algorithms for subgraph isomorphism problems. It combines constraint programming concepts with a variety of strong but fast domain-specific search and inference techniques, and is suitable for use on a wide range of graphs, including many that are found to be computationally hard by other solvers. It can also be equipped with side constraints, and can easily be adapted to solve other subgraph matching problem variants. We outline its key features from the view of both users and algorithm developers, and discuss future directions.

Journal ArticleDOI
TL;DR: New mixed integer and constraint programming (CP) models are proposed for the developed integrated flexible project scheduling problem and the real-world applicability of the suggested CP models is shown by additionally solving a large industry case.

Proceedings Article
05 May 2020
TL;DR: A formal model for providing a justification for a given choice in the context of a given corpus of basic normative principles (so-called axioms) on which to base any possible step-by-step explanation for why a given target outcome has been or should be selected in a given situation is proposed.
Abstract: Given the preferences of several agents over a set of alternatives, there may be competing views on which of the alternatives would be the "best'' compromise We propose a formal model, grounded in social choice theory, for providing a justification for a given choice in the context of a given corpus of basic normative principles (so-called axioms ) on which to base any possible step-by-step explanation for why a given target outcome has been or should be selected in a given situation Thus, our notion of justification has both an explanatory and a normative component We also develop an algorithm for computing such justifications that exploits the analogy between the notion of explanation and the concept of minimal unsatisfiable subset used in constraint programming Finally, we report on an application of a proof-of-concept implementation of our approach to run an experimental study of the explanatory power of several axioms proposed in the social choice literature

Journal ArticleDOI
TL;DR: This paper is designed based on the mathematical models for bi-level programming in Stackelberg game under type-2 fuzzy environment and LINGO iterative scheme is used to solve the deterministic problem using fuzzy programming.
Abstract: This paper is designed based on the mathematical models for bi-level programming in Stackelberg game under type-2 fuzzy environment. The parameters of the objective functions on both levels are considered as type-2 fuzzy numbers in the first case whereas the parameters of the objective functions and the constraints are chosen as type-2 fuzzy numbers in the second case. Critical value based reduction methods are applied to reduce type-2 fuzzy numbers to type-1 fuzzy numbers in the first case. After that, centroid method is used for completely defuzzifying type-2 fuzzy numbers. Besides this, the obtained results are compared with the help of LINGO iterative scheme and genetic algorithm. Coming to the second case, a chance constraint programming with the help of generalized credibility measure is utilized to convert the fuzzy problem to its equivalent crisp form. LINGO iterative scheme is used to solve the deterministic problem using fuzzy programming. The sensitivity analysis is shown to different credibility levels of right hand side of the constraints to find the value of objective function in each level. Finally, real-life based numerical problems are presented to show the performance of the proposed models and techniques. At last, conclusion about the findings and outlook are described.

Proceedings ArticleDOI
09 Jul 2020
TL;DR: This paper introduces a new approach to learn decision trees using constraint programming, and shows that this approach obtains better performance, while still being sufficiently flexible to allow for the inclusion of constraints.
Abstract: Decision trees are among the most popular classification models in machine learning. Traditionally, they are learned using greedy algorithms. However, such algorithms have their disadvantages: it is difficult to limit the size of the decision trees while maintaining a good classification accuracy, and it is hard to impose additional constraints on the models that are learned. For these reasons, there has been a recent interest in exact and flexible algorithms for learning decision trees. In this paper, we introduce a new approach to learn decision trees using constraint programming. Compared to earlier approaches, we show that our approach obtains better performance, while still being sufficiently flexible to allow for the inclusion of constraints. Our approach builds on three key building blocks: (1) the use of AND/OR search, (2) the use of caching, (3) the use of the CoverSize global constraint proposed recently for the problem of itemset mining. This allows our constraint programming approach to deal in a much more efficient way with the decompositions in the learning problem.

Journal ArticleDOI
TL;DR: A compact integer linear programming (ILP) model is proposed for this problem based on the discretisation of the defective object and a Benders decomposition algorithm and a constraint-programming (CP) based algorithm are developed.
Abstract: This paper addresses a variant of two-dimensional cutting problems in which rectangular small pieces are obtained by cutting a rectangular object through guillotine cuts. The characteristics of thi...

Journal ArticleDOI
TL;DR: The overall efficiency of the standard branch-and-bound algorithm is demonstrated on a standard set of benchmarks from the literature, in comparison with the best state of the art alternative.

Journal ArticleDOI
TL;DR: A constraint programming (CP) model is formulated to address both small and large-size data sets and indicates that the CP model discovers optimal solutions, approximately 90% of all the instances, and small optimality gaps in the remaining instances.

Journal ArticleDOI
TL;DR: A multi-objective mixed integer linear programming model is developed to jointly determine the best suppliers, and purchasing quantities along with the number and type of transportation modes, and a novel flexible-possibilistic programming approach is proposed to cope with the epistemic uncertainty and the flexibility of constraints.

Book ChapterDOI
21 Sep 2020
TL;DR: This work focuses on constrained clustering, a semi-supervised learning task that involves using limited amounts of labelled data, formulated as constraints, to improve clustering accuracy, and presents an Ising modeling framework that is flexible enough to support various types of constraints.
Abstract: The recent emergence of novel hardware platforms, such as quantum computers and Digital/CMOS annealers, capable of solving combinatorial optimization problems has spurred interest in formulating key problems as Ising models, a mathematical abstraction shared by a number of these platforms. In this work, we focus on constrained clustering, a semi-supervised learning task that involves using limited amounts of labelled data, formulated as constraints, to improve clustering accuracy. We present an Ising modeling framework that is flexible enough to support various types of constraints and we instantiate the framework with two common types of constraints: pairwise instance-level and partition-level. We study the proposed framework, both theoretically and empirically, and demonstrate how constrained clustering problems can be solved on a specialized CMOS annealer. Empirical evaluation across eight benchmark sets shows that our framework outperforms the state-of-the-art heuristic algorithms and that, unlike those algorithms, it can solve problems that involve combinations of constraint types. We also show that our framework provides high quality solutions orders of magnitudes more quickly than a recent constraint programming approach, making it suitable for mainstream data mining tasks.