scispace - formally typeset
Search or ask a question

Showing papers on "Constraint programming published in 2017"


BookDOI
01 Jan 2017
TL;DR: This work proposes a global ranking constraint and shows that GAC can be achieved in polynomial time and proposes an Oðn3 log nÞ algorithm for achieving RC as well as an efficient quadratic algorithm offering a better tradeoff.
Abstract: In many problems we want to reason about the ranking of items. For example, in information retrieval, when aggregating several search results, we may have ties and consequently rank orders. (e.g. [2, 3]). As a second example, we may wish to construct an overall ranking of tennis player based on pairwise comparisons between players. One principled method for constructing a ranking is the Kemeny distance [5] as this is the unique scheme that is neutral, consistent, and Condorcet. Unfortunately, determining this ranking is NP-hard, and remains so when we permit ties in the input or output [4]. As a third example, tasks in a scheduling problem may run in parallel, resulting in a ranking. In a ranking, unlike a permutation, we can have ties. Thus, 12225 is a ranking whilst 12345 is a permutation. To reason about permutations, we have efficient and effective global constraints. Regin [7] proposed an Oðn4Þ GAC propagator for permutations. For BC, there is an even faster Oðn log nÞ propagator [6]. Every constraint toolkit now provides propagators for permutation constraints. Surprisingly, ranking constraints are not yet supported. In [1], we tackle this weakness by proposing a global ranking constraint. We show that simple decompositions of this constraint hurt pruning. We then show that GAC can be achieved in polynomial time and we propose an Oðn3 log nÞ algorithm for achieving RC as well as an efficient quadratic algorithm offering a better tradeoff.

113 citations


Journal ArticleDOI
TL;DR: This dissertation explores Constraint Programming (CP) and proposes two models based on CP to address constrained clustering tasks and shows that these models can easily be embedded in a more general process and illustrate this on the problem of finding the Pareto front of a bi-criterion optimization process.

83 citations


Journal ArticleDOI
TL;DR: This paper proposes a methodology called Empirical Model Learning (EML) that relies on Machine Learning for obtaining components of a prescriptive model, using data either extracted from a predictive model or harvested from a real system, and uses two learning methods, namely Artificial Neural Networks and Decision Trees.

82 citations


Journal ArticleDOI
TL;DR: In this paper, the constraint programming (CP) approach is applied for the simple assembly line balancing problem (SALBP) as well as some of its generalizations, and the proposed formulations are conversions of well-known mixed integer programming (MILP) formulations to CP, along with a new set of constraints that helps the CP solver to converge faster.
Abstract: In this paper, the constraint programming (CP) approach is applied for the simple assembly line balancing problem (SALBP) as well as some of its generalizations. CP is a rich modeling language that enables the formulation of general combinatorial problems and is coupled with a strong set of solution methods that are available through general purpose solvers. The proposed formulations are conversions of well-known mixed integer programming (MILP) formulations to CP, along with a new set of constraints that helps the CP solver to converge faster. As a generic solution method, we compare its performance with the best known generic MILP formulations and show that it consistently outperforms MILP for medium to large problem instances. A comparison with SALOME, a well-known custom-made algorithm for solving the SALBP-1, shows that both approaches are capable of efficiently solving problems with up to 100 tasks. When 1000-task problems are concerned, SALOME provides better performance; however, CP can provide relatively good close to optimal solutions for multiple combinations of problem parameters. Finally, the generality of the CP approach is demonstrated by some adaptations of the proposed formulation to other variants of the assembly line balancing problem including the U-shaped assembly line balancing problem and the task assignment and equipment selection problem.

78 citations


Journal ArticleDOI
TL;DR: A novel hybrid framework using Constraint Programming to generate initial feasible solutions to feed as ‘warm start’ solutions to a Mixed Integer Programming solver for further improvement and the use of a Mixed integer programming solver to improve the initial solutions generated by Constraints is shown to be significantly superior to addressing the problem as a Constrain Optimisation Problem.

52 citations


Journal ArticleDOI
TL;DR: A multi-level approach to the modeling and solving of combinatorial optimization problems, which is versatile and effective owing to the use of multi- level presolving and multiple paradigms, such as constraint programming, logic programming, mathematical programming and fuzzy logic, for their complementary strengths.
Abstract: Constraints, although ubiquitous in production and distribution planning, scheduling and control, often lead to inconsistencies in the decision-making process. The constraint-based modeling helps circumvent many organization-impacting issues. To address this, we developed a multi-level approach to the modeling and solving of combinatorial optimization problems. It is versatile and effective owing to the use of multi-level presolving and multiple paradigms, such as constraint programming, logic programming, mathematical programming and fuzzy logic, for their complementary strengths. The capability of this framework and its advantage over mathematical programming alone or over hybrid frameworks is shown in the illustrative example, in which combinatorial optimization is used as a benchmark to prove the effectiveness of the proposed approach. Knowledge of the problem is stored in the form of facts.

48 citations


Journal ArticleDOI
TL;DR: A new method is constructed which is obtained from combination of Charnes−Cooper scheme and the multi-objective linear programming problem and this method is compared with some existing methods.
Abstract: This paper deals with developing an efficient algorithm for solving the fully fuzzy linear fractional programming problem. To this end, we construct a new method which is obtained from combination of Charnes−Cooper scheme and the multi-objective linear programming problem. Furthermore, the application of the proposed method in real life problems is presented and this method is compared with some existing methods. The numerical experiments and comparative results presented promising results to find the fuzzy optimal solution.

46 citations


Journal ArticleDOI
TL;DR: A new algorithm is introduced and described in detail to perform Associative–Commutative Common Subexpression Elimination (AC-CSE) in constraint problems, significantly improving existing CSE techniques for associative and commutative operators such as +.

45 citations


Journal ArticleDOI
TL;DR: In this paper, the problem of determining the exact lower bound of the number of active S-boxes for 6-round AES-128 in the related-key model was solved using constraint programming.
Abstract: Search for different types of distinguishers are common tasks in symmetrickey cryptanalysis. In this work, we employ the constraint programming (CP) technique to tackle such problems. First, we show that a simple application of the CP approach proposed by Gerault et al. leads to the solution of the open problem of determining the exact lower bound of the number of active S-boxes for 6-round AES-128 in the related-key model. Subsequently, we show that the same approach can be applied in searching for integral distinguishers, impossible differentials, zero-correlation linear approximations, in both the single-key and related-(twea)key model. We implement the method using the open source constraint solver Choco and apply it to the block ciphers PRESENT, SKINNY, and HIGHT (ARX construction). As a result, we find 16 related-tweakey impossible differentials for 12-round SKINNY-64-128 based on which we construct an 18-round attack on SKINNY-64-128 (one target version for the crypto competition https://sites.google.com/site/skinnycipher announced at ASK 2016). Moreover, we show that in some cases, when equipped with proper strategies (ordering heuristic, restart and dynamic branching strategy), the CP approach can be very efficient. Therefore, we suggest that the constraint programming technique should become a convenient tool at hand of the symmetric-key cryptanalysts.

42 citations


Journal ArticleDOI
TL;DR: This work considers the problem of packing a set of rectangular items into a strip of fixed width, without overlapping, using minimum height, and proposes an alternative method, based on Benders' decomposition, which can be successfully used to solve relevant related problems, like rectangle packing and pallet loading.

42 citations


Journal ArticleDOI
TL;DR: In this paper, the authors provide a comprehensive account of the constraint answer set language and solver ezcsp, a mainstream representative of this research area that has been used in various successful applications.
Abstract: Researchers in answer set programming and constraint programming have spent significant efforts in the development of hybrid languages and solving algorithms combining the strengths of these traditionally separate fields. These efforts resulted in a new research area: constraint answer set programming. Constraint answer set programming languages and systems proved to be successful at providing declarative, yet efficient solutions to problems involving hybrid reasoning tasks. One of the main contributions of this paper is the first comprehensive account of the constraint answer set language and solver ezcsp, a mainstream representative of this research area that has been used in various successful applications. We also develop an extension of the transition systems proposed by Nieuwenhuis et al. in 2006 to capture Boolean satisfiability solvers. We use this extension to describe the ezcsp algorithm and prove formal claims about it. The design and algorithmic details behind ezcsp clearly demonstrate that the development of the hybrid systems of this kind is challenging. Many questions arise when one faces various design choices in an attempt to maximize system's benefits. One of the key decisions that a developer of a hybrid solver makes is settling on a particular integration schema within its implementation. Thus, another important contribution of this paper is a thorough case study based on ezcsp, focused on the various integration schemas that it provides.

Journal ArticleDOI
TL;DR: The MiningZinc language allows one to model constraint-based itemset mining problems in a solver independent way, and its execution mechanism can automatically chain different algorithms and solvers, leading to a unique combination of declarative modeling with high-performance solving.

Journal ArticleDOI
TL;DR: The results of the performance evaluation demonstrate the effectiveness of MRCP-RM/HCP-RM in generating a schedule that leads to a low proportion of jobs missing their deadlines (P) and also provide insights into system behaviour and performance.
Abstract: Resource allocation and scheduling on clouds are required to harness the power of the underlying resource pool such that the service provider can meet the quality of service requirements of users, which are often captured in service level agreements (SLAs). This paper focuses on resource allocation and scheduling on clouds and clusters that process MapReduce jobs with SLAs. The resource allocation and scheduling problem is modelled as an optimization problem using constraint programming, and a novel MapReduce Constraint Programming based Resource Management algorithm (MRCP-RM) is devised that can effectively process an open stream of MapReduce jobs where each job is characterized by an SLA comprising an earliest start time, a required execution time, and an end-to-end deadline. A detailed performance evaluation of MRCP-RM is conducted for an open system subjected to a stream of job arrivals using both simulation and experimentation on a real system. The experiments on a real system are performed on a Hadoop cluster (deployed on Amazon EC2) that runs our new Hadoop Constraint Programming based Resource Management algorithm (HCP-RM) that incorporates a technique for handling data locality. The results of the performance evaluation demonstrate the effectiveness of MRCP-RM/HCP-RM in generating a schedule that leads to a low proportion of jobs missing their deadlines ( P ) and also provide insights into system behaviour and performance. In the simulation experiments, it is observed that MRCP-RM achieves on average an 82 percent lower P compared to a technique from the existing literature when processing a synthetic workload from Facebook. Furthermore, in the experiments performed on a Hadoop cluster deployed on Amazon EC2, it is observed that HCP-RM achieved on average a 63 percent lower P compared to an EDF-Scheduler for a wide variety of workload and system parameters experimented with.

Journal ArticleDOI
TL;DR: Five different mathematical programming models and two constraint programming models are developed for the no-wait flow shop problem with due date constraints and an exact algorithm that takes advantage of unique characteristics of the problem is designed.

Journal ArticleDOI
TL;DR: Experiments conducted on three variants of the time-dependent traveling salesman problem indicate that the proposed techniques substantially outperform general-purpose methods, such as mixed-integer linear programming and constraint programming models.

Journal Article
TL;DR: Nine scenarios are analyzed to reflect the impacts of the imprecision (fuzziness and randomness) associated with the size of the population in a plume emergency planning zone and the results are valuable for supporting local decision makers to generate effective emergency evacuation strategies.
Abstract: Nuclear power accidents are one of the most dangerous disasters posing a lethal threat to human health and have detrimental effects lasting for decades. Therefore, emergency evacuation is important to minimize injuries and prevent lethal consequences resulting from a nuclear power accident. An inexact fuzzy stochastic chance constrained programming (IFSCCP) method is developed to address various uncertainties in evacuation management problems. It integrates the interval-parameter programming (IPP) and fuzzy stochastic chance constrained programming (FSCCP) methods into a general framework, in which the IPP method addresses the uncertainties presented as intervals defined by crisp lower and upper bounds, and the FCCP treat the dual-uncertainties expressed as fuzzy random variables. The measures of possibility and necessity were employed to convert the fuzzy random variables into crisp values to reflect the decision maker’s pessimistic and optimistic preferences. The IFSCCP model is applied to support nuclear emergency evacuation management in the Qinshan Nuclear Power Site, which is one of the largest nuclear plants in China. The results pro-vide stable intervals for the objective function and decision variables with different fuzzy and probability confidence levels regarding the local residents’ distribution. Nine scenarios are analyzed to reflect the impacts of the imprecision (fuzziness and randomness) associated with the size of the population in a plume emergency planning zone. The results are valuable for supporting local decision makers to generate effective emergency evacuation strategies.

Journal ArticleDOI
TL;DR: A hybrid algorithm is presented, which combines Integer Programming and Constraint Programming to efficiently solve the highly-constrained Nurse Rostering Problem, and exploits the strength of IP in obtaining lower-bounds and finding an optimal solution with the capability of CP in finding feasible solutions in a co-operative manner.


Journal ArticleDOI
TL;DR: This work investigates the automatic learning of constraints (formulas and relations) in raw tabular data in an unsupervised way and is able to accurately discover constraints in spreadsheets from various sources.
Abstract: Spreadsheets, comma separated value files and other tabular data representations are in wide use today. However, writing, maintaining and identifying good formulas for tabular data and spreadsheets can be time-consuming and error-prone. We investigate the automatic learning of constraints (formulas and relations) in raw tabular data in an unsupervised way. We represent common spreadsheet formulas and relations through predicates and expressions whose arguments must satisfy the inherent properties of the constraint. The challenge is to automatically infer the set of constraints present in the data, without labeled examples or user feedback. We propose a two-stage generate and test method where the first stage uses constraint solving techniques to efficiently reduce the number of candidates, based on the predicate signatures. Our approach takes inspiration from inductive logic programming, constraint learning and constraint satisfaction. We show that we are able to accurately discover constraints in spreadsheets from various sources.

Book ChapterDOI
28 Aug 2017
TL;DR: This work introduces the CoverSize constraint for itemset mining problems, a global constraint for counting and constraining the number of transactions covered by the itemset decision variables, and exposes the size of the cover as a variable, which opens up new modelling perspectives compared to an existing global constraint.
Abstract: Constraint Programming is becoming competitive for solving certain data-mining problems largely due to the development of global constraints. We introduce the CoverSize constraint for itemset mining problems, a global constraint for counting and constraining the number of transactions covered by the itemset decision variables. We show the relation of this constraint to the well-known table constraint, and our filtering algorithm internally uses the reversible sparse bitset data structure recently proposed for filtering table. Furthermore, we expose the size of the cover as a variable, which opens up new modelling perspectives compared to an existing global constraint for (closed) frequent itemset mining. For example, one can constrain minimum frequency or compare the frequency of an itemset in different datasets as is done in discriminative itemset mining. We demonstrate experimentally on the frequent, closed and discriminative itemset mining problems that the CoverSize constraint with reversible sparse bitsets allows to outperform other CP approaches.

Proceedings ArticleDOI
01 Aug 2017
TL;DR: This work introduces a new branch and bound algorithm for the maximum common subgraph and maximum common connected subgraph problems which is based around vertex labelling and partitioning which dramatically reduce the memory and computation requirements during search, and allow better dual viewpoint ordering heuristics to be calculated cheaply.
Abstract: We introduce a new branch and bound algorithm for the maximum common subgraph and maximum common connected subgraph problems which is based around vertex labelling and partitioning. Our method in some ways resembles a traditional constraint programming approach, but uses a novel compact domain store and supporting inference algorithms which dramatically reduce the memory and computation requirements during search, and allow better dual viewpoint ordering heuristics to be calculated cheaply. Experiments show a speedup of more than an order of magnitude over the state of the art, and demonstrate that we can operate on much larger graphs without running out of memory.

Proceedings ArticleDOI
01 Feb 2017
TL;DR: This survey aims to evidence which product requirements are currently supported by studied methods, how scalability and performance is considered in existing approaches, and point out some challenges to be addressed in future research.
Abstract: Product lines have been employed as a mass customisation method that reduces production costs and time-to-market. Multiple product variants are represented in a product line, however the selection of a particular configuration depends on stakeholders' functional and non-functional requirements. Methods like constraint programming and evolutionary algorithms have been used to support the configuration process. They consider a set of product requirements like resource constraints, stakeholders' preferences, and optimization objectives. Nevertheless, scalability and performance concerns start to be an issue when facing large-scale product lines and runtime environments. Thus, this paper presents a survey that analyses strengths and drawbacks of 21 approaches that support product line configuration. This survey aims to: i) evidence which product requirements are currently supported by studied methods; ii) how scalability and performance is considered in existing approaches; and iii) point out some challenges to be addressed in future research.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated three different technologies for solving a planning and scheduling problem of deploying multiple robots in a retirement home environment to assist elderly residents, and concluded that a constraint-based scheduling approach, specifically a decomposition using constraint programming, provides the most promising results for their application.
Abstract: This paper investigates three different technologies for solving a planning and scheduling problem of deploying multiple robots in a retirement home environment to assist elderly residents. The models proposed make use of standard techniques and solvers developed in AI planning and scheduling, with two primary motivations. First, to find a planning and scheduling solution that we can deploy in our real-world application. Second, to evaluate planning and scheduling technology in terms of the ``model-and-solve'' functionality that forms a major research goal in both domain-independent planning and constraint programming. Seven variations of our application are studied using the following three technologies: PDDL-based planning, time-line planning and scheduling, and constraint-based scheduling. The variations address specific aspects of the problem that we believe can impact the performance of the technologies while also representing reasonable abstractions of the real world application. We evaluate the capabilities of each technology and conclude that a constraint-based scheduling approach, specifically a decomposition using constraint programming, provides the most promising results for our application. PDDL-based planning is able to find mostly low quality solutions while the timeline approach was unable to model the full problem without alterations to the solver code, thus moving away from the model-and-solve paradigm. It would be misleading to conclude that constraint programming is ``better'' than PDDL-based planning in a general sense, both because we have examined a single application and because the approaches make different assumptions about the knowledge one is allowed to embed in a model. Nonetheless, we believe our investigation is valuable for AI planning and scheduling researchers as it highlights these different modelling assumptions and provides insight into avenues for the application of AI planning and scheduling for similar robotics problems. In particular, as constraint programming has not been widely applied to robot planning and scheduling in the literature, our results suggest significant untapped potential in doing so.

Proceedings ArticleDOI
17 Oct 2017
TL;DR: Two new workload prediction models, based on constraint programming and neural networks, that can be used for dynamic resource provisioning in Cloud environments are proposed and shown to be complimentary as neural networks give better prediction results, while constraint programming is more suitable for trace generation.
Abstract: Cloud computing allows for elasticity as users can dynamically benefit from new virtual resources when their workload increases. Such a feature requires highly reactive resource provisioning mechanisms. In this paper, we propose two new workload prediction models, based on constraint programming and neural networks, that can be used for dynamic resource provisioning in Cloud environments. We also present two workload trace generators that can help to extend an experimental dataset in order to test more widely resource optimization heuristics. Our models are validated using real traces from a small Cloud provider. Both approaches are shown to be complimentary as neural networks give better prediction results, while constraint programming is more suitable for trace generation.

Book ChapterDOI
17 Jul 2017
TL;DR: In the early stages of model driven development, models are frequently incomplete and partial, which means that well-formedness constraints can be efficiently checked for (fully specified) concrete models, but checking the same constraints over partial models is more challenging.
Abstract: In the early stages of model driven development, models are frequently incomplete and partial. Partial models represent multiple possible concrete models, and thus, they are able to capture uncertainty and possible design decisions. When using models of a complex modeling language, several well-formedness constraints need to be continuously checked to highlight conceptual design flaws for the engineers in an early phase. While well-formedness constraints can be efficiently checked for (fully specified) concrete models, checking the same constraints over partial models is more challenging since, for instance, a currently valid constraint may be violated (or an invalid constraint may be respected) when refining a partial model into a concrete model.

Journal ArticleDOI
TL;DR: A parallel batch-scheduling problem that involves the constraints of different job release times, non-identical job sizes, and incompatible job families, is addressed and mixed integer programming and constraint programming models are proposed and tested and compared with a variable neighborhood search heuristic.
Abstract: We study a parallel batch-scheduling problem that involves the constraints of different job release times, non-identical job sizes, and incompatible job families, is addressed. Mixed integer programming and constraint programming (CP) models are proposed and tested on a set of common problem instances from a paper in the literature. Then, we compare the performance of the models with that of a variable neighborhood search (VNS) heuristic from the same paper. Computational results show that CP outperforms VNS with respect to solution quality and run time by 3.4%–6.8% and 47%–91%, respectively. When compared to optimal solutions, the results demonstrate CP is capable of generating a near optimal solution in a short amount of time.

Journal ArticleDOI
TL;DR: The theory of spatial constraint systems with operators to specify information and processes moving from a space to another and it is shown that spatial constraint can also express the epistemic notion of knowledge by means of a derived spatial operator that specifies global information.

Journal ArticleDOI
TL;DR: Key in the approach is the concept of an extension window defined by gap/span and it is demonstrated that the proposed approach outperforms both specialized and CP-based approaches in almost all cases and that the advantage increases as the minimum frequency threshold decreases.
Abstract: Constraint Programming (CP) has proven to be an effective platform for constraint based sequence mining. Previous work has focused on standard frequent sequence mining, as well as frequent sequence mining with a maximum ’gap’ between two matching events in a sequence. The main challenge in the latter is that this constraint can not be imposed independently of the omnipresent frequency constraint. Indeed, the gap constraint changes whether a subsequence is included in a sequence, and hence its frequency. In this work, we go beyond that and investigate the integration of timed events and constraining the minimum/maximum gap as well as minimum/maximum span. The latter constrains the allowed time between the first and last matching event of a pattern. We show how the three are interrelated, and what the required changes to the frequency constraint are. Key in our approach is the concept of an extension window defined by gap/span and we develop techniques to avoid scanning the sequences needlessly, as well as using a backtracking-aware data structure. Experiments demonstrate that the proposed approach outperforms both specialized and CP-based approaches in almost all cases and that the advantage increases as the minimum frequency threshold decreases. This paper is an extension of the original manuscript presented at CPAIOR’17 [5].

Journal ArticleDOI
TL;DR: This paper analyzes new exact approaches for the multi-mode resource-constrained project scheduling (MRCPSP) problem with the aim of makespan minimization and is the first to close (find the optimal solution and prove its optimality for) 628 open instances with 50 and 100 jobs from the literature.

Journal ArticleDOI
TL;DR: This work improves and generalize a recently introduced constraint-based method for learning undirected graphical models and shows that the method is capable of efficiently handling a more general class of models, called stratified/labeled graphicalmodels, which have an astronomically larger model space.
Abstract: Statistical model learning problems are traditionally solved using either heuristic greedy optimization or stochastic simulation, such as Markov chain Monte Carlo or simulated annealing. Recently, there has been an increasing interest in the use of combinatorial search methods, including those based on computational logic. Some of these methods are particularly attractive since they can also be successful in proving the global optimality of solutions, in contrast to stochastic algorithms that only guarantee optimality at the limit. Here we improve and generalize a recently introduced constraint-based method for learning undirected graphical models. The new method combines perfect elimination orderings with various strategies for solution pruning and offers a dramatic improvement both in terms of time and memory complexity. We also show that the method is capable of efficiently handling a more general class of models, called stratified/labeled graphical models, which have an astronomically larger model space.