scispace - formally typeset

Proceedings ArticleDOI

Adaptive Constructive Interval Disjunction

04 Nov 2013-pp 900-906

TL;DR: On a representative sample of instances, ACID appears to be the best approach in solving and optimization, and has been added to the default strategies of the Ibex interval solver.

AbstractAn operator called CID and an efficient variant 3BCID wereproposed in 2007. For numerical CSPs handled by interval methods, these operators compute a partial consistency equivalent to Partition-1-AC for discrete CSPs. The two main parameters of CID are the number of times the main CID procedure is called and the maximum number ofsub-intervals treated by the procedure. The 3BCID operator is state-of-the-art in numerical CSP solving, but not in constrained global optimization. This paper proposes an adaptive variant of 3BCID. The number of variables handled is auto-adapted during the search, the other parameters are fixed and robust to modifications. On a representative sample of instances, ACID appears to be the best approach in solving and optimization, and has been added to the default strategies of the Ibex interval solver.

Summary (3 min read)

Introduction

  • This paper proposes an adaptive variant of 3BCID.
  • The number of variables handled is auto-adapted during the search, the other parameters are fixed and robust to modifications.
  • On a representative sample of instances, ACID appears to be the best approach in solving and optimization, and has been added to the default strategies of the Ibex interval solver.

I. CONSTRUCTIVE INTERVAL DISJUNCTION (CID)

  • A filtering/contracting operator for numerical CSPs called Constructive Interval Disjunction (in short CID) has been proposed in [13].
  • Applied first to continuous constraint satisfaction problems handled by interval methods, it has been more recently applied to constrained global optimization problems.
  • This algorithm is state-of-the-art in constraint satisfaction, but is generally dominated by constraint propagation algorithms like HC4 in optimization.
  • The main practical contribution is that an adaptive version of CID becomes efficient for both real-valued satisfaction and optimization problems, while needing no additional parameter value from the user.

B. Numerical CSP

  • The constraints defined in an NCSP are numerical.
  • They are equations and inequalities using mathematical operators like +, , /, exp, log, sin.
  • NCSPs are generally solved by a Branch & Contract interval strategy: Branch: a variable xi is chosen and its interval [xi] is split into two sub-intervals, thus making the whole process combinatorial.
  • The 2BRevise procedure works with all the projection functions of a given constraint.
  • C. 3B algorithm Stronger interval partial consistencies have also been proposed.

D. CID

  • Constructive Interval Disjunction (CID) is a partial consistency stronger than 3B-consistency [13].
  • CID-consistency is similar to Partition-1-AC (P-1-AC) in finite domain CSPs [4].
  • The main procedure varCID handles a single variable xi.
  • The subboxes are contracted by ctc and hulled, giving [Xcid].
  • The procedure var3BCID has been deeply studied and experimented in the past.

II. ADAPTIVE CID: LEARNING THE NUMBER OF

  • Like for SAC or 3B, a quasi fixed-point in terms of contraction can be reached by 3BCID (or CID) by calling var3BCID inside two nested loops.
  • An outer loop calls the inner loop until no interval is contracted more than a predefined precision (thus reaching a quasi-fixed point).
  • The authors will write in the remaining part of the paper that a variable is varcided when the procedure var3BCID is called on that variable to contract the current box.
  • This gives good results in satisfaction but is dominated by pure constraint propagation in optimization.
  • All the policies measure the decrease in search space size after each call to var3BCID.

B. ACID1: interleaving learning and exploitation phases

  • A more sophisticated approach avoids this drawback.
  • After the kvarCIDth call to var3BCID, the gain in current box size from a var3BCID call to the next one, computed by the gainRatio formula, never exceeded a small given ratio, called ctratio.
  • During the exploitation phase following the previous learning phase, the average of the different kvarCID values (obtained in the nodes of the learning phase) provides the new value of numVarCID.
  • Numerous variants of this schema were tested.
  • The authors fixed experimentally the 3 parameters of the ACID1 procedure learnLength, cycleLength and ctratio, respectively to 50, 1000 and 0.002.

C. ACID2: taking into account the level in the search tree

  • A criticism against ACID1 is that the authors average kvarCID values obtained at different levels of the search tree.
  • This drawback is partially corrected by the successive learning phases of ACID1, where each learning phase corresponds to a part of the search tree.
  • A value corresponds to one order of magnitude in the box width.
  • This approach, called ACID2, gave in general results similar to those of ACID1 and appeared to be less robust.
  • Indeed, only a few nodes sometimes fall at certain width levels, which renders the statistics not significant.

III. EXPERIMENTS

  • All the algorithms were implemented in the C++ interval library Ibex (Interval Based EXplorer) [6].
  • All the experiments were run on the same computer (Intel X86 3GHz).
  • The authors tested the algorithms on square NCSP solving and constrained global optimization.
  • NCSP solving consists in finding all the solutions of a square system of n nonlinear equations with n real-values variables with bounded domains.
  • Global optimization consists in finding the global minimum of a function over n variables subject to constraints (equations and inequalities), the objective function and/or the constraints being non-convex.

A. Experiments in constraint satisfaction

  • The authors selected from the COPRIN benchmark1 all the systems that were solved by one of the tested algorithms in a time comprised between 2 s and 3600 s.
  • The authors compared their ACID method and its variants with the well known filtering techniques: a simple constraint propagation HC4, 3BCID-n (see Section II) and 3BCID-fp (fixed-point) in which a new iteration on all the variables is run when a variable domain width is reduced by more than 1%.
  • In particular, setting s3b to 10 gives results better than with smaller values (s3b = 5) and with greater values.
  • ACID1 obtains better gains w.r.t 3BCID-n in total time than on average because the best gains were obtained on difficult instances with more variables.
  • In the right part of the table, the authors report the solving time ratios obtained when X-Newton is removed (¬ XN) from the contractor sequence (4 problems could not be solved in 10,000s).

B. Experiments in constrained global optimization

  • The authors used the IbexOpt strategy of Ibex that performs a Best First Branch & Bound.
  • The precision required on the objective is 10−8.
  • In fact, the more recent Mohc constraint propagation algorithm [1] is better than HC4.
  • It is significant because the CP contraction is only a part of the IbexOpt algorithm [12] (linear relaxation and the search of feasible points are other important parts, not studied in this paper and set to their default algorithms in IbexOpt).
  • ACID2 obtains results slightly worse than ACID1, rendering this refinement not promising in practice.

IV. CONCLUSION

  • The authors have presented in this paper an adaptive version of the 3BCID contraction operator used by interval methods and close to partition-1-AC.
  • The best variant of this Adaptive CID operator (ACID1 in the paper) interleaves learning phases and exploitation phases to auto-adapt the number of variables handled.
  • These variables are selected by an efficient branching heuristic and all the other parameters are fixed and robust to modifications.
  • Overall, ACID1 adds no parameter to the solving or optimization strategies.
  • It offers the best results on average and is the best or close to the best on every tested instance, even in presence of the best Ibex devices (Interval-Newton, X-Newton).

Did you find this useful? Give us your feedback

...read more

Content maybe subject to copyright    Report

HAL Id: hal-00936654
https://hal-enpc.archives-ouvertes.fr/hal-00936654
Submitted on 27 Jan 2014
HAL is a multi-disciplinary open access
archive for the deposit and dissemination of sci-
entic research documents, whether they are pub-
lished or not. The documents may come from
teaching and research institutions in France or
abroad, or from public or private research centers.
L’archive ouverte pluridisciplinaire HAL, est
destinée au dépôt et à la diusion de documents
scientiques de niveau recherche, publiés ou non,
émanant des établissements d’enseignement et de
recherche français ou étrangers, des laboratoires
publics ou privés.
Adaptive Constructive Interval Disjunction
Bertrand Neveu, Gilles Trombettoni
To cite this version:
Bertrand Neveu, Gilles Trombettoni. Adaptive Constructive Interval Disjunction. ICTAI: Interna-
tional Conference on Tools with Articial Intelligence, Nov 2013, Washington, DC, United States.
pp.900-906, �10.1109/ICTAI.2013.138�. �hal-00936654�

Adaptive Constructive Interval Disjunction
Bertrand Neveu
LIGM
Universit
´
e Paris Est
Marne-la-Vall
´
ee, France
Email: Bertrand.Neveu@enpc.fr
Gilles Trombettoni
LIRMM
Universit
´
e Montpellier II
Montpellier, France
Email: Gilles.Trombettoni@lirmm.fr
Abstract—An operator called CID and an efficient variant
3BCID were proposed in 2007. For numerical CSPs handled by
interval methods, these operators compute a partial consistency
equivalent to Partition-1-AC for discrete CSPs. The two main
parameters of CID are the number of times the main CID
procedure is called and the maximum number of sub-intervals
treated by the procedure. The 3BCID operator is state-of-the-
art in numerical CSP solving, but not in constrained global
optimization.
This paper proposes an adaptive variant of 3BCID. The
number of variables handled is auto-adapted during the search,
the other parameters are fixed and robust to modifications. On
a representative sample of instances, ACID appears to be the
best approach in solving and optimization, and has been added
to the default strategies of the Ibex interval solver.
I. CONSTRUCTIVE INTERVAL DISJUNCTION (CID)
A filtering/contracting operator for numerical CSPs called
Constructive Interval Disjunction (in short CID) has been
proposed in [13]. Applied first to continuous constraint sat-
isfaction problems handled by interval methods, it has been
more recently applied to constrained global optimization
problems. This algorithm is state-of-the-art in constraint sat-
isfaction, but is generally dominated by constraint propaga-
tion algorithms like HC4 in optimization. The main practical
contribution is that an adaptive version of CID becomes
efficient for both real-valued satisfaction and optimization
problems, while needing no additional parameter value from
the user.
A. Shaving
The shaving principle is used to compute the Singleton
Arc Consistency (SAC) of finite domain CSPs [7] and the
3B-consistency of numerical CSPs [9]. It is also at the core
of the SATZ algorithm [11] used to prove the satisfiability
of Boolean formula. Shaving works as follows. A value
is temporarily assigned to a variable (the other values are
temporarily discarded) and a partial consistency is computed
on the remaining subproblem. If an inconsistency is obtained
then the value can be safely removed from the domain of
the variable. Otherwise, the value is kept in the domain.
Contrarily to arc consistency, this consistency is not in-
cremental [7]. Indeed, the work of the underlying refutation
procedure on the whole subproblem is the reason why a
single value can be removed. Thus, obtaining the singleton
arc consistency on finite-domain CSPs requires an expen-
sive fixed-point algorithm where all the variables must be
handled again every time a single value is removed [7]. The
remark still holds for the improved version SAC-Opt [5]. A
similar idea can be followed on numerical CSPs (NCSPs).
B. Numerical CSP
An NCSP is defined by a tuple P =(X, [X],C),
where X denotes a n-set of numerical, real-valued variables
ranging in a domain [X]. We denote by [x
i
]=[x
i
, x
i
]
the interval/domain of variable x
i
X, where x
i
, x
i
are
floating-point numbers (allowing interval algorithms to be
implemented on computers). A solution of P is an n-vector
in [X] satisfying all the constraints in C. The constraints
defined in an NCSP are numerical. They are equations and
inequalities using mathematical operators like +,
q
, /, exp,
log, sin.
A Cartesian product of intervals like the domain [X]=
[x
1
] × ... × [x
n
] is called a (parallel-to-axes) box. w(x
i
)
denotes the width
x
i
x
i
of an interval [x
i
]. The width of a
box is given by the width
x
m
x
m
of its largest dimension
x
m
. The union of several boxes is generally not a box, and a
Hull operator has been defined instead to define the smallest
box enclosing all of them.
NCSPs are generally solved by a Branch & Contract interval
strategy:
Branch: a variable x
i
is chosen and its interval [x
i
]
is split into two sub-intervals, thus making the whole
process combinatorial.
Contract: a filtering process allows contracting the
intervals (i.e., improving interval bounds) without loss
of solutions.
The process starts with the initial domain [X] and stops
when the leaves/boxes of the search tree reach a width
inferior to a precision given as input. These leaves yield
an approximation of all the solutions of the NCSP.
Several contraction algorithms have been proposed. Let us
mention the constraint propagation algorithm called HC4 [3],
[10], an efficient implementation of 2B [9], that can enforce
the optimal local consistency (called hull-consistency) only
if strong hypotheses are met (in particular, each variable
2013 IEEE 25th International Conference on Tools with Artificial Intelligence
1082-3409/13 $31.00 © 2013 IEEE
DOI 10.1109/ICTAI.2013.138
900

must occur at most once in a same constraint). The 2B-
Revise procedure works with all the projection functions of a
given constraint. Informally, a projection function isolates a
given variable occurrence within the constraint. For instance,
consider the constraint x + y = z.x; x z.x y is a
projection function (among others) that aims at reducing
the domain of variable x. Evaluating the projection function
with interval arithmetics on the domain [x] × [y] × [z] (i.e.,
replacing the variable occurrences of the projection function
by their domains and using the interval counterpart of the
involved mathematical operators) provides an interval that is
intersected with [x]. Hence a potential domain reduction. A
constraint propagation loop close to that of AC3 is used to
propagate reductions obtained for a given variable domain
to the other constraints in the system.
C. 3B algorithm
Stronger interval partial consistencies have also been pro-
posed. 3B-consistency [9] is a theoretical partial consistency
similar to SAC for CSP although limited to the bounds of
the domains. Consider the 2n subproblems of the studied
NCSP where each interval [x
i
] (i ∈{1..n}) is reduced to
its lower bound x
i
(resp. upper bound x
i
). 3B-consistency is
enforced iff each of these 2n subproblems is hull-consistent.
In practice, the 3B(w) algorithm splits the intervals in
several sub-intervals, also called slices, of width w, which
gives the accuracy: the 3B(w)-consistency is enforced iff the
slices at the bounds of the handled box cannot be eliminated
by HC4. Let us denote var3B the procedure of the 3B
algorithm that shaves one variable interval [x
i
] and s
3b
its
parameter, a positive integer specifying a number of sub-
intervals: w = w(x
i
)/s
3b
is the width of a sub-interval.
D. CID
Constructive Interval Disjunction (CID) is a partial con-
sistency stronger than 3B-consistency [13]. CID-consistency
is similar to Partition-1-AC (P-1-AC) in finite domain
CSPs [4]. P-1-AC is strictly stronger than SAC [4].
The main procedure varCID handles a single variable
x
i
. The main parameters of varCID are x
i
, a number s
cid
of sub-intervals (accuracy) and a contraction algorithm ctc
like HC4. [x
i
] is split into s
cid
slices of equal width, each
corresponding subproblem is contracted by the contractor
ctc and the hull of the different contracted subproblems is
finally returned, as shown in Algorithm 1.
Intuitively, CID generalizes 3B because a sub-box that is
eliminated by var3B can also be discarded by varCID.
In addition, contrary to var3B, varCID can also contract
[X] along several dimensions.
Note that in the actual implementation the for loop can
be interrupted earlier, when [X]
becomes equal to the initial
box [X] in all the dimensions except x
i
.
var3BCID is a hybrid and operational variant of
varCID.
Procedure VarCID (x
i
, s
cid
,(X, C, in-out [X]), ctc)
[X]
empty box
for j 1 to s
cid
do
/* The j
th
sub-box of [X] on x
i
is handled: */
sliceBox SubBox (j, x
i
, [X])
/* Enforce a partial consistency on the sub-box: */
sliceBox’ ctc(X, C, sliceBox)
/* ”Union” with previous sub-boxes: */
[X]
Hull([X]
, sliceBox’)
[X] [X]
Algorithm 1: The main VarCID procedure of the CID
operator shaving a given variable x
i
.
1) Like var3B, it first tries to eliminate sub-intervals at
the bounds of [x
i
] of width w = w(x
i
)/s
3b
each. We
store the left box [X
l
] and the right box [X
r
] that are
not excluded by the contractor ctc (if any).
2) Second, the remaining box [X]
is handled by
varCID that splits [X]
into s
cid
sub-boxes. The sub-
boxes are contracted by ctc and hulled, giving [X
cid
].
3) Finally, we return the hull of [X
l
], [X
r
] and [X
cid
].
The var3BCID process is illustrated in Figure 1.
3B:
[x]’
CID:
Figure 1. Task of the var3BCID procedure. The parameter s
3b
is set to
10 and s
cid
is set to 1.
var3BCID comes from the wish of managing different
widths (accuracies) for s
3b
and s
cid
. Indeed, the best choice
for s
3b
generally belongs to {5..20} while s
cid
should always
be set to 1 or 2 (implying a final hull of 3 or 4 sub-boxes).
The reason is that the actual time cost of the shaving part is
smaller than the one of the constructive domain disjunction.
Indeed, if no sub-interval is discarded by var3B, only two
calls to ctc are performed, one for each bound of the handled
interval; if varCID is applied, the subcontractor is often
called s
cid
times.
The procedure var3BCID has been deeply studied and
experimented in the past. The number and the order in which
calls to var3BCID are achieved is a harder question studied
in this paper.
II. A
DAPTIVE CID: LEARNING THE NUMBER OF
HANDLED VARIABLES
Like for SAC or 3B, a quasi fixed-point in terms of
contraction can be reached by 3BCID (or CID) by calling
var3BCID inside two nested loops. An inner loop calls
var3BCID on each variable x
i
. An outer loop calls the
901

inner loop until no interval is contracted more than a
predefined (width) precision (thus reaching a quasi-fixed
point). Let us call 3BCID-fp (fixed-point) this historical
version.
Two reasons led us to radically change this policy. First,
as said above, var3BCID can contract the handled box
in several dimensions. One significant advantage is that the
fixed-point in terms of contraction can thus be reached in
a small number of calls to var3BCID. On most of the
instances in satisfaction or optimization, it appears that a
quasi fixed-point is reached in less than n calls. In this
case, 3BCID is clearly too expensive. Second, the varCID
principle is close to a branching point in a search tree. The
difference is that a hull is achieved at the end of the sub-box
contractions. Therefore an idea is to use a standard branching
heuristic to select the next variable to be “varcided”. We will
write in the remaining part of the paper that a variable is
varcided when the procedure var3BCID is called on that
variable to contract the current box.
To sum up, the idea for rendering 3BCID even more
efficient in practice is to replace the two nested loops by
a single loop calling numVarCID times var3BCID and to
use an efficient variant of the Smear function branching
heuristic for selecting the variables to be varcided (called
SmearSumRel in [12]). Informally, the Smear function
favors variables having a large domain and a high impact
on the constraints measuring interval partial derivatives.
A first idea is to fix numVarCID to the number n of
variables. We call 3BCID-n this version. This gives good
results in satisfaction but is dominated by pure constraint
propagation in optimization. As said above, it is too time
costly when the right numVarCID is smaller than n (which
is often the case in optimization), but can also have a very
bad impact on performance if a bigger effort brought a
significantly greater filtering.
The goal of Adaptive CID (ACID) is precisely to compute
dynamically during search the value of the numVarCID
parameter. Several auto-adaptation policies have been tested
and we report three interesting versions. All the policies
measure the decrease in search space size after each call to
var3BCID. They measure a contraction ratio of a box [X]
b
over another box [X]
a
as an average relative gain in all the
dimensions:
gainRatio([X]
b
, [X]
a
)=
1
n
n
i=1
(1
w(x
b
i
)
w(x
a
i
)
)
A. ACID0: auto-adapting numVarCID during search
The first version ACID0 adapts the number of shaved
variables dynamically at each node of the search tree. First,
the variables are sorted by their impact, computed by the
same formula as the SmearSumRel function (used for
branching). Variables are then varcided until the cumulative
contraction ratio during the last nv calls to var3BCID
becomes less than ctratio. This algorithm has thus 2 param-
eters nv and ctratio, and it was difficult to tune them. We
experimentally found that ctratio could be fixed to 0.001
and nv should depend on the number of variables n of
the problem. Setting nv to 1 is often a bad choice, and
fixing it with the formula nv = max(3,
n
4
) experimentally
gave the best results. The experimental results are not bad
but this policy prevents numVarCID from reaching 0, i.e.
from calling only constraint propagation. This is a significant
drawback when a simple constraint propagation is the most
efficient approach.
B. ACID1: interleaving learning and exploitation phases
A more sophisticated approach avoids this drawback.
ACID1 interleaves learning and exploitation phases for auto-
adapting the numVarCID value. Depending on the node
number, the algorithm is in a learning or in an exploitation
phase.
The behavior of ACID1, shown in Algorithm 2, is the
following:
The variables are first sorted according to their impact
measurement (using the SmearSumRel heuristic).
During a learning phase (during learnLength nodes), we
then analyze how the contraction ratio evolves from a
var3BCID call to the next one, and store the number
kvarCID of varcided variables necessary to obtain most
of the possible filtering.
More precisely, 2.numVarCID variables are varcided
at each node (with a minimum value equal to 2,
in case numVarCID=0). In the first learning phase,
we handle n variables. At the current node, the
lastSignificantGain function returns the num-
ber kvarCID of varcided variables giving the last
significant improvement. After the kvarCID
th
call to
var3BCID, the gain in current box size from a
var3BCID call to the next one, computed by the
gainRatio formula, never exceeded a small given
ratio, called ctratio. This analysis starts from the last
varcided variable. (For the readibility of the pseudo-
code, we omit the parameters of the var3BCID proce-
dure, i.e. s
3b
, s
cid
, the constraints C and the contractor
ctc.)
During the exploitation phase following the previous
learning phase, the average of the different kvarCID
values (obtained in the nodes of the learning phase)
provides the new value of numVarCID. This value
will be used by 3BCID during the exploitation phase.
Compared to the previous value (previous call to an
exploitation phase), note that this new value can at most
double, but can also drastically decrease.
Every cycleLength nodes in the search tree, both
phases are called again.
Numerous variants of this schema were tested. In partic-
ular, it is counterproductive to learn numVarCID only once
902

Procedure ACID1 (X, n, in-out [X], in-out call, in-out
numVarCID)
learnLength 50
cycleLength 1000
ctratio 0.002
/* Sort the variables according to their impact */
X smearSumRelSort (X)
if call % cycleLength learnLength then
/* Learning phase */
nvarCID max(2, 2 . numVarCID)
for i from 1 to nvarCID do
[X]
old
[X]
var3BCID (X[i%n], [X], ...)
ctcGains[i] gainRatio( [X], [X]
old
)
kvarCID[call] lastSignificantGain
(ctcGains, ctratio, nvarCID)
if call % cycleLength = learnLength then
/* End of learning phase */
numVarCID average (kvarCID[])
else
/* Exploitation Phase */
if numVarCID > 0 then
for i from 1 to numVarCID do
var3BCID (X[i % n], [X], ...)
call call +1
Algorithm 2: Algorithm ACID1
Function lastSignificantGain(ctcGains, ctratio,
nvarCID)
for i from nvarCID downto 1 do
if (ctcGains[i] > ctratio) then
return i
return 0
or, on the contrary, to memorize the computations from a
learning phase to another one.
We fixed experimentally the 3 parameters of the ACID1
procedure learnLength, cycleLength and ctratio,
respectively to 50, 1000 and 0.002. ACID1 becomes then
a parameter free procedure. With these parameter values,
the overhead of the learning phases (where we double the
previous numVarCID value) remains small.
C. ACID2: taking into account the level in the search tree
A criticism against ACID1 is that we average kvarCID
values obtained at different levels of the search tree. This
drawback is partially corrected by the successive learning
phases of ACID1, where each learning phase corresponds
to a part of the search tree.
In order to go further in that direction, we designed a
refinement of ACID1 for which each learning phase tunes
at most 10 different values depending on the width of the
studied box. A value corresponds to one order of magnitude
in the box width. For example, we store a numVarCID
value for the boxes with a width comprised between 1
and 0.1, another one for the boxes with a width comprised
between 0.1 and 0.01, etc. However, this approach, called
ACID2, gave in general results similar to those of ACID1
and appeared to be less robust. Indeed, only a few nodes
sometimes fall at certain width levels, which renders the
statistics not significant.
III. E
XPERIMENTS
All the algorithms were implemented in the C++ in-
terval library Ibex (Interval Based EXplorer) [6]. All
the experiments were run on the same computer (Intel
X86 3GHz). We tested the algorithms on square NCSP
solving and constrained global optimization. NCSP solving
consists in finding all the solutions of a square system
of n nonlinear equations with n real-values variables with
bounded domains. Global optimization consists in finding
the global minimum of a function over n variables subject
to constraints (equations and inequalities), the objective
function and/or the constraints being non-convex.
A. Experiments in constraint satisfaction
We selected from the COPRIN benchmark
1
all the sys-
tems that were solved by one of the tested algorithms in
a time comprised between 2 s and 3600 s. The timeout was
fixed to 10,000 s. The required precision on the solution is
10
8
. Some of these problems are scalable. In this case, we
selected the problem with the greatest number of variables
that was solved by one of the algorithms in less than one
hour.
We compared our ACID method and its variants with the
well known filtering techniques: a simple constraint prop-
agation HC4, 3BCID-n (see Section II) and 3BCID-fp
(fixed-point) in which a new iteration on all the vari-
ables is run when a variable domain width is reduced
by more than 1%. At each node of the search tree, we
used the following sequence of contractors : HC4, shaving,
Interval-Newton [8], and X-Newton [2]. shaving de-
notes a variant of ACID, 3BCID-n, 3BCID-fp or nothing
when only HC4 is tested.
For each problem, we used the best bisection heuristics
available (among two variants of the Smear function [12]).
The main parameter ctratio of ACID1 and ACID2, measur-
ing a stagnation in the filtering while variables are varcided,
was fixed to 0.002. The var3BCID parameters s
3b
and s
cid
were fixed to the default settings, respectively 10 and 1, pro-
posed in [13]. Experiments on the selected instances confirm
that these settings are relevant and robust to variations. In
particular, setting s
3b
to 10 gives results better than with
smaller values (s
3b
=5) and with greater values. (For 21
over the 26 instances, s
3b
=20gives worse results.) As
1
www-sop.inria.fr/coprin/logiciels/ALIAS/Benches/benches.html
903

Citations
More filters

Proceedings Article
27 Jul 2014
TL;DR: This paper proposes adaptive variants of partition-one-AC, a singleton-based consistency which is able to prune values on all variables when it performs singleton tests on one of them, and shows that adaptive Partition- one-AC can obtain significant speed-ups over arc consistency and over the full version of partitions.
Abstract: Singleton-based consistencies have been shown to dramatically improve the performance of constraint solvers on some difficult instances. However, they are in general too expensive to be applied exhaustively during the whole search. In this paper, we focus on partition-one-AC, a singleton-based consistency which, as opposed to singleton arc consistency, is able to prune values on all variables when it performs singleton tests on one of them. We propose adaptive variants of partition-one-AC that do not necessarily run until having proved the fixpoint. The pruning can be weaker than the full version but the computational effort can be significantly reduced. Our experiments show that adaptive Partition-one-AC can obtain significant speed-ups over arc consistency and over the full version of partition-one-AC.

18 citations


Cites background from "Adaptive Constructive Interval Disj..."

  • ...|) for approximating the search space, as done in (Neveu and Trombettoni 2013)....

    [...]

  • ...In (Trombettoni and Chabert 2007; Neveu and Trombettoni 2013) a consistency called Constructive Interval Disjunction (CID), close to POAC in its principle, gave good results by simply calling the main procedure once on each variable or by adapting during search the number of times it is called....

    [...]


Journal ArticleDOI
TL;DR: The basic principles behind interval Branch and Bound algorithms are explained and issues that should be considered to improve the efficiency of the algorithms are described.
Abstract: Interval Branch and Bound algorithms are used to solve rigorously continuous constraint satisfaction and constrained global optimization problems. In this paper, we explain the basic principles behind interval Branch and Bound algorithms. We detail the main components and describe issues that should be considered to improve the efficiency of the algorithms.

15 citations


Cites background from "Adaptive Constructive Interval Disj..."

  • ...Another adaptive interval-based contractor is proposed in [95]....

    [...]


Journal ArticleDOI
TL;DR: Constraint propagation has been widely used in nonlinear single-objective optimization inside interval Branch & Bound algorithms as an efficient way to discard infeasible and non-optimal regions of the search space.
Abstract: Constraint propagation has been widely used in nonlinear single-objective optimization inside interval Branch & Bound algorithms as an efficient way to discard infeasible and non-optimal regions of the search space. On the other hand, when considering two objective functions, constraint propagation is uncommon. It has mostly been applied in combinatorial problems inside particular methods. The difficulty is in the exploitation of dominance relations in order to discard the so-called non-Pareto optimal solutions inside a decision domain, which complicates the design of complete and efficient constraint propagation methods exploiting dominance relations. In this paper, we present an interval Branch & Bound algorithm which integrates dominance contractors , constraint propagation mechanisms that exploit an upper bound set using dominance relations. This method discards from the decision space values yielding solutions dominated by some solutions from the upper bound set. The effectiveness of the approach is shown on a sample of benchmark problems.

9 citations


Journal ArticleDOI
TL;DR: An appropriate analysis routine which is able to determine the worst-case bounds of the coupler curve resulting from uncertainties is proposed and routines are introduced for evaluating the classification and assembly of mechanisms with uncertainties.
Abstract: Uncertainties are inherent in the fabrication and operation of mechanisms. In terms of the four-bar linkage, uncertainties in the geometric parameters result in non-exact solutions for the coupler point. A design description which accounts for bounded uncertainties is termed an appropriate design. Appropriate analysis routines are analysis routines for guaranteeing that specific requirements will be satisfied, which is applicable to mechanisms described by an appropriate design. These routines rely on interval analysis to provide reliable results and are able to handle the uncertainties present in a mechanism. An appropriate analysis routine which is able to determine the worst-case bounds of the coupler curve resulting from uncertainties is proposed. As well, routines are introduced for evaluating the classification and assembly of mechanisms with uncertainties. Examples are presented to demonstrate the coupler curves of four-bar linkages described by an appropriate design which corresponds to each linkage classification, including folding linkages.

9 citations


Journal ArticleDOI
TL;DR: This work proposes a new variable selection strategy which first weights the constraints by using the optimal Lagrangian multipliers of a linearization of the original problem.
Abstract: Smear-based variable selection strategies are well-known and commonly used by branch-and-prune interval-based solvers. They estimate the impact of the variables on each constraint of the system by using the partial derivatives and the sizes of the variable domains. Then they aggregate these values, in some way, to estimate the impact of each variable on the whole system. The variable with the greatest impact is then selected. A problem of these strategies is that they, generally, consider all constraints equally important. In this work, we propose a new variable selection strategy which first weights the constraints by using the optimal Lagrangian multipliers of a linearization of the original problem. Then, the impact of the variables is computed with a typical smear-based function but taking into account the weights of the constraints. The strategy isg tested on a set of well-known benchmark instances outperforming significantly the classical variable selection strategies.

8 citations


References
More filters

Journal ArticleDOI
TL;DR: This Second Edition of Global Optimization Using Interval Analysis expands and improves various aspects of its forerunner and features significant new discussions, such as those on the use of consistency methods to enhance algorithm performance.
Abstract: Employing a closed set-theoretic foundation for interval computations, Global Optimization Using Interval Analysis simplifies algorithm construction and increases generality of interval arithmetic. This Second Edition contains an up-to-date discussion of interval methods for solving systems of nonlinear equations and global optimization problems. It expands and improves various aspects of its forerunner and features significant new discussions, such as those on the use of consistency methods to enhance algorithm performance. Provided algorithms are guaranteed to find and bound all solutions to these problems despite bounded errors in data, in approximations, and from use of rounded arithmetic.

1,690 citations


"Adaptive Constructive Interval Disj..." refers methods in this paper

  • ...The experimental protocol is the same as the NCSP experimental protocol, except that we do not use Interval-Newton that is only implemented for square systems....

    [...]

  • ...At each node of the search tree, we used the following sequence of contractors : HC4, shaving, Interval-Newton [8], and X-Newton [2]....

    [...]

  • ...It offers the best results on average and is the best or close to the best on every tested instance, even in presence of the best Ibex devices (Interval-Newton, X-Newton)....

    [...]

  • ...At each node of the search tree, we used the following sequence of contractors : HC4, shaving, Interval-Newton [8], and X-Newton [2]. shaving denotes a variant of ACID, 3BCID-n, 3BCID-fp or nothing when only HC4 is tested....

    [...]


Book
21 Jul 1992

1,326 citations


Proceedings Article
28 Aug 1993
TL;DR: The semantics that have been elaborated, plus the complexity analysis and good experimental results, confirm that the consistency techniques developed for CSPs can be used in real applications.
Abstract: Many problems can be expressed in terms of a numeric constraint satisfaction problem over finite or continuous domains (numeric CSP). The purpose of this paper is to show that the consistency techniques that have been developed for CSPs can be adapted to numeric CSPs. Since the numeric domains are ordered the underlying idea is to handle domains only by their bounds. The semantics that have been elaborated, plus the complexity analysis and good experimental results, confirm that these techniques can be used in real applications.

497 citations


"Adaptive Constructive Interval Disj..." refers background or methods in this paper

  • ...NCSPs are generally solved by a Branch & Contract interval strategy: • Branch: a variable xi is chosen and its interval [xi] is split into two sub-intervals, thus making the whole process combinatorial....

    [...]

  • ...A similar idea can be followed on numerical CSPs (NCSPs)....

    [...]

  • ...3B-consistency [9] is a theoretical partial consistency similar to SAC for CSP although limited to the bounds of the domains....

    [...]

  • ...Let us mention the constraint propagation algorithm called HC4 [3], [10], an efficient implementation of 2B [9], that can enforce the optimal local consistency (called hull-consistency) only if strong hypotheses are met (in particular, each variable 2013 IEEE 25th International Conference on Tools with Artificial Intelligence...

    [...]

  • ...CID-consistency is similar to Partition-1-AC (P-1-AC) in finite domain CSPs [4]....

    [...]


Proceedings Article
23 Aug 1997
TL;DR: A new simple unit propagation based heuristic is put forward that compares favorably with the heuristics employed in the current state-of-the-art DPL implementations (C-SAT, Tableau, POSIT).
Abstract: The paper studies new unit propagation based heuristics for Davis-Putnam-Loveland (DPL) procedure. These are the novel combinations of unit propagation and the usual "Maximum Occurrences in clauses of Minimum Size" heuristics. Based on the experimental evaluations of different alternatives a new simple unit propagation based heuristic is put forward. This compares favorably with the heuristics employed in the current state-of-the-art DPL implementations (C-SAT, Tableau, POSIT).

404 citations


"Adaptive Constructive Interval Disj..." refers methods in this paper

  • ...of the SATZ algorithm [11] used to prove the satisfiability of Boolean formula....

    [...]


Proceedings Article
29 Nov 1999
TL;DR: HC4, an algorithm to enforce hull consistency without decomposing complex constraints into primitives is presented, and BC4, a new algorithm to efficiently enforce box consistency is described, which is shown to significantly outperform both HC3 and BC3.
Abstract: Most interval-based solvers in the constraint logic programming framework are based on either hull consistency or box consistency (or a variation of these ones) to narrow domains of variables involved in continuous constraint systems. This paper first presents HC4, an algorithm to enforce hull consistency without decomposing complex constraints into primitives. Next, an extended definition for box consistency is given and the resulting consistency is shown to subsume hull consistency. Finally, BC4, a new algorithm to efficiently enforce box consistency is described, that replaces BC3—the “original” solely Newton-based algorithm to achieve box consistency—by an algorithm based on HC4 and BC3 taking care of the number of occurrences of each variable in a constraint. BC4 is then shown to significantly outperform both HC3 (the original algorithm enforcing hull consistency by decomposing constraints) and BC3.

403 citations


"Adaptive Constructive Interval Disj..." refers methods in this paper

  • ...Let us mention the constraint propagation algorithm called HC4 [3], [10], an efficient implementation of 2B [9], that can enforce the optimal local consistency (called hull-consistency) only if strong hypotheses are met (in particular, each variable 1082-3409/13 $31.00 © 2013 IEEE DOI 10.1109/ICTAI.2013.138 900 must occur at most once in a same constraint)....

    [...]

  • ...Therefore, we report in the penultimate column a comparison between ACID1 and HC4....

    [...]

  • ...The main parameters of varCID are xi, a number scid of sub-intervals (accuracy) and a contraction algorithm ctc like HC4....

    [...]

  • ...At each node of the search tree, we used the following sequence of contractors : HC4, shaving, Interval-Newton [8], and X-Newton [2]. shaving denotes a variant of ACID, 3BCID-n, 3BCID-fp or nothing when only HC4 is tested....

    [...]

  • ...Table IV shows that we obtained an average gain of 10% over HC4....

    [...]