scispace - formally typeset
Search or ask a question
Book ChapterDOI

A study of multi-parent crossover operators in a memetic algorithm

11 Sep 2010-pp 556-565
TL;DR: This work evaluates the performance of four multi-parent crossover operators (called MSX, Diagonal, U-Scan and OB-Scan) and provides evidences and insights as to why one particular multi- parent crossover operator leads to better computational results than another one.
Abstract: Using unconstrained binary quadratic programming problem as a case study, we investigate the role of multi-parent crossover operators within the memetic algorithm framework. We evaluate the performance of four multi-parent crossover operators (called MSX, Diagonal, U-Scan and OB-Scan) and provide evidences and insights as to why one particular multi-parent crossover operator leads to better computational results than another one. For this purpose, we employ several indicators like population entropy and average solution distance in the population.

Summary (3 min read)

1 Introduction

  • Memetic algorithms (MA) are known to be one of the highly effective metaheuristic approaches for solving a large number of constraint satisfaction and optimization problems [1].
  • The literature reports a number of evolutionary and memetic algorithms with two-parent crossover operators for solving the UBQP problem ([3,4,5,6]).
  • The authors are particularly interested in investigating the role of multi-parent crossover operators as well as a number of related important questions: why does one particular multi-parent crossover operator lead to better computational results than Corresponding author.

2.1 Main Scheme and Initial Population

  • This study is based on the general memetic framework described in Algorithm 1 that alternates between a combination operator and a local improvement procedure.
  • The combination operator (Section 2.4) is used to generate new offspring solutions while the local improvement procedure based on tabu search (Section 2.2) aims at optimizing each offspring solution.
  • As soon as an offspring solution is improved by tabu search, the population is accordingly updated based on two criteria: the solution quality and the diversity of the population.

2.2 Tabu Search Procedure

  • The authors employ a simple tabu search algorithm as their local search procedure.
  • The authors tabu search procedure uses a neighborhood defined by the simple one-flip move, which consists of changing the value of a single variable xi to its complementary value 1− xi.
  • The implementation of this neighborhood uses a fast incremental evaluation technique [9] to calculate the cost (move value) of transitioning to each neighboring solution.
  • Tabu search incorporates a tabu list as a “recency-based” memory structure to assure that solutions visited within a certain span of iterations, called the tabu tenure, will not be revisited [10].
  • The authors tabu search method stops when a given number α of moves are reached, called depth of tabu search.

2.3 Pool Updating

  • In their memetic algorithm, after an offspring x0 is obtained by the crossover operator and improved by tabu search, the authors decide whether the improved offspring should be inserted into the population, replacing the existing worst solution.
  • For this purpose, the authors define a quality-and-distance goodness score of the offspring x0 with respect to the population.
  • Therefore, if the goodness score of the offspring solution is good enough, it will have high probability to replace the worst solution in the population.
  • Interested readers are referred to [7] for more details.

2.4 Combination Operators

  • The authors use four multi-parent crossover (or combination) operators to generate offspring solutions, including a “logic” multi-parent combination operator (MSX), a diagonal multi-parent crossover , a multi-parent uniform scanning crossover (U-Scan), a multi-parent occurrence based scanning crossover (OB-Scan).
  • Thus, the variables with the first Avg largest Strength values receive assignment 1 and other variables receive assignment 0.
  • The formal definition is given as follows.
  • Thus each parent has the same probability to be the dominator of the value inherited by the offspring.

3.1 Instances and Experimental Protocol

  • To evaluate the MSX, Diagonal, U-Scan and OB-Scan crossover operators, the authors carry out experiments on a set of 15 large random instances with 3000 to 5000 variables from the literature [13].
  • The authors algorithm is programmed in C and compiled using GNU GCC on a PC running Windows XP with Pentium 2.66GHz CPU and 512MB RAM.
  • Given the stochastic nature of the algorithm, problem instances are independently solved 10 times.
  • The stop condition for a single run is respectively set to be 5, 10 and 20 minutes on their computer for instances with 3000, 4000 and 5000 variables, respectively.
  • Note that when performing experiments on each crossover, the only difference consists in the crossover operator and other components of the algorithm are kept unchanged.

3.2 Computational Results

  • Tables 1 and 2 report the best objective values (in parentheses number of hits over 10 runs) and the average objective values using the four crossover operators, respectively.
  • The authors observe that MSX and U-Scan perform slightly better in terms of the best objective values since both two crossovers obtain the best values for 4 out of the 15 instances.
  • OB-Scan seems to be the worst in terms of the best objective value and the success rate.
  • Moreover, U-Scan and Diagonal perform quite well on 3 and 2 instances, respectively.
  • In addition, the results also disclose that the performance of various crossover operators depend on the instances to be solved.

4 Analysis

  • The above computational results show that for certain instances, some crossover operators perform better than other ones in terms of the solution quality.
  • For this purpose, the authors introduce the following evaluation criteria to characterize the search capacity of different crossover operators: population entropy, average solution distance and average solution quality in the population.
  • The authors also perform an experiment to show how different crossover operators and local search jointly influence the performance of the memetic algorithms.
  • As an example, the experiments are presented on the large random instance p5000.5.
  • From Tables 1 and 2, one observes that for this instance MSX performs the best, while U-Scan and OB-Scan are much worse than MSX and even Diagonal.

4.1 Evolution of Solution Quality

  • The authors first study one of the most important characteristics for the four crossover operators, i.e., the solution gaps to the best known value evolving with the generation iterations, denoted by gb.
  • Gb is defined as the average value of solution gaps between the best solution in the current population and the best known objective value over 10 independent runs.
  • One observes that at the first generations, the four crossover operators have no clear difference in terms of this criterion.
  • With the search progresses, MSX performs much better than other three ones.
  • This observation coincides very well with the results reported in Tables 1 and 2, showing the advantage of the crossover operators of MSX and Diagonal, as well as the weakness of U-Scan and OB-Scan for this problem instance.

4.2 Population Entropy and Distance

  • In their second experiment, the authors observe the two characteristics of the four multiparent crossover operators in terms of the population diversity: the population entropy vs. the number of generations; the average solution distance in the population vs. the number of generations.
  • The authors see that the population diversity measured in terms of these two characteristics is better preserved during the evolution process for MSX and Diagonal than for U-Scan and OB-Scan, especially after the first 60 generations.
  • Following the spirit of scatter search and path-relinking, an efficient solution combination operator is one that ensures not only high quality solutions but also a good diversity of the population.
  • In other words, the diversification of the population induced by MSX and Diagonal allows the algorithm to benefit from a better exploration of the search space and prevents the population from stagnating in poor local optima.

4.3 Tradeoff between Intensification and Diversification

  • The authors turn their attention to study another important aspect of the memetic algorithms, i.e., the tradeoff between local search and crossover operators.
  • Under a limited computational resource, the depth of tabu search reflects the relative proportion of combination operators and tabu search in the algorithm.
  • For each value, the authors perform 10 independent runs, each run being given 20 minutes CPU time.
  • Figure 3 shows the average evolution of the best solution gaps during the search obtained with these three α values and four crossover operators.
  • When it comes to the MSX crossover operator, one observes that MSX is not really sensitive to various α values, showing that MSX plays a real driving role for the search process.

5 Conclusions

  • Understanding and explaining the performance of crossover operators within a memetic algorithm is an important topic.
  • The authors presented an attempt to analyze the intrinsic characteristics of four crossover operators for the UBQP problem.
  • To this end, the authors employed several evaluation indicators to characterize the search capability of a crossover operator.
  • The experimental analysis allowed us to understand to some extent the relative advantages and weaknesses of the four studied crossover operators within the memetic framework.

Did you find this useful? Give us your feedback

Content maybe subject to copyright    Report

A Study of Multi-parent Crossover Operators in
a Memetic Algorithm
Yang Wang, Zhipeng L¨u
, and Jin-Kao Hao
LERIA, Universit´e d’Angers, 2 Boulevard Lavoisier, 49045 Angers Cedex 01, France
{yangw,lu,hao}@info.univ-angers.fr
Abstract. Using unconstrained binary quadratic programming problem
as a case study, we investigate the role of multi-parent crossover operators
within the memetic algorithm framework. We evaluate the performance
of four multi-parent crossover operators (called MSX, Diagonal, U-Scan
and OB-Scan) and provide evidences and insights as to why one particu-
lar multi-parent crossover operator leads to better computational results
than another one. For this purpose, we employ several indicators like
population entropy and average solution distance in the population.
Keywords: multi-parent crossover, memetic algorithm, unconstrained
quadratic programming, performance analysis.
1 Introduction
Memetic algorithms (MA) are known to be one of the highly effective metaheuris-
tic approaches for solving a large number of constraint satisfaction and optimiza-
tion problems [1]. One of the most important features of a MA is the crossover
operator for generating offspring solutions. In general, meaningful crossover op-
erators help to create healthy diversification in the population and to avoid a
premature convergence of the population.
In this paper, we provide a case study of multi-parent crossover operators
within memetic algorithms for the unconstrained binary quadratic programming
(UBQP) that can be written
UBQP: Maximize f(x)=x
Qx
x binary
where Q is an n by n matrix of constants and x is an n-vector of binary variables.
The formulation UBQP is notable for its ability to represent a wide range
of important problems [2]. The literature reports a number of evolutionary
and memetic algorithms with two-parent crossover operators for solving the
UBQP problem ([3,4,5,6]). However, one finds no studies concerning multi-parent
crossover operators for UBQP, where multi-parent crossover operators generate
offspring solutions by combining more than two parent solutions. In this work,
we are particularly interested in investigating the role of multi-parent crossover
operators as well as a number of related important questions: why does one par-
ticular multi-parent crossover operator lead to better computational results than
Corresponding author.
R. Schaefer et al. (Eds.): PPSN XI, Part I, LNCS 6238, pp. 556–565, 2010.
c
Springer-Verlag Berlin Heidelberg 2010

A Study of Multi-parent Crossover Operators in a Memetic Algorithm 557
another one? What are the main characteristics of a good multi-parent crossover
operator? To what extent can the crossover operators influence the performance
of the memetic algorithms?
Without claiming to answer all these questions, we present an experimental
analysis of various multi-parent crossover operators within a memetic algorithm.
For this purpose, we use four multi-parent crossover operators, respectively called
MSX, Diagonal, U-Scan and OB-Scan. The last three ones are well known in
the literature while the first one is recently proposed in [7]. The analysis shows
that the computational results are strongly correlated with the characteristics of
the corresponding crossover operators, such as the entropy and average solution
distance of the population, the average solution quality in the population. Fur-
thermore, the analysis sheds light on how a tradeoff between local search and
crossover operator can be achieved when using different crossover operators.
2 Multi-parent Crossover within Memetic Algorithms
2.1 Main Scheme and Initial Population
This study is based on the general memetic framework described in Algorithm 1
that alternates between a combination operator and a local improvement proce-
dure. The combination operator (Section 2.4) is used to generate new offspring
solutions while the local improvement procedure based on tabu search (Section
2.2) aims at optimizing each offspring solution. As soon as an offspring solution
is improved by tabu search, the population is accordingly updated based on two
criteria: the solution quality and the diversity of the population. The individuals
of the initial population are generated randomly (i.e., each variable x
i
of the
n-vector x receives a value of 0 or 1 with equal probability).
Algorithm 1. Pseudo-code of the memetic algorithm
1: Input:matrixQ
2: Output: the best solution x
found so far
3: P = {x
1
,...,x
p
}←Population Initialization( )
4: for i = {1,...,p} do
5: x
i
Tabu Search(x
i
)
6: end for
7: x
= arg max{f (x
i
)|i =1,...,p}
8: repeat
9: randomly choose a subset of individuals E (|E|∈[4, 8])fromP
10: x
0
Crossover Operator(E)
11: x
0
Tabu Search(x
0
)
12: if f (x
0
) >f(x
) then
13: x
= x
0
14: end if
15: P Pool
Updating(x
0
,P)
16: until a stop criterion is met

558 Y.Wang,Z.L¨uandJ.-K.Hao
2.2 TabuSearchProcedure
In this paper, we employ a simple tabu search algorithm as our local search
procedure. Our tabu search procedure uses a neighborhood defined by the simple
one-flip move, which consists of changing (flipping) the value of a single variable
x
i
to its complementary value 1 x
i
. The implementation of this neighborhood
uses a fast incremental evaluation technique [9] to calculate the cost (move value)
of transitioning to each neighboring solution.
Tabu search incorporates a tabu list as a “recency-based” memory structure
to assure that solutions visited within a certain span of iterations, called the
tabu tenure, will not be revisited [10]. In our implementation, we elected to set
thetabutenureasTabuTenure(i)=tt + rand(10), where tt is a given constant
(n/100) and rand(10) takes a random value from 1 to 10. Our tabu search method
stops when a given number α of moves are reached, called depth of tabu search.
2.3 Pool Updating
In our memetic algorithm, after an offspring x
0
is obtained by the crossover
operator and improved by tabu search, we decide whether the improved offspring
should be inserted into the population, replacing the existing worst solution. For
this purpose, we define a quality-and-distance goodness score of the offspring x
0
with respect to the population. The main idea is to favor the inclusion of x
0
in the
population if x
0
is “good enough” (in terms of its objective function evaluation)
and is not too similar to any solution currently in the population.
Our aim is not only to maintain a pool of good quality solutions but also to
emphasize the importance of the diversity of the solutions to avoid a premature
convergence of the population. Therefore, if the goodness score of the offspring
solution is good enough, it will have high probability to replace the worst solution
in the population. Interested readers are referred to [7] for more details.
2.4 Combination Operators
In this paper, we use four multi-parent crossover (or combination) operators to
generate offspring solutions, including a logic” multi-parent combination opera-
tor (MSX), a diagonal multi-parent crossover (Diagonal), a multi-parent uniform
scanning crossover (U-Scan), a multi-parent occurrence based scanning crossover
(OB-Scan). Note that except MSX which is recently proposed for UBQP [7], the
last three ones have been widely used for other combinatorial optimization prob-
lems in the literature [11,12].
All the four combination operators used in our algorithm are applied on a set
E of s (|E| = s) parent solutions randomly selected from the current population,
i.e., E = {x
(1)
,...,x
(s)
},wherex
(i)
=(x
(i)
1
,...,x
(i)
n
). In our implementation,
we set s to be a random number between 4 and 8.
MSX Crossover (MSX): we define a weight w(i) for the solution x
(i)
and a
strength value Strength(j) for variable x
j
as: w(i)=1/
n
j=1
x
(i)
j
and Strength
(j)=
s
i=1
w(i)x
(i)
j
.

A Study of Multi-parent Crossover Operators in a Memetic Algorithm 559
The value Strength(j) gives a relative indication of the tendency of the solu-
tions in E to favor x
j
=1orx
j
= 0. That is, we may say that the larger the
value of Strength(j), the greater is the degree that E favors x
j
= 1”. Then,
we take an average of the sum(i)valuesoverE to get a value for the num-
ber of x
j
components that should be 1 in an “average” solution, denoted by
Avg =
s
i=1
sum(i)/s.
Thus, the variables with the first Avg largest Strength values receive assign-
ment 1 and other variables receive assignment 0. In practice, it is preferable to
shift Avg slightly in one direction or another to increase the diversity of the
generated offspring solutions [7].
Diagonal Crossover (Diagonal): it is a generalization of the one-point
crossover. For s parent solutions, diagonal crossover divides each parents into
s sections through s 1 crossover points. Each section has the same length ex-
cept the last section containing the surplus variables when divided unequally.
The offspring is constructed through extracting in a diagonal way respectively
one section from each parent. The formal definition is given as follows.
Given set E with s solutions and s 1 crossover points {y
1
,y
2
, ..., y
s1
},
where y
i
= i n/s and 0 <i<s. Diagonal crossover reproduces the offspring
c =(c
1
,c
2
, ...c
s
)by
c
k
=
x
(1)
j
, 1 j<y
1
k =1;
x
(k)
j
,y
k1
j<y
k
1 <k<s;
x
(s)
j
,y
s1
j nk= s.
(1)
Uniform Scanning Crossover (U-Scan): this is a generalization of the two-
parent uniform crossover. U-Scan uses a scheme that one of the parents selected
randomly determines the value of the offspring. Thus each parent has the same
probability to be the dominator of the value inherited by the offspring. It breaks
the limitation of traditionary two parents and can extend the number of parents
to an arbitrary number.
Given set E with s solutions, U-Scan generates offspring solution c =(c
1
,c
2
, ...
c
n
) as follows. Value c
j
is obtained by c
j
= x
(i)
j
where x
(i)
j
denotes the jth value
of parent x
(i)
and i is randomly selected from 1 to s with probability 1/s.
Occurrence-Based Scanning Crossover (OB-Scan): OB-Scan relies on
parental occurrence on determining the offspring values. Generally speaking,
each parent votes and the values inherited will be the one favored by the ma-
jority of parents. As for UBQP problem, the value equals either one or zero,
thus we record the frequency of each variable’s value equal to one appearing in
the parents. If this frequency surpasses or is less than the half of the number of
parents, then the value of offspring in this position is assigned to one or zero,
respectively. Otherwise, the variable is assigned to be one or zero randomly. The
following gives a formal definition of OB-Scan.

560 Y.Wang,Z.L¨uandJ.-K.Hao
Given set E, OB-scan reproduces the offspring c =(c
1
,c
2
, ...c
n
)by
c
j
=
0,
s
i=1
x
(i)
j
<s/2;
1,
s
i=1
x
(i)
j
>s/2;
rand(0, 1), otherwise.
(2)
where rand(0, 1) ∈{0, 1} is a binary random function.
3 Experimental Results
3.1 Instances and Experimental Protocol
To evaluate the MSX, Diagonal, U-Scan and OB-Scan crossover operators, we
carry out experiments on a set of 15 large random instances with 3000 to 5000
variables from the literature [13]. Our algorithm is programmed in C and com-
piled using GNU GCC on a PC running Windows XP with Pentium 2.66GHz
CPU and 512MB RAM. Given the stochastic nature of the algorithm, problem
instances are independently solved 10 times. The stop condition for a single run
is respectively set to be 5, 10 and 20 minutes on our computer for instances with
3000, 4000 and 5000 variables, respectively. Note that when performing exper-
iments on each crossover, the only difference consists in the crossover operator
and other components of the algorithm are kept unchanged. The parameters
are set as follows: population size p = 30, depth of tabu search α =2n.The
experimental results are summarized in Tables 1 and 2.
3.2 Computational Results
Tables 1 and 2 report the best objective values (in parentheses number of hits
over 10 runs) and the average objective values using the four crossover operators,
Table 1. Computational results on the 15 large random instances with 3000 to 5000
variables: best values (succ rate)
Instance MSX U-Scan OB-Scan Diagonal
p3000.1 3931583(9) 3931583(10) 3931583(8) 3931583(9)
p3000.2
5193073(10) 5193073(10) 5193073(10) 5193073(10)
p3000.3
5111533(7) 5111533(7) 5111533(7) 5111533(6)
p3000.4
5761822(10) 5761822(10) 5761822(9) 5761822(10)
p3000.5
5675625(6) 5675625(1) 5675598(1) 5675625(4)
p4000.1
6181830(10) 6181830(10) 6181830(10) 6181830(10)
p4000.2
7801355(7) 7801355(6) 7801355(4) 7801355(6)
p4000.3
7741685(9) 7741685(9) 7741685(9) 7741685(7)
p4000.4
8711822(10) 8711822(9) 8711822(7) 8711822(9)
p4000.5
8908979(4) 8908979(7) 8908979(3) 8908979(2)
p5000.1
8559015(1) 8559312(1) 8559210(3) 8559210(4)
p5000.2
10835437(1) 10835832(3) 10835437(3) 10835437(5)
p5000.3
10488783(10) 10489137(3) 10489137(1) 10489137(1)
p5000.4
12251211(1) 12251211(2) 12251211(2) 12251211(1)
p5000.5
12731803(10) 12731803(1) 12731803(1) 12731803(1)

Citations
More filters
Journal ArticleDOI
TL;DR: A hybrid framework composed of two stages for gene selection and classification of DNA microarray data is proposed, and it is observed that the proposed approach works better than other methods reported in the literature.
Abstract: A hybrid framework composed of two stages for gene selection and classification of DNA microarray data is proposed At the first stage, five traditional statistical methods are combined for preliminary gene selection (Multiple Fusion Filter) Then, different relevant gene subsets are selected by using an embedded Genetic Algorithm (GA), Tabu Search (TS), and Support Vector Machine (SVM) A gene subset, consisting of the most relevant genes, is obtained from this process, by analyzing the frequency of each gene in the different gene subsets Finally, the most frequent genes are evaluated by the embedded approach to obtain a final relevant small gene subset with high performance The proposed method is tested in four DNA microarray datasets From simulation study, it is observed that the proposed approach works better than other methods reported in the literature

48 citations

Journal ArticleDOI
05 May 2019
TL;DR: The proposed NMGA is the combination of Boltzmann probability selection and a multi-parent crossover technique with known random mutation to solve the Traveling Salesman Problem.
Abstract: In the present study, a Novel Memetic Genetic Algorithm (NMGA) is developed to solve the Traveling Salesman Problem (TSP). The proposed NMGA is the combination of Boltzmann probability selection and a multi-parent crossover technique with known random mutation. In the proposed multi-parent crossover parents and common crossing point are selected randomly. After comparing the cost/distance with the adjacent nodes (genes) of participated parents, two offspring’s are produced. To establish the efficiency of the developed algorithm standard benchmarks are solved from TSPLIB against classical genetic algorithm (GA) and the fruitfulness of the proposed algorithm is recognized. Some statistical test has been done and the parameters are studied.

28 citations


Cites result from "A study of multi-parent crossover o..."

  • ...A study on multi-parent MA is found in (Wang et al., 2010) and they concluded that different combination got better results from others....

    [...]

Journal ArticleDOI
TL;DR: A new variation of HSA that uses multi-parent crossover is proposed (HSA-MPC), where three harmonies are used to generate three new harmonies that will replace the worst three solution vectors in the harmony memory (HM).
Abstract: Harmony search algorithm (HSA) is a recent evolutionary algorithm used to solve several optimization problems. The algorithm mimics the improvisation behaviour of a group of musicians to find a good harmony. Several variations of HSA have been proposed to enhance its performance. In this paper, a new variation of HSA that uses multi-parent crossover is proposed (HSA-MPC). In this technique three harmonies are used to generate three new harmonies that will replace the worst three solution vectors in the harmony memory (HM). The algorithm has been applied to solve a set of eight real world numerical optimization problems (1-8) introduced for IEEE-CEC2011 evolutionary algorithm competition. The experimental results of the proposed algorithm are compared with the original HSA, and two variations of HSA: global best HSA and tournament HSA. The HSA-MPC almost always shows superiority on all test problems.

9 citations

Journal ArticleDOI
TL;DR: This paper mainly describes the available selection mechanisms as well as the crossover and the mutation operators for genetic algorithm.
Abstract: algorithm is a search heuristic that mimics the natural process of evolution and it generates solution to a very complex NP-Hard problems. Genetic algorithm belongs to the class of evolutionary algorithms (EA) and it generates solution by using nature inspired techniques like selection, crossover and mutation. The performance of the genetic algorithm is mainly depends on the genetic operators. Genetic operators have the capability to maintain the genetic diversity. This paper mainly describes the available selection mechanisms as well as the crossover and the mutation operators. Keywordsalgorithm; Selection mechanism; Crossover; Mutation

7 citations


Cites background or methods from "A study of multi-parent crossover o..."

  • ...The offspring is constructed through extracting in a diagonal fashion correspondingly one section from each parent [29]....

    [...]

  • ...The choice can be uniform random choice or a random fitness proportional choice [27] [29]....

    [...]

  • ...This U-Scan breaks the traditional two parents approach and can extend the number of parents to an arbitrary number [29]....

    [...]

  • ...UNDX was implemented in the steady state genetic algorithm and it is used to solve three difficult large optimization problems [3][29]....

    [...]

  • ...In general, each parent votes and the values inherited will be the one privileged by the majority of the parents [29]....

    [...]

Book ChapterDOI
17 Sep 2016
TL;DR: This paper presents entropy-based population diversity measures that take into account dependencies between the variables in order to maintain genetic diversity in a GA for the traveling salesman problem and develops a more elaborate population diversity measure.
Abstract: This paper presents entropy-based population diversity measures that take into account dependencies between the variables in order to maintain genetic diversity in a GA for the traveling salesman problem. The first one is formulated as the entropy rate of a variable-order Markov process, where the probability of occurrence of each vertex is assumed to be dependent on the preceding vertices of variable length in the population. Compared to the use of a fixed-order Markov model, the variable-order model has the advantage of avoiding the lack of sufficient statistics for the estimation of the exponentially increasing number of conditional probability components as the order of the Markov process increases. Moreover, we develop a more elaborate population diversity measure by further reducing the problem of the lack of statistics

6 citations

References
More filters
Book
01 Jan 1999

528 citations


"A study of multi-parent crossover o..." refers background in this paper

  • ...Memetic algorithms (MA) are known to be one of the highly effective metaheuristic approaches for solving a large number of constraint satisfaction and optimization problems [1]....

    [...]

Book ChapterDOI
09 Oct 1994
TL;DR: The experiments show that 2-parent recombination is inferior on the classical DeJong functions and in some cases 2 parents are optimal, while in some others more parents are better.
Abstract: We investigate genetic algorithms where more than two parents are involved in the recombination operation. We introduce two multi-parent recombination mechanisms: gene scanning and diagonal crossover that generalize uniform, respecively n-point crossovers. In this paper we concentrate on the gene scanning mechanism and we perform extensive tests to observe the effect of different numbers of parents on the performance of the GA. We consider different problem types, such as numerical optimization, constrained optimization (TSP) and constraint satisfaction (graph coloring). The experiments show that 2-parent recombination is inferior on the classical DeJong functions. For the other problems the results are not conclusive, in some cases 2 parents are optimal, while in some others more parents are better.

397 citations


"A study of multi-parent crossover o..." refers background in this paper

  • ...Note that except MSX which is recently proposed for UBQP [7], the last three ones have been widely used for other combinatorial optimization problems in the literature [11,12]....

    [...]

Journal ArticleDOI
TL;DR: This paper describes the development and use of adaptive memory tabu search procedures to solve binary quadratic programs, and demonstrates that the approach is significantly more efficient and yields better solutions than the best heuristic method reported to date.
Abstract: Recent studies have demonstrated the effectiveness of applying adaptive memory tabu search procedures to combinatorial optimization problems. In this paper we describe the development and use of such an approach to solve binary quadratic programs. Computational experience is reported, showing that the approach optimally solves the most difficult problems reported in the literature. For challenging problems of limited size, which are capable of being approached by exact procedures, we find optimal solutions considerably faster than the best reported exact method. Moreover, we demonstrate that our approach is significantly more efficient and yields better solutions than the best heuristic method reported to date. Finally, we give outcomes for larger problems that are considerably more challenging than any currently reported in the literature.

206 citations


"A study of multi-parent crossover o..." refers methods in this paper

  • ...The implementation of this neighborhood uses a fast incremental evaluation technique [9] to calculate the cost (move value) of transitioning to each neighboring solution....

    [...]