A tutorial for competent memetic algorithms: model, taxonomy, and design issues
Summary (6 min read)
Introduction
- Specifically, solutions to a given problem are codified in so-called chromosomes.
- Thus, a memetic model of adaptation exhibits the plasticity of individuals that a strictly genetic model fails to capture.
- The authors adopt the name of MAs for this metaheuristic, because they think it encompasses all the major concepts involved by the other ones, and for better or worse has become the de facto standard, e.g., [13]–[15].
II. GOALS, AIMS, AND METHODS
- The process of designing effective and efficient MAs currently remains fairly ad hoc and is frequently hidden behind problem-specific details.
- The first goal is to define a syntactic model which enables a better understanding of the interplay between the different component parts of an MA.
- At the same time, it will provide a conceptual framework to deal with more difficult questions about the general behavior of MAs.
- Section V presents a syntax-only model for MAs and a taxonomy of possible architectures for these metaheuristics is given in Section VI.
- Finally, the authors conclude with a discussion and conclusions in Section VIII.
A. Defining the Subject of Study
- It has been argued that the success of MAs is due to the tradeoff between the exploration abilities of the EA, and the exploitation abilities of the local search used.
- A priori formalizations such as [13] and [19] inevitably leave out many demonstrably successful MAs and can seriously limit analysis and generalization of the (already complex) behavior of MAs.
- This is not because MAs are unsuited to these domains—they have been very successfully applied to the fields of multiobjective optimization (see, e.g., [21]–[24], an extensive bibliography can be found in [25]), and numerical optimization (see, e.g., [26]–[32]).
- Rather, the reason for this omission is partly practical, to do with the space this large field would demand.
- Nevertheless, it is worth stressing that the issues cloud the exposition, rather than invalidate the concept of “schedulers” which leads to their syntactic model and taxonomy, and the subsequent design guidelines which can equally well be applied in these more complex domains.
B. Design Issues for MAs
- Having provided a fairly broad-brush definition of the class of metaheuristics that the authors are concerned with, it is still vital to note that the design of “competent” [33].
- As the authors will see in the following sections, there are a host of possible answers to these questions, and it is important to use both empirical experience and theoretical reasoning in the search for answers.
- The aim of their syntactic model is to provide a sound basis for understanding and comparing the effects of different schemes.
IN OPTIMIZATION AND SEARCH
- The authors will briefly comment on the use of MAs on different combinatorial optimization problems and adaptive landscapes.
- For the definition of the problems, the notation in [35] will be used.
- Begin produce a starting solution to problem instance ; Repeat Until (locally optimal).
- It does not specify tie-breaking policies, neighborhood structure, etc.
- Note also that the algorithm above implies that local search continues until a local optima is found.
A. MAs for the TSP
- The TSP is one of the most studied combinatorial optimization problems.
- In [37], early works on the application of MAs to the TSP were commented on.
- The local search procedure is used after the application of each of the genetic operators and not only once in every iteration of the EA.
- In [38], the local search used is based on the powerful guided local search (GLS) metaheuristic [39].
- GLS_Based_Memetic_Algorithm Begin Initialize ; For To sizeOf Do ; ; Evaluate; Od Repeat Until (termination_condition) Do For To #recombinations Do. selectToMerge a set ; ; ; Evaluate; Add to ; Od For To #mutations.
B. MAs for the QAP
- The QAP is found in the core of many practical problems such as facility location, architectural design, VLSI optimization, etc.
- The problem is formally defined as the following.
- Because of the nature of QAP, it is difficult to treat with exact methods, and many heuristics and metaheuristics have been used to solve it.
- As in , the optimization step is applied only to the newly generated individual, that is, the output of the crossover stage.
- Regardless of the new representation and crossover on which the MA relies to perform its search, it should be particularly noted that is applied only when a diversity crisis arises, and immediately after mutating a solution of the population a new local search improvement is performed.
C. MAs for the BQP
- Binary quadratic programming is defined by the following.
- Binary Quadratic Programming Problem Instance: Symmetric rational matrix .
- The benefit of , i.e, Aim: Maximum benefit solution.
- As well as being a well-known NP-Hard problem, BQP has many applications, i.e., financial analysis, CAD problems, machine scheduling, etc. In [47], the authors used an MA with the same architecture as in but tailored for BQP, and they were able to improve over previous approaches based on TS and simulated annealing (SA).
- They also were able to find new best solutions for instances in the ORLIB [48].
D. MAs for the MGC
- The MGC is one of the most studied problems in graph theory, with many applications in the area of scheduling and timetabling.
- In [49], an MA was presented for this problem which used an embedded kind of after the mutation stage.
- The authors reported what, at the time the paper was written, were exciting results.
- Fleurent and Ferland [50] studied a number of MAs for MGC based on the hybridization of a standard steady-state GA with problem-specific local searchers and TS.
- The improvement stage was used instead of the mutation stage of the standard GA.
E. MAs for the PSP
- Protein structure prediction is one the most exciting problems that computational biology faces today.
- There remains the problem of how the one-dimensional string of amino acids folds up to form a three-dimensional protein it would be extremely useful to be able to deduce the three-dimensional form of a protein from the base sequence of the genes coding for it; but this is still beyond us.the authors.
- One well-studied example is Dill’s HP model [53].
- A replacement strategy was used, together with fitness-proportionate selection for mating.
- In [58], several MAs for other molecular conformation problems are briefly commented on.
A. Syntactic Model for EAs
- Following [62], the EA can be formalized within a “generateand-test” framework by the following: : Initial population.
- Note that in this model, the authors consider the effects of survivor selection at the end of one generation, and parent selection at the start of the next, to be amortized into a single function , which is responsible for updating the working memory of their algorithm.
- The MAs’ literature, as does the general EA literature, contains examples of the incorporation of diversity-preservation measures into .
- This issue will be discussed in more depth in Section VII.
- 1Note that the use of the superscript permits the modeling of crossover operators with variable arity, e.g., Moscato’s K-mergers.
B. Extension to MAs
- The authors will need to extend this notation to include local search operators as new generating functions.
- Examples of so called “multimeme algorithms” where the local search phase has access to several distinct local searchers (i.e., ) can be found in [20] and [67].
- In general, the authors will assume that and, consequently, drop the subscript for the sake of clarity, but as an example of a local searcher with , the reader might consider Jones’ Crossover Hill Climber [68].
- To model this, the authors define entities called schedulers which are higher order functions.
- An early example of the application of higher order functions to MAs see [69], where the authors implement Radcliffe and Surry’s formalism [19] in a functional language.
C. Coordinating Local Search With Crossover and Mutation
- The fine-grain scheduler (fS) coordinates when, where, and with which parameters local searchers from will be applied during the mutation and crossover stages of the evolutionary cycle.
- It has the following signature: The receives three arguments.
- Usually will have the value 1: for example, in most of the examples above, local search is applied after recombination or mutation.
- The symmetric case is equally valid, i.e., applying mutation to the result of improving with .
- The crossover is a local search procedure that uses a two-solutions-based neighborhood.
D. Coordinating Local Search With Population Management
- An alternative model, as illustrated in Section IV-E, is to coordinate the action of local search with the population management and updating functions.
- A coarse-grain scheduler (cS) is defined by In this scheduler, the formal parameters stand for the updating function , the set of the local searchers , the sets of parents and offspring ( and , respectively), and the operator specific parameter sets and .
- Further, it is possible to model the local search methods described in [26], where statistics from the population are used to apply local search selectively.
- With the introduction of this scheduler, a new class of metaheuristics is available given by the many possible instantiations of (2) where the use of superscripts recognizes that the several parameters may be time-dependant.
- As an example of its use, one can imagine that the elements of are based on TS and that the metascheduler uses the information of ancient populations to update their tabu lists, thus combining global and local information across time.
A. Scheduler-Based Taxonomy
- With the use of (2), it is possible to model the vast majority of the MAs found in the literature, capturing the interaction between local search and the standard evolutionary operators (mutation, crossover, selection).
- To understand the ordering of the bits, note that the least significant bit is associated to the scheduler that receives as one of its arguments at most one solution, the next bit to the one that receives at most solutions, the next 2 bits are assigned to the schedulers that employ at most or solutions, respectively, in their arguments.
- Table I classifies the various methods discussed in Section III accordingly to their number, but it will rapidly be seen that only a small fraction of the alternative MAs were employed and investigated, and that the pattern is inconsistent across different problem types.
- Of particular interest are the frontiers for and .
- The authors have included in this table a reference to [21].
B. Relationship to Other Taxonomies
- The taxonomy presented here complements the one introduced in [91] by Calegary et al. who provide a comprehensive taxonomic framework for EAs.
- He then develops a hierarchical organization for each one.
- The authors approach categorizes the architecture of a subclass of the algorithms both of the previous taxonomies include.
- In that way, a more refined classification is obtained for the subclass of EAs and hybrid metaheuristic that are MAs.
- Of course such a syntactic model and taxonomy is of little interest to the practitioner unless it in some way aids in the conceptualization and design process.
C. Distinguishing Types of Local Search Strategies
- Making the separation into two sets of objects (candidate solutions in the EA’s population, and local search heuristics), with interactions mediated by a set of schedulers facilitates a closer examination of the potential nature of the elements of .
- If adapts through changes in its parameters as increases, then the authors call an adaptive meme.
- In the same way, if just one is self-adaptive then the entire is self-adaptive.
- The simplest case uses static memes and requires that is enlarged to include a probability distribution function (pdf) for the likelihood of applying the different memes, in addition to their operational parameters.
- The simplest adaptive case requires that is time-dependent, with the scheduler becoming responsible for adapting the pdf.
VII. DESIGN ISSUES FOR “COMPETENT” MAS
- In [33], Goldberg describes “competent” GAs as: Genetic algorithms that solve hard problems quickly, reliably, and accurately.
- As the authors have described above, for a wide variety of problems, MAs can fulfil these criteria better than traditional EAs.
- It is now appropriate to revisit these issues, in the light of their syntactic model and taxonomy, in order to see what new insights can be gained.
A. Choice of Local Search Operators
- The reader will probably not be surprised to find that their answer to the first question is “it depends.”.
- In [67], the authors showed that even within a single problem class (in that case TSP) the choice of which single LS operator gave the best results when incorporated in an MA was entirely instance-specific.
- It is perhaps worth noting that in [95], it was shown that while coarse-grain adaptation of was sufficient for a steepest-ascent LS, the extra noise inherent in an first-ascent approach gave worse results.
- The hoped-for synergy in such an MA is that the use of genetic variation operators will produce offspring which are more likely to be in the basin of attraction of a high-quality local optimum than simply randomly selecting another point to optimize.
C. Managing the Global-Local Search Tradeoff
- The majority of MAs in the literature apply local search to every individual in every generation of the EA, their model makes it clear that this is not mandatory.
- They achieve this by providing sophisticated coarse-grain schedulers that measure population statistics and take them into consideration at the time of applying local search.
- In [74], Land addresses the problem of how to best integrate the local search operators with the genetic operators.
- That is, instead of performing a complete local search in every solution generated by the evolutionary operators, a partial local search is applied; only those solutions that are in promising basin of attraction will be assigned later (by the coarse-grain scheduler) anextended CPU budget for local search.
VIII. CONCLUSION AND FURTHER WORK
- The authors committed ourselves to the study of several works on MAs, coming from different sources, with the purpose of designing a syntactical model for MAs.
- The authors were able to identify two kinds of helpers, static and adaptive, and to generalize a third type: self-adaptive helpers.
- While examples were found of the first two types, the third type was just recently explored [93]–[98] suggesting another interesting line of research.
- Another important avenue of research is the study of which kind of MA, defined by its index, is suitable for different types of problems.
- Both the syntactic model and the taxonomy aids their understanding of the design issues involved in the engineering of MAs.
Did you find this useful? Give us your feedback
Citations
1,086 citations
Cites background from "A tutorial for competent memetic al..."
...3) MA-S2: Memetic algorithms (MAs) [63], [ 64 ] are based on the hybridization of genetic algorithm (GA) with local search (LS) techniques....
[...]
1,029 citations
Cites background from "A tutorial for competent memetic al..."
...These algorithms are also known as hybrid EAs [Blum et al. 2011; Liao 2010; Mashinchi et al. 2011; Mongus et al. 2012; Misevi . cius and Rubliauskas 2008], genetic local search algorithms [Merz and Freisleben 2000], or memetic algorithms [Moscato 1999; Krasnogor and Smith 2005; Ong et al. 2006]....
[...]
...…and exploitation Ancestry-based [Tsutsui et al. 1997b, 1997a; Oppacher and Wineberg 1999; Goh et al. 2003] [Hart 1994; Freisleben and Merz 1996; Moscato 1999; Merz and Freisleben 2000; Ursem 2002; Ishibuchi et al. 2003, 2010a; Alba and Dorronsoro 2005; Krasnogor and Smith 2005; Ong et al....
[...]
...Nevertheless, the success of memetic algorithms has often been credited to a good trade-off between the exploration abilities of EAs and the exploitation abilities of the local search [Krasnogor and Smith 2005]....
[...]
769 citations
684 citations
522 citations
Cites background from "A tutorial for competent memetic al..."
..., [105, 199, 230], where the memes, either directly encoded within the candidate solutions or evolving in parallel to them, take part in the evolution and undergo recombination and selection in order to select the most promising operators...
[...]
References
[...]
10,880 citations
10,771 citations
9,322 citations
6,758 citations
"A tutorial for competent memetic al..." refers background in this paper
...It is now well established that pure EAs are not well suited to fine tuning search in complex combinatorial spaces and that hybridization with other techniques can greatly improve the efficiency of search [3]–[6]....
[...]
[...]
6,044 citations
"A tutorial for competent memetic al..." refers background in this paper
...Additionally, MAs are inspired by Richard Dawkin’s concept of a meme, which represents a unit of cultural evolution that can exhibit local refinement [2]....
[...]
...The choice of name is inspired by Richard Dawkins’ concept of a meme, which represents a unit of cultural evolution that can exhibit local refinement [2]....
[...]
Related Papers (5)
Frequently Asked Questions (10)
Q2. What is the meta scheduler able to use to influence its search?
The meta scheduler is able to use information from previous populations to influence its search by means of and the elements of , hence a kind of evolutionary memory is introduced into the evolutionary search mechanisms.
Q3. What is the hoped-for synergy in a Lamarckian MA?
The hoped-for synergy in such an MA is that the use of genetic variation operators will produce offspring which are more likely to be in the basin of attraction of a high-quality local optimum than simply randomly selecting another point to optimize.
Q4. What is the definition of a coarse-grain scheduler?
A coarse-grain scheduler (cS) is defined byIn this scheduler, the formal parameters stand for the updating function , the set of the local searchers , the sets of parents and offspring ( and , respectively), and the operator specific parameter sets and .
Q5. How does the author address the issue of large neutral plateaus and deep local optima?
In [57] and [71], the issue of large neutral plateaus and deep local optima is addressed by providing modified local searchers that can change their behavior accordingly to the convergence state of the evolutionary search.
Q6. What is the important avenue of research?
Another important avenue of research is the study of which kind of MA, defined by its index, is suitable for different types of problems.
Q7. What is the effect of survivor selection on the algorithm?
Note that in this model, the authors consider the effects of survivor selection at the end of one generation, and parent selection at the start of the next, to be amortized into a single function , which is responsible for updating the working memory of their algorithm.
Q8. What is the general classification scheme for metaheuristics?
In a related and complementary work, Talbi [92] provides a general classification scheme for metaheuristics based on two different aspects of a metaheuristic: its design space and its implementation space.
Q9. What is the main argument for a sensible design approach?
In earlier sections, the authors have listed a number of papers in the recent MA literature which use multiple LS operators, and the authors would certainly argue that faced with a choice of operators, a sensible design approach would be not to decide a priori but to incorporate several.
Q10. Why is the concept of optimality clouded by the tradeoffs between objectives?
When the authors consider multiobjective problems, the whole concept of optimality becomes clouded by the tradeoffs between objectives, and dominance relations are usually preferred.