scispace - formally typeset
Search or ask a question
Book ChapterDOI

A generic framework for population-based algorithms, implemented on multiple FPGAs

14 Aug 2005-pp 43-55
TL;DR: This work outlines a generic framework that captures a collection of population-based algorithms, allowing commonalities to be factored out, and properties previously thought particular to one class of algorithms to be applied uniformly across all the algorithms.
Abstract: Many bio-inspired algorithms (evolutionary algorithms, artificial immune systems, particle swarm optimisation, ant colony optimisation,...) are based on populations of agents. Stepney et al [2005] argue for the use of conceptual frameworks and meta-frameworks to capture the principles and commonalities underlying these, and other bio-inspired algorithms. Here we outline a generic framework that captures a collection of population-based algorithms, allowing commonalities to be factored out, and properties previously thought particular to one class of algorithms to be applied uniformly across all the algorithms. We then describe a prototype proof-of-concept implementation of this framework on a small grid of FPGA (field programmable gate array) chips, thus demonstrating a generic architecture for both parallelism (on a single chip) and distribution (across the grid of chips) of the algorithms.

Summary (4 min read)

1 Introduction

  • Many bio-inspired algorithms are based on populations of agents trained to solve some problem such as optimising functions or recognising categories.
  • The authors take up this challenge, and, in section 2, outline a generic framework abstracted from the individual population-based models of the following classes: genetic algorithms (GA), AIS negative selection, AIS clonal selection, PSO, and ant colony optimisation (ACO).
  • The framework provides a basis for factoring out the commonalities, and applying various properties uniformly across all the classes of algorithms, even where they were previously thought particular to one class (section 3).
  • In section 4 the authors describe their proof-of-concept prototype implementation of the generic framework on a platform of multiple field programmable gate array (FPGA) chips.

2 The generic framework for population algorithms

  • There are many specific algorithms and implementation variants of the different classes.
  • Rather, the authors take a step back from the specifics, and abstract the basic underlying concepts, particularly the more bio-inspired ones, of each class of algorithm.
  • The authors unify the similarities between these basics in order to develop a generic framework.
  • The intention is that such a framework provides a useful starting point for the subsequent development of more sophisticated variants of the algorithms.

Basic underlying concepts

  • Each individual contains a set of characteristics, which represent the solution.
  • The individuals are antibodies; each characteristic is a shape receptor, also known as AIS negative selection.
  • There are two populations, also known as AIS clonal selection.
  • There is also a population of memory cells drawn from this main population.
  • The individuals are the complete paths (not the ants, which are merely mechanisms to construct the complete paths from path steps); the characteristics are the sequence of path steps, where each step has an associated characteristic of length and pheromone level, also known as Ants.

Algorithm stages

  • The different specific algorithms each exhibit six clearly distinct stages, comprising a generation.
  • These are generalised as: 1. Create : make novel members of the population 2. Evaluate : evaluate each individual for its affinity to the solution 3.
  • The authors describe each of these stages, covering the generic properties, and how they are instantiated for each specific class of algorithm.
  • Rather than saying that some individuals survive from generation to generation, for uniformity the authors consistently consider each generation to be a completely fresh set of individuals, with some possibly being copies of previous generation individuals.
  • As another example, the pheromone changes in the Ant algorithm is mapped to the generic mutate step.

Create

  • Creation makes novel members of the populations.
  • In the first generation, the whole population is set up, and the members have their characteristics initialised.
  • On subsequent generations, creation “tops up” the population with fresh individuals, as necessary.

Evaluate

  • The affinity measures how well each individual solves (part of) the problem.
  • This function should ideally (but does not always) have the structure of a metric over the space defined by the characteristics.

Test

  • The test for termination is either (a) a sufficiently good solution is found, or (b) enough generations have been run without finding a sufficiently good solution.
  • On termination, the solution is: GA, Swarms, Ants : the highest affinity individual AIS negative selection : the set of individuals with above-threshold affinities AIS clonal selection : the population of memory cells.

Select

  • High affinity individuals are selected to contribute somehow to the next generation’s population.
  • There are several selection algorithms commonly used.
  • N best selects the n highest affinity individuals from the current population.
  • Roulette wheel selection randomly chooses a given number of individuals, with probability of selection proportional to their affinity, or to their ranking.
  • Tournament randomly selects teams of individuals, and then selects a subset of individuals from each team.

Spawn

  • Production of new individuals for the next generation usually involves combining the characteristics of parent individuals from the selected population (ants are a special case).
  • If the crossover mask is set to the identity, then the two new individuals are clones of the two parents.
  • The selected parents become the basis of the new generation (which is topped up to the population size by creating sufficient new individuals), also known as AIS negative selection.
  • The new position is derived from the parent’s position and velocity, the velocity is modified to point towards the best neighbour, and the neighbourhood group is copied from the parent.
  • Ants : no individuals are specifically spawned for the next generation: each generation is created afresh from the path steps (whose characteristics are changed by the mutate step).

Other generalisations

  • The generic framework allows further features of one specific algorithm to be generalised to the others.
  • Evolutionary Strategies encode the mutation rates as characteristics: a similar approach can be used in the other algorithms.
  • The ant algorithm could allow the pheromone decay rate to be a characteristic.
  • The range of selection strategies can be employed across all the algorithms that have a non-trivial selection stage.
  • In particular, AIS clonal selection has two populations: selection strategies could be used on the memory cell population too.

4 The prototype implementation

  • There is much opportunity for parallelism in these algorithms: individuals can (to some degree) be evaluated, selected, and created in parallel.
  • This suggests efficiency gains by executing these algorithms on parallel hardware.

FPGAs and Handel-C

  • The authors chose as their prototype implementation platform a small grid of FPGAs, executing the framework implemented in Handel-C.
  • So each individual FPGA can host multiple individuals executing in parallel, and multiple FPGAs allow distributed implementations.
  • Handel-C is essentially an executable subset of CSP [Stepney 2003], with some extensions to support FPGA hardware.
  • It would have been possible to design a protocol to implement this, allowing the distributed program to be (very close to) a pure Handel-C program.
  • So for this prototype, a simple handshaking protocol has been used, and the inter-chip communication hidden in a wrapper.

The implemented framework

  • The prototype implementation of the framework provides much of the functionality described above.
  • The Handel-C compiler optimises away dead code, so options that are not selected by the user (such as various choices of creation or selection functions) do not appear in the compiled code.
  • It is also possible to return intermediate results every generation, to allow investigation of the performance, or for debugging, but this introduces a communication bottleneck.
  • Each FPGA chip holds a certain number of islands, each of which holds its individuals.
  • Then the appropriate selection method is used on each team in parallel.

Restrictions due to the platform choice

  • Some of the design decisions for the framework prototype are due to specific features and limitations of FPGAs and Handel-C, and different platform choices could result in different decisions.
  • The use of families is to cope with the limited size of the FPGAs.
  • Certain parts of the selection can be performed in parallel, for example, to find the n best, where each individual can read the affinity of all its teammates in parallel.
  • Handel-C supports variable bit-width values, requiring explicit casting between values with different widths.
  • This can lead to arcane code, particularly when trying to write generic routines.

5 Preliminary results

  • The number of (families of) individuals possible per chip varies depending on the settings.
  • With all the capabilities turned on, this number drops to about 18 individuals run sequentially, or four if run in parallel, the reduction being due to the increased routing and copies of code.
  • The FPGAs being used (300K gate Xilinx SpartanIIE chips) are relatively small: it was thought more important for this proof of concept work to get the maximum number of FPGAs for the budget, rather than the maximum size of each one.
  • Looking at only the evaluate stage shows the sequential form taking about twice as long as the parallel form.
  • The experiment compares running four individuals in parallel on one chip versus four individuals in parallel on each of the five chips (20 individuals in total), migrating the two best individuals every 100 generations.

7 Acknowledgments

  • The authors would like to thank Wilson Ifill and AWE, who provided funding for the FPGAs used in this work.
  • Also thanks to Neil Audsley and Michael Ward for turning a large box of components into a usable FPGA grid, and to Fiona Polack and Jon Timmis for detailed comments on earlier versions.

8 References

  • 4th Asia-Pacific Conference on Simulated Evolution and Learning, 2002. [3].
  • Exploiting Parallelism Inherent in AIRS, an Artificial Immune Classifier.

Did you find this useful? Give us your feedback

Content maybe subject to copyright    Report

A generic framework for population-based algorithms,
implemented on multiple FPGAs
John Newborough and Susan Stepney
Department of Computer Science, University of York, Heslington, York, YO10 5DD, UK
Abstract. Many bio-inspired algorithms (evolutionary algorithms, artificial immune
systems, particle swarm optimisation, ant colony optimisation, …) are based on
populations of agents. Stepney et al [2005] argue for the use of conceptual frameworks
and meta-frameworks to capture the principles and commonalities underlying these, and
other bio-inspired algorithms. Here we outline a generic framework that captures a
collection of population-based algorithms, allowing commonalities to be factored out, and
properties previously thought particular to one class of algorithms to be applied uniformly
across all the algorithms. We then describe a prototype proof-of-concept implementation
of this framework on a small grid of FPGA (field programmable gate array) chips, thus
demonstrating a generic architecture for both parallelism (on a single chip) and distribution
(across the grid of chips) of the algorithms.
1 Introduction
Many bio-inspired algorithms are based on populations of agents trained to solve some
problem such as optimising functions or recognising categories. For example,
Evolutionary Algorithms (EA) are based on analogy to populations of organisms
mutating, breeding and selecting to become “fitter” [Mitchell 1996]. The negative and
clonal selection algorithms of Artificial Immune Systems (AIS) use populations of agents
trained to recognise certain aspects of interest (see de Castro & Timmis [2002] for an
overview): negative selection involves essentially random generation of candidate
recognisers, whilst clonal selection uses reinforcement based on selection and mutation of
the best recognisers. Particle swarm optimisation (PSO) [Kennedy & Eberhart 2001] and
social insect algorithms [Bonabeau 1999] use populations of agents whose co-operations
(direct, or stigmergic) result in problem solving.
Stepney et al [2005] argue for the use of conceptual frameworks and meta-frameworks
to capture the principles and commonalities underlying various bio-inspired algorithms.
We take up this challenge, and, in section 2, outline a generic framework abstracted from
the individual population-based models of the following classes: genetic algorithms (GA),
AIS negative selection, AIS clonal selection, PSO, and ant colony optimisation (ACO).
The framework provides a basis for factoring out the commonalities, and applying various
properties uniformly across all the classes of algorithms, even where they were previously
thought particular to one class (section 3).
ICARIS 2005, Banff, Canada, August 2005
.
LNCS 3627:43-55. Springer, 2005

2
In section 4 we describe our proof-of-concept prototype implementation of the generic
framework on a platform of multiple field programmable gate array (FPGA) chips. Thus
the generic architecture naturally permits both parallelism (multiple individuals executing
on a single chip) and distribution (multiple individuals executing across the array of chips)
of the algorithms. In section 5 we outline what needs to be done next to take these
concepts into a fully rigorous framework architecture and implementation.
2 The generic framework for population algorithms
There are many specific algorithms and implementation variants of the different classes.
To take one case, AIS clonal selection, see, for example [Cutello et al 2004] [Garrett
2004] [Kim & Bentley 2002]. It is not our intention to capture every detail of all the
variants in the literature. Rather, we take a step back from the specifics, and abstract the
basic underlying concepts, particularly the more bio-inspired ones, of each class of
algorithm. So when we refer to “GA” or “AIS clonal selection”, for example, we are not
referring to any one specific algorithm or implementation, but rather of the general
properties of this class. We unify the similarities between these basics in order to develop
a generic framework. The intention is that such a framework provides a useful starting
point for the subsequent development of more sophisticated variants of the algorithms.
Basic underlying concepts
The generic algorithm is concerned with a population of individuals, each of which
captures a possible solution, or part of a solution. Each individual contains a set of
characteristics, which represent the solution. The characteristics define the (phase or
state) space that the population of individuals inhabit. The goal of the algorithm is to find
“good” regions of this space, based on some affinity (a measure that relates position in the
space to goodness of solution, so defining a landscape). The individuals and
characteristics of the specific classes of algorithm are as follows:
GA : the individuals are chromosomes; each characteristic is a gene.
AIS negative selection : the individuals are antibodies; each characteristic is a shape
receptor.
AIS clonal selection : there are two populations. In the main population the
individuals are antibodies; each characteristic is a shape receptor. There is also a
population of memory cells drawn from this main population.
Swarms : the individuals are boids; the characteristics are position, velocity and
neighbourhood group (the other visible individuals).
Ants: the individuals are the complete paths (not the ants, which are merely
mechanisms to construct the complete paths from path steps); the characteristics are the
sequence of path steps, where each step has an associated characteristic of length and
pheromone level.

A generic framework for population-based algorithms, implemented on multiple FPGAs 3
Algorithm stages
The different specific algorithms each exhibit six clearly distinct stages, comprising a
generation. These are generalised as:
1. Create : make novel members of the population
2. Evaluate : evaluate each individual for its affinity to the solution
3. Test : test if some termination condition has been met
4. Select : select certain individuals from the current generation, based on their affinity, to
be used in the creation of the next generation
5. Spawn : create new individuals for the next generation
6. Mutate : change selected individuals
We describe each of these stages, covering the generic properties, and how they are
instantiated for each specific class of algorithm. Using this framework results in
descriptions that sometimes differ from, but are equivalent to, the traditional descriptions
of the algorithms. For example, rather than saying that some individuals survive from
generation to generation, for uniformity we consistently consider each generation to be a
completely fresh set of individuals, with some possibly being copies of previous
generation individuals. As another example, the pheromone changes in the Ant algorithm
is mapped to the generic mutate step.
Create
Creation makes novel members of the populations. In the first generation, the whole
population is set up, and the members have their characteristics initialised. On subsequent
generations, creation “tops up” the population with fresh individuals, as necessary.
GA: an individual chromosome is created usually with random characteristics, giving a
broad coverage of the search space
AIS negative selection : an individual antibody is created usually with random shape
receptors
AIS clonal selection : an individual antibody in the main population is created usually
with random shape receptors; memory cells are not created, rather they are spawned from
the main population
Swarms : an individual boid is created usually with random position and velocity
characteristics, giving a broad coverage of the search space; the neighbourhood
characteristic is usually set to implement a ring, grid or star connection topology
Ants : each path step is initially set up usually with a fixed pheromone level, and with
the relevant (fixed) path length; the population of paths is created by the ants from these
steps each generation

4
Evaluate
The affinity measures how well each individual solves (part of) the problem. It is a user-
defined function of (some of) an individual’s characteristics. This function should ideally
(but does not always) have the structure of a metric over the space defined by the
characteristics.
GA : the affinity is the fitness function, a function of the values of the genes
AIS : the affinity is a measure of how closely the shape receptors complement the
target of recognition, inspired by the “lock and key” metaphor
Swarms : the affinity, or fitness function, is a function of the current position
Ants : the affinity is the (inverse of the) path length
Test
The test for termination is either (a) a sufficiently good solution is found, or (b) enough
generations have been run without finding a sufficiently good solution. On termination,
the solution is:
GA, Swarms, Ants : the highest affinity (fittest) individual
AIS negative selection : the set of individuals with above-threshold affinities
AIS clonal selection : the population of memory cells
Select
High affinity individuals are selected to contribute somehow to the next generation’s
population. There are several selection algorithms commonly used. n best selects the n
highest affinity individuals from the current population. Threshold selects all the
individuals with an affinity greater than some given threshold value. Roulette wheel
selection randomly chooses a given number of individuals, with probability of selection
proportional to their affinity, or to their ranking. Tournament randomly selects teams of
individuals, and then selects a subset of individuals from each team.
GA : different variants use any of the above methods of selection, to find the parents
that will produce the next generation
AIS negative selection : threshold selection is used to find the next generation
AIS clonal selection : a combination of n best and threshold selection is used to find
the next generation of the main population; all individuals of the memory cell population
are selected to become the basis of its next generation
Swarms : all individuals are selected to become the basis of the next generation
Ants : no individuals are specifically selected to become the next generation: each
generation is created afresh from the path steps (whose characteristics are changed by the
mutate step)

A generic framework for population-based algorithms, implemented on multiple FPGAs 5
Spawn
Production of new individuals for the next generation usually involves combining the
characteristics of parent individuals from the selected population (ants are a special case).
GA : the characteristics of pairs of selected parents are combined by using a crossover
mask (predefined or randomly generated) to generate two new individuals. If the
crossover mask is set to the identity, then the two new individuals are clones of the two
parents.
AIS negative selection : the selected parents become the basis of the new generation
(which is topped up to the population size by creating sufficient new individuals). If the
threshold is a constant value throughout the run, this has the effect that an individual, once
selected, continues from generation to generation, and only the newly created individuals
need be evaluated.
AIS clonal selection : in the main population new individuals are spawned as clones of
each parent, with the number of clones being produced proportional to the parent’s
affinity; in the memory cell population, the selected parents become the basis of the new
generation, and a new individual is spawned, as (a copy of) the best individual of the main
population.
Swarms : a new individual is spawned from the sole parent and the highest affinity
individual in that parent’s neighbourhood group, with the intention of making the new
individual “move towards” the best neighbour. The new position is derived from the
parent’s position and velocity, the velocity is modified to point towards the best
neighbour, and the neighbourhood group is copied from the parent.
Ants : no individuals are specifically spawned for the next generation: each generation
is created afresh from the path steps (whose characteristics are changed by the mutate
step)
Mutate
Mutation involves altering the characteristics of single individuals in the population. It
would be possible to unify spawning and mutation into a single generate stage, but since
most algorithms consider these to be separate processes, we have followed that view,
rather than strive for total generality at this stage. The mutation rate might be globally
random, or based on the value of a characteristic or the affinity of each individual. How a
characteristic is mutated depends on its type: a boolean might be flipped, a numerical
value might be increased or decreased by an additive or multiplicative factor, etc.
GA, Swarms : individuals are mutated, usually randomly, in order to reintroduce lost
values of characteristics; evolutionary strategy algorithms encode mutation rates as
characteristics
AIS negative selection : no mutation occurs. (That is, the next generation consists of
copies of the selected above threshold individuals, topped up with newly created
individuals. An alternative, but equivalent, formulation in terms of this framework would

Citations
More filters
Book ChapterDOI
Inho Park1, Dokyun Na1, Kwang H. Lee1, Doheon Lee1
14 Aug 2005
TL;DR: Fuzzy continuous Petri nets are extended by adding new types of places and transitions called fuzzy places and fuzzy transitions that makes it possible to perform fuzzy inference with fuzzy placesand fuzzy transitions acting as kinetic parameters and fuzzy inference systems between input and output places, respectively.
Abstract: Helper T(Th) cells regulate immune response by producing various kinds of cytokines in response to antigen stimulation. The regulatory functions of Th cells are promoted by their differentiation into two distinct subsets, Th1 and Th2 cells. Th1 cells are involved in inducing cellular immune response by activating cytotoxic T cells. Th2 cells trigger B cells to produce antibodies, protective proteins used by the immune system to identify and neutralize foreign substances. Because cellular and humoral immune responses have quite different roles in protecting the host from foreign substances, Th cell differentiation is a crucial event in the immune response. The destiny of a naive Th cell is mainly controlled by cytokines such as IL-4, IL-12, and IFN-γ. To understand the mechanism of Th cell differentiation, many mathematical models have been proposed. One of the most difficult problems in mathematical modeling is to find appropriate kinetic parameters needed to complete a model. However, it is relatively easy to get qualitative or linguistic knowledge of a model dynamics. To incorporate such knowledge into a model, we propose a novel approach, fuzzy continuous Petri nets extending traditional continuous Petri net by adding new types of places and transitions called fuzzy places and fuzzy transitions. This extension makes it possible to perform fuzzy inference with fuzzy places and fuzzy transitions acting as kinetic parameters and fuzzy inference systems between input and output places, respectively.

13 citations

Dissertation
30 Jun 2010
TL;DR: This thesis describes a path from a model of a biological system to a biologically-inspired algorithm, which has application in tracking probability distributions and demonstrates the algorithm on a related but different problem of detecting anomalies in spectrometer data.
Abstract: This thesis describes a path from a model of a biological system to a biologically-inspired algorithm The thesis commences with a discussion of the principled design of biologically-inspired algorithms It is argued that modelling a biological system can be tremendously helpful in eventual algorithm construction A proposal is made that it is possible to reduce modelling biases by modelling the biological system without any regard to algorithm development, that is, with only concern of understanding the biological mechanisms As a consequence the thesis investigates a detailed model of T cell signalling process The model is subjected to stochastic analysis which results in a hypothesis for T cell activation This hypothesis is abstracted to form a simplified model which retains key mechanisms The abstracted model is shown to have connections to Kernel Density Estimation, through developing these connections the Receptor Density Algorithm is developed By design, the algorithm has application in tracking probability distributions Finally, the thesis demonstrates the algorithm on a related but different problem of detecting anomalies in spectrometer data

10 citations


Cites result from "A generic framework for population-..."

  • ...This suggestion is in part related to the observation that many population-based bio-inspired algorithms have very similar methods [Newborough and Stepney, 2005]....

    [...]

01 Jan 2006
TL;DR: The intent of this work is to provide both a high-level overview of the principle computational paradigms of the field and provide some indication of the major works and their relationships within each paradigm.
Abstract: This work provides a high-level taxonomy of the field of Artificial Immune Systems (AIS). Specifically, the taxonomy presented is not application focused as is the case in de Castro and Timmis' seminal reference on the field [212], rather the field is presented firstly by computational paradigm, and secondary by so-called thrust of research. This secondary aggregation involves the subjective organisation of work based on authors involved and the theme of the presented research. The intent of this work is to provide both a high-level overview of the principle computational paradigms of the field and provide some indication of the major works and their relationships within each paradigm. This work is not a complete bibliography, rather it contains seminal and influencial works from each selected paradigm. Section 2 considers the general field of Artificial Immune Systems, and provides references regarding defintinios, overview of the field and conceptual frameworks. Section 3 considers those works inspired by the self-nonself principle refered to as the Negative Selection Paradigm. Section 4 considers those works inspired by Burnet's clonal selection theory refered to as the Clonal Selection Paradigm. Section 5 considers those works inspired by Jerne's immune network theory refered to as the Immune Network Paradigm. Section 6 considers those works inspired by Matzinger's danger theory refered to as the Danger Signal Paradigm. Section 7 considers those works that do not fit neatly into the selected organisation of the field. Finally, Section 8 considers works that focus on the distributed concerns of the immune system. The field of Artificial Immune Systems (AIS) is primarily concerned with the development, demonstration, and implementation of computational tools inspired by principles and processes of the vertebrate immune system. The field of artificial immune systems is not concerned with the theoretical or computational modelling of the biological system. Standard definitions of both the field of AIS and what comprises an artificial immune system follow. The authors de Castro and Von Zuben [210] define an AIS as " … a computational system based upon metaphors of the biological immune system ". They go on to define Immune Engineering (IE) as " … a meta-synthesis process that uses the information contained in the problem itself to define the solution tool to a given problem, and then apply it to obtain the problem solution ". The distinction is the IE is the engineering process, differentiated as to the conventional engineering process that uses ideas …

9 citations


Cites background or methods from "A generic framework for population-..."

  • ...A distributed framework for multiple search algorithms (applied to FPGA) [153] - Distributed algorithm for searching P2P networks (ImmuneSearch) [245] - Multi-agent system for breaking up a problem (distributed problem solving) (immune network and MHC) applied to TSP (n-TSP) [242] -...

    [...]

  • ...[153] John Newborough and Susan Stepney, "A Generic Framework for Population-Based Algorithms, Implemented on Multiple FPGAs," Artificial Immune Systems: 4th International Conference, ICARIS 2005, Banff, Canada, pp....

    [...]

  • ...Summary References Conceptual framework for AIS Framework [310] Application of framework [153] Seeking a more complete model [241] More on the theme [309]...

    [...]

  • ...[153] John Newborough and Susan Stepney, "A Generic Framework for Population-Based Algorithms, Implemented on Multiple FPGAs," Artificial Immune Systems: 4th International Conference, ICARIS 2005, Banff, Canada, pp. 43-55, 2005....

    [...]

  • ...- AIS as a context aware ubiquitous distributed system (immune network) [259] - A distributed framework for multiple search algorithms (applied to FPGA) [153] - Distributed algorithm for searching P2P networks (ImmuneSearch) [245] - Multi-agent system for breaking up a problem (distributed problem solving) (immune network and MHC) applied to TSP (n-TSP) [242] - An agent-based intrusion detection system (Dasgupta, Gonzalez) [66] o A very similar piece of work on agent-based intrusion detection by the same author [68] - Robot control (autonomous, distributed, etc....

    [...]

Dissertation
09 Dec 2010
TL;DR: It is found that the dendritic cell algorithm has a hitherto unknown frequency-dependent component, making it ideal for filtering out sensor noise, and it is concluded that traditional machine learning approaches are likely to outperform the implemented system in its current form.
Abstract: The implementation and running of physical security systems is costly and potentially hazardous for those employed to patrol areas of interest. From a technial perspective, the physical security problem can be seen as minimising the probability that intruders and other anomalous events will occur unobserved. A robotic solution is proposed using an artificial immune system, traditionally applied to software security, to identify threats and hazards: the dendritic cell algorithm. It is demonstrated that the migration from the software world to the hardware world is achievable for this algorithm and key properties of the resulting system are explored empirically and theoretically. It is found that the algorithm has a hitherto unknown frequency-dependent component, making it ideal for filtering out sensor noise. Weaknesses of the algorithm are also discovered, by mathematically phrasing the signal processing phase as a collection of linear classifiers. It is concluded that traditional machine learning approaches are likely to outperform the implemented system in its current form. However, it is also observed that the algorithm’s inherent filtering characteristics make modification, rather than rejection, the most beneficial course of action. Hybridising the dendritic cell algorithm with more traditional machine learning techniques, through the introduction of a training phase and using a non-linear classification phase is suggested as a possible future direction.

9 citations

01 Jan 2008
TL;DR: This investigation was motivated by three open problems in the broader field of Artificial Immune Systems, specifically the perceived impasse in the development, identity, and application of the field, the promise of distributed information processing, and the need for a framework to motivate such work.
Abstract: The development of adaptive and intelligent computational methods is an important frontier in the field Artificial Intelligence given the fragility of top-down software solutions to complex problems involving incomplete information. This dissertation describes a systematic investigation of the Clonal Selection Theory of acquired immunity as a motivating information processing metaphor of a series of adaptive and distributed Computational Intelligence algorithms. The broader structure and function of the mammalian immune system is used to frame the cellular theory and classically inspired approaches, providing the additional distributed perspectives of a ‘host of tissues’ called the Tissue Paradigm and a ‘population of hosts’ called the Hosts Paradigm. This investigation was motivated by three open problems in the broader field of Artificial Immune Systems, specifically the perceived impasse in the development, identity, and application of the field, the promise of distributed information processing, and the need for a framework to motivate such work. The state of Clonal Selection Algorithms is investigated in the context of immunological theory, and considered in the context of broader related machine learning fields and adaptive systems theory. A systematic approach is adopted in considering the adaptive qualities of clonal selection beyond a cellular perspective, involving the identification of the lymphatic system and lymphocyte migration as a motivating metaphor for intra-host distributed systems, and host immunisation and evolutionary immunology as a motivating metaphor for intra-population distributed system design. Relevant immunophysiology and theory was reviewed, abstracted to computational models and algorithms, and systematically assessed on model pattern recognition problems to demonstrate and verify expected information processing capabilities. The empirical investigation reveals a variety of tissue and host based clonal selection systems capable of acquiring distributed information via internal processes of controlled localisation and dissemination, in a decentralised information exposure environment. The general capabilities of Clonal Selection Algorithms as a Computational Intelligence paradigm are defined in the context of a detailed assessment of the suitability of the approaches to the important problem primitives of Function Optimisation and Function Approximation. The findings highlight the general capabilities of the approaches as mutation-based parallel hill climbers for global optimisation and prototype quantisation approaches to function approximation. Finally, an Integrated Hierarchical Clonal Selection Framework demonstrates the subsumed relationship between the cellular, tissue, and host classes of algorithm, the dependent relationship with the complexity of the abstract antigenic environment addressed by the system, and a general scaffold for a broader class of distributed artificial immune systems.

8 citations


Cites background or result from "A generic framework for population-..."

  • ...[304] provide a framework that suggested that all population algorithms (including clonal selection) may be treated in a similar manner....

    [...]

  • ...Newborough and Stepney proposed a generic framework for population based algorithms based on a general biologically-inspired conceptual framework [304] (considered further in Section 2....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: It is suggested that input and output are basic primitives of programming and that parallel composition of communicating sequential processes is a fundamental program structuring method.
Abstract: This paper suggests that input and output are basic primitives of programming and that parallel composition of communicating sequential processes is a fundamental program structuring method. When combined with a development of Dijkstra's guarded command, these concepts are surprisingly versatile. Their use is illustrated by sample solutions of a variety of a familiar programming exercises.

11,419 citations

Book
01 Jan 1996
TL;DR: An Introduction to Genetic Algorithms focuses in depth on a small set of important and interesting topics -- particularly in machine learning, scientific modeling, and artificial life -- and reviews a broad span of research, including the work of Mitchell and her colleagues.
Abstract: From the Publisher: "This is the best general book on Genetic Algorithms written to date. It covers background, history, and motivation; it selects important, informative examples of applications and discusses the use of Genetic Algorithms in scientific models; and it gives a good account of the status of the theory of Genetic Algorithms. Best of all the book presents its material in clear, straightforward, felicitous prose, accessible to anyone with a college-level scientific background. If you want a broad, solid understanding of Genetic Algorithms -- where they came from, what's being done with them, and where they are going -- this is the book. -- John H. Holland, Professor, Computer Science and Engineering, and Professor of Psychology, The University of Michigan; External Professor, the Santa Fe Institute. Genetic algorithms have been used in science and engineering as adaptive algorithms for solving practical problems and as computational models of natural evolutionary systems. This brief, accessible introduction describes some of the most interesting research in the field and also enables readers to implement and experiment with genetic algorithms on their own. It focuses in depth on a small set of important and interesting topics -- particularly in machine learning, scientific modeling, and artificial life -- and reviews a broad span of research, including the work of Mitchell and her colleagues. The descriptions of applications and modeling projects stretch beyond the strict boundaries of computer science to include dynamical systems theory, game theory, molecular biology, ecology, evolutionary biology, and population genetics, underscoring the exciting "general purpose" nature of genetic algorithms as search methods that can be employed across disciplines. An Introduction to Genetic Algorithms is accessible to students and researchers in any scientific discipline. It includes many thought and computer exercises that build on and reinforce the reader's understanding of the text. The first chapter introduces genetic algorithms and their terminology and describes two provocative applications in detail. The second and third chapters look at the use of genetic algorithms in machine learning (computer programs, data analysis and prediction, neural networks) and in scientific models (interactions among learning, evolution, and culture; sexual selection; ecosystems; evolutionary activity). Several approaches to the theory of genetic algorithms are discussed in depth in the fourth chapter. The fifth chapter takes up implementation, and the last chapter poses some currently unanswered questions and surveys prospects for the future of evolutionary computation.

9,933 citations

Book
01 Jan 1985

9,210 citations

Journal ArticleDOI
TL;DR: An Introduction to Genetic Algorithms as discussed by the authors is one of the rare examples of a book in which every single page is worth reading, and the author, Melanie Mitchell, manages to describe in depth many fascinating examples as well as important theoretical issues.
Abstract: An Introduction to Genetic Algorithms is one of the rare examples of a book in which every single page is worth reading. The author, Melanie Mitchell, manages to describe in depth many fascinating examples as well as important theoretical issues, yet the book is concise (200 pages) and readable. Although Mitchell explicitly states that her aim is not a complete survey, the essentials of genetic algorithms (GAs) are contained: theory and practice, problem solving and scientific models, a \"Brief History\" and \"Future Directions.\" Her book is both an introduction for novices interested in GAs and a collection of recent research, including hot topics such as coevolution (interspecies and intraspecies), diploidy and dominance, encapsulation, hierarchical regulation, adaptive encoding, interactions of learning and evolution, self-adapting GAs, and more. Nevertheless, the book focused more on machine learning, artificial life, and modeling evolution than on optimization and engineering.

7,098 citations


"A generic framework for population-..." refers background in this paper

  • ...For example, Evolutionary Algorithms (EA) are based on analogy to populations of organisms mutating, breeding and selecting to become “fitter” [Mitchell 1996]....

    [...]

  • ...© Springer-Verlag Berlin Heidelberg 2005...

    [...]

BookDOI
01 Jan 1999
TL;DR: This chapter discusses Ant Foraging Behavior, Combinatorial Optimization, and Routing in Communications Networks, and its application to Data Analysis and Graph Partitioning.
Abstract: 1. Introduction 2. Ant Foraging Behavior, Combinatorial Optimization, and Routing in Communications Networks 3. Division of Labor and Task Allocation 4. Cemetery Organization, Brood Sorting, Data Analysis, and Graph Partitioning 5. Self-Organization and Templates: Application to Data Analysis and Graph Partitioning 6. Nest Building and Self-Assembling 7. Cooperative Transport by Insects and Robots 8. Epilogue

5,822 citations