scispace - formally typeset
Search or ask a question

Showing papers in "Artificial Life in 1997"


Journal ArticleDOI
TL;DR: An Introduction to Genetic Algorithms as discussed by the authors is one of the rare examples of a book in which every single page is worth reading, and the author, Melanie Mitchell, manages to describe in depth many fascinating examples as well as important theoretical issues.
Abstract: An Introduction to Genetic Algorithms is one of the rare examples of a book in which every single page is worth reading. The author, Melanie Mitchell, manages to describe in depth many fascinating examples as well as important theoretical issues, yet the book is concise (200 pages) and readable. Although Mitchell explicitly states that her aim is not a complete survey, the essentials of genetic algorithms (GAs) are contained: theory and practice, problem solving and scientific models, a \"Brief History\" and \"Future Directions.\" Her book is both an introduction for novices interested in GAs and a collection of recent research, including hot topics such as coevolution (interspecies and intraspecies), diploidy and dominance, encapsulation, hierarchical regulation, adaptive encoding, interactions of learning and evolution, self-adapting GAs, and more. Nevertheless, the book focused more on machine learning, artificial life, and modeling evolution than on optimization and engineering.

7,098 citations


Journal ArticleDOI
TL;DR: These simulations show that the evolutionary component of Echo makes a significant contribution to its behavior and that Echo shows good qualitative agreement with naturally occurring species abundance distributions and species-area scaling relations.
Abstract: Echo is a generic ecosystem model in which evolving agents are situated in a resource-limited environment The Echo model is described, and the behavior of Echo is evaluated on two well-studied measures of ecological diversity: relative species abundance and the species-area scaling relation In simulation experiments, these measures are used to compare the behavior of Echo with that of a neutral model, in which selection on agent genotypes is random These simulations show that the evolutionary component of Echo makes a significant contribution to its behavior and that Echo shows good qualitative agreement with naturally occurring species abundance distributions and species-area scaling relations

98 citations


Journal ArticleDOI
Eric Bonabeau1
TL;DR: It is argued in this article that the notion of agent-based pattern formation, which is introduced and exemplified, can serve as a basis to study pattern formation in nature and can certainly be derived from existing theories of pattern formation.
Abstract: An extremely large body of theoretical work exists on pattern formation, but very few experimental results have confirmed the relevance of theoretical models. It is argued in this article that the notion of agent-based pattern formation, which is introduced and exemplified, can serve as a basis to study pattern formation in nature, especially because pattern-forming systems based on agents are (relatively) more easily amenable to experimental observations. Moreover, understanding agent-based pattern formation is a necessary step if one wishes to design distributed artificial pattern-forming systems. But, to achieve this goal, a theory of agent-based pattern formation is needed. This article suggests that it can certainly be derived from existing theories of pattern formation.

91 citations


Journal ArticleDOI
TL;DR: The article contains an informal presentation as well as the formal definition of the model, presents some properties of variants of eco-grammar systems, and discusses the emergence of important lifelike features such as birth and death.
Abstract: A formal framework for studying systems made up of a community of agents and their environment is proposed. The suggested model, technically based on the theory of formal grammars and called an eco-grammar system, captures some common features of ecological, economic, social, and collective robotic systems. The article contains an informal presentation as well as the formal definition of the model, presents some properties of variants of eco-grammar systems, and discusses the emergence of important lifelike features such as birth and death. Emphasis is put on results with relevance for artificial life. Some recent developments are also briefly reported.

68 citations


Journal ArticleDOI
TL;DR: This article investigates one crucial aspect of brokers' dynamical behavior, their price-setting mechanisms, in the context of a simple information-filtering economy, and shows that the system's dynamicalbehavior in such myopic cases is generally an unending cycle of disastrous competitive wars in price/product space.
Abstract: One scenario of the future of computation populates the Internet with vast numbers of software agents providing, trading, and using a rich variety of information goods and services in an open, free-market economy. An essential task in such an economy is the retailing or brokering of information: gathering it from the right producers and distributing it to the right consumers. This article investigates one crucial aspect of brokers' dynamical behavior, their price-setting mechanisms, in the context of a simple information-filtering economy. We consider only the simplest cases in which a broker sets its price and product parameters based solely on the system's current state, without explicit prediction of the future. Analytical and numerical results show that the system's dynamical behavior in such “myopic” cases is generally an unending cycle of disastrous competitive “wars” in price/product space. These in turn are directly attributable to the existence of multiple peaks in the brokers' profitability land...

57 citations


Journal ArticleDOI
TL;DR: The origin of multicellular organisms and the mechanism of development in cell societies were studied by choosing a model with intracellular biochemical dynamics allowing for oscillations, cell-cell... as discussed by the authors.
Abstract: The origin of multicellular organisms and the mechanism of development in cell societies are studied by choosing a model with intracellular biochemical dynamics allowing for oscillations, cell–cell...

56 citations


Journal ArticleDOI
TL;DR: The emergence of collective strategies in a prey-predator system is studied and the strategy of random swarming encourages symbiosis in the sense that it is associated with a low extinction probability for the whole system.
Abstract: The emergence of collective strategies in a prey-predator system is studied. We use the term "collective" in the sense of the collective motion of defense or attack often found in behaviors of animal grotips. In our prey-predator system, both prey and predators move around on a two-dimensional plane, interacting by playing a game; predators can score by touching the backside of a prey. Thresholds are assumed for the scores of both prey and predators. The species with the higher scores can reproduce more, and that with the lower scores will be diminished. As a result, strategies as collective motions are observed; these consist of rotating cluster motions, line formations, disordered but one-way marching, and random swarming. In particular, the strategy of random swarming encourages symbiosis in the sense that it is associated with a low extinction probability for the whole system.

48 citations


Journal ArticleDOI
Ginger Booth1
TL;DR: An individual-based simulation system, named Gecko, is presented for modeling multiple species at multiple trophic levels, on a spatially explicit, continuous two-dimensional landscape, and results show promise for application in both theoretical and natural ecosystem modeling.
Abstract: An individual-based simulation system, named Gecko, is presented for modeling multiple species at multiple trophic levels, on a spatially explicit, continuous two-dimensional landscape. Biologically motivated rules are specified at an individual level, and resulting behaviors are observed at an ecosystem level. Individuals are represented by circles with free range on a resource-producing plane. These circles grow allometrically with biomass of fixed resources. Resource acquisition behaviors include competition by area overlap for producers, and movement based on perception and intent. Individual-level energetics are explicitly modeled with inefficient assimilation, resource transformation, and allometrically specified metabolic costs. Individual growth and reproduction requires a history of successful resource acquisition. Terrestrial producer, herbivore, and carnivore species classes are included, extensible to further classes. A grassland food chain model of "plants," "grasshoppers," and "spiders" is used to demonstrate ecosystem-level results of given individual-level behaviors. Ecosystem-level behaviors include a trophic cascade of indirect carnivore-producer interaction effects; stable persistence of all populations; a near-realistic biomass pyramid; and spatial competition and coexistence of multiple producer species. Initial Gecko results show promise for application in both theoretical and natural ecosystem modeling.

46 citations


Journal ArticleDOI
TL;DR: In this paper, a simple neural controller that is less complex than the controllers traditionally hypothesized for cricket phonotaxis and syllable rate preference was used to control recognition and choice behavior in female crickets.
Abstract: Behavioral experiments with crickets show that female crickets respond to male calling songs with syllable rates within a certain bandwidth only We have made a robot model in which we implement a simple neural controller that is less complex than the controllers traditionally hypothesized for cricket phonotaxis and syllable rate preference The simple controller, which had been successfully used with a slowed and simplified signal, is here demonstrated to function, using songs with identical parameters to those found in real male cricket song, using an analog electronic model of the peripheral auditory morphology of the female cricket as the sensor We put the robot under the same experimental conditions as the female crickets, and it responds with phonotaxis to calling songs of real male Gryllus bimaculatus Further, the robot only responds to songs with syllable rates within a bandwidth similar to the bandwidth found for crickets By making polar plots of the heading direction of the robot, we obtain behavioral data that can be used in statistical analyses These analyses show that there are statistically significant differences between the behavioral responses to calling songs with syllable rates within the bandwidth and calling songs with syllable rates outside the bandwidth This gives the verification that the simple neural control mechanism (together with morphological auditory matched filtering) can account for the syllable rate preference found in female crickets With our robot system, we can now systematically explore the mechanisms controlling recognition and choice behavior in the female cricket by experimental replication

45 citations


Journal ArticleDOI
TL;DR: It has been shown that the evolution of linguistic diversity in vocabulary sharing will support cooperative behavior in a population of agents and the possibility of using linguistic Diversity in the field of distributed AI and robotics is examined.
Abstract: This article reports on the current state of our efforts to shed light on the origin and evolution of linguistic diversity using synthetic modeling and artificial life techniques. We construct a si...

27 citations


Journal ArticleDOI
TL;DR: The use of an evolutionary modeling approach to simulate the adaptation of mosquitoes and parasites to the available pesticides and drugs is introduced and suggests that adequate use of insecticides and drugs may reduce the occurrence of malaria in regions of low endemicity, although increased efforts would be necessary in the event of a climate change.
Abstract: As the resistance of the malaria parasite to antimalarial drugs continues to increase, as does that of the malarial mosquito to insecticides, the efficacy of efforts to control malaria in many tropical countries is diminishing. This trend, together with the projected consequences of climate change, may prove to exacerbate substantially the significance of malaria in the coming decades. In this article we introduce the use of an evolutionary modeling approach to simulate the adaptation of mosquitoes and parasites to the available pesticides and drugs. By coupling genetic algorithms with a dynamic malaria-epidemiological model, we derive a complex adaptive system capable of simulating adapting and evolving processes within both the mosquito and the parasite populations. This approach is used to analyze malaria management strategies appropriate to regions of higher and lower degrees of endemicity. The results suggest that adequate use of insecticides and drugs may reduce the occurrence of malaria in regions of low endemicity, although increased efforts would be necessary in the event of a climate change. However, our model indicates that in regions of high endemicity the use of insecticides and drugs may lead to an increase in incidence due to enhanced resistance development. Projected climate change, on the other hand, may lead to a limited reduction of the occurrence of malaria due to the presence of a higher percentage of immune persons in the older age class.

Journal ArticleDOI
TL;DR: [Personetics:] At present a "world" for personoid "inhabitants" can be prepared in a matter of a couple of hours; this is the time it takes to feed into the machine one of the full-fledged programs.
Abstract: [Personetics:] At present a \"world\" for personoid \"inhabitants\" can be prepared in a matter of a couple of hours. This is the time it takes to feed into the machine one of the full-fledged programs.... (p. 168) A specific type of personoid activity serves as a triggering mechanism, setting in motion a production process that will gradually augment and define itself; in other words, the world surrounding these beings takes on an unequivocalness only in accordance with their own behavior.... (p. 172) As hundreds of experiments have shown, groups numbering from four to seven personoids are optimal, at least for the development of speech and typical exploratory activity, and also for \"culturization.\" [Sociodynamics:] On the other hand, phenomena corresponding to social processes on a larger scale require larger groups. At present it is possible to \"accommodate\" up to one thousand personoids, roughly speaking, in a computer Universum of fair capacity.... (p. 177) There have arisen many different philosophies (ontologies and epistemologies), and also \"metaphysical experiments\" of a type all their own_(p. 186) \"I can enlarge their world or reduce it, speed up its time or slow it down, alter the mode and means of their perception; I can liquidate them, divide them, multiply them, transform the very ontological foundation of their existence. I am thus omnipotent with respect to them....\" (p. 195)

Journal ArticleDOI
TL;DR: "virtual" strong alife amounts to the claim that, by programming a computer, one can literally bring bits of its hardware to life.
Abstract: This article concerns the claim that it is possible to create living organisms, not merely models that represent organisms, simply by programming computers ("virtual" strong alife). I ask what sort of things these computer-generated organisms are supposed to be (where are they, and what are they made of?). I consider four possible answers to this question: (a) The organisms are abstract complexes of pure information; (b) they are material objects made of bits of computer hardware; (c) they are physical processes going on inside the computer; and (d) they are denizens of an entire artificial world, different from our own, that the programmer creates. I argue that (a) could not be right, that (c) collapses into (b), and that (d) would make strong alife either absurd or uninteresting. Thus, "virtual" strong alife amounts to the claim that, by programming a computer, one can literally bring bits of its hardware to life.

Journal ArticleDOI
TL;DR: The key elements and relationships that must be incorporated or synthesized in an artificial life system if these transitions are to emerge are identified.
Abstract: A major challenge for artificial life is to synthesize the evolutionary transitions that have repeatedly formed differentiated higher-level entities from cooperative organizations of lower-level entities, producing the nested hierarchical structure of living processes. This article identifies the key elements and relationships that must be incorporated or synthesized in an artificial life system if these transitions are to emerge. The processes currently included in artificial life systems are unable to provide an adequate basis for the emergence of the complex cooperative organization that is essential to the transitions. A new theory of the evolution of cooperative organization is developed that points to the additional processes that must be included in artificial life systems to underpin the emergence of the transitions.

Journal ArticleDOI
TL;DR: It is suggested that the capacity for intelligent behavior shown by many behavior-based robots is similar to that of animals of the late Precambrian and early Cambrian periods approximately 530 to 565 million years ago.
Abstract: The study of trace fossils, the fossilized remains of animal behavior, reveals interesting parallels with recent research in behavior-based robotics. This article reports robot simulations of the meandering foraging trails left by early invertebrates that demonstrate that such trails can be generated by mechanisms similar to those used for robot wall-following. We conclude with the suggestion that the capacity for intelligent behavior shown by many behavior-based robots is similar to that of animals of the late Precambrian and early Cambrian periods approximately 530 to 565 million years ago.

Journal ArticleDOI
TL;DR: This research focuses upon the self-organization and evolution of lower levels of aquatic food webs, Gaian interactions between primitive organisms and their physical environments, and species interactions governed by varying life-history strategies.
Abstract: In the spirit of contemporary artificial life research, EUZONE provides a virtual laboratory for the emergence of complex ecosystems from simple primitives. However, whereas most alife systems abstract away many real-world environmental constraints, EUZONE employs detailed physical and chemical models in combination with evolutionary algorithms to support the emergence of carbon-based aquatic ecosystems. With an emphasis on planktonlike organisms, this research focuses upon the self-organization and evolution of (a) lower levels of aquatic food webs, (b) Gaian interactions between primitive organisms and their physical environments, and (c) species interactions governed by varying life-history strategies.

Journal ArticleDOI
TL;DR: It is argued that natural selection is necessarily in operation because sufficient conditions for its occurrence are met: replication, mutagenicity, and trait/fitness covariance.
Abstract: I introduce a new alife model, an ecology based on a corpus of text, and apply it to the analysis of posts to USENET News. In this corporal ecology posts are organisms, the newsgroups of NetNews define an environment, and human posters situated in their wider context make up a scarce resource. I apply latent semantic indexing (LSI), a text retrieval method based on principal component analysis, to distill from the corpus those replicating units of text. LSI arrives at suitable replicators because it discovers word co-occurrences that segregate and recombine with appreciable frequency. I argue that natural selection is necessarily in operation because sufficient conditions for its occurrence are met: replication, mutagenicity, and trait/fitness covariance. I describe a set of experiments performed on a static corpus of over 10,000 posts. In these experiments I study average population fitness, a fundamental element of population ecology. My study of fitness arrives at the unhappy discovery that a flame-war, centered around an overly prolific poster, is the king of the jungle.


Journal ArticleDOI
TL;DR: This proceedings of a conference held in San Diego, California, in February 1995 consists of 48 papers grouped under the following sessions: Novel Areas of Evolutionary Programming and Evolution Strategies, Evolutionary Computation with Medical Applications, Evolutionaries Optimization, System Identification, Hierarchical Learning Methods, Self-Adaptation, Morphogenic Evolutionary Computing, VLSI and Part Placement Applications, Application to Biology and Biochemistry, Control Applications, and Genetic and Inductive Logic.
Abstract: This proceedings of a conference held in San Diego, California, in February 1995 consists of 48 papers grouped under the following sessions: Novel Areas of Evolutionary Programming and Evolution Strategies, Evolutionary Computation with Medical Applications, Evolutionary Optimization, System Identification, Hierarchical Learning Methods, Self-Adaptation, Morphogenic Evolutionary Computation, VLSI and Part Placement Applications, Application to Biology and Biochemistry, Control Applications, and Genetic and Inductive Logic. Although entitled \"Evolutionary Programming IV, Proceedings of the Fourth Annual Conference on Evolutionary Programming,\" the Evolutionary Programming Society includes all of the evolutionary computation methods and represents current research in the area of evolutionary computations, which generally include evolutionary programming (EP), genetic algorithms (GA), evolution strategies (ES), and genetic programming (GP). Evolutionary computation is inspired by the seemingly simplistic approach of nature and has become a powerful tool for the study of complex systems. Nature is quite adept at adapting multiple solutions to a single complex niche. The development of flight is just one example. Adaptation of flight can be found in both Animalia and Plantae across all phyla. Each taxon has adapted a unique method of flight. There is little similarity between the flying fish and the seed of a pine tree, except that they both have evolved the mastery of flight. There is no right or wrong solution to the phenotypic behavior of flight. From a scientific point of view, is the sparrow a better solution to flight than the fruit bat? While science and engineering focus on a single absolute truth, the study of complex systems shows that there exists a myriad of solutions to even the simplest of problems. Evolutionary computation is a statistical population-based approach to computational optimization. The general underlying algorithm is adaptation based: Randomly generate an initial population of individuals, consisting of the parameters to a particular problem. Evaluate each individual's parameters and determine a relative merit of the individual with respect to the objective. Rank the individuals within the current population by relative merit. Generate a new population by applying selection and reproductive pressure to the population. Several groups of researchers in the United States and Europe first simulated the process of evolution and applied the process to engineering problems, including Rechenberg [5] for evolution strategies, L. Fogel, Owens, and Walsh [31 with evolutionary programming, and Holland [4] for genetic algorithms. This research took place in the early 1960s, and primitive computers were not capable of fully exploiting the power of evolutionary computations. A revitalized interest in evolutionary computation was presented by D. Fogel [2] for evolutionary programming, Dejong [1] for genetic algorithms, and Schwefel [6] for evolution strategies.

Journal ArticleDOI
TL;DR: One of Edelman's early computer experiments, Darwin I, is revisited, and it is shown that adding replication greatly improves the adaptive power of the system.
Abstract: Neural Darwinism is a theory of cognition developed by Gerald Edelman along with George Reeke and Olaf Sporns at Rockefeller University. As its name suggests, neural Darwinism is modeled after biological Darwinism, and its authors assert that the two processes are strongly analogous. Both operate on variation in a population, amplifying the more adaptive individuals. However, from a computational perspective, neural Darwinism is quite different from other models of natural selection, such as genetic algorithms. The individuals of neural Darwinism do not replicate, thus robbing the process of the capacity to explore new solutions over time and ultimately reducing it to a random search. Because neural Darwinism does not have the computational power of a truly Darwinian process, it is misleading to label it as such. To illustrate this disparity in adaptive power, one of Edelman's early computer experiments, Darwin I, is revisited, and it is shown that adding replication greatly improves the adaptive power of the system.

Journal ArticleDOI
TL;DR: A spatially stmctured model of a coevolutionary predator-prey system with interactions in a one-dimensional phenotype space is considered and it is shown that in phenotype space predators and prey organize themselves into distinct clusters of phenotypes called quasi-species.
Abstract: We consider a spatially stmctured model of a coevolutionary predator-prey system with interactions in a one-dimensional phenotype space. We show that in phenotype space predators and prey organize themselves into distinct clusters of phenotypes called quasi-species. The prey quasi-species also cluster in patches in real space. As the prey quasi-species evolve away from the predator quasi-species (in phenotype space) the prey patch size reduces and the single predator quasi-species is inhibited from evolving toward either of the two prey species. We show that it is the interaction between the phenotype space patterns (quasi-species) and the real space patterns (patches) that inhibit the predators from evolving.

Journal ArticleDOI
TL;DR: It is argued that greater expressive power is obtained using Kauffman networks and the new method was tested in the artificial evolution of morphology and finally successfully applied to the synthesis of structure in dynamical neural networks.
Abstract: (developmental) genetic representations are schemes where each genotype encodes a program for the construction of a phenotype. A new method for contriving abstract genetic representations is presented, based upon Kauffman's ideas regarding biological development [13-16]. Phenogenesis is controlled by genotype via cell replication and differentiation. Comparison is made with the earlier published methods of Gruau [9] and Kitano [18]. It is argued that greater expressive power is obtained using Kauffman networks. The new method was tested in the artificial evolution of morphology and finally successfully applied to the synthesis of structure in dynamical neural networks.

Journal ArticleDOI
Hideaki Suzuki1
TL;DR: A novel system composed of multiple von Neumann computers and an appropriate problem environment is proposed and simulated, and crossover helps create novel advantageous sets of machine codes and evidently accelerates optimization by GAs.
Abstract: A novel system composed of multiple von Neumann computers and an appropriate problem environment is proposed and simulated. Each computer has a memory to store the machine instruction program, and when a program is executed, a series of machine codes in the memory is sequentially decoded, leading to register operations in the central processing unit (CPU). By means of these operations, the computer not only can handle its generally used registers but also can read and write the environmental database. Simulation is driven by genetic algorithms (GAs) performed on the population of program memories. Mutation and crossover create program diversity in the memory, and selection facilitates the reproduction of appropriate programs. Through these evolutionary operations, advantageous combinations of machine codes are created and fixed in the population one by one, and the higher function, which enables the computer to calculate an appropriate number from the environment, finally emerges in the program memory. In the latter half of the article, the performance of GAs on this system is studied. Under different sets of parameters, the evolutionary speed, which is determined by the time until the domination of the final program, is examined and the conditions for faster evolution are clarified. At an intermediate mutation rate and at an intermediate population size, crossover helps create novel advantageous sets of machine codes and evidently accelerates optimization by GAs.

Journal ArticleDOI
TL;DR: This book is the proceedings of the fourth annual conference on evolutionary programming held in 1995 in San Diego and contains 44 articles that cover the fields of evolutionary programming, inductive logicprogramming, morphogenic evolutionary computation, evolutionary strategies, evolutionary optimization, medical engineering, pattern recognition, system identification, learning, self-adaptation, biology, biochemistry, and control.
Abstract: This book is the proceedings of the fourth annual conference on evolutionary programming held in 1995 in San Diego. It contains 44 articles, most of which are applications. They cover the fields of evolutionary programming (EP), inductive logicprogramming, morphogenic evolutionary computation, evolutionary strategies, evolutionary optimization, medical engineering, pattern recognition, system identification, learning, self-adaptation, biology, biochemistry, and control. Wide applications make this book very useful to practitioners facing real-world problems. The evolutionary methods, including EP, genetic algorithms (GA), evolutionary computation, and evolutionary strategy, are believed to be easy and powerful alternatives to the conventional mathematical and AI methods. In addition, the book is of great value because it can be used not only for solving problems but also for defining or finding problems. In other words, traditional methods are useful and precise for well-defined problems, but evolutionary methods have the advantage of handling ill-defined problems that we unfortunately come across in the practical fields so often. Several articles are fundamental. They, I expect, promote the evolutionary method in a sophisticated and rigorous

Journal Article
TL;DR: A paradigm in which multiple Von Neumann machines (MUltiple von Neumann Computers; MUNCs) evolve and create the function is proposed and enables GAs to direct evolutionary growth towards functional emergence which is useful for humans.
Abstract: For the spontaneous emergence of the function in the multiagent system through evolution, a designer (man) must refrain from designing agents to excess and make agents behave as freely as possible based upon the rules within themselves. All the designer can do is to implement the system architecture and to specify the environments. The von Neumann machine, which is logically equivalent to the universal Turing machine, can do everything based upon its own rules (the program in the memory) and hence is the most suitable architecture of agents for functional emergence. Such machines, after installation into some appropriate environments and evolution, can attain any higher functions in their program. In all evolutionary adaptive systems, multiplication is the final purpose after which agents strive. The emergence of the function of agents are always directed to this purpose; hence when some resource helps multiplication, it becomes a target of acquisition by agents. In the core memory in which a number of self-reproducing programs are shut up in the same memory and compete with each other, CPU time and the memory space facilitate the reproduction, hence programs grow increasingly smarter through evolution in obtaining these resources [1, 2]. Recently Adami [3] succeeded to impose another task (adding two numbers) on the core memory agents by use of an additional bonus. A bonus is awarded to the program which fulfills the task prepared by the designer, effectively converted into CPU time, and eventually facilitates self-reproduction. Though in his strategy the whole program is not devoted to the task imposed by the designer, the created program was composed of two parts i.e., the self-reproducing part and the task managing part, and met the requirements from both the program itself and a man. This is one example of the ways to make the inherent purpose of the evolutionary system compatible with engineering demands. Genetic algorithms (GAs) [4, 5] are another method with which to impose the human requirement on agents more directly. In GAs, reproduction is not put in agents’ hands but fulfilled by the program prepared by the designer. Released from the duty of self-reproduction, the whole program of an agent is devoted to accomplishing the task requested by the man. The fitness score indicating how well each agent copes with the task is used for determining its reproduction rate. This selection strategy enables GAs to direct evolutionary growth towards functional emergence which is useful for humans. In the previous studies on GAs, however, the inner architecture of agents under genetic operations was too simple to create higher functions. An agent is typically a bit sequence coding a solution for some specific problem; the bit string coding the route for the traveling salesman, the bit string describing the state transition matrix of the finite state automaton, etc. Although in Artificial Life approach a designer must refrain from designing agents to excess, we must at least implement their basic architecture so that they might create higher functions based upon it. In this paper, I propose a paradigm in which multiple von Neumann machines (MUltiple von Neumann Computers; MUNCs) evolve and create the function. Each machine is a standard computer system with hardware architecture consisting of CPU and memory. (Unlike core memory programs, machines are not put in the same memory, but prepared apart.) Each machine can independently read or write one common environmental database. GAs operate on the bit sequence (the binary representation of program) in the memory of machines and, through optimiAbstract

Journal ArticleDOI
TL;DR: In this paper, the authors introduce the term artificial evolution, defined as the controlled micromanipulation of genetic information from one generation to the next, where the first variational step is engineered and the second selection step is insured by humankind.
Abstract: Most of us know about specific biotechnologies but may be less aware of the underlying process. This essay analyzes that process and speculates on its meaning. It introduces the term artificial evolution, here defined as the controlled micromanipulation of genetic information from one generation to the next, where the first variational step is engineered and the second selection step is insured by humankind. This is qualitatively different from natural evolution. The characteristics of this artificial mode of evolution are immediacy, as opposed to Darwin's law of gradualism, transclass descent, unlike Darwin's common descent, identity, as opposed to variety, and an artificial rate of mutational change, as opposed to a natural one. It constitutes evolution out of evolution, and redoubles our ethical responsibility for the future.