scispace - formally typeset
Open Access

A comparison between cellular encoding and direct encoding for genetic neural networks

Reads0
Chats0
TLDR
This paper compares the efficiency of two encoding schemes for Artificial Neural Networks optimized by evolutionary algorithms and solves a more difficult problem: balancing two poles when no information about the velocity is provided as input.
Abstract
This paper compares the efficiency of two encoding schemes for Artificial Neural Networks optimized by evolutionary algorithms. Direct Encoding encodes the weights for an a priori fixed neural network architecture. Cellular Encoding encodes both weights and the architecture of the neural network. In previous studies, Direct Encoding and Cellular Encoding have been used to create neural networks for balancing 1 and 2 poles attached to a cart on a fixed track. The poles are balanced by a controller that pushes the cart to the left or the right. In some cases velocity information about the pole and cart is provided as an input; in other cases the network must learn to balance a single pole without velocity information. A careful study of the behavior of these systems suggests that it is possible to balance a single pole with velocity information as an input and without learning to compute the velocity. A new fitness function is introduced that forces the neural network to compute the velocity. By using this new fitness function and tuning the syntactic constraints used with cellular encoding, we achieve a tenfold speedup over our previous study and solve a more difficult problem: balancing two poles when no information about the velocity is provided as input.

read more

Citations
More filters
Journal ArticleDOI

Deep learning in neural networks

TL;DR: This historical survey compactly summarizes relevant work, much of it from the previous millennium, review deep supervised learning, unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.
Journal ArticleDOI

Evolving neural networks through augmenting topologies

TL;DR: Neural Evolution of Augmenting Topologies (NEAT) as mentioned in this paper employs a principled method of crossover of different topologies, protecting structural innovation using speciation, and incrementally growing from minimal structure.
Journal ArticleDOI

Neuroevolution: from architectures to learning

TL;DR: This paper gives an overview of the most prominent methods for evolving ANNs with a special focus on recent advances in the synthesis of learning architectures.
Journal ArticleDOI

A hypercube-based encoding for evolving large-scale neural networks

TL;DR: The main conclusion is that the ability to explore the space of regular connectivity patterns opens up a new class of complex high-dimensional tasks to neuroevolution.
Journal ArticleDOI

Compositional pattern producing networks: A novel abstraction of development

TL;DR: Results produced with CPPNs through interactive evolution of two-dimensional images show that such an encoding can nevertheless produce structural motifs often attributed to more conventional developmental abstractions, suggesting that local interaction may not be essential to the desirable properties of natural encoding in the way that is usually assumed.
References
More filters
Book

Genetic Programming: On the Programming of Computers by Means of Natural Selection

TL;DR: This book discusses the evolution of architecture, primitive functions, terminals, sufficiency, and closure, and the role of representation and the lens effect in genetic programming.
Journal ArticleDOI

Paper: The parallel genetic algorithm as function optimizer

TL;DR: The parallel genetic algorithm PGA is applied to the optimization of continuous functions and is able to find the global minimum of Rastrigin's function of dimension 400 on a 64 processor system!
Journal ArticleDOI

Automatic definition of modular neural networks

TL;DR: An artificial developmental system that is a computationally efficient technique for the automatic generation of complex artificial neural networks (ANNs) and some simulation results showing that the same problem cannot be solved if the mechanism for automatic definition of subnetworks is suppressed.

Selection in Massively Parallel Genetic Algorithms.

TL;DR: This paper characterize the difference between panmictic and local selection/mating schemes in terms of diversity of alleles, diversity of genotypes, the inbreeding, and the speed and robustness of the genetic algorithm.