scispace - formally typeset
Open AccessProceedings Article

The Cascade-Correlation Learning Architecture

Reads0
Chats0
TLDR
The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.
Abstract
Cascade-Correlation is a new architecture and supervised learning algorithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology. Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detectors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.

read more

Content maybe subject to copyright    Report

Citations
More filters
Book ChapterDOI

Constructive Function Approximation

TL;DR: Three algorithms are applied to two tasks, measuring the error in the approximated function as learning proceeds, and issues of feature overlap, independence, and coverage are addressed.
Book ChapterDOI

Learning Controllers for Industrial Robots

TL;DR: From the experimental comparison, it appears that both Fuzzy Controllers and RBFNs synthesised from examples are excellent approximators, and that, in practice, they can be even more accurate than MLPs.
Journal ArticleDOI

Evaluation of constructive neural networks with cascaded architectures

TL;DR: Five different constructive neural network algorithms are investigated, of which four were methods found in the literature and one was the authors' own recently developed algorithm, which produces rather often the best performance level among the investigated algorithms.
Journal ArticleDOI

On the automated, evolutionary design of neural networks: past, present, and future

TL;DR: This work aims to provide a complete reference of all works related to neuroevolution of convolutional neural networks up to the date, and to the best of its knowledge, this is the best survey reviewing the literature in this field.
Book ChapterDOI

The Legion System: A Novel Approach to Evolving Hetrogeneity for Collective Problem Solving

TL;DR: It was found that the amount of heterogeneity evolved in an agent group is dependent of the given problem domain: for the first task, the Legion system evolved heterogeneous groups; for the second task, primarily homogeneous groups evolved.
References
More filters
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Book

Learning internal representations by error propagation

TL;DR: In this paper, the problem of the generalized delta rule is discussed and the Generalized Delta Rule is applied to the simulation results of simulation results in terms of the generalized delta rule.
Journal ArticleDOI

Increased Rates of Convergence Through Learning Rate Adaptation

TL;DR: A study of Steepest Descent and an analysis of why it can be slow to converge and four heuristics for achieving faster rates of convergence are proposed.