scispace - formally typeset
Open AccessProceedings Article

The Cascade-Correlation Learning Architecture

Reads0
Chats0
TLDR
The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.
Abstract
Cascade-Correlation is a new architecture and supervised learning algorithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology. Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detectors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Neural networks in geophysical applications

TL;DR: Techniques are described for faster training, better overall performance, i.e., generalization, and the automatic estimation of network size and architecture.
Journal ArticleDOI

An iterative pruning algorithm for feedforward neural networks

TL;DR: A new pruning method is developed, based on the idea of iteratively eliminating units and adjusting the remaining weights in such a way that the network performance does not worsen over the entire training set.
Journal ArticleDOI

Inversion methods for physically‐based models

TL;DR: In this paper, the characteristics of the traditional inversion, table look-up, neural network and other methods are discussed as well as the major achievements, advantages/disadvantages, and research issues for each method.
Journal ArticleDOI

Learning with limited numerical precision using the cascade-correlation algorithm

TL;DR: An empirical study of the effects of limited precision in cascade-correlation networks on three different learning problems is presented and techniques for dynamic rescaling and probabilistic rounding that allow reliable convergence down to 7 bits of precision or less are introduced.
Journal ArticleDOI

A functional hypothesis for adult hippocampal neurogenesis: Avoidance of catastrophic interference in the dentate gyrus

TL;DR: The hypothesis is that old neurons are rather stable and preserve an optimal encoding learned for known environments while new neurons are plastic to adapt to those features that are qualitatively new in a new environment when adapting to new environments.
References
More filters
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Book

Learning internal representations by error propagation

TL;DR: In this paper, the problem of the generalized delta rule is discussed and the Generalized Delta Rule is applied to the simulation results of simulation results in terms of the generalized delta rule.
Journal ArticleDOI

Increased Rates of Convergence Through Learning Rate Adaptation

TL;DR: A study of Steepest Descent and an analysis of why it can be slow to converge and four heuristics for achieving faster rates of convergence are proposed.