scispace - formally typeset
Open AccessProceedings Article

The Cascade-Correlation Learning Architecture

Reads0
Chats0
TLDR
The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.
Abstract
Cascade-Correlation is a new architecture and supervised learning algorithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology. Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detectors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Water demand forecasting: review of soft computing methods

TL;DR: It is found that soft computing methods are mainly used for short-term demand forecasting, and it seems soft computing has a lot more to contribute to water demand forecasting.
Book ChapterDOI

Predicting the Mackey-Glass Timeseries With Cascade-Correlation Learning

TL;DR: In this paper, a cascade-correlation learning algorithm has been used to predict realvalued timeseries, and the results of learning to predict the Mackey-glass chaotic timeseries using Cascade-Correlation are compared with other neural net learning algorithms as well as standard techniques.
Journal ArticleDOI

Exploring constructive cascade networks

TL;DR: The acasper algorithm, which incorporates the insights obtained from the empirical studies, was shown to have good generalization and network construction properties and was compared to the cascade correlation algorithm on the Proben 1 and additional regression data sets.
Journal ArticleDOI

Connectionist theory refinement: genetically searching the space of network topologies

TL;DR: The Regent algorithm as mentioned in this paper uses domain-specific knowledge to help create an initial population of knowledge-based neural networks and genetic operators of crossover and mutation to continually search for better network topologies.

Learning to pronounce written words : a study in inductive language learning

TL;DR: Compression by compression, 23, 144 levels of, 10, 12, 16, 46, 47 activation, 29 propagation, 29 alphabet orthographic, 179 phonemic, 179 alphabetic writing system, 2 analogical modelling.
References
More filters
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Book

Learning internal representations by error propagation

TL;DR: In this paper, the problem of the generalized delta rule is discussed and the Generalized Delta Rule is applied to the simulation results of simulation results in terms of the generalized delta rule.
Journal ArticleDOI

Increased Rates of Convergence Through Learning Rate Adaptation

TL;DR: A study of Steepest Descent and an analysis of why it can be slow to converge and four heuristics for achieving faster rates of convergence are proposed.