scispace - formally typeset
Open AccessProceedings Article

The Cascade-Correlation Learning Architecture

Reads0
Chats0
TLDR
The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.
Abstract
Cascade-Correlation is a new architecture and supervised learning algorithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology. Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detectors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

An efficient constrained learning algorithm with momentum acceleration

TL;DR: Its performance, in terms of learning speed and scalability properties, is evaluated and found superior to the performance of reputedly fast variants of the back-propagation algorithm in the above benchmarks.
BookDOI

Soft Computing for Knowledge Discovery and Data Mining

Oded Maimon, +1 more
TL;DR: This edited volume by highly regarded authors, includes several contributors of the 2005, Data Mining and Knowledge Discovery Handbook and is suitable as a secondary textbook or reference for advanced-level students in information systems, engineering, computer science and statistics management.
Journal ArticleDOI

Connectivity and performance tradeoffs in the cascade correlation learning architecture

TL;DR: The cascade correlation algorithm is modified to generate networks with restricted fan-in and small depth by controlling the connectivity and the results reveal that there is a tradeoff between connectivity and other performance attributes like depth, total number of independent parameters, and learning time.
Journal ArticleDOI

Neural networks applied in chemistry. I. Determination of the optimal topology of multilayer perceptron neural networks

TL;DR: Methods that can be used to determine optimum or near‐optimum geometries of artificial neural networks, and several case studies illustrate the development of neural network models for applications in chemistry and chemical engineering.
Journal ArticleDOI

A new strategy for adaptively constructing multilayer feedforward neural networks

TL;DR: A new strategy for adaptively and autonomously constructing a multi-hidden-layer feedforward neural network (FNN) that adds both new hidden units and new hidden layers one at a time when it is determined to be needed is introduced.
References
More filters
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Book

Learning internal representations by error propagation

TL;DR: In this paper, the problem of the generalized delta rule is discussed and the Generalized Delta Rule is applied to the simulation results of simulation results in terms of the generalized delta rule.
Journal ArticleDOI

Increased Rates of Convergence Through Learning Rate Adaptation

TL;DR: A study of Steepest Descent and an analysis of why it can be slow to converge and four heuristics for achieving faster rates of convergence are proposed.