scispace - formally typeset
Open AccessProceedings Article

The Cascade-Correlation Learning Architecture

Reads0
Chats0
TLDR
The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.
Abstract
Cascade-Correlation is a new architecture and supervised learning algorithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology. Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detectors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

A comparison of linear regression and neural network methods for predicting excess returns on large stocks

TL;DR: This paper investigates whether the predictive power of the economic and financial variables employed in the above studies can be enhanced if the statistical method of linear regression is replaced by feedforward neural networks with backpropagation of error, and explores two methods for reducing the complexity of the network.

Selective transfer of neural network task knowledge

TL;DR: The research objectives are to develop a theoretical model and test a prototype system which sequentially retains ANN task knowledge and selectively uses that knowledge to bias the learning of a new task in an efficient and effective manner.
Posted Content

Intelligent Systems: Architectures and Perspectives

TL;DR: This chapter introduces the different generic architectures for integrating intelligent systems, and the designing aspects and perspectives of different hybrid architectures like NN-FIS, EC-F IS,EC-NN, FIS-PR and NN -FIS-EC systems are presented.
Journal ArticleDOI

Wavelet-Based Feature Extraction for the Analysis of EEG Signals Associated with Imagined Fists and Feet Movements

TL;DR: This work used Neural Networks (NNs) as a classifier that enables the classification of imagined movements into either fists or feet and showed a good performance that enables controlling computer applications via imagined fists and feet movements.
Journal ArticleDOI

A computational analysis of conservation.

TL;DR: In this article, an approach to modeling cognitive development with a generative connectionist algorithm is described and illustrated with a new model of conservation acquisition, including the problem size effect, the length bias effect, and the screening effect.
References
More filters
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Book

Learning internal representations by error propagation

TL;DR: In this paper, the problem of the generalized delta rule is discussed and the Generalized Delta Rule is applied to the simulation results of simulation results in terms of the generalized delta rule.
Journal ArticleDOI

Increased Rates of Convergence Through Learning Rate Adaptation

TL;DR: A study of Steepest Descent and an analysis of why it can be slow to converge and four heuristics for achieving faster rates of convergence are proposed.