scispace - formally typeset
Open AccessProceedings Article

The Cascade-Correlation Learning Architecture

Reads0
Chats0
TLDR
The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.
Abstract
Cascade-Correlation is a new architecture and supervised learning algorithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology. Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detectors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

A new algorithm to design compact two-hidden-layer artificial neural networks

TL;DR: The experimental results show that the CNNDA can produce compact ANNs with good generalization ability and short training time in comparison with other algorithms, including cancer, diabetes and character-recognition problems in ANNs.
Journal ArticleDOI

Development of artificial intelligence for modeling wastewater heavy metal removal: State of the art, application assessment and possible future research

TL;DR: In this review, each element of the predictive models and their corresponding treatment processes, including its pros and cons, are discussed thoroughly and several research directions, which could bridge the gap in the same domain are proposed and recommended on the basis of the identified research limitations.
Journal ArticleDOI

Applying local measures of spatial heterogeneity to Landsat-TM images for predicting wildfire occurrence in Mediterranean landscapes

TL;DR: In this paper, the relationship between local landscape heterogeneity and wildfire occurrence was investigated in a rural area of 672.3 km2 in Eastern Spain, and several neural network models found significant relationships between local spatial pattern and future fire occurrence.
Book ChapterDOI

Connectionist models of cognition.

TL;DR: This book is a definitive reference source for the growing, increasingly more important, and interdisciplinary field of computational cognitive modeling, that is, computational psychology.
Book ChapterDOI

Hierarchical Classifiers for Complex Spatio-temporal Concepts

TL;DR: The general methodology presented here is applied to approximate spatial complex concepts and spatio-temporal complex concepts defined for (un)structured complex objects, to identify the behavioral patterns ofcomplex objects, and to the automated behavior planning for such objects when the states of objects are represented by spatio/temporal concepts requiring approximation.
References
More filters
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Book

Learning internal representations by error propagation

TL;DR: In this paper, the problem of the generalized delta rule is discussed and the Generalized Delta Rule is applied to the simulation results of simulation results in terms of the generalized delta rule.
Journal ArticleDOI

Increased Rates of Convergence Through Learning Rate Adaptation

TL;DR: A study of Steepest Descent and an analysis of why it can be slow to converge and four heuristics for achieving faster rates of convergence are proposed.