scispace - formally typeset
Open AccessProceedings Article

The Cascade-Correlation Learning Architecture

Reads0
Chats0
TLDR
The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.
Abstract
Cascade-Correlation is a new architecture and supervised learning algorithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology. Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detectors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.

read more

Content maybe subject to copyright    Report

Citations
More filters
Book ChapterDOI

Using genetic search to refine knowledge-based neural networks

TL;DR: The REGENT algorithm is presented, which uses genetic algorithms to broaden the type of networks seen during its search by using (a) the domain theory to help create an initial population and (b) crossover and mutation operators specifically designed for knowledge-based networks.
Book ChapterDOI

Nonlinear image processing using artificial neural networks

TL;DR: The experiment on supervised classification, in handwritten digit recognition, showed that ANNs are quite capable of solving difficult object recognition problems, and a number of ANN architectures were trained to mimic the Kuwahara filter, a nonlinear edge-preserving smoothing filter used in preprocessing.
Journal ArticleDOI

Determining and classifying the region of interest in ultrasonic images of the breast using neural networks

TL;DR: This paper describes how ultrasonic images of the female breast have been processed and neural nets used to aid the identification of malignant and benign areas in them and concludes that the system is promising and should be developed further by providing more training to the network.
Journal ArticleDOI

An empirical evaluation of constructive neural network algorithms in classification tasks

TL;DR: Empirical results of an empirical evaluation of seven two-class CoNN algorithms, namely Tower, Pyramid, Tiling, Upstart, Shift, Perceptron Cascade (PC) and Partial Target Inversion (PTI) in 12 knowledge domains are presented.
References
More filters
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Book

Learning internal representations by error propagation

TL;DR: In this paper, the problem of the generalized delta rule is discussed and the Generalized Delta Rule is applied to the simulation results of simulation results in terms of the generalized delta rule.
Journal ArticleDOI

Increased Rates of Convergence Through Learning Rate Adaptation

TL;DR: A study of Steepest Descent and an analysis of why it can be slow to converge and four heuristics for achieving faster rates of convergence are proposed.