scispace - formally typeset
Open AccessProceedings Article

The Cascade-Correlation Learning Architecture

Reads0
Chats0
TLDR
The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.
Abstract
Cascade-Correlation is a new architecture and supervised learning algorithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology. Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detectors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.

read more

Content maybe subject to copyright    Report

Citations
More filters
Posted Content

Adaptive Neural Trees

TL;DR: Adapt neural trees via adaptive neural trees (ANTs) that incorporates representation learning into edges, routing functions and leaf nodes of a decision tree, along with a backpropagation-based training algorithm that adaptively grows the architecture from primitive modules (e.g., convolutional layers).
Journal ArticleDOI

The Parallel Transfer of Task Knowledge Using Dynamic Learning Rates Based on a Measure of Relatedness

Daniel Silver
- 01 Jun 1996 - 
TL;DR: Results of experiments demonstrate the ability of eta MTL to dynamically select the most related source task(s) for the functional transfer of prior domain knowledge.
Journal ArticleDOI

Modeling developmental cognitive neuroscience.

TL;DR: Two classes of connectionist models are described, which focus on experience-dependent structural elaboration within a brain region by adding or deleting units and connections during learning and the gradual integration of different brain areas based on combinations of experience- dependent and maturational factors.
Journal ArticleDOI

A developmental model for the evolution of artificial neural networks

TL;DR: In this paper, a model of decentralized growth and development for artificial neural networks (ANNs), inspired by developmental biology and the physiology of nervous systems, is presented, where each individual artificial neuron is an autonomous unit whose behavior is determined only by the genetic information it harbors and local concentrations of substrates.
Journal ArticleDOI

Perceptual constraints and the learnability of simple grammars.

TL;DR: It is shown that some spontaneous generalizations are driven by specialized, highly constrained symbolic operations, which suggests that some simple grammars are acquired using perceptual primitives rather than general-purpose mechanisms.
References
More filters
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Book

Learning internal representations by error propagation

TL;DR: In this paper, the problem of the generalized delta rule is discussed and the Generalized Delta Rule is applied to the simulation results of simulation results in terms of the generalized delta rule.
Journal ArticleDOI

Increased Rates of Convergence Through Learning Rate Adaptation

TL;DR: A study of Steepest Descent and an analysis of why it can be slow to converge and four heuristics for achieving faster rates of convergence are proposed.