Open AccessProceedings Article
The Cascade-Correlation Learning Architecture
Scott E. Fahlman,Christian Lebiere +1 more
- Vol. 2, pp 524-532
Reads0
Chats0
TLDR
The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.Abstract:
Cascade-Correlation is a new architecture and supervised learning algorithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology. Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detectors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.read more
Citations
More filters
Journal ArticleDOI
Using and designing massively parallel computers for artificial neural networks
Tomas Nordström,Bertil Svensson +1 more
TL;DR: This paper studies the attempts to make ANN algorithms run on massively parallel computers as well as designs of new parallel systems tuned for ANN computing, and identifies different classes of parallel architectures used or designed for ANN.
Journal Article
State of the Art of Artificial Neural Networks in Geotechnical Engineering
TL;DR: A state-of-the-art examination of ANNs in geotechnical engineering and insights into the modeling issues ofANNs are presented.
Speech recognition using neural networks
TL;DR: It is argued that a NN-HMM hybrid has several theoretical advantages over a pure HMM system, including better acoustic modeling accuracy, better context sensitivity, more natural discrimination, and a more economical use of parameters.
Journal ArticleDOI
Review article: Attributes of neural networks for extracting continuous vegetation variables from optical and radar measurements
TL;DR: In this article, the advantages and power of neural networks for extracting continuous vegetation variables using optical and/or radar data and ancillary data are discussed and compared to traditional techniques.
Journal ArticleDOI
A dynamic all parameters adaptive BP neural networks model and its application on oil reservoir prediction
Shiwei Yu,Kejun Zhu,Fengqin Diao +2 more
TL;DR: The results of application on oil reservoir prediction show that the proposed model with comparatively simple structure can meet the precision request and enhance the generalization ability.
References
More filters
Book ChapterDOI
Learning internal representations by error propagation
TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
MonographDOI
Parallel Distributed Processing: Explorations in the Microstructure of Cognition: Foundations
Book
Learning internal representations by error propagation
TL;DR: In this paper, the problem of the generalized delta rule is discussed and the Generalized Delta Rule is applied to the simulation results of simulation results in terms of the generalized delta rule.
Journal ArticleDOI
Increased Rates of Convergence Through Learning Rate Adaptation
TL;DR: A study of Steepest Descent and an analysis of why it can be slow to converge and four heuristics for achieving faster rates of convergence are proposed.