Open AccessProceedings Article
The Cascade-Correlation Learning Architecture
Scott E. Fahlman,Christian Lebiere +1 more
- Vol. 2, pp 524-532
Reads0
Chats0
TLDR
The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.Abstract:
Cascade-Correlation is a new architecture and supervised learning algorithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology. Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detectors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.read more
Citations
More filters
Proceedings ArticleDOI
Image Segmentation with Cascaded Hierarchical Models and Logistic Disjunctive Normal Networks
TL;DR: This work proposes a multi-resolution contextual framework, called cascaded hierarchical model (CHM), which learns contextual information in a hierarchical framework for image segmentation, and introduces a novel classification scheme, called logistic disjunctive normal networks (LDNN), which outperforms state-of-the-art classifiers and can be used in the CHM to improve object segmentation performance.
Journal ArticleDOI
Design and Application of a Variable Selection Method for Multilayer Perceptron Neural Network With LASSO
TL;DR: The results show that the proposed approach can be used to construct a more compressed model, which incorporates a higher level of prediction accuracy than other existing methods.
Journal ArticleDOI
Application of Cascade Correlation Networks for Structures toChemistry
TL;DR: This work reports the results obtained for QSPR on Alkanes (predicting the boiling point) and QSAR of a class of Benzodiazepines and it is competitive with ‘ad hoc’ MLPs for theQSPR problem.
Book ChapterDOI
Time Series Prediction with the Self-Organizing Map: A Review
TL;DR: The main goal of the paper is to show that, despite being originally designed as an unsupervised learning algorithm, the SOM is flexible enough to give rise to a number of efficient supervised neural architectures devoted to TSP tasks.
Journal ArticleDOI
Automated Cellular Modeling and Prediction on a Large Scale
TL;DR: CHAMP (CHurn Analysis, Modeling, andPrediction), an automated system for modeling cellularsubscriber churn that is predicting which customers will discontinue cellular phone service, is described.
References
More filters
Book ChapterDOI
Learning internal representations by error propagation
TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
MonographDOI
Parallel Distributed Processing: Explorations in the Microstructure of Cognition: Foundations
Book
Learning internal representations by error propagation
TL;DR: In this paper, the problem of the generalized delta rule is discussed and the Generalized Delta Rule is applied to the simulation results of simulation results in terms of the generalized delta rule.
Journal ArticleDOI
Increased Rates of Convergence Through Learning Rate Adaptation
TL;DR: A study of Steepest Descent and an analysis of why it can be slow to converge and four heuristics for achieving faster rates of convergence are proposed.