Open AccessProceedings Article
The Cascade-Correlation Learning Architecture
Scott E. Fahlman,Christian Lebiere +1 more
- Vol. 2, pp 524-532
Reads0
Chats0
TLDR
The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.Abstract:
Cascade-Correlation is a new architecture and supervised learning algorithm for artificial neural networks. Instead of just adjusting the weights in a network of fixed topology. Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one, creating a multi-layer structure. Once a new hidden unit has been added to the network, its input-side weights are frozen. This unit then becomes a permanent feature-detector in the network, available for producing outputs or for creating other, more complex feature detectors. The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.read more
Citations
More filters
Journal ArticleDOI
Predicting internal bond strength of particleboard under outdoor exposure based on climate data: comparison of multiple linear regression and artificial neural network
TL;DR: In this paper, the internal bond strength (IB) of a commercial particleboard put under various outdoor exposure conditions were modeled using a multiple linear regression (MLR) and an artificial neural network (ANN).
Journal ArticleDOI
Design of low-cost, real-time simulation systems for large neural networks
M. James,Doan B. Hoang +1 more
TL;DR: This paper analyzes several software and hardware strategies to make feasible the simulation of large neural networks in real-time and presents a particular multicomputer design able to implement these strategies.
Journal ArticleDOI
Adaptive structure learning method of deep belief network using neuron generation–annihilation and layer generation
TL;DR: The adaptive DBN is developed using the neuron generation–annihilation and layer generation algorithm by observing the variance of some parameters and achieved the highest classification accuracy among several latest DBN- and CNN-based methods.
Proceedings ArticleDOI
Fault tolerance of feedforward neural nets for classification tasks
Dhananjay S. Phatak,Israel Koren +1 more
TL;DR: A method is proposed to estimate the fault tolerance of feedforward artificial neural nets (ANNs) and to synthesize robust nets and lower bounds on the required redundancy are analytically derived for some canonical problems.
Interference cancellation in EMG signal Using ANFIS
C. Kezi,Selva Vijila,Selva Kumar +2 more
TL;DR: The performance evaluation of the proposed Adaptive Neuro Fuzzy Inference System shows that ANFIS successfully cancel the interference in EMG signal.
References
More filters
Book ChapterDOI
Learning internal representations by error propagation
TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
MonographDOI
Parallel Distributed Processing: Explorations in the Microstructure of Cognition: Foundations
Book
Learning internal representations by error propagation
TL;DR: In this paper, the problem of the generalized delta rule is discussed and the Generalized Delta Rule is applied to the simulation results of simulation results in terms of the generalized delta rule.
Journal ArticleDOI
Increased Rates of Convergence Through Learning Rate Adaptation
TL;DR: A study of Steepest Descent and an analysis of why it can be slow to converge and four heuristics for achieving faster rates of convergence are proposed.