Journal ArticleDOI
Dynamic Node Creation in Backpropagation Networks
TLDR
A new method called Dynamic Node Creation (DNC) which automatically grows BP networks until the target problem is solved, and yielded a solution for every problem tried.Abstract:
This paper introduces a new method called Dynamic Node Creation (DNC) which automatically grows BP networks until the target problem is solved. DNC sequentially adds nodes one at a time to the hidden layer(s) of the network until the desired approximation accuracy is achieved. Simulation results for parity, symmetry, binary addition, and the encoder problem are presented. The procedure was capable of finding known minimal topologies in many cases, and was always within three nodes of the minimum. Computational expense for finding the solutions was comparable to training normal BP networks with the same final topologies. Starting out with fewer nodes than needed to solve the problem actually seems to help find a solution. The method yielded a solution for every problem tried.read more
Citations
More filters
Journal ArticleDOI
Deep learning in neural networks
TL;DR: This historical survey compactly summarizes relevant work, much of it from the previous millennium, review deep supervised learning, unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.
Proceedings Article
The Cascade-Correlation Learning Architecture
TL;DR: The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.
Journal ArticleDOI
Introduction to neural networks.
TL;DR: This book is for non-commercial use, as long as it is distributed as a whole in its original form, and the names of the authors and the University of Amsterdam are mentioned.
Journal ArticleDOI
Learning and development in neural networks: the importance of starting small
TL;DR: Possible synergistic interactions between maturational change and the ability to learn a complex domain (language) as investigated in connectionist networks suggest that developmental restrictions on resources may constitute a necessary prerequisite for mastering certain complex domains.
Book ChapterDOI
Theory of the backpropagation neural network
TL;DR: A speculative neurophysiological model illustrating how the backpropagation neural network architecture might plausibly be implemented in the mammalian brain for corticocortical learning between nearby regions of the cerebral cortex is presented.
References
More filters
Journal ArticleDOI
Multilayer feedforward networks are universal approximators
TL;DR: It is rigorously established that standard multilayer feedforward networks with as few as one hidden layer using arbitrary squashing functions are capable of approximating any Borel measurable function from one finite dimensional space to another to any desired degree of accuracy, provided sufficiently many hidden units are available.
Book ChapterDOI
Learning internal representations by error propagation
TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
MonographDOI
Parallel Distributed Processing: Explorations in the Microstructure of Cognition: Foundations
Book
Learning internal representations by error propagation
TL;DR: In this paper, the problem of the generalized delta rule is discussed and the Generalized Delta Rule is applied to the simulation results of simulation results in terms of the generalized delta rule.
Journal ArticleDOI
Multilayer feedforward networks are universal approximators
HornikK.,StinchcombeM.,WhiteH. +2 more