scispace - formally typeset
Journal ArticleDOI

Initializing back propagation networks with prototypes

Thierry Denoeux, +1 more
- 06 Mar 1993 - 
- Vol. 6, Iss: 3, pp 351-363
Reads0
Chats0
TLDR
Simulation results are presented, showing that initializing back propagation networks with prototypes generally results in drastic reductions in training time, improved robustness against local minima, and better generalization.
About
This article is published in Neural Networks.The article was published on 1993-03-06. It has received 138 citations till now. The article focuses on the topics: Feature vector & Backpropagation.

read more

Citations
More filters
Journal ArticleDOI

Neural networks in geophysical applications

TL;DR: Techniques are described for faster training, better overall performance, i.e., generalization, and the automatic estimation of network size and architecture.
Journal ArticleDOI

A big data urban growth simulation at a national scale: Configuring the GIS and neural network based Land Transformation Model to run in a High Performance Computing (HPC) environment

TL;DR: An overview of a redesigned LTM capable of running at continental scales and at a fine (30m) resolution using a new architecture that employs a windows-based High Performance Computing (HPC) cluster is provided.
Proceedings ArticleDOI

Learning multidimensional signal processing

TL;DR: This paper presents the general strategy for designing learning machines as well as a number of particular designs based on two main principles: simple adaptive local models; and adaptive model distribution.
Journal ArticleDOI

A weight initialization method for improving training speed in feedforward neural network

TL;DR: The proposed method ensures that the outputs of neurons are in the active region and increases the rate of convergence and with the optimal initial weights determined, the initial error is substantially smaller and the number of iterations required to achieve the error criterion is significantly reduced.
References
More filters
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Book

Learning internal representations by error propagation

TL;DR: In this paper, the problem of the generalized delta rule is discussed and the Generalized Delta Rule is applied to the simulation results of simulation results in terms of the generalized delta rule.
Proceedings Article

The Cascade-Correlation Learning Architecture

TL;DR: The Cascade-Correlation architecture has several advantages over existing algorithms: it learns very quickly, the network determines its own size and topology, it retains the structures it has built even if the training set changes, and it requires no back-propagation of error signals through the connections of the network.