scispace - formally typeset
Search or ask a question

Showing papers in "Neural Networks in 1994"


Journal ArticleDOI
TL;DR: A new self-organizing neural network model that has two variants that performs unsupervised learning and can be used for data visualization, clustering, and vector quantization is presented and results on the two-spirals benchmark and a vowel classification problem are presented that are better than any results previously published.

1,319 citations


Journal ArticleDOI
TL;DR: This competitive Hebbian rule provides a novel approach to the problem of constructing topology preserving feature maps and representing intricately structured manifolds and makes this novel approach particularly useful in all applications where neighborhood relations have to be exploited or the shape and topology of submanifolds have to been take into account.

912 citations


Journal ArticleDOI
TL;DR: A class of nonlinear PCA (principal component analysis) type learning algorithms is derived by minimizing a general statistical signal representation error and several known algorithms emerge as special cases of these optimization approaches that provide useful information on the properties of the algorithms.

396 citations


Journal ArticleDOI
TL;DR: A model of hippocampal function is presented in which the firing rate maps of cells downstream of the hippocampus provide a ''population vector'' encoding the instantaneous direction of the rat from a previously encountered reward site, enabling navigation to it.

377 citations


Journal ArticleDOI
TL;DR: In this article, a combination method based on the Dempster-Shafer theory of evidence was proposed, which uses statistical information about the relative classification strengths of several classifiers.

376 citations


Journal ArticleDOI
TL;DR: This paper found that the use of a small number of nets with a sufficiently large uncorrelation in their mistakes reaches a combined performance that is significantly higher than the best obtainable from the individual nets, with a negligible effort after starting from a pool of networks produced in the development phase of an application.

323 citations


Journal ArticleDOI
TL;DR: It is shown that by using sensitivity analysis, neural networks can provide a reasonable explanation of their predictive behaviour and can model their environment more convincingly than regression models.

309 citations


Journal ArticleDOI
TL;DR: A series of simulations and analyses with modular neural networks are presented, suggesting a number of design principles in the form of explicit ways in which neural modules can cooperate in recognition tasks that may supplement recent accounts of the relation between structure and function in the brain.

289 citations


Journal ArticleDOI
TL;DR: Minimisation methods for training feedforward networks with back propagation with conjugate gradient shows to be the superior method by using not only the local gradient but also the second derivative of the error function, a much shorter training time is required.

225 citations


Journal ArticleDOI
TL;DR: It is shown that under appropriate conditions, it may be possible to design efficient neural controllers for nonlinear multivariable systems for which linear controllers are inadequate.

182 citations


Journal ArticleDOI
TL;DR: An application of layered neural networks to nonlinear power systems control to model the dynamic system is avoided by introducing the Jacobian matrices of the system in the back propagation chain used in controller training.

Journal ArticleDOI
TL;DR: The proposed deterministic annealing neural network is shown to be capable of generating optimal solutions to convex programming problems and the conditions for asymptotic stability, solution feasibility, and solution optimality are derived.

Journal ArticleDOI
TL;DR: A novel method is proposed for solving the assignment problem using techniques adapted from statistical physics to derive a convex effective energy function whose unique minimum corresponds to the optimal assignment.

Journal ArticleDOI
TL;DR: In this paper, the convergence rate of RBF nets with respect to the number of hidden units is investigated and the existence of a consistent estimator for RBF networks is proven constructively.

Journal ArticleDOI
TL;DR: The proposed fuzzy algorithms consist of various distinctive features such as converging more often to the desired solutions, or equivalently, reducing the likelihood of neuron underutilization that has long been a major shortcoming of crisp competitive learning.

Journal ArticleDOI
TL;DR: It is shown by mathematical analyses that WTA-type neural networks composed of nonlinear dynamic model neurons, characterized by a nonlinear loss term, have the WTA property even when their neurons have nonidentical characteristics and the interconnections have nonIdentical strengths.

Journal ArticleDOI
TL;DR: A spectral timing model is developed to explain how the cerebellum learns adaptively timed responses during the rabbit's conditioned nictitating membrane response (NMR) and reproduces key behavioral features of the NMR.

Journal ArticleDOI
TL;DR: A distributed model of the spatiotemporal neural processing that underlies the control of two-dimensional saccadic eye movements in the monkey and is able to produce accurate eye movements and realistic neural discharge for saccades evoked with a variety of experimental conditions not included in the training set.

Journal ArticleDOI
TL;DR: This contribution describes a procedure for determining the optimal topology of a static three-layer neural network based on a canonical decomposition technique that establishes a link between the number of neurons in each hidden layer and the dimensions of the subspaces of the canonical decompositions.

Journal ArticleDOI
TL;DR: Feedforward processing does not deny the possibility that top-down influences, although poorly understood, may play a role in nulling image aspects that are predictable in appearance and/or not the object of attention such that only features containing relevant discriminatory information are processed further.

Journal ArticleDOI
TL;DR: A flexible neural mechanism for invariant recognition based on correlated neuronal activity and the self-organization of dynamic links is proposed which allows an unsupervised decision of whether a given input pattern matches with a stored model pattern.

Journal ArticleDOI
TL;DR: An algorithm based on the back propagation procedure that dynamically configures the structure of feedforward multilayered neural networks and demonstrates its potential for control applications.

Journal ArticleDOI
TL;DR: A new competitive learning algorithm with a selection mechanism, called the CSL (competitive and selective learning) algorithm, which is based on the equidistortion principle and can obtain better performance without a particular initialization procedure even when the input data cluster in a number of regions in the input vector space.

Journal ArticleDOI
TL;DR: A hybrid algorithm that combines the modified back-propagation method and the random optimization method is proposed to find the global minimum of the total error function of a neural network in a small number of steps and is shown that it ensures convergence to a global minimum with probability 1 in a compact region of a weight vector space.

Journal ArticleDOI
TL;DR: The dynamics of learning in a cortical model are described, focusing on the exponential growth produced by allowing synaptic transmission at previously modified synapses during learning of a new pattern.

Journal ArticleDOI
TL;DR: A new approach to Darwinian learning over a repertoire of competing neural nets that includes a credit apportionment algorithm for individual neurons within the networks to provide the basis for implementing a competition among single neurons across all of the networks.

Journal ArticleDOI
TL;DR: The application, the neural architectures and algorithms, the current status, and the lessons learned in developing a neural network system for production use in industry are described.

Journal ArticleDOI
TL;DR: A real-time, view-based neurocomputational architecture for unsupervised 2-D mapping and localization within a 3-D environment defined by a spatially distributed set of visual landmarks, which emulates place learning by hippocampal place cells in rats.

Journal ArticleDOI
TL;DR: A novel scheme for sensory-based navigation of a mobile robot is presented, trained to learn a goal-directed task under adequate supervision, utilizing local sensory inputs, focusing on the topological changes of temporal sensory flow.

Journal ArticleDOI
TL;DR: An information theoretic method is presented that eliminates the need to select network size before training by building the appropriate network architecture dynamically during the training process, called dynamic node architecture learning (DNAL).