scispace - formally typeset
Search or ask a question

Showing papers in "Neural Networks in 1988"


Journal ArticleDOI
TL;DR: An historical discussion is provided of the intellectual trends that caused nineteenth century interdisciplinary studies of physics and psychobiology by leading scientists such as Helmholtz, Maxwell, and Mach to splinter into separate twentieth-century scientific movements.

1,586 citations


Journal ArticleDOI
TL;DR: A brief survey of the motivations, fundamentals, and applications of artificial neural networks, as well as some detailed analytical expressions for their theory.

1,418 citations


Journal ArticleDOI
TL;DR: A neural network learning procedure has been applied to the classification of sonar returns from two undersea targets, a metal cylinder and a similarly shaped rock, and network performance and classification strategy was comparable to that of trained human listeners.

1,146 citations


Journal ArticleDOI
TL;DR: The operation of tolerating positional error a little at a time at each stage, rather than all in one step, plays an important role in endowing the network with an ability to recognize even distorted patterns.

1,037 citations


Journal ArticleDOI
TL;DR: This paper will derive a generalization of backpropagation to recurrent systems (which input their own output), such as hybrids of perceptron-style networks and Grossberg/Hopfield networks, and does not require the storage of intermediate iterations to deal with continuous recurrence.

960 citations


Journal ArticleDOI
TL;DR: In this paper, an analog model of the first stages of retinal processing has been constructed on a single silicon chip, where each photoreceptor computes the logarithm of the incident light intensity.

510 citations


Journal ArticleDOI
TL;DR: H hierarchical arrangement of the transcortical loop and the inverse-dynamics model is applied for learning trajectory control of an industrial robotic manipulator and the control performance by the neural-network model improved gradually during 30 minutes of learning.

500 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a survey of the basic backpropagation neural network architecture, covering the areas of architectural design, performance measurement, function approximation capability, and learning.

486 citations


Journal ArticleDOI
TL;DR: A new statistical neurodynamical method is proposed for analyzing the non-equilibrium dynamical behaviors of an autocorrelation associative memory model and explains the strange behaviors due to strange shapes of the basins of attractors.

400 citations


Journal ArticleDOI
TL;DR: Based on Kohonen's work on self-organizing feature maps, an algorithm for solving the classical Travelling Salesman Problem is derived, given a set of cities defined by their positions in the plane, which iteratively organizes towards a quasi-optimal solution.

347 citations


Journal ArticleDOI
TL;DR: This counterpropagation network functions as a statistically optimal self-adapting look-up table that is applicable to pattern recognition, function approximation, statistical analysis, and data compression.

Journal ArticleDOI
TL;DR: The non-redundant, up to Nth order polynomial expansion of N-dimensional binary vectors is shown to yield orthogonal feature vectors and optical implementations of quadratic associative memories are described.

Journal ArticleDOI
TL;DR: A model for position invariant pattern recognition is presented that consists of two layers of neurons, an input layer, and a memory and recognition layer, where patterns are represented by labeled graphs.

Journal ArticleDOI
TL;DR: A neural-network clustering algorithm proposed by T. Kohonen (1986, 88) is used to design a codebook for the vector quantization of images and the results are compared with coded images when the cookbook is designed by the Linde-Buzo-Gray algorithm.


Journal ArticleDOI
TL;DR: Certain simple neural network models of associative memory with N binary neurons and symmetric lth order synaptic connections, in which m randomly chosen N-bit patterns are to be stored and retrieved with a small fraction δ of bit errors allowed, are considered.

Journal ArticleDOI
TL;DR: An input correlation rule is applied to a set of fully interconnected coupled oscillators that comprises a dynamical model of the olfactory bulb so as to form “templates” of oscillators with strengthened interconnection in respect to inputs classed as “learned.”


Journal ArticleDOI
TL;DR: Extend somewhat Newman's description of the “energy landscape,” and prove the existence of an exponential number of stable states (extraneous memories) with convergence properties similar to those of the fundamental memories.


Journal ArticleDOI
TL;DR: The gastric and pyloric rhythm of the lobster stomatogastric system are presented as possible computational data bases for modeling studies.

Journal ArticleDOI
TL;DR: Recognition memory for familiar and unfamiliar spatial frequency filtered faces was examined using a combination of experiments and computer simulations based on the physical-systems approach to memory to show a significant interaction in recognition performance between the frequency characteristics of the learning and test pictures of the faces.



Journal ArticleDOI
TL;DR: Results for expanded neocognitron architectures operating on complex images of 128 × 128 pixels are described, giving insight into the role of various model parameters and their proper values, as well as demonstrating the model's applicability to complex images.


Journal ArticleDOI
TL;DR: Several design strategies for feed-forward networks are examined within the scope of pattern classification and a hierarchical structure with pairwise training of two-class models is superior to a single uniform network for speaker-independent word recognition.

Journal ArticleDOI
TL;DR: The basic features of the model to be described here are a 100% interconnected network with symmetrical weights, a continuously changing activation of the nodes between 0 and 1, a "fatigue" of the activability of nodes as a function of time and current activation, a Hebbian-like learning rule, and a strongly negative starting weight of the connections.

Journal ArticleDOI
TL;DR: The Brain-State-in-a-Box Neural model is shown to converge to the hypercube's extreme points under mild assumptions, which is possible because the extreme points are the only stable equilibria.

Journal ArticleDOI
TL;DR: It is proved that a large class of neural networks with symmetric interaction coefficients admit a global Liapunov function guaranteeing that their trajectories approach equilibrium points, and the equilibria are the stored memories.