Showing papers in "Neural Networks in 1988"
••
TL;DR: An historical discussion is provided of the intellectual trends that caused nineteenth century interdisciplinary studies of physics and psychobiology by leading scientists such as Helmholtz, Maxwell, and Mach to splinter into separate twentieth-century scientific movements.
1,586 citations
••
TL;DR: A brief survey of the motivations, fundamentals, and applications of artificial neural networks, as well as some detailed analytical expressions for their theory.
1,418 citations
••
TL;DR: A neural network learning procedure has been applied to the classification of sonar returns from two undersea targets, a metal cylinder and a similarly shaped rock, and network performance and classification strategy was comparable to that of trained human listeners.
1,146 citations
••
TL;DR: The operation of tolerating positional error a little at a time at each stage, rather than all in one step, plays an important role in endowing the network with an ability to recognize even distorted patterns.
1,037 citations
••
TL;DR: This paper will derive a generalization of backpropagation to recurrent systems (which input their own output), such as hybrids of perceptron-style networks and Grossberg/Hopfield networks, and does not require the storage of intermediate iterations to deal with continuous recurrence.
960 citations
••
TL;DR: In this paper, an analog model of the first stages of retinal processing has been constructed on a single silicon chip, where each photoreceptor computes the logarithm of the incident light intensity.
510 citations
••
TL;DR: H hierarchical arrangement of the transcortical loop and the inverse-dynamics model is applied for learning trajectory control of an industrial robotic manipulator and the control performance by the neural-network model improved gradually during 30 minutes of learning.
500 citations
••
TL;DR: In this paper, the authors present a survey of the basic backpropagation neural network architecture, covering the areas of architectural design, performance measurement, function approximation capability, and learning.
486 citations
••
TL;DR: A new statistical neurodynamical method is proposed for analyzing the non-equilibrium dynamical behaviors of an autocorrelation associative memory model and explains the strange behaviors due to strange shapes of the basins of attractors.
400 citations
••
TL;DR: Based on Kohonen's work on self-organizing feature maps, an algorithm for solving the classical Travelling Salesman Problem is derived, given a set of cities defined by their positions in the plane, which iteratively organizes towards a quasi-optimal solution.
347 citations
••
TL;DR: This counterpropagation network functions as a statistically optimal self-adapting look-up table that is applicable to pattern recognition, function approximation, statistical analysis, and data compression.
••
TL;DR: The non-redundant, up to Nth order polynomial expansion of N-dimensional binary vectors is shown to yield orthogonal feature vectors and optical implementations of quadratic associative memories are described.
••
TL;DR: A model for position invariant pattern recognition is presented that consists of two layers of neurons, an input layer, and a memory and recognition layer, where patterns are represented by labeled graphs.
••
TL;DR: A neural-network clustering algorithm proposed by T. Kohonen (1986, 88) is used to design a codebook for the vector quantization of images and the results are compared with coded images when the cookbook is designed by the Linde-Buzo-Gray algorithm.
••
••
TL;DR: Certain simple neural network models of associative memory with N binary neurons and symmetric lth order synaptic connections, in which m randomly chosen N-bit patterns are to be stored and retrieved with a small fraction δ of bit errors allowed, are considered.
••
TL;DR: An input correlation rule is applied to a set of fully interconnected coupled oscillators that comprises a dynamical model of the olfactory bulb so as to form “templates” of oscillators with strengthened interconnection in respect to inputs classed as “learned.”
••
TL;DR: Extend somewhat Newman's description of the “energy landscape,” and prove the existence of an exponential number of stable states (extraneous memories) with convergence properties similar to those of the fundamental memories.
••
TL;DR: The gastric and pyloric rhythm of the lobster stomatogastric system are presented as possible computational data bases for modeling studies.
••
TL;DR: Recognition memory for familiar and unfamiliar spatial frequency filtered faces was examined using a combination of experiments and computer simulations based on the physical-systems approach to memory to show a significant interaction in recognition performance between the frequency characteristics of the learning and test pictures of the faces.
••
••
TL;DR: Results for expanded neocognitron architectures operating on complex images of 128 × 128 pixels are described, giving insight into the role of various model parameters and their proper values, as well as demonstrating the model's applicability to complex images.
••
TL;DR: Several design strategies for feed-forward networks are examined within the scope of pattern classification and a hierarchical structure with pairwise training of two-class models is superior to a single uniform network for speaker-independent word recognition.
••
TL;DR: The basic features of the model to be described here are a 100% interconnected network with symmetrical weights, a continuously changing activation of the nodes between 0 and 1, a "fatigue" of the activability of nodes as a function of time and current activation, a Hebbian-like learning rule, and a strongly negative starting weight of the connections.
••
TL;DR: The Brain-State-in-a-Box Neural model is shown to converge to the hypercube's extreme points under mild assumptions, which is possible because the extreme points are the only stable equilibria.
••
TL;DR: It is proved that a large class of neural networks with symmetric interaction coefficients admit a global Liapunov function guaranteeing that their trajectories approach equilibrium points, and the equilibria are the stored memories.