scispace - formally typeset
Search or ask a question

Showing papers on "Neuromorphic engineering published in 1987"


Proceedings Article
01 Jan 1987
TL;DR: A family of learning algorithms that operate on a recurrent, symmetrically connected, neuromorphic network that, like the Boltzmann machine, settles in the presence of noise and a version of the supervised learning algorithm for a network with analog activation functions.
Abstract: We describe a family of learning algorithms that operate on a recurrent, symmetrically connected, neuromorphic network that, like the Boltzmann machine, settles in the presence of noise. These networks learn by modifying synaptic connection strengths on the basis of correlations seen locally by each synapse. We describe a version of the supervised learning algorithm for a network with analog activation functions. We also demonstrate unsupervised competitive learning with this approach, where weight saturation and decay play an important role, and describe preliminary experiments in reinforcement learning, where noise is used in the search procedure. We identify the above described phenomena as elements that can unify learning techniques at a physical microscopic level. These algorithms were chosen for ease of implementation in vlsi. We have designed a CMOS test chip in 2 micron rules that can speed up the learning about a millionfold over an equivalent simulation on a VAX 11/780. The speedup is due to parallel analog computation for summing and multiplying weights and activations, and the use of physical processes for generating random noise. The components of the test chip are a noise amplifier, a neuron amplifier, and a 300 transistor adaptive synapse, each of which is separately testable. These components are also integrated into a 6 neuron and 15 synapse network. Finally, we point out techniques for reducing the area of the electronic correlational synapse both in technology and design and show how the algorithms we study can be implemented naturally in electronic systems.

55 citations


Proceedings Article
01 Jan 1987
TL;DR: A family of neuromorphic networks specifically designed for communications and optical signal processing applications is presented, and can be used to implement general functions, such as code filtering, code mapping, code joining, code shifting and code projecting.
Abstract: A family of neuromorphic networks specifically designed for communications and optical signal processing applications is presented. The information is encoded utilizing sparse Optical Orthogonal Code sequences on the basis of unipolar, binary (0,1) signals. The generalized synaptic connectivity matrix is also unipolar, and clipped to binary (0,1) values. In addition to high-capacity associative memory, the resulting neural networks can be used to implement general functions, such as code filtering, code mapping, code joining, code shifting and code projecting.

33 citations


Journal ArticleDOI
TL;DR: A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions.
Abstract: A combinatorial optimization methodology is developed, which enables the efficient use of hypercube multiprocessors onboard mobile intelligent robots dedicated to time-critical missions. The methodology is implemented in terms of large-scale concurrent algorithms based either on fast simulated annealing, or on nonlinear asynchronous neural networks. In particular, analytic expressions are given for the effect of single-neuron perturbations on the systems' configuration energy. Compact neuromorphic data structures are used to model effects such as precedence constraints, processor idling times, and task-schedule overlaps. Results for a typical robot-dynamics benchmark are presented.

25 citations