scispace - formally typeset
Journal ArticleDOI

An all-analog expandable neural network LSI with on-chip backpropagation learning

Takashi Morie, +1 more
- 01 Sep 1994 - 
- Vol. 29, Iss: 9, pp 1086-1093
Reads0
Chats0
TLDR
An analog neural system made by combining LSI's with feedback connections is promising for implementing continuous-time models of recurrent networks with real-time learning.
Abstract
This paper proposes an all-analog neural network LSI architecture and a new learning procedure called contrastive backpropagation learning In analog neural LSI's with on-chip backpropagation learning, inevitable offset errors that arise in the learning circuits seriously degrade the learning performance Using the learning procedure proposed here, offset errors are canceled to a large extent and the effect of offset errors on the learning performance is minimized This paper also describes a prototype LSI with 9 neurons and 81 synapses based on the proposed architecture which is capable of continuous neuron-state and continuous-time operation because of its fully analog and fully parallel property Therefore, an analog neural system made by combining LSI's with feedback connections is promising for implementing continuous-time models of recurrent networks with real-time learning >

read more

Citations
More filters
Journal ArticleDOI

Neuromorphic computing using non-volatile memory

TL;DR: The relevant virtues and limitations of these devices are assessed, in terms of properties such as conductance dynamic range, (non)linearity and (a)symmetry of conductance response, retention, endurance, required switching power, and device variability.
Journal ArticleDOI

Equivalent-accuracy accelerated neural-network training using analogue memory

TL;DR: Mixed hardware–software neural-network implementations that involve up to 204,900 synapses and that combine long-term storage in phase-change memory, near-linear updates of volatile capacitors and weight-data transfer with ‘polarity inversion’ to cancel out inherent device-to-device variations are demonstrated.
Posted Content

A Survey of Neuromorphic Computing and Neural Networks in Hardware.

TL;DR: An exhaustive review of the research conducted in neuromorphic computing since the inception of the term is provided to motivate further work by illuminating gaps in the field where new research is needed.
Journal ArticleDOI

Memristor-Based Multilayer Neural Networks With Online Gradient Descent Training

TL;DR: The utility and robustness of the proposed memristor-based circuit can compactly implement hardware MNNs trainable by scalable algorithms based on online gradient descent (e.g., backpropagation).
Journal ArticleDOI

The Next Generation of Deep Learning Hardware: Analog Computing

TL;DR: A detailed analysis and design guidelines how nonvolatile memory materials need to be reengineered for optimal performance in the deep learning space shows a strong deviation from the materials used in memory applications.
References
More filters
Journal ArticleDOI

A learning algorithm for continually running fully recurrent neural networks

TL;DR: The exact form of a gradient-following learning algorithm for completely recurrent networks running in continually sampled time is derived and used as the basis for practical algorithms for temporal supervised learning tasks.
Journal ArticleDOI

Neural networks at work

Dan Hammerstrom
- 01 Jun 1993 - 
TL;DR: The benefits of neural networks and the types of application for which they are suited are outlined, and four representative applications are described in enough detail to show how they work.
Journal ArticleDOI

A CMOS Four-Quadrant Analog Multiplier

TL;DR: A new circuit configuration for an MOS four-quadrant analog multiplier circuit is presented, based on the square-law characteristics of the MOS transistor, which has floating inputs and linearity better than 0.14 percent.
Journal ArticleDOI

Recurrent backpropagation and the dynamical approach to adaptive neural computation

TL;DR: It is now possible to efficiently compute the error gradients for networks that have temporal dynamics, which opens applications to a host of problems in systems identification and control.
Journal ArticleDOI

Adaptive neural oscillator using continuous-time back-propagation learning

TL;DR: A neural network model of temporal pattern memory in animal motor systems is proposed that receives an external oscillatory input with some desired wave form, then, after sufficient learning, the network autonomously oscillates in the previously given wave form.
Related Papers (5)