Journal ArticleDOI
An all-analog expandable neural network LSI with on-chip backpropagation learning
Takashi Morie,Yoshihito Amemiya +1 more
Reads0
Chats0
TLDR
An analog neural system made by combining LSI's with feedback connections is promising for implementing continuous-time models of recurrent networks with real-time learning.Abstract:
This paper proposes an all-analog neural network LSI architecture and a new learning procedure called contrastive backpropagation learning In analog neural LSI's with on-chip backpropagation learning, inevitable offset errors that arise in the learning circuits seriously degrade the learning performance Using the learning procedure proposed here, offset errors are canceled to a large extent and the effect of offset errors on the learning performance is minimized This paper also describes a prototype LSI with 9 neurons and 81 synapses based on the proposed architecture which is capable of continuous neuron-state and continuous-time operation because of its fully analog and fully parallel property Therefore, an analog neural system made by combining LSI's with feedback connections is promising for implementing continuous-time models of recurrent networks with real-time learning >read more
Citations
More filters
Journal ArticleDOI
Neuromorphic computing using non-volatile memory
Geoffrey W. Burr,Robert M. Shelby,Abu Sebastian,Sangbum Kim,Seyoung Kim,Severin Sidler,Kumar Virwani,Masatoshi Ishii,Pritish Narayanan,Alessandro Fumarola,Lucas L. Sanches,Irem Boybat,Manuel Le Gallo,Kibong Moon,Jiyoo Woo,Hyunsang Hwang,Yusuf Leblebici +16 more
TL;DR: The relevant virtues and limitations of these devices are assessed, in terms of properties such as conductance dynamic range, (non)linearity and (a)symmetry of conductance response, retention, endurance, required switching power, and device variability.
Journal ArticleDOI
Equivalent-accuracy accelerated neural-network training using analogue memory
Stefano Ambrogio,Pritish Narayanan,Hsinyu Tsai,Robert M. Shelby,Irem Boybat,Irem Boybat,Carmelo di Nolfo,Carmelo di Nolfo,Severin Sidler,Severin Sidler,Massimo Giordano,Martina Bodini,Martina Bodini,Nathan C. P. Farinha,Benjamin Killeen,Christina Cheng,Yassine Jaoudi,Geoffrey W. Burr +17 more
TL;DR: Mixed hardware–software neural-network implementations that involve up to 204,900 synapses and that combine long-term storage in phase-change memory, near-linear updates of volatile capacitors and weight-data transfer with ‘polarity inversion’ to cancel out inherent device-to-device variations are demonstrated.
Posted Content
A Survey of Neuromorphic Computing and Neural Networks in Hardware.
Catherine D. Schuman,Thomas E. Potok,Robert M. Patton,J. Douglas Birdwell,Mark Edward Dean,Garrett S. Rose,James S. Plank +6 more
TL;DR: An exhaustive review of the research conducted in neuromorphic computing since the inception of the term is provided to motivate further work by illuminating gaps in the field where new research is needed.
Journal ArticleDOI
Memristor-Based Multilayer Neural Networks With Online Gradient Descent Training
TL;DR: The utility and robustness of the proposed memristor-based circuit can compactly implement hardware MNNs trainable by scalable algorithms based on online gradient descent (e.g., backpropagation).
Journal ArticleDOI
The Next Generation of Deep Learning Hardware: Analog Computing
TL;DR: A detailed analysis and design guidelines how nonvolatile memory materials need to be reengineered for optimal performance in the deep learning space shows a strong deviation from the materials used in memory applications.
References
More filters
Journal ArticleDOI
A learning algorithm for continually running fully recurrent neural networks
Ronald J. Williams,David Zipser +1 more
TL;DR: The exact form of a gradient-following learning algorithm for completely recurrent networks running in continually sampled time is derived and used as the basis for practical algorithms for temporal supervised learning tasks.
Journal ArticleDOI
Neural networks at work
TL;DR: The benefits of neural networks and the types of application for which they are suited are outlined, and four representative applications are described in enough detail to show how they work.
Journal ArticleDOI
A CMOS Four-Quadrant Analog Multiplier
K. Bult,H. Wallinga +1 more
TL;DR: A new circuit configuration for an MOS four-quadrant analog multiplier circuit is presented, based on the square-law characteristics of the MOS transistor, which has floating inputs and linearity better than 0.14 percent.
Journal ArticleDOI
Recurrent backpropagation and the dynamical approach to adaptive neural computation
TL;DR: It is now possible to efficiently compute the error gradients for networks that have temporal dynamics, which opens applications to a host of problems in systems identification and control.
Journal ArticleDOI
Adaptive neural oscillator using continuous-time back-propagation learning
Kenji Doya,Shuji Yoshizawa +1 more
TL;DR: A neural network model of temporal pattern memory in animal motor systems is proposed that receives an external oscillatory input with some desired wave form, then, after sufficient learning, the network autonomously oscillates in the previously given wave form.
Related Papers (5)
Weight perturbation: an optimal architecture and learning technique for analog VLSI feedforward and recurrent multilayer networks
Marwan A. Jabri,Barry Flower +1 more