Long short-term memory networks in memristor crossbar arrays
Can Li,Can Li,Zhongrui Wang,Mingyi Rao,Daniel Belkin,Wenhao Song,Hao Jiang,Peng Yan,Yunning Li,Peng Lin,Miao Hu,Ning Ge,John Paul Strachan,Mark Barnell,Qing Wu,R. Stanley Williams,Jianhua Yang,Qiangfei Xia +17 more
TLDR
It is demonstrated experimentally that the synaptic weights shared in different time steps in an LSTM can be implemented with a memristor crossbar array, which has a small circuit footprint, can store a large number of parameters and offers in-memory computing capability that contributes to circumventing the ‘von Neumann bottleneck’.Abstract:
Recent breakthroughs in recurrent deep neural networks with long short-term memory (LSTM) units have led to major advances in artificial intelligence. However, state-of-the-art LSTM models with significantly increased complexity and a large number of parameters have a bottleneck in computing power resulting from both limited memory capacity and limited data communication bandwidth. Here we demonstrate experimentally that the synaptic weights shared in different time steps in an LSTM can be implemented with a memristor crossbar array, which has a small circuit footprint, can store a large number of parameters and offers in-memory computing capability that contributes to circumventing the ‘von Neumann bottleneck’. We illustrate the capability of our crossbar system as a core component in solving real-world problems in regression and classification, which shows that memristor LSTM is a promising low-power and low-latency hardware platform for edge inference. Deep neural networks are increasingly popular in data-intensive applications, but are power-hungry. New types of computer chips that are suited to the task of deep learning, such as memristor arrays where data handling and computing take place within the same unit, are required. A well-used deep learning model called long short-term memory, which can handle temporal sequential data analysis, is now implemented in a memristor crossbar array, promising an energy-efficient and low-footprint deep learning platform.read more
Citations
More filters
Journal ArticleDOI
Memristive crossbar arrays for brain-inspired computing
Qiangfei Xia,Jianhua Yang +1 more
TL;DR: The challenges in the integration and use in computation of large-scale memristive neural networks are discussed, both as accelerators for deep learning and as building blocks for spiking neural networks.
Journal ArticleDOI
Resistive switching materials for information processing
Zhongrui Wang,Huaqiang Wu,Geoffrey W. Burr,Cheol Seong Hwang,Kang L. Wang,Qiangfei Xia,Jianhua Yang +6 more
TL;DR: This Review surveys the four physical mechanisms that lead to resistive switching materials enable novel, in-memory information processing, which may resolve the von Neumann bottleneck and examines the device requirements for systems based on RSMs.
Journal ArticleDOI
Photonics for artificial intelligence and neuromorphic computing
Bhavin J. Shastri,Bhavin J. Shastri,Alexander N. Tait,Alexander N. Tait,T. Ferreira de Lima,Wolfram H. P. Pernice,Harish Bhaskaran,C.D. Wright,Paul R. Prucnal +8 more
TL;DR: In this paper, the authors review recent advances in integrated photonic neuromorphic systems, discuss current and future challenges, and outline the advances in science and technology needed to meet those challenges.
Journal ArticleDOI
Photonics for artificial intelligence and neuromorphic computing
Bhavin J. Shastri,Alexander N. Tait,Thomas Ferreira de Lima,Wolfram H. P. Pernice,Harish Bhaskaran,C. David Wright,Paul R. Prucnal +6 more
TL;DR: Recent advances in integrated photonic neuromorphic neuromorphic systems are reviewed, current and future challenges are discussed, and the advances in science and technology needed to meet those challenges are outlined.
Journal ArticleDOI
Bridging Biological and Artificial Neural Networks with Emerging Neuromorphic Devices: Fundamentals, Progress, and Challenges.
Jianshi Tang,Fang Yuan,Xinke Shen,Zhongrui Wang,Mingyi Rao,Yuanyuan He,Yuhao Sun,Xinyi Li,Wenbin Zhang,Yijun Li,Bin Gao,He Qian,Guo-Qiang Bi,Sen Song,Jianhua Yang,Huaqiang Wu +15 more
TL;DR: A systematic overview of biological and artificial neural systems is given, along with their related critical mechanisms, and the existing challenges are highlighted to hopefully shed light on future research directions.
References
More filters
Journal ArticleDOI
Long short-term memory
TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal ArticleDOI
Deep learning
TL;DR: Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.
Journal ArticleDOI
Learning representations by back-propagating errors
TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Journal ArticleDOI
The missing memristor found
TL;DR: It is shown, using a simple analytical example, that memristance arises naturally in nanoscale systems in which solid-state electronic and ionic transport are coupled under an external bias voltage.
Journal ArticleDOI
Memristor-The missing circuit element
TL;DR: In this article, the memristor is introduced as the fourth basic circuit element and an electromagnetic field interpretation of this relationship in terms of a quasi-static expansion of Maxwell's equations is presented.