scispace - formally typeset
Open AccessJournal ArticleDOI

Long short-term memory networks in memristor crossbar arrays

TLDR
It is demonstrated experimentally that the synaptic weights shared in different time steps in an LSTM can be implemented with a memristor crossbar array, which has a small circuit footprint, can store a large number of parameters and offers in-memory computing capability that contributes to circumventing the ‘von Neumann bottleneck’.
Abstract
Recent breakthroughs in recurrent deep neural networks with long short-term memory (LSTM) units have led to major advances in artificial intelligence. However, state-of-the-art LSTM models with significantly increased complexity and a large number of parameters have a bottleneck in computing power resulting from both limited memory capacity and limited data communication bandwidth. Here we demonstrate experimentally that the synaptic weights shared in different time steps in an LSTM can be implemented with a memristor crossbar array, which has a small circuit footprint, can store a large number of parameters and offers in-memory computing capability that contributes to circumventing the ‘von Neumann bottleneck’. We illustrate the capability of our crossbar system as a core component in solving real-world problems in regression and classification, which shows that memristor LSTM is a promising low-power and low-latency hardware platform for edge inference. Deep neural networks are increasingly popular in data-intensive applications, but are power-hungry. New types of computer chips that are suited to the task of deep learning, such as memristor arrays where data handling and computing take place within the same unit, are required. A well-used deep learning model called long short-term memory, which can handle temporal sequential data analysis, is now implemented in a memristor crossbar array, promising an energy-efficient and low-footprint deep learning platform.

read more

Citations
More filters
Journal ArticleDOI

Memristive crossbar arrays for brain-inspired computing

TL;DR: The challenges in the integration and use in computation of large-scale memristive neural networks are discussed, both as accelerators for deep learning and as building blocks for spiking neural networks.
Journal ArticleDOI

Resistive switching materials for information processing

TL;DR: This Review surveys the four physical mechanisms that lead to resistive switching materials enable novel, in-memory information processing, which may resolve the von Neumann bottleneck and examines the device requirements for systems based on RSMs.
Journal ArticleDOI

Photonics for artificial intelligence and neuromorphic computing

TL;DR: In this paper, the authors review recent advances in integrated photonic neuromorphic systems, discuss current and future challenges, and outline the advances in science and technology needed to meet those challenges.
Journal ArticleDOI

Photonics for artificial intelligence and neuromorphic computing

TL;DR: Recent advances in integrated photonic neuromorphic neuromorphic systems are reviewed, current and future challenges are discussed, and the advances in science and technology needed to meet those challenges are outlined.
Journal ArticleDOI

Bridging Biological and Artificial Neural Networks with Emerging Neuromorphic Devices: Fundamentals, Progress, and Challenges.

TL;DR: A systematic overview of biological and artificial neural systems is given, along with their related critical mechanisms, and the existing challenges are highlighted to hopefully shed light on future research directions.
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal ArticleDOI

Deep learning

TL;DR: Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.
Journal ArticleDOI

Learning representations by back-propagating errors

TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Journal ArticleDOI

The missing memristor found

TL;DR: It is shown, using a simple analytical example, that memristance arises naturally in nanoscale systems in which solid-state electronic and ionic transport are coupled under an external bias voltage.
Journal ArticleDOI

Memristor-The missing circuit element

TL;DR: In this article, the memristor is introduced as the fourth basic circuit element and an electromagnetic field interpretation of this relationship in terms of a quasi-static expansion of Maxwell's equations is presented.
Related Papers (5)