scispace - formally typeset
Journal ArticleDOI

In-memory Learning with Analog Resistive Switching Memory: A Review and Perspective

TLDR
This article defines the main figures of merit (FoMs) of analog RSM hardware including the basic device characteristics, hardware algorithms, and the corresponding mapping methods for device arrays, as well as the architecture and circuit design considerations for neural networks.
Abstract
In this article, we review the existing analog resistive switching memory (RSM) devices and their hardware technologies for in-memory learning, as well as their challenges and prospects. Since the characteristics of the devices are different for in-memory learning and digital memory applications, it is important to have an in-depth understanding across different layers from devices and circuits to architectures and algorithms. First, based on a top-down view from architecture to devices for analog computing, we define the main figures of merit (FoMs) and perform a comprehensive analysis of analog RSM hardware including the basic device characteristics, hardware algorithms, and the corresponding mapping methods for device arrays, as well as the architecture and circuit design considerations for neural networks. Second, we classify the FoMs of analog RSM devices into two levels. Level 1 FoMs are essential for achieving the functionality of a system (e.g., linearity, symmetry, dynamic range, level numbers, fluctuation, variability, and yield). Level 2 FoMs are those that make a functional system more efficient and reliable (e.g., area, operational voltage, energy consumption, speed, endurance, retention, and compatibility with back-end-of-line processing). By constructing a device-to-application simulation framework, we perform an in-depth analysis of how these FoMs influence in-memory learning and give a target list of the device requirements. Lastly, we evaluate the main FoMs of most existing devices with analog characteristics and review optimization methods from programming schemes to materials and device structures. The key challenges and prospects from the device to system level for analog RSM devices are discussed.

read more

Citations
More filters
Journal ArticleDOI

Memristive Crossbar Arrays for Storage and Computing Applications

TL;DR: Crossbar architecture is introduced, the origin of sneak‐path current is reviewed, techniques to mitigate this issue from the angle of materials and circuits are discussed, and the applications of memristive crossbars in both machine learning and neuromorphic computing are surveyed.
Journal ArticleDOI

High-precision and linear weight updates by subnanosecond pulses in ferroelectric tunnel junction for neuro-inspired computing

TL;DR: In this paper , a high-performance synaptic device is designed and established based on a Ag/PbZr0.52Ti0.48O3 (PZT, (111)-oriented)/Nb:SrTiO3 ferroelectric tunnel junction (FTJ).
Journal ArticleDOI

Analog memristive synapse based on topotactic phase transition for high-performance neuromorphic computing and neural network pruning

TL;DR: In this article, a topotactic phase transition random access memory (TPT-RAM) with a unique diffusive nonvolatile dual mode based on SrCoO x is demonstrated.
Journal ArticleDOI

Ferroelectric P(VDF-TrFE) wrapped InGaAs nanowires for ultralow-power artificial synapses

TL;DR: In this article, the poly(vinylidene fluoride-trifluoroethylene) (P(VDF-TrFE) wrapped InGaAs nanowire (NW) artificial synapses capable to operate with record-low subfemtojoule power consumption are presented.
Journal ArticleDOI

Ferroelectric P(VDF-TrFE) wrapped InGaAs nanowires for ultralow-power artificial synapses

- 01 Jan 2022 - 
TL;DR: In this article , a poly(vinylidene fluoride-trifluoroethylene) (P(VDF-TrFE) top-wrapped InGaAs nanowire (NW) artificial synapses capable to operate with record-low subfemtojoule power consumption is presented.
References
More filters
Journal ArticleDOI

Deep learning

TL;DR: Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.
Journal ArticleDOI

Gradient-based learning applied to document recognition

TL;DR: In this article, a graph transformer network (GTN) is proposed for handwritten character recognition, which can be used to synthesize a complex decision surface that can classify high-dimensional patterns, such as handwritten characters.
Journal ArticleDOI

Mastering the game of Go with deep neural networks and tree search

TL;DR: Using this search algorithm, the program AlphaGo achieved a 99.8% winning rate against other Go programs, and defeated the human European Go champion by 5 games to 0.5, the first time that a computer program has defeated a human professional player in the full-sized game of Go.
Journal ArticleDOI

The missing memristor found

TL;DR: It is shown, using a simple analytical example, that memristance arises naturally in nanoscale systems in which solid-state electronic and ionic transport are coupled under an external bias voltage.
Book

Introduction To The Theory Of Neural Computation

TL;DR: This book is a detailed, logically-developed treatment that covers the theory and uses of collective computational networks, including associative memory, feed forward networks, and unsupervised learning.
Related Papers (5)