Proceedings ArticleDOI
Flash Based In-Memory Multiply-Accumulate Realisation: A Theoretical Study
S. Ashwin Balagopal,Janakiraman Viraraghavan +1 more
- pp 1-5
Reads0
Chats0
TLDR
This paper proposes to use the Sense Amp as a comparator to perform the digitization using a serial flash, implemented in memory, and shows that the reference voltage can be generated in much the same way as the MAC voltage is generated along a column, in-memory.Abstract:
In memory computing is gaining traction as a technique to implement the Multiply Accumulate (MAC) operation on edge network devices, to perform neural network inference while reducing energy expended in memory-fetch. The voltage developed along a bit-line is an analog representation of the MAC value and needs to be digitized for further processing. In this paper we propose to use the Sense Amp as a comparator to perform the digitization using a serial flash, implemented in memory. Flash ADCs require an ordered set of reference voltages to compare against the input to be digitized. Recognizing that the MAC value is non-uniformly distributed and is application specific we propose an algorithm to generate the reference voltages tailored to the MAC distribution function. Further, we show that the reference voltage can be generated in much the same way as the MAC voltage is generated along a column, in-memory. We provide an algorithm to populate the bit-cells of the reference column to generate the appropriate reference voltage. Experiments on the MNIST, SVHN and CIFAR-10 data sets show that the proposed technique results in a worst case accuracy reduction of 0.8% compared to the Double-Precision evaluation.read more
References
More filters
Proceedings ArticleDOI
Deep Residual Learning for Image Recognition
TL;DR: In this article, the authors proposed a residual learning framework to ease the training of networks that are substantially deeper than those used previously, which won the 1st place on the ILSVRC 2015 classification task.
Journal ArticleDOI
Gradient-based learning applied to document recognition
Yann LeCun,Léon Bottou,Léon Bottou,Yoshua Bengio,Yoshua Bengio,Yoshua Bengio,Patrick Haffner +6 more
TL;DR: In this article, a graph transformer network (GTN) is proposed for handwritten character recognition, which can be used to synthesize a complex decision surface that can classify high-dimensional patterns, such as handwritten characters.
Journal ArticleDOI
Deep learning in neural networks
TL;DR: This historical survey compactly summarizes relevant work, much of it from the previous millennium, review deep supervised learning, unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.
Journal ArticleDOI
Deep Neural Networks for Acoustic Modeling in Speech Recognition: The Shared Views of Four Research Groups
Geoffrey E. Hinton,Li Deng,Dong Yu,George E. Dahl,Abdelrahman Mohamed,Navdeep Jaitly,Andrew W. Senior,Vincent Vanhoucke,Patrick Nguyen,Tara N. Sainath,Brian Kingsbury +10 more
TL;DR: This article provides an overview of progress and represents the shared views of four research groups that have had recent successes in using DNNs for acoustic modeling in speech recognition.
Reading Digits in Natural Images with Unsupervised Feature Learning
TL;DR: A new benchmark dataset for research use is introduced containing over 600,000 labeled digits cropped from Street View images, and variants of two recently proposed unsupervised feature learning methods are employed, finding that they are convincingly superior on benchmarks.