R
Rathinakumar Appuswamy
Researcher at IBM
Publications - 52
Citations - 5824
Rathinakumar Appuswamy is an academic researcher from IBM. The author has contributed to research in topics: Linear network coding & TrueNorth. The author has an hindex of 18, co-authored 51 publications receiving 4506 citations. Previous affiliations of Rathinakumar Appuswamy include Indian Institute of Technology Kanpur & University of California, San Diego.
Papers
More filters
Journal ArticleDOI
A million spiking-neuron integrated circuit with a scalable communication network and interface
Paul A. Merolla,John V. Arthur,Rodrigo Alvarez-Icaza,Andrew S. Cassidy,Jun Sawada,Filipp Akopyan,Bryan L. Jackson,Nabil Imam,Chen Guo,Yutaka Nakamura,Bernard Brezzo,Ivan Vo,Steven K. Esser,Rathinakumar Appuswamy,Brian Taba,Arnon Amir,Myron D. Flickner,William P. Risk,Rajit Manohar,Dharmendra S. Modha +19 more
TL;DR: Inspired by the brain’s structure, an efficient, scalable, and flexible non–von Neumann architecture is developed that leverages contemporary silicon technology and is well suited to many applications that use complex neural networks in real time, for example, multiobject detection and classification.
Journal ArticleDOI
Convolutional networks for fast, energy-efficient neuromorphic computing
Steven K. Esser,Paul A. Merolla,John V. Arthur,Andrew S. Cassidy,Rathinakumar Appuswamy,Alexander Andreopoulos,David Berg,Jeffrey L. McKinstry,Timothy Melano,R Davis,Carmelo di Nolfo,Pallab Datta,Arnon Amir,Brian Taba,Myron D. Flickner,Dharmendra S. Modha +15 more
TL;DR: This approach allows the algorithmic power of deep learning to be merged with the efficiency of neuromorphic processors, bringing the promise of embedded, intelligent, brain-inspired computing one step closer.
Proceedings Article
Learned step size quantization
Steven K. Esser,Jeffrey L. McKinstry,Deepika Bablani,Rathinakumar Appuswamy,Dharmendra S. Modha +4 more
TL;DR: This work introduces a novel means to estimate and scale the task loss gradient at each weight and activation layer's quantizer step size, such that it can be learned in conjunction with other network parameters.
Proceedings Article
Backpropagation for energy-efficient neuromorphic computing
TL;DR: This work treats spikes and discrete synapses as continuous probabilities, which allows training the network using standard backpropagation and naturally maps to neuromorphic hardware by sampling the probabilities to create one or more networks, which are merged using ensemble averaging.
Proceedings ArticleDOI
Cognitive computing systems: Algorithms and applications for networks of neurosynaptic cores
Steve K. Esser,Alexander Andreopoulos,Rathinakumar Appuswamy,Pallab Datta,Davis,Arnon Amir,John V. Arthur,Andrew S. Cassidy,Myron D. Flickner,Paul A. Merolla,Shyamal Suhana Chandra,Nicola Basilico,Stefano Carpin,Tom Zimmerman,Frank Zee,Rodrigo Alvarez-Icaza,Jeffrey A. Kusnitz,Theodore M. Wong,William P. Risk,Emmett McQuinn,Tapan K. Nayak,Raghavendra Singh,Dharmendra S. Modha +22 more
TL;DR: A set of abstractions, algorithms, and applications that are natively efficient for TrueNorth, a non-von Neumann architecture inspired by the brain's function and efficiency, and seven applications that include speaker recognition, music composer recognition, digit recognition, sequence prediction, collision avoidance, optical flow, and eye detection are developed.