Deep learning in spiking neural networks
Amirhossein Tavanaei,Masoud Ghodrati,Saeed Reza Kheradpisheh,Timothée Masquelier,Anthony S. Maida +4 more
Reads0
Chats0
TLDR
The emerging picture is that SNNs still lag behind ANNs in terms of accuracy, but the gap is decreasing, and can even vanish on some tasks, while SNN's typically require many fewer operations and are the better candidates to process spatio-temporal data.About:
This article is published in Neural Networks.The article was published on 2019-03-01 and is currently open access. It has received 756 citations till now. The article focuses on the topics: Spiking neural network & Artificial neural network.read more
Citations
More filters
Journal ArticleDOI
Crossing the Cleft: Communication Challenges Between Neuroscience and Artificial Intelligence
Frances S. Chance,James B. Aimone,Srideep Musuvathy,Michael R. Smith,Craig M. Vineyard,Felix Wang +5 more
TL;DR: Cultural differences between the two fields are discussed, including divergent priorities that should be considered when leveraging modern-day neuroscience for AI and small but significant cultural shifts that would greatly facilitate increased synergy between theTwo fields are highlighted.
Proceedings ArticleDOI
Spiking Neural Networks Trained With Backpropagation for Low Power Neuromorphic Implementation of Voice Activity Detection
TL;DR: In this paper, the authors exploit an SNN model that can be recast into a recurrent network and trained with known deep learning techniques, achieving state-of-the-art performance at a fraction of the power consumption compared to other methods.
Journal ArticleDOI
An ensemble unsupervised spiking neural network for objective recognition
Qiang Fu,Hongbin Dong +1 more
TL;DR: A hierarchical SNN, comprising convolutional and pooling layers, is designed, consisting of excitatory and inhibitory neurons based on the mechanism of the primate brain and suggests that the ensemble SNN architecture with transfer learning is key to improving the performance of the SNN.
Journal ArticleDOI
Spiking neural P systems with target indications
TL;DR: It is shown that 6 neurons are sufficient for constructing a universal SNP system with the proposed spike distribution mechanism as a number generator and as a function computing device.
Journal ArticleDOI
Quantized STDP-based online-learning spiking neural network
TL;DR: A spike-timing-dependent plasticity (STDP)-based weight-quantized/binarized online-learning spiking neural network (SNN), which uses bio-plausible integrate-and-fire neuron and conductance-based synapse as the basic building blocks and realizes online learning by STDP and winner-take-all (WTA) mechanism is reported.
References
More filters
Proceedings ArticleDOI
Deep Residual Learning for Image Recognition
TL;DR: In this article, the authors proposed a residual learning framework to ease the training of networks that are substantially deeper than those used previously, which won the 1st place on the ILSVRC 2015 classification task.
Proceedings Article
ImageNet Classification with Deep Convolutional Neural Networks
TL;DR: The state-of-the-art performance of CNNs was achieved by Deep Convolutional Neural Networks (DCNNs) as discussed by the authors, which consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax.
Journal ArticleDOI
Long short-term memory
TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Proceedings Article
Very Deep Convolutional Networks for Large-Scale Image Recognition
Karen Simonyan,Andrew Zisserman +1 more
TL;DR: This work investigates the effect of the convolutional network depth on its accuracy in the large-scale image recognition setting using an architecture with very small convolution filters, which shows that a significant improvement on the prior-art configurations can be achieved by pushing the depth to 16-19 weight layers.
Proceedings Article
Very Deep Convolutional Networks for Large-Scale Image Recognition
Karen Simonyan,Andrew Zisserman +1 more
TL;DR: In this paper, the authors investigated the effect of the convolutional network depth on its accuracy in the large-scale image recognition setting and showed that a significant improvement on the prior-art configurations can be achieved by pushing the depth to 16-19 layers.
Related Papers (5)
Loihi: A Neuromorphic Manycore Processor with On-Chip Learning
Michael Davies,Narayan Srinivasa,Tsung-Han Lin,Gautham N. Chinya,Cao Yongqiang,Sri Harsha Choday,Georgios D. Dimou,Prasad Joshi,Nabil Imam,Shweta Jain,Yuyun Liao,Chit-Kwan Lin,Andrew Lines,Ruokun Liu,Deepak A. Mathaikutty,Steven McCoy,Arnab Paul,Jonathan Tse,Guruguhanathan Venkataramanan,Yi-Hsin Weng,Andreas Wild,Yoon Seok Yang,Hong Wang +22 more
Training Deep Spiking Neural Networks Using Backpropagation.
A million spiking-neuron integrated circuit with a scalable communication network and interface
Paul A. Merolla,John V. Arthur,Rodrigo Alvarez-Icaza,Andrew S. Cassidy,Jun Sawada,Filipp Akopyan,Bryan L. Jackson,Nabil Imam,Chen Guo,Yutaka Nakamura,Bernard Brezzo,Ivan Vo,Steven K. Esser,Rathinakumar Appuswamy,Brian Taba,Arnon Amir,Myron D. Flickner,William P. Risk,Rajit Manohar,Dharmendra S. Modha +19 more