scispace - formally typeset
Open AccessProceedings ArticleDOI

T2FSNN: Deep Spiking Neural Networks with Time-to-first-spike Coding

TLDR
T2FSNN is presented, which introduces the concept of time-to-first-spike coding into deep SNNs using the kernel-based dynamic threshold and dendrite to overcome the aforementioned drawback and proposes gradient-based optimization and early firing methods to further increase the efficiency of the T1FSNN.
Abstract
Spiking neural networks (SNNs) have gained considerable interest due to their energy-efficient characteristics, yet lack of a scalable training algorithm has restricted their applicability in practical machine learning problems. The deep neural network-to-SNN conversion approach has been widely studied to broaden the applicability of SNNs. Most previous studies, however, have not fully utilized spatio-temporal aspects of SNNs, which has led to inefficiency in terms of number of spikes and inference latency. In this paper, we present T2FSNN, which introduces the concept of time-to-first-spike coding into deep SNNs using the kernel-based dynamic threshold and dendrite to overcome the aforementioned drawback. In addition, we propose gradient-based optimization and early firing methods to further increase the efficiency of the T2FSNN. According to our results, the proposed methods can reduce inference latency and number of spikes to 22% and less than 1%, compared to those of burst coding, which is the state-of-the-art result on the CIFAR-100.

read more

Citations
More filters
Posted Content

Revisiting Batch Normalization for Training Low-latency Deep Spiking Neural Networks from Scratch.

TL;DR: A temporal Batch Normalization Through Time (BNTT) technique is proposed and it is found that varying the BN parameters at every time-step allows the model to learn the time-varying input distribution better.
Journal ArticleDOI

Neural Coding in Spiking Neural Networks: A Comparative Study for Robust Neuromorphic Systems.

TL;DR: In this paper, the impact and performance of four important neural coding schemes, namely, rate coding, time-to-first spike (TTFS) coding, phase coding, and burst coding, were compared with an unsupervised spike-timing dependent plasticity (STDP) algorithm.
Book ChapterDOI

Neural Architecture Search for Spiking Neural Networks

TL;DR: SNASNet as mentioned in this paper proposes a Neural Architecture Search (NAS) approach for finding better SNN architectures by selecting the architecture that can represent diverse spike activation patterns across different data samples without training.
Proceedings Article

AutoSNN: Towards Energy-Efficient Spiking Neural Networks

TL;DR: This work investigates the design choices used in the previous studies in terms of the accuracy and number of spikes and points out that they are not best-suited for SNNs and proposes a spike-aware neural architecture search framework called AutoSNN, a search space consisting of architectures without undesirable design choices.
Proceedings ArticleDOI

Spatio-Temporal Pruning and Quantization for Low-latency Spiking Neural Networks

TL;DR: In this article, spatial and temporal pruning of SNNs is proposed to reduce the number of operations with energy per operation, which leads to 10-14x model compression.
References
More filters
Proceedings Article

Learning both weights and connections for efficient neural networks

TL;DR: In this paper, the authors proposed a method to reduce the storage and computation required by neural networks by an order of magnitude without affecting their accuracy by learning only the important connections using a three-step method.
Proceedings Article

EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks

TL;DR: EfficientNet-B7 as discussed by the authors proposes a new scaling method that uniformly scales all dimensions of depth/width/resolution using a simple yet highly effective compound coefficient, which achieves state-of-the-art accuracy on ImageNet, while being 8.4x smaller and 6.1x faster on inference.
Journal ArticleDOI

A million spiking-neuron integrated circuit with a scalable communication network and interface

TL;DR: Inspired by the brain’s structure, an efficient, scalable, and flexible non–von Neumann architecture is developed that leverages contemporary silicon technology and is well suited to many applications that use complex neural networks in real time, for example, multiobject detection and classification.
Journal ArticleDOI

Networks of spiking neurons: the third generation of neural network models

TL;DR: It is shown that networks of spiking neurons are, with regard to the number of neurons that are needed, computationally more powerful than other neural network models based on McCulloch Pitts neurons and sigmoidal gates.
Journal Article

Networks of Spiking Neurons: The Third Generation of Neural Network Models

TL;DR: It is shown that networks of spiking neurons are, with regard to the number of neurons that are needed, computationally more powerful than these other neural network models based on McCulloch Pitts neurons, respectively, sigmoidal gates.
Related Papers (5)