scispace - formally typeset
Open AccessJournal ArticleDOI

A unified information perceptron using deep reservoir computing

TLDR
This paper presents a unified information processing structure by fusing the convolutional or fully connected neural network with the delay feedback reservoir into a hybrid neural network model to accomplish the comprehensive information processing goal.
About
This article is published in Computers & Electrical Engineering.The article was published on 2020-07-01 and is currently open access. It has received 5 citations till now. The article focuses on the topics: Reservoir computing & MNIST database.

read more

Citations
More filters
Journal ArticleDOI

Reservoir-based convolution

TL;DR: Zhang et al. as discussed by the authors proposed a novel convolutional neural network based on reservoir computing (RC) that can be optimized by ridge regression rather than back-propagation.
Journal ArticleDOI

Chain-structure time-delay reservoir computing for synchronizing chaotic signal and an application to secure communication

TL;DR: In this article , a chain-structure time-delay reservoir (CSTDR) was proposed for synchronizing chaotic signals, and a novel scheme of secure communication was designed, in which the smart receiver can synchronize to the chaotic signal used for encryption in an adaptive manner.
Journal ArticleDOI

Chain-structure time-delay reservoir computing for synchronizing chaotic signal and an application to secure communication

TL;DR: In this paper , a chain-structure time-delay reservoir (CSTDR) was proposed for synchronizing chaotic signals, and a novel scheme of secure communication was designed, in which the smart receiver can synchronize to the chaotic signal used for encryption in an adaptive manner.

In-material reservoir implementation of reservoir-based convolution

TL;DR: In this article , a reservoir-based convolutional neural network (CNN) was implemented on physical reservoir computing (RC) to develop an efficient image recognition system for edge AI, which achieved an accuracy of 81.7% in an image classification task while an echo state network-based CNN achieves 87.7%.
Peer Review

Design and Optimization of Temporal Encoders Using Integrate-and-Fire and Leaky Integrate-and-Fire Neurons

TL;DR: In this paper , the authors proposed a new form of signal processing by mimicking biological neural networks using electrical components, which can encode signals directly using analog temporal encoders from SNNs.
References
More filters
Proceedings ArticleDOI

Deep Residual Learning for Image Recognition

TL;DR: In this article, the authors proposed a residual learning framework to ease the training of networks that are substantially deeper than those used previously, which won the 1st place on the ILSVRC 2015 classification task.
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal ArticleDOI

Gradient-based learning applied to document recognition

TL;DR: In this article, a graph transformer network (GTN) is proposed for handwritten character recognition, which can be used to synthesize a complex decision surface that can classify high-dimensional patterns, such as handwritten characters.
Dissertation

Learning Multiple Layers of Features from Tiny Images

TL;DR: In this paper, the authors describe how to train a multi-layer generative model of natural images, using a dataset of millions of tiny colour images, described in the next section.
Proceedings Article

Understanding the difficulty of training deep feedforward neural networks

TL;DR: The objective here is to understand better why standard gradient descent from random initialization is doing so poorly with deep neural networks, to better understand these recent relative successes and help design better algorithms in the future.
Related Papers (5)