scispace - formally typeset
Journal ArticleDOI

Minimum Complexity Echo State Network

Ali Rodan, +1 more
- 01 Jan 2011 - 
- Vol. 22, Iss: 1, pp 131-144
Reads0
Chats0
TLDR
It is shown that a simple deterministically constructed cycle reservoir is comparable to the standard echo state network methodology and the (short-term) of linear cyclic reservoirs can be made arbitrarily close to the proved optimal value.
Abstract
Reservoir computing (RC) refers to a new class of state-space models with a fixed state transition structure (the reservoir) and an adaptable readout form the state space. The reservoir is supposed to be sufficiently complex so as to capture a large number of features of the input stream that can be exploited by the reservoir-to-output readout mapping. The field of RC has been growing rapidly with many successful applications. However, RC has been criticized for not being principled enough. Reservoir construction is largely driven by a series of randomized model-building stages, with both researchers and practitioners having to rely on a series of trials and errors. To initialize a systematic study of the field, we concentrate on one of the most popular classes of RC methods, namely echo state network, and ask: What is the minimal complexity of reservoir construction for obtaining competitive models and what is the memory capacity (MC) of such simplified reservoirs? On a number of widely used time series benchmarks of different origin and characteristics, as well as by conducting a theoretical analysis we show that a simple deterministically constructed cycle reservoir is comparable to the standard echo state network methodology. The (short-term) of linear cyclic reservoirs can be made arbitrarily close to the proved optimal value.

read more

Citations
More filters
Journal ArticleDOI

Information processing using a single dynamical node as complex system

TL;DR: This work introduces a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback and proves that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing.
Journal ArticleDOI

Parallel photonic information processing at gigabyte per second data rates using transient states

TL;DR: The potential of a simple photonic architecture to process information at unprecedented data rates is demonstrated, implementing a learning-based approach and all digits with very low classification errors are identified and chaotic time-series prediction with 10% error is performed.
Journal ArticleDOI

Artificial Neural Networks-Based Machine Learning for Wireless Networks: A Tutorial

TL;DR: This paper constitutes the first holistic tutorial on the development of ANN-based ML techniques tailored to the needs of future wireless networks and overviews how artificial neural networks (ANNs)-based ML algorithms can be employed for solving various wireless networking problems.
Journal ArticleDOI

Photonic information processing beyond Turing: an optoelectronic implementation of reservoir computing.

TL;DR: This work experimentally demonstrate optical information processing using a nonlinear optoelectronic oscillator subject to delayed feedback and implements a neuro-inspired concept, called Reservoir Computing, proven to possess universal computational capabilities.
Book ChapterDOI

A Practical Guide to Applying Echo State Networks

TL;DR: Practical techniques and recommendations for successfully applying Echo State Network, as well as some more advanced application-specific modifications are presented.
References
More filters
Journal ArticleDOI

Novel approach to nonlinear/non-Gaussian Bayesian state estimation

TL;DR: An algorithm, the bootstrap filter, is proposed for implementing recursive Bayesian filters, represented as a set of random samples, which are updated and propagated by the algorithm.
Journal ArticleDOI

Learning long-term dependencies with gradient descent is difficult

TL;DR: This work shows why gradient based learning algorithms face an increasingly difficult problem as the duration of the dependencies to be captured increases, and exposes a trade-off between efficient learning by gradient descent and latching on information for long periods.
Journal ArticleDOI

Real-time computing without stable states: a new framework for neural computation based on perturbations

TL;DR: A new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks, based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry.
Journal ArticleDOI

Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication

TL;DR: A method for learning nonlinear systems, echo state networks (ESNs), which employ artificial recurrent neural networks in a way that has recently been proposed independently as a learning mechanism in biological brains is presented.
Journal ArticleDOI

A Two-dimensional Mapping with a Strange Attractor

TL;DR: In this article, the same properties can be observed in a simple mapping of the plane defined by: \({x i + 1}} = {y_i} + 1 - ax_i^2,{y i+ 1} = b{x_i}\).
Related Papers (5)