Survey: Reservoir computing approaches to recurrent neural network training
Citations
[...]
38,208 citations
3,549 citations
Cites background from "Survey: Reservoir computing approac..."
...As done in Martens and Sutskever (2011), we address the pathological problems proposed by Hochreiter and Schmidhuber (1997) that require learning long term correlations....
[...]
...Echo State Networks (Lukoševičius and Jaeger, 2009) avoid the exploding and vanishing gradients problem by not learning the recurrent and input weights....
[...]
...Echo State Networks (Lukoševičius and Jaeger, 2009) avoid the exploding and vanishing gradients problem by not learning the recurrent and input weights....
[...]
2,586 citations
2,291 citations
Cites background from "Survey: Reservoir computing approac..."
...An overview of echo-state networks in the context of reservoir computing principles is provided in [301]....
[...]
References
72,897 citations
"Survey: Reservoir computing approac..." refers background or methods in this paper
...Evolino [46] transfers the idea of ESNs from an RNN of simple sigmoidal units to a Long Short-Term Memory type of RNNs [40] constructed from units capable of preserving memory for long periods of time....
[...]
...There are also other versions of supervised RNN training, formulating the training problem differently, such as using Extended Kalman Filters [38] or the Expectation-Maximization algorithm [39], as well as dealing with special types of RNNs, such as Long Short-Term Memory [40] modular networks capable of learning long-term dependences....
[...]
39,297 citations
33,771 citations
32,573 citations
17,604 citations
"Survey: Reservoir computing approac..." refers methods in this paper
...Other types of high-dimensional dynamical systems that can take an input u(n) and have an observable state x(n) (which does not necessarily fully describe the state of the system) can be used as well....
[...]