scispace - formally typeset
Search or ask a question

Showing papers in "Neural Networks in 2007"


Journal ArticleDOI
TL;DR: Three different uses of a recurrent neural network as a reservoir that is not trained but instead read out by a simple external classification layer are compared and a new measure for the reservoir dynamics based on Lyapunov exponents is introduced.

930 citations


Journal ArticleDOI
TL;DR: Stability conditions are presented, a stochastic gradient descent method is introduced and a usefulness of leaky-integrator ESNs are demonstrated for learning very slow dynamic systems and replaying the learnt system at different speeds.

740 citations


Journal ArticleDOI
TL;DR: In this article, the significance of the edge of chaos for real-time computations in neural microcircuit models consisting of spiking neurons and dynamic synapses is analyzed and a new method for predicting the computational performance of neural micro circuits is proposed.

373 citations


Journal ArticleDOI
TL;DR: It is shown that, under mild conditions, global convergence to a concurrently synchronized regime is preserved under basic system combinations such as negative feedback or hierarchies, so that stable concurrently synchronized aggregates of arbitrary size can be constructed.

262 citations


Journal ArticleDOI
TL;DR: An echo state network (ESN) with a competitive state machine framework is combined to create a classification engine called the predictive ESN classifier that was significantly more noise robust compared to a hidden Markov model in noisy speech classification experiments.

211 citations


Journal ArticleDOI
TL;DR: An enhanced self-organizing incremental neural network (ESOINN) is proposed to accomplish online unsupervised learning tasks and is more stable than SOINN.

198 citations


Journal ArticleDOI
TL;DR: The cerebellum that has been considered to date as a biological counterpart of a perceptrons is reinterpreted to be a liquid state machine that possesses powerful information processing capability more than a perceptron.

184 citations


Journal ArticleDOI
TL;DR: It is shown that the memory of ESNs in this word-prediction task, although noisy, extends significantly beyond that of bigrams and trigrams, enabling ESNs to make good predictions of verb agreement at distances over which these methods operate at chance.

169 citations


Journal ArticleDOI
TL;DR: In a real-life prediction task using noisy sea clutter data, both schemes exhibit higher prediction accuracy and successful design ratio than a conventional ESN with a single reservoir.

168 citations


Journal ArticleDOI
TL;DR: E echo state networks and liquid state machines are viewed as a further member in this family of versatile basic computational metaphors with a clear biological footing, which makes them amenable to mathematical analysis and invites mapping to biological brains in many ways.

164 citations



Journal ArticleDOI
Jochen J. Steil1
TL;DR: It is shown experimentally that a biologically motivated learning rule based on neural intrinsic plasticity can drive the neurons' output activities to approximate exponential distributions and implement sparse codes in the reservoir.

Journal ArticleDOI
TL;DR: This model is instantiated with a sparse recurrently connected neural network that has spiking leaky integrator units and continuous Hebbian learning and concludes that an implementation on a cluster computer is not communication but computation bounded.

Journal ArticleDOI
TL;DR: This work describes a "hybrid" cognitive architecture that is implementable in neuronal nets, and which has uniform brainlike features, including activation-passing and highly distributed "codelets," implementable as small-scale neural nets.

Journal ArticleDOI
TL;DR: The experimental results on two data sets studied in this paper demonstrate that the DEPSO algorithm performs better in RNN training, and the RNN-based model can provide meaningful insight in capturing the nonlinear dynamics of genetic networks and revealing genetic regulatory interactions.

Journal ArticleDOI
TL;DR: Computer simulations show that training of the CPG can be successfully performed by the proposed CPG-actor-critic method, thus allowing the biped robot to not only walk stably but also adapt to environmental changes.

Journal ArticleDOI
TL;DR: This work proposes a generalization of CCA to several data sets, which is shown to be equivalent to the classical maximum variance (MAXVAR) generalization proposed by Kettenring.

Journal ArticleDOI
TL;DR: It is demonstrated that the new network can lead to a parsimonious model with much better generalization property compared with the traditional single width RBF networks.

Journal ArticleDOI
TL;DR: This work formalizes the Bayesian Student-t mixture model as a latent variable model in a different way from Svensén and Bishop, and expects that the lower bound on the log-evidence is tighter and the model complexity can be inferred with a higher confidence.

Journal ArticleDOI
TL;DR: HYPINV is the only pedagogical rule extraction method, which extracts hyperplane rules from continuous or binary attribute neural networks, and it is able to generate rules with arbitrarily desired fidelity, maintaining a fidelity-complexity tradeoff.

Journal ArticleDOI
TL;DR: The present article proposes that hippocampal spatial and temporal processing are carried out by parallel circuits within entorhinal cortex, dentate gyrus and CA3 that are variations of the same circuit design.

Journal ArticleDOI
TL;DR: Under easily verified conditions, exponential stability is obtained when the delay is finite, while asymptotic stability is obtaining when thedelay is infinite.

Journal ArticleDOI
TL;DR: In this article, the authors view consciousness as operating simultaneously in a field at all levels ranging from subatomic to social, and the relations and transpositions between levels require sophisticated mathematical treatments that are still to be devised.

Journal ArticleDOI
TL;DR: The advantages of the ESN WAM are that it learns the dynamics of the power system in a shorter training time with a higher accuracy and with considerably fewer weights to be adapted compared to the design-based on a TDNN.

Journal ArticleDOI
TL;DR: This article demonstrates that the concept of sensorimotor anticipation, whereby a robot learns to predict how its visual input changes under movement commands, can be realized in a mobile robot.

Journal ArticleDOI
TL;DR: It is demonstrated that the combination of STDP and IP shapes the network structure and dynamics in ways that allow the discovery of patterns in input time series and lead to good performance in time series prediction.

Journal ArticleDOI
TL;DR: This paper is concerned with the problem of robust stability for stochastic interval delayed additive neural networks (SIDANN) with Markovian switching and proposes a mathematical model of this kind of neural networks based on the Lyapunov method.

Journal ArticleDOI
TL;DR: A neural model of saccade initiation based on competitive integration of planned and reactive saccades decision signals in the intermediate layer of the superior colliculus is presented and the decision processes grow nonlinearly towards a preset criterion level and when they cross it, a movement is initiated.

Journal ArticleDOI
TL;DR: Simulations in which two networks interact provide the beginnings of a computational mechanism to account for mental attitudes, that is, an understanding by a cognitive system of the manner in which its first-order knowledge is held.

Journal ArticleDOI
TL;DR: A method that performs drastic reduction in the complexity of time-frequency representations through a modelling of the maps by elementary functions is described, validated on artificial signals and subsequently applied to electrophysiological brain signals recorded from the olfactory bulb of rats while they are trained to recognize odours.