scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Real-time computing without stable states: a new framework for neural computation based on perturbations

TL;DR: A new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks, based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry.
Abstract: A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real time. We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a task-dependent construction of neural circuits. Instead, it is based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a sufficiently large and heterogeneous neural circuit may serve as universal analog fading memory. Readout neurons can learn to extract in real time from the current state of such recurrent neural circuit information about current and past inputs that may be needed for diverse tasks. Stable internal states are not required for giving a stable output, since transient internal states can be transformed by readout neurons into stable target outputs due to the high dimensionality of the dynamical system. Our approach is based on a rigorous computational model, the liquid state machine, that, unlike Turing machines, does not require sequential transitions between well-defined discrete internal states. It is supported, as the Turing machine is, by rigorous mathematical results that predict universal computational power under idealized conditions, but for the biologically more realistic scenario of real-time processing of time-varying inputs. Our approach provides new perspectives for the interpretation of neural coding, the design of experiments and data analysis in neurophysiology, and the solution of problems in robotics and neurotechnology.

Content maybe subject to copyright    Report

Citations
More filters
Book
18 Nov 2016
TL;DR: Deep learning as mentioned in this paper is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts, and it is used in many applications such as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames.
Abstract: Deep learning is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts. Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning. The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames. Finally, the book offers research perspectives, covering such theoretical topics as linear factor models, autoencoders, representation learning, structured probabilistic models, Monte Carlo methods, the partition function, approximate inference, and deep generative models. Deep Learning can be used by undergraduate or graduate students planning careers in either industry or research, and by software engineers who want to begin using deep learning in their products or platforms. A website offers supplementary material for both readers and instructors.

38,208 citations

Journal ArticleDOI
TL;DR: This historical survey compactly summarizes relevant work, much of it from the previous millennium, review deep supervised learning, unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.

14,635 citations


Cites result from "Real-time computing without stable ..."

  • ...In fact, some popular RNN algorithms restricted credit assignment to a single step backwards (Elman, 1990; Jordan, 1986, 1997), also inmore recent studies (Jaeger, 2001, 2004; Maass et al., 2002)....

    [...]

Journal ArticleDOI
02 Apr 2004-Science
TL;DR: A method for learning nonlinear systems, echo state networks (ESNs), which employ artificial recurrent neural networks in a way that has recently been proposed independently as a learning mechanism in biological brains is presented.
Abstract: We present a method for learning nonlinear systems, echo state networks (ESNs). ESNs employ artificial recurrent neural networks in a way that has recently been proposed independently as a learning mechanism in biological brains. The learning method is computationally efficient and easy to use. On a benchmark task of predicting a chaotic time series, accuracy is improved by a factor of 2400 over previous techniques. The potential for engineering applications is illustrated by equalizing a communication channel, where the signal error rate is improved by two orders of magnitude.

3,122 citations

Book
15 Aug 2002
TL;DR: A comparison of single and two-dimensional neuron models for spiking neuron models and models of Synaptic Plasticity shows that the former are superior to the latter, while the latter are better suited to population models.
Abstract: Neurons in the brain communicate by short electrical pulses, the so-called action potentials or spikes. How can we understand the process of spike generation? How can we understand information transmission by neurons? What happens if thousands of neurons are coupled together in a seemingly random network? How does the network connectivity determine the activity patterns? And, vice versa, how does the spike activity influence the connectivity pattern? These questions are addressed in this 2002 introduction to spiking neurons aimed at those taking courses in computational neuroscience, theoretical biology, biophysics, or neural networks. The approach will suit students of physics, mathematics, or computer science; it will also be useful for biologists who are interested in mathematical modelling. The text is enhanced by many worked examples and illustrations. There are no mathematical prerequisites beyond what the audience would meet as undergraduates: more advanced techniques are introduced in an elementary, concrete fashion when needed.

2,814 citations

Book
01 Jan 2018

2,291 citations


Cites background from "Real-time computing without stable ..."

  • ...Early variants of the recurrent neural network included the echo-state network [219], which is also referred to as the liquid-state machine [304]....

    [...]

  • ...Echo-state networks are also referred to as liquid-state machines [304], except that the latter uses spiking neurons with binary outputs, whereas echo-state networks use conventional activations like the sigmoid and the tanh functions....

    [...]

References
More filters
01 Jan 1998
TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Abstract: A comprehensive look at learning and generalization theory. The statistical theory of learning and generalization concerns the problem of choosing desired functions on the basis of empirical data. Highly applicable to a variety of computer science and robotics fields, this book offers lucid coverage of the theory as a whole. Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.

26,531 citations

Book
01 Jan 1991
TL;DR: This book is a detailed, logically-developed treatment that covers the theory and uses of collective computational networks, including associative memory, feed forward networks, and unsupervised learning.
Abstract: From the Publisher: This book is a comprehensive introduction to the neural network models currently under intensive study for computational applications. It is a detailed, logically-developed treatment that covers the theory and uses of collective computational networks, including associative memory, feed forward networks, and unsupervised learning. It also provides coverage of neural network applications in a variety of problems of both theoretical and practical interest.

7,518 citations


"Real-time computing without stable ..." refers background in this paper

  • ...Readout neurons can learn to extract in real time from the current state of such recurrent neural circuit information about current and past inputs that may be needed for diverse tasks....

    [...]

Book
05 Jun 1975
TL;DR: Introduction to synaptic circuits, Gordon M.Shepherd and Christof Koch membrane properties and neurotransmitter actions, David A.Brown and Anthony M.Brown.
Abstract: Introduction to synaptic circuits, Gordon M.Shepherd and Christof Koch membrane properties and neurotransmitter actions, David A.McCormick peripheral ganglia, Paul R.Adams and Christof Koch spinal cord - ventral horn, Robert E.Burke olfactory bulb, Gordon M.Shepherd, and Charles A.Greer retina, Peter Sterling cerebellum, Rodolfo R.Llinas and Kerry D.Walton thalamus, S.Murray Sherman and Christof Koch basal ganglia, Charles J.Wilson olfactory cortex, Lewis B.Haberly hippocampus, Thomas H.Brown and Anthony M.Zador neocortex, Rodney J.Douglas and Kevan A.C.Martin Gordon M.Shepherd. Appendix: Dendretic electrotonus and synaptic integration.

3,241 citations


"Real-time computing without stable ..." refers background in this paper

  • ...This information is processed by extremely complexbut surprisingly stereotypic microcircuits that can perform a wide spectrum of tasks (Shepherd, 1988; Douglas & Martin, 1998; von Melchner, Pallas, & Sur, 2000)....

    [...]

Journal ArticleDOI
27 Sep 1996-Science
TL;DR: In this article, the authors examined whether the variability can be attributed to ongoing activity in the mammalian cortex and found that evoked responses in single trials could be predicted by linear summation of the deterministic response and preceding ongoing activity.
Abstract: Evoked activity in the mammalian cortex and the resulting behavioral responses exhibit a large variability to repeated presentations of the same stimulus. This study examined whether the variability can be attributed to ongoing activity. Ongoing and evoked spatiotemporal activity patterns in the cat visual cortex were measured with real-time optical imaging; local field potentials and discharges of single neurons were recorded simultaneously, by electrophysiological techniques. The evoked activity appeared deterministic, and the variability resulted from the dynamics of ongoing activity, presumably reflecting the instantaneous state of cortical networks. In spite of the large variability, evoked responses in single trials could be predicted by linear summation of the deterministic response and the preceding ongoing activity. Ongoing activity must play an important role in cortical function and cannot be ignored in exploration of cognitive processes.

1,651 citations

Journal ArticleDOI
TL;DR: Differential signaling is a key mechanism in neocortical information processing, which can be regulated by selective synaptic modifications, andeterogeneity of synaptic transfer functions allows multiple synaptic representations of the same presynaptic action potential train and suggests that these synaptic representations are regulated in a complex manner.
Abstract: The nature of information stemming from a single neuron and conveyed simultaneously to several hundred target neurons is not known. Triple and quadruple neuron recordings revealed that each synaptic connection established by neocortical pyramidal neurons is potentially unique. Specifically, synaptic connections onto the same morphological class differed in the numbers and dendritic locations of synaptic contacts, their absolute synaptic strengths, as well as their rates of synaptic depression and recovery from depression. The same axon of a pyramidal neuron innervating another pyramidal neuron and an interneuron mediated frequency-dependent depression and facilitation, respectively, during high frequency discharges of presynaptic action potentials, suggesting that the different natures of the target neurons underlie qualitative differences in synaptic properties. Facilitating-type synaptic connections established by three pyramidal neurons of the same class onto a single interneuron, were all qualitatively similar with a combination of facilitation and depression mechanisms. The time courses of facilitation and depression, however, differed for these convergent connections, suggesting that different pre-postsynaptic interactions underlie quantitative differences in synaptic properties. Mathematical analysis of the transfer functions of frequency-dependent synapses revealed supra-linear, linear, and sub-linear signaling regimes in which mixtures of presynaptic rates, integrals of rates, and derivatives of rates are transferred to targets depending on the precise values of the synaptic parameters and the history of presynaptic action potential activity. Heterogeneity of synaptic transfer functions therefore allows multiple synaptic representations of the same presynaptic action potential train and suggests that these synaptic representations are regulated in a complex manner. It is therefore proposed that differential signaling is a key mechanism in neocortical information processing, which can be regulated by selective synaptic modifications.

1,113 citations


"Real-time computing without stable ..." refers background or methods in this paper

  • ...…inevitable consequence of collapsing the high-dimensional space of liquid states into a single dimension, but what is surprising is that the equivalence classes are meaningful in terms of the task, allowing invariant and appropriately scaled readout responses and therefore real-time computation on…...

    [...]

  • ...In the case of a synaptic connection from a to b, we modeled the synaptic dynamics according to the model proposed in Markram, Wang, and Tsodyks (1998), with the synaptic parameters U (use), D (time constant for depression), and F (time constant for facilitation) randomly chosen from gaussian…...

    [...]