scispace - formally typeset
Search or ask a question
Book ChapterDOI

Stochastic neurodynamics and the system size expansion

01 Oct 1997-pp 290-294
TL;DR: In this paper, the authors present a method for the study of stochastic neurodynamics in the master equation framework and obtain a statistical description of the dynamics of fluctuations and correlations of neural activity in large neural networks.
Abstract: We present here a method for the study of stochastic neurodynamics in the master equation framework. Our aim is to obtain a statistical description of the dynamics of fluctuations and correlations of neural activity in large neural networks. We focus on a macroscopic description of the network via a master equation for the number of active neurons in the network. We present a systematic expansion of this equation using the “system size expansion”. We obtain coupled dynamical equations for the average activity and of fluctuations around this average. These equations exhibit non-monotonic approaches to equilibrium, as seen in Monte Carlo simulations.
Citations
More filters
Journal ArticleDOI
TL;DR: This work surveys recent analytical approaches to studying the spatiotemporal dynamics of continuum neural fields, an important example of spatially extended excitable systems with nonlocal interactions.
Abstract: We survey recent analytical approaches to studying the spatiotemporal dynamics of continuum neural fields. Neural fields model the large-scale dynamics of spatially structured biological neural networks in terms of nonlinear integrodifferential equations whose associated integral kernels represent the spatial distribution of neuronal synaptic connections. They provide an important example of spatially extended excitable systems with nonlocal interactions and exhibit a wide range of spatially coherent dynamics including traveling waves oscillations and Turing-like patterns.

412 citations

Journal ArticleDOI
TL;DR: An important implication is that a network need not be “critical” for the production of avalanches, so experimentally observed power laws in burst size may be a signature of noisy functionally feedforward structure rather than of, for example, self-organized criticality.
Abstract: Neuronal avalanches are a form of spontaneous activity widely observed in cortical slices and other types of nervous tissue, both in vivo and in vitro. They are characterized by irregular, isolated population bursts when many neurons fire together, where the number of spikes per burst obeys a power law distribution. We simulate, using the Gillespie algorithm, a model of neuronal avalanches based on stochastic single neurons. The network consists of excitatory and inhibitory neurons, first with all-to-all connectivity and later with random sparse connectivity. Analyzing our model using the system size expansion, we show that the model obeys the standard Wilson-Cowan equations for large network sizes ( neurons). When excitation and inhibition are closely balanced, networks of thousands of neurons exhibit irregular synchronous activity, including the characteristic power law distribution of avalanche size. We show that these avalanches are due to the balanced network having weakly stable functionally feedforward dynamics, which amplifies some small fluctuations into the large population bursts. Balanced networks are thought to underlie a variety of observed network behaviours and have useful computational properties, such as responding quickly to changes in input. Thus, the appearance of avalanches in such functionally feedforward networks indicates that avalanches may be a simple consequence of a widely present network structure, when neuron dynamics are noisy. An important implication is that a network need not be “critical” for the production of avalanches, so experimentally observed power laws in burst size may be a signature of noisy functionally feedforward structure rather than of, for example, self-organized criticality.

196 citations


Cites background from "Stochastic neurodynamics and the sy..."

  • ...These equations generalize the one population system-size expansion of the master equation reported in [3]....

    [...]

Journal ArticleDOI
TL;DR: This work analyzes a master equation formulation of stochastic neurodynamics for a network of synaptically coupled homogeneous neuronal populations each consisting of N identical neurons to derive the lowest order corrections to these rate equations for large but finite N.
Abstract: We analyze a master equation formulation of stochastic neurodynamics for a network of synaptically coupled homogeneous neuronal populations each consisting of N identical neurons. The state of the network is specified by the fraction of active or spiking neurons in each population, and transition rates are chosen so that in the thermodynamic or deterministic limit ($N\rightarrow\infty$) we recover standard activity-based or voltage-based rate models. We derive the lowest order corrections to these rate equations for large but finite N using two different approximation schemes, one based on the Van Kampen system-size expansion and the other based on path integral methods. Both methods yield the same series expansion of the moment equations, which at $\mathcal{O}(1/N)$ can be truncated to form a closed system of equations for the first- and second-order moments. Taking a continuum limit of the moment equations while keeping the system size N fixed generates a system of integrodifferential equations for the ...

179 citations

Journal ArticleDOI
TL;DR: This work describes how the Markov models account for many recent measurements of the resting or spontaneous activity of the neocortex, and shows that the power spectrum of large-scale neocortical activity has a Brownian motion baseline, and that the statistical structure of the random bursts of spiking activity found near the resting state indicates that such a state can be represented as a percolation process on a random graph, called directed percolations.
Abstract: In 1972-1973 Wilson and Cowan introduced a mathematical model of the population dynamics of synaptically coupled excitatory and inhibitory neurons in the neocortex. The model dealt only with the mean numbers of activated and quiescent excitatory and inhibitory neurons, and said nothing about fluctuations and correlations of such activity. However, in 1997 Ohira and Cowan, and then in 2007-2009 Buice and Cowan introduced Markov models of such activity that included fluctuation and correlation effects. Here we show how both models can be used to provide a quantitative account of the population dynamics of neocortical activity.We first describe how the Markov models account for many recent measurements of the resting or spontaneous activity of the neocortex. In particular we show that the power spectrum of large-scale neocortical activity has a Brownian motion baseline, and that the statistical structure of the random bursts of spiking activity found near the resting state indicates that such a state can be represented as a percolation process on a random graph, called directed percolation.Other data indicate that resting cortex exhibits pair correlations between neighboring populations of cells, the amplitudes of which decay slowly with distance, whereas stimulated cortex exhibits pair correlations which decay rapidly with distance. Here we show how the Markov model can account for the behavior of the pair correlations.Finally we show how the 1972-1973 Wilson-Cowan equations can account for recent data which indicates that there are at least two distinct modes of cortical responses to stimuli. In mode 1 a low intensity stimulus triggers a wave that propagates at a velocity of about 0.3 m/s, with an amplitude that decays exponentially. In mode 2 a high intensity stimulus triggers a larger response that remains local and does not propagate to neighboring regions.

100 citations

Journal ArticleDOI
TL;DR: An analytical expression for the speed of a binocular rivalry wave is derived as a function of various neurophysiological parameters, and properties of the wave are consistent with the wave-like propagation of perceptual dominance observed in recent psychophysical experiments.
Abstract: We present a neural field model of binocular rivalry waves in visual cortex For each eye we consider a one-dimensional network of neurons that respond maximally to a particular feature of the corresponding image such as the orientation of a grating stimulus Recurrent connections within each one-dimensional network are assumed to be excitatory, whereas connections between the two networks are inhibitory (cross-inhibition) Slow adaptation is incorporated into the model by taking the network connections to exhibit synaptic depression We derive an analytical expression for the speed of a binocular rivalry wave as a function of various neurophysiological parameters, and show how properties of the wave are consistent with the wave-like propagation of perceptual dominance observed in recent psychophysical experiments In addition to providing an analytical framework for studying binocular rivalry waves, we show how neural field methods provide insights into the mechanisms underlying the generation of the waves In particular, we highlight the important role of slow adaptation in providing a "symmetry breaking mechanism" that allows waves to propagate

58 citations

References
More filters
Book
01 Jan 1979
TL;DR: This book provides an introduction to Monte Carlo simulations in classical statistical physics and is aimed both at students beginning work in the field and at more experienced researchers who wish to learn more about Monte Carlo methods.
Abstract: This book provides an introduction to Monte Carlo simulations in classical statistical physics and is aimed both at students beginning work in the field and at more experienced researchers who wish to learn more about Monte Carlo methods. The material covered includes methods for both equilibrium and out of equilibrium systems, and common algorithms like the Metropolis and heat-bath algorithms are discussed in detail, as well as more sophisticated ones such as continuous time Monte Carlo, cluster algorithms, multigrid methods, entropic sampling and simulated tempering. Data analysis techniques are also explained starting with straightforward measurement and error-estimation techniques and progressing to topics such as the single and multiple histogram methods and finite size scaling. The last few chapters of the book are devoted to implementation issues, including discussions of such topics as lattice representations, efficient implementation of data structures, multispin coding, parallelization of Monte Carlo algorithms, and random number generation. At the end of the book the authors give a number of example programmes demonstrating the applications of these techniques to a variety of well-known models.

2,765 citations

BookDOI
01 Jan 1979

847 citations

Journal ArticleDOI
TL;DR: A new statistical neurodynamical method is proposed for analyzing the non-equilibrium dynamical behaviors of an autocorrelation associative memory model and explains the strange behaviors due to strange shapes of the basins of attractors.

400 citations

Journal ArticleDOI
TL;DR: The theory of neuronal correlation functions in large networks comprising of several highly connected subpopulations, and obeying stochastic dynamic rules is developed and extended to networks with random connectivity, such as randomly dilute networks.
Abstract: One of the main experimental tools in probing the interactions between neurons has been the measurement of the correlations in their activity. In general, however, the interpretation of the observed correlations is difficult since the correlation between a pair of neurons is influenced not only by the direct interaction between them but also by the dynamic state of the entire network to which they belong. Thus a comparison between the observed correlations and the predictions from specific model networks is needed. In this paper we develop a theory of neuronal correlation functions in large networks comprising several highly connected subpopulations and obeying stochastic dynamic rules. When the networks are in asynchronous states, the cross correlations are relatively weak, i.e., their amplitude relative to that of the autocorrelations is of order of 1/N, N being the size of the interacting populations. Using the weakness of the cross correlations, general equations that express the matrix of cross correlations in terms of the mean neuronal activities and the effective interaction matrix are presented. The effective interactions are the synaptic efficacies multiplied by the gain of the postsynaptic neurons. The time-delayed cross-correlation matrix can be expressed as a sum of exponentially decaying modes that correspond to the (nonorthogonal) eigenvectors of the effective interaction matrix. The theory is extended to networks with random connectivity, such as randomly dilute networks. This allows for a comparison between the contribution from the internal common input and that from the direct interactions to the correlations of monosynaptically coupled pairs.A closely related quantity is the linear response of the neurons to external time-dependent perturbations. We derive the form of the dynamic linear response function of neurons in the above architecture in terms of the eigenmodes of the effective interaction matrix. The behavior of the correlations and the linear response when the system is near a bifurcation point is analyzed. Near a saddle-node bifurcation, the correlation matrix is dominated by a single slowly decaying critical mode. Near a Hopf bifurcation the correlations exhibit weakly damped sinusoidal oscillations. The general theory is applied to the case of a randomly dilute network consisting of excitatory and inhibitory subpopulations, using parameters that mimic the local circuit of 1 ${\mathrm{mm}}^{3}$ of the rat neocortex. Both the effect of dilution as well as the influence of a nearby bifurcation to an oscillatory state are demonstrated.

255 citations