scispace - formally typeset
Search or ask a question
Author

Siwei Qiu

Bio: Siwei Qiu is an academic researcher from National Institutes of Health. The author has contributed to research in topics: Spiking neural network. The author has an hindex of 2, co-authored 2 publications receiving 9 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: A path-integral formalism is used to derive a perturbation expansion in the inverse system size around the mean-field limit for the covariance of the input to a neuron (synaptic drive) and firing rate fluctuations due to dynamical deterministic finite-size effects.
Abstract: We study finite-size fluctuations in a network of spiking deterministic neurons coupled with nonuniform synaptic coupling. We generalize a previously developed theory of finite-size effects for globally coupled neurons with a uniform coupling function. In the uniform coupling case, mean-field theory is well defined by averaging over the network as the number of neurons in the network goes to infinity. However, for nonuniform coupling it is no longer possible to average over the entire network if we are interested in fluctuations at a particular location within the network. We show that if the coupling function approaches a continuous function in the infinite system size limit, then an average over a local neighborhood can be defined such that mean-field theory is well defined for a spatially dependent field. We then use a path-integral formalism to derive a perturbation expansion in the inverse system size around the mean-field limit for the covariance of the input to a neuron (synaptic drive) and firing rate fluctuations due to dynamical deterministic finite-size effects.

8 citations

Posted Content
TL;DR: Here, it is shown how ideas from quantum field theory can be used to construct an effective reduced theory, which may be analyzed with lattice computations.
Abstract: The human brain is a complex system composed of a network of hundreds of billions of discrete neurons that are coupled through time dependent synapses. Simulating the entire brain is a daunting challenge. Here, we show how ideas from quantum field theory can be used to construct an effective reduced theory, which may be analyzed with lattice computations. We give some examples of how the formalism can be applied to biophysically plausible neural network models.

2 citations


Cited by
More filters
Journal ArticleDOI
19 Mar 2020-Chaos
TL;DR: In this paper, localized patterns in an exact mean-field description of a spatially extended network of quadratic integrate-and-fire neurons are investigated, and conditions for the existence and stability of localized solutions, so-called bumps, are investigated.
Abstract: We study localized patterns in an exact mean-field description of a spatially extended network of quadratic integrate-and-fire neurons. We investigate conditions for the existence and stability of localized solutions, so-called bumps, and give an analytic estimate for the parameter range, where these solutions exist in parameter space, when one or more microscopic network parameters are varied. We develop Galerkin methods for the model equations, which enable numerical bifurcation analysis of stationary and time-periodic spatially extended solutions. We study the emergence of patterns composed of multiple bumps, which are arranged in a snake-and-ladder bifurcation structure if a homogeneous or heterogeneous synaptic kernel is suitably chosen. Furthermore, we examine time-periodic, spatially localized solutions (oscillons) in the presence of external forcing, and in autonomous, recurrently coupled excitatory and inhibitory networks. In both cases, we observe period-doubling cascades leading to chaotic oscillations.

22 citations

Journal ArticleDOI
TL;DR: The Wilson-Cowan equations represent a landmark in the history of computational neuroscience and crystallized an approach to modeling neural dynamics and brain function and are used in various guises today.
Abstract: The Wilson–Cowan equations represent a landmark in the history of computational neuroscience. Along with the insights Wilson and Cowan offered for neuroscience, they crystallized an approach to mod...

21 citations

Journal ArticleDOI
TL;DR: The emergence of patterns composed of multiple bumps, which are arranged in a snake-and-ladder bifurcation structure if a homogeneous or heterogeneous synaptic kernel is suitably chosen, is studied.
Abstract: We study localized patterns in an exact mean-field description of a spatially-extended network of quadratic integrate-and-fire (QIF) neurons. We investigate conditions for the existence and stability of localized solutions, so-called bumps, and give an analytic estimate for the parameter range where these solutions exist in parameter space, when one or more microscopic network parameters are varied. We develop Galerkin methods for the model equations, which enable numerical bifurcation analysis of stationary and time-periodic spatially-extended solutions. We study the emergence of patterns composed of multiple bumps, which are arranged in a snake-and-ladder bifurcation structure if a homogeneous or heterogeneous synaptic kernel is suitably chosen. Furthermore, we examine time-periodic, spatially-localized solutions (oscillons) in the presence of external forcing, and in autonomous, recurrently coupled excitatory and inhibitory networks. In both cases we observe period doubling cascades leading to chaotic oscillations.

12 citations

Journal ArticleDOI
TL;DR: An original path-integral description of the chemical Langevin equation is constructed and it is shown how applying Gillespie's two conditions to it directly leads to a path-Integral equivalent to the CLE.
Abstract: In 2000, Gillespie rehabilitated the chemical Langevin equation (CLE) by describing two conditions that must be satisfied for it to yield a valid approximation of the chemical master equation (CME). In this work, we construct an original path-integral description of the CME and show how applying Gillespie's two conditions to it directly leads to a path-integral equivalent to the CLE. We compare this approach to the path-integral equivalent of a large system size derivation and show that they are qualitatively different. In particular, both approaches involve converting many sums into many integrals, and the difference between the two methods is essentially the difference between using the Euler-Maclaurin formula and using Riemann sums. Our results shed light on how path integrals can be used to conceptualize coarse-graining biochemical systems and are readily generalizable.

11 citations

Journal ArticleDOI
TL;DR: In this paper, the authors formulate the stochastic neuron dynamics in the Martin-Siggia-Rose de Dominicis-Janssen (MSRDJ) formalism and present the fluctuation expansion of the effective action and the functional renormalization group (fRG) as two systematic ways to incorporate corrections to the mean dynamics and time-dependent statistics due to fluctuations in the presence of nonlinear neuronal gain.
Abstract: Neural dynamics is often investigated with tools from bifurcation theory. However, many neuron models are stochastic, mimicking fluctuations in the input from unknown parts of the brain or the spiking nature of signals. Noise changes the dynamics with respect to the deterministic model; in particular classical bifurcation theory cannot be applied. We formulate the stochastic neuron dynamics in the Martin-Siggia-Rose de Dominicis-Janssen (MSRDJ) formalism and present the fluctuation expansion of the effective action and the functional renormalization group (fRG) as two systematic ways to incorporate corrections to the mean dynamics and time-dependent statistics due to fluctuations in the presence of nonlinear neuronal gain. To formulate self-consistency equations, we derive a fundamental link between the effective action in the Onsager-Machlup (OM) formalism, which allows the study of phase transitions, and the MSRDJ effective action, which is computationally advantageous. These results in particular allow the derivation of an OM effective action for systems with non-Gaussian noise. This approach naturally leads to effective deterministic equations for the first moment of the stochastic system; they explain how nonlinearities and noise cooperate to produce memory effects. Moreover, the MSRDJ formulation yields an effective linear system that has identical power spectra and linear response. Starting from the better known loopwise approximation, we then discuss the use of the fRG as a method to obtain self-consistency beyond the mean. We present a new efficient truncation scheme for the hierarchy of flow equations for the vertex functions by adapting the Blaizot, M\'endez, and Wschebor approximation from the derivative expansion to the vertex expansion. The methods are presented by means of the simplest possible example of a stochastic differential equation that has generic features of neuronal dynamics.

9 citations