Journal ArticleDOI
Critical Branching Captures Activity in Living Neural Networks and Maximizes the Number of Metastable States
Clayton Haldeman,John M. Beggs +1 more
Reads0
Chats0
TLDR
When the branching parameter is tuned to the critical point, it is found that metastable states are most numerous and that network dynamics are not attracting, but neutral.Abstract:
Recent experimental work has shown that activity in living neural networks can propagate as a critical branching process that revisits many metastable states. Neural network theory suggests that attracting states could store information, but little is known about how a branching process could form such states. Here we use a branching process to model actual data and to explore metastable states in the network. When we tune the branching parameter to the critical point, we find that metastable states are most numerous and that network dynamics are not attracting, but neutral.read more
Citations
More filters
疟原虫var基因转换速率变化导致抗原变异[英]/Paul H, Robert P, Christodoulou Z, et al//Proc Natl Acad Sci U S A
TL;DR: PfPMP1)与感染红细胞、树突状组胞以及胎盘的单个或多个受体作用,在黏附及免疫逃避中起关键的作�ly.
Journal ArticleDOI
Optimal Dynamical Range of Excitable Networks at Criticality
Osame Kinouchi,Mauro Copelli +1 more
TL;DR: It is proposed that the main functional role of electrical coupling is to provide an enhancement of dynamic range, therefore allowing the coding of information spanning several orders of magnitude, which could provide a microscopic neural basis for psychophysical laws.
Journal ArticleDOI
Real-time computation at the edge of chaos in recurrent neural networks
TL;DR: It is shown that only near the critical boundary can recurrent networks of threshold gates perform complex computations on time series, which strongly supports conjectures that dynamical systems that are capable of doing complex computational tasks should operate near the edge of chaos.
Journal ArticleDOI
Networks of the Brain
TL;DR: Models of Network Growth All networks, whether they are social, technological, or biological, are the result of a growth process, and many continue to grow for prolonged periods of time, continually modifying their connectivity structure throughout their entire existence.
Journal ArticleDOI
Dynamical synapses causing self-organized criticality in neural networks
TL;DR: It is demonstrated analytically and numerically that by assuming (biologically more realistic) dynamical synapses in a spiking neural network, the neuronal avalanches turn from an exceptional phenomenon into a typical and robust self-organized critical behaviour, if the total resources of neurotransmitter are sufficiently large.
References
More filters
疟原虫var基因转换速率变化导致抗原变异[英]/Paul H, Robert P, Christodoulou Z, et al//Proc Natl Acad Sci U S A
TL;DR: PfPMP1)与感染红细胞、树突状组胞以及胎盘的单个或多个受体作用,在黏附及免疫逃避中起关键的作�ly.
Journal ArticleDOI
Neural networks and physical systems with emergent collective computational abilities
TL;DR: A model of a system having a large number of simple equivalent components, based on aspects of neurobiology but readily adapted to integrated circuits, produces a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size.
Journal ArticleDOI
Determining Lyapunov exponents from a time series
TL;DR: In this article, the authors present the first algorithms that allow the estimation of non-negative Lyapunov exponents from an experimental time series, which provide a qualitative and quantitative characterization of dynamical behavior.
Book
Introduction to Phase Transitions and Critical Phenomena
H. Eugene Stanley,Guenter Ahlers +1 more
TL;DR: In this article, the authors present a paperback edition of a distinguished book, originally published by Clarendon Press in 1971, which is at the level at which a graduate student who has studied condensed matter physics can begin to comprehend the nature of phase transitions, which involve the transformation of one state of matter into another.
Journal ArticleDOI
Real-time computing without stable states: a new framework for neural computation based on perturbations
TL;DR: A new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks, based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry.