scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Computational Neuroscience in 2015"


Journal ArticleDOI
TL;DR: It is shown that for the colored Gaussian noise case the optimal template matching is given by a form of linear filtering, which can be derived via linear discriminant analysis, which provides a Bayesian interpretation for the well-known matched filter output.
Abstract: Spike sorting, ie, the separation of the firing activity of different neurons from extracellular measurements, is a crucial but often error-prone step in the analysis of neuronal responses Usually, three different problems have to be solved: the detection of spikes in the extracellular recordings, the estimation of the number of neurons and their prototypical (template) spike waveforms, and the assignment of individual spikes to those putative neurons If the template spike waveforms are known, template matching can be used to solve the detection and classification problem Here, we show that for the colored Gaussian noise case the optimal template matching is given by a form of linear filtering, which can be derived via linear discriminant analysis This provides a Bayesian interpretation for the well-known matched filter output Moreover, with this approach it is possible to compute a spike detection threshold analytically The method can be implemented by a linear filter bank derived from the templates, and can be used for online spike sorting of multielectrode recordings It may also be applicable to detection and classification problems of transient signals in general Its application significantly decreases the error rate on two publicly available spike-sorting benchmark data sets in comparison to state-of-the-art template matching procedures Finally, we explore the possibility to resolve overlapping spikes using the template matching outputs and show that they can be resolved with high accuracy

79 citations


Journal ArticleDOI
TL;DR: The results provide a framework for the interpretation of firing statistics measured in vivo in the brain by derived simple formulas for the essential interspike-interval statistics for a canonical model of a tonically firing neuron subjected to arbitrarily correlated input from the network.
Abstract: Nerve cells in the brain generate sequences of action potentials with a complex statistics Theoretical attempts to understand this statistics were largely limited to the case of a temporally uncorrelated input (Poissonian shot noise) from the neurons in the surrounding network However, the stimulation from thousands of other neurons has various sorts of temporal structure Firstly, input spike trains are temporally correlated because their firing rates can carry complex signals and because of cell-intrinsic properties like neural refractoriness, bursting, or adaptation Secondly, at the connections between neurons, the synapses, usage-dependent changes in the synaptic weight (short-term plasticity) further shape the correlation structure of the effective input to the cell From the theoretical side, it is poorly understood how these correlated stimuli, so-called colored noise, affect the spike train statistics In particular, no standard method exists to solve the associated first-passage-time problem for the interspike-interval statistics with an arbitrarily colored noise Assuming that input fluctuations are weaker than the mean neuronal drive, we derive simple formulas for the essential interspike-interval statistics for a canonical model of a tonically firing neuron subjected to arbitrarily correlated input from the network We verify our theory by numerical simulations for three paradigmatic situations that lead to input correlations: (i) rate-coded naturalistic stimuli in presynaptic spike trains; (ii) presynaptic refractoriness or bursting; (iii) synaptic short-term plasticity In all cases, we find severe effects on interval statistics Our results provide a framework for the interpretation of firing statistics measured in vivo in the brain

62 citations


Journal ArticleDOI
TL;DR: The results highlight the complexity of the voltage response to oscillatory inputs in nonlinear models and the roles that resonant and amplifying currents have in shaping these responses.
Abstract: We investigate the biophysical and dynamic mechanisms of generation of subthreshold amplitude and phase resonance in response to sinusoidal input currents in two-dimensional models of quadratic type. These models feature a parabolic voltage nullcline and a linear nullcline for the recovery gating variable, capturing the interplay of the so-called resonant currents (e.g., hyperpolarization-activated mixed-cation inward and slow potassium) and amplifying currents (e.g., persistent sodium) in biophysically realistic parameter regimes. These currents underlie the generation of resonance in medial entorhinal cortex layer II stellate cells and CA1 pyramidal cells. We show that quadratic models exhibit nonlinear amplifications of the voltage response to sinusoidal inputs in the resonant frequency band. These are expressed as an increase in the impedance profile as the input amplitude increases. They are stronger for values positive than negative to resting potential and are accompanied by a shift in the phase profile, a decrease in the resonant and phase-resonant frequencies, and an increase in the sharpness of the voltage response. These effects are more prominent for smaller values of 𝜖 (larger levels of the time scale separation between the voltage and the resonant gating variable) and for values of the resting potential closer to threshold for spike generation. All other parameter fixed, as 𝜖 increases the voltage response becomes "more linear"; i.e., the nonlinearities are present, but "ignored". In addition, the nonlinear effects are strongly modulated by the curvature of the parabolic voltage nullcline (partially reflecting the effects of the amplifying current) and the slope of the resonant current activation curve. Following the effects of changes in the biophysical conductances of realistic conductance-based models through the parameters of the quadratic model, we characterize the qualitatively different effects that resonant and amplifying currents have on the nonlinear properties of the voltage response. We identify different classes of resonant currents, represented by h- and slow potassium, according to whether they enhance (h-) or attenuate (slow potassium) the nonlinear effects. Finally, we use dynamical systems tools to investigate the dynamic mechanisms of generation of resonance and phase-resonance. We show that the nonlinear effects on the voltage response (e.g., amplification of the voltage response in the resonant frequency band and shifts in the resonant and phase-resonant frequencies) result from the ability of limit cycle trajectories to follow the unstable (right) branch of the voltage nullcline for a significant amount of time. This is a canard-related mechanism that has been shown to underlie the generation of intrinsic subthreshold oscillations in quadratic type models such as medial entorhinal cortex stellate cells. Overall, our results highlight the complexity of the voltage response to oscillatory inputs in nonlinear models and the roles that resonant and amplifying currents have in shaping these responses.

55 citations


Journal ArticleDOI
TL;DR: It is shown that in presence of a visual input, the stable eigenmodes of the linearized operator represent perceptual units of the visual stimulus, strictly related to dimensionality reduction and clustering problems.
Abstract: In this paper we show that the emergence of perceptual units in V1 can be explained in terms of a physical mechanism of simmetry breaking of the mean field neural equation. We consider a mean field neural model which takes into account the functional architecture of the visual cortex modeled as a group of rotations and translations equipped with a degenerate metric. The model generalizes well known results of Bressloff and Cowan which, in absence of input, accounts for hallucination patterns. The main result of our study consists in showing that in presence of a visual input, the stable eigenmodes of the linearized operator represent perceptual units of the visual stimulus. The result is strictly related to dimensionality reduction and clustering problems.

54 citations


Journal ArticleDOI
TL;DR: This paper provides two representations for stochastic ion channel kinetics, and compares the performance of exact simulation with a commonly used numerical approximation strategy, and presents a random time change representation.
Abstract: In this paper we provide two representations for stochastic ion channel kinetics, and compare the performance of exact simulation with a commonly used numerical approximation strategy. The first representation we present is a random time change representation, popularized by Thomas Kurtz, with the second being analogous to a "Gillespie" representation. Exact stochastic algorithms are provided for the different representations, which are preferable to either (a) fixed time step or (b) piecewise constant propensity algorithms, which still appear in the literature. As examples, we provide versions of the exact algorithms for the Morris-Lecar conductance based model, and detail the error induced, both in a weak and a strong sense, by the use of approximate algorithms on this model. We include ready-to-use implementations of the random time change algorithm in both XPP and Matlab. Finally, through the consideration of parametric sensitivity analysis, we show how the representations presented here are useful in the development of further computational methods. The general representations and simulation strategies provided here are known in other parts of the sciences, but less so in the present setting.

52 citations


Journal ArticleDOI
TL;DR: The results demonstrate how synaptic connectivity could potentially be inferred from large-scale parallel spike train recordings and describe a minimal model that is optimized for large networks and an efficient scheme for its parallelized numerical optimization on generic computing clusters.
Abstract: Dynamics and function of neuronal networks are determined by their synaptic connectivity. Current experimental methods to analyze synaptic network structure on the cellular level, however, cover only small fractions of functional neuronal circuits, typically without a simultaneous record of neuronal spiking activity. Here we present a method for the reconstruction of large recurrent neuronal networks from thousands of parallel spike train recordings. We employ maximum likelihood estimation of a generalized linear model of the spiking activity in continuous time. For this model the point process likelihood is concave, such that a global optimum of the parameters can be obtained by gradient ascent. Previous methods, including those of the same class, did not allow recurrent networks of that order of magnitude to be reconstructed due to prohibitive computational cost and numerical instabilities. We describe a minimal model that is optimized for large networks and an efficient scheme for its parallelized numerical optimization on generic computing clusters. For a simulated balanced random network of 1000 neurons, synaptic connectivity is recovered with a misclassification error rate of less than 1 % under ideal conditions. We show that the error rate remains low in a series of example cases under progressively less ideal conditions. Finally, we successfully reconstruct the connectivity of a hidden synfire chain that is embedded in a random network, which requires clustering of the network connectivity to reveal the synfire groups. Our results demonstrate how synaptic connectivity could potentially be inferred from large-scale parallel spike train recordings.

45 citations


Journal ArticleDOI
TL;DR: To maintain optimal conduction velocities, increased myelin coverage of axonal surface must be accompanied by an increase in channel density at the evolving nodes, but along with increases in myelin thickness, a reduction in overall average channel density must occur.
Abstract: Multilayered, lipid-rich myelin increases nerve impulse conduction velocity, contributes to compact nervous systems, and reduces metabolic costs of neural activity Based on the hypothesis that increased impulse conduction velocity provides a selective advantage that drives the evolution of myelin, we simulated a sequence of plausible intermediate stages of myelin evolution, each of which providing an enhancement of conduction speed We started with the expansion of insulating glial coverage, which led first to a single layer of myelin surrounding the axon and then to multiple myelin wraps with well-organized nodes The myelinated fiber was modeled at three levels of complexity as the hypothesized evolutionary progression became more quantitatively exacting: 1) representing the fiber as a mathematically-tractable uniform active cylinder with the effect of myelination approximated by changing its specific capacitance (Cm); 2) representing it as a chain of simple, cable-model compartments having alternating nodal and internodal parameters subject to optimization, and 3) representing it in a double cable model with the axon and myelin sheath treated separately Conduction velocity was optimized at each stage To maintain optimal conduction velocities, increased myelin coverage of axonal surface must be accompanied by an increase in channel density at the evolving nodes, but along with increases in myelin thickness, a reduction in overall average channel density must occur Leakage under the myelin sheath becomes more of a problem with smaller fiber diameters, which may help explain the tendency for myelin to occur preferentially in larger nerve fibers in both vertebrates and invertebrates

37 citations


Journal ArticleDOI
TL;DR: A nominal model of swallowing in Aplysia californica is described and it is shown that the underlying stable heteroclinic channel architecture exhibits dramatic slowing of activity when sensory and endogenous input is reduced, and that similar slowing with removal of proprioception is seen in vitro.
Abstract: Many behaviors require reliably generating sequences of motor activity while adapting the activity to incoming sensory information. This process has often been conceptually explained as either fully dependent on sensory input (a chain reflex) or fully independent of sensory input (an idealized central pattern generator, or CPG), although the consensus of the field is that most neural pattern generators lie somewhere between these two extremes. Many mathematical models of neural pattern generators use limit cycles to generate the sequence of behaviors, but other models, such as a heteroclinic channel (an attracting chain of saddle points), have been suggested. To explore the range of intermediate behaviors between CPGs and chain reflexes, in this paper we describe a nominal model of swallowing in Aplysia californica. Depending upon the value of a single parameter, the model can transition from a generic limit cycle regime to a heteroclinic regime (where the trajectory slows as it passes near saddle points). We then study the behavior of the system in these two regimes and compare the behavior of the models with behavior recorded in the animal in vivo and in vitro. We show that while both pattern generators can generate similar behavior, the stable heteroclinic channel can better respond to changes in sensory input induced by load, and that the response matches the changes seen when a load is added in vivo. We then show that the underlying stable heteroclinic channel architecture exhibits dramatic slowing of activity when sensory and endogenous input is reduced, and show that similar slowing with removal of proprioception is seen in vitro. Finally, we show that the distributions of burst lengths seen in vivo are better matched by the distribution expected from a system operating in the heteroclinic regime than that expected from a generic limit cycle. These observations suggest that generic limit cycle models may fail to capture key aspects of Aplysia feeding behavior, and that alternative architectures such as heteroclinic channels may provide better descriptions.

36 citations


Journal ArticleDOI
TL;DR: It is observed that background activity tends to give rise to extremely broad distributions of event sizes and inter-event times, while driving a system imposes a certain regularity on its inter- event times, producing a rhythm consistent with broad-band gamma oscillations.
Abstract: This numerical study documents and analyzes emergent spiking behavior in local neuronal populations. Emphasis is given to a phenomenon we call clustering, by which we refer to a tendency of random groups of neurons large and small to spontaneously coordinate their spiking activity in some fashion. Using a sparsely connected network of integrate-and-fire neurons, we demonstrate that spike clustering occurs ubiquitously in both high firing and low firing regimes. As a practical tool for quantifying such spike patterns, we propose a simple scheme with two parameters, one setting the temporal scale and the other the amount of deviation from the mean to be regarded as significant. Viewing population activity as a sequence of events, meaning relatively brief durations of elevated spiking, separated by inter-event times, we observe that background activity tends to give rise to extremely broad distributions of event sizes and inter-event times, while driving a system imposes a certain regularity on its inter-event times, producing a rhythm consistent with broad-band gamma oscillations. We note also that event sizes and inter-event times decorrelate very quickly. Dynamical analyses supported by numerical evidence are offered.

33 citations


Journal ArticleDOI
TL;DR: A thalamo-cortical neuronal population model is applied to reproduce the power spectrum changes in EEG during propofol-induced anaesthesia sedation and suggests an important role for synaptic GABAergic receptors at relay neurons and, more generally, for the thalamus in the generation of both the δ− and the α− EEG patterns.
Abstract: Increasing concentrations of the anaesthetic agent propofol initially induces sedation before achieving full general anaesthesia. During this state of anaesthesia, the observed specific changes in electroencephalographic (EEG) rhythms comprise increased activity in the ?? (0.5?4 Hz) and ?? (8?13 Hz) frequency bands over the frontal region, but increased ?? and decreased ??activity over the occipital region. It is known that the cortex, the thalamus, and the thalamo-cortical feedback loop contribute to some degree to the propofol-induced changes in the EEG power spectrum. However the precise role of each structure to the dynamics of the EEG is unknown. In this paper we apply a thalamo-cortical neuronal population model to reproduce the power spectrum changes in EEG during propofol-induced anaesthesia sedation. The model reproduces the power spectrum features observed experimentally both in frontal and occipital electrodes. Moreover, a detailed analysis of the model indicates the importance of multiple resting states in brain activity. The work suggests that the ??activity originates from the cortico-thalamic relay interaction, whereas the emergence of ??activity results from the full cortico-reticular-relay-cortical feedback loop with a prominent enforced thalamic reticular-relay interaction. This model suggests an important role for synaptic GABAergic receptors at relay neurons and, more generally, for the thalamus in the generation of both the ?? and the ?? EEG patterns that are seen during propofol anaesthesia sedation.

33 citations


Journal ArticleDOI
TL;DR: It is shown that Bayesian inference can be implemented in the non-uniform population code model using one spike per neuron when the population is large and is thus able to support the rapid inference that is necessary for sound localization.
Abstract: Bayesian models are often successful in describing perception and behavior, but the neural representation of probabilities remains in question. There are several distinct proposals for the neural representation of probabilities, but they have not been directly compared in an example system. Here we consider three models: a non-uniform population code where the stimulus-driven activity and distribution of preferred stimuli in the population represent a likelihood function and a prior, respectively; the sampling hypothesis which proposes that the stimulus-driven activity over time represents a posterior probability and that the spontaneous activity represents a prior; and the class of models which propose that a population of neurons represents a posterior probability in a distributed code. It has been shown that the non-uniform population code model matches the representation of auditory space generated in the owl's external nucleus of the inferior colliculus (ICx). However, the alternative models have not been tested, nor have the three models been directly compared in any system. Here we tested the three models in the owl's ICx. We found that spontaneous firing rate and the average stimulus-driven response of these neurons were not consistent with predictions of the sampling hypothesis. We also found that neural activity in ICx under varying levels of sensory noise did not reflect a posterior probability. On the other hand, the responses of ICx neurons were consistent with the non-uniform population code model. We further show that Bayesian inference can be implemented in the non-uniform population code model using one spike per neuron when the population is large and is thus able to support the rapid inference that is necessary for sound localization.

Journal ArticleDOI
TL;DR: In this paper, morphological and electrophysiological properties of layer 3 pyramidal neurons from young and aged rhesus monkeys were characterized using in vitro whole-cell patch-clamp recordings and high-resolution digital reconstruction of neurons Consistent with previous studies, aged neurons exhibited significantly reduced dendritic arbor length and spine density, as well as increased input resistance and firing rates.
Abstract: Layer 3 (L3) pyramidal neurons in the lateral prefrontal cortex (LPFC) of rhesus monkeys exhibit dendritic regression, spine loss and increased action potential (AP) firing rates during normal aging The relationship between these structural and functional alterations, if any, is unknown To address this issue, morphological and electrophysiological properties of L3 LPFC pyramidal neurons from young and aged rhesus monkeys were characterized using in vitro whole-cell patch-clamp recordings and high-resolution digital reconstruction of neurons Consistent with our previous studies, aged neurons exhibited significantly reduced dendritic arbor length and spine density, as well as increased input resistance and firing rates Computational models using the digital reconstructions with Hodgkin-Huxley and AMPA channels allowed us to assess relationships between demonstrated age-related changes and to predict physiological changes that have not yet been tested empirically For example, the models predict that in both backpropagating APs and excitatory postsynaptic currents (EPSCs), attenuation is lower in aged versus young neurons Importantly, when identical densities of passive parameters and voltage- and calcium-gated conductances were used in young and aged model neurons, neither input resistance nor firing rates differed between the two age groups Tuning passive parameters for each model predicted significantly higher membrane resistance (Rm) in aged versus young neurons This Rm increase alone did not account for increased firing rates in aged models, but coupling these Rm values with subtle differences in morphology and membrane capacitance did The predicted differences in passive parameters (or parameters with similar effects) are mathematically plausible, but must be tested empirically

Journal ArticleDOI
TL;DR: This work proposes a mechanism for learning both the order and precise timing of event sequences, and chooses short term facilitation as a time-tracking process, and demonstrates that other mechanisms, such as spike rate adaptation, can fulfill this role.
Abstract: Neuronal circuits can learn and replay firing patterns evoked by sequences of sensory stimuli. After training, a brief cue can trigger a spatiotemporal pattern of neural activity similar to that evoked by a learned stimulus sequence. Network models show that such sequence learning can occur through the shaping of feedforward excitatory connectivity via long term plasticity. Previous models describe how event order can be learned, but they typically do not explain how precise timing can be recalled. We propose a mechanism for learning both the order and precise timing of event sequences. In our recurrent network model, long term plasticity leads to the learning of the sequence, while short term facilitation enables temporally precise replay of events. Learned synaptic weights between populations determine the time necessary for one population to activate another. Long term plasticity adjusts these weights so that the trained event times are matched during playback. While we chose short term facilitation as a time-tracking process, we also demonstrate that other mechanisms, such as spike rate adaptation, can fulfill this role. We also analyze the impact of trial-to-trial variability, showing how observational errors as well as neuronal noise result in variability in learned event times. The dynamics of the playback process determines how stochasticity is inherited in learned sequence timings. Future experiments that characterize such variability can therefore shed light on the neural mechanisms of sequence learning.

Journal ArticleDOI
TL;DR: A three-dimensional electrical model of syncytial smooth muscle developed using the compartmental modeling technique is presented, with special reference to the bladder detrusor, and it was found that a size of 21-cube may be considered the critical minimum size for an electrically infinitesyncytium.
Abstract: Certain smooth muscles, such as the detrusor of the urinary bladder, exhibit a variety of spikes that differ markedly in their amplitudes and time courses The origin of this diversity is poorly understood but is often attributed to the syncytial nature of smooth muscle and its distributed innervation In order to help clarify such issues, we present here a three-dimensional electrical model of syncytial smooth muscle developed using the compartmental modeling technique, with special reference to the bladder detrusor Values of model parameters were sourced or derived from experimental data The model was validated against various modes of stimulation employed experimentally and the results were found to accord with both theoretical predictions and experimental observations Model outputs also satisfied criteria characteristic of electrical syncytia such as correlation between the spatial spread and temporal decay of electrotonic potentials as well as positively skewed amplitude frequency histogram for sub-threshold potentials, and lead to interesting conclusions Based on analysis of syncytia of different sizes, it was found that a size of 21-cube may be considered the critical minimum size for an electrically infinite syncytium Set against experimental results, we conjecture the existence of electrically sub-infinite bundles in the detrusor Moreover, the absence of coincident activity between closely spaced cells potentially implies, counterintuitively, highly efficient electrical coupling between such cells The model thus provides a heuristic platform for the interpretation of electrical activity in syncytial tissues

Journal ArticleDOI
TL;DR: The work highlights the crucial role of nonlinearities for the frequency dependence of neuronal information transmission by demonstrating nonlinearity-mediated band-pass filtering of information at frequencies close to the subthreshold impedance resonance in three different model systems.
Abstract: Neuronal information transmission is fre- quency specific. In single cells, a band-pass like frequency preference can arise from the subthreshold dynamics of the membrane potential, shaped by properties of the cell's membrane and its ionic channels. In these cases, a cell is termed resonant and its membrane impedance spectrum exhibits a peak at non-vanishing frequencies. Here, we show that this frequency selectivity of neuronal response amplitudes need not translate into a similar frequency selectivity of information transfer. In particular, neurons with resonant but linear subthreshold voltage dynamics (without threshold) do not show a resonance of information transfer at the level of subthreshold voltage; the corresponding coherence has low-pass characteristics. Interestingly, we find that when combined with nonlinearities, subthreshold resonances do shape the frequency dependence of coherence and the peak in the subthreshold impedance translates to a peak in the coherence function. In other words, the nonlinearity inherent to spike generation allows a subthreshold impedance resonance to shape a resonance of voltage-based information transfer. We demonstrate such nonlinearity-mediated band-pass filtering of information at frequencies close to the subthreshold impedance resonance in three different model systems: the resonate-and-fire model, the conductance-based Morris-Lecar model, and linear resonant dynamics combined with a simple static nonlinearity. In the spiking neuron models, the band-pass filtering is most pronounced for low firing rates and a high variability of interspike intervals, similar to the spiking statistics observed in vivo. We show that band-pass filtering is achieved by reducing information transfer over low-frequency components and, consequently, comes along with an overall reduction of information rate. Our work highlights the crucial role of nonlinearities for the frequency dependence of neuronal information transmission.

Journal ArticleDOI
TL;DR: This work presents a multi-input multi-output neural circuit architecture for nonlinear processing and encoding of stimuli in the spike domain and demonstrates a fundamental duality between the identification of the dendritic stimulus processor of a single neuron and the decoding of stimuli encoded by a population of neurons with a bank of dendrites.
Abstract: We present a multi-input multi-output neural circuit architecture for nonlinear processing and encoding of stimuli in the spike domain In this architecture a bank of dendritic stimulus processors implements nonlinear transformations of multiple temporal or spatio-temporal signals such as spike trains or auditory and visual stimuli in the analog domain Dendritic stimulus processors may act on both individual stimuli and on groups of stimuli, thereby executing complex computations that arise as a result of interactions between concurrently received signals The results of the analog-domain computations are then encoded into a multi-dimensional spike train by a population of spiking neurons modeled as nonlinear dynamical systems We investigate general conditions under which such circuits faithfully represent stimuli and demonstrate algorithms for (i) stimulus recovery, or decoding, and (ii) identification of dendritic stimulus processors from the observed spikes Taken together, our results demonstrate a fundamental duality between the identification of the dendritic stimulus processor of a single neuron and the decoding of stimuli encoded by a population of neurons with a bank of dendritic stimulus processors This duality result enabled us to derive lower bounds on the number of experiments to be performed and the total number of spikes that need to be recorded for identifying a neural circuit

Journal ArticleDOI
TL;DR: This paper compares the ability of different methods to detect and resolve spikes recorded extracellularly with polytrode and high-density microelectrode arrays (MEAs) and finds two methods based on windowing and clustering could resolve spikes occurring 1 ms or more apart, regardless of their spatial separation.
Abstract: This paper compares the ability of different methods to detect and resolve spikes recorded extracellularly with polytrode and high-density microelectrode arrays (MEAs). Detecting spikes on such arrays is more complex than with single electrodes or tetrodes since a single spike from a neuron may cause threshold crossings on several adjacent channels, giving rise to multiple events. These initial events have to be recognized as belonging to a single spike. Combining them is, in essence, a clustering problem. A conflicting need is to be able to resolve spike waveforms that occur close together in space and time. We first evaluated three different detection methods, using simulated data in which spike shape waveforms obtained from real recordings were added to noise with an amplitude and temporal structure similar to that found in real recordings. Performance was assessed by calculating the percentage of correctly identified spikes vs. the false positive rate. Using the best of these detection methods, two different methods for avoiding multiple detections per spike were tested: one based on windowing and the other based on clustering. Using parameters that avoided spatial and temporal duplication, the spatiotemporal resolution of the two methods was next evaluated. The method based on clustering gave slightly better results. Both methods could resolve spikes occurring 1 ms or more apart, regardless of their spatial separation. There was no restriction on the temporal resolution of spike pairs for units more than 200 μm apart.

Journal ArticleDOI
TL;DR: The model and numerical methods presented here can be applied to any neuronal circuit where dendritic spines are invaginated in presynaptic terminals or boutons and provide convincing evidence that an ephaptic mechanism can produce the feedback effect seen in experiments.
Abstract: Experimental evidence suggests the existence of a negative feedback pathway between horizontal cells and cone photoreceptors in the outer plexiform layer of the retina that modulates the flow of calcium ions into the synaptic terminals of cones. However, the underlying mechanism for this feedback is controversial and there are currently three competing hypotheses: the ephaptic hypothesis, the pH hypothesis, and the GABA hypothesis. The goal of this investigation is to demonstrate the ephaptic hypothesis by means of detailed numerical simulations. The drift-diffusion (Poisson-Nernst-Planck) model with membrane boundary current equations is applied to a realistic two-dimensional cross-section of the triad synapse in the goldfish retina to verify the existence of strictly electrical feedback, as predicted by the ephaptic hypothesis. The effect on electrical feedback from the behavior of the bipolar cell membrane potential is also explored. The computed steady-state cone calcium transmembrane current-voltage curves for several cases are presented and compared with experimental data on goldfish. The results provide convincing evidence that an ephaptic mechanism can produce the feedback effect seen in experiments. The model and numerical methods presented here can be applied to any neuronal circuit where dendritic spines are invaginated in presynaptic terminals or boutons.

Journal ArticleDOI
TL;DR: The spike statistics of an adaptive exponential integrate-and-fire neuron stimulated by white Gaussian current noise are studied to provide an alternative explanation for interspike-interval correlations observed in vivo and may be useful in fitting point neuron models to experimental data.
Abstract: We study the spike statistics of an adaptive exponential integrate-and-fire neuron stimulated by white Gaussian current noise. We derive analytical approximations for the coefficient of variation and the serial correlation coefficient of the interspike interval assuming that the neuron operates in the mean-driven tonic firing regime and that the stochastic input is weak. Our result for the serial correlation coefficient has the form of a geometric sequence and is confirmed by the comparison to numerical simulations. The theory predicts various patterns of interval correlations (positive or negative at lag one, monotonically decreasing or oscillating) depending on the strength of the spike-triggered and subthreshold components of the adaptation current. In particular, for pure subthreshold adaptation we find strong positive ISI correlations that are usually ascribed to positive correlations in the input current. Our results i) provide an alternative explanation for interspike-interval correlations observed in vivo, ii) may be useful in fitting point neuron models to experimental data, and iii) may be instrumental in exploring the role of adaptation currents for signal detection and signal transmission in single neurons.

Journal ArticleDOI
TL;DR: It is found that absence seizures arise from instabilities in the system and replicate experimental studies in numerous animal models and clinical studies.
Abstract: The mechanisms underlying generalized seizures are explored with neural field theory. A corticothalamic neural field model that has accounted for multiple brain activity phenomena and states is used to explore changes leading to pathological seizure states. It is found that absence seizures arise from instabilities in the system and replicate experimental studies in numerous animal models and clinical studies.

Journal ArticleDOI
TL;DR: A pulse-based mechanism by which a graded current amplitude may be exactly propagated from one neuronal population to another via a pulse is presented and postulate that such circuits, with automatic and stereotyped control and processing of information, are the neural correlates of Crick and Koch's zombie modes.
Abstract: Neural oscillations can enhance feature recognition (Azouz and Gray Proceedings of the National Academy of Sciences of the United States of America, 97, 8110---8115 2000), modulate interactions between neurons (Womelsdorf et al. Science, 316, 1609---01612 2007), and improve learning and memory (Markowska et al. The Journal of Neuroscience, 15, 2063---2073 1995). Numerical studies have shown that coherent spiking can give rise to windows in time during which information transfer can be enhanced in neuronal networks (Abeles Israel Journal of Medical Sciences, 18, 83---92 1982; Lisman and Idiart Science, 267, 1512---1515 1995, Salinas and Sejnowski Nature Reviews. Neuroscience, 2, 539---550 2001). Unanswered questions are: 1) What is the transfer mechanism? And 2) how well can a transfer be executed? Here, we present a pulse-based mechanism by which a graded current amplitude may be exactly propagated from one neuronal population to another. The mechanism relies on the downstream gating of mean synaptic current amplitude from one population of neurons to another via a pulse. Because transfer is pulse-based, information may be dynamically routed through a neural circuit with fixed connectivity. We demonstrate the transfer mechanism in a realistic network of spiking neurons and show that it is robust to noise in the form of pulse timing inaccuracies, random synaptic strengths and finite size effects. We also show that the mechanism is structurally robust in that it may be implemented using biologically realistic pulses. The transfer mechanism may be used as a building block for fast, complex information processing in neural circuits. We show that the mechanism naturally leads to a framework wherein neural information coding and processing can be considered as a product of linear maps under the active control of a pulse generator. Distinct control and processing components combine to form the basis for the binding, propagation, and processing of dynamically routed information within neural pathways. Using our framework, we construct example neural circuits to 1) maintain a short-term memory, 2) compute time-windowed Fourier transforms, and 3) perform spatial rotations. We postulate that such circuits, with automatic and stereotyped control and processing of information, are the neural correlates of Crick and Koch's zombie modes.

Journal ArticleDOI
TL;DR: This paper combines the use of an information theoretic approach with intracellular recording to establish patterns of connections between layers of interneurons in a neural network responsible for mediating reflex movements of the hind limb of an insect.
Abstract: Understanding the patterns of interconnections between neurons in complex networks is an enormous challenge using traditional physiological approaches. Here we combine the use of an information theoretic approach with intracellular recording to establish patterns of connections between layers of interneurons in a neural network responsible for mediating reflex movements of the hind limb of an insect. By analysing delayed mutual information of the synaptic and spiking responses of sensory neurons, spiking and nonspiking interneurons in response to movement of a joint receptor that monitors the position of the tibia relative to the femur, we are able to predict the patterns of interconnections between the layers of sensory neurons and interneurons in the network, with results matching closely those known from the literature. In addition, we use cross-correlation methods to establish the sign of those interconnections and show that they also show a high degree of similarity with those established for these networks over the last 30 years. The method proposed in this paper has great potential to elucidate functional connectivity at the neuronal level in many different neuronal networks.

Journal ArticleDOI
TL;DR: This study demonstrates that a new class of perturbations can achieve SR, namely, series of stochastically generated biphasic pulse trains, and shows that a Hodgkin Huxley model neuron exhibits SR behavior when detecting weak input signals.
Abstract: Stochastic resonance (SR) is the enhanced representation of a weak input signal by the addition of an optimal level of broadband noise to a nonlinear (threshold) system. Since its discovery in the 1980s the domain of input signals shown to be applicable to SR has greatly expanded, from strictly periodic inputs to now nearly any aperiodic forcing function. The perturbations (noise) used to generate SR have also expanded, from white noise to now colored noise or vibrational forcing. This study demonstrates that a new class of perturbations can achieve SR, namely, series of stochastically generated biphasic pulse trains. Using these pulse trains as `noise' we show that a Hodgkin Huxley model neuron exhibits SR behavior when detecting weak input signals. This result is of particular interest to neuroscience because nearly all artificial neural stimulation is implemented with square current or voltage pulses rather than broadband noise, and this new method may facilitate the translation of the performance gains achievable through SR to neural prosthetics.

Journal ArticleDOI
TL;DR: The results reveal that the asymmetry index of the dendritic tree does not correlate with the performance for the full range of tree morphologies, and the performance of neurons with dendrite tapering is best predicted by the mean and variance of the electrotonic distance of their synapses to the soma.
Abstract: In this paper we examine how a neuron's dendritic morphology can affect its pattern recognition performance. We use two different algorithms to systematically explore the space of dendritic morphologies: an algorithm that generates all possible dendritic trees with 22 terminal points, and one that creates representative samples of trees with 128 terminal points. Based on these trees, we construct multi-compartmental models. To assess the performance of the resulting neuronal models, we quantify their ability to discriminate learnt and novel input patterns. We find that the dendritic morphology does have a considerable effect on pattern recognition performance and that the neuronal performance is inversely correlated with the mean depth of the dendritic tree. The results also reveal that the asymmetry index of the dendritic tree does not correlate with the performance for the full range of tree morphologies. The performance of neurons with dendritic tapering is best predicted by the mean and variance of the electrotonic distance of their synapses to the soma. All relationships found for passive neuron models also hold, even in more accentuated form, for neurons with active membranes.

Journal ArticleDOI
TL;DR: Considering the role of calcium dependent synaptic plasticity, the results suggest that RyR-regulated calcium propagation induces a similar change at the synapses, however, according to the dependence of RyR calcium regulation on the model parameters, whether the chain activation of RyRs occurred, sensitively depended on spatial expression ofRyR and nominal fluctuation of calcium flux.
Abstract: Synaptic modifications induced at one synapse are accompanied by hetero-synaptic changes at neighboring sites. In addition, it is suggested that the mechanism of spatial association of synaptic plasticity is based on intracellular calcium signaling that is mainly regulated by two types of receptors of endoplasmic reticulum calcium store: the ryanodine receptor (RyR) and the inositol triphosphate receptor (IP3R). However, it is not clear how these types of receptors regulate intracellular calcium flux and contribute to the outcome of calcium-dependent synaptic change. To understand the relation between the synaptic association and store-regulated calcium dynamics, we focused on the function of RyR calcium regulation and simulated its behavior by using a computational neuron model. As a result, we observed that RyR-regulated calcium release depended on spike timings of pre- and postsynaptic neurons. From the induction site of calcium release, the chain activation of RyRs occurred, and spike-like calcium increase propagated along the dendrite. For calcium signaling, the propagated calcium increase did not tend to attenuate; these characteristics came from an all-or-none behavior of RyR-sensitive calcium store. Considering the role of calcium dependent synaptic plasticity, the results suggest that RyR-regulated calcium propagation induces a similar change at the synapses. However, according to the dependence of RyR calcium regulation on the model parameters, whether the chain activation of RyRs occurred, sensitively depended on spatial expression of RyR and nominal fluctuation of calcium flux. Therefore, calcium regulation of RyR helps initiate rather than relay calcium propagation.

Journal ArticleDOI
TL;DR: In this paper, the authors analyzed the directionality of information flow between different layers of the cortex and the connected thalamus during spontaneous activity and found that infragranular layers lead the information flow during slow oscillations both towards supragranular and towards the thalamicus.
Abstract: The recurrent circuitry of the cerebral cortex generates an emergent pattern of activity that is organized into rhythmic periods of firing and silence referred to as slow oscillations (ca 1 Hz). Slow oscillations not only are dominant during slow wave sleep and deep anesthesia, but also can be generated by the isolated cortical network in vitro, being a sort of default activity of the cortical network. The cortex is densely and reciprocally connected with subcortical structures and, as a result, the slow oscillations in situ are the result of an interplay between cortex and thalamus. Due to this reciprocal connectivity and interplay, the mechanism responsible for the initiation of waves in the corticothalamocortical loop during slow oscillations is still a matter of debate. It was our objective to determine the directionality of the information flow between different layers of the cortex and the connected thalamus during spontaneous activity. With that purpose we obtained multilayer local field potentials from the rat visual cortex and from its connected thalamus, the lateral geniculate nucleus, during deep anaesthesia. We analyzed directionality of information flow between thalamus, cortical infragranular layers (5 and 6) and supragranular layers (2/3) by means of three information theoretical indicators: transfer entropy, symbolic transfer entropy and transcript mutual information. These three indicators coincided in finding that infragranular layers lead the information flow during slow oscillations both towards supragranular layers and towards the thalamus.

Journal ArticleDOI
TL;DR: It is shown that computational models that are carefully linked with experiment provide insight into the generation of population rhythms in the mammalian brain and that cellular adaptation in pyramidal cells could be an important aspect for the occurrence of theta frequency population bursting in the hippocampus.
Abstract: Determining the biological details and mechanisms that are essential for the generation of population rhythms in the mammalian brain is a challenging problem. This problem cannot be addressed either by experimental or computational studies in isolation. Here we show that computational models that are carefully linked with experiment provide insight into this problem. Using the experimental context of a whole hippocampus preparation in vitro that spontaneously expresses theta frequency (3---12 Hz) population bursts in the CA1 region, we create excitatory network models to examine whether cellular adaptation bursting mechanisms could critically contribute to the generation of this rhythm. We use biologically-based cellular models of CA1 pyramidal cells and network sizes and connectivities that correspond to the experimental context. By expanding our mean field analyses to networks with heterogeneity and non all-to-all coupling, we allow closer correspondence with experiment, and use these analyses to greatly extend the range of parameter values that are explored. We find that our model excitatory networks can produce theta frequency population bursts in a robust fashion.Thus, even though our networks are limited by not including inhibition at present, our results indicate that cellular adaptation in pyramidal cells could be an important aspect for the occurrence of theta frequency population bursting in the hippocampus. These models serve as a starting framework for the inclusion of inhibitory cells and for the consideration of additional experimental features not captured in our present network models.

Journal ArticleDOI
TL;DR: The results added a new warning message when extracting conductance traces from intracellular recordings and the conclusions concerning neuronal activity that can be drawn from them, to reduce the relative errors of the estimated conductances by more than one order of magnitude.
Abstract: We study the influence of subthreshold activity in the estimation of synaptic conductances. It is known that differences between actual conductances and the estimated ones using linear regression methods can be huge in spiking regimes, so caution has been taken to remove spiking activity from experimental data before proceeding to linear estimation. However, not much attention has been paid to the influence of ionic currents active in the non-spiking regime where such linear methods are still profusely used. In this paper, we use conductance-based models to test this influence using several representative mechanisms to induce ionic subthreshold activity. In all the cases, we show that the currents activated during subthreshold activity can lead to significant errors when estimating synaptic conductance linearly. Thus, our results add a new warning message when extracting conductance traces from intracellular recordings and the conclusions concerning neuronal activity that can be drawn from them. Additionally, we present, as a proof of concept, an alternative method that takes into account the main nonlinear effects of specific ionic subthreshold currents. This method, based on the quadratization of the subthreshold dynamics, allows us to reduce the relative errors of the estimated conductances by more than one order of magnitude. In experimental conditions, under appropriate fitting to canonical models, it could be useful to obtain better estimations as well even under the presence of noise.

Journal ArticleDOI
TL;DR: It was found that CA3 kernels had significantly more power in the theta and beta range than those of CA1, confirming CA3’s role as an endogenous hippocampal pacemaker.
Abstract: Although an anatomical connection from CA1 to CA3 via the Entorhinal Cortex (EC) and through backprojecting interneurons has long been known it exist, it has never been examined quantitatively on the single neuron level, in the in-vivo nonpatholgical, nonperturbed brain. Here, single spike activity was recorded using a multi-electrode array from the CA3 and CA1 areas of the rodent hippocampus (N = 7) during a behavioral task. The predictive power from CA3?CA1 and CA1?CA3 was examined by constructing Multivariate Autoregressive (MVAR) models from recorded neurons in both directions. All nonsignificant inputs and models were identified and removed by means of Monte Carlo simulation methods. It was found that 121/166 (73 %) CA3?CA1 models and 96/145 (66 %) CA1?CA3 models had significant predictive power, thus confirming a predictive 'Granger' causal relationship from CA1 to CA3. This relationship is thought to be caused by a combination of truly causal connections such as the CA1?EC?CA3 pathway and common inputs such as those from the Septum. All MVAR models were then examined in the frequency domain and it was found that CA3 kernels had significantly more power in the theta and beta range than those of CA1, confirming CA3's role as an endogenous hippocampal pacemaker.

Journal ArticleDOI
TL;DR: In this article, the authors employ a dimension reduction method that relies on a combination of Monte Carlo simulations and probability density function equations to analyze the relationship between intrinsic and network heterogeneity in spiking neural networks.
Abstract: Heterogeneity of neural attributes has recently gained a lot of attention and is increasing recognized as a crucial feature in neural processing. Despite its importance, this physiological feature has traditionally been neglected in theoretical studies of cortical neural networks. Thus, there is still a lot unknown about the consequences of cellular and circuit heterogeneity in spiking neural networks. In particular, combining network or synaptic heterogeneity and intrinsic heterogeneity has yet to be considered systematically despite the fact that both are known to exist and likely have significant roles in neural network dynamics. In a canonical recurrent spiking neural network model, we study how these two forms of heterogeneity lead to different distributions of excitatory firing rates. To analytically characterize how these types of heterogeneities affect the network, we employ a dimension reduction method that relies on a combination of Monte Carlo simulations and probability density function equations. We find that the relationship between intrinsic and network heterogeneity has a strong effect on the overall level of heterogeneity of the firing rates. Specifically, this relationship can lead to amplification or attenuation of firing rate heterogeneity, and these effects depend on whether the recurrent network is firing asynchronously or rhythmically firing. These observations are captured with the aforementioned reduction method, and furthermore simpler analytic descriptions based on this dimension reduction method are developed. The final analytic descriptions provide compact and descriptive formulas for how the relationship between intrinsic and network heterogeneity determines the firing rate heterogeneity dynamics in various settings.