scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Computational Neuroscience in 2004"


Journal ArticleDOI
TL;DR: It is demonstrated how DBS could work by increasing firing rates of target cells, rather than shutting them down, by replacing pathologically rhythmic basal ganglia output with tonic, high frequency firing.
Abstract: Deep brain stimulation (DBS) of the subthalamic nucleus (STN) or the internal segment of the globus pallidus (GPi) has recently been recognized as an important form of intervention for alleviating motor symptoms associated with Parkinson's disease, but the mechanism underlying its effectiveness remains unknown. Using a computational model, this paper considers the hypothesis that DBS works by replacing pathologically rhythmic basal ganglia output with tonic, high frequency firing. In our simulations of parkinsonian conditions, rhythmic inhibition from GPi to the thalamus compromises the ability of thalamocortical relay (TC) cells to respond to depolarizing inputs, such as sensorimotor signals. High frequency stimulation of STN regularizes GPi firing, and this restores TC responsiveness, despite the increased frequency and amplitude of GPi inhibition to thalamus that result. We provide a mathematical phase plane analysis of the mechanisms that determine TC relay capabilities in normal, parkinsonian, and DBS states in a reduced model. This analysis highlights the differences in deinactivation of the low-threshold calcium T -current that we observe in TC cells in these different conditions. Alternative scenarios involving convergence of thalamic signals in the cortex are also discussed, and predictions associated with these results, including the occurrence of rhythmic rebound bursts in certain TC cells in parkinsonian states and their drastic reduction by DBS, are stated. These results demonstrate how DBS could work by increasing firing rates of target cells, rather than shutting them down.

560 citations


Journal ArticleDOI
TL;DR: This article provides a guide to entering a new model into ModelDB, a web-accessible database for convenient entry, retrieval, and running of published models on different platforms.
Abstract: Wider dissemination and testing of computational models are crucial to the field of computational neuroscience. Databases are being developed to meet this need. ModelDB is a web-accessible database for convenient entry, retrieval, and running of published models on different platforms. This article provides a guide to entering a new model into ModelDB.

381 citations


Journal ArticleDOI
TL;DR: This study developed a set of map-based models that replicate spiking activity of cortical fast spiking, regular spiking and intrinsically bursting neurons that can be widely used for large-scale simulations and that such models are especially useful for tasks where the modeling of specific firing patterns of different cell classes is important.
Abstract: We develop a new computationally efficient approach for the analysis of complex large-scale neurobi- ological networks. Its key element is the use of a new phenomenological model of a neuron capable of replicating important spike pattern characteristics and designed in the form of a system of difference equations (a map). We developed a set of map-based models that replicate spiking activity of cortical fast spiking, regular spiking and intrinsically bursting neurons. Interconnected with synaptic currents these model neurons demonstrated responses very similar to those found with Hodgkin-Huxley models and in experiments. We illustrate the efficacy of this approach in simulations of one- and two-dimensional cortical network models consisting of regular spiking neurons and fast spiking interneurons to model sleep and activated states of the thalamocortical system. Our study suggests that map-based models can be widely used for large-scale simulations and that such models are especially useful for tasks where the modeling of specific firing patterns of different cell classes is important.

179 citations


Journal ArticleDOI
TL;DR: It is shown that in a hierarchical system of synfire chains, a part-binding problem may be resolved, and that such a system readily demonstrates the property of priming.
Abstract: This paper examines the feasibility of manifesting compositionality by a system of synfire chains. Compositionality is the ability to construct mental representations, hierarchically, in terms of parts and their relations. We show that synfire chains may synchronize their waves when a few orderly cross links are available. We propose that synchronization among synfire chains can be used for binding component into a whole. Such synchronization is shown both for detailed simulations, and by numerical analysis of the propagation of a wave along a synfire chain. We show that global inhibition may prevent spurious synchronization among synfire chains. We further show that selecting which synfire chains may synchronize to which others may be improved by including inhibitory neurons in the synfire pools. Finally we show that in a hierarchical system of synfire chains, a part-binding problem may be resolved, and that such a system readily demonstrates the property of priming. We compare the properties of our system with the general requirements for neural networks that demonstrate compositionality.

118 citations


Journal ArticleDOI
TL;DR: It is argued that this procedure provides a ‘natural partitioning’ of ongoing brain dynamics into topographically distinct synchronous epochs which may be integral to the brain's adaptive function.
Abstract: The study of synchronous oscillations in neural systems is a very active area of research. However, cognitive function may depend more crucially upon a dynamic alternation between synchronous and desynchronous activity rather than synchronous behaviour per se. The principle aim of this study is to develop and validate a novel method of quantifying this complex process. The method permits a direct mapping of phase synchronous dynamics and desynchronizing bursts in the spatial and temporal domains. Two data sets are analyzed: Numeric data from a model of a sparsely coupled neural cell assembly and experimental data consisting of scalp-recorded EEG from 40 human subjects. In the numeric data, the approach enables the demonstration of complex relationships between cluster size and temporal duration that cannot be detected with other methods. Dynamic patterns of phase-clustering and desynchronization are also demonstrated in the experimental data. It is further shown that in a significant proportion of the recordings, the pattern of dynamics exhibits nonlinear structure. We argue that this procedure provides a 'natural partitioning' of ongoing brain dynamics into topographically distinct synchronous epochs which may be integral to the brain's adaptive function. In particular, the character of transitions between consecutive synchronous epochs may reflect important aspects of information processing and cognitive flexibility.

112 citations


Journal ArticleDOI
TL;DR: A realistic model of a hippocampal CA1 pyramidal neuron is used to show a major role for the hyperpolarization-activated current, Ih, in regulating the spike probability of a neuron when independent synaptic inputs are activated with different degrees of synchronization and at different distances from the soma.
Abstract: The active dendritic conductances shape the input-output properties of many principal neurons in different brain regions, and the various ways in which they regulate neuronal excitability need to be investigated to better understand their functional consequences. Using a realistic model of a hippocampal CA1 pyramidal neuron, we show a major role for the hyperpolarization-activated current, Ih, in regulating the spike probability of a neuron when independent synaptic inputs are activated with different degrees of synchronization and at different distances from the soma. The results allowed us to make the experimentally testable prediction that the Ih in these neurons is needed to reduce neuronal excitability selectively for distal unsynchronized, but not for synchronized, inputs.

75 citations


Journal ArticleDOI
TL;DR: A computational model shows that inhibitory processes in the olfactory bulb suffice to generate the behaviorally observed inverse relationship between two odorants' perceptual similarities and the perceptual similarities between either of these same odorants and their binary mixture.
Abstract: Contrast enhancement via lateral inhibitory circuits is a common mechanism in sensory systems. We here employ a computational model to show that, in addition to shaping experimentally observed molecular receptive fields in the olfactory bulb, functionally lateral inhibitory circuits can also mediate the elemental and configurational properties of odor mixture perception. To the extent that odor perception can be predicted by slow-timescale neural activation patterns in the olfactory bulb, and to the extent that interglomerular inhibitory projections map onto a space of odorant similarity, the model shows that these inhibitory processes in the olfactory bulb suffice to generate the behaviorally observed inverse relationship between two odorants' perceptual similarities and the perceptual similarities between either of these same odorants and their binary mixture.

73 citations


Journal ArticleDOI
TL;DR: Common patterns of connectivity between the model and biological networks suggest new functions for previously identified connections in the C. elegans nervous system, and it is shown that feedback regulates the latency between sensory input and behavior.
Abstract: The anatomical connectivity of the nervous system of the nematode Caenorhabditis elegans has been almost completely described, but determination of the neurophysiological basis of behavior in this system is just beginning. Here we used an optimization algorithm to search for patterns of connectivity sufficient to compute the sensorimotor transformation underlying C. elegans chemotaxis, a simple form of spatial orientation behavior in which turning probability is modulated by the rate of change of chemical concentration. Optimization produced differentiator networks capable of simulating chemotaxis. A surprising feature of these networks was inhibitory feedback connections on all neurons. Further analysis showed that feedback regulates the latency between sensory input and behavior. Common patterns of connectivity between the model and biological networks suggest new functions for previously identified connections in the C. elegans nervous system.

73 citations


Journal ArticleDOI
TL;DR: The present results suggest that the essential mechanism required to explain a range of data on in vivo neocortical activity is the conductance-based synapse and that the particular model of spike initiation used is not crucial.
Abstract: A model of in vivo-like neocortical activity is studied analytically in relation to experimental data and other models in order to understand the essential mechanisms underlying such activity. The model consists of a network of sparsely connected excitatory and inhibitory integrate-and-fire (IF) neurons with conductance-based synapses. It is shown that the model produces values for five quantities characterizing in vivo activity that are in agreement with both experimental ranges and a computer-simulated Hodgkin-Huxley model adapted from the literature (Destexhe et al. (2001) Neurosci. 107(1): 13–24). The analytical model builds on a study by Brunel (2000) (J. Comput. Neurosci. 8: 183–208), which used IF neurons with current-based synapses, and therefore does not account for the full range of experimental data. The present results suggest that the essential mechanism required to explain a range of data on in vivo neocortical activity is the conductance-based synapse and that the particular model of spike initiation used is not crucial. Thus the IF model with conductance-based synapses may provide a basis for the analytical study of the ‘large, fluctuating synaptic conductance state’ typical of neocortical neurons in vivo.

64 citations


Journal ArticleDOI
TL;DR: It is suggested that noise associated with finite populations of Na channels could evoke very different effects on the intrinsic variability of spiking and bursting discharges, depending on a biological neuron's complexity and applied current-dependent state.
Abstract: We explore the effects of stochastic sodium (Na) channel activation on the variability and dynamics of spiking and bursting in a model neuron. The complete model segregates Hodgin-Huxley-type currents into two compartments, and undergoes applied current-dependent bifurcations between regimes of periodic bursting, chaotic bursting, and tonic spiking. Noise is added to simulate variable, finite sizes of the population of Na channels in the fast spiking compartment. During tonic firing, Na channel noise causes variability in interspike intervals (ISIs). The variance, as well as the sensitivity to noise, depend on the model's biophysical complexity. They are smallest in an isolated spiking compartment; increase significantly upon coupling to a passive compartment; and increase again when the second compartment also includes slow-acting currents. In this full model, sufficient noise can convert tonic firing into bursting. During bursting, the actions of Na channel noise are state-dependent. The higher the noise level, the greater the jitter in spike timing within bursts. The noise makes the burst durations of periodic regimes variable, while decreasing burst length duration and variance in a chaotic regime. Na channel noise blurs the sharp transitions of spike time and burst length seen at the bifurcations of the noise-free model. Close to such a bifurcation, the burst behaviors of previously periodic and chaotic regimes become essentially indistinguishable. We discuss biophysical mechanisms, dynamical interpretations and physiological implications. We suggest that noise associated with finite populations of Na channels could evoke very different effects on the intrinsic variability of spiking and bursting discharges, depending on a biological neuron's complexity and applied current-dependent state. We find that simulated channel noise in the model neuron qualitatively replicates the observed variability in burst length and interburst interval in an isolated biological bursting neuron.

58 citations


Journal ArticleDOI
TL;DR: A phase oscillator model for spiking neurons in locus coeruleus, a brain nucleus involved in modulating cognitive performance, shows that post-stimulus response is elevated in populations with lower spike rates, and that shorter stimuli preferentially cause depressed post-activation spiking.
Abstract: We model spiking neurons in locus coeruleus (LC), a brain nucleus involved in modulating cognitive performance, and compare with recent experimental data. Extracellular recordings from LC of monkeys performing target detection and selective attention tasks show varying responses dependent on stimuli and performance accuracy. From membrane voltage and ion channel equations, we derive a phase oscillator model for LC neurons. Average spiking probabilities of a pool of cells over many trials are then computed via a probability density formulation. These show that: (1) Post-stimulus response is elevated in populations with lower spike rates; (2) Responses decay exponentially due to noise and variable pre-stimulus spike rates; and (3) Shorter stimuli preferentially cause depressed post-activation spiking. These results allow us to propose mechanisms for the different LC responses observed across behavioral and task conditions, and to make explicit the role of baseline firing rates and the duration of task-related inputs in determining LC response.

Journal ArticleDOI
TL;DR: This study compares the neural coding capabilities of tonically firing and bursting electroreceptor model neurons using information theoretic measures and shows that both bursting and tonic firing model neurons efficiently transmit information about the stimulus.
Abstract: It is well known that some neurons tend to fire packets of action potentials followed by periods of quiescence (bursts) while others within the same stage of sensory processing fire in a tonic manner. However, the respective computational advantages of bursting and tonic neurons for encoding time varying signals largely remain a mystery. Weakly electric fish use cutaneous electroreceptors to convey information about sensory stimuli and it has been shown that some electroreceptors exhibit bursting dynamics while others do not. In this study, we compare the neural coding capabilities of tonically firing and bursting electroreceptor model neurons using information theoretic measures. We find that both bursting and tonically firing model neurons efficiently transmit information about the stimulus. However, the decoding mechanisms that must be used for each differ greatly: a non-linear decoder would be required to extract all the available information transmitted by the bursting model neuron whereas a linear one might suffice for the tonically firing model neuron. Further investigations using stimulus reconstruction techniques reveal that, unlike the tonically firing model neuron, the bursting model neuron does not encode the detailed time course of the stimulus. A novel measure of feature detection reveals that the bursting neuron signals certain stimulus features. Finally, we show that feature extraction and stimulus estimation are mutually exclusive computations occurring in bursting and tonically firing model neurons, respectively. Our results therefore suggest that stimulus estimation and feature extraction might be parallel computations in certain sensory systems rather than being sequential as has been previously proposed.

Journal ArticleDOI
TL;DR: This approach provides a direct link between intracellular recordings in vivo and the design of models consistent with the dynamics and spectral structure of synaptic noise, as well as to estimate equivalent stochastic models.
Abstract: In cortical neurons, synaptic "noise" is caused by the nearly random release of thousands of synapses. Few methods are presently available to analyze synaptic noise and deduce properties of the underlying synaptic inputs. We focus here on the power spectral density (PSD) of several models of synaptic noise. We examine different classes of analytically solvable kinetic models for synaptic currents, such as the "delta kinetic models," which use Dirac delta functions to represent the activation of the ion channel. We first show that, for this class of kinetic models, one can obtain an analytic expression for the PSD of the total synaptic conductance and derive equivalent stochastic models with only a few variables. This yields a method for constraining models of synaptic currents by analyzing voltage-clamp recordings of synaptic noise. Second, we show that a similar approach can be followed for the PSD of the the membrane potential (Vm) through an effective-leak approximation. Third, we show that this approach is also valid for inputs distributed in dendrites. In this case, the frequency scaling of the Vm PSD is preserved, suggesting that this approach may be applied to intracellular recordings of real neurons. In conclusion, using simple mathematical tools, we show that Vm recordings can be used to constrain kinetic models of synaptic currents, as well as to estimate equivalent stochastic models. This approach, therefore, provides a direct link between intracellular recordings in vivo and the design of models consistent with the dynamics and spectral structure of synaptic noise.

Journal ArticleDOI
TL;DR: This computational study is based on Type I and II implementations of the Morris-Lecar model, which concerns neurons, such as those in the auditory or electrosensory system, which encode band-limited amplitude modulations of a periodic carrier signal, and which fire at random cycles yet preferred phases of this carrier.
Abstract: We consider the dependence of information transfer by neurons on the Type I vs. Type II classification of their dynamics. Our computational study is based on Type I and II implementations of the Morris-Lecar model. It mainly concerns neurons, such as those in the auditory or electrosensory system, which encode band-limited amplitude modulations of a periodic carrier signal, and which fire at random cycles yet preferred phases of this carrier. We first show that the Morris-Lecar model with additive broadband noise ("synaptic noise") can exhibit such firing patterns with either Type I or II dynamics, with or without amplitude modulations of the carrier. We then compare the encoding of band-limited random amplitude modulations for both dynamical types. The comparison relies on a parameter calibration that closely matches firing rates for both models across a range of parameters. In the absence of synaptic noise, Type I performs slightly better than Type II, and its performance is optimal for perithreshold signals. However, Type II performs well over a slightly larger range of inputs, and this range lies mostly in the subthreshold region. Further, Type II performs marginally better than Type I when synaptic noise, which yields more realistic baseline firing patterns, is present in both models. These results are discussed in terms of the tuning and phase locking properties of the models with deterministic and stochastic inputs.

Journal ArticleDOI
TL;DR: This work shows how a multi-compartmental model of a cerebellar Purkinje cell can learn to recognise temporal parallel fibre activity patterns by adapting latencies of calcium responses after activation of metabotropic glutamate receptors (mGluRs).
Abstract: It has been suggested that information in the brain is encoded in temporal spike patterns which are decoded by a combination of time delays and coincidence detection. Here, we show how a multi-compartmental model of a cerebellar Purkinje cell can learn to recognise temporal parallel fibre activity patterns by adapting latencies of calcium responses after activation of metabotropic glutamate receptors (mGluRs). In each compartment of our model, the mGluR signalling cascade is represented by a set of differential equations that reflect the underlying biochemistry. Phosphorylation of the mGluRs changes the concentration of receptors which are available for activation by glutamate and thereby adjusts the time delay between mGluR stimulation and voltage response. The adaptation of a synaptic delay as opposed to a weight represents a novel non-Hebbian learning mechanism that can also implement the adaptive timing of the classically conditioned eye-blink response.

Journal ArticleDOI
TL;DR: It is shown that each neuron's discharge rate should increase quadratically with the stimulus and that statistically independent neural outputs provides optimal coding, and that only cooperative populations can achieve this condition in an informationally effective way.
Abstract: We create a framework based on Fisher information for determining the most effective population coding scheme for representing a continuous-valued stimulus attribute over its entire range. Using this scheme, we derive optimal single- and multi-neuron rate codes for homogeneous populations using several statistical models frequently used to describe neural data. We show that each neuron's discharge rate should increase quadratically with the stimulus and that statistically independent neural outputs provides optimal coding. Only cooperative populations can achieve this condition in an informationally effective way.

Journal ArticleDOI
TL;DR: This work shows, in a simplified network consisting of an oscillator inhibiting a follower neuron, how the interaction between synaptic depression and a transient potassium current in the follower neuron determines the activity phase of this neuron.
Abstract: Many inhibitory rhythmic networks produce activity in a range of frequencies. The relative phase of activity between neurons in these networks is often a determinant of the network output. This relative phase is determined by the interaction between synaptic inputs to the neurons and their intrinsic properties. We show, in a simplified network consisting of an oscillator inhibiting a follower neuron, how the interaction between synaptic depression and a transient potassium current in the follower neuron determines the activity phase of this neuron. We derive a mathematical expression to determine at what phase of the oscillation the follower neuron becomes active. This expression can be used to understand which parameters determine the phase of activity of the follower as the frequency of the oscillator is changed. We show that in the presence of synaptic depression, there can be three distinct frequency intervals, in which the phase of the follower neuron is determined by different sets of parameters. Alternatively, when the synapse is not depressing, only one set of parameters determines the phase of activity at all frequencies.

Journal ArticleDOI
TL;DR: A biophysical model for the interactions between bursting (B) cells and nonbursting (NB) cells in the procerebral lobe of Limax is developed and tested and a novel calcium current is suggested to explain the effects of nitric oxide (NO) on the lobe.
Abstract: A biophysical model for the interactions between bursting (B) cells and nonbursting (NB) cells in the procerebral lobe of Limax is developed and tested. Phase-sensitivity of the NB cells is exhibited due to the strong inhibition from the rhythmically bursting B cells. Electrical and chemical junctions coupled with a parameter gradient lead to sustained periodic waves in the lobe. Excitatory interactions between the NB cells, which rarely fire, lead to stimulus evoked synchrony in the lobe oscillations. A novel calcium current is suggested to explain the effects of nitric oxide (NO) on the lobe. Gap junctions are shown both experimentally and through simulations to be required for the oscillating field potentials.

Journal ArticleDOI
TL;DR: It is speculated that combined membrane and stochastic resonances have physiological utility in coupling synaptic activity to preferred firing frequency and in network synchronization under noise.
Abstract: We examined the interactions of subthreshold membrane resonance and stochastic resonance using whole-cell patch clamp recordings in thalamocortical neurons of rat brain slices, as well as with a Hodgkin-Huxley-type mathematical model of thalamocortical neurons. The neurons exhibited the subthreshold resonance when stimulated with small amplitude sine wave currents of varying frequency, and stochastic resonance when noise was added to sine wave inputs. Stochastic resonance was manifest as a maximum in signal-to-noise ratio of output response to subthreshold periodic input combined with noise. Stochastic resonance in conjunction with subthreshold resonance resulted in action potential patterns that showed frequency selectivity for periodic inputs. Stochastic resonance was maximal near subthreshold resonance frequency and a high noise level was required for detection of high frequency signals. We speculate that combined membrane and stochastic resonances have physiological utility in coupling synaptic activity to preferred firing frequency and in network synchronization under noise.

Journal ArticleDOI
TL;DR: The study used a computational approach to identify combinations of synaptic input timing and strength superimposed on a variety of active dendritic conductances that could evoke similar levels of motor unit synchronization in model motor neurons and suggested that the experimentally observed correlation between discharge variability and synchronization is caused by an increase in fast inward ionic conductances in the dendrites.
Abstract: The study used a computational approach to identify combinations of synaptic input timing and strength superimposed on a variety of active dendritic conductances that could evoke similar levels of motor unit synchronization in model motor neurons. Two motor neurons with low recruitment thresholds but different passive properties were modeled using GENESIS software. The timing and strength of synaptic inputs and the density of dendritic ion channels were optimized with a genetic algorithm to produce a set of target discharge times. The target times were taken from experimental recordings made in a human subject and had the synchronization characteristics that are commonly observed in hand muscles. The main finding was that the two parameters with the highest association to output synchrony were the ratio of inward-to-outward ionic conductances (r = 0.344; P = 0.003) and the degree of correlation in inhibitory inputs (r = 0.306; P = 0.009). Variation in the amount of correlation in the excitatory input was not positively correlated with variation in output synchrony. Further, the variability in discharge rate of the model neurons was positively correlated with the density of N-type calcium channels in the dendritic compartments (r = 0.727; P < 0.001 and r = 0.533; P < 0.001 for the two cells). This result suggests that the experimentally observed correlation between discharge variability and synchronization is caused by an increase in fast inward ionic conductances in the dendrites. Given the moderate level of correlation between output synchrony and each of the model parameters, especially at moderate levels of synchrony (E < 0.09 and CIS < 1.0), the results suggest caution in ascribing mechanisms to observations of motor unit synchronization.

Journal ArticleDOI
TL;DR: A novel iterative algorithm is introduced that allows one to find stimuli that are reliably represented by the sensory system under study and shows that the optimal stimuli often exhibit pronounced sub-threshold periods that are interrupted by short, yet intense pulses.
Abstract: Shaped by evolutionary processes, sensory systems often represent behaviorally relevant stimuli with higher fidelity than other stimuli. The stimulus dependence of neural reliability could therefore provide an important clue in a search for relevant sensory signals. We explore this relation and introduce a novel iterative algorithm that allows one to find stimuli that are reliably represented by the sensory system under study. To assess the quality of a neural representation, we use stimulus reconstruction methods. The algorithm starts with the presentation of an initial stimulus (e.g. white noise). The evoked spike train is recorded and used to reconstruct the stimulus online. Within a closed-loop setup, this reconstruction is then played back to the sensory system. Iterating this procedure, the newly generated stimuli can be better and better reconstructed. We demonstrate the feasibility of this method by applying it to auditory receptor neurons in locusts. Our data show that the optimal stimuli often exhibit pronounced sub-threshold periods that are interrupted by short, yet intense pulses. Similar results are obtained for simple model neurons and suggest that these stimuli are encoded with high reliability by a large class of neurons.

Journal ArticleDOI
TL;DR: This temporal model can be combined with a population model for average rate to derive a spatio-temporal description of the responses of somatosensory afferents.
Abstract: Rapidly-adapting (RA) mechanoreceptive fibers, which are associated with Meissner corpuscles, mediate one component of the neural information that contributes to the sense of touch. Responses of cat RA fibers subject to 40-Hz sinusoidal stimulation were modeled as a Markov process. Since an RA fiber generates one, two or no spikes in each cycle of the stimulus, the fiber's activity was considered to exist in one of these three possible states. By analyzing empirically generated spike trains, the probability of each state and the probabilities of transitions between the three states were found as a function of the average firing rate of the fiber. The average firing rate depends on the stimulus amplitude. In addition, the phase of each spike with respect to the stimulus cycle was represented by a Laplace distribution. Based on empirical data, the mean and the standard deviation of this distribution decrease as the stimulus amplitude is increased. The entire stochastic model was implemented on a computer to simulate the responses of RA fibers. The post-stimulus time, inter-spike interval and period histograms generated from the simulations match the histograms obtained from the empirical data well as quantified by relative errors. This temporal model can be combined with a population model for average rate to derive a spatio-temporal description of the responses of somatosensory afferents. The effects of changing the stimulation frequency are discussed.

Journal ArticleDOI
Don H. Johnson1
TL;DR: This work applies the new theory of information processing to determine the fidelity limits of simple population structures to encode stimulus features and shows that noncooperative populations always exhibit positively correlated responses and that as population size increases, they perfectly represent the information conveyed by their inputs regardless of the individual neuron's coding scheme.
Abstract: Researchers studying neural coding have speculated that populations of neurons would more effectively represent the stimulus if the neurons "cooperated:" by interacting through lateral connections, the neurons would process and represent information better than if they functioned independently. We apply our new theory of information processing to determine the fidelity limits of simple population structures to encode stimulus features. We focus on noncooperative populations, which have no lateral connections. We show that they always exhibit positively correlated responses and that as population size increases, they perfectly represent the information conveyed by their inputs regardless of the individual neuron's coding scheme. Cooperative populations, which do have lateral connections, can, depending on the nature of the connections, perform better or worse than their noncooperative counterparts. We further show that common notions of synergy fail to capture the level of cooperation and to reflect the information processing properties of populations.

Journal ArticleDOI
TL;DR: The paper demonstrates that wave subspace caricatures from the three cortical preparations have qualitative similarities, and proposes a robust technique for extracting wave structure from experimental data by calculating "wave subspaces" from the KL decomposition of the data set.
Abstract: Waves have long been thought to be a fundamental mechanism for communicating information within a medium and are widely observed in biological systems. However, a quantitative analysis of biological waves is confounded by the variability and complexity of the response. This paper proposes a robust technique for extracting wave structure from experimental data by calculating “wave subspaces” from the KL decomposition of the data set. If a wave subspace contains a substantial portion of the data set energy during a particular time interval, one can deduce the structure of the wave and potentially isolate its information content. This paper uses the wave subspace technique to extract and compare wave structure in data from three different preparations of the turtle visual cortex. The paper demonstrates that wave subspace caricatures from the three cortical preparations have qualitative similarities. In the numerical model, where information about the underlying dynamics is available, wave subspace landmarks are related to activation and changes in behavior of other dynamic variables besides membrane potential.

Journal ArticleDOI
TL;DR: A model derived from the statistical learning theory and using the biological model of Thorpe et al. is introduced, an interesting front-end for algorithmsderived from the Vapnik theory is experimented using a restrained sign language recognition experiment.
Abstract: Regarding biological visual classification, recent series of experiments have enlighten the fact that data classification can be realized in the human visual cortex with latencies of about 100-150 ms, which, considering the visual pathways latencies, is only compatible with a very specific processing architecture, described by models from Thorpe et al. Surprisingly enough, this experimental evidence is in coherence with algorithms derived from the statistical learning theory. More precisely, there is a double link: on one hand, the so-called Vapnik theory offers tools to evaluate and analyze the biological model performances and on the other hand, this model is an interesting front-end for algorithms derived from the Vapnik theory. The present contribution develops this idea, introducing a model derived from the statistical learning theory and using the biological model of Thorpe et al. We experiment its performances using a restrained sign language recognition experiment. This paper intends to be read by biologist as well as statistician, as a consequence basic material in both fields have been reviewed.

Journal ArticleDOI
TL;DR: Using a biologically plausible computational model of the primary somatosensory system, simulation results can be used to relate the dynamics of the interactions of excitatory and inhibitory neurons to the process of somatotopic map reorganization immediately after peripheral lesion.
Abstract: In this work we study the connection between some dynamic effects at the synaptic level and fast reorganization of cortical sensory maps. By using a biologically plausible computational model of the primary somatosensory system we obtained simulation results that can be used to relate the dynamics of the interactions of excitatory and inhibitory neurons to the process of somatotopic map reorganization immediately after peripheral lesion. The model consists of three regions integrated into a single structure: tactile receptors representing the glabrous surface of the hand, ventral posterior lateral nucleus of the thalamus and area 3b of the primary somatosensory cortex, reproducing the main aspects of the connectivity of these regions. By applying informational measures to the simulation results of the dynamic behavior of AMPA, NMDA and GABA synaptic conductances we draw some conjectures about how the several neuronal synaptic elements are related to the initial stage of the digit-induced reorganization of the hand map in the somatosensory cortex.

Journal ArticleDOI
TL;DR: Simulations of the interaction between light and hair cell activity show that paired stimuli do not produce a greater calcium increase than unpaired stimuli, which suggests thathair cell activity is acting via some other pathway to initiate memory storage.
Abstract: The sea slug Hermissenda learns to associate light and hair cell stimulation, but not when the stimuli are temporally uncorrelated. Memory storage, which requires an elevation in calcium, occurs in the photoreceptors, which receive monosynaptic input from hair cells that sense acceleration stimuli such as turbulence. Both light and hair cell activity increase calcium concentration in the photoreceptor, but it is unknown whether paired calcium signals combine supralinearly to initiate memory storage. A correlate of memory storage is an enhancement of the long lasting depolarization (LLD) after light offset, which is attributed to a reduction in voltage dependent potassium currents; however, it is unclear what causes the LLD in the untrained animal. These issues were addressed using a multi-compartmental computer model of phototransduction, calcium dynamics, and ionic currents of the Hermissenda photoreceptor. Simulations of the interaction between light and hair cell activity show that paired stimuli do not produce a greater calcium increase than unpaired stimuli. This suggests that hair cell activity is acting via some other pathway to initiate memory storage. In addition, simulations show that a potassium leak channel, which closes with an increase in calcium, is required to produce both the untrained LLD and the enhanced LLD due to the decrease in voltage dependent potassium currents. Thus, the expression of this correlate of classical conditioning may depend on a leak potassium current.

Journal ArticleDOI
TL;DR: In this article, a model of columnar networks of neocortical association areas is studied, where memory items, or patterns, are peculiar combinations of features sparsely distributed over the multi-modular network.
Abstract: A model of columnar networks of neocortical association areas is studied. The neuronal network is composed of many Hebbian autoassociators, or modules, each of which interacts with a relatively small number of the others, randomly chosen. Any module encodes and stores a number of elementary percepts, or features. Memory items, or patterns, are peculiar combinations of features sparsely distributed over the multi-modular network. Any feature stored in any module can be involved in several of the stored patterns; feature-sharing is in fact source of local ambiguities and, consequently, a potential cause of erroneous memory retrieval spreading through the model network in pattern completion tasks. The memory retrieval dynamics of the large modular autoassociator is investigated by combining mathematical analysis and numerical simulations. An oscillatory retrieval process is proposed that is very efficient in overcoming feature-sharing drawbacks; it requires a mechanism that modulates the robustness of local attractors to noise, and neuronal activity sparseness such that quiescent and active modules are about equally noisy to any post-synaptic module. Moreover, it is shown that statistical correlation between ‘kinds’ of features across the set of memory patterns can be exploited to obtain a more efficient achievement of memory retrieval capabilities. It is also shown that some spots of the network cannot be reached by retrieval activity spread if they are not directly cued by the stimulus. The locations of these activity isles depend on the pattern to retrieve, while their extension only depends (in large networks) on statistics of inter-modular connections and stored patterns. The existence of activity isles determines an upper-bound to retrieval quality that does not depend on the specific retrieval dynamics adopted, nor on whether feature-sharing is permitted. The oscillatory retrieval process nearly saturates this bound.

Journal ArticleDOI
TL;DR: The spatial localization of synapses for supralinear summation of APs and EPSPs within thin dendritic branches where patch clamp experiments cannot be easily conducted is computationally illustrated.
Abstract: Although the supralinear summation of synchronizing excitatory postsynaptic potentials (EPSPs) and backpropagating action potentials (APs) is important for spike-timing-dependent synaptic plasticity (STDP), the spatial conditions of the amplification in the divergent dendritic structure have yet to be analyzed. In the present study, we simulated the coincidence of APs with EPSPs at randomly determined synaptic sites of a morphologically reconstructed hippocampal CA1 pyramidal model neuron and clarified the spatial condition of the amplifying synapses. In the case of uniform conductance inputs, the amplifying synapses were localized in the middle apical dendrites and distal basal dendrites with small diameters, and the ratio of synapses was unexpectedly small: 8-16% in both apical and basal dendrites. This was because the appearance of strong amplification requires the coincidence of both APs of 3-30 mV and EPSPs of over 6 mV, both of which depend on the dendritic location of synaptic sites. We found that the localization of amplifying synapses depends on A-type K+ channel distribution because backpropagating APs depend on the A-type K+ channel distribution, and that the localizations of amplifying synapses were similar within a range of physiological synaptic conductances. We also quantified the spread of membrane amplification in dendrites, indicating that the neighboring synapses can also show the amplification. These findings allowed us to computationally illustrate the spatial localization of synapses for supralinear summation of APs and EPSPs within thin dendritic branches where patch clamp experiments cannot be easily conducted.

Journal ArticleDOI
Steven J. Cox1
TL;DR: It is shown that accurate and robust extraction of the location and time course of synaptic conductance from potentials recorded on either side of, and perhaps at some distance from, the synapse in question permits one to fully overcome the problems typically associated with lack of spaceclamp.
Abstract: A method is introduced that permits accurate and robust extraction of the location and time course of synaptic conductance from potentials recorded on either side of, and perhaps at some distance from, the synapse in question. It is shown that such data permits one to fully overcome the problems typically associated with lack of spaceclamp. The method does not presume anything about the nature of the time course and yet is applicable to branched, active cells receiving simultaneous input from a number of synapses.