scispace - formally typeset
Search or ask a question

Showing papers in "Network: Computation In Neural Systems in 2000"


Journal ArticleDOI
TL;DR: The similarities between these results and the observed properties of simple cells in the primary visual cortex are further evidence for the hypothesis that visual cortical neurons perform some type of redundancy reduction, which was one of the original motivations for ICA in the first place.
Abstract: Previous work has shown that independent component analysis (ICA) applied to feature extraction from natural image data yields features resembling Gabor functions and simple-cell receptive fields. This article considers the effects of including chromatic and stereo information. The inclusion of colour leads to features divided into separate red/green, blue/yellow, and bright/dark channels. Stereo image data, on the other hand, leads to binocular receptive fields which are tuned to various disparities. The similarities between these results and the observed properties of simple cells in the primary visual cortex are further evidence for the hypothesis that visual cortical neurons perform some type of redundancy reduction, which was one of the original motivations for ICA in the first place. In addition, ICA provides a principled method for feature extraction from colour and stereo images; such features could be used in image processing operations such as denoising and compression, as well as in pattern recognition.

274 citations


Journal ArticleDOI
TL;DR: This important and cautionary book contributes to the understanding of the rich behaviours of these neural elements when combined in circuits, excepting, of course, important examples from various invertebrate systems.
Abstract: (2000). Fast oscillations in cortical circuits. Network: Computation in Neural Systems: Vol. 11, No. 4, pp. 333-334.

128 citations


Journal ArticleDOI
TL;DR: Stochastic weak synchronization (SWS) is found to be much more robust than strong synchronization (i.e. when neurons spike at each period) and in the gamma-frequency range (20-80 Hz) for a stronger synaptic coupling compared with previous models and for networks with 10-1000 neurons.
Abstract: Recent experiments suggest that inhibitory networks of interneurons can synchronize the neuronal discharge in in vitro hippocampal slices. Subsequent theoretical work has shown that strong synchronization by mutual inhibition is only moderately robust against neuronal heterogeneities in the current drive, provided by activation of metabotropic glutamate receptors. In vivo neurons display greater variability in the interspike intervals due to the presence of synaptic noise. Noise and heterogeneity affect synchronization properties differently. In this paper we study, using model simulations, how robust synchronization can be in the presence of synaptic noise and neuronal heterogeneity. We find that stochastic weak synchronization (SWS) (i.e. when neurons spike within a short interval from each other, but not necessarily at each period) is produced with at least a minimum amount of noise and that it is much more robust than strong synchronization (i.e. when neurons spike at each period). The statistics prod...

120 citations


Journal ArticleDOI
TL;DR: A novel mechanism for self-organized oscillations in networks that have strong, sparse random electrical coupling via gap junctions is described and insight into possible mechanisms for frequency control and modulation in networks of neurons is provided.
Abstract: Recent evidence suggests that electrical coupling plays a role in generating oscillatory behaviour in networks of neurons; however, the underlying mechanisms have not been identified. Using a cellular automata model proposed by Traub et al (Traub R D, Schmitz D, Jefferys J G and Draguhn A 1999 High-frequency population oscillations are predicted to occur in hippocampal pyramidal neural networks interconnected by axo-axonal gap junctions Neuroscience 92 407-26), we describe a novel mechanism for self-organized oscillations in networks that have strong, sparse random electrical coupling via gap junctions. The network activity is generated by random spontaneous activity that is moulded into regular population oscillations by the propagation of activity through the network. We explain how this activity gives rise to particular dependences of mean oscillation frequency on network connectivity parameters and on the rate of spontaneous activity, and we derive analytical expressions to approximate the mean frequency and variance of the oscillations. In doing so, we provide insight into possible mechanisms for frequency control and modulation in networks of neurons.

106 citations


Journal ArticleDOI
TL;DR: A network of neurons with two sites of synaptic integration communicate in such a way that one set of synapses mainly influences the neurons' activity; the other set gates synaptic plasticity, demonstrating how a new measure of performance can be implemented: cells learn to represent only the part of the input that is relevant to the processing at higher stages.
Abstract: Since the classical work of D O Hebb 1949 The Organization of Behaviour (New York: Wiley) it is assumed that synaptic plasticity solely depends on the activity of the pre- and the post-synaptic cells. Synapses influence the plasticity of other synapses exclusively via the post-synaptic activity. This confounds effects on synaptic plasticity and neuronal activation and, thus, makes it difficult to implement networks which optimize global measures of performance. Exploring solutions to this problem, inspired by recent research on the properties of apical dendrites, we examine a network of neurons with two sites of synaptic integration. These communicate in such a way that one set of synapses mainly influences the neurons' activity; the other set gates synaptic plasticity. Analysing the system with a constant set of parameters reveals: (1) the afferents that gate plasticity act as supervisors, individual to every cell. (2) While the neurons acquire specific receptive fields the net activity remains constant ...

84 citations


Journal ArticleDOI
TL;DR: A model of an olfactory system that performs odour segmentation is presented that consists of a pair of coupled modules, bulb and cortex, which implements o aroma segmentation by suppressing the bulbar response to the pre-existing odour, thereby allowing subsequent odours to be singled out for recognition.
Abstract: We present a model of an olfactory system that performs odour segmentation. Based on the anatomy and physiology of natural olfactory systems, it consists of a pair of coupled modules, bulb and cortex. The bulb encodes the odour inputs as oscillating patterns. The cortex functions as an associative memory: when the input from the bulb matches a pattern stored in the connections between its units, the cortical units resonate in an oscillatory pattern characteristic of that odour. Further circuitry transforms this oscillatory signal to a slowly varying feedback to the bulb. This feedback implements olfactory segmentation by suppressing the bulbar response to the pre-existing odour, thereby allowing subsequent odours to be singled out for recognition.

68 citations


Journal ArticleDOI
TL;DR: A cortical network model in which this persistent activity appears due to recurrent synaptic interactions is considered, which predicts that, in networks of the cerebral cortex in which persistent activity phenomena are observed, average synaptic inputs in both spontaneous and persistent activity should bring the cells close to firing threshold.
Abstract: Neurophysiological experiments indicate that working memory of an object is maintained by the persistent activity of cells in the prefrontal cortex and infero-temporal cortex of the monkey. This paper considers a cortical network model in which this persistent activity appears due to recurrent synaptic interactions. The conditions under which the magnitude of spontaneous and persistent activity are close to one another (as is found empirically) are investigated using a simplified mean-field description in which firing rates in these states are given by the intersections of a straight line with the f-I curve of a single pyramidal cell. The present analysis relates a network phenomenon—persistent activity in a ‘working memory’ state—to single-cell data which are accessible to experiment. It predicts that, in networks of the cerebral cortex in which persistent activity phenomena are observed, average synaptic inputs in both spontaneous and persistent activity should bring the cells close to firing threshold....

67 citations


Journal ArticleDOI
TL;DR: A biologically based neural network simulation is used to model the recency effects and familiarity effects of neurons in inferior temporal cortex, which are caused by competitive self-organization of modifiable feedforward synapses terminating on IT cortex neurons.
Abstract: Neurons in inferior temporal (IT) cortex exhibit selectivity for complex visual stimuli and can maintain activity during the delay following the presentation of a stimulus in delayed match to sample tasks. Experimental work in awake monkeys has shown that the responses of IT neurons decline during presentation of stimuli which have been seen recently (within the past few seconds). In addition, experiments have found that the responses of IT neurons to visual stimuli also decline as the stimuli become familiar, independent of recency. Here a biologically based neural network simulation is used to model these effects primarily through two processes. The recency effects are caused by adaptation due to a calcium-dependent potassium current, and the familiarity effects are caused by competitive self-organization of modifiable feedforward synapses terminating on IT cortex neurons.

55 citations


Journal ArticleDOI
TL;DR: Divisive inhibition, acting through interneurons that are themselves divisively inhibited, can solve problems without degrading the selectivity of a recurrent network.
Abstract: Models of visual cortex suggest that response selectivity can arise from recurrent networks operating at high gain. However, such networks have a number of problematic features: (i) they operate perilously close to a point of instability, (ii) small changes in synaptic strength can dramatically modify the degree of amplification, and (iii) they respond slowly to rapidly changing stimuli. Divisive inhibition, acting through interneurons that are themselves divisively inhibited, can solve these problems without degrading the selectivity of a recurrent network.

54 citations


Journal ArticleDOI
TL;DR: This model uses a model of neurite outgrowth combined with cell movement to investigate the hypothesis that lateral cell movement is guided by dendritic interactions, and shows that small cell movements are sufficient to transform random cell distributions into regular mosaics.
Abstract: The formation of retinal mosaics is thought to involve lateral movement of retinal cells from their clonal column of origin. The forces underlying this lateral cell movement are currently unknown. ...

45 citations


Journal ArticleDOI
TL;DR: Analysis of the development and structure of orientation (OR) and ocular dominance maps in the primary visual cortex of cats and monkeys using the elastic net algorithm suggests that developmental order can be predicted from the final OR and OD periodicities.
Abstract: The development and structure of orientation (OR) and ocular dominance (OD) maps in the primary visual cortex of cats and monkeys can be modelled using the elastic net algorithm, which attempts to find an 'optimal' cortical representation of the input features. Here we analyse this behaviour in terms of parameters of the feature space. We derive expressions for the OR periodicity, and the first bifurcation point as a function of the annealing parameter using the methods of Durbin et al (Durbin R, Szeliski R and Yuille A 1989 Neural Computation 1 348-58). We also investigate the effect of the relative order of OR and OD development on overall map structure. This analysis suggests that developmental order can be predicted from the final OR and OD periodicities. In conjunction with experimentally measured values for these periodicities, the model predicts that (i) in normal macaques OD develops first, (ii) in normal cats OR develops first and (iii) in strabismic cats OD develops first.

Journal ArticleDOI
TL;DR: It is concluded that the differences in velocity and shape between the front of thalamic spindle waves in vitro and cortical paroxysmal discharges stem from their different effective delays.
Abstract: We study a one-dimensional model of integrate-and-fire neurons that are allowed to fire only one spike, and are coupled by excitatory synapses with delay. At small delay values, this model describes a disinhibited cortical slice. At large delay values, the model is a reduction of a model of thalamic networks composed of excitatory and inhibitory neurons, in which the excitatory neurons show the post-inhibitory rebound mechanism. The velocity and stability of propagating continuous pulses are calculated analytically. Two pulses with different velocities exist if the synaptic coupling is larger than a minimal value; the pulse with the lower velocity is always unstable. Above a certain critical value of the constant delay, continuous pulses lose stability via a Hopf bifurcation, and lurching pulses emerge. The parameter regime for which lurching occurs is strongly affected by the synaptic footprint (connectivity) shape. A bistable regime, in which both continuous and lurching pulses can propagate, may occur ...

Journal ArticleDOI
TL;DR: A set of sigma–pi units randomly connected to two input vectors forms a type of hetero-associator related to convolution- and matrix-based associative memories, which encodes information in activation values rather than in weight values, which makes the information about relationships accessible to further processing.
Abstract: A set of sigma-pi units randomly connected to two input vectors forms a type of hetero-associator related to convolution- and matrix-based associative memories. Associations are represented as patterns of activity rather than connection strengths. Decoding the associations requires another network of sigma-pi units, with connectivity dependent on the encoding network. Learning the connectivity of the decoding network involves setting n3 parameters (where n is the size of the vectors), and can be accomplished in approximately 3e n log n presentations of random patterns. This type of network encodes information in activation values rather than in weight values, which makes the information about relationships accessible to further processing. This accessibility is essential for higher-level cognitive tasks such as analogy processing. The fact that random networks can perform useful operations makes it more plausible that these types of associative network could have arisen in the nervous systems of natural organisms during the course of evolution.

Journal ArticleDOI
TL;DR: The calculations suggest that differences in the experimental dynamical responses of cells in different cortical layers originate from differences in their recurrent connections with other cells, which furnishes a variety of information that is not available from experiment alone.
Abstract: A typical functional region in cortex contains thousands of neurons, therefore direct neuronal simulation of the dynamics of such a region necessarily involves massive computation. A recent efficient alternative formulation is in terms of kinetic equations that describe the collective activity of the whole population of similar neurons. A previous paper has shown that these equations produce results that agree well with detailed direct simulations. Here we illustrate the power of this new technique by applying it to the investigation of the effect of recurrent connections upon the dynamics of orientation tuning in the visual cortex. Our equations express the kinetic counterpart of the hypercolumn model from which Somers et al (Somers D, Nelson S and Sur M 1995 J. Neurosci. 15 5448-65) computed steady-state cortical responses to static stimuli by direct simulation. We confirm their static results. Our method presents the opportunity to simulate the data-intensive dynamical experiments of Ringach et al (Ringach D, Hawken M and Shapley R 1997 Nature 387 281-4), in which 60 randomly oriented stimuli were presented each second for 15 min, to gather adequate statistics of responses to multiple presentations. Without readjustment of the previously defined parameters, our simulations yield substantial agreement with the experimental results. Our calculations suggest that differences in the experimental dynamical responses of cells in different cortical layers originate from differences in their recurrent connections with other cells. Thus our method of efficient simulation furnishes a variety of information that is not available from experiment alone.

Journal ArticleDOI
TL;DR: This work adds an additional, biologically appropriate, parameter to control the magnitude and stability of activity oscillations in recurrent neural network models of brain regions, and shows how the size of external input activity interacts with this parameter to affect network activity.
Abstract: Controlling activity in recurrent neural network models of brain regions is essential both to enable effective learning and to reproduce the low activities that exist in some cortical regions such as hippocampal region CA3. Previous studies of sparse, random, recurrent networks constructed with McCulloch-Pitts neurons used probabilistic arguments to set the parameters that control activity. Here, we extend this work by adding an additional, biologically appropriate, parameter to control the magnitude and stability of activity oscillations. The new constant can be considered to be the rest conductance in a shunting model or the threshold when subtractive inhibition is used. This new parameter is critical for large networks run at low activity levels. Importantly, extreme activity fluctuations that act to turn large networks totally on or totally off can now be avoided. We also show how the size of external input activity interacts with this parameter to affect network activity. Then the model based on fixed weights is extended to estimate activities in networks with distributed weights. Because the theory provides accurate control of activity fluctuations, the approach can be used to design a predictable amount of pseudorandomness into deterministic networks. Such nonminimal fluctuations improve learning in simulations trained on the transitive inference problem.

Journal ArticleDOI
TL;DR: In this article, the scaling properties of changes in contrast of natural images in different visual environments are investigated. But the scaling exponents are not universal: even if most images follow the same type of statistics, they do it with different values of the distribution parameters.
Abstract: We report results on the scaling properties of changes in contrast of natural images in different visual environments. This study confirms the existence, in a vast class of images, of a multiplicative process relating the variations in contrast seen at two different scales, as was found in Turiel et al (Turiel A, Mato G, Parga N and Nadal J-P 1998 Self-Similarity Properties of Natural Images: Proc. NIPS'97 (Cambridge, MA: MIT Press), Turiel A, Mato G, Parga N and Nadal J-P 1998 Phys. Rev. Lett. 80 1098-101). But it also shows that the scaling exponents are not universal: even if most images follow the same type of statistics, they do it with different values of the distribution parameters. Motivated by these results, we also present the analysis of a generative model of images that reproduces those properties and that has the correct power spectrum. Possible implications for visual processing are also discussed.

Journal ArticleDOI
TL;DR: The results show that synchronization phenomena far beyond completely synchronized oscillations can occur even in simple coupled networks, suggesting a similarly rich spatio-temporal behaviour in real neural systems.
Abstract: Synchronization of neural signals has been proposed as a temporal coding scheme representing cooperated computation in distributed cortical networks. Previous theoretical studies in that direction mainly focused on the synchronization of coupled oscillatory subsystems and neglected more complex dynamical modes, that already exist on the single-unit level. In this paper we study the parametrized time-discrete dynamics of two coupled recurrent networks of graded neurons. Conditions for the existence of partially synchronized dynamics of these systems are derived, referring to a situation where only subsets of neurons in each sub-network are synchronous. The coupled networks can have different architectures and even a different number of neurons. Periodic as well as quasiperiodic and chaotic attractors constrained to a manifold M of synchronized components are observed. Examples are discussed for coupled 3-neuron networks having different architectures, and for coupled 2-neuron and 3-neuron networks. Partial synchronization of different degrees is demonstrated by numerical results for selected sets of parameters. In conclusion, the results show that synchronization phenomena far beyond completely synchronized oscillations can occur even in simple coupled networks. The type of the synchronization depends in an intricate way on stimuli, history and connectivity as well as other parameters of the network. Specific inputs can further switch between different operational modes in a complex way, suggesting a similarly rich spatio-temporal behaviour in real neural systems.

Journal ArticleDOI
TL;DR: Estimation procedures for the parameters of this model when spatial covariances are large are extended based upon linear regression techniques and applied to an optical recording of the auditory cortex of a guinea pig stimulated with pure tone bursts.
Abstract: The diffusion model has been introduced as a statistical model for processing multidimensional neuronal data. This paper extends estimation procedures for the parameters of this model when spatial covariances are large. The new method is based upon linear regression techniques. It is applied to an optical recording of the auditory cortex of a guinea pig stimulated with pure tone bursts (frequency 14 kHz).

Journal ArticleDOI
TL;DR: The results indicate that dendritic spines may act as an analogue pattern matching device, and suggest that modulation of potassium channels by protein kinases may mediate neuronal pattern recognition.
Abstract: Modification of potassium channels by protein phosphorylation has been shown to play a role in learning and memory. If such memory storage machinery were part of dendritic spines, then a set of spines could act as an 'analogue pattern matching' device by learning a repeatedly presented pattern of synaptic activation. In this study, the plausibility of such analogue pattern matching is investigated in a detailed circuit model of a set of spines attached to a dendritic branch. Each spine head contains an AMPA synaptic channel in parallel with a calcium-dependent potassium channel whose sensitivity depends on its phosphorylation state. Repeated presentation of synaptic activity results in calcium activation of protein kinases and subsequent channel phosphorylation. Simulations demonstrate that signal strength is greatest when the synaptic input pattern is equal to the previously learned pattern, and smaller when components of the synaptic input pattern are either smaller or larger than corresponding components of the previously learned pattern. Therefore, our results indicate that dendritic spines may act as an analogue pattern matching device, and suggest that modulation of potassium channels by protein kinases may mediate neuronal pattern recognition.