scispace - formally typeset
Search or ask a question

Showing papers by "Sacha B. Nelson published in 2001"


Journal ArticleDOI
20 Dec 2001-Neuron
TL;DR: A novel form of cooperativity operating even when postsynaptic firing is evoked by current injection is demonstrated, and a complex dependence of LTP and LTD on rate and timing is revealed, providing a quantitative framework for predicting the impact of in vivo firing patterns on synaptic strength.

1,153 citations


Journal ArticleDOI
TL;DR: Together with previous work, these data suggest that there are two independent signals that regulate activity-dependent synaptic scaling in pyramidal neurons: low levels of BDNF cause excitatory synapses to scale up in strength, whereas depolarization causes excites to scale down in strength.
Abstract: Pyramidal neurons scale the strength of all of their excitatory synapses up or down in response to long-term changes in activity, and in the direction needed to stabilize firing rates. This form of homeostatic plasticity is likely to play an important role in stabilizing firing rates during learning and developmental plasticity, but the signals that translate a change in activity into global changes in synaptic strength are poorly understood. Some but not all of the effects of long-lasting changes in activity on synaptic strengths can be accounted for by activity-dependent release of the neurotrophin brain-derived neurotrophic factor (BDNF). Other candidate activity signals include changes in glutamate receptor (GluR) activation, changes in firing rate, or changes in the average level of postsynaptic depolarization. Here we combined elevated KCl (3-12 mm) with ionotropic receptor blockade to dissociate postsynaptic depolarization from receptor activation. Chronic (48 hr) depolarization, ranging between -62 and -36 mV, parametrically reduced the quantal amplitude of excitatory synapses in a BDNF-independent manner. This effect of depolarization did not depend on AMPA, NMDA, or GABA(A) receptor signaling, action-potential generation, or metabotropic GluR activation. Together with previous work, these data suggest that there are two independent signals that regulate activity-dependent synaptic scaling in pyramidal neurons: low levels of BDNF cause excitatory synapses to scale up in strength, whereas depolarization causes excitatory synapses to scale down in strength.

135 citations


01 Jan 2001
TL;DR: A simple, intuitive network to readout the population code of a direction selective population using a non-linear recurrent network approaches the optimal performance, suggesting that possibly on a circuit level the brain has indeed implemented readout mechanisms.
Abstract: We study the optimal way to decode information present in a population code. Using a matched filter, the performance in Gaussian additive noise is as good as the theoretical maximum. The scheme can be applied when correlations among the neurons in the population are present. We show how the read out of the matched filter can be implemented in a neurophysiological realistic manner. The method seems advantageous for computations in layered networks. At many stages of neural information processing in the brain, information is not carried by a single neuron but shared by many neurons in parallel. Evidence for this has been found in sensory systems [7], motor systems [3, 6] and hippocampal place cells [4]. In population codes neurons have receptive fields centered at different locations, but their tuning curves are wide and overlap considerably. Thus a stimulus will activate a large population of neurons. Combining the response rates of the different neurons, the stimulus can be reconstructed. The population coding is not vulnerable to failure of a single neuron. In addition, the population signal is less noisy than the signal from a single neuron alone. This can be used to increase accuracy or limit temporal averaging to allow for a quicker response [5, 13]. One approach to read out the encoded stimulus is to average the responses of all neurons in the population in order to construct the population vector [3]. Due to noise in the responses of the neurons there are trial to trial fluctuations in the estimate of the encoded quantity (e.g. motion direction). A statistical construct, the Cramer-Rao bound, gives the minimal trial to trial error obtainable at a certain noise level. But the Cramer-Rao bound does not prescribe how to accomplish the minimal error. It is not unreasonable to assume that the nervous system in its evolutionary quest for efficiency performs close to the optimum. However, the population vector scores in most situations worse than the bound [11]. In this paper we discuss optimal readout mechanisms. Interestingly, readout using a non-linear recurrent network approaches the optimal performance [9, 2], suggesting that possibly on a circuit level the brain has indeed implemented 0 90 180 270 360 Angle (degrees) 0 90 180 Filter width (degrees) 0 5 10 S D in a ng le ( de g) Cramer−Rao Pop. vector 0 90 180 Filter width (degrees) 0 0.1 0.2 0.3 S D in a m pl itu de Cramer−Rao Pop. vector Single cell max a c b Figure 1: a) Simulated population activity of 200 units in response to a stimulus with an angle. The stimulus is distorted with Gaussian additive noise, (lower panel). Upper panel: The response filtered with a matched filter. b) The error of the matched filter in the estimate of the stimulus angle as a function of the width of the filter. The standard deviation across trials in the angle estimate is plotted. The error approaches the theoretical limit (dotted line) when the filter width equals the width of the stimulus, that is, when the filter is matched (indicated by the arrow). Dashed line: error when using the population vector. c) as b) but for the amplitude estimate. a population code reader. Yet, such a recurrent network has two possible disadvantages: 1) The output of the network becomes non-linear in the input; in an extreme case a subthreshold input will not lead to any response in the output. 2) The network is recurrent and therefore some time passes before the final output is reached, slowing down computation. Here we propose a simple, intuitive network to readout the population code of a direction selective population. We extend the readout mechanism to not only read the encoded angle of the stimulus but also the amplitude of the stimulus. We prove that the method equals the Cramer-Rao bound for the both the angle and the amplitude.

1 citations