Hebbian learning and spiking neurons
read more
Citations
Deep learning in neural networks
Stochastic Processes in Physics and Chemistry
Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems
Spiking Neuron Models: Single Neurons, Populations, Plasticity
Competitive Hebbian learning through spike-timing-dependent synaptic plasticity
References
A synaptic model of memory: long-term potentiation in the hippocampus
Self Organization And Associative Memory
Stochastic processes in physics and chemistry
Stochastic Processes in Physics and Chemistry
Related Papers (5)
Synaptic Modifications in Cultured Hippocampal Neurons: Dependence on Spike Timing, Synaptic Strength, and Postsynaptic Cell Type
Competitive Hebbian learning through spike-timing-dependent synaptic plasticity
Regulation of Synaptic Efficacy by Coincidence of Postsynaptic APs and EPSPs
Frequently Asked Questions (8)
Q2. What is the simplest way to explain the spiking?
Given that input spiking is random but partially correlated and that the generation of spikes is in general a complicated dynamic process, an analysis of Eq. ~1! is a formidable problem.
Q3. What is the correlation function for a spike at time t8?
the correlation function ^Siin(t9)Sout(t8)& is to be interpreted as the joint probability density for observing an input spike at synapse i at the time t9 and an output spike at time t8.
Q4. How long is the learning window in the retinotectal pyramidal cells?
According to the results of Markram et al. @25#, the width W of the Hebbian learning window in cortical pyramidal cells is in the range of 100 ms.
Q5. how can the authors form the rules i!–iii?
Denoting the input spike train at synapse i by a series of d functions, Si in(t)5( fd(t2t i f), and, similarly, output spikes by Sout(t)5(nd(t2t n), the authors can formulate the rules ~i!–~iii!
Q6. What is the rewrite of Eq. 20!?
The authors rewrite Eq. ~20! in the standard form J̇av5@J* av2Jav#/tav, whereJ * av52k1 /@N~k21Q av!# ~21!is the fixed point for the average weight andtav5J * av/k1521/@N~k21Q av!# ~22!is the time constant of normalization.
Q7. What are the restric-tions for the size of individual weights?
Even if the average weight Jav approaches a fixed point J* av , there is no restric-tion for the size of individual weights, apart from Ji>0 for excitatory synapses and Ji&N J*av .
Q8. What is the simplest way to explain the effects of the k3 term?
A numerical example confirming the above theoretical considerations is presented in Fig. 9. Simulation parameters are as given in Appendix B.Up to this point the authors have neglected the influence of the k3 term in Eq. ~15!, which may lead to a stabilization of weight distributions, in particular when synapses are few and strong @22,54#; cf. Sec. IV D.