Hebbian plasticity requires compensatory processes on multiple timescales.
TLDR
It is suggested that learning and memory rely on an intricate interplay of diverse plasticity mechanisms on different timescales which jointly ensure stability and plasticity of neural circuits.Abstract:
We review a body of theoretical and experimental research on Hebbian and homeostatic plasticity, starting from a puzzling observation: while homeostasis of synapses found in experiments is a slow c...read more
Citations
More filters
Journal ArticleDOI
Integrating Hebbian and homeostatic plasticity: the current state of the field and future research directions
Tara Keck,Taro Toyoizumi,Lu Chen,Brent Doiron,Daniel E. Feldman,Kevin Fox,Wulfram Gerstner,Philip G. Haydon,Mark Hübener,Hey Kyoung Lee,John E. Lisman,Tobias Rose,Frank Sengpiel,Frank Sengpiel,David Stellwagen,Michael P. Stryker,Gina G. Turrigiano,Mark C. W. van Rossum +17 more
TL;DR: Potential directions of research that are most promising to develop an understanding of how these two forms of plasticity interact to facilitate functional changes in the brain are suggested.
Journal ArticleDOI
The temporal paradox of Hebbian learning and homeostatic plasticity
TL;DR: It is suggested that homeostatic plasticity is complemented by additional rapid compensatory processes, which rapidly stabilize neuronal activity on short timescales.
Journal ArticleDOI
Plasticity of intrinsic neuronal excitability.
TL;DR: The nature of the learning rules shared by intrinsic and synaptic plasticity and the impact of intrinsic plasticity on temporal processing are discussed.
Journal ArticleDOI
Lifelong learning of human actions with deep neural network self-organization
TL;DR: A self-organizing neural architecture for incrementally learning to classify human actions from video sequences using a set of hierarchically arranged recurrent networks for the unsupervised learning of action representations with increasingly large spatiotemporal receptive fields is proposed.
References
More filters
Journal ArticleDOI
Neural networks and physical systems with emergent collective computational abilities
TL;DR: A model of a system having a large number of simple equivalent components, based on aspects of neurobiology but readily adapted to integrated circuits, produces a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size.
Journal ArticleDOI
Receptive fields, binocular interaction and functional architecture in the cat's visual cortex
David H. Hubel,Torsten N. Wiesel +1 more
TL;DR: This method is used to examine receptive fields of a more complex type and to make additional observations on binocular interaction and this approach is necessary in order to understand the behaviour of individual cells, but it fails to deal with the problem of the relationship of one cell to its neighbours.
Book
Independent Component Analysis
TL;DR: Independent component analysis as mentioned in this paper is a statistical generative model based on sparse coding, which is basically a proper probabilistic formulation of the ideas underpinning sparse coding and can be interpreted as providing a Bayesian prior.
Journal ArticleDOI
Long-lasting potentiation of synaptic transmission in the dentate area of the anaesthetized rabbit following stimulation of the perforant path.
Timothy V. P. Bliss,T. Lømo +1 more
TL;DR: The after‐effects of repetitive stimulation of the perforant path fibres to the dentate area of the hippocampal formation have been examined with extracellular micro‐electrodes in rabbits anaesthetized with urethane.
Journal ArticleDOI
Emergence of simple-cell receptive field properties by learning a sparse code for natural images
TL;DR: It is shown that a learning algorithm that attempts to find sparse linear codes for natural scenes will develop a complete family of localized, oriented, bandpass receptive fields, similar to those found in the primary visual cortex.