Author
O. P. Buneman
Bio: O. P. Buneman is an academic researcher from University of Edinburgh. The author has contributed to research in topics: Recall & Bidirectional associative memory. The author has an hindex of 2, co-authored 2 publications receiving 1007 citations.
Papers
More filters
••
TL;DR: The features of a hologram that commend it as a model of associative memory can be improved on by other devices.
Abstract: The features of a hologram that commend it as a model of associative memory can be improved on by other devices.
981 citations
••
TL;DR: Most of the models developed beyond their initial hypotheses have a certain family resemblance, and it seems as if the authors may now be in possession of the basic ideas which will be needed for the understanding of one of the central problems of memory, namely the mechanism of associative recall.
Abstract: The problem of how the brain stores and retrieves information is ultimately an experimental one, and its solution will doubtless call for the combined resources of psychology, physiology and molecular biology. But it is also a problem of great theoretical sophistication; and one of the major tasks confronting the brain scientist is the construction of theoretical models which are worthy of, and open to, experimental test. In this review we shall be concerned with the latter aspect of the problem of memory, which has attracted quite a lot of attention in the last few years. It is early yet to judge the relative merits of the various models in any detail; but as we shall see, most of those which have been developed beyond their initial hypotheses have a certain family resemblance, and it seems as if we may now be in possession of the basic ideas which will be needed for the understanding of one of the central problems of memory, namely the mechanism of associative recall.
66 citations
Cited by
More filters
•
01 Jan 1997
TL;DR: An Introduction to Nueral Networks will be warmly welcomed by a wide readership seeking an authoritative treatment of this key subject without an intimidating level of mathematics in the presentation.
Abstract: From the Publisher:
An Introduction to Nueral Networks will be warmly welcomed by a wide readership seeking an authoritative treatment of this key subject without an intimidating level of mathematics in the presentation.
2,135 citations
••
01 Jan 1992TL;DR: A speculative neurophysiological model illustrating how the backpropagation neural network architecture might plausibly be implemented in the mammalian brain for corticocortical learning between nearby regions of the cerebral cortex is presented.
Abstract: Publisher Summary This chapter presents a survey of the elementary theory of the basic backpropagation neural network architecture, covering the areas of architectural design, performance measurement, function approximation capability, and learning. The survey includes a formulation of the backpropagation neural network architecture to make it a valid neural network and a proof that the backpropagation mean squared error function exists and is differentiable. Also included in the survey is a theorem showing that any L2 function can be implemented to any desired degree of accuracy with a three-layer backpropagation neural network. An appendix presents a speculative neurophysiological model illustrating the way in which the backpropagation neural network architecture might plausibly be implemented in the mammalian brain for corticocortical learning between nearby regions of cerebral cortex. One of the crucial decisions in the design of the backpropagation architecture is the selection of a sigmoidal activation function.
1,729 citations
••
01 Jan 1989
TL;DR: A speculative neurophysiological model illustrating how the backpropagation neural network architecture might plausibly be implemented in the mammalian brain for corticocortical learning between nearby regions of the cerebral cortex is presented.
Abstract: The author presents a survey of the basic theory of the backpropagation neural network architecture covering architectural design, performance measurement, function approximation capability, and learning. The survey includes previously known material, as well as some new results, namely, a formulation of the backpropagation neural network architecture to make it a valid neural network (past formulations violated the locality of processing restriction) and a proof that the backpropagation mean-squared-error function exists and is differentiable. Also included is a theorem showing that any L/sub 2/ function from (0, 1)/sup n/ to R/sup m/ can be implemented to any desired degree of accuracy with a three-layer backpropagation neural network. The author presents a speculative neurophysiological model illustrating how the backpropagation neural network architecture might plausibly be implemented in the mammalian brain for corticocortical learning between nearby regions of the cerebral cortex. >
1,668 citations
••
TL;DR: Recent physiological recordings from sensory neurons have indicated that sparse coding could be a ubiquitous strategy employed in several different modalities across different organisms.
1,414 citations
••
TL;DR: A simple neuronal model capable of superimposing multiple memory traces within the same matrix of connections is outlined, and the correspondence between such models and the properties of LTE in the context of the hippocampal circuitry in which it occurs is considered.
1,254 citations