scispace - formally typeset
Open AccessJournal ArticleDOI

Neural networks and physical systems with emergent collective computational abilities

John J. Hopfield
- 01 Apr 1982 - 
- Vol. 79, Iss: 8, pp 2554-2558
Reads0
Chats0
TLDR
A model of a system having a large number of simple equivalent components, based on aspects of neurobiology but readily adapted to integrated circuits, produces a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size.
Abstract
Computational properties of use of biological organisms or to the construction of computers can emerge as collective properties of systems having a large number of simple equivalent components (or neurons). The physical meaning of content-addressable memory is described by an appropriate phase space flow of the state of a system. A model of such a system is given, based on aspects of neurobiology but readily adapted to integrated circuits. The collective properties of this model produce a content-addressable memory which correctly yields an entire memory from any subpart of sufficient size. The algorithm for the time evolution of the state of the system is based on asynchronous parallel processing. Additional emergent collective properties include some capacity for generalization, familiarity recognition, categorization, error correction, and time sequence retention. The collective properties are only weakly sensitive to details of the modeling or the failure of individual devices.

read more

Citations
More filters
Journal ArticleDOI

Reducing the Dimensionality of Data with Neural Networks

TL;DR: In this article, an effective way of initializing the weights that allows deep autoencoder networks to learn low-dimensional codes that work much better than principal components analysis as a tool to reduce the dimensionality of data is described.
Journal ArticleDOI

Deep learning in neural networks

TL;DR: This historical survey compactly summarizes relevant work, much of it from the previous millennium, review deep supervised learning, unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.
Book ChapterDOI

Neural Networks for Pattern Recognition

TL;DR: The chapter discusses two important directions of research to improve learning algorithms: the dynamic node generation, which is used by the cascade correlation algorithm; and designing learning algorithms where the choice of parameters is not an issue.
Journal ArticleDOI

An integrative theory of prefrontal cortex function

TL;DR: It is proposed that cognitive control stems from the active maintenance of patterns of activity in the prefrontal cortex that represent goals and the means to achieve them, which provide bias signals to other brain structures whose net effect is to guide the flow of activity along neural pathways that establish the proper mappings between inputs, internal states, and outputs needed to perform a given task.
Book

Information Theory, Inference and Learning Algorithms

TL;DR: A fun and exciting textbook on the mathematics underpinning the most dynamic areas of modern science and engineering.
References
More filters
Journal ArticleDOI

A Theory of Cerebellar Cortex

TL;DR: A detailed theory of cerebellar cortex is proposed whose consequence is that the cerebellum learns to perform motor skills and two forms of input—output relation are described, both consistent with the cortical theory.
Book

Perceptrons: An Introduction to Computational Geometry

TL;DR: The aim of this book is to seek general results from the close study of abstract version of devices known as perceptrons.
Book

Associative Memory: A System-Theoretical Approach

Teuvo Kohonen
TL;DR: This associative memory a system theoretical approach is well known book in the world, of course many people will try to own it and this is it the book that you can receive directly after purchasing.
Journal ArticleDOI

Neural theory of association and concept-formation

TL;DR: Primitive neural models of association and concept-formation are presented, which will elucidate the distributed and multiply superposed manner of retaining knowledge in the brain.
Journal ArticleDOI

On associative memory.

TL;DR: The information storing capacity of certain associative and auto-associative memories is calculated and the usefulness of associative memories, as opposed to conventional listing memories, is discussed — especially in connection with brain modelling.