scispace - formally typeset
Search or ask a question

Showing papers on "Bidirectional associative memory published in 1986"


Journal ArticleDOI
TL;DR: The author finds that the asymmetry in the synaptic strengths may be crucial for the process of learning.
Abstract: Studies the influence of a strong asymmetry of the synaptic strengths on the behavior of a neural network which works as an associative memory. The author finds that the asymmetry in the synaptic strengths may be crucial for the process of learning.

164 citations


Journal ArticleDOI
TL;DR: The model of Hopfield for a neural network with associative memory is modified by the introduction of a maximum value for the synaptic strength; in this way old patterns are automatically forgotten and the memory recalls only the most recent ones.
Abstract: The model of Hopfield for a neural network with associative memory is modified by the introduction of a maximum value for the synaptic strength; in this way old patterns are automatically forgotten and the memory recalls only the most recent ones. If the parameters are correctly chosen, the memory never goes into the state of total confusion, characteristic of the Hopfield model.

161 citations


Journal ArticleDOI
TL;DR: It is shown that the storage capacity of such networks is similar to that of the Hopfield network, and that it is not significantly affected by the restriction of keeping the couplings' signs constant throughout the learning phase.
Abstract: A new learning mechanism is proposed for networks of formal neurons analogous to Ising spin systems; it brings such models substantially closer to biological data in three respects: first, the learning procedure is applied initially to a network with random connections (which may be similar to a spin-glass system), instead of starting from a system void of any knowledge (as in the Hopfield model); second, the resultant couplings are not symmetrical; third, patterns can be stored without changing the sign of the coupling coefficients. It is shown that the storage capacity of such networks is similar to that of the Hopfield network, and that it is not significantly affected by the restriction of keeping the couplings' signs constant throughout the learning phase. Although this approach does not claim to model the central nervous system, it provides new insight on a frontier area between statistical physics, artificial intelligence, and neurobiology.

27 citations


Proceedings ArticleDOI
10 Dec 1986
TL;DR: The issue of orthogonal projection vectors, associative memory capacity and new results and techniques to synthesize associative memories are included throughout.
Abstract: Associative memories represent a major new artificial intelligence type of processor. We consider their use in pattern recognition, with particular attention to distortion-invariant and adaptive pattern recognition. New associative memory techniques and pattern recognition oriented architectures suitable for multi-class distortion-invariant pattern recognition (including systems that provide adaptive updating, forgetting, achieve reduced dynamic range and improved performance) are discussed and initial results presented. The first results of distortion-invariance, multi-class associative memories for pattern recognition are presented together with new architectures and algorithms for multi-stage associative processors, iterative processors for associative memory synthesis, and multi-class distortion-invariant associative processors. The issue of orthogonal projection vectors, associative memory capacity and new results and techniques to synthesize associative memories are included throughout.

9 citations


Proceedings ArticleDOI
23 Mar 1986
TL;DR: A series of computer simulations performed on a 100-node Hopfield network examined the sources of confusion and led to a preprocessing approach which substantially reduces the confusion.
Abstract: The performance of an associative memory based on the Hopfield Model of a neural network is data dependent. When programmed memories are too similar (a small Hamming distance between memories) the associative memory system is easily confused; settling either to incorrect or in some cases, undefined states. This paper describes a series of computer simulations performed on a 100-node Hopfield network. The programs were written in the APL language, which is highly efficient for this type of system. The simulations examined the sources of confusion and led to a preprocessing approach which substantially reduces the confusion. The simulations were also extended in the direction of coupling several small neural networks to form one integrated low-confusion associative memory. The coupling of the neural subnetworks was through a voting scheme wherein each node of a subnetwork consulted the analogous node of the other subnetworks; the decision to change state or remain the same is based on majority rule. The performance of these two associative memory systems is detailed and compared to a conventional Hopfield system.

5 citations


Journal ArticleDOI
TL;DR: The theory confirms results obtained by Hopfield in 1982 for patterns with 50% density of active elements for binary-pattern storage capacity of a quasi-neural network with threshold logic units.

5 citations


Proceedings ArticleDOI
01 Dec 1986
TL;DR: This paper proposes the application of an associative memory algorithm for the identification and isolation of system failures, which assumes that all possible failure modes are known their signatures have been Stored in a random-access memory and feedback is used to provide a search algorithm.
Abstract: This paper proposes the application of an associative memory algorithm for the identification and isolation of system failures. It is assumed that the system is time-invariant and that prior to the occurrence of the failure all signals are stationary random processes. It is also assumed that the fact that a failure has occurred and its time of occurrence are both known, from some other method. The associative memory algorithm assumes that all possible failure modes are known their signatures have been Stored in a random-access memory and feedback is used to provide a search algorithm. The combination of the memory and the feedback loop produce a dynamic associative memory whose output is a "nearest neighbor" of the actual failure mode.

2 citations


Book ChapterDOI
02 Jul 1986

2 citations


Journal ArticleDOI
TL;DR: One form of dynamic memory has the highest storage capacity of any known network model of associative memory and some implications of static and dynamic architectures are concluded.
Abstract: Associative memories are of two fundamental types, those that store representations of prototypical patterns (auto-associative memories)and those that store associations between pairs of arbitrary patterns (hetero-associative memories)Four network models of the latter type, each employing a single layer of linear threshold units are presented. Two of these models maintain fixed arrangements of their components. The other two are dynamically self-organizing. They employ feedback about performance to guide changes in the organization of their components. These models are evaluated in terms of storage capacity, error-tolerance, and storage space efficiency. One form of dynamic memory has the highest storage capacity of any known network model of associative memory. A discussion of models by Anderson and Hopfield and some implications of static and dynamic architectures conclude the paper.

2 citations