scispace - formally typeset
Search or ask a question

Showing papers on "Bidirectional associative memory published in 2000"


Journal ArticleDOI
TL;DR: These conditions are presented in terms of system parameters, and have an important leading significance in the design and applications of periodic oscillatory neural circuits for the BAM with delays.
Abstract: Periodic oscillatory phenomena of bidirectional associative memory (BAM) networks with axonal signal transmission delays are investigated by constructing suitable Lyapunov functionals and some analysis techniques. Some simple sufficient conditions are derived ensuring the existence and uniqueness of periodic oscillatory solutions of the BAM with delays, and all other solutions of the BAM converge exponentially to a periodic oscillatory solution. These conditions are presented in terms of system parameters, and have an important leading significance in the design and applications of periodic oscillatory neural circuits for the BAM with delays.

165 citations


Journal ArticleDOI
TL;DR: The Lyapunov function method is utilized to analyze stability of continuous nonlinear neural networks with delays and obtain some new sufficient conditions ensuring the globally asymptotic stability independent of delays.

122 citations


Journal ArticleDOI
TL;DR: This paper establishes the proofs for all claims made about the choice of kernel vectors and perfect recall in kernel method applications and provides arguments for the success of both approaches beyond the experimental results presented up to this point.

75 citations


Journal ArticleDOI
TL;DR: It is shown that the Hamming attractive radius of each prototype reaches the maximum possible value and the overall network design procedure is fully scalable in the sense that any number p= or <2(min{m,n}) of bidirectional associations can be implemented.
Abstract: In contrast to conventional feedback bidirectional associative memory (BAM) network models, a feedforward BAM network is developed based on a one-shot design algorithm of O(p/sup 2/(n+m)) computational complexity, where p is the number of prototype pairs and n, m are the dimensions of the input/output bipolar vectors. The feedforward BAM is an n-p-m three-layer network of McCulloch-Pitts neurons with storage capacity 2/sup min{m,n}/ and guaranteed perfect bidirectional recall. The overall network design procedure is fully scalable in the sense that any number p/spl les/2/sup min{m,n}/ of bidirectional associations can be implemented. The prototype patterns may be arbitrarily correlated. With respect to inference performance, it is shown that the Hamming attractive radius of each prototype reaches the maximum possible value. Simulation studies and comparisons illustrate and support these theoretical developments.

47 citations


Journal ArticleDOI
TL;DR: It is demonstrated how associative memory studies on different levels of abstraction can specify the functionality to be expected in real cortical neuronal circuits.
Abstract: The interplay between modelling and experimental studies can support the exploration of the function of neuronal circuits in the cortex. We exemplify such an approach with a study on the role of spike timing and gamma-oscillations in associative memory in strongly connected circuits of cortical neurones. It is demonstrated how associative memory studies on different levels of abstraction can specify the functionality to be expected in real cortical neuronal circuits. In our model overlapping random configurations of sparse cell populations correspond to memory items that are stored by simple Hebbian coincidence learning. This associative memory task will be implemented with biophysically well tested compartmental neurones developed by Pinsky and Rinzel [58]. We ran simulation experiments to study memory recall in two network architectures: one interconnected pool of cells, and two reciprocally connected pools. When recalling a memory by stimulating a spatially overlapping set of cells, the completed pattern is coded by an event of synchronized single spikes occurring after 25–60 ms. These fast associations are performed even at a memory load corresponding to the memory capacity of optimally tuned formal associative networks (>0.1 bit/synapse). With tonic stimulation or feedback loops in the network the neurones fire periodically in the gamma-frequency range (20–80 Hz). With fast changing inputs memory recall can be switched between items within a single gamma cycle. Thus, oscillation is not a primary coding feature necessary for associative memory. However, it accompanies reverberatory feedback providing an improved iterative memory recall completed after a few gamma cycles (60–260 ms). In the bidirectional architecture reverberations do not express in a rigid phase locking between the pools. For small stimulation sets bursting occurred in these cells acting as a supportive mechanism for associative memory.

41 citations


Journal ArticleDOI
TL;DR: The domain of attraction of memory patterns and the exponential convergence rate of the network trajectories to memory patterns for Hopfield continuous associative memory are estimated and can be used for the evaluation of fault-tolerance capability and the synthesis procedures forHopfield continuous feedback associativeMemory neural networks.

28 citations


Journal ArticleDOI
TL;DR: A new procedure to implement a recurrent neural network (RNN), based on a new approach to the well-known Hopfield autoassociative memory, with a very significant difference: the weights, which control the dynamics of the net, are obtained by coloring the graph G.
Abstract: This paper describes a new procedure to implement a recurrent neural network (RNN), based on a new approach to the well-known Hopfield autoassociative memory. In our approach a RNN is seen as a complete graph G and the learning mechanism is also based on Hebb's law, but with a very significant difference: the weights, which control the dynamics of the net, are obtained by coloring the graph G. Once the training is complete, the synaptic matrix of the net will be the weight matrix of the graph. Any one of these matrices will fulfil some spatial properties, for this reason they will be referred to as tetrahedral matrices. The geometrical properties of these tetrahedral matrices may be used for classifying the n-dimensional state-vector space in n classes. In the recall stage, a parameter vector is introduced, which is related with the capacity of the network. It may be shown that the bigger the value of the ith component of the parameter vector is, the lower the capacity of the [i] class of the state-vector space becomes. Once the capacity has been controlled, a new set of parameters that uses the statistical deviation of the prototypes to compare them with those that appear as fixed points is introduced, eliminating thus a great number of parasitic fixed points.

24 citations


Journal ArticleDOI
01 Jan 2000
TL;DR: A novel learning algorithm is explained, which is superior to the conventional ones and shows the effectiveness by a number of computer simulations of an associative memory suited for the intelligent controls.
Abstract: In many industrial applications of softcomputing, intelligent controls are important to accomplish high level tasks. Intelligent controls, however, need specific knowledge for each task. Therefore developing good memory is crucial to store the required knowledge efficiently and robustly. Neural network associative memories are the most suitable for the role because of their flexibility and content addressability. In this paper, first, we describe the basic concept of the neural network associative memories and the conventional learning algorithms. After pointing out some problems of the associative memories, we explain a novel learning algorithm, which is superior to the conventional ones. Finally, we introduce an associative memory suited for the intelligent controls and show the effectiveness by a number of computer simulations.

23 citations


Journal ArticleDOI
TL;DR: The constructive representation theorem provides a storing rule for a training set that allows a concept interpretation of patterns for bidirectional associative memory and a representation of hierarchical structures of concepts (concept lattices) by BAMs.
Abstract: This article presents a concept interpretation of patterns for bidirectional associative memory (BAM) and a representation of hierarchical structures of concepts (concept lattices) by BAMs. The constructive representation theorem provides a storing rule for a training set that allows a concept interpretation. Examples demonstrating the theorems are presented.

21 citations


Journal ArticleDOI
TL;DR: This work proposes a mechanism for storing and retrieving pairs of spatio-temporal sequences with the network architecture of the standard bidirectional associative memory (BAM), thereby achieving hetero-associations of spatiotemporal sequences.
Abstract: Autoassociations of spatio-temporal sequences have been discussed by a number of authors. We propose a mechanism for storing and retrieving pairs of spatio-temporal sequences with the network architecture of the standard bidirectional associative memory (BAM), thereby achieving heteroassociations of spatio-temporal sequences.

19 citations


Journal ArticleDOI
TL;DR: A neural network model of paired-associate learning based upon an auto-associative learning mechanism is developed that can replicate complex human behavioral data, but only when the correlation between forward and backward learning is highly correlated.

Proceedings ArticleDOI
01 Dec 2000
TL;DR: A chaotic associative memory for successive learning (CAMSL) using internal patterns, where the learning process and the recall process are not divided and the CAMSL can learn the pattern successively.
Abstract: The authors propose a chaotic associative memory for successive learning (CAMSL) using internal patterns. In the CAMSL, the learning process and the recall process are not divided. When an unstored pattern is given to the network, the CAMSL can learn the pattern successively. The CAMSL distinguishes an unstored pattern from the stored patterns. When a stored pattern is given, the CAMSL recalls the pattern. When an unstored pattern is given, the CAMSL changes the internal pattern for the input pattern by chaos and presents the other pattern candidates. When the CAMSL cannot recall the desired pattern, it learns the input pattern as an unstored pattern. We carried out a series of computer simulations and confirmed the effectiveness of the CAMSL.

Journal ArticleDOI
TL;DR: An associative memory retrieval in a pulsed neural network composed of the FitzHugh-Nagumo neurons is investigated, and it is demonstrated that this pulsed network is capable of an alternate retrieval of two patterns.
Abstract: An associative memory retrieval in a pulsed neural network composed of the FitzHugh-Nagumo neurons is investigated. The memory is represented in the spatio-temporal firing pattern of the neurons, and the memory retrieval is accomplished using the fluctuations in the system. The storage capacity of the network is investigated numerically. It is demonstrated that this pulsed neural network is capable of an alternate retrieval of two patterns.


Journal ArticleDOI
TL;DR: Stochastic resonance in associative memory is discussed with a canonical neural network model that describes the generic behavior of a large family of dynamical systems near bifurcation and shows that stochastic resonance helps memory association.
Abstract: We discuss stochastic resonance in associative memory with a canonical neural network model that describes the generic behavior of a large family of dynamical systems near bifurcation. Our result shows that stochastic resonance helps memory association. The relationship between stochastic resonance, associative memory, storage load, history of memory and initial states are studied. In intelligent systems like neural networks, it is likely that stochastic resonance combined with synaptic information enhances memory recalls.

01 Jan 2000
TL;DR: It is concluded that a symmetry constraint does not have any adverse affect on performance but that it does offer benefits in learning time and in network dynamics.
Abstract: Two existing high capacity training rules for the standard Hopfield architecture associative memory are examined. Both rules, based on the perceptron learning rule produce asymmetric weight matrices, for which the simple dynamics (only point attractors) of a symmetric network can no longer be guaranteed. This paper examines the consequences of imposing a symmetry constraint in learning. The mean size of attractor basins of trained patterns and the mean time for learning convergence are analysed for the networks that arise from these learning rules, in both the asymmetric and symmetric instantiations. It is concluded that a symmetry constraint does not have any adverse affect on performance but that it does offer benefits in learning time and in network dynamics.

Proceedings ArticleDOI
Jinde Cao1
28 Jun 2000
TL;DR: The domain of attraction of memory patterns and exponential convergence rate of the network trajectories to memory patterns for Hopfield continuous associative memory are estimated by means of a matrix measure and a comparison principle and can be used for the evaluation of fault-tolerance capability and the synthesis procedures forHopfield continuous feedback associativeMemory neural networks.
Abstract: The domain of attraction of memory patterns and exponential convergence rate of the network trajectories to memory patterns for Hopfield continuous associative memory are estimated by means of a matrix measure and a comparison principle. These results can be used for the evaluation of fault-tolerance capability and the synthesis procedures for Hopfield continuous feedback associative memory neural networks.

Proceedings ArticleDOI
28 May 2000
TL;DR: It is shown that as dynamics of a pattern are not interfered with by the other patterns except the target pattern, the ability of associative memory in higher order neural networks is superior to that of the traditional model.
Abstract: This paper describes dynamics of recalling process for associative memory in higher order neural networks by using the statistical method. As a result, it is shown that as dynamics of a pattern are not interfered with by the other patterns except the target pattern, the ability of associative memory in higher order neural networks is superior to that of the traditional model.

Journal ArticleDOI
TL;DR: A new synthesis approach is developed for bidirectional associative memories using feedback neural networks as a set of linear inequalities which can be solved using the perceptron training algorithm.

Proceedings ArticleDOI
01 Jan 2000
TL;DR: The proposed KPICAM is based on an improved chaotic associative memory composed of chaotic neurons and has the following features: it can deal with the knowledge which is represented in a form of semantic network; it canDeal with characteristics inheritance; it is robust for noisy input.
Abstract: In this paper, we propose a knowledge processing system using improved chaotic associative memory (KPICAM). The proposed KPICAM is based on an improved chaotic associative memory (ICAM) composed of chaotic neurons. In the conventional chaotic neural network, when a stored pattern is given to the network as an external input continuously, around the input pattern is searched. The ICAM makes use of this property in order to separate superimposed patterns and to deal with many-to-many associations. In this research, the ICAM is applied to knowledge processing in which the knowledge is represented in a form of semantic network. The proposed KPICAM has the following features: (1) it can deal with the knowledge which is represented in a form of semantic network; (2) it can deal with characteristics inheritance; (3) it is robust for noisy input. A series of computer simulations shows the effectiveness of the proposed system.

Journal ArticleDOI
TL;DR: A novel and high-speed realization of an inner-product processor for the multi-valued exponential bidirectional associative memory (MV-eBAM) is presented in order to reduce the carry propagation delay, wherein the treatment of inner product of two vectors is given.
Abstract: Inner-product calculations are often required in digital neural computing. The critical path of the inner product of two vectors is the carry propagation delay generated from individual product terms. In this work, a novel and high-speed realization of an inner-product processor for the multi-valued exponential bidirectional associative memory (MV-eBAM) is presented in order to reduce the carry propagation delay, wherein the treatment of inner product of two vectors is given. Notably, a systolic-like architecture of digital compressors is used to reduce the carry propagation delay in the critical path of the inner product of two vectors. The architecture we propose here might offer a sub-optimal solution for the digital hardware realization of the inner-product computation.

Journal ArticleDOI
TL;DR: This paper shows that when structured maps are encoded into bidirectional associative memories using outer-product correlation encoding, the memory of these associations annihilates under certain mild conditions, and the centroidal association emerges as a stable association, and is called an alien attractor.
Abstract: Structured sets comprise Boolean vectors with equal pair-wise Hamming distances, h. An external vector, if it exists at an equidistance of h/2 from each vector of the structured set, is called the centroid of the set. A structured map is a one-one mapping between structured sets. It is a set of associations between Boolean vectors, where both domain and range vectors are drawn from structured sets. Associations between centroids are called centroidal associations. We show that when structured maps are encoded into bidirectional associative memories using outer-product correlation encoding, the memory of these associations are annihilated under certain mild conditions. When annihilation occurs, the centroidal association emerges as a stable association, and we call it an alien attractor. For the special case of maps where h=2, self-annihilation can take place when either the domain or range dimensions are greater than five. In fact, we show that for dimensions greater than eight, as few as three associations suffice for self-annihilation. As an example shows, annihilation occurs even for the case of bipolar decoding which is well known for its improved error correction capability in such associative memory models.

Proceedings ArticleDOI
24 Jul 2000
TL;DR: The results indicate that, indeed, a small-world approach can lead to networks with high performance and minimal interconnect requirements, and there is evidence that this strategy is widely used in the nervous system.
Abstract: "Small-World" networks is a term recently coined by Watts and Strogatz to describe networks which simultaneously exhibit a high degree of node clustering and short minimum path lengths between nodes. Such networks represent a very efficient architecture for achieving maximal internode communication with minimal connection length-a feature that is extremely important in highly connected physical networks, where interconnections consume most space. Neural networks-both in the brain and in hardware implementation-can benefit greatly from a small-world architecture, and there is evidence that this strategy is widely used in the nervous system. In this paper, we study the recall performance of associative memories with regard to their small-world characteristics. The results indicate that, indeed, a small-world approach can lead to networks with high performance and minimal interconnect requirements.

Proceedings ArticleDOI
24 Jul 2000
TL;DR: A new method for the synthesis of neural networks with BAM (bidirectional associative memory) features, based on the ART structure, is presented, intended for pattern classification, which avoids the inherent defects of the BAM and its misclassifications with appropriate actions on the thresholds of the neurons of the ART layers.
Abstract: A new method for the synthesis of neural networks with BAM (bidirectional associative memory) features, based on the ART structure, is presented. Intended for pattern classification, it contains a new procedure for the correct usage of the relation matrix, and avoids the inherent defects of the BAM and its misclassifications with appropriate actions on the thresholds of the neurons of the ART layers. The results clearly indicate that this method leads to a good improvement in the performance that is achievable in a BAM, with a 0% error rate found in a test on the well-known NIST 19 character database.

Journal ArticleDOI
TL;DR: The capacities of various eBAMs are derived and compared to solve classification problems such as pattern recognition, data compression, etc.
Abstract: Bidirectional associative memory (BAM) has been widely used to solve classification problems such as pattern recognition, data compression, etc. Since Jeng et al. (1990) proposed the exponential BAM (eBAM), which has a high storage capacity and error-correction capability as well as being simple to implement, a great deal of research has gone into further improving its performance. The capacities of various eBAMs are derived and compared.

Proceedings ArticleDOI
24 Jul 2000
TL;DR: A novel method of pattern recognition using Polynomial Bidirectional Hetero-Correlator (PBHC) is proposed and simulation results show that the new scheme displays superior storage capacity over other BAM-like associative memories and fuzzy Associative memories.
Abstract: A novel method of pattern recognition using Polynomial Bidirectional Hetero-Correlator (PBHC) is proposed. Simulation results show that the new scheme displays superior storage capacity over other BAM-like associative memories and fuzzy associative memories.

Proceedings ArticleDOI
28 Jun 2000
TL;DR: Several sufficient conditions guaranteeing the network's global asymptotic stability are derived and the obtained results have important leading significance in the design and application of BAM.
Abstract: Bi-directional associative memory (BAM) models are two-layer heteroassociative networks. In the paper, the global asymptotic stability is studied for continuous bi-directional associative memory neural networks with axonal signal transmission delay while the neuronal output signal function S is not differentiable and strictly monotone increasing. Several sufficient conditions guaranteeing the network's global asymptotic stability are derived, the obtained results have important leading significance in the design and application of BAM.

Journal ArticleDOI
TL;DR: It is demonstrated that semantic networks can be well represented by the adaptive associative memories proposed by Ma and is flexible in the sense that modifying knowledge can do easily be done using one-shot relearning and the generalization of knowledge is a basic system property.

Journal ArticleDOI
TL;DR: A new associative memory model that stores arbitrary bipolar patterns without the problems the authors can find in other models like BAM or LAM is presented and its learning and recall stages are explained.
Abstract: We present a new associative memory model that stores arbitrary bipolar patternswithout the problems we can find in other models like BAM or LAM. After identifyingthose problems we show the new memory topology and we explain its learning and recallstages. Mathematical demonstrations are provided to prove that the new memory modelguarantees the perfect retrieval of every stored pattern and also to prove that whateverthe input of the memory is, it operates as a nearest neighbor classifier.

Journal ArticleDOI
TL;DR: A new neural network algorithm based on the counter‐propagation network (CPN) architecture, named MVL‐CPN, is proposed in this paper for bidirectional mapping and recognition of multiple‐valued patterns.
Abstract: A new neural network algorithm based on the counter‐propagation network (CPN) architecture, named MVL‐CPN, is proposed in this paper for bidirectional mapping and recognition of multiple‐valued patterns. The MVL‐CPN is capable of performing a mathematical mapping of a set of multiple‐valued vector pairs by self‐organization. The use of MVL‐CPN reduces considerably the number of nodes required for the input layers as well as the number of synaptic weights compared to the binary CPN. The training of the network is stable because all synaptic weights are monotonically nonincreasing. The bidirectional mapping and associative recall features of the MVL‐CPN are tested by using various sets of quaternary patterns. It is observed that the MVL‐CPN can converge within three or four iterations. The high‐speed convergence characteristics of the network can lead to the possibility of using this architecture for real‐time applications. An important advantage of the proposed type of neural network is that it can be implemented in VLSI with reduced number of neurons and synaptic weights when compared to a larger binary network needed for the same application. © 2000 John Wiley & Sons, Inc. Int J Imaging Syst Technol 11, 125–129, 2000