scispace - formally typeset
Search or ask a question

Showing papers on "Bidirectional associative memory published in 1998"


Journal ArticleDOI
TL;DR: By introducing a scaling model, it is demonstrated that a network approaching experimentally reported neuron numbers and synaptic distributions also could work like the model studied here, which has become sufficiently detailed to allow evaluation against electrophysiological and anatomical observations.
Abstract: An attractor network model of cortical associative memory functions has been constructed and simulated. By replacing the single cell as the functional unit by multiple cells in cortical columns connected by long-range fibers, the model is improved in terms of correspondence with cortical connectivity. The connectivity is improved, since the original dense and symmetric connectivity of a standard recurrent network becomes sparse and asymmetric at the cell-to-cell level. Our simulations show that this kind of network, with model neurons of the Hodgkin-Huxley type arranged in columns, can operate as an associative memory in much the same way as previous models having simpler connectivity. The network shows attractor-like behaviour and performs the standard assembly operations despite differences in the dynamics introduced by the more detailed cell model and network structure. Furthermore, the model has become sufficiently detailed to allow evaluation against electrophysiological and anatomical observations. For instance, cell activities comply with experimental findings and reaction times are within biological and psychological ranges. By introducing a scaling model we demonstrate that a network approaching experimentally reported neuron numbers and synaptic distributions also could work like the model studied here.

82 citations


Journal ArticleDOI
TL;DR: It is proved that all the equilibrium (fixed) points of CDBAM correspond to local energy minima so that the design problem can be solved by a gradient descent algorithm.

45 citations


Journal ArticleDOI
01 Aug 1998
TL;DR: A general model for bidirectional associative memories that associate patterns between the X-space and the Y-space is proposed and an algorithm for learning the asymptotic stability conditions using the Rosenblatt perceptron rule is developed.
Abstract: This paper proposes a general model for bidirectional associative memories that associate patterns between the X-space and the Y-space. The general model does not require the usual assumption that the interconnection weight from a neuron in the X-space to a neuron in the Y-space is the same as the one from the Y-space to the X-space. We start by defining a supporting function to measure how well a state supports another state in a general bidirectional associative memory (GBAM). We then use the supporting function to formulate the associative recalling process as a dynamic system, explore its stability and asymptotic stability conditions, and develop an algorithm for learning the asymptotic stability conditions using the Rosenblatt perceptron rule. The effectiveness of the proposed model for recognition of noisy patterns and the performance of the model in terms of storage capacity, attraction, and spurious memories are demonstrated by some outstanding experimental results.

44 citations


Proceedings ArticleDOI
04 May 1998
TL;DR: A chaotic associative memory that can recall correct stored patterns from superimposed input and deal with many-to-many associations is proposed.
Abstract: We propose a chaotic associative memory (CAM). It has two distinctive features: 1) it can recall correct stored patterns from superimposed input; and 2) it can deal with many-to-many associations. As for the first feature, when a stored pattern is given to the conventional chaotic neural network as an external input, the input pattern is continuously searched. The proposed model makes use of the above property to separate the superimposed patterns. As for the second feature, most of the conventional associative memories cannot deal with many-to-many associations due to the superimposed pattern caused by the stored common data. However, since the proposed model can separate the superimposed pattern, it can deal with many-to-many associations. A series of computer simulations shows the effectiveness of the proposed model.

39 citations


Journal ArticleDOI
TL;DR: A multilayered reasoning method associated to a multilayed structured CFS, which has the following features: (1) capable of simultaneous symbolic and quantitative processing, (2) able of simultaneous top‐down and bottom‐up processing.
Abstract: The real world consists of instances of events and continuous numeric values, while people represent and process their knowledge in terms of symbols Fuzzy sets provide a strong notation connecting the symbolic representation to the real world In previously proposed Conceptual Fuzzy Sets (CFS), the meaning of a concept is represented by the distribution of activations of labels in a bidirectional associative memory In particular, a multilayered structured CFS represents the meaning of the same concept as it is used in various expressions in each layer The propagation of activations corresponds to reasoning Therefore, we propose a multilayered reasoning method associated to a multilayered structured CFS, which has the following features : (1) capable of simultaneous symbolic and quantitative processing, (2) capable of simultaneous top-down and bottom-up processing The effectiveness of the proposed method is illustrated by practical examples of decision regarding the amount of steering in the task of parking a car, and recognition of facial expressions for an image understanding system

35 citations


Journal ArticleDOI
TL;DR: The bidirectional associative memory connectionistic neural networks model provides a theoretical basis for explaining the clinical symptom constellation of PTSD, with special emphasis on why trauma is re-experienced through memory.

29 citations


Journal ArticleDOI
TL;DR: In order to memorize and recall such very complicated training data, the (MMA)2 employs pseudo-noise patterns, transformation of distributed patterns into locally represented patterns and the logical operations to avoid producing a mixed unknown pattern.

27 citations


Journal ArticleDOI
TL;DR: It is shown in this letter that it is possible to realize multistep retrieval by using conventional Discrete Bidirectional Associative Memories (DBAMs) by using a modified Minimum Overlap Algorithm (MMOA) to train the weight matrix of the DBAM.

24 citations


Journal ArticleDOI
TL;DR: Two ECG processing techniques are described for the classification of QRSs, PVCs and normal and ischaemic beats and the results show that this method, if properly calibrated, can result in a fast and reliable ischaemia beat detection algorithm.
Abstract: Two ECG processing techniques are described for the classification of QRSs, PVCs and normal and ischaemic beats. The techniques use neural network (NN) technology in two ways. The first technique, uses nonlinear ECG mapping preprocessing and subsequently for classification uses a shrinking algorithm based on NNs. This technique is applied to the QRS/PVC problem with good result. The second technique is based on the Bidirectional Associative Memory (BAM) NN and is used to distinguish normal from ischaemic beats. In this technique the ECG beat is treated as a digitized image which is then transformed into a bipolar vector suitable for input in the BAM. The results show that this method, if properly calibrated, can result in a fast and reliable ischaemic beat detection algorithm.

24 citations


13 Mar 1998
TL;DR: In this paper, a chaotic neural network model was proposed for the chaotic auto-association memory, which is properly characterized in terms of a time-dependent periodic activation function to involve a chaotic dynamics as well as the energy steepest descent strategy.
Abstract: In this paper we shall propose a novel chaos neural network model applied to the chaotic autoassociation memory. The present artificial neuron model is properly characterized in terms of a time-dependent periodic activation function to involve a chaotic dynamics as well as the energy steepest descent strategy. It is elucidated that the present neural network has a remarkable ability of the dynamic memory retrievals beyond the conventional models with the nonmonotonous activation function as well as such a monotonous activation function as sigmoidal one. This advantage is found to result from the property of the analogue periodic mapping accompanied with a chaotic behaviour of the neurons. It is also concluded that the present analogue neuron model with the periodicity control has an apparently large memory capacity in comparison with the previously proposed association models.

21 citations


PatentDOI
Pentti O. A. Haikonen1
28 May 1998
TL;DR: In this paper, an associative artificial neural network comprises a number of associative neurons (N1 to Nm), each one of which has associative inputs for receiving associative signals (A111 to Ajql) and concept input for receiving a concept input signal (CS1 to CSm).
Abstract: An associative artificial neural network comprises a number of associative neurons (N1 to Nm), each one of which has a number of associative inputs for receiving associative signals (A111 to Ajql) and a concept input for receiving a concept input signal (CS1 to CSm). In the invention, the neural network also comprises memory means (M11 to Mjq), which are arranged to convert the temporally sequential input signals (G11 to G1k) of the neural network to temporally parallel signals (A111 to Ajql), which are operatively connected to the associative inputs of the neurons (N1 to Nm). Further, the neural network comprises selection means (SELa), which are arranged to select the output signal (O1 to Om) on the basis of at least one predetermined criterion from only some of the neurons (N1 to Nm), preferably from at most one neuron at a time.

01 Jan 1998
TL;DR: The QuAM makes use of two quantum computational algorithms, one for pattern storage and the other for pattern recall, resulting in an exponential increase in the capacity of the memory when compared to traditional associative memories such as the Hopfield network.
Abstract: This paper discusses an approach to constructing an artificial quantum associative memory (QuAM). The QuAM makes use of two quantum computational algorithms, one for pattern storage and the other for pattern recall. The result is an exponential increase in the capacity of the memory when compared to traditional associative memories such as the Hopfield network. Further, the paper argues for considering pattern recall as a non-unitary process and demonstrates the utility of non-unitary operators for improving the pattern recall performance of the QuAM.

Journal Article
TL;DR: The stability of the M VeMDAM is proven in synchronous and asynchronous update modes for neuron states, which enables the MVeMDAM to ensure all the training pattern sets to become the stable states of the system.
Abstract: MDAM (multidirectional associative memory) is a direct extension of Kosko BAM (bidirectional associative memory). It can be applied in data fusion and splitting larger dimensional input patterns to ease some problems to be solved. At present, the existing multidirectional models only dealt with binary input output patterns or data. However, some patterns in such applications as image processing and pattern recognition are represented in a multivalued mode. Therefore, the above models have some processing difficulties. The purpose of this paper is to present a MVeMDAM (multi valued exponential associative memory) to partially solve the difficulties. In this paper, the stability of the MVeMDAM is proven in synchronous and asynchronous update modes for neuron states, which enables the MVeMDAM to ensure all the training pattern sets to become the stable states of the system. Finally, the computer simulation results confirm feasibility of the proposed model.


Book ChapterDOI
TL;DR: Empirical computations are applied to the Hopfield's neural network model of associative memory to learn a geometry of a fitness landscape defined on the space.
Abstract: We apply evolutionary computations to the Hopfield's neural network model of associative memory. In the model, some of the appropriate configurations of synaptic weights give the network a function of associative memory. One of our goals is to obtain the distribution of these configurations in the synaptic weight space. In other words, our aim is to learn a geometry of a fitness landscape defined on the space. For the purpose, we use evolutionary walks to explore the fitness landscape in this paper.

Journal ArticleDOI
01 Jul 1998
TL;DR: The authors prove the stability of the exponential bidirectional associative memory and the absolute lower bound of the radix of the eBAM is obtained.
Abstract: The exponential bidirectional associative memory (eBAM) is a high-capacity associative memory. However, in the hardware realisation of eBAM, increasing efforts have been made to obtain an optimally small radix of exponential circuit for the fixed dynamic range of the VLSI circuit transistor, thereby allowing the dimension of the stored patterns to reach maximum. In this paper, the authors prove the stability of eBAM. The absolute lower bound of the radix of the eBAM is also obtained. In addition, an algorithm is presented to compute the optimal radix of an exponential circuit. To preserve the optimality of the radix, an algorithm capable of updating the radix when new pattern pairs are to be installed is proposed. Moreover, a deterministic method is presented to train and install pattern pairs with a predetermined fault tolerance ability.

Journal ArticleDOI
01 Apr 1998
TL;DR: A novel data compression algorithm utilizing the histogram and the high-capacity exponential bidirectional associative memory (eBAM) is presented.
Abstract: A novel data compression algorithm utilizing the histogram and the high-capacity exponential bidirectional associative memory (eBAM) is presented. Since eBAM has been proved to possess high capacity and fault tolerance, it is suitable to be utilized in the data compression using the table-lookup scheme. The histogram approach is employed to extract the feature vectors in the given data. The result of the simulation of the proposed algorithm turns out to be better than the traditional methods.

Book ChapterDOI
01 Dec 1998
TL;DR: While the straight-forward BAM extension of the Willshaw model does not improve the performance at high memory load, a new bidirectional recall method (CB-retrieval) is proposed accessing patterns with highly improved fault tolerance and also allowing segmentation of ambiguous input.
Abstract: Reciprocal pathways are presumedly the dominant wiring organization for cortico-cortical long range projections5. This paper examines the hypothesis that synaptic modification and activation flow in a reciprocal cortico-cortical pathway correspond to learning and retrieval in a bidirectional associative memory (BAM): Unidirectional activation flow may provide the fast estimation of stored information, whereas bidirectional activation flow might establish an improved recall mode. The idea is tested in a network of binary neurons where pairs of sparse memory patterns have been stored in bidirectional synapses by fast Hebbian learning (Willshaw model). We assume that cortical long-range connections shall be efficiently used, i.e., in many different hetero- associative projections corresponding in technical terms to a high memory load. While the straight-forward BAM extension of the Willshaw model does not improve the performance at high memory load, a new bidirectional recall method (CB-retrieval) is proposed accessing patterns with highly improved fault tolerance and also allowing segmentation of ambiguous input. The improved performance is demonstrated in simulations. The consequences and predictions of such a cortico-cortical pathway model are discussed. A brief outline of the relations between a theory of modular BAM operation and common ideas about cell assemblies is given.

Journal Article
TL;DR: This paper provides suucient conditions for kernel vectors which connrm the intuitive notion of kernel vectors as sparse representations of the input vectors and deduce exact statements on the amount of noise which is permissible for perfect recall.
Abstract: 1 Abstract The ability of human beings to retrieve information on the basis of associated cues continues to elicit great interest among researchers. Investigations of how the brain is capable to make such associations from partial information have led to a variety of theoretical neu-ral network models that act as associative memories. Recently, several researchers have had signiicant success in retrieving complete stored patterns from noisy or incomplete input pattern keys by using morphological associative memories. For certain types of noise in the input patterns, this new model of artiicial asso-ciative memories can be successfully applied following a direct approach. If the input patterns contain both dilative and erosive noise, an indirect approach using kernel vectors is recommended, however the problem of how to select these kernel vectors has not yet been solved. In this paper, we provide suucient conditions for kernel vectors which connrm the intuitive notion of kernel vectors as sparse representations of the input vectors. In addition, we deduce exact statements on the amount of noise which is permissible for perfect recall. 2 Introduction The concept of morphological neural networks grew out of the theory of image algebra. A subalgebra of image algebra includes the mathematical formulations of currently popular neural network models 11, 9]. G.X. Rit-ter and J.L. Davidson were the rst to formulate useful morphological neural networks 10, 4]. Since then, only a few papers involving morphological neural networks have appeared. Davidson employed morphological neu-ral networks in order to solve template identiication and target classiication problems 3, 2]. Suarez-Araujo applied morphological neural networks to compute ho-mothetic auditory and visual invariances 15] Another interesting network consisting of a morphological net and a classical feedforward network used for feature extraction and classiication was designed by Won, Gader, and Cooeld 16, 17]. All of these researchers devised multilayer morphological neural networks for very specialized applications. A more comprehensive and rigorous basis for computing with morphological neural networks appeared in 12]. The properties of morphological neural networks diier drastically from those of traditional neural network models. These diierences are due to the fact that traditional neural network operations consist of linear operations followed by an application of nonlinear activation functions whereas in morphological neural computing the next state of a neuron or in performing the next layer neural network computation involves the nonlinear operation of adding neural values and their synaptic strengths followed by forming the maximum of the …

Proceedings ArticleDOI
11 Oct 1998
TL;DR: A new bidirectional associative memory (BAM) named multiwinners self-organizing Bidirectional Associative Memory (MWS-BAM), which can represent one piece of pattern pair distributedly in the distributed representation layer and stores the relation using the upward weights.
Abstract: We propose a new bidirectional associative memory (BAM) named multiwinners self-organizing bidirectional associative memory (MWS-BAM) in this paper. The proposed MWS-BAM has two processes for pattern pair storage: storage process and recall process. For the storage process, the proposed MWS-BAM can represent one piece of pattern pair distributedly in the distributed representation layer and stores the relation using the upward weights. In addition, the MWS-BAM can store the relation of the distributed representation and the other piece of pattern pair in the downward weights. For the recall process, the MWS-BAM can recall information bidirectionally: one piece of the stored pattern pair by receiving the other piece in the input-output layer for any stored pattern pairs.

Journal ArticleDOI
TL;DR: The sufficient conditions of existence, uniqueness, and globally asymptotic stability of the equilibrum position are given and two interesting examples to illustrate the theory are given.
Abstract: In this paper the globally asymptotic stability of more general two-layer nonlinear feedback associative memory neural networks with time delays is examined. The sufficient conditions of existence, uniqueness and globally asymptotic stability of the equilibrum position are given. Finally, two interesting examples to illustrate the theory are given.

Journal ArticleDOI
TL;DR: The global exponential stability of an equilibrium position for general bidirectional associative memory neural networks are studied and the sufficient conditions and uniqueness of the equilibrium position are given.
Abstract: In this paper, the global exponential stability of an equilibrium position for general bidirectional associative memory neural networks are studied. The sufficient conditions of existence and uniqueness of the equilibrium position are given. The method of energy function is examined. Two examples are given to illustrate the theory.

Journal ArticleDOI
TL;DR: A general model of the intraconnected bidirectional associative memory (GIBAM) is proposed and it is demonstrated that the GIBAM has a much higher storage capacity and much better error correcting capability than Jengs modified IBAM.
Abstract: A general model of the intraconnected bidirectional associative memory (GIBAM) is proposed. In a GIBAM, states are represented by complex values on the unit circle of the complex plane, and the weight matrices are learned using the generalised inverse technique. The stability of the GIBAM is demonstrated by defining an energy function which decreases with a change in neuron states. Computer simulation demonstrates that the GIBAM has a much higher storage capacity and much better error correcting capability than Jengs modified IBAM.

Proceedings ArticleDOI
21 Apr 1998
TL;DR: A novel relaxation method is derived for the system of linear inequalities and it is applied to the learning for associative memories and a multimodule associative memory is proposed which can be learned by the intersection learning algorithm.
Abstract: In this paper, first we derive a novel relaxation method for the system of linear inequalities and apply it to the learning for associative memories. Since the proposed intersection learning can guarantee the recall of all training data, it can greatly enlarge the storage capacity of associative memories. In addition, it requires much less weights renewal times than the conventional methods. We also propose a multimodule associative memory which can be learned by the intersection learning algorithm. The proposed associative memory can deal with many-to-many associations and it is applied to a knowledge processing task. Computer simulation results show the effectiveness of the proposed learning algorithm and associative memory.

Journal ArticleDOI
TL;DR: The Hopfield model of associative memories is reconsidered in terms of pattern recognition in presence of noise and a learning rule is deduced that increases the storing capacity of Hopfield neuronal network.
Abstract: The Hopfield model of associative memories is reconsidered in terms of pattern recognition in presence of noise. A learning rule for the investigated model is deduced. With the obtained learning rule the storing capacity of Hopfield neuronal network is increased from α≃0.14 to the optimal value α=1. The pattern recognition capabilities of the model increase exponentially with the dimensionality of the patterns.

Proceedings ArticleDOI
11 Oct 1998
TL;DR: A chaotic episodic associative memory based on the conventional TAM and has connections in the input layer for autoassociation, which enables the CEAM to recall plural episodes that have common terms.
Abstract: We propose a chaotic episodic associative memory (CEAM). It can deal with complex episodes which have common terms. Temporal associative memory (TAM) and episodic associative memory (EAM) have been proposed as models for episodic memory. However these models cannot deal with association of plural episodes that have common terms because the stored common patterns cause superimposed patterns. The proposed CEAM is based on the conventional TAM and has connections in the input layer for autoassociation. It also employs chaotic neurons in a part of the input layer. Each scene of the episodes is memorized together with its own contextual information. That is, the training set including common terms is converted into a form which doesn't include any common terms. The chaotic neurons in the input layer corresponding to contextual information change their states by chaos. As a result, the contextual information changes dynamically, which enables the CEAM to recall plural episodes that have common terms. A series of computer simulations shows the effectiveness of the proposed model.

Journal ArticleDOI
TL;DR: A "designer" neural network that synthesizes the associative memories with prespecified interconnecting weights is proposed and an upper bound on the time required for the designer network to reach a solution is determined.

Journal ArticleDOI
TL;DR: A majority rule of decision making in presentation of multiple evidence is also found by the study of signal-noise-ratio (SNR) of the multi-BAM network.

Journal Article
TL;DR: The stability of the new model, in synchronous and asynchronous updating modes, is proven by defining an energy function such that it can ensure all the training pattern pairs to become its asymptotically stable points.
Abstract: In this paper, a new higher order bidirectional associative memory model is presented. It is an extension of Tai's HOBAM(higher order bidirectional associative memory) and Jeng's MIBAM(modified intraconnected BAM). The stability of the new model, in synchronous and asynchronous updating modes, is proven by defining an energy function such that it can ensure all the training pattern pairs to become its asymptotically stable points. Using statistical analysis principle, the storage capacity of the proposed model is estimated. The computer simulations show that this model has not only higher storage capacity but also better error correcting capability.