scispace - formally typeset
Search or ask a question

Showing papers by "DeLiang Wang published in 1993"


Journal ArticleDOI
TL;DR: In this paper, four representative architectures that are able to generalize are reviewed: the backpropagation network, the ART architecture, the dynamic link architecture, and associate memories.
Abstract: Invariant pattern recognition will be a problem facing neural networks for some time, and the challenge is to overcome the limitation of Hamming distance generalization. Four representative architectures that are able to generalize are reviewed. The architectures are the backpropagation network, the ART architecture, the dynamic link architecture, and associate memories. Image representation, segmentation, and invariance are discussed. >

62 citations


Journal ArticleDOI
01 Jul 1993
TL;DR: A mechanism of degree self-organization based on a global inhibitor is proposed for the model to learn required context lengths in order to disambiguate associations in complex sequence reproduction.
Abstract: A computational framework of learning, recognition and reproduction of temporal sequences are provided, based on an interference theory of forgetting in short-term memory (STM), modelled as a network of neural units with mutual inhibition. The STM model provides information for recognition and reproduction of arbitrary temporal sequences. Sequences are acquired by a new learning rule, the attentional learning rule, which combines Hebbian learning and a normalization rule with sequential system activation. Acquired sequences can be recognized without being affected by speed of presentation or certain distortions in symbol form. Different layers of the STM model can be naturally constructed in a feedforward manner to recognize hierarchical sequences, significantly expanding the model's capability in a way similar to human information chunking. A model of sequence reproduction is presented that consists of two reciprocally connected networks, one of which behaves as a sequence recognizer. Reproduction of complex sequences can maintain interval lengths of sequence components, and vary the overall speed. A mechanism of degree self-organization based on a global inhibitor is proposed for the model to learn required context lengths in order to disambiguate associations in complex sequence reproduction. Certain implications of the model are discussed at the end of the paper. >

50 citations


Journal ArticleDOI
TL;DR: A parsimonious model of short-term and long-term synaptic plasticity at the electrophysiological level consists of two interacting differential equations, one describing alterations of the synaptic weight and the other describing changes to the speed of recovery (forgetting).
Abstract: It has been demonstrated that short-term habituation may be caused by a decrease in release of presynaptic neurotransmitters and long-term habituation seems to be caused by morphological changes of presynaptic terminals. A parsimonious model of short-term and long-term synaptic plasticity at the electrophysiological level is presented. This model consists of two interacting differential equations, one describing alterations of the synaptic weight and the other describing changes to the speed of recovery (forgetting). The latter exhibits an inverse S-shaped curve whose high value corresponds to fast recovery (short-term habituation) and low value corresponds to slow recovery (long-term habituation). The model has been tested on short-term and a set of long-term habituation data of prey-catching behavior in toads, spanning minutes to hours to several weeks.

33 citations