Journal ArticleDOI
Properties of learning related to pattern diversity in ART1
TLDR
It is shown that if this network is repeatedly presented with an arbitrary list of binary input patterns, learning self-stabilizes in at most m list presentations, where m corresponds to the number of patterns of distinct size in the input list.Citations
More filters
Book
An introduction to neural networks
TL;DR: An Introduction to Nueral Networks will be warmly welcomed by a wide readership seeking an authoritative treatment of this key subject without an intimidating level of mathematics in the presentation.
Journal ArticleDOI
Order of search in fuzzy ART and fuzzy ARTMAP: effect of the choice parameter
TL;DR: This work provides a geometrical, and clearer understanding of why, and in what order, these categories are chosen for various ranges of the choice parameter of the Fuzzy ART module.
Journal ArticleDOI
Fuzzy ART properties
TL;DR: The properties described in the paper are distinguished into a number of categories, as well as properties related to the number of list presentations needed for weight stabilization, which provide numerous insights as to how Fuzzy ART operates.
Journal ArticleDOI
NIRS: large scale ART-1 neural architectures for engineering design retrieval
TL;DR: The application, the neural architectures and algorithms, the current status, and the lessons learned in developing a neural network system for production use in industry are described.
Journal ArticleDOI
Artificial neural network control of FES in paraplegics for patient responsive ambulation
Daniel Graupe,H. Kordylewski +1 more
TL;DR: A binary adaptive resonance theory (ART-1)-based artificial neural network adapted for controlling functional electrical stimulation (FES) to facilitate patient-responsive ambulation by paralyzed patients with spinal cord injures is described.
References
More filters
Journal ArticleDOI
A massively parallel architecture for a self-organizing neural pattern recognition machine
TL;DR: A neural network architecture for the learning of recognition categories is derived which circumvents the noise, saturation, capacity, orthogonality, and linear predictability constraints that limit the codes which can be stably learned by alternative recognition models.
Journal ArticleDOI
Adaptive pattern classification and universal recoding: II. Feedback, expectation, olfaction, illusions
TL;DR: It is suggested that arousal is gated by a chemical transmitter system—for example, norepinephrine—whose relative states of accumulation at antagonistic pairs of on-cells and off-cells through time can shift the spatial pattern of STM activity across a field of feature detectors.
Journal ArticleDOI
Convergence properties of learning in ART1
TL;DR: It is shown that in the fast learning case, an ART1 network that is repeatedly presented with an arbitrary list of binary input patterns, self-stabilizes the recognition code of every size-l pattern in at most l list presentations.