scispace - formally typeset
Open AccessProceedings Article

A Growing Neural Gas Network Learns Topologies

Bernd Fritzke
- Vol. 7, pp 625-632
Reads0
Chats0
TLDR
An incremental network model is introduced which is able to learn the important topological relations in a given set of input vectors by means of a simple Hebb-like learning rule.
Abstract
An incremental network model is introduced which is able to learn the important topological relations in a given set of input vectors by means of a simple Hebb-like learning rule. In contrast to previous approaches like the "neural gas" method of Martinetz and Schulten (1991, 1994), this model has no parameters which change over time and is able to continue learning, adding units and connections, until a performance criterion has been met. Applications of the model include vector quantization, clustering, and interpolation.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Deep learning in neural networks

TL;DR: This historical survey compactly summarizes relevant work, much of it from the previous millennium, review deep supervised learning, unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.
Journal ArticleDOI

Continual lifelong learning with neural networks: A review.

TL;DR: This review critically summarize the main challenges linked to lifelong learning for artificial learning systems and compare existing neural network approaches that alleviate, to different extents, catastrophic forgetting.
Journal ArticleDOI

Review: A review of novelty detection

TL;DR: This review aims to provide an updated and structured investigation of novelty detection research papers that have appeared in the machine learning literature during the last decade.
Journal ArticleDOI

DENFIS: dynamic evolving neural-fuzzy inference system and its application for time-series prediction

TL;DR: It is demonstrated that DENFIS can effectively learn complex temporal sequences in an adaptive way and outperform some well-known, existing models.
References
More filters
Journal ArticleDOI

Self-organized formation of topologically correct feature maps

TL;DR: In this paper, the authors describe a self-organizing system in which the signal representations are automatically mapped onto a set of output responses in such a way that the responses acquire the same topological order as that of the primary events.
Journal ArticleDOI

Simplified neuron model as a principal component analyzer

TL;DR: A simple linear neuron model with constrained Hebbian-type synaptic modification is analyzed and a new class of unconstrained learning rules is derived.
Journal ArticleDOI

Growing cell structures—a self-organizing network for unsupervised and supervised learning

Bernd Fritzke
- 01 Nov 1994 - 
TL;DR: A new self-organizing neural network model that has two variants that performs unsupervised learning and can be used for data visualization, clustering, and vector quantization is presented and results on the two-spirals benchmark and a vowel classification problem are presented that are better than any results previously published.
Journal ArticleDOI

Topology representing networks

TL;DR: This competitive Hebbian rule provides a novel approach to the problem of constructing topology preserving feature maps and representing intricately structured manifolds and makes this novel approach particularly useful in all applications where neighborhood relations have to be exploited or the shape and topology of submanifolds have to been take into account.
Book ChapterDOI

Competitive Hebbian Learning Rule Forms Perfectly Topology Preserving Maps

TL;DR: In this article, it was shown that Hebbian learning with competition leads to lateral connections that correspond to the edges of the induced Delaunay triangulation and leads to a network structure that forms a topology preserving map of a given manifold, independent of the manifold's topology.