Journal ArticleDOI
Vector quantization using tree-structured self-organizing feature maps
Reads0
Chats0
TLDR
A binary-tree structure neural network model suitable for structured clustering and used to design tree search vector quantization codebooks for image coding that performs better than the generalized Lloyd algorithm in terms of distortion, bits per pixel, and encoding complexity.Abstract:
In this paper, we propose a binary-tree structure neural network model suitable for structured clustering. During and after training, the centroids of the clusters in this model always form a binary tree in the input pattern space. This model is used to design tree search vector quantization codebooks for image coding. Simulation results show that the acquired codebook not only produces better-quality images but also achieves a higher compression ratio than conventional tree search vector quantization. When source coding is applied after VQ, the new model performs better than the generalized Lloyd algorithm in terms of distortion, bits per pixel, and encoding complexity for low-detail and medium-detail images. >read more
Citations
More filters
Bibliography of Self-Organizing Map (SOM) Papers: 1981-1997
TL;DR: A comprehensive list of papers that use the Self-Organizing Map algorithms, have bene ted from them, or contain analyses of them is collected and provided both a thematic and a keyword index to help find articles of interest.
Book ChapterDOI
Modification of Kohonen's SOFM to Simulate Cortical Plasticity Induced by Coactivation Input Patterns
TL;DR: A modification of Kohonens SOFMin is presented to simulate cortical plasticity induced by coactivation patterns by introducing a probabilistic mode of stimulus presentation and substituting the winner-takes- all mechanism by selecting the winner from a set of best matching neurons.
Journal ArticleDOI
TASOM: a new time adaptive self-organizing map
TL;DR: Several versions of the TASOM-based networks are proposed in this paper for different applications, including bilevel thresholding of grey level images, tracking of moving objects and their boundaries, and adaptive clustering.
Journal ArticleDOI
Color image compression and limited display using self-organization Kohonen map
Soo-Chang Pei,You-Shen Lo +1 more
TL;DR: Under the two-dimensional (2-D) mesh neural structure, SOFM vector quantization on an indexed image could largely reduce the color shift artifacts and avoid the requantization problem.
Proceedings ArticleDOI
TASOM: the time adaptive self-organizing map
TL;DR: This work proposes a modified SOM algorithm called "time adaptive SOM", or TASOM, that automatically adjusts the learning rate and neighborhood size of each neuron independently, and its performance is compared with that of the basic SOM.
References
More filters
Journal ArticleDOI
An Algorithm for Vector Quantizer Design
Y. Linde,A. Buzo,Robert M. Gray +2 more
TL;DR: An efficient and intuitive algorithm is presented for the design of vector quantizers based either on a known probabilistic model or on a long training sequence of data.
Journal Article
Vector quantization
TL;DR: During the past few years several design algorithms have been developed for a variety of vector quantizers and the performance of these codes has been studied for speech waveforms, speech linear predictive parameter vectors, images, and several simulated random processes.
Journal ArticleDOI
Speech coding based upon vector quantization
TL;DR: The vector quantizing approach is shown to be a mathematically and computationally tractable method which builds upon knowledge obtained in linear prediction analysis studies and is introduced in a nonrigorous form.
Journal ArticleDOI
Optimal pruning with applications to tree-structured source coding and modeling
TL;DR: An algorithm introduced by Breiman et al. (1984) in the context of classification and regression trees is reinterpreted and extended to cover a variety of applications in source coding and modeling in which trees are involved.
Journal ArticleDOI
Variants of self-organizing maps
TL;DR: Two innovations are discussed: dynamic weighting of the input signals at each input of each cell, which improves the ordering when very different input signals are used, and definition of neighborhoods in the learning algorithm by the minimum spanning tree, which provides a far better and faster approximation of prominently structured density functions.