Topic
Codebook
About: Codebook is a research topic. Over the lifetime, 8492 publications have been published within this topic receiving 115995 citations.
Papers published on a yearly basis
Papers
More filters
••
03 Apr 1990
TL;DR: The results show that the codebooks generated by these two methods both enable low bits-per-pixel coding with low distortion, and when given a suboptimal initial codebook, the KNN method outperformed the LBG.
Abstract: The creation of an acceptable codebook, as defined by three methods of measuring performance (peak signal-to-noise ratio, image quality, and entropy), is discussed and how the Linde-Buzo-Gray (LBG) and Kohonen neural network (KNN) methods differ detailed The results show that the codebooks generated by these two methods both enable low bits-per-pixel coding with low distortion When using fewer training vectors, and when given a suboptimal initial codebook, the KNN method outperformed the LBG For a theoretical lower bound, mean square error comparisons to an optimal N-level k-dimensional quantizer lower bound were made using a Gaussian source As k increased, the KNN performance came quite close to the optimal quantizer >
44 citations
••
TL;DR: A practical high-throughput architecture and its implementation for real-time coding of television-quality signals are presented and the architecture is directed toward the implementation of multistage vector quantization (VQ), as the authors' simulation results show that the latter is more suitable for real -time coding.
Abstract: A practical high-throughput architecture and its implementation for real-time coding of television-quality signals are presented. The architecture is directed toward the implementation of multistage vector quantization (VQ), as the authors' simulation results show that the latter is more suitable for real-time coding. However, the implementation is suitable for both single-stage and multistage VQ. The functional blocks of the VQ encoder system have been designed and implemented in VLSI technology. The VQ encoding scheme designed has an encoding delay of 25 clock cycles and is independent of the codebook size. >
44 citations
••
22 Nov 2006
TL;DR: The main contribution of this paper is the introduction of the "Hybrid Cone-Cylinder" Codebook (HC3) model, which shows superior speed and quantitatively better performance in many different conditions and environments.
Abstract: In the interest of 24-7 long-term surveillance, a truly robust, adaptive, and fast background-foreground segmentation technique is required. This paper deals with the especially difficult but extremely common problems of moving backgrounds, shadows, highlights, and illumination changes. To produce reliable foreground extraction in the face of these problems, the best practical aspects of two algorithms, Codebook Segmentation[6] and HSV Shadow Suppression[2] are combined. The main contribution of this paper is the introduction of the "Hybrid Cone-Cylinder" Codebook (HC3) model. Results show superior speed and quantitatively better performance in many different conditions and environments. Applications include people-tracking with Omni-directional cameras and vehicle-counting with rectilinear cameras.
44 citations
••
13 Sep 1993TL;DR: A new vector quantization method is proposed which generates codebooks incrementally by inserting vectors in areas of the input vector space where the quantization error is especially high until the desired number of codebook vectors is reached.
Abstract: A new vector quantization method is proposed which generates codebooks incrementally. New vectors are inserted in areas of the input vector space where the quantization error is especially high until the desired number of codebook vectors is reached. A one-dimensional topological neighborhood makes it possible to interpolate new vectors from existing ones. Vectors not contributing to error minimization are removed. After the desired number of vectors is reached, a stochastic approximation phase fine tunes the codebook. The final quality of the codebooks is exceptional. A comparison with two well-known methods for vector quantization was performed by solving an image compression problem. The results indicate that the new method is significantly better than both other approaches.
44 citations
••
TL;DR: Novel probability density function (PDF) models, based on beta and wrapped Cauchy distributions, are proposed for Givens rotations in correlated MIMO channels and precoding using the proposed codebooks achieves significant performance improvement, in terms of mean square error and sum rate, as compared to using uniform codebooks.
Abstract: Parametrization of unitary matrices using Givens rotations has been used for limited feedback in multiple-input multiple-output (MIMO) systems. Feedback based on these rotations has been adopted in IEEE 802.11n and other upcoming standards. However, the probability distributions of Givens rotations is not known for correlated channels, forcing the use of uniform quantization. In this paper, novel probability density function (PDF) models, based on beta and wrapped Cauchy distributions, are proposed for Givens rotations in correlated MIMO channels. Empirical distributions and goodness-of-fit tests show that the proposed distributions characterize the spatial correlation behavior with good accuracy. Moreover, it is shown that the distributions known in the literature for uncorrelated MIMO channels are only special cases. Distributions of Givens rotations are useful to understand the behavior of singular vectors of correlated channels. In this paper, the PDF models are utilized for bit allocation and optimized codebook design. Simulations show that precoding using the proposed codebooks achieves significant performance improvement, in terms of mean square error and sum rate, as compared to using uniform codebooks. It is also shown that the bit allocations proposed in this paper reduce to that of IEEE 802.11n standard when the MIMO channel is not spatially correlated.
44 citations