scispace - formally typeset
Search or ask a question
Topic

Quantization (image processing)

About: Quantization (image processing) is a research topic. Over the lifetime, 7977 publications have been published within this topic receiving 126632 citations.


Papers
More filters
Proceedings ArticleDOI
Fan Ling, Weiping Li, Hongqiao Sun1
28 Dec 1998
TL;DR: Bitplane coding of the DCT coefficients is a new coding scheme that provides a better performance than run_value coding under all conditions.
Abstract: In the current image and video coding standards, such as MPEG- 1, MPEG-2, MPEG-4, H.261, H.263, and JPEG, quantized DCT coefficients are entropy coded using a so-called run_value coding technique. A problem with this technique is that the statistics of the run_value symbols are highly dependent on the quantization step size and the dynamic range of the DCT coefficients. Therefore, a single fixed entropy coding table cannot achieve the optimal coding efficiency for all possible quantization step sizes and all possible dynamic ranges of the DCT coefficients. Bitplane coding of the DCT coefficients is a new coding scheme that overcomes this problem. It provides a better performance than run_value coding under all conditions.© (1998) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

53 citations

Proceedings ArticleDOI
10 Dec 2002
TL;DR: This paper proposes an extension of the zerotree-based space-frequency quantization (SFQ) algorithm by adding a wedgelet symbol to its tree-pruning optimization, which incorporates wedgelets into a rate-distortion compression framework and allows simple, coherent descriptions of the wavelet coefficients near edges.
Abstract: Most wavelet-based image coders fail to model the joint coherent behavior of wavelet coefficients near edges. Wedgelets offer a convenient parameterization for the edges in an image, but they have yet to yield a viable compression algorithm. In this paper, we propose an extension of the zerotree-based space-frequency quantization (SFQ) algorithm by adding a wedgelet symbol to its tree-pruning optimization. This incorporates wedgelets into a rate-distortion compression framework and allows simple, coherent descriptions of the wavelet coefficients near edges. The resulting method yields improved visual quality and increased compression efficiency over the standard SFQ technique.

53 citations

Proceedings ArticleDOI
24 Mar 1992
TL;DR: This formulation of the inverse discrete cosine transform has several advantages over previous approaches, including the elimination of multiplies from the central loop of the algorithm and its adaptability to incremental evaluation.
Abstract: The paper presents a new realization of the inverse discrete cosine transform (IDCT). It exploits both the decorrelation properties of the discrete cosine transform (DCT) and the quantization process that is frequently applied to the DCT's resultant coefficients. This formulation has several advantages over previous approaches, including the elimination of multiplies from the central loop of the algorithm and its adaptability to incremental evaluation. The technique provides a significant reduction in computational requirements of the IDCT, enabling a software-based implementation to perform at rates which were previously achievable only through dedicated hardware. >

53 citations

Journal ArticleDOI
TL;DR: A JPEG 2000-based codec framework is proposed that provides a generic architecture suitable for the compression of many types of off-axis holograms, and is extended with a JPEG 2000 codec at its core, extended with fully arbitrary wavelet decomposition styles and directional wavelet transforms.
Abstract: With the advent of modern computing and imaging technologies, digital holography is becoming widespread in various scientific disciplines such as microscopy, interferometry, surface shape measurements, vibration analysis, data encoding, and certification Therefore, designing an efficient data representation technology is of particular importance Off-axis holograms have very different signal properties with respect to regular imagery, because they represent a recorded interference pattern with its energy biased toward the high-frequency bands This causes traditional images’ coders, which assume an underlying 1/f2 power spectral density distribution, to perform suboptimally for this type of imagery We propose a JPEG 2000-based codec framework that provides a generic architecture suitable for the compression of many types of off-axis holograms This framework has a JPEG 2000 codec at its core, extended with (1) fully arbitrary wavelet decomposition styles and (2) directional wavelet transforms Using this codec, we report significant improvements in coding performance for off-axis holography relative to the conventional JPEG 2000 standard, with Bjontegaard delta-peak signal-to-noise ratio improvements ranging from 13 to 116 dB for lossy compression in the 0125 to 200 bpp range and bit-rate reductions of up to 16 bpp for lossless compression

52 citations

Journal ArticleDOI
TL;DR: A novel method based on topology-preserving neural networks is used to implement vector quantization for medical image compression, which can be applied to larger image blocks and represents better probability distribution estimation methods.

52 citations


Network Information
Related Topics (5)
Feature extraction
111.8K papers, 2.1M citations
84% related
Image segmentation
79.6K papers, 1.8M citations
84% related
Feature (computer vision)
128.2K papers, 1.7M citations
84% related
Image processing
229.9K papers, 3.5M citations
83% related
Robustness (computer science)
94.7K papers, 1.6M citations
81% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20228
2021354
2020283
2019294
2018259
2017295