scispace - formally typeset
Search or ask a question
Topic

Quantization (image processing)

About: Quantization (image processing) is a research topic. Over the lifetime, 7977 publications have been published within this topic receiving 126632 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: This correspondence addresses the use of a joint source-channel coding strategy for enhancing the error resilience of images transmitted over a binary channel with additive Markov noise via a maximum a posteriori (MAP) channel detector.
Abstract: This article addresses the use of a joint source-channel coding strategy for enhancing the error resilience of images transmitted over a binary channel with additive Markov noise. In this scheme, inherent or residual (after source coding) image redundancy is exploited at the receiver via a maximum a posteriori (MAP) channel detector. This detector, which is optimal in terms of minimizing the probability of error, also exploits the larger capacity of the channel with memory as opposed to the interleaved (memoryless) channel. We first consider MAP channel decoding of uncompressed two-tone and bit-plane encoded grey-level images. Next, we propose a scheme relying on unequal error protection and MAP detection for transmitting grey-level images compressed using the discrete cosine transform (DCT), zonal coding, and quantization. Experimental results demonstrate that for various overall (source and channel) operational rates, significant performance improvements can be achieved over interleaved systems that do not incorporate image redundancy.

39 citations

Journal ArticleDOI
TL;DR: A novel watermarking scheme that exploits the features of micro images of watermarks to build association rules and embeds the rules into a host image instead of the bit stream of the watermark, which is commonly used in digital water marking.
Abstract: A novel watermarking scheme is proposed that could substantially improve current watermarking techniques This scheme exploits the features of micro images of watermarks to build association rules and embeds the rules into a host image instead of the bit stream of the watermark, which is commonly used in digital watermarking Next, similar micro images with the same rules are collected or even created from the host image to simulate an extracted watermark This method, called the features classification forest, can achieve blind extraction and is adaptable to any watermarking scheme using a quantization-based mechanism Furthermore, a larger size watermark can be accepted without an adverse effect on the imperceptibility of the host image The experiments demonstrate the successful simulation of watermarks and the application to five different watermarking schemes One of them is slightly adjusted from a reference to especially resist JPEG compression, and the others show native advantages to resist different image processing attacks

39 citations

Journal ArticleDOI
TL;DR: This work proposes a new compression format, .zfib, for streamline tractography datasets reconstructed from diffusion magnetic resonance imaging (dMRI), which is highly compressible and opens new opportunities for connectomics and tractometry applications.

39 citations

Journal ArticleDOI
TL;DR: In this paper, the properties of complex-valued SAR images relevant to the task of data compression are examined, and the use of transform-based compression methods is advocated but employ radically different quantization strategies than those commonly used for incoherent optical images.
Abstract: Synthetic aperture radars (SAR) are coherent imaging systems that produce complex-valued images of the ground. Because modern systems can generate large amounts of data, there is substantial interest in applying image compression techniques to these products. We examine the properties of complex-valued SAR images relevant to the task of data compression. We advocate the use of transform-based compression methods but employ radically different quantization strategies than those commonly used for incoherent optical images. The theory, methodology, and examples are presented.

39 citations

Patent
Ricardo L. de Queiroz1
22 Jul 2002
TL;DR: In this article, a unique hashing function is derived from a first section of image data contained in the JPEG compressed image in such a way that any changes subsequently made to the first image data is reflected in a different hashing function being derived from signature string is embedded into a next section of the image data.
Abstract: A system and method for authentication of JPEG image data enables the recipient to ascertain whether the received image file originated from a known identified source or whether the contents of the file have been altered in some fashion prior to receipt. A unique hashing function is derived from a first section of image data contained in the JPEG compressed image in such a way that any changes subsequently made to the first section of image data is reflected in a different hashing function being derived from a signature string is embedded into a next section of the image data. Since the embedding of a previous section's integrity checking number is done without modifying the JPEG bit stream, any JPEG decoder can thereafter properly decode the image.

39 citations


Network Information
Related Topics (5)
Feature extraction
111.8K papers, 2.1M citations
84% related
Image segmentation
79.6K papers, 1.8M citations
84% related
Feature (computer vision)
128.2K papers, 1.7M citations
84% related
Image processing
229.9K papers, 3.5M citations
83% related
Robustness (computer science)
94.7K papers, 1.6M citations
81% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20228
2021354
2020283
2019294
2018259
2017295