Topic
Quantization (image processing)
About: Quantization (image processing) is a research topic. Over the lifetime, 7977 publications have been published within this topic receiving 126632 citations.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: This correspondence addresses the use of a joint source-channel coding strategy for enhancing the error resilience of images transmitted over a binary channel with additive Markov noise via a maximum a posteriori (MAP) channel detector.
Abstract: This article addresses the use of a joint source-channel coding strategy for enhancing the error resilience of images transmitted over a binary channel with additive Markov noise. In this scheme, inherent or residual (after source coding) image redundancy is exploited at the receiver via a maximum a posteriori (MAP) channel detector. This detector, which is optimal in terms of minimizing the probability of error, also exploits the larger capacity of the channel with memory as opposed to the interleaved (memoryless) channel. We first consider MAP channel decoding of uncompressed two-tone and bit-plane encoded grey-level images. Next, we propose a scheme relying on unequal error protection and MAP detection for transmitting grey-level images compressed using the discrete cosine transform (DCT), zonal coding, and quantization. Experimental results demonstrate that for various overall (source and channel) operational rates, significant performance improvements can be achieved over interleaved systems that do not incorporate image redundancy.
39 citations
••
TL;DR: A novel watermarking scheme that exploits the features of micro images of watermarks to build association rules and embeds the rules into a host image instead of the bit stream of the watermark, which is commonly used in digital water marking.
Abstract: A novel watermarking scheme is proposed that could substantially improve current watermarking techniques This scheme exploits the features of micro images of watermarks to build association rules and embeds the rules into a host image instead of the bit stream of the watermark, which is commonly used in digital watermarking Next, similar micro images with the same rules are collected or even created from the host image to simulate an extracted watermark This method, called the features classification forest, can achieve blind extraction and is adaptable to any watermarking scheme using a quantization-based mechanism Furthermore, a larger size watermark can be accepted without an adverse effect on the imperceptibility of the host image The experiments demonstrate the successful simulation of watermarks and the application to five different watermarking schemes One of them is slightly adjusted from a reference to especially resist JPEG compression, and the others show native advantages to resist different image processing attacks
39 citations
••
TL;DR: This work proposes a new compression format, .zfib, for streamline tractography datasets reconstructed from diffusion magnetic resonance imaging (dMRI), which is highly compressible and opens new opportunities for connectomics and tractometry applications.
39 citations
••
TL;DR: In this paper, the properties of complex-valued SAR images relevant to the task of data compression are examined, and the use of transform-based compression methods is advocated but employ radically different quantization strategies than those commonly used for incoherent optical images.
Abstract: Synthetic aperture radars (SAR) are coherent imaging systems that produce complex-valued images of the ground. Because modern systems can generate large amounts of data, there is substantial interest in applying image compression techniques to these products. We examine the properties of complex-valued SAR images relevant to the task of data compression. We advocate the use of transform-based compression methods but employ radically different quantization strategies than those commonly used for incoherent optical images. The theory, methodology, and examples are presented.
39 citations
•
22 Jul 2002TL;DR: In this article, a unique hashing function is derived from a first section of image data contained in the JPEG compressed image in such a way that any changes subsequently made to the first image data is reflected in a different hashing function being derived from signature string is embedded into a next section of the image data.
Abstract: A system and method for authentication of JPEG image data enables the recipient to ascertain whether the received image file originated from a known identified source or whether the contents of the file have been altered in some fashion prior to receipt. A unique hashing function is derived from a first section of image data contained in the JPEG compressed image in such a way that any changes subsequently made to the first section of image data is reflected in a different hashing function being derived from a signature string is embedded into a next section of the image data. Since the embedding of a previous section's integrity checking number is done without modifying the JPEG bit stream, any JPEG decoder can thereafter properly decode the image.
39 citations