Topic
Lossless JPEG
About: Lossless JPEG is a research topic. Over the lifetime, 2415 publications have been published within this topic receiving 51110 citations. The topic is also known as: Lossless JPEG & .jls.
Papers published on a yearly basis
Papers
More filters
••
30 Oct 2009TL;DR: The different detection methods are proposed based on different 1 q, 2 q which are effective proved by the experiments in 1.q .
Abstract: q . The difference of the distributions is analyzed in this paper. The different detection methods are proposed based on different 1 q , 2 q which are effective proved by the experiments in
2 citations
••
01 Dec 2013TL;DR: A method of lossless compression of high dynamic range (HDR) images encoded in LogLuv32 format is presented, where the average bit rate was obtained and a set of those data shrinks to 31% of the uncompressed data size on average.
Abstract: A method of lossless compression of high dynamic range (HDR) images encoded in LogLuv32 format is presented. Simple bitplane coding (SBC) is independently applied to the components of LogLuv data. A context model for coding a bit is created by a comparison of neighboring pixel values, and is fed to a range coder. As a result, the average bit rate of 9.8 bits/pixel was obtained in lossless compression of popular 34 LogLuv32 HDR images. That is, a set of those data shrinks to 31% of the uncompressed data size on average.
2 citations
••
13 Mar 2003
TL;DR: The lossy compression methods discussed are the JPEG standard, and four approaches based on the Wavelet Transform: the Embedded coding of ZeroTree wavelet coefficients, the Set Partitioning in Hierarchical Trees, a Lattice Vector Quantizer, and the new JPEG2K.
Abstract: Several well-known methods for lossy compression of still images are here analyzed to evaluate their performance for hyperspectral images.
The lossy compression methods discussed are the JPEG standard,
and four approaches based on the Wavelet Transform: the Embedded coding of ZeroTree wavelet coefficients, the Set Partitioning in Hierarchical Trees, a Lattice Vector Quantizer, and the new JPEG2K.
Experiments are first performed on corpuses of natural grayscale still images to provide a general framework of the performance of each method. Then experiments are performed on several hyperspectral images taken with CASI and AVIRIS sensors. Experiments show that it is possible to employ the basic lossy compression methods for hyperspectral image coding. The wavelet-based approaches produce results consistently better than the JPEG: JPEG can not achieve
compression ratios above 75:1; on the other side, with EZT, SPIHT and LVQ compression ratios of 250:1 or higher may be reached. For JPEG2K, higher compression ratios than JPEG may also be reached, but with a PSNR quality lower than the three other techniques. At compression ratios about 8:1, the wavelet methods yield results 1.5 dB better than those of JPEG. These results help to explain why JPEG2K standard uses the WT instead of the DCT.
2 citations
•
TL;DR: Experimental results show that the proposed semi-fragile watermarking scheme is robust against JPEG lossy compression, and provides very effective classification of intentional and incidental tampering.
Abstract: A semi-fragile watermarking scheme is proposed for enduring JPEG compression. This algorithm is combined with the JPEG image coding system and cryptology, uses wavelet and chaos generating the watermark related with the image in content, which can assure very good security for the watermark. The algorithm makes uses of the invariant before and after JPEG compression, which the watermark is semi-fragilely embedded in DCT domain and the key is steadily hided in the carrier simultaneously. Experimental results show that the proposed algorithm is robust against JPEG lossy compression, and provides very effective classification of intentional and incidental tampering.
2 citations