scispace - formally typeset
Search or ask a question
Topic

Lossless JPEG

About: Lossless JPEG is a research topic. Over the lifetime, 2415 publications have been published within this topic receiving 51110 citations. The topic is also known as: Lossless JPEG & .jls.


Papers
More filters
Proceedings ArticleDOI
24 Mar 2010
TL;DR: Some coding technologies newly proposed and applied to the G.711.0 codec, such as Plus-Minus zero mapping for the mapped domain linear predictive coding and escaped-Huffman coding combined with adaptive recursive Rice coding for lossless compression of the prediction residual are introduced.
Abstract: ITU-T Rec. G.711 is widely used for the narrow band speech communication. ITU-T has just established a very low complexity and efficient lossless coding standard for G.711, called G.711.0 - Lossless compression of G.711 pulse code modulation. This paper introduces some coding technologies newly proposed and applied to the G.711.0 codec, such as Plus-Minus zero mapping for the mapped domain linear predictive coding and escaped-Huffman coding combined with adaptive recursive Rice coding for lossless compression of the prediction residual. Performance test results for those coding tools are shown in comparison with the results for the conventional technology. The performance is measured based on the figure of merit (FoM), which is a function of the trade-off between compression performance and computational complexity. The proposed tools improve the compression performance by 0.16% in total while keeping the computational complexity of encoder/decoder pair low (about 1.0 WMOPS in average and 1.667 WMOPS in the worst-case).

9 citations

Journal ArticleDOI
TL;DR: This paper proposes a procedure for the design of separable 2-D synthesis filters that minimize the reconstruction error power for transform coders and shows that the proposed decoding method gives some gain with respect to the usual decoder in most cases.
Abstract: Transform coding is a technique used worldwide for image coding, and JPEG has become the most common tool for image compression. In a JPEG decoder, the quantized transform coefficient blocks are usually processed using the inverse discrete cosine transform (DCT) in order to reconstruct an approximation of the original image. The direct and inverse DCT pair can be arranged in the form of a perfect reconstruction filter bank, and it can be shown that, in the presence of quantization of the transform coefficients, the perfect reconstruction synthesis is not the best choice. In this paper, we propose a procedure for the design of separable 2-D synthesis filters that minimize the reconstruction error power for transform coders. The procedure is used to design a family of filters which are used in the decoder instead of the inverse DCT. The appropriate reconstruction filters are selected on the basis of the standard quantization information provided in the JPEG bit stream. We show that the proposed decoding method gives some gain with respect to the usual decoder in most cases, Moreover, it only makes use of the standard information provided by a JPEG bit stream.

9 citations

Journal ArticleDOI
TL;DR: A simple and high effective method is presented for automatically detecting the compression history of an image and it can be applied to multi-compression detection and is robust to different sources of out-camera compression, e.g. Adobe Photoshop.
Abstract: An illicit photography work can be exposed by its unusual compression history. Our work aims at revealing the primary JPEG compression of a camera image especially when it has undergone an out-camera JPEG compression. The proposed method runs a recompression operator on a given image using a chosen software tool (MATLAB). We measure the JPEG error between the given image and the recompressed version in the Y, Cb and Cr color channels. The in-camera compression can be easily identified by drawing the JPEG error curves. In this paper a simple and high effective method is presented for automatically detecting the compression history of an image. For a doubly compressed image, the proposed method can give the historical compression sequence with the corresponding quality factors and determine whether the first compression is the in-camera compression. Experimental results, carried out on two datasets, show that the proposed method can yield satisfactory detection accuracy, over 96 % accuracy rate for in-camera compression and no false positives with a block size of 512 × 512. The proposed method has universality. It can be applied to multi-compression detection and is robust to different sources of out-camera compression, e.g. Adobe Photoshop. This makes it more practical compared to the previous methods of double compression.

9 citations

Journal ArticleDOI
Yanjun Cao1, Tiegang Gao1, Guorui Sheng1, Li Fan1, Lin Gao1 
TL;DR: A novel anti‐forensic algorithm is proposed, which is capable of concealing the quantization artifacts that left in the single JPEG compressed image and can verify the reliability of the JPEG forensic tools.
Abstract: To prevent image forgeries, a number of forensic techniques for digital image have been developed that can detect an image's origin, trace its processing history, and can also locate the position of tampering. Especially, the statistical footprint left by JPEG compression operation can be a valuable source of information for the forensic analyst, and some image forensic algorithm have been raised based on the image statistics in the DCT domain. Recently, it has been shown that footprints can be removed by adding a suitable anti-forensic dithering signal to the image in the DCT domain, this results in invalid for some image forensic algorithms. In this paper, a novel anti-forensic algorithm is proposed, which is capable of concealing the quantization artifacts that left in the single JPEG compressed image. In the scheme, a chaos-based dither is added to an image's DCT coefficients to remove such artifacts. Effectiveness of both the scheme and the loss of image quality are evaluated through the experiments. The simulation results show that the proposed anti-forensic scheme can verify the reliability of the JPEG forensic tools.

9 citations

Journal ArticleDOI
TL;DR: This paper proposes a method for encoding still images based on the JPEG standard that allows the compression/decompression time cost and image quality to be adjusted to the needs of each application and to the bandwidth conditions of the network.
Abstract: There are a large number of image processing applications that work with different performance requirements and available resources. Recent advances in image compression focus on reducing image size and processing time, but offer no real-time solutions for providing time/quality flexibility of the resulting image, such as using them to transmit the image contents of web pages. In this paper we propose a method for encoding still images based on the JPEG standard that allows the compression/decompression time cost and image quality to be adjusted to the needs of each application and to the bandwidth conditions of the network. The real-time control is based on a collection of adjustable parameters relating both to aspects of implementation and to the hardware with which the algorithm is processed. The proposed encoding system is evaluated in terms of compression ratio, processing delay and quality of the compressed image when compared with the standard method. Our method performs real-time encoding/decoding still images based on JPEG standard.We introduce parameters that concern to the performance of the image processing.Adjusting the parameters affords to time/quality meet the application constraints.Experiments and quality test were made to probe the method consistency.

9 citations


Network Information
Related Topics (5)
Image segmentation
79.6K papers, 1.8M citations
82% related
Feature (computer vision)
128.2K papers, 1.7M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
82% related
Image processing
229.9K papers, 3.5M citations
80% related
Convolutional neural network
74.7K papers, 2M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202321
202240
20215
20202
20198
201815