scispace - formally typeset
Search or ask a question
Topic

Lossless JPEG

About: Lossless JPEG is a research topic. Over the lifetime, 2415 publications have been published within this topic receiving 51110 citations. The topic is also known as: Lossless JPEG & .jls.


Papers
More filters
Proceedings ArticleDOI
05 Apr 1989
TL;DR: Many image transmission/storage applications requiring some form of data compression additionally require that the decoded image be an exact replica of the original, and lossless image coding algorithms meet this requirement by generating a decodes image that is numerically identical to the original.
Abstract: Many image transmission/storage applications requiring some form of data compression additionally require that the decoded image be an exact replica of the original. Lossless image coding algorithms meet this requirement by generating a decoded image that is numerically identical to the original. Several lossless coding techniques are modifications of well-known lossy schemes, whereas others are new. Traditional Markov-based models and newer arithmetic coding techniques are applied to predictive coding, bit plane processing, and lossy plus residual coding. Generally speaking, the compression ratio offered by these techniques are in the area of 1.6:1 to 3:1 for 8-bit pictorial images. Compression ratios for 12-bit radiological images approach 3:1, as these images have less detailed structure, and hence, their higher pel correlation leads to a greater removal of image redundancy.

20 citations

Book ChapterDOI
01 Oct 2014
TL;DR: A counter-forensic technique that makes multiple compression undetectable for any forensic detector based on the analysis of the histograms of quantized DCT coefficients is proposed.
Abstract: Detection of multiple JPEG compression of digital images has been attracting more and more interest in the field of multimedia forensics. On the other side, techniques to conceal the traces of multiple compression are being proposed as well. Motivated by a recent trend towards the adoption of universal approaches, we propose a counter-forensic technique that makes multiple compression undetectable for any forensic detector based on the analysis of the histograms of quantized DCT coefficients. Experimental results show the effectiveness of our approach in removing the artifacts of double and also triple compression, while maintaining a good quality of the image.

20 citations

Journal ArticleDOI
TL;DR: Two approaches to adaptive JPEG-based compression of color images inside digital cameras are presented and it is demonstrated that the second approach provides more accurate estimate of degrading factor characteristics, and thus, a larger compression ratio increase compared to super-high quality (SHQ) mode used in consumer digital cameras.
Abstract: The paper presents two approaches to adaptive JPEG-based compression of color images inside digital cameras. Compression for both approaches, although lossy, is organized in such a manner that introduced distortions are not visible. This is done taking into account quality of each original image before it is subject to lossy compression. Noise characteristics and blur are assumed to be the main factors determining visual quality of original images. They are estimated in a fast and blind (automatic) manner for images in RAW format (first approach) and in Bitmap (second approach). The dominant distorting factor which can be either noise or blur is determined. Then, the scaling factor (SF) of JPEG quantization table is adaptively adjusted to preserve valuable information in a compressed image with taking into account estimated noise and blur influence. The advantages and drawbacks of the proposed approaches are discussed. Both approaches are intensively tested for real-life images. It is demonstrated that the second approach provides more accurate estimate of degrading factor characteristics, and thus, a larger compression ratio (CR) increase compared to super-high quality (SHQ) mode used in consumer digital cameras. The first approach mainly relies on the prediction of noise and blur characteristics to be observed in Bitmap images after a set of nonlinear operations applied to RAW data in image processing chain. It is simpler and requires less memory but appeared to be slightly less beneficial. Both approaches are shown to provide, on the average, more than two times increase in average CR compared to SHQ mode without introducing visible distortions with respect to SHQ compressed images. This is proven by the analysis of modern visual quality metrics able to adequately characterize compressed image quality.

20 citations

Journal ArticleDOI
TL;DR: A new tool for forensic recovery of single and multi-fragment JPEG/JFIF data files can significantly outperform Adroit Photo Forensics and is compared with the well-known Adroit photo Forensics state-of-the art tool.
Abstract: In this paper, we present a new tool for forensic recovery of single and multi-fragment JPEG/JFIF data files. First, we discuss the basic design and the technical methods composing our proposed data carving algorithm. Next, we compare the performance of our method with the well-known Adroit Photo Forensics (APF) state-of-the art tool. This comparison is centered on both the carving results as well as the obtained data processing speed, and is evaluated in terms of the results that can be obtained for several well-known reference data sets. Important to note is that we specifically focus on the fundamental recovery and fragment matching performance of the tools by forcing them to use various assumed cluster sizes. We show that on all accounts our new tool can significantly outperform APF. This improvement in data processing speed and carving results can be mostly attributed to novel methods to iterate and reduce the data search space and to a novel parameterless method to determine the end of a fragment based on the pixel data. Finally, we discuss several options for future research.

20 citations

Proceedings ArticleDOI
21 Sep 2013
TL;DR: Experimental results show that JPEG image compression encryption algorithm is effectively guaranteed for the actual engineering applications and will be widely used in secure communication.
Abstract: With the maturity of communication technology, digital image for its values, has become an important carrier of information. However, digital image also faces the huge pressure of mass data storage and transmission, and it may be attacked or falsified during the transmission processes. Thus, it is necessary to focus our attention on the image compression encryption technology. First and foremost, this article mainly talks about the necessity and classification of image compression technology, then we make a depth analysis of JEPG image compression algorithm. Moreover, we focus on the JPEG encoding algorithm and make a detailed description of JEPG encoder, decoder control processes. We select the original image to complete the Mat lab simulation analysis based on JEPG algorithm. Thirdly, by using the DSP host processor, we can complete the hardware implementation of image acquisition and compression easily. Last but not least, this article selects a better compressed image to finish image encryption process. Experimental results show that JPEG image compression encryption algorithm is effectively guaranteed for the actual engineering applications and will be widely used in secure communication.

20 citations


Network Information
Related Topics (5)
Image segmentation
79.6K papers, 1.8M citations
82% related
Feature (computer vision)
128.2K papers, 1.7M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
82% related
Image processing
229.9K papers, 3.5M citations
80% related
Convolutional neural network
74.7K papers, 2M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202321
202240
20215
20202
20198
201815