scispace - formally typeset
Search or ask a question
Topic

Lossless JPEG

About: Lossless JPEG is a research topic. Over the lifetime, 2415 publications have been published within this topic receiving 51110 citations. The topic is also known as: Lossless JPEG & .jls.


Papers
More filters
Journal ArticleDOI
TL;DR: This work presents a rate-distortion (RD) optimized JPEG compliant progressive encoder that produces a sequence of scans, ordered in terms of decreasing importance, and can achieve precise rate/distortion control.
Abstract: Among the many different modes of operations allowed in the current JPEG standard, the sequential and progressive modes are the most widely used. While the sequential JPEG mode yields essentially the same level of compression performance for most encoder implementations, the performance of progressive JPEG depends highly upon the designed encoder structure. This is due to the flexibility the standard leaves open in designing progressive JPEG encoders. In this work, a rate-distortion (RD) optimized JPEG compliant progressive encoder is presented that produces a sequence of scans, ordered in terms of decreasing importance. Our encoder outperforms an optimized sequential JPEG encoder in terms of compression efficiency, substantially at low and high bit rates. Moreover, unlike existing JPEG compliant encoders, our encoder can achieve precise rate/distortion control. Substantially better compression performance and precise rate control, provided by our progressive JPEG compliant encoding algorithm, are two highly desired features currently sought for the emerging JPEG-2000 standard.

23 citations

Proceedings Article
03 Oct 2012
TL;DR: The survey summarizes the major image compression methods spanning across lossy and lossless image compression techniques and explains how the JPEG and JPEG2000 images compression techniques are distinct from each other.
Abstract: research in the field of image compression is driven by the ever increasing bandwidth requirements for transmission of images in computer, mobile and internet environments. In this context, the survey summarizes the major image compression methods spanning across lossy and lossless image compression techniques and explains how the JPEG and JPEG2000 image compression techniques are distinct from each other. Further, the paper concludes that still research possibilities exist in this field to explore efficient image compression. General Terms Image compression, Huffman coding, low bit rate transmission.

23 citations

Proceedings ArticleDOI
05 Apr 2004
TL;DR: A new method of image steganography that is extremely robust against JPEG compression while allowing error free information extracting is presented, based on 2D lossless wavelet transform and convolution error correction coding.
Abstract: Steganography is art of hiding a secret media in another media. One of the major challenges in steganography is robustness, since the stego-signal need to survive multiple kinds of data processing. Low pass filtering, for example JPEG compression, is known as a common attack against stego-signal in image based steganography. We present a new method of image steganography that is extremely robust against JPEG compression while allowing error free information extracting. The method is based on 2D lossless wavelet transform and convolution error correction coding. Experimental results show that hidden information can be retrieved with zero bit error rates even when the stego-image experienced maximum JPEG compression.

23 citations

Proceedings ArticleDOI
20 Nov 2014
TL;DR: It is demonstrated that profiles A and B lead to similar saturation of quality at the higher bit rates, while profile C exhibits no saturation, while Profiles B and C appear to be more dependent on TMOs used for the base layer compared to profile A.
Abstract: The upcoming JPEG XT is under development for High Dynamic Range (HDR) image compression. This standard encodes a Low Dynamic Range (LDR) version of the HDR image generated by a Tone-Mapping Operator (TMO) using the conventional JPEG coding as a base layer and encodes the extra HDR information in a residual layer. This paper studies the performance of the three profiles of JPEG XT (referred to as profiles A, B and C) using a test set of six HDR images. Four TMO techniques were used for the base layer image generation to assess the influence of the TMOs on the performance of JPEG XT profiles. Then, the HDR images were coded with different quality levels for the base layer and for the residual layer. The performance of each profile was evaluated using Signal to Noise Ratio (SNR), Feature SIMilarity Index (FSIM), Root Mean Square Error (RMSE), and CIEDE2000 color difference objective metrics. The evaluation results demonstrate that profiles A and B lead to similar saturation of quality at the higher bit rates, while profile C exhibits no saturation. Profiles B and C appear to be more dependent on TMOs used for the base layer compared to profile A.

23 citations

Journal ArticleDOI
TL;DR: The proposed FEREC algorithm is shown to be almost twice as fast as EREC in encoding the data, and hence the error resilience capability is also observed to be significantly better.
Abstract: There has been an outburst of research in image and video compression for transmission over noisy channels. Channel matched source quantizer design has gained prominence. Further, the presence of variable-length codes in compression standards like the JPEG and the MPEG has made the problem more interesting. Error-resilient entropy coding (EREC) has emerged as a new and effective method to combat catastrophic loss in the received signal due to burst and random errors. We propose a new channel-matched adaptive quantizer for JPEG image compression. A slow, frequency-nonselective Rayleigh fading channel model is assumed. The optimal quantizer that matches the human visibility threshold and the channel bit-error rate is derived. Further, a new fast error-resilient entropy code (FEREC) that exploits the statistics of the JPEG compressed data is proposed. The proposed FEREC algorithm is shown to be almost twice as fast as EREC in encoding the data, and hence the error resilience capability is also observed to be significantly better. On average, a 5% decrease in the number of significantly corrupted received image blocks is observed with FEREC. Up to a 2-dB improvement in the peak signal-to-noise ratio of the received image is also achieved.

23 citations


Network Information
Related Topics (5)
Image segmentation
79.6K papers, 1.8M citations
82% related
Feature (computer vision)
128.2K papers, 1.7M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
82% related
Image processing
229.9K papers, 3.5M citations
80% related
Convolutional neural network
74.7K papers, 2M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202321
202240
20215
20202
20198
201815