Topic
Lossless JPEG
About: Lossless JPEG is a research topic. Over the lifetime, 2415 publications have been published within this topic receiving 51110 citations. The topic is also known as: Lossless JPEG & .jls.
Papers published on a yearly basis
Papers
More filters
••
04 Apr 2017
TL;DR: This work investigates the use of a similar approach for the reversible pipeline of the JPEG2000 standard to allow the creation of a scalable codestream that can provide both visually lossless and numerically lossless representations from a singlecodestream.
Abstract: Image compression systems that exploit the properties of the Human Visual System (HVS) have been studied extensively over the few decades. For the JPEG2000 image compressionstandard, several methods to optimize perceptual quality have been proposed. In 2013, Han et al. proposed a visually lossless compression approach based on the irreversible pipeline defined in the JPEG2000 standard. In this approach, visibility thresholds were measured using psychovisual experiments. These thresholds were then incorporated in a JPEG2000 encoder to ensure that quantization distortions remain below visible levels in the compressed codestreams. In this work, we investigate the use of a similar approach for the reversible pipeline of the JPEG2000 standard. Our motivation is to allow the creation of a scalable codestream that can provide both visually lossless and numerically lossless representations from a single codestream. By comparing the difference in compression performance between the reversible and irreversible pipelines, we also quantify the overhead associated with the reversible pipeline for visually lossless compression.
1 citations
••
07 Dec 2001TL;DR: A scan-based implementation of TCQ has been realized and tested, with a very small performance loss as compared with the full image (frame-based) version.
Abstract: JPEG 2000 Part 2 (Extensions) contains a number of technologies that are of potential interest in remote sensing applications. These include arbitrary wavelet transforms, techniques to limit boundary artifacts in tiles, multiple component transforms, and trellis-coded quantization (TCQ). We are investigating the addition of these features to the low-memory (scan-based) implementation of JPEG 2000 Part 1. A scan-based implementation of TCQ has been realized and tested, with a very small performance loss as compared with the full image (frame-based) version. A proposed amendment to JPEG 2000 Part 2 will effect the syntax changes required to make scan-based TCQ compatible with the standard.
1 citations
••
16 May 2009TL;DR: Using the typical example as experiment, a new shuffle algorithm is proposed to resize the JPEG, and then use the universal lossless algorithms to compress the resize JPEG with little time cost.
Abstract: Universal lossless algorithms of data compression are always efficient to text and inefficient to JPEG data. According to the characteristic of JPEG, a new shuffle algorithm is proposed to resize the JPEG, and then use the universal lossless algorithms to compress the resize JPEG. In the end, using the typical example as experiment has been carried out and obtained a preferable compression ratio with little time cost.
1 citations
••
01 Nov 2011TL;DR: A new Reduced Reference model for assessing the quality of JPEG and JPEG2000 images is proposed that incorporates a simple design to model the distortions in the image without using complex Human Visual System (HVS) model.
Abstract: We propose a new Reduced Reference model for assessing the quality of JPEG and JPEG2000 images We utilize the Haar Wavelet Decomposition as a tool to model the image information along with the distortions presents in JPEG or JPEG2000 images One of the superiority of our method is the ability to independently assess the quality of JPEG and JPEG2000 images without using any additional discrimination method Our method also incorporate a simple design to model the distortions in the image without using complex Human Visual System (HVS) model Despite its simple design, our method has demonstrated an accurate and reliable quality metric Experimental results using several categories of JPEG and JPEG2000 images using LIVE Image Database Release 2 will be presented to verify our metric The results of our experiments show that the method has achieved high correlation with the subjective data in most of the categories
1 citations
•
TL;DR: Experimental results demonstrate that the proposed algorithm performs well in tamper forensics of JPEG images spliced after the re-sampling operations that include scaling and rotation.
Abstract: In order not to leave any visual evidence of tampering when the digital images were tampered by splicing,the tampered region might be dealt with some resample operations,such as scaling,rotation,and so on.Targeting this phenomenon,this paper proposes a new JPEG image splicing forgery algorithm based on the re-sampling detection.Firstly,calculating the second derivative of the local JPEG image region and doing the Radon transform to it,then finding the auto-covariance and doing fast Fourier transform to it.After those processing,the impact of JPEG compression in the frequency domain is eliminated.Secondly,judging that whether the local region is operated by re-sampling is to be the evidence of whether the detected JPED images are tampered with splicing.Experimental results demonstrate that the proposed algorithm performs well in tamper forensics of JPEG images spliced after the re-sampling operations that include scaling and rotation.
1 citations