scispace - formally typeset
Search or ask a question
Topic

Lossless JPEG

About: Lossless JPEG is a research topic. Over the lifetime, 2415 publications have been published within this topic receiving 51110 citations. The topic is also known as: Lossless JPEG & .jls.


Papers
More filters
Journal ArticleDOI
10 Jun 2013
TL;DR: A new lossless adaptive prediction based algorithm for continuous tone images, based on the prediction method used in Context based Adaptive Lossless Image Coding, which reduces spatial redundancy in a image.
Abstract: Images are an important part of today's digital world. However, due to the large quantity of data needed to represent modern imagery the storage of such data can be expensive. Thus, work on efficient image storage (image compression) has the potential to reduce storage costs and enable new applications.This lossless image compression has uses in medical, scientific and professional video processing applications.Compression is a process, in which given size of data is compressed to a smaller size. Storing and sending images to its original form can present a problem in terms of storage space and transmission speed.Compression is efficient for storing and transmission purpose. In this paper we described a new lossless adaptive prediction based algorithm for continuous tone images. In continuous tone images spatial redundancy exists.Our approach is to develop a new backward adaptive prediction techniques to reduce spatial redundancy in a image.The new prediction technique known as Modifed Gradient Adjusted Predictor (MGAP) is developed. MGAP is based on the prediction method used in Context based Adaptive Lossless Image Coding (CALIC). An adaptive selection method which selects the predictor in a slope bin in terms of minimum entropy improves the compression performance.

2 citations

Proceedings ArticleDOI
27 Mar 2009
TL;DR: The proposed method uses the number of zero bitplanes, which is obtained by parsing the header information without decoding JPEG 2000 codestream, to achieve joint identification and retrieving of JPEG 2000 image.
Abstract: A joint bitstream level identification and retrieval method for JPEG 2000 images is proposed in this paper. Although the main purpose is to identify a JPEG 2000 image, the proposed method simultaneously retrieves JPEG 2000 images from a certain query with the computation of similarity. The proposed method uses the number of zero bitplanes, which is obtained by parsing the header information without decoding JPEG 2000 codestream, to achieve joint identification and retrieving of JPEG 2000 image. Some experimental results on retrieving JPEG 2000 images are provided to evaluate the performance of the proposed method. The main advantage is not only accuracy but also fast processing. Since there is no need to decode JPEG 2000 images in the proposed method, averaged time consumption due to retrieving operation is about less than 1[ms] and is independent on image size. Moreover, the proposed method can be combined with the scalability of JPEG 2000 codestream to satisfy trade-off between accuracy and speed.

2 citations

Proceedings ArticleDOI
14 Nov 1996
TL;DR: A novel image-adaptive encoding scheme for the baseline JPEG standard that maximizes the decoded image quality without compromising compatibility with current JPEG decoders and may be applied to other systems that use run-length encoding, including intra- frame MPEG and subband or wavelet coding.
Abstract: We introduce a novel image-adaptive encoding scheme for the baseline JPEG standard that maximizes the decoded image quality without compromising compatibility with current JPEG decoders. Our algorithm jointly optimizes quantizer selection, coefficient 'thresholding', and entropy coding within a rate-distortion (R-D) framework. It unifies two previous approaches to image-adaptive JPEG encoding: R-D optimized quantizer selection by Wu and Gersho, and R-D optimal coefficient thresholding by Ramchandran and Vetterli. By formulating an algorithm which optimizes these two operations jointly, we have obtained performance that is the best in the reported literature for JPEG-compatible coding. In fact the performance of this JPEG coder is comparable to that of more complex 'state-of-the-art' image coding schemes: e.g., for the benchmark 512 by 512 'Lenna' image at a coding rate of 1 bit per pixel, our algorithm achieves a peak signal to noise ratio of 39.6 dB, which represents a gain of 1.7 dB over JPEG using the example Q- matrix with a customized Huffman entropy coder, and even slightly exceeds the published performance of Shapiro's celebrated embedded zerotree wavelet coding scheme. Furthermore, with the choice of appropriate visually-based error metrics, noticeable subjective improvement has been achieved as well. The reason for our algorithm's superior performance can be attributed to its conceptual equivalence to the application of entropy-constrained vector quantization design principles to a JPEG-compatible framework. Furthermore, our algorithm may be applied to other systems that use run-length encoding, including intra- frame MPEG and subband or wavelet coding.© (1996) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

2 citations

Journal ArticleDOI
TL;DR: Testing results on a variety of erroneous fragmented and normal JPEG files prove the strength of this operator for the purpose of forensics analysis, data recovery and abnormal fragment inconsistencies classification and detection and show that the proposed DCT coefficient analysis methods are efficient and practical in terms of classification accuracy.

2 citations

Proceedings ArticleDOI
01 Dec 2012
TL;DR: The objective of this paper is to prove the significance of the optimized run length coding algorithm for biomedical imaging technology and open source the idea behind the optimized algorithm in a comprehensive way.
Abstract: The objective of this paper is to prove the significance of the optimized run length coding algorithm for biomedical imaging technology and open source the idea behind the optimized algorithm in a comprehensive way An optimized scheme for entropy encoding part of JPEG image compression by modifying the run length encoding method has been provided by the authors for a Space Research Program at Institute of Space technology (IST) The same has been observed to produce a large amount of saving in terms of memory required for Biomedical compressed images In JPEG (Joint Photographic Experts Group) image compression algorithm run length coding performs the actual compression by removing the redundancy from transformed and quantized image data Using the fact that the preceding processes of run length coding produces a large number of zeros the original run length coding uses an ordered pair (a, b), where ‘a’ is the length of consecutive zeros preceding the ASCII character ‘b’ The proposed run length encoding scheme removes the unintended redundancy by using an ordered pair only when a zero occurs The proposed encoding scheme does not alter the PSNR value for the algorithm Using Matlab simulation, the proposed scheme has been tested on various biomedical images over a range of quantization (quality) factor and the results confirmed the effectiveness of the new run length encoding scheme in reducing the run length encoded data size and processing time delay

2 citations


Network Information
Related Topics (5)
Image segmentation
79.6K papers, 1.8M citations
82% related
Feature (computer vision)
128.2K papers, 1.7M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
82% related
Image processing
229.9K papers, 3.5M citations
80% related
Convolutional neural network
74.7K papers, 2M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202321
202240
20215
20202
20198
201815