Topic
Lossless JPEG
About: Lossless JPEG is a research topic. Over the lifetime, 2415 publications have been published within this topic receiving 51110 citations. The topic is also known as: Lossless JPEG & .jls.
Papers published on a yearly basis
Papers
More filters
••
24 Jun 2004TL;DR: A compression scheme that exploits inter-slice redundancy in medical imaging that is compatible with the JPEG 2000 core coding system extensions (part 2) is evaluated.
Abstract: We evaluate a compression scheme that exploits inter-slice redundancy in medical imaging that is compatible with the JPEG 2000 core coding system extensions (part 2). The scheme consists of applying a decorrelating transform in the cross-slice vertical direction subsequently compressing each slice with a core part 1 JPEG 2000 coder. The rates for each slice are chosen according to a heuristic log-based allocation algorithm. Two transforms were considered, the KLT and the DWT 9/spl times/7. A gain of up to 4-4 dB in RMSE was achieved using the DWT 9/spl times/7 over a baseline JPEG 2000 system that uses no decorrelating transform.
7 citations
••
08 Jul 2009TL;DR: This paper proposes an algorithm that utilizes both JPEG 2000 and robust watermarking for protection and compression of the medical image and confirmed that the proposed algorithm was faster than when they are done separately.
Abstract: The Picture Archiving and Communication System(PACS) was introduced for computerization of the medical system and telediagnosis between the hospital. It is becoming possible to create, store, and transmit medical images via PACS. There has been a growing interest in protecting medical images with an enormous amount of information. To improve transmission speed among the hospitals, the medical image should be compression JPEG 2000 by high compression ratio. This paper proposes an algorithm that utilizes both JPEG 2000 and robust watermarking for protection and compression of the medical image. With the proposed algorithm, it takes considerably less time to do JPEG 2000 and watermarking than when they are done separately. Based on the experiment results, it takes 0.72 second for the proposed algorithm and 1.11 second when they are done separately. We confirmed that the proposed algorithm was faster than when they are done separately.
7 citations
••
04 Jan 2012TL;DR: The result shows that the performance of standard JPEG method can be improved by proposed method, and this hybrid approach achieves about 20% more compression ratio than the Standard JPEG.
Abstract: Lossy JPEG compression is a widely used compression technique. Normally the JPEG technique uses two process quantization, which is lossy process and entropy encoding, which is considered lossless process. In this paper, a new technique has been proposed by combining the JPEG algorithm and Symbol Reduction Huffman technique for achieving more compression ratio. The symbols reduction technique reduces the number of symbols by combining together to form a new symbol. As a result of this technique the number of Huffman code to be generated also reduced. The result shows that the performance of standard JPEG method can be improved by proposed method. This hybrid approach achieves about 20% more compression ratio than the Standard JPEG.
7 citations
••
09 Aug 1992TL;DR: A quantization scheme for discrete cosine transform (DCT) coefficients in the JPEGs baseline sequential method for image compression is proposed in this paper, which is adaptive to the image characteristics and is statistical in nature.
Abstract: A quantization scheme for discrete cosine transform (DCT) coefficients in the Joint Photographic Experts Group's (JPEGs) baseline sequential method for image compression is proposed. The DCT coefficients should be quantized to achieve maximum compression without degrading the visual image quality. The scheme is adaptive to the image characteristics and is statistical in nature. The results are evaluated in terms of compression, root-mean-square error, and subjective visual quality, 8-b/pixel monochrome images of size 512 * 512 have been compressed in the range 0.4-1 b/pixel with good to excellent quality. >
7 citations
••
29 Mar 1994TL;DR: The authors show a rate-distortion optimal quantization technique to threshold the DCT coefficients in the industry image and video coding standards JPEG and MPEG respectively which achieves a decent thresholding gain and uses a fast dynamic programming recursive structure which exploits certain monotonicity characteristics of the JPEG andmpeg codebooks to drastically reduce the complexity.
Abstract: The authors show a rate-distortion optimal quantization technique to threshold the DCT coefficients in the industry image and video coding standards JPEG and MPEG respectively. Their scheme achieves a decent thresholding gain in terms of both objective SNR (about 1 dB) as well as perceived quality and uses a fast dynamic programming recursive structure which exploits certain monotonicity characteristics of the JPEG and MPEG codebooks to drastically reduce the complexity. The primary advantage of their encoding algorithm is that it is completely compatible with the baseline JPEG and MPEG decoders. >
7 citations