scispace - formally typeset
Search or ask a question
Topic

Lossless JPEG

About: Lossless JPEG is a research topic. Over the lifetime, 2415 publications have been published within this topic receiving 51110 citations. The topic is also known as: Lossless JPEG & .jls.


Papers
More filters
Proceedings ArticleDOI
07 Aug 2001
TL;DR: The proposed lossless Karhunen-Loeve Transform based on ladder networks is proposed, which reduces inter-color redundancies which increases coding performance and progressive transmission capability and the coding performance is superior to that of DCT-based JPEG with the RGB/YUV transform.
Abstract: This paper proposes pathological microscopic image compression schemes that suit lossless and progressive transmission. Because pathological microscopic images require very high resolution, they create heavy storage requirements and long transmission times. Image compression is desired to reduce these problems. First, we propose a lossless Karhunen-Loeve Transform (KLT) based on ladder networks. The proposed lossless KLT reduces inter-color redundancies which increases coding performance. Next we propose a progressive transmission algorithm by combining the lossless KLT and set partitioning in hierarchical trees (SPIHT) with the S+P transform. SPIHT is adopted to encode individual color-transformed components. By considering coding efficiency, the transmission bit rates of each encoded component are determined. The resulting algorithm gives high coding performance and has progressive transmission capability. When all transmitted data are decoded, decoding yields the original image. We demonstrate the performance of the proposed algorithm when applied to super high definition pathological microscopic images. All the images used in our tests have 2048x2048 pixels and 24 bits per pixel. It is shown that the coding performance of the proposed algorithm is superior to that of DCT-based JPEG with the RGB/YUV transform.

1 citations

Proceedings ArticleDOI
07 Apr 2014
TL;DR: This paper would help the forger hide JPEG compression history by recompressing and adding noise at different quality factor and restore the image quality by TV denoising and defend forensic attack aiming at anti-forensic dither by measuring the total variation (TV) of the image.
Abstract: Identifying and disguising the JPEG compression history has become a hot research point in recent days. The image forger could easily get the JPEG images as tampering materials, while the JPEG loss compression left a big influence on the image's DCT histogram. This JPEG compression trace has become an important standpoint of compression forensic. In order to fool the compression forensic, the anti-forensic dithers have been added into the JPEG compressed image and became an uncompressed image format. However, that kind of fake lossless image could be identified by measuring how 'noisy' the uncompressed image is after recompressing. This paper would help the forger hide JPEG compression history by recompressing and adding noise at different quality factor and restore the image quality by TV denoising. It could defend forensic attack aiming at anti-forensic dither by measuring the total variation (TV) of the image. The simulation results have proved that it can fool the TV detection forensic with a high image quality.

1 citations

Journal Article
TL;DR: Simulated experiments show that proposed methods gives better image quality when compared to the JPEG2000 at any given bit rate, which is currently the most popular compressor.
Abstract: many contemporary applications, such as distributed multimedia systems, rapid transmission of images is necessary. Cost of transmission and storage tends to be directly proportional to the volume of data. Therefore, application of digital image compression techniques becomes necessary to minimize the cost. A number of digital image compression algorithms have been developed and standardized. The method proposed by Joint Photographic Experts Group (JPEG) is a lossy compression technique. An improved version of JPEG is JPEG2000, which is currently the most popular compressor. The paper deals with a new compressing non-negative integer method JPEG2000 with Gamma code compressors by modifying the JPEG2000 architecture. In the proposed methods, the entropy coder is replaced by new coders. Simulated experiments using the methods show that proposed methods gives better image quality when compared to the JPEG2000 at any given bit rate.

1 citations

Book ChapterDOI
01 Jan 1993
TL;DR: This work presents a new model of error resilient communication where even though errors may not be detected, there are strong guarantees that their effects will not propagate.
Abstract: With dynamic communication a sender and receiver work in a “lock-step” cooperation to maintain identical copies of a dictionary D (which is constantly changing). A key application of dynamic communication is adaptive data compression. A potential drawback of dynamic communication is error propagation (that causes the sender and receiver dictionaries to diverge and possibly corrupt all data to follow). Protocols that require the receiver to request re-transmission from the sender when an error is detected can be impractical for many applications where such two way communication is not possible or self-defeating (e.g., with data compression, re-transmission is tantamount to losing the data that could have been transmitted in the mean time). We present a new model of error resilient communication where even though errors may not be detected, there are strong guarantees that their effects will not propagate.

1 citations

Proceedings ArticleDOI
21 Jul 2015
TL;DR: A comparative study was performed with current state of the art lossless image coding standards showing that the proposed DNACoding approach provides high compression rations.
Abstract: Lossless image compression is necessary for many applications related to digital cameras, medical imaging, mobile telecommunications, security and entertainment. Image compression as an important filed in image processing, includes several coding standards providing high compression ratios. In this work a novel method for lossless image encoding and decoded is introduced. Inspired by the storing and data representation architectures used in living multicellular organisms, the proposed DNACoding approach encodes images based on the same principles. The coding process includes three main stages, division, differentiation and specialization allowing the exploitation of spatial and inter-pixel redundancies. The key element to achieve that representation and efficiency is the novel concept of ‘stem’ pixels that is introduced. A comparative study was performed with current state of the art lossless image coding standards showing that the proposed methodology provides high compression rations.

1 citations


Network Information
Related Topics (5)
Image segmentation
79.6K papers, 1.8M citations
82% related
Feature (computer vision)
128.2K papers, 1.7M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
82% related
Image processing
229.9K papers, 3.5M citations
80% related
Convolutional neural network
74.7K papers, 2M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202321
202240
20215
20202
20198
201815