scispace - formally typeset
Search or ask a question
Topic

Lossless JPEG

About: Lossless JPEG is a research topic. Over the lifetime, 2415 publications have been published within this topic receiving 51110 citations. The topic is also known as: Lossless JPEG & .jls.


Papers
More filters
Posted Content
TL;DR: It is found that the noise distributions in higher compression cycles are different from those in the first compression cycle, and they are dependent on the quantization parameters used between two successive cycles.
Abstract: This paper focuses on the JPEG noises, which include the quantization noise and the rounding noise, during a JPEG compression cycle. The JPEG noises in the first compression cycle have been well studied; however, so far less attention has been paid on the JPEG noises in higher compression cycles. In this work, we present a statistical analysis on JPEG noises beyond the first compression cycle. To our knowledge, this is the first work on this topic. We find that the noise distributions in higher compression cycles are different from those in the first compression cycle, and they are dependent on the quantization parameters used between two successive cycles. To demonstrate the benefits from the statistical analysis, we provide two applications that can employ the derived noise distributions to uncover JPEG compression history with state-of-the-art performance.

1 citations

Proceedings ArticleDOI
01 Oct 2006
TL;DR: The proposed GPM approach has advantages over other serial lossless compression methods in terms of parallelism, scalability and easy hardware implementation.
Abstract: This paper presents a new generalized particle model (GPM) to generate the prediction coding for lossless data compression. The basic conception, parallel algorithm, properties and realization scheme of GPM are discussed. The proposed GPM approach has advantages over other serial lossless compression methods in terms of parallelism, scalability and easy hardware implementation. GPM is suitable for the lossless compression based on various prediction models and higher-order transition models.

1 citations

01 Jan 2007
TL;DR: This work investigates the sparse coding with an overcomplete basis set representation which is believed to be the strategy employed by the mammalian visual system for efficient coding of natural images and applies the developed models for the image compression applications and tests the achievable levels of compression of it.
Abstract: Overcomplete representations are currently one of the highly researched areas especially in the field of signal processing due to their strong potential to generate sparse representation of signals. Sparse representation implies that given signal can be represented with components that are only rarely significantly active. It has been strongly argued that the mammalian visual system is highly related towards sparse and overcomplete representations. The primary visual cortex has overcomplete responses in representing an input signal which leads to the use of sparse neuronal activity for further processing. This work investigates the sparse coding with an overcomplete basis set representation which is believed to be the strategy employed by the mammalian visual system for efficient coding of natural images. This work analyzes the Sparse Code Learning algorithm in which the given image is represented by means of linear superposition of sparse statistically independent events on a set of overcomplete basis functions. This algorithm trains and adapts the overcomplete basis functions such as to represent any given image in terms of sparse structures. The second part of the work analyzes an inhibition based sparse coding model in which the Gabor based overcomplete representations are used to represent the image. It then applies an iterative inhibition algorithm based on competition between neighboring transform coefficients to select subset of Gabor functions such as to represent the given image with sparse set of coefficients. This work applies the developed models for the image compression applications and tests the achievable levels of compression of it. The research towards these areas so far iii proves that sparse coding algorithms are inefficient in representing high frequency sharp image features. So this work analyzes the performance of these algorithms only on the natural images which does not have sharp features and compares the compression results with the current industrial standard coding schemes such as JPEG and JPEG 2000. It also models the characteristics of an image falling on the retina after the distortion effects of the eye and then applies the developed algorithms towards these images and tests compression results.

1 citations

Journal ArticleDOI
TL;DR: In this paper, a combination of predictive coding and the integer wavelet transform is proposed to reduce the storage requirement and transmission time for radiographic non-destructive testing images of aircraft components.
Abstract: This paper presents an efficient lossless compression method to reduce the storage requirement and transmission time for radiographic non‐destructive testing images of aircraft components. The method is based on a combination of predictive coding and the integer wavelet transform. By using the component CAD model to divide the radiographic image of aircraft components into different regions with each region having the same material structure, the parameters of the predictors and the choice of the integer wavelet transform are optimised according to the specific image features contained in each region. Using a real radiographic image of a practical aircraft component as an example, the proposed method is presented and shown to offer a significantly higher compression ratio than other lossless compression schemes currently available.

1 citations

Book ChapterDOI
14 Nov 2006
TL;DR: The capability of the Lower Tree Wavelet (LTW) image encoder to work in lossless mode is evaluated and despite being general purpose and lacking of complex techniques, the LTW performs as well as JPEG 2000 in lossed mode, and only 5% below LOCO-I, a specific lossless algorithm.
Abstract: For a lossy encoder, it is important to be able to provide also lossless compression with little or no modification of the usual algorithm, so that an implementation of that algorithm can work in lossy or lossless mode, depending on the specific application, simply by varying the input parameters. In this paper, we evaluate the capability of the Lower Tree Wavelet (LTW) image encoder to work in lossless mode. LTW is a fast and multiresolution wavelet image encoder, which uses trees as a fast mode to group coefficients. In addition, general details on how to implement efficiently (i.e., with only shift and addition/subtraction operations) a reversible integer-to-integer wavelet transform are also given, as a requirement to implement a wavelet-based lossless encoder. Numerical results show that despite being general purpose (i.e., both lossy and lossless) and lacking of complex techniques (such as high-order context and predictive coding), the LTW performs as well as JPEG 2000 in lossless mode, and only 5% below LOCO-I, a specific lossless algorithm.

1 citations


Network Information
Related Topics (5)
Image segmentation
79.6K papers, 1.8M citations
82% related
Feature (computer vision)
128.2K papers, 1.7M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
82% related
Image processing
229.9K papers, 3.5M citations
80% related
Convolutional neural network
74.7K papers, 2M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202321
202240
20215
20202
20198
201815