scispace - formally typeset
Search or ask a question
Topic

Lossless JPEG

About: Lossless JPEG is a research topic. Over the lifetime, 2415 publications have been published within this topic receiving 51110 citations. The topic is also known as: Lossless JPEG & .jls.


Papers
More filters
02 Mar 1991

7 citations

Proceedings ArticleDOI
17 Aug 2005
TL;DR: A priority-driven scheduling approach is introduced into the coding algorithm, which makes the transmission of important parts earlier with more data than other parts, which can satisfy users with desired images quality and lead to a significant reduction of the important parts' deadline misses.
Abstract: Since high-quality image/video systems based on the JPEG/MPEG compression standards often require power-expensive implementations at relatively high bit-rates, they have not been widely used in low-power wireless applications. To alleviate this problem, we designed, implemented, and evaluated a strategy that can adapt to different compression and transmission rates. (I) It gives important parts of an image higher priority over unimportant parts. Therefore, the high-priority parts can achieve high image quality, while the low-priority parts, with a slight sacrifice of quality, can achieve huge compression rate and thus save the power/energy of a low-power wireless system. (2) We also introduce a priority-driven scheduling approach into our coding algorithm, which makes the transmission of important parts earlier with more data than other parts. Through a balanced trade-off between the available time/bandwidth/power and the image quality, this adaptive strategy can satisfy users with desired images quality and lead to a significant reduction of the important parts' deadline misses.

7 citations

Proceedings ArticleDOI
03 Mar 2016
TL;DR: The proposed JPEG-LS Algorithm based on LOCO-I is implemented in MATLAB and uses a predictive technique and the resulting prediction error is encoded using Golomb-Rice coding.
Abstract: The LOCO-I / JPEG-LS algorithm aims at providing lossless compression ratios but with a much lower algorithm complexity. Official designation of JPEG-LS is ISO-14495-1/ITU-T.87. JPEG-LS is a simple and efficient algorithm that mainly consists of two stages modeling and encoding. Thus it divides the whole compression process in two phases of spatial pixel prediction and entropy coding and uses contexts in the first as well as the second phase. The algorithm uses a predictive technique and the resulting prediction error is encoded using Golomb-Rice coding. The proposed JPEG-LS Algorithm based on LOCO-I is implemented in MATLAB.

7 citations

Proceedings ArticleDOI
TL;DR: The efficiency of several predictive techniques (MAP, CALIC, 3D predictors), are compared, and the advantages of 2D versus 3D error feedback and context modeling are examinated; and the use of wavelet transforms for lossless multispectral compression are discussed.
Abstract: In this paper, we address the problem of lossless and nearly- lossless multispectral compression of remote-sensing data acquired using SPOT satellites. Lossless compression algorithms classically have two stages: Transformation of the available data, and coding. The purpose of the first stage is to express the data as uncorrelated data in an optimal way. In the second stage, coding is performed by means of an arithmetic coder. In this paper, we discuss two well-known approaches for spatial as well as multispectral compression of SPOT images: (1) The efficiency of several predictive techniques (MAP, CALIC, 3D predictors), are compared, and the advantages of 2D versus 3D error feedback and context modeling are examinated; (2) The use of wavelet transforms for lossless multispectral compression are discussed. Then, applications of the above mentioned methods for quincunx sampling are evaluated. Lastly, some results, on how predictive and wavelet techniques behave when nearly-lossless compression is needed, are given.

7 citations

Book ChapterDOI
08 Nov 2006
TL;DR: In this article, the authors proposed a new approach for universal JPEG steganalysis by computing higher order statistics over Hamming weights and combined them with a Kullbak-Leibler distance between the probability density function of these weights and a benchmark one.
Abstract: We present in this paper a new approach for universal JPEG steganalysis and propose studying statistics of the compressed DCT coefficients. This approach is motivated by the Avalanche Criterion of the JPEG lossless compression step. This criterion makes possible the design of detectors whose detection rates are independent of the payload. We design an universal steganalytic scheme using blocks of the JPEG file binary output stream. We compute higher order statistics over their Hamming weights and combined them with a Kullbak-Leibler distance between the probability density function of these weights and a benchmark one. We evaluate the universality of our detector through its capacity to efficiently detect the use of a new algorithm not used during the training step. To that goal, we examinate training sets produced by Outguess, F5 and JPhide-and-Seek. The experimental results we obtained show that our scheme is able to detect the use of new algorithms with high detection rate (≈90%) even with very low embedding rates (<10−−5).

7 citations


Network Information
Related Topics (5)
Image segmentation
79.6K papers, 1.8M citations
82% related
Feature (computer vision)
128.2K papers, 1.7M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
82% related
Image processing
229.9K papers, 3.5M citations
80% related
Convolutional neural network
74.7K papers, 2M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202321
202240
20215
20202
20198
201815