scispace - formally typeset
Search or ask a question
Topic

Lossless JPEG

About: Lossless JPEG is a research topic. Over the lifetime, 2415 publications have been published within this topic receiving 51110 citations. The topic is also known as: Lossless JPEG & .jls.


Papers
More filters
Proceedings ArticleDOI
02 Nov 2004
TL;DR: This paper presents the technologies that are currently being developed to accommodate the coding of floating point datasets with JPEG 2000 and shows that these enhancements to the JPEG 2000 coding pipeline lead to better compression results than Part 1 encoding where the floating point data had been retyped as integers.
Abstract: JPEG 2000 Part 10 is a new work part of the ISO/IEC JPEG Committee dealing with the extension of JPEG 2000 technologies to three-dimensional data. One of the issues in Part 10 is the ability to encode floating point datasets. Many Part 10 use cases come from the scientific and engineering communities, where floating point data is often produced either from numerical simulations or from remote sensing instruments. This paper presents the technologies that are currently being developed to accommodate this Part 10 requirement. The coding of floating point datasets with JPEG 2000 requires two changes to the coding pipeline. Firstly, the wavelet transformation stage is optimized to correctly decorrelate data represented with the IEEE 754 floating point standard. Special IEEE 754 floating point values like Infinities and NaN's are signaled beforehand as they do not correlate well with other floating point values. Secondly, computation of distortion measures on the encoder side is performed in floating point space, rather than in integer space, in order to correctly perform rate allocation. Results will show that these enhancements to the JPEG 2000 coding pipeline lead to better compression results than Part 1 encoding where the floating point data had been retyped as integers.

35 citations

Journal ArticleDOI
TL;DR: This paper put forward a new image lossless compression joint encryption algorithm based on chaotic map with all original information intact that passes many security tests, such as sensitivity test, entropy test, autocorrelation test, NIST SP800–22 test.
Abstract: Nowadays poor security, low transmission and storage efficiency of images have become serious concerns. In order to improve the situation, this paper put forward a new image lossless compression joint encryption algorithm based on chaotic map with all original information intact. The lossless compression uses SPIHT(Set Partitioning in Hierarchical Trees) encoding method based on integer wavelet transform, and encrypt multiple rounds in the process of wavelet coefficients and SPIHT coding applying many kinds of chaotic maps. Experimental results show that the compressed file size is about 50 % of the original file size, which achieves relatively good lossless compression ratio. Besides, the encryption method passes many security tests, such as sensitivity test, entropy test, autocorrelation test, NIST SP800---22 test. There is a high application value in the medical field and the national security department whose image files require a relatively high quality.

35 citations

Patent
15 Dec 2000
TL;DR: In this paper, the 8×8 Discrete Cosine Transform (DCT) blocks are stored after entropy decoding in a JPEG decoder or after the forward discrete cosine transform (FDCT) in the JPEG encoder to use as an intermediate format between transform processes.
Abstract: JPEG (Joint Photographic Experts Group) images are encoded and decoded as fast as possible for a variety of disparate applications A novel structure stores the 8×8 Discrete Cosine Transform (DCT) blocks after entropy decoding in a JPEG decoder or after the Forward Discrete Cosine Transform (FDCT) in the JPEG encoder to use as an intermediate format between transform processes The format was chosen to speed up the entropy decode and encode processes and is based on the information needed for the JPEG Huffman entropy coding, but lends itself to fast execution of other DCT based transforms, including arithmetic entropy coding

35 citations

Proceedings ArticleDOI
01 Aug 2017
TL;DR: Experimental results prove that training on such a kind of most powerful attacks allows good detection in the presence of a much wider variety of attacks and processing.
Abstract: In this paper we present an adversary-aware double JPEG detector which is capable of detecting the presence of two JPEG compression steps even in the presence of heterogeneous processing and counter-forensic (C-F) attacks. The detector is based on an SVM classifier fed with a large number of features and trained to recognise the traces left by double JPEG detection in the presence of attacks. Since it is not possible to train the SVM on all possible kinds of processing and C-F attacks, a selected set of images, manipulated with a limited number of attacks is added to the training set. The processing tools used for training are chosen among those that proved to be most effective in disabling double JPEG detection. Experimental results prove that training on such a kind of most powerful attacks allows good detection in the presence of a much wider variety of attacks and processing. Good performance are retained over a wide range of compression quality factors.

35 citations


Network Information
Related Topics (5)
Image segmentation
79.6K papers, 1.8M citations
82% related
Feature (computer vision)
128.2K papers, 1.7M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
82% related
Image processing
229.9K papers, 3.5M citations
80% related
Convolutional neural network
74.7K papers, 2M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202321
202240
20215
20202
20198
201815