scispace - formally typeset
Search or ask a question
Topic

Lossless JPEG

About: Lossless JPEG is a research topic. Over the lifetime, 2415 publications have been published within this topic receiving 51110 citations. The topic is also known as: Lossless JPEG & .jls.


Papers
More filters
Proceedings ArticleDOI
Yi Zhang, Xiangyang Luo, Chunfang Yang, Dengpan Ye1, Fenlin Liu 
24 Aug 2015
TL;DR: The proposed JPEG-compression resistant adaptive Steganography algorithm not only has a high correct rate of extracted messages after JPEG compression, which increases from about 60% to nearly 100% comparing with J-UNIWARD steganography under quality factor 75 of JPEG compression), but also has a strong detection resistant performance.
Abstract: Current typical adaptive Steganography algorithms cannot extract the embedded secret messages correctly after compression. In order to solve this problem, a JPEG-compression resistant adaptive steganography algorithm is proposed. Utilizing the relationship between DCT coefficients, the domain of messages embedding is determined. The modifying magnitude of different DCT coefficients can be determined according to the quality factors of JPEG compression. To ensure the completely correct extraction of embedded messages after JPEG compression, the RS codes is used to encode the messages to be embedded. Besides, based on the current energy function in the PQe steganography and the distortion function in J-UNIWARD Steganography, the corresponding distortion value of DCT coefficients is calculated. With the help of that, STCs is used to embed the encoded messages into the DCT coefficients, which have a smaller distortion value. The experimental results under different quality factors of JPEG compression and different payloads demonstrate that the proposed algorithm not only has a high correct rate of extracted messages after JPEG compression, which increases from about 60% to nearly 100% comparing with J-UNIWARD steganography under quality factor 75 of JPEG compression, but also has a strong detection resistant performance.

37 citations

Proceedings ArticleDOI
Yuriy Reznik1
17 May 2004
TL;DR: Two alternative schemes for encoding of the prediction residual adopted in the MPEG-4 ALS (audio lossless coding) standard for lossless audio coding are described and analytical and experimental analysis of their performance is provided.
Abstract: We describe two alternative schemes for encoding of the prediction residual adopted in the MPEG-4 ALS (audio lossless coding) standard for lossless audio coding. We explain choices of algorithms used in their design and provide both analytical and experimental analysis of their performance.

37 citations

Patent
28 Mar 2005
TL;DR: In this paper, the authors proposed a method of JPEG compression of an image frame divided up into a plurality of non-overlapping, tiled 8×8 pixel blocks X i.
Abstract: A method of JPEG compression of an image frame divided up into a plurality of non-overlapping, tiled 8×8 pixel blocks X i . A global quantization matrix Q is determined by either selecting a standard JPEG quantization table or selecting a quantization table such that the magnitude of each quantization matrix coefficient, Q[m,n] is inversely proportional to the aggregate visual importance in the image of the corresponding DCT basis vector. Next a linear scaling factor S i is selected for each block, bounded by user selected values S min and S max . Transform coefficients, Y i , obtained from a digital cosine transform of X i , are quantized with global table S min Q while emulated the effects of quantization with local table S i Q and the quantized coefficients T i [m,n] and global quantization table S min Q are entropy encoded , where S min is a user selected minimum scaling factor, to create a JPEG Part 1 image file. The algorithm is unique in that it allows for the effect of variable-quantization to be achieved while still producing a fully compliant JPEG Part 1 file.

36 citations

Journal ArticleDOI
01 Jan 2016-Optik
TL;DR: A block based lossless image compression algorithm using Hadamard transform and Huffman encoding which is a simple algorithm with less complexity that yields better results in terms of compression ratio when compared with existing lossless compression algorithms such as JPEG 2000.

36 citations


Network Information
Related Topics (5)
Image segmentation
79.6K papers, 1.8M citations
82% related
Feature (computer vision)
128.2K papers, 1.7M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
82% related
Image processing
229.9K papers, 3.5M citations
80% related
Convolutional neural network
74.7K papers, 2M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202321
202240
20215
20202
20198
201815