Topic
Lossless JPEG
About: Lossless JPEG is a research topic. Over the lifetime, 2415 publications have been published within this topic receiving 51110 citations. The topic is also known as: Lossless JPEG & .jls.
Papers published on a yearly basis
Papers
More filters
•
01 Jan 2007
TL;DR: A reversible watermarking scheme in the JPEG compression domain is proposed to authenticate the content without the quality loss because it preserves the original content when embed the watermark information.
Abstract: In this paper, we propose a reversible watermarking scheme in the JPEG compression domain. The reversible watermarking is useful to authenticate the content without the quality loss because it preserves the original content when embed the watermark information. In the internet, for the purpose to save the storage space and improve the efficiency of communication, digital image is usually compressed by JPEG or GIF. Therefore, it is necessary to develop a reversible watermarking in the JPEG compression domain. When the watermark is embedded, the lossless compression was used and the original image is recovered during the watermark extracting process. The test results show that PSNRs are distributed from 38dB to 42dB and the payload is from 2.5Kbits to 3.4Kbits where the QF is 75. Where the QF of the Lena image is varied from 10 to 99, the PSNR is directly proportional to the QF and the payload is around .
2 citations
•
TL;DR: This paper mainly brings in the idea of using an art form called ambigram to compress text which is again compressed by Huffman coding with consistency in the efficiency of the compression.
Abstract: The new era of networking is looking forward to improved and effective methods in channel utilization. There are many texts where lossless data recovery is vitally essential because of the importance of information it holds. Therefore, a lossless decomposition algorithm which is independent of the nature and pattern of text is today's top concern. Efficiency of algorithms used today varies greatly depending on the nature of text. This paper mainly brings in the idea of using an art form called ambigram to compress text which is again compressed by Huffman coding with consistency in the efficiency of the compression. Keywords: Ambigrams, Huffman coding, Lossless compression, Steganography, Embedded algorithms, Encryption.
2 citations
••
01 Dec 2010TL;DR: Experimental results indicate that the proposed GAP scheme outperforms the existing GAP prediction for all the finger-print images tested, while the complexity of the prediction algorithm is improved by more than four times with the help of parallel implementation.
Abstract: In this paper we investigate the prediction scheme of Context Based Adaptive Lossless Image Coding (CALIC), the standard for lossless/near lossless image compression for continuous-tone finger-print images. We show that it is not sufficient to consider the prediction technique in a single direction for a fingerprint image as a whole for Gradient Adjusted Predictor (GAP). As a result, we propose an additional GAP scheme to achieve better speed and better prediction accuracy as and hence provide potential for further improvements in Lossless Image Compression. Experimental results indicate that the proposed scheme outperforms the existing GAP prediction for all the finger-print images tested, while the complexity of the prediction algorithm is improved by more than four times with the help of parallel implementation.
2 citations
•
20 May 2016TL;DR: In this article, lossless data compression and depression devices and lossless compression and decompression methods are provided. But the authors do not discuss the use of entropy coding on the corresponding codewords to obtain a compressed data stream.
Abstract: Lossless data compression and depression devices and lossless data compression and decompression methods are provided The lossless data compression device includes a processor and an entropy coding circuit The processor is arranged to determine whether a raw data stream matches data items in a dictionary when a compression command for the raw data stream is received and output corresponding codewords according to the determination result The entropy coding circuit is arranged to perform entropy coding on the corresponding codewords to obtain a compressed data stream
2 citations
••
26 May 2013TL;DR: The results show that the proposed method is competitive with the well-known methods for lossless compression, in terms of compression ratio and computational efficiency.
Abstract: Basis pursuit algorithm is one of the most popular methods of sparse coding. The goal of the algorithm is to represent signal using as few coefficients as possible, which is suitable for acoustic signal compression. This paper presents a lossless coding/decoding method using the basis pursuit algorithm. In this method, wavelet packets bases were used to compose the dictionary because of their natural sparse property. Experimental results are obtained by comparing the proposed method with the four popular lossless coding/decoding methods using various types of acoustic signals. The results show that the proposed method is competitive with the well-known methods for lossless compression, in terms of compression ratio and computational efficiency.
2 citations