scispace - formally typeset
Search or ask a question
Topic

Lossless JPEG

About: Lossless JPEG is a research topic. Over the lifetime, 2415 publications have been published within this topic receiving 51110 citations. The topic is also known as: Lossless JPEG & .jls.


Papers
More filters
01 Dec 2010
TL;DR: An experimental comparison of a number of different lossless data compression algorithms is presented and it is stated which algorithm performs well for text data.
Abstract: Data compression is a common requirement for most of the computerized applications. There are number of data compression algorithms, which are dedicated to compress different data formats. Even for a single data type there are number of different compression algorithms, which use different approaches. This paper examines lossless data compression algorithms and compares their performance. A set of selected algorithms are examined and implemented to evaluate the performance in compressing text data. An experimental comparison of a number of different lossless data compression algorithms is presented in this paper. The article is concluded by stating which algorithm performs well for text data.

120 citations

Proceedings ArticleDOI
28 Dec 2000
TL;DR: Evaluating JPEG 2000 versus JPEG-LS and MPEG-4 VTC, as well as the older but widely used JPEG, shows that the choice of the “best” standard depends strongly on the application at hand.
Abstract: JPEG 2000, the new ISO/ITU-T standard for still image coding, is about to be finished. Other new standards have been recently introduced, namely JPEG-LS and MPEG-4 VTC. This paper compares the set of features offered by JPEG 2000, and how well they are fulfilled, versus JPEG-LS and MPEG-4 VTC, as well as the older but widely used JPEG and more recent PNG. The study concentrates on compression efficiency and functionality set, while addressing other aspects such as complexity. Lossless compression efficiency as well as the fixed and progressive lossy rate-distortion behaviors are evaluated. Robustness to transmission errors, Region of Interest coding and complexity are also discussed. The principles behind each algorithm are briefly described. The results show that the choice of the "best" standard depends strongly on the application at hand, but that JPEG 2000 supports the widest set of features among the evaluated standards, while providing superior rate-distortion performance in most cases.

119 citations

Journal ArticleDOI
TL;DR: A spatial subband image-compression method well suited to the local nature of the CNNUM, which performs especially well with radiographical images (mammograms) and is suggested to use as part of a cellular neural/nonlinear (CNN)-based mammogram-analysis system.
Abstract: This paper demonstrates how the cellular neural-network universal machine (CNNUM) architecture can be applied to image compression. We present a spatial subband image-compression method well suited to the local nature of the CNNUM. In case of lossless image compression, it outperforms the JPEG image-compression standard both in terms of compression efficiency and speed. It performs especially well with radiographical images (mammograms); therefore, it is suggested to use it as part of a cellular neural/nonlinear (CNN)-based mammogram-analysis system. This paper also gives a CNN-based method for the fast implementation of the moving pictures experts group (MPEG) and joint photographic experts group (JPEG) moving and still image-compression standards.

118 citations

01 Jan 2010
TL;DR: The Lossless method of image compression and decompression using a simple coding technique called Huffman coding is proposed, which is simple in implementation and utilizes less memory.
Abstract: The need for an efficient technique for compression of Images ever increasing because the raw images need large amounts of disk space seems to be a big disadvantage during transmission & storage. Even though there are so many compression technique already present a better technique which is faster, memory efficient and simple surely suits the requirements of the user. In this paper we proposed the Lossless method of image compression and decompression using a simple coding technique called Huffman coding. This technique is simple in implementation and utilizes less memory. A software algorithm has been developed and implemented to compress and decompress the given image using Huffman coding techniques in a MATLAB platform.

118 citations

Journal ArticleDOI
TL;DR: A simple parallel algorithm for decoding a Huffman encoded file is presented, exploiting the tendency of Huffman codes to resynchronize quickly, i.e. recovering after possible decoding errors, in most cases.
Abstract: A simple parallel algorithm for decoding a Huffman encoded file is presented, exploiting the tendency of Huffman codes to resynchronize quickly, i.e. recovering after possible decoding errors, in most cases. The average number of bits that have to be processed until synchronization is analyzed and shows good agreement with empirical data. As Huffman coding is also a part of the JPEG image compression standard, the suggested algorithm is then adapted to the parallel decoding of JPEG files.

116 citations


Network Information
Related Topics (5)
Image segmentation
79.6K papers, 1.8M citations
82% related
Feature (computer vision)
128.2K papers, 1.7M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
82% related
Image processing
229.9K papers, 3.5M citations
80% related
Convolutional neural network
74.7K papers, 2M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202321
202240
20215
20202
20198
201815