scispace - formally typeset
Search or ask a question
Topic

Lossless JPEG

About: Lossless JPEG is a research topic. Over the lifetime, 2415 publications have been published within this topic receiving 51110 citations. The topic is also known as: Lossless JPEG & .jls.


Papers
More filters
Journal ArticleDOI
TL;DR: A low-cost solution, backed up by extensive experiments, is presented by introducing different levels of information loss and by exploiting the human visual perception of image quality.
Abstract: Further application of JPEG-LS for near lossless image compression reveals a balance problem when both compression efficiency and image quality are required to be high. A low-cost solution, backed up by extensive experiments, is presented by introducing different levels of information loss and by exploiting the human visual perception of image quality.

4 citations

Proceedings ArticleDOI
15 Apr 1996
TL;DR: A novel adaptive coding algorithm is proposed to yield high compression rate for medical images that achieves a higher compression rate than the JPEG baseline algorithm and the PSNR values of the new coder is marginally higher than the results obtained with the JPEG coder.
Abstract: Compression of medical images to reduce their storage and transmission bandwidth requirements is of great interest in the implementation of systems such as the picture archiving and communication system (PACS). Direct application of discrete cosine transform (DCT) coding to medical images such as CT or MRI images is not effective as the characteristics of such medical images are not exploited. Firstly, the noisy background in medical images exhibits largely uncorrelated data which is difficult to compress using transform coding. Secondly, the overhead in representing the background information using fixed block-size transform coding is inefficient. A novel adaptive coding algorithm is proposed to yield high compression rate for medical images. The proposed algorithm is a two-stage process: the first stage (pre-processing stage) attempts to remove the background noise and identifies the border of the medical data by using a visual mask; the second stage (encoding stage) uses an adaptive block-size DCT coding algorithm to compress the image data. The proposed coding algorithm is evaluated and compared with the JPEG baseline algorithm where results on the compression ratio and peak signal-to-noise ratio (PSNR) are presented. The results show that the proposed coding algorithm achieves a higher compression rate than the JPEG baseline algorithm. In addition, the PSNR values of the new coder is marginally higher than the results obtained with the JPEG coder.© (1996) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

4 citations

Proceedings ArticleDOI
Ming-Te Wu1
01 Aug 2015
TL;DR: A multiple scale sub-band analysis method was proposed, which uses the energy and entropy metrics, which can be used to reduce the effect of artifacts like blocking or blurred edge, compared with the JPEG 2000 and other recent developed scheme, but also to improve the PSNR performance.
Abstract: Using the discrete cosine transform, such as JPEG or JPEG 2000, is still one of the best technologies for image and video compression standards. JPEG/JPEG2000 standards have been supplied to image compression, but this area is still open algorithms. It can provide better compression ratio while maintaining a low mean square error. A multiple scale sub-band analysis method in this study was proposed, which uses the energy and entropy metrics. The developed method can not only be used to reduce the effect of artifacts like blocking or blurred edge, compared with the JPEG 2000 and other recent developed scheme, but also to improve the PSNR performance. Simulated results are implemented to show that the proposed algorithms obtain better improvement than JPEG, JPEG 2000 and other schemes. The evaluation of this method in comparison with other approaches is implemented both subjectively and qualitatively and the simulation exemplify that the proposed algorithms could obtain the desired reconstructive performance subjectively and objectively.

4 citations

Journal ArticleDOI
TL;DR: Algorithms developed in this paper, identified to be giving almost same Peak Signal-to-Noise Ratio (PSNR) as that of standard JPEG algorithm.
Abstract: recent years, use of Digital Image Communication has increased exponentially in the day to day activities. Joint Photographic Experts Group (JPEG) is the most widely used still image compression standard for bandwidth conservation. In this paper, it is proposed and critically studied a new set of JPEG Compression algorithms by combining Mean filtering, Median filtering, and Outlier detection algorithms and conventional JPEG DCT algorithm in a staged manner. This outlier based JPEG algorithm is giving exceptionally compression compared to conventional JPEG algorithm. Experiments are carried out with many standard still images. Algorithms developed in this paper, identified to be giving almost same Peak Signal-to-Noise Ratio (PSNR) as that of standard JPEG algorithm.

4 citations


Network Information
Related Topics (5)
Image segmentation
79.6K papers, 1.8M citations
82% related
Feature (computer vision)
128.2K papers, 1.7M citations
82% related
Feature extraction
111.8K papers, 2.1M citations
82% related
Image processing
229.9K papers, 3.5M citations
80% related
Convolutional neural network
74.7K papers, 2M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202321
202240
20215
20202
20198
201815