Topic
Lossless JPEG
About: Lossless JPEG is a research topic. Over the lifetime, 2415 publications have been published within this topic receiving 51110 citations. The topic is also known as: Lossless JPEG & .jls.
Papers published on a yearly basis
Papers
More filters
•
1 citations
•
TL;DR: VLSI architecture of the high speed MQ encoder is designed, which adopts architecture of 3-stage pipeline and optimizes the algorithm with improving the content of probability estimation table to achieve a high throughput.
Abstract: MQ encoder is an important algorithm of lossless compression in JPEG 2000,and it can attain the high efficiency of compression.But it is restricted for its complex algorithm and slow performance.To achieve a high throughput,VLSI architecture of the high speed MQ encoder is designed,which adopts architecture of 3-stage pipeline and optimizes the algorithm with improving the content of probability estimation table.The design is programmed with Verilog and simulated with Modelsim 6.1.The experimental result shows that the speed of encoder is improved highly.Research of the thesis makes it possible for JPEG 2000 to be used in reality.
1 citations
01 Jan 2003
TL;DR: Simulation results reveal the TDS obtains a significant coding gain over conventional diversity schemes in terms of both objective and subjective measurements.
Abstract: A novel turbo diversity scheme (TDS) applied to JPEG coded image is proposed in this paper. By utilising the built-in turbo encoder structure, the transmitted JPEG image encoded by a rate half code is recovered at the receiver using a more powerful rate third code generated by the proposed TDS. Simulation results reveal the TDS obtains a significant coding gain over conventional diversity schemes in terms of both objective and subjective measurements.
1 citations
••
08 Nov 2010
TL;DR: This paper deals with a quality analysis of reconstructed microscopic leukocytes images after they have been lossy compressed by evaluating the performance of several segmentation algorithms together with objective quality metrics.
Abstract: Reducing image file size by means of lossy compression algorithms can lead to distortions inimage contentaffectingdetection of fine detail structures, either by human orautomated observation. In the case of microscopic images of blood cells, which usually occupy large amounts of disk space, the use of such procedures is justified within a controlled quality loss. Although JPEG 2000 remains as the accepted standard for lossycompression, still a set of guidelines need to be established in order to use this codec in its lossy mode and for particular applications. The present paper deals with a quality analysis of reconstructed microscopic leukocytes images after they have beenlossy compressed. The quality loss is investigated near the lower compression boundby evaluating the performance of several segmentation algorithms together with objective quality metrics. The value of compression rate of142:1 is estimated from the experiments.
1 citations