scispace - formally typeset
Search or ask a question
Author

Fang Sheng

Bio: Fang Sheng is an academic researcher from University of Arizona. The author has contributed to research in topics: Wavelet transform & Transform coding. The author has an hindex of 3, co-authored 4 publications receiving 133 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: A new fully scalable image coder is presented and the lossless and lossy performance of these transforms in the proposed coder are investigated, which are comparable to JPEG-LS.
Abstract: Reversible integer wavelet transforms allow both lossless and lossy decoding using a single bitstream. We present a new fully scalable image coder and investigate the lossless and lossy performance of these transforms in the proposed coder. The lossless compression performance of the presented method is comparable to JPEG-LS. The lossy performance is quite competitive with other efficient lossy compression methods.

65 citations

Proceedings ArticleDOI
04 Oct 1998
TL;DR: The lossless compression performance of the presented method is comparable to JPEG-LS and the lossy performance is quite competitive with other efficient lossy compression methods.
Abstract: There has been recent interest in using reversible integer wavelet transforms for image compression. These transforms allow both lossless and lossy decoding-by resolution and/or accuracy-using a single bitstream. We investigate the lossless and lossy performance of these transforms in the JPEG-2000 Verification Model O. The lossless compression performance of the presented method is comparable to JPEG-LS. The lossy performance is quite competitive with other efficient lossy compression methods.

59 citations

Proceedings ArticleDOI
26 Oct 1997
TL;DR: Coding techniques that enable progressive transmission when trellis coded quantization (TCQ) is applied to the wavelet coefficients are presented and comparable PSNR performance is achieved with a simple entropy coder.
Abstract: As image coders evolve from DCT-based to wavelet-based designs, the latter must be enhanced to include capabilities currently supported by standards such as JPEG. Said and Pearlman (see IEEE Transactions on Circuits and Systems for Video Tech., vol.6, p.243-50, 1996) and Schwartz, Zandi and Boliek (see Proc. of SPIE, vol.2564, July 1995) have described approaches for incorporating progressive transmission capabilities within wavelet-based coders. All of these coders apply scalar quantization to wavelet transform coefficients and then apply sophisticated entropy coding methods to the quantized coefficients. In this paper, we present coding techniques that enable progressive transmission when trellis coded quantization (TCQ) is applied to the wavelet coefficients. While the trellis coded quantizer is more complex than the uniform scalar quantizer, comparable PSNR performance is achieved with a simple entropy coder. In addition, our use of sophisticated quantization and bit rate allocation algorithms enables the development of coders that are tuned for improved perceptual image quality.

14 citations

Proceedings Article
26 Oct 1997
TL;DR: In this paper, the trellis coded quantizer is applied to the wavelet coefficients, and a simple entropy coder is used to achieve comparable PSNR performance with the uniform scalar quantizer.
Abstract: As image coders evolve from DCT-based to wavelet-based designs, the latter must be enhanced to include capabilities currently supported by standards such as JPEG. Recent work [l, 2, 3] has described approaches for incorporating progressive transmission capabilities within wavelet-based coders. All of these coders apply scalar quantization to wavelet transform coefficients, and then apply sophisticated entropy coding methods to the quantized coefficients. In this paper, we present coding techniques that enable progressive transmission when trellis coded quantization (TCQ) is applied to the wavelet coefficients. While the trellis coded quantizer is more complex than the uniform scalar quantizer, com-parable PSNR performance is achieved with a simple entropy coder. In addition, our use of sophisticated quantization and bit rate allocation algorithms enables the development of coders that are tuned for improved perceptual image quality.

1 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: A new image compression algorithm is proposed, based on independent embedded block coding with optimized truncation of the embedded bit-streams (EBCOT), capable of modeling the spatially varying visual masking phenomenon.
Abstract: A new image compression algorithm is proposed, based on independent embedded block coding with optimized truncation of the embedded bit-streams (EBCOT). The algorithm exhibits state-of-the-art compression performance while producing a bit-stream with a rich set of features, including resolution and SNR scalability together with a "random access" property. The algorithm has modest complexity and is suitable for applications involving remote browsing of large compressed images. The algorithm lends itself to explicit optimization with respect to MSE as well as more realistic psychovisual metrics, capable of modeling the spatially varying visual masking phenomenon.

1,933 citations

Journal ArticleDOI
TL;DR: Some of the most significant features of the standard are presented, such as region-of-interest coding, scalability, visual weighting, error resilience and file format aspects, and some comparative results are reported.
Abstract: One of the aims of the standardization committee has been the development of Part I, which could be used on a royalty- and fee-free basis. This is important for the standard to become widely accepted. The standardization process, which is coordinated by the JTCI/SC29/WG1 of the ISO/IEC has already produced the international standard (IS) for Part I. In this article the structure of Part I of the JPFG 2000 standard is presented and performance comparisons with established standards are reported. This article is intended to serve as a tutorial for the JPEG 2000 standard. The main application areas and their requirements are given. The architecture of the standard follows with the description of the tiling, multicomponent transformations, wavelet transforms, quantization and entropy coding. Some of the most significant features of the standard are presented, such as region-of-interest coding, scalability, visual weighting, error resilience and file format aspects. Finally, some comparative results are reported and the future parts of the standard are discussed.

1,842 citations

Proceedings ArticleDOI
24 Oct 1999
TL;DR: A new image compression algorithm is proposed, based on independent embedded block coding with optimized truncation of the embedded bit-streams (EBCOT), capable of modeling the spatially varying visual masking phenomenon.
Abstract: A new image compression algorithm is proposed, based on independent embedded block coding with optimized truncation of the embedded bit-streams (EBCOT). The algorithm exhibits state-of-the-art compression performance while producing a bit-stream with a rich feature set, including resolution and SNR scalability together with a random access property. The algorithm has modest complexity and is extremely well suited to applications involving remote browsing of large compressed images. The algorithm lends itself to explicit optimization with respect to MSE as well as more realistic psychovisual metrics, capable of modeling the spatially varying visual masking phenomenon.

1,479 citations

Journal ArticleDOI
TL;DR: At low bit rates, reversible integer-to-integer and conventional versions of transforms were found to often yield results of comparable quality, with the best choice for a given application depending on the relative importance of the preceding criteria.
Abstract: In the context of image coding, a number of reversible integer-to-integer wavelet transforms are compared on the basis of their lossy compression performance, lossless compression performance, and computational complexity. Of the transforms considered, several were found to perform particularly well, with the best choice for a given application depending on the relative importance of the preceding criteria. Reversible integer-to-integer versions of numerous transforms are also compared to their conventional (i.e., nonreversible real-to-real) counterparts for lossy compression. At low bit rates, reversible integer-to-integer and conventional versions of transforms were found to often yield results of comparable quality. Factors affecting the compression performance of reversible integer-to-integer wavelet transforms are also presented, supported by both experimental data and theoretical arguments.

410 citations

Journal ArticleDOI
07 Nov 2002
TL;DR: A tutorial-style review of the new JPEG2000, explaining the technology on which it is based and drawing comparisons with JPEG and other compression standards is provided.
Abstract: JPEG2000 is the latest image compression standard to emerge from the Joint Photographic Experts Group (JPEG) working under the auspices of the International Standards Organization. Although the new standard does offer superior compression performance to JPEG, JPEG2000 provides a whole new way of interacting with compressed imagery in a scalable and interoperable fashion. This paper provides a tutorial-style review of the new standard, explaining the technology on which it is based and drawing comparisons with JPEG and other compression standards. The paper also describes new work, exploiting the capabilities of JPEG2000 in client-server systems for efficient interactive browsing of images over the Internet.

275 citations