scispace - formally typeset
Search or ask a question
Author

William A. Pearlman

Bio: William A. Pearlman is an academic researcher from Rensselaer Polytechnic Institute. The author has contributed to research in topics: Data compression & Set partitioning in hierarchical trees. The author has an hindex of 36, co-authored 202 publications receiving 12924 citations. Previous affiliations of William A. Pearlman include Texas A&M University & University of Wisconsin-Madison.


Papers
More filters
Book ChapterDOI
01 Jan 2006
TL;DR: This chapter proposed a three dimensional set partitioned embedded block coder for hyperspectral image compression that automatically exploits inter-band dependence and provides better performance for lossy representation.
Abstract: This chapter proposed a three dimensional set partitioned embedded block coder for hyperspectral image compression The three dimensional wavelet transform automatically exploits inter-band dependence Two versions of the algorithm were implemented The integer filter implementation enables lossy-to-lossless compression, and the floating point filter implementation provides better performance for lossy representation Wavelet packet structure and bit shifting were applied on the integer filter implementation to make the transform approximately unitary

160 citations

Proceedings ArticleDOI
05 Jun 2000
TL;DR: A variant of the SPIHT image compression algorithm called no list SPIHT (NLS) is presented, which operates without linked lists and is suitable for a fast, simple hardware implementation.
Abstract: A variant of the SPIHT image compression algorithm called no list SPIHT (NLS) is presented. NLS operates without linked lists and is suitable for a fast, simple hardware implementation. NLS has a fixed predetermined memory requirement about 50% larger than that needed for the image alone. Instead of lists, a state table with four bits per coefficient keeps track of the set partitions and what information has been encoded. NLS sparsely marks selected descendant nodes of insignificant trees in the state table in such a way that large groups of predictably insignificant pixels are easily identified and skipped during coding passes. The image data is stored in a one dimensional recursive zig-zag array for computational efficiency and algorithmic simplicity. The performance of the algorithm on standard test images is nearly the same as SPIHT.

153 citations

Patent
14 Sep 1995
TL;DR: In this article, a data compression technique includes a subband decomposition of a source image followed by coding of the coefficients of the resultant subband coefficients for storage and/or transmission.
Abstract: A data compression technique includes a subband decomposition of a source image followed by coding of the coefficients of the resultant subband decomposition for storage and/or transmission. During coding, three ordered lists are used comprising a list of significant pixels (LSP), a list of insignificant pixels (LIP) and a list of insignificant sets of pixels (LIS). The pixels in the LIP are tested, and those that are significant at a current quantization level are moved to the LSP. Similarly, sets are sequentially evaluated following the LIS order, and when a set is found to be significant it is removed from the LIS and partitioned into new subsets. The new subsets with more than one element are added back to the end of the LIS, while the single-coordinate sets are added to the end of the LIP or to the end of the LSP, depending whether they are insignificant or significant, respectively.

152 citations

01 Jan 2004
TL;DR: In this paper, an embedded block-based, image wavelet transform coding algorithm of low complexity, 3D-SPECK, has been proposed for 3D volumetric image data.
Abstract: We propose an embedded, block-based, image wavelet transform coding algorithm of low complexity. The embedded coding of Set Partitioned Embedded bloCK (SPECK) algorithm is modified and extended to three dimensions. The resultant algorithm, three-Dimensional Set Partitioned Embedded bloCK (3D-SPECK), efficiently encodes 3D volumetric image data by exploiting the dependencies in all dimensions. 3D-SPECK generates embedded bit stream and therefore provides progressive transmission. We describe the use of this coding algorithm in two implementations, including integer wavelet transform as well as floating point wavelet transform, where the former one enables lossy and lossless decompression from the same bit stream, and the latter one achieves better performance in lossy compression. Wavelet packet structure and coefficient scaling are used to make the integer filter transform approximately unitary. The structure of hyperspectral images reveals spectral responses that would seem ideal candidate for compression by 3D-SPECK. We demonstrate that 3D-SPECK, a wavelet domain compression algorithm, can preserve spectral profiles well. Compared with the lossless version of the benchmark JPEG2000 (multi-component), the 3D-SPECK lossless algorithm produces average of 3.0% decrease in compressed file size for Airborne Visible Infrared Imaging Spectrometer images, the typical hyperspectral imagery. We also conduct comparisons of the lossy implementation with other the state-of-the-art algorithms such as three-Dimensional Set Partitioning In Hierarchical Trees (3D-SPIHT) and JPEG2000. We conclude that this algorithm, in addition to being very flexible, retains all the desirable features of these algorithms and is highly competitive to 3D-SPIHT and better than JPEG2000 in compression efficiency.

118 citations

Journal ArticleDOI
01 Aug 1999
TL;DR: A new adaptive windowing algorithm is proposed for speckle noise suppression which solves the problem of window size associated with the local statistics adaptive filters and is applied to both a simulated SAR image and an ERS-1 SAR image.
Abstract: Speckle noise usually occurs in synthetic aperture radar (SAR) images owing to coherent processing of SAR data. The most well-known image domain speckle filters are the adaptive filters using local statistics such as the mean and standard deviation. The local statistics filters adapt the filter coefficients based on data within a fixed running window. In these schemes, depending on the window size, there exists trade-off between the extent of speckle noise suppression and the capability of preserving fine details. The authors propose a new adaptive windowing algorithm for speckle noise suppression which solves the problem of window size associated with the local statistics adaptive filters. In the algorithm, the window size is automatically adjusted depending on regional characteristics to suppress speckle noise as much as possible while preserving fine details. Speckle noise suppression gets stronger in homogeneous regions as the window size increases succeedingly. In fine detail regions, by reducing the window size successively, edges and textures are preserved. The fixed-window filtering schemes and the proposed one are applied to both a simulated SAR image and an ERS-1 SAR image to demonstrate the excellent performance of the proposed adaptive windowing algorithm for speckle noise.

113 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this article, a structural similarity index is proposed for image quality assessment based on the degradation of structural information, which can be applied to both subjective ratings and objective methods on a database of images compressed with JPEG and JPEG2000.
Abstract: Objective methods for assessing perceptual image quality traditionally attempted to quantify the visibility of errors (differences) between a distorted image and a reference image using a variety of known properties of the human visual system. Under the assumption that human visual perception is highly adapted for extracting structural information from a scene, we introduce an alternative complementary framework for quality assessment based on the degradation of structural information. As a specific example of this concept, we develop a structural similarity index and demonstrate its promise through a set of intuitive examples, as well as comparison to both subjective ratings and state-of-the-art objective methods on a database of images compressed with JPEG and JPEG2000. A MATLAB implementation of the proposed algorithm is available online at http://www.cns.nyu.edu//spl sim/lcv/ssim/.

40,609 citations

Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Book
01 Jan 1998
TL;DR: An introduction to a Transient World and an Approximation Tour of Wavelet Packet and Local Cosine Bases.
Abstract: Introduction to a Transient World. Fourier Kingdom. Discrete Revolution. Time Meets Frequency. Frames. Wavelet Zoom. Wavelet Bases. Wavelet Packet and Local Cosine Bases. An Approximation Tour. Estimations are Approximations. Transform Coding. Appendix A: Mathematical Complements. Appendix B: Software Toolboxes.

17,693 citations

Journal ArticleDOI
TL;DR: The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods.
Abstract: Embedded zerotree wavelet (EZW) coding, introduced by Shapiro (see IEEE Trans. Signal Processing, vol.41, no.12, p.3445, 1993), is a very effective and computationally simple technique for image compression. We offer an alternative explanation of the principles of its operation, so that the reasons for its excellent performance can be better understood. These principles are partial ordering by magnitude with a set partitioning sorting algorithm, ordered bit plane transmission, and exploitation of self-similarity across different scales of an image wavelet transform. Moreover, we present a new and different implementation based on set partitioning in hierarchical trees (SPIHT), which provides even better performance than our previously reported extension of EZW that surpassed the performance of the original EZW. The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods. In addition, the new coding and decoding procedures are extremely fast, and they can be made even faster, with only small loss in performance, by omitting entropy coding of the bit stream by the arithmetic code.

5,890 citations

Journal ArticleDOI
J.M. Shapiro1
TL;DR: The embedded zerotree wavelet algorithm (EZW) is a simple, yet remarkably effective, image compression algorithm, having the property that the bits in the bit stream are generated in order of importance, yielding a fully embedded code.
Abstract: The embedded zerotree wavelet algorithm (EZW) is a simple, yet remarkably effective, image compression algorithm, having the property that the bits in the bit stream are generated in order of importance, yielding a fully embedded code The embedded code represents a sequence of binary decisions that distinguish an image from the "null" image Using an embedded coding algorithm, an encoder can terminate the encoding at any point thereby allowing a target rate or target distortion metric to be met exactly Also, given a bit stream, the decoder can cease decoding at any point in the bit stream and still produce exactly the same image that would have been encoded at the bit rate corresponding to the truncated bit stream In addition to producing a fully embedded bit stream, the EZW consistently produces compression results that are competitive with virtually all known compression algorithms on standard test images Yet this performance is achieved with a technique that requires absolutely no training, no pre-stored tables or codebooks, and requires no prior knowledge of the image source The EZW algorithm is based on four key concepts: (1) a discrete wavelet transform or hierarchical subband decomposition, (2) prediction of the absence of significant information across scales by exploiting the self-similarity inherent in images, (3) entropy-coded successive-approximation quantization, and (4) universal lossless data compression which is achieved via adaptive arithmetic coding >

5,559 citations