scispace - formally typeset
Search or ask a question
Author

J. Sementilli

Bio: J. Sementilli is an academic researcher from Lockheed Martin Corporation. The author has contributed to research in topics: Image compression & Lossy compression. The author has an hindex of 1, co-authored 1 publications receiving 59 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: A new fully scalable image coder is presented and the lossless and lossy performance of these transforms in the proposed coder are investigated, which are comparable to JPEG-LS.
Abstract: Reversible integer wavelet transforms allow both lossless and lossy decoding using a single bitstream. We present a new fully scalable image coder and investigate the lossless and lossy performance of these transforms in the proposed coder. The lossless compression performance of the presented method is comparable to JPEG-LS. The lossy performance is quite competitive with other efficient lossy compression methods.

65 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Some of the most significant features of the standard are presented, such as region-of-interest coding, scalability, visual weighting, error resilience and file format aspects, and some comparative results are reported.
Abstract: One of the aims of the standardization committee has been the development of Part I, which could be used on a royalty- and fee-free basis. This is important for the standard to become widely accepted. The standardization process, which is coordinated by the JTCI/SC29/WG1 of the ISO/IEC has already produced the international standard (IS) for Part I. In this article the structure of Part I of the JPFG 2000 standard is presented and performance comparisons with established standards are reported. This article is intended to serve as a tutorial for the JPEG 2000 standard. The main application areas and their requirements are given. The architecture of the standard follows with the description of the tiling, multicomponent transformations, wavelet transforms, quantization and entropy coding. Some of the most significant features of the standard are presented, such as region-of-interest coding, scalability, visual weighting, error resilience and file format aspects. Finally, some comparative results are reported and the future parts of the standard are discussed.

1,842 citations

Journal Article
TL;DR: The aim of this paper is to propose a modified high,capacity image steganography technique that depends on wavelet transform with acceptable levels of imperceptibility and distortion in the cover image and high level of overall security.
Abstract: Steganography is the art and science of concealing information in unremarkable cover media so as not to arouse an eavesdropper's suspicion. It is an application under information security field. Being classified under information security, steganography will be characterized by having set of measures that rely on strengths and counter measures (attacks) that are driven by weaknesses and vulnerabilities. Today, computer and network technologies provide easy,to,use communication channels for steganography. The aim of this paper is to propose a modified high,capacity image steganography technique that depends on wavelet transform with acceptable levels of imperceptibility and distortion in the cover image and high level of overall security.

128 citations

Journal ArticleDOI
01 Nov 2000
TL;DR: In this paper, the authors survey some of the recent advances in lossless compression of continuous-tone images and discuss the modeling paradigms underlying the state-of-the-art algorithms, and the principles guiding their design.
Abstract: In this paper, we survey some of the recent advances in lossless compression of continuous-tone images. The modeling paradigms underlying the state-of-the-art algorithms, and the principles guiding their design, are discussed in a unified manner. The algorithms are described and experimentally compared.

111 citations

Journal ArticleDOI
TL;DR: A VLSI architecture is proposed for the IWT implementation, capable of achieving very high frame rates with moderate gate complexity and the effects of finite precision representation of the lifting coefficients on the compression performance are analyzed.
Abstract: This paper deals with the design and implementation of an image transform coding algorithm based on the integer wavelet transform (IWT). First of all, criteria are proposed for the selection of optimal factorizations of the wavelet filter polyphase matrix to be employed within the lifting scheme. The obtained results lead to the IWT implementations with very satisfactory lossless and lossy compression performance. Then, the effects of finite precision representation of the lifting coefficients on the compression performance are analyzed, showing that, in most cases, a very small number of bits can be employed for the mantissa keeping the performance degradation very limited. Stemming from these results, a VLSI architecture is proposed for the IWT implementation, capable of achieving very high frame rates with moderate gate complexity.

97 citations

Journal ArticleDOI
TL;DR: This paper proposes a novel scheme of scalable coding for encrypted images that quantizes the subimage and the Hadamard coefficients of each data set to reduce the data amount and can be reconstructed when more bitstreams are received.
Abstract: This paper proposes a novel scheme of scalable coding for encrypted images. In the encryption phase, the original pixel values are masked by a modulo-256 addition with pseudorandom numbers that are derived from a secret key. After decomposing the encrypted data into a downsampled subimage and several data sets with a multiple-resolution construction, an encoder quantizes the subimage and the Hadamard coefficients of each data set to reduce the data amount. Then, the data of quantized subimage and coefficients are regarded as a set of bitstreams. At the receiver side, while a subimage is decrypted to provide the rough information of the original content, the quantized coefficients can be used to reconstruct the detailed content with an iteratively updating procedure. Because of the hierarchical coding mechanism, the principal original content with higher resolution can be reconstructed when more bitstreams are received.

80 citations