scispace - formally typeset
Search or ask a question
Conference

Data Compression Conference 

About: Data Compression Conference is an academic conference. The conference publishes majorly in the area(s): Data compression & Lossless compression. Over the lifetime, 2737 publications have been published by the conference receiving 40913 citations.


Papers
More filters
Proceedings ArticleDOI
31 Mar 1996
TL;DR: LOCO-I as discussed by the authors combines the simplicity of Huffman coding with the compression potential of context models, thus "enjoying the best of both worlds." The algorithm is based on a simple fixed context model, which approaches the capability of the more complex universal context modeling techniques for capturing high-order dependencies.
Abstract: LOCO-I (low complexity lossless compression for images) is a novel lossless compression algorithm for continuous-tone images which combines the simplicity of Huffman coding with the compression potential of context models, thus "enjoying the best of both worlds." The algorithm is based on a simple fixed context model, which approaches the capability of the more complex universal context modeling techniques for capturing high-order dependencies. The model is tuned for efficient performance in conjunction with a collection of (context-conditioned) Huffman codes, which is realized with an adaptive, symbol-wise, Golomb-Rice code. LOCO-I attains, in one pass, and without recourse to the higher complexity arithmetic coders, compression ratios similar or superior to those obtained with state-of-the-art schemes based on arithmetic coding. In fact, LOCO-I is being considered by the ISO committee as a replacement for the current lossless standard in low-complexity applications.

625 citations

Proceedings ArticleDOI
29 Mar 1999
TL;DR: This work introduces a new construction and practical framework for tackling the problem of distributed source coding based on the judicious incorporation of channel coding principles into this source coding problem and focuses in this paper on trellis-structured constructions of the framework to illustrate its utility.
Abstract: We address the problem of distributed source coding, i.e. compression of correlated sources that are not co-located and/or cannot communicate with each other to minimize their joint description cost. In this work we tackle the related problem of compressing a source that is correlated with another source which is available only at the decoder. In contrast to prior information-theoretic approaches, we introduce a new construction and practical framework for tackling the problem based on the judicious incorporation of channel coding principles into this source coding problem. We dub our approach as distributed source coding using syndromes (DISCUS). We focus in this paper on trellis-structured constructions of the framework to illustrate its utility. Simulation results confirm the power of DISCUS, opening up a new and exciting constructive playing-ground for the distributed source coding problem. For the distributed coding of correlated i.i.d. Gaussian sources that are noisy versions of each other with "correlation-SNR" in the range of 12 to 20 dB, the DISCUS method attains gains of 7-15 dB in SNR over the Shannon-bound using "naive" independent coding of the sources.

463 citations

Proceedings ArticleDOI
02 Apr 2002
TL;DR: It is shown that turbo codes can come close to the Slepian-Wolf bound in lossless distributed source coding in asymmetric scenario considered and the scheme also performs well for joint source-channel coding.
Abstract: We show that turbo codes can come close to the Slepian-Wolf bound in lossless distributed source coding. In the asymmetric scenario considered, X and Y are statistically dependent signals and X is encoded with no knowledge of Y. However, Y is known as side information at the decoder. We use a system based on turbo codes to send X at a rate close to H(X|Y). We apply our system to binary sequences and simulations show performance close to the information-theoretic limit. For distributed source coding of Gaussian sequences, our results show significant improvement over previous work. The scheme also performs well for joint source-channel coding.

455 citations

Proceedings ArticleDOI
28 Mar 2000
TL;DR: The JPEG-2000 standard as discussed by the authors is an emerging standard for still image compression, which defines the minimum compliant decoder and bitstream syntax, as well as optional, value-added extensions.
Abstract: JPEG-2000 is an emerging standard for still image compression. This paper provides a brief history of the JPEG-2000 standardization process, an overview of the standard, and some description of the capabilities provided by the standard. Part I of the JPEG-2000 standard specifies the minimum compliant decoder, while Part II describes optional, value-added extensions. Although the standard specifies only the decoder and bitstream syntax, in this paper we describe JPEG-2000 from the point of view of encoding. We take this approach, as we believe it is more amenable to a compact description more easily understood by most readers.

391 citations

Proceedings ArticleDOI
24 Mar 2010
TL;DR: Block-based random image sampling is coupled with a projection-driven compressed-sensing recovery that encourages sparsity in the domain of directional transforms simultaneously with a smooth reconstructed image, yielding images with quality that matches or exceeds that produced by a popular, yet computationally expensive, technique which minimizes total variation.
Abstract: Recent years have seen significant interest in the paradigm of compressed sensing (CS) which permits, under certain conditions, signals to be sampled at sub-Nyquist rates via linear projection onto a random basis while still enabling exact reconstruction of the original signal. As applied to 2D images, however, CS faces several challenges including a computationally expensive reconstruction process and huge memory required to store the random sampling operator. Recently, several fast algorithms have been developed for CS reconstruction, while the latter challenge was addressed by Gan using a block-based sampling operation as well as projection-based Landweber iterations to accomplish fast CS reconstruction while simultaneously imposing smoothing with the goal of improving the reconstructed-image quality by eliminating blocking artifacts. In this technique, smoothing is achieved by interleaving Wiener filtering with the Landweber iterations, a process facilitated by the relative simple implementation of the Landweber algorithm. In this work, we adopt Gan's basic framework of block-based CS sampling of images coupled with iterative projection-based reconstruction with smoothing. Our contribution lies in that we cast the reconstruction in the domain of recent transforms that feature a highly directional decomposition. These transforms---specifically, contourlets and complex-valued dual-tree wavelets---have shown promise to overcome deficiencies of widely-used wavelet transforms in several application areas. In their application to iterative projection-based CS recovery, we adapt bivariate shrinkage to their directional decomposition structure to provide sparsity-enforcing thresholding, while a Wiener-filter step encourages smoothness of the result. In experimental simulations, we find that the proposed CS reconstruction based on directional transforms outperforms equivalent reconstruction using common wavelet and cosine transforms. Additionally, the proposed technique usually matches or exceeds the quality of total-variation (TV) reconstruction, a popular approach to CS recovery for images whose gradient-based operation also promotes smoothing but runs several orders of magnitude slower than our proposed algorithm.

387 citations

Performance
Metrics
No. of papers from the Conference in previous years
YearPapers
202376
202299
202151
202076
2019111
201876