scispace - formally typeset
Topic

Data compression

About: Data compression is a(n) research topic. Over the lifetime, 43644 publication(s) have been published within this topic receiving 756576 citation(s). The topic is also known as: source coding & bit-rate reduction.

...read more

Papers
  More

Open accessJournal ArticleDOI: 10.1109/TCSVT.2003.815165
Abstract: H.264/AVC is newest video coding standard of the ITU-T Video Coding Experts Group and the ISO/IEC Moving Picture Experts Group. The main goals of the H.264/AVC standardization effort have been enhanced compression performance and provision of a "network-friendly" video representation addressing "conversational" (video telephony) and "nonconversational" (storage, broadcast, or streaming) applications. H.264/AVC has achieved a significant improvement in rate-distortion efficiency relative to existing standards. This article provides an overview of the technical features of H.264/AVC, describes profiles and applications for the standard, and outlines the history of the standardization process.

...read more

  • Fig. 1. Scope of video coding standardization.
    Fig. 1. Scope of video coding standardization.
  • Fig. 5. Progressive and interlaced frames and fields.
    Fig. 5. Progressive and interlaced frames and fields.
  • Fig. 2. Structure of H.264/AVC video encoder.
    Fig. 2. Structure of H.264/AVC video encoder.
  • Fig. 18. Evolution of H.264/AVC since August 1999 until March 2003. Top: QCIF sequence Foreman coded at 10 Hz. Bottom: CIF sequence tempete coded at 30 Hz. The legend in the top figure indicates the various versions that have been run with typical settings.
    Fig. 18. Evolution of H.264/AVC since August 1999 until March 2003. Top: QCIF sequence Foreman coded at 10 Hz. Bottom: CIF sequence tempete coded at 30 Hz. The legend in the top figure indicates the various versions that have been run with typical settings.
  • Fig. 12. Segmentations of the macroblock for motion compensation. Top: segmentation of macroblocks, bottom: segmentation of 8 8 partitions.
    Fig. 12. Segmentations of the macroblock for motion compensation. Top: segmentation of macroblocks, bottom: segmentation of 8 8 partitions.
  • + 6

8,302 Citations


Journal ArticleDOI: 10.1109/76.499834
Amir Said1, William A. Pearlman2Institutions (2)
Abstract: Embedded zerotree wavelet (EZW) coding, introduced by Shapiro (see IEEE Trans. Signal Processing, vol.41, no.12, p.3445, 1993), is a very effective and computationally simple technique for image compression. We offer an alternative explanation of the principles of its operation, so that the reasons for its excellent performance can be better understood. These principles are partial ordering by magnitude with a set partitioning sorting algorithm, ordered bit plane transmission, and exploitation of self-similarity across different scales of an image wavelet transform. Moreover, we present a new and different implementation based on set partitioning in hierarchical trees (SPIHT), which provides even better performance than our previously reported extension of EZW that surpassed the performance of the original EZW. The image coding results, calculated from actual file sizes and images reconstructed by the decoding algorithm, are either comparable to or surpass previous results obtained through much more sophisticated and computationally complex methods. In addition, the new coding and decoding procedures are extremely fast, and they can be made even faster, with only small loss in performance, by omitting entropy coding of the bit stream by the arithmetic code.

...read more

Topics: Set partitioning in hierarchical trees (67%), Data compression (56%), Entropy encoding (56%) ...read more

5,812 Citations


Journal ArticleDOI: 10.1109/TIT.1977.1055714
Jacob Ziv1, A. Lempel1Institutions (1)
Abstract: A universal algorithm for sequential data compression is presented. Its performance is investigated with respect to a nonprobabilistic model of constrained sources. The compression ratio achieved by the proposed universal code uniformly approaches the lower bounds on the compression ratios attainable by block-to-variable codes and variable-to-block codes designed to match a completely specified source.

...read more

Topics: Data compression ratio (71%), Lossless compression (64%), Data compression (62%) ...read more

5,569 Citations


Journal ArticleDOI: 10.1109/78.258085
J.M. Shapiro1Institutions (1)
Abstract: The embedded zerotree wavelet algorithm (EZW) is a simple, yet remarkably effective, image compression algorithm, having the property that the bits in the bit stream are generated in order of importance, yielding a fully embedded code The embedded code represents a sequence of binary decisions that distinguish an image from the "null" image Using an embedded coding algorithm, an encoder can terminate the encoding at any point thereby allowing a target rate or target distortion metric to be met exactly Also, given a bit stream, the decoder can cease decoding at any point in the bit stream and still produce exactly the same image that would have been encoded at the bit rate corresponding to the truncated bit stream In addition to producing a fully embedded bit stream, the EZW consistently produces compression results that are competitive with virtually all known compression algorithms on standard test images Yet this performance is achieved with a technique that requires absolutely no training, no pre-stored tables or codebooks, and requires no prior knowledge of the image source The EZW algorithm is based on four key concepts: (1) a discrete wavelet transform or hierarchical subband decomposition, (2) prediction of the absence of significant information across scales by exploiting the self-similarity inherent in images, (3) entropy-coded successive-approximation quantization, and (4) universal lossless data compression which is achieved via adaptive arithmetic coding >

...read more

5,503 Citations


Journal ArticleDOI: 10.1145/103085.103089
Abstract: For the past few years, a joint ISO/CCITT committee known as JPEG (Joint Photographic Experts Group) has been working to establish the first international compression standard for continuous-tone still images, both grayscale and color. JPEG’s proposed standard aims to be generic, to support a wide variety of applications for continuous-tone images. To meet the differing needs of many applications, the JPEG standard includes two basic compression methods, each with various modes of operation. A DCT-based method is specified for “lossy’’ compression, and a predictive method for “lossless’’ compression. JPEG features a simple lossy technique known as the Baseline method, a subset of the other DCT-based modes of operation. The Baseline method has been by far the most widely implemented JPEG method to date, and is sufficient in its own right for a large number of applications. This article provides an overview of the JPEG standard, and focuses in detail on the Baseline method.

...read more

Topics: Lossless JPEG (75%), JPEG 2000 (72%), JPEG File Interchange Format (72%) ...read more

3,866 Citations


Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202214
2021810
20201,231
20191,322
20181,259
20171,293

Top Attributes

Show by:

Topic's top 5 most impactful authors

Aggelos K. Katsaggelos

67 papers, 2.6K citations

Wen Gao

59 papers, 1.3K citations

David Bull

55 papers, 625 citations

Liang-Gee Chen

51 papers, 1.2K citations

Feng Wu

50 papers, 1.8K citations

Network Information
Related Topics (5)
Image compression

23K papers, 369.1K citations

94% related
Motion estimation

31.2K papers, 699K citations

93% related
Video processing

25.8K papers, 374.5K citations

93% related
Feature extraction

111.8K papers, 2.1M citations

92% related
Motion compensation

21.3K papers, 370.6K citations

92% related