scispace - formally typeset
Book ChapterDOI

Coding Theorems for a Discrete Source With a Fidelity CriterionInstitute of Radio Engineers, International Convention Record, vol. 7, 1959.

Neil J. A. Sloane, +1 more
- pp 325-350
Reads0
Chats0
TLDR
For a wide class of distortion measures and discrete sources of information there exists a functionR(d) (depending on the particular distortion measure and source) which measures the equivalent rateR of the source (in bits per letter produced) whendis the allowed distortion level.
Abstract
Consider a discrete source producing a sequence of message letters from a finite alphabet. A single-letter distortion measure is given by a non-negative matrix (d ij ). The entryd ij measures the ?cost? or ?distortion? if letteriis reproduced at the receiver as letterj. The average distortion of a communications system (source-coder-noisy channel-decoder) is taken to bed= ? i.j P ij d ij whereP ij is the probability ofibeing reproduced asj. It is shown that there is a functionR(d) that measures the ?equivalent rate? of the source for a given level of distortion. For coding purposes where a leveldof distortion can be tolerated, the source acts like one with information rateR(d). Methods are given for calculatingR(d), and various properties discussed. Finally, generalizations to ergodic sources, to continuous sources, and to distortion measures involving blocks of letters are developed. In this paper a study is made of the problem of coding a discrete source of information, given afidelity criterionor ameasure of the distortionof the final recovered message at the receiving point relative to the actual transmitted message. In a particular case there might be a certain tolerable level of distortion as determined by this measure. It is desired to so encode the information that the maximum possible signaling rate is obtained without exceeding the tolerable distortion level. This work is an expansion and detailed elaboration of ideas presented earlier [1], with particular reference to the discrete case. We shall show that for a wide class of distortion measures and discrete sources of information there exists a functionR(d) (depending on the particular distortion measure and source) which measures, in a sense, the equivalent rateRof the source (in bits per letter produced) whendis the allowed distortion level. Methods will be given for evaluatingR(d) explicitly in certain simple cases and for evaluatingR(d) by a limiting process in more complex cases. The basic results are roughly that it is impossible to signal at a rate faster thanC/R(d) (source letters per second) over a memoryless channel of capacityC(bits per second) with a distortion measure less than or equal tod. On the other hand, by sufficiently long block codes it is possible to approach as closely as desired the rateC/R(d) with distortion leveld. Finally, some particular examples, using error probability per letter of message and other simple distortion measures, are worked out in detail.

read more

Citations
More filters
Book

Wireless Communications

Proceedings Article

Wireless communications

TL;DR: This book aims to provide a chronology of key events and individuals involved in the development of microelectronics technology over the past 50 years and some of the individuals involved have been identified and named.
Book

Introduction to data compression

TL;DR: The author explains the development of the Huffman Coding Algorithm and some of the techniques used in its implementation, as well as some of its applications, including Image Compression, which is based on the JBIG standard.
Journal ArticleDOI

Feedback Control Under Data Rate Constraints: An Overview

TL;DR: In this article, the authors review the results available in the literature on data-rate-limited control for linear systems and show how fundamental tradeoffs between the data rate and control goals, such as stability, mean entry times, and asymptotic state norms, emerge naturally.
Journal ArticleDOI

Rate-distortion methods for image and video compression

TL;DR: An overview of rate-distortion (R-D) based optimization techniques and their practical application to image and video coding is provided and two popular techniques for resource allocation are introduced, namely, Lagrangian optimization and dynamic programming.
References
More filters
Proceedings Article

Wireless communications

TL;DR: This book aims to provide a chronology of key events and individuals involved in the development of microelectronics technology over the past 50 years and some of the individuals involved have been identified and named.
Book

Introduction to data compression

TL;DR: The author explains the development of the Huffman Coding Algorithm and some of the techniques used in its implementation, as well as some of its applications, including Image Compression, which is based on the JBIG standard.
Journal ArticleDOI

Deterministic annealing for clustering, compression, classification, regression, and related optimization problems

TL;DR: The deterministic annealing approach to clustering and its extensions has demonstrated substantial performance improvement over standard supervised and unsupervised learning methods in a variety of important applications including compression, estimation, pattern recognition and classification, and statistical regression.
Journal ArticleDOI

Design of multiple description scalar quantizers

TL;DR: The design of scalar quantizers for communication systems that use diversity to overcome channel impairments is considered and a design algorithm, a generalization of S.P. Lloyd's (1962) algorithm, is developed.
Journal ArticleDOI

Rate-distortion methods for image and video compression

TL;DR: An overview of rate-distortion (R-D) based optimization techniques and their practical application to image and video coding is provided and two popular techniques for resource allocation are introduced, namely, Lagrangian optimization and dynamic programming.
Related Papers (5)