scispace - formally typeset
Journal ArticleDOI

Error exponent for source coding with a fidelity criterion

K. Marton
- 01 Mar 1974 - 
- Vol. 20, Iss: 2, pp 197-199
Reads0
Chats0
TLDR
For discrete memoryless sources with a single-letter fidelity criterion, the probability of the event that the distortion exceeds a level d is studied, if for large block length the best code of given rate R > R(d) is used.
Abstract
For discrete memoryless sources with a single-letter fidelity criterion, we study the probability of the event that the distortion exceeds a level d , if for large block length the best code of given rate R > R(d) is used. Lower and upper exponential bounds are obtained, giving the asymptotically exact exponent, except possibly for a countable set of R values.

read more

Citations
More filters
Book

A First Course in Information Theory

TL;DR: This book provides the first comprehensive treatment of the theory of I-Measure, network coding theory, Shannon and non-Shannon type information inequalities, and a relation between entropy and group theory.
Journal ArticleDOI

The method of types [information theory]

TL;DR: While the method of types is suitable primarily for discrete memoryless models, its extensions to certain models with memory are also discussed, and a wide selection of further applications are surveyed.
Journal ArticleDOI

Fixed-Length Lossy Compression in the Finite Blocklength Regime

TL;DR: For stationary memoryless sources with separable distortion, the minimum rate achievable is shown to be closely approximated by the standard Gaussian complementary cumulative distribution function.
Journal ArticleDOI

Guessing subject to distortion

TL;DR: This work investigates the problem of guessing a random vector X within distortion level D and proposes an asymptotically optimal guessing scheme that is universal both with respect to the information source and the value of /spl rho/.
Posted Content

Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities

TL;DR: In this article, a unified treatment of single and multi-user problems in Shannon's information theory is presented, where the error probabilities for various problems are bounded above by a non-vanishing constant and the spotlight is shone on achievable coding rates as functions of the growing blocklength.
References
More filters
Book

Information Theory and Reliable Communication

TL;DR: This chapter discusses Coding for Discrete Sources, Techniques for Coding and Decoding, and Source Coding with a Fidelity Criterion.
Journal ArticleDOI

Lower bounds to error probability for coding on discrete memoryless channels. II

TL;DR: The paper is presented in two parts: the first, appearing here, summarizes the major results and treats the case of high transmission rates in detail; the second, to appear in the subsequent issue, treats the cases of low transmission rates.
Journal ArticleDOI

Hypothesis testing and information theory

TL;DR: The testing of binary hypotheses is developed from an information-theoretic point of view, and the asymptotic performance of optimum hypothesis testers is developed in exact analogy to the ascyptoticperformance of optimum channel codes.
Journal ArticleDOI

A coding theorem for discrete-time sources

TL;DR: A new derivation of the source coding theorem for discrete-time sources is presented and it is shown that the classical random coding exponent also emerges as a critical quantity for source coding.