scispace - formally typeset
Journal ArticleDOI

A proof of the data compression theorem of Slepian and Wolf for ergodic sources (Corresp.)

Thomas M. Cover
- 01 Mar 1975 - 
- Vol. 21, Iss: 2, pp 226-228
TLDR
It is established that the Slepian-Wolf theorem is true without change for arbitrary ergodic processes \{(X_i,Y_i)\}_{i=1}^{\infty} and countably infinite alphabets.
Abstract
If \{(X_i, Y_i)\}_{i=1}^{\infty} is a sequence of independent identically distributed discrete random pairs with (X_i, Y_i) ~ p(x,y) , Slepian and Wolf have shown that the X process and the Y process can be separately described to a common receiver at rates R_X and R_Y hits per symbol if R_X + R_Y > H(X,Y), R_X > H(X\midY), R_Y > H(Y\midX) . A simpler proof of this result will be given. As a consequence it is established that the Slepian-Wolf theorem is true without change for arbitrary ergodic processes \{(X_i,Y_i)\}_{i=1}^{\infty} and countably infinite alphabets. The extension to an arbitrary number of processes is immediate.

read more

Citations
More filters
Journal ArticleDOI

Capacity theorems for the relay channel

TL;DR: In this article, the capacity of the Gaussian relay channel was investigated, and a lower bound of the capacity was established for the general relay channel, where the dependence of the received symbols upon the inputs is given by p(y,y) to both x and y. In particular, the authors proved that if y is a degraded form of y, then C \: = \: \max \!p(x,y,x,2})} \min \,{I(X,y), I(X,Y,Y,X,Y

Capacity theorems for the relay channel

TL;DR: An achievable lower bound to the capacity of the general relay channel is established and superposition block Markov encoding is used to show achievability of C, and converses are established.
Book

Network Information Theory

TL;DR: In this article, a comprehensive treatment of network information theory and its applications is provided, which provides the first unified coverage of both classical and recent results, including successive cancellation and superposition coding, MIMO wireless communication, network coding and cooperative relaying.
Journal ArticleDOI

A new achievable rate region for the interference channel

TL;DR: A new achievable rate region for the general interference channel which extends previous results is presented and evaluated and the capacity of a class of Gaussian interference channels is established.
Journal ArticleDOI

Distributed source coding using syndromes (DISCUS): design and construction

TL;DR: This work addresses the problem of compressing correlated distributed sources, i.e., correlated sources which are not co-located or which cannot cooperate to directly exploit their correlation and provides a constructive practical framework based on algebraic trellis codes dubbed as DIstributed Source Coding Using Syndromes (DISCUS), that can be applicable in a variety of settings.
References
More filters
Journal ArticleDOI

A mathematical theory of communication

TL;DR: This final installment of the paper considers the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now.
Journal ArticleDOI

Noiseless coding of correlated information sources

TL;DR: The minimum number of bits per character R_X and R_Y needed to encode these sequences so that they can be faithfully reproduced under a variety of assumptions regarding the encoders and decoders is determined.
Journal ArticleDOI

Source coding with side information and a converse for degraded broadcast channels

TL;DR: In Section H of the paper, a characterization of the capacity region for degraded broadcast channels (DBC's) is given, which was conjectured by Bergmans and is somewhat sharper than the one obtained by Gallager.
Journal ArticleDOI

The Basic Theorems of Information Theory

TL;DR: In this article, the authors describe the mathematical models upon which communication theory is based, and present in some detail an exposition and partial critique of C. E. Shannon's treatment of one such model.