scispace - formally typeset
Journal ArticleDOI

On source coding with side information at the decoder

A.D. Wyner
- 01 May 1975 - 
- Vol. 21, Iss: 3, pp 294-300
TLDR
The characterization of the family of rate triples (R_0,R_1, R_2) for which this system can deliver essentially perfect reproductions of X and Y and the principal result is a characterization of this family via an information-theoretic minimization.
Abstract
Let \{(X_k, Y_k, V_k)\}_{k=1}^{\infty} be a sequence of independent copies of the triple (X,Y,V) of discrete random variables. We consider the following source coding problem with a side information network. This network has three encoders numbered 0, 1, and 2, the inputs of which are the sequences \{ V_k\}, \{X_k\} , and \{Y_k\} , respectively. The output of encoder i is a binary sequence of rate R_i, i = 0,1,2 . There are two decoders, numbered 1 and 2, whose task is to deliver essentially perfect reproductions of the sequences \{X_k\} and \{Y_k\} , respectively, to two distinct destinations. Decoder 1 observes the output of encoders 0 and 1, and decoder 2 observes the output of encoders 0 and 2. The sequence \{V_k\} and its binary encoding (by encoder 0) play the role of side information, which is available to the decoders only. We study the characterization of the family of rate triples (R_0,R_1,R_2) for which this system can deliver essentially perfect reproductions (in the usual Shannon sense) of \{X_k\} and \{Y_k\} . The principal result is a characterization of this family via an information-theoretic minimization. Two special cases are of interest. In the first, V = (X, Y) so that the encoding of \{V_k \} contains common information. In the second, Y \equiv 0 so that our problem becomes a generalization of the source coding problem with side information studied by Slepian and Wo1f [3].

read more

Citations
More filters
Journal ArticleDOI

Cooperative diversity in wireless networks: Efficient protocols and outage behavior

TL;DR: Using distributed antennas, this work develops and analyzes low-complexity cooperative diversity protocols that combat fading induced by multipath propagation in wireless networks and develops performance characterizations in terms of outage events and associated outage probabilities, which measure robustness of the transmissions to fading.
Journal ArticleDOI

Capacity theorems for the relay channel

TL;DR: In this article, the capacity of the Gaussian relay channel was investigated, and a lower bound of the capacity was established for the general relay channel, where the dependence of the received symbols upon the inputs is given by p(y,y) to both x and y. In particular, the authors proved that if y is a degraded form of y, then C \: = \: \max \!p(x,y,x,2})} \min \,{I(X,y), I(X,Y,Y,X,Y

Capacity theorems for the relay channel

TL;DR: An achievable lower bound to the capacity of the general relay channel is established and superposition block Markov encoding is used to show achievability of C, and converses are established.
Journal ArticleDOI

The rate-distortion function for source coding with side information at the decoder

TL;DR: The quantity R \ast (d) is determined, defined as the infimum ofrates R such that communication is possible in the above setting at an average distortion level not exceeding d + \varepsilon .
Book

Network Information Theory

TL;DR: In this article, a comprehensive treatment of network information theory and its applications is provided, which provides the first unified coverage of both classical and recent results, including successive cancellation and superposition coding, MIMO wireless communication, network coding and cooperative relaying.
References
More filters
Book

Information Theory and Reliable Communication

TL;DR: This chapter discusses Coding for Discrete Sources, Techniques for Coding and Decoding, and Source Coding with a Fidelity Criterion.
Journal ArticleDOI

Noiseless coding of correlated information sources

TL;DR: The minimum number of bits per character R_X and R_Y needed to encode these sequences so that they can be faithfully reproduced under a variety of assumptions regarding the encoders and decoders is determined.
Journal ArticleDOI

The common information of two dependent random variables

TL;DR: The main result of the paper is contained in two theorems which show that C(X; Y) is i) the minimum R_0 such that a sequence of independent copies of (X,Y) can be efficiently encoded into three binary streams W_0, W_1,W_2 with rates R-0, R-1,R-2.
Journal ArticleDOI

A proof of the data compression theorem of Slepian and Wolf for ergodic sources (Corresp.)

TL;DR: It is established that the Slepian-Wolf theorem is true without change for arbitrary ergodic processes \{(X_i,Y_i)\}_{i=1}^{\infty} and countably infinite alphabets.
Journal ArticleDOI

Source coding for a simple network

TL;DR: This work considers the problem of source coding subject to a fidelity criterion for a simple network connecting a single source with two receivers via a common channel and two private channels and develops several upper and lower bounds that actually yield a portion of the desired region.