scispace - formally typeset
Journal ArticleDOI

The rate-distortion function for source coding with side information at the decoder

TLDR
The quantity R \ast (d) is determined, defined as the infimum ofrates R such that communication is possible in the above setting at an average distortion level not exceeding d + \varepsilon .
Abstract
Let \{(X_{k}, Y_{k}) \}^{ \infty}_{k=1} be a sequence of independent drawings of a pair of dependent random variables X, Y . Let us say that X takes values in the finite set \cal X . It is desired to encode the sequence \{X_{k}\} in blocks of length n into a binary stream of rate R , which can in turn be decoded as a sequence \{ \hat{X}_{k} \} , where \hat{X}_{k} \in \hat{ \cal X} , the reproduction alphabet. The average distortion level is (1/n) \sum^{n}_{k=1} E[D(X_{k},\hat{X}_{k})] , where D(x,\hat{x}) \geq 0, x \in {\cal X}, \hat{x} \in \hat{ \cal X} , is a preassigned distortion measure. The special assumption made here is that the decoder has access to the side information \{Y_{k}\} . In this paper we determine the quantity R \ast (d) , defined as the infimum ofrates R such that (with \varepsilon > 0 arbitrarily small and with suitably large n )communication is possible in the above setting at an average distortion level (as defined above) not exceeding d + \varepsilon . The main result is that R \ast (d) = \inf [I(X;Z) - I(Y;Z)] , where the infimum is with respect to all auxiliary random variables Z (which take values in a finite set \cal Z ) that satisfy: i) Y,Z conditionally independent given X ; ii) there exists a function f: {\cal Y} \times {\cal Z} \rightarrow \hat{ \cal X} , such that E[D(X,f(Y,Z))] \leq d . Let R_{X | Y}(d) be the rate-distortion function which results when the encoder as well as the decoder has access to the side information \{ Y_{k} \} . In nearly all cases it is shown that when d > 0 then R \ast(d) > R_{X|Y} (d) , so that knowledge of the side information at the encoder permits transmission of the \{X_{k}\} at a given distortion level using a smaller transmission rate. This is in contrast to the situation treated by Slepian and Wolf [5] where, for arbitrarily accurate reproduction of \{X_{k}\} , i.e., d = \varepsilon for any \varepsilon >0 , knowledge of the side information at the encoder does not allow a reduction of the transmission rate.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Cooperative diversity in wireless networks: Efficient protocols and outage behavior

TL;DR: Using distributed antennas, this work develops and analyzes low-complexity cooperative diversity protocols that combat fading induced by multipath propagation in wireless networks and develops performance characterizations in terms of outage events and associated outage probabilities, which measure robustness of the transmissions to fading.
Journal ArticleDOI

Broadcast channels with confidential messages

TL;DR: Given two discrete memoryless channels (DMC's) with a common input, a single-letter characterization is given of the achievable triples where R_{e} is the equivocation rate and the related source-channel matching problem is settled.
Journal ArticleDOI

Cooperative strategies and capacity theorems for relay networks

TL;DR: The capacity results generalize broadly, including to multiantenna transmission with Rayleigh fading, single-bounce fading, certain quasi-static fading problems, cases where partial channel knowledge is available at the transmitters, and cases where local user cooperation is permitted.
Book

Network Information Theory

TL;DR: In this article, a comprehensive treatment of network information theory and its applications is provided, which provides the first unified coverage of both classical and recent results, including successive cancellation and superposition coding, MIMO wireless communication, network coding and cooperative relaying.
Journal ArticleDOI

A survey on wireless multimedia sensor networks

TL;DR: Existing solutions and open research issues at the application, transport, network, link, and physical layers of the communication protocol stack are investigated, along with possible cross-layer synergies and optimizations.
References
More filters
Book

Information Theory and Reliable Communication

TL;DR: This chapter discusses Coding for Discrete Sources, Techniques for Coding and Decoding, and Source Coding with a Fidelity Criterion.
Journal ArticleDOI

Noiseless coding of correlated information sources

TL;DR: The minimum number of bits per character R_X and R_Y needed to encode these sequences so that they can be faithfully reproduced under a variety of assumptions regarding the encoders and decoders is determined.
Journal ArticleDOI

On source coding with side information at the decoder

TL;DR: The characterization of the family of rate triples (R_0,R_1, R_2) for which this system can deliver essentially perfect reproductions of X and Y and the principal result is a characterization of this family via an information-theoretic minimization.
Journal ArticleDOI

A conditional entropy bound for a pair of discrete random variables

TL;DR: This paper concerns the function F, its properties, its calculation, and its applications to several problems in information theory.
Related Papers (5)