scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

On function computation over a cascade network

01 Sep 2012-pp 472-476
TL;DR: The main result is an inner bound to the rate region of this problem which is tight when X - Y - Z forms a Markov chain.
Abstract: A transmitter has access to X, a relay has access to Y, and a receiver has access to Z and wants to compute a given function ƒ(X, Y, Z). How many bits must be transmitted from the transmitter to the relay and from the relay to the receiver so that the latter can reliably recover ƒ(X, Y, Z)? The main result is an inner bound to the rate region of this problem which is tight when X - Y - Z forms a Markov chain.
Citations
More filters
Journal ArticleDOI
TL;DR: This paper establishes the capacity region for a class of source coding function computation setups, where sources of information are available at the nodes of a tree and where a function of these sources must be computed at its root.
Abstract: This paper establishes the capacity region for a class of source coding function computation setups, where sources of information are available at the nodes of a tree and where a function of these sources must be computed at its root. The capacity region holds for any function as long as the sources’ joint distribution satisfies a certain Markov criterion. This criterion is met, in particular, when the sources are independent. This result recovers the capacity regions of several function computation setups. These include the point-to-point communication setting with arbitrary sources, the noiseless multiple access network with conditionally independent sources, and the cascade network with Markovian sources.

32 citations


Cites background from "On function computation over a casc..."

  • ...5 In addition to multiple access networks, function computation over cascade networks have been investigated in [9], [31], and [37] referenced here in increasing order of generality....

    [...]

  • ...Tree networks generalize some previously investigated settings including point-to-point [26], multiple access [18], [29],1 and cascade (relay-assisted) [9], [31], [33], [37], and can be used as backbones for computing functions over general networks [20]....

    [...]

Posted Content
TL;DR: A general inner bound to the above three dimensional rate region is provided and shown to be tight in a number of interesting settings: the function is partially invertible, full cooperation, one-round point-to-point communication, two-round pointed communication, and cascade.
Abstract: A receiver wants to compute a function of two correlated sources separately observed by two transmitters. One of the transmitters may send a possibly private message to the other transmitter in a cooperation phase before both transmitters communicate to the receiver. For this network configuration this paper investigates both a function computation setup, wherein the receiver wants to compute a given function of the sources exactly, and a rate distortion setup, wherein the receiver wants to compute a given function within some distortion. For the function computation setup, a general inner bound to the rate region is established and shown to be tight in a number of cases: partially invertible functions, full cooperation between transmitters, one-round point-to-point communication, two-round point-to-point communication, and the cascade setup where the transmitters and the receiver are aligned. In particular it is shown that the ratio of the total number of transmitted bits without cooperation and the total number of transmitted bits with cooperation can be arbitrarily large. Furthermore, one bit of cooperation suffices to arbitrarily reduce the amount of information both transmitters need to convey to the receiver. For the rate distortion version, an inner bound to the rate region is exhibited which always includes, and sometimes strictly, the convex hull of Kaspi-Berger's related inner bounds. The strict inclusion is shown via two examples.

12 citations


Cites background from "On function computation over a casc..."

  • ...The general case was recently investigated in [19]....

    [...]

Proceedings ArticleDOI
01 Jul 2012
TL;DR: In this article, a general inner bound to the above three dimensional rate region is provided and shown to be tight in a number of interesting settings: the function is partially invertible, full cooperation, one-round point-to-point communication, two-round and cascade.
Abstract: A receiver wants to compute a function of two correlated sources separately observed by two transmitters. One of the transmitters is allowed to cooperate with the other transmitter by sending it some data before both transmitters convey information to the receiver. Assuming noiseless communication, what is the minimum number of bits that needs to be communicated by each transmitter to the receiver for a given number of cooperation bits? In this paper, first a general inner bound to the above three dimensional rate region is provided and shown to be tight in a number of interesting settings: the function is partially invertible, full cooperation, one-round point-to-point communication, two-round point-to-point communication, and cascade. Second, the related Kaspi-Berger rate distortion problem is investigated where the receiver now wants to recover the sources within some distortion. By using ideas developed for establishing the above inner bound, a new rate distortion inner bound is proposed. This bound always includes the time sharing of Kaspi-Berger's inner bounds and inclusion is strict in certain cases.

12 citations

Proceedings ArticleDOI
23 Dec 2013
TL;DR: This paper investigates a distributed function computation setting where the underlying network is a rooted directed tree and the root wants to compute a function of the sources of information available at the nodes of the network.
Abstract: This paper investigates a distributed function computation setting where the underlying network is a rooted directed tree and where the root wants to compute a function of the sources of information available at the nodes of the network. The main result provides the rate region for an arbitrary function under the assumption that the sources satisfy a general criterion. This criterion is satisfied, in particular, when the sources are independent.

8 citations


Cites background from "On function computation over a casc..."

  • ...Note that that tree network generalizes many previously investigated settings including point-to-point [6], multiple access [4], [7], and cascade [2], [10], [9]....

    [...]

Journal ArticleDOI
TL;DR: The average number of bits that need to be conveyed to that end by each sender to the relay and by the relay to the receiver is studied, in the limit of multiple instances.
Abstract: Two remote senders observe $X$ and $Y$ , respectively, and can noiselessly send information via a common relay node to a receiver that observes $Z$ . The receiver wants to compute a function $f(X,Y,Z)$ of these possibly related observations, without error. We study the average number of bits that need to be conveyed to that end by each sender to the relay and by the relay to the receiver, in the limit of multiple instances. We relate these quantities to the entropy region of a probabilistic graph with respect to a Cartesian representation of its vertex set, which we define as a natural extension of graph entropy. General properties and bounds for the graph entropy region are derived, and mapped back to special cases of the distributed computing setup.

7 citations


Cites methods from "On function computation over a casc..."

  • ...We write f n for the n-fold Cartesian product of f ....

    [...]

References
More filters
Journal ArticleDOI
17 Sep 1995
TL;DR: It is shown that if only the sender can transmit, the number of bits required is a conditional entropy of a naturally defined graph.
Abstract: A sender communicates with a receiver who wishes to reliably evaluate a function of their combined data. We show that if only the sender can transmit, the number of bits required is a conditional entropy of a naturally defined graph. We also determine the number of bits needed when the communicators exchange two messages. Reference is made to the results of rate distortion in evaluating the function of two random variables.

455 citations

Proceedings ArticleDOI
23 Oct 1995
TL;DR: It is shown that if only the sender can transmit, the number of bits required is a conditional entropy of a naturally defined graph.
Abstract: A sender communicates with a receiver who wishes to reliably evaluate a function of their combined data. We show that if only the sender can transmit, the number of bits required is a conditional entropy of a naturally defined graph. We also determine the number of bits needed when the communicators exchange two messages.

280 citations


"On function computation over a casc..." refers background in this paper

  • ...The probability of each of these two errors is shown to be negligible in [3] for n large enough....

    [...]

  • ...Therefore, the problem reduces to two point-to-point problems which can be treated independently: a computation problem between the transmitter and the relay whose solution is given by [3], and a classical single source coding problem between the relay and the receiver....

    [...]

  • ...Definition 4 (Conditional Graph Entropy [3])....

    [...]

  • ...Later, Viswanathan [7] proposed a general rate region outer bound based on the point-to-point result of Orlitsky and Roche [3] and investigated the case where X − Y −Z forms a Markov chain [6]....

    [...]

Proceedings ArticleDOI
28 Jun 2009
TL;DR: The general contribution toward understanding the limits of the cascade multiterminal source coding network is in the form of inner and outer bounds on the achievable rate region for satisfying a distortion constraint for an arbitrary distortion function d(x, y, z).
Abstract: We investigate distributed source coding of two correlated sources X and Y where messages are passed to a decoder in a cascade fashion. The encoder of X sends a message at rate R1 to the encoder of Y. The encoder of Y then sends a message to the decoder at rate R 2 based both on Y and on the message it received about X. The decoder's task is to estimate a function of X and Y. For example, we consider the minimum mean squared-error distortion when encoding the sum of jointly Gaussian random variables under these constraints. We also characterize the rates needed to reconstruct a function of X and Y losslessly. Our general contribution toward understanding the limits of the cascade multiterminal source coding network is in the form of inner and outer bounds on the achievable rate region for satisfying a distortion constraint for an arbitrary distortion function d(x, y, z). The inner bound makes use of a balance between two encoding tactics—relaying the information about X and recompressing the information about X jointly with Y. In the Gaussian case, a threshold is discovered for identifying which of the two extreme strategies optimizes the inner bound. Relaying outperforms recompressing the sum at the relay for some rate pairs if the variance of X is greater than the variance of Y.

64 citations


"On function computation over a casc..." refers background or methods in this paper

  • ...This problem was first considered by Cuff, Su, El-Gamal [1] for which they derived the rate region in the case where Z is a constant....

    [...]

  • ...and this region is actually tight as shown by Cuff, Su, and El-Gamal [1]....

    [...]

  • ...In the particular case where Z is a constant, the conditions of Theorem 1 become RX ≥ HGX|Y (X|Y ) RY ≥ H(f(X,Y )) , 2We use the notation U − V −W whenever random variables (U, V,W ) form a Markov chain. and this region is actually tight as shown by Cuff, Su, and El-Gamal [1]....

    [...]

Posted Content
TL;DR: A general inner bound to the above three dimensional rate region is provided and shown to be tight in a number of interesting settings: the function is partially invertible, full cooperation, one-round point-to-point communication, two-round pointed communication, and cascade.
Abstract: A receiver wants to compute a function of two correlated sources separately observed by two transmitters. One of the transmitters may send a possibly private message to the other transmitter in a cooperation phase before both transmitters communicate to the receiver. For this network configuration this paper investigates both a function computation setup, wherein the receiver wants to compute a given function of the sources exactly, and a rate distortion setup, wherein the receiver wants to compute a given function within some distortion. For the function computation setup, a general inner bound to the rate region is established and shown to be tight in a number of cases: partially invertible functions, full cooperation between transmitters, one-round point-to-point communication, two-round point-to-point communication, and the cascade setup where the transmitters and the receiver are aligned. In particular it is shown that the ratio of the total number of transmitted bits without cooperation and the total number of transmitted bits with cooperation can be arbitrarily large. Furthermore, one bit of cooperation suffices to arbitrarily reduce the amount of information both transmitters need to convey to the receiver. For the rate distortion version, an inner bound to the rate region is exhibited which always includes, and sometimes strictly, the convex hull of Kaspi-Berger's related inner bounds. The strict inclusion is shown via two examples.

12 citations


"On function computation over a casc..." refers background in this paper

  • ...Finally, a similar problem has been considered in [5] for the case where Z is a constant and where there is an additional direct link between the transmitter and the receiver....

    [...]

Proceedings ArticleDOI
13 Jun 2010
TL;DR: A correspondence between the problem of computing functions of data streams while employing limited memory in a standard information-theoretic framework and a functional source coding problem in cascade/line networks is established.
Abstract: We consider the problem of computing functions of data streams while employing limited memory in a standard information-theoretic framework. A streaming system with memory constraint has to observe a collection of sources X 1 ,X 2 ,…,X m sequentially, store synopses of the sources in memory, and compute a function of the sources based on the synopses. We establish a correspondence between this problem and a functional source coding problem in cascade/line networks. For the general functional source coding problem in cascade networks, we derive inner and outer bounds, and for distributions satisfying certain properties, we characterize the achievable rate-region exactly for the computation of any function. As a result of the correspondence we established, this result also characterizes the minimum amount of memory required to compute the function in a streaming system. We briefly discuss the implications of this result for the problem of distinct value computation.

12 citations


"On function computation over a casc..." refers background or methods in this paper

  • ...In this paper, first we extend the scheme used in [7] for arbitrary X , Y , and Z....

    [...]

  • ...This inner bound is such that RY can always be taken to be equal to the RY lower bound derived by Viswanathan....

    [...]

  • ...Later, Viswanathan [7] proposed a general rate region outer bound based on the point-to-point result of Orlitsky and Roche [3] and investigated the case where X − Y −Z forms a Markov chain [6]....

    [...]

  • ...Interestingly, a general formulation of this problem is equivalent to the problem of function computation in a cascade setting as shown by Viswanathan [7]....

    [...]