Cascade multiterminal source coding
Summary (2 min read)
I. INTRODUCTION
- Distributed data collection, such as aggregating measurements in a sensor network, has been investigated from many angles [1] .
- Computing functions of observations in a network has been considered in various other settings, such as the two-node back-and-forth setting of [2] and the multiple access channel setting in [3] .
- In the cascade multiterminal network, the answer breaks down quite intuitively.
B. Rate-Distortion Region
- The goal is for X", Y", and Z" to satisfy an average letterby-letter distortion constraint D with high probability.
- A finite distortion function d(x, y, z) specifies the penalty incurred for any triple (x, y, z).
- The rate-distortion region R for a particular source joint distribution Po(x, y) and distortion function d is the closure of achievable rate-distortion triples, given as, EQUATION.
III. GENERAL INNER BOUND
- The cascade multiterminal source coding network presents an interesting dilemma.
- The second encoder could jointly compress the source sequence yn along with the auxiliary sequence, treating it as if it was also a random source sequence.
- A lot is known about the auxiliary sequence, such as the codebook it came from, allowing it to be summarized more easily than this approach would allow.
II. PROBLEM SPECIFICS
- Channel setting that is solved in its full generality.
- Berger and Tung [11] first considered the multiterminal source coding problem, where correlated sources are encoded separately with loss.
- The difference is that communication between the source encoders in this network replaces one of the direct channels to the decoder.
- In their setting, the decoder has side information, and the relay has access to a physically degraded version of it.
- Then the authors consider specific cases, such as encoding the sum of jointly Gaussian random variables, computing functions, and even coordinating actions.
IV. GENERAL OUTER BOUND
- Finally, Encoder 1 sends the bin numbers bu(i) and bv(j, i) to Encoder 2. Encoder 2 considers all codewords in Cu with bin number bu(i) and finds that only u" (i) is e-jointly typical with yn with respect to p(y, u).
- Finally, E can be chosen small enough to satisfy the rate and distortion inequalities.
- The forwarded message keeps its sparse codebook in tact, while the decoded and recompressed message enjoys the efficiency that comes with being bundled with Y. Theorem 3.1 (Inner bound):.
2 y
- Only has one significant free parameter due to Markovity, All remammg quantities that define the region Rin are conditioned on U, including the final estimate at the decoder since U is available to the decoder.
- Therefore, after fixing p(x, y, u) the authors can remove U entirely from the optimization problem by exploiting the idiosyncracies of the jointly Gaussian distribution.
- This greatly reduces the dimensionality of the problem.
2) Outer bound:
- The outer bound Rout is optimized with Gaussian auxiliary random variables.
- The result is the following lower bound on distortion.
- This puts us in the recompress regime of the inner bound.
- From this the authors find a piece-wise upper bound on the sum-rate-distortion function.
I ' ffici E(XY)
- The two encoding strategies employed are to either forward the message from Encoder 1",to the Decoder, or to use the message to construct an",estimate X" at Encoder 2 and then compress the vector sum X" + yn and send it to the Decoder, but not both.
- The determining factor for deciding which method to use is a comparison of the rate R 1 with the quantity ~log2 ~. Case 1: 2The optimal rate region for computing functions of data in the standard multiterminal source coding network is currently an open problem [17] .
- This optimization is carefully investigated in [6] and equated to a graph entropy problem.
Did you find this useful? Give us your feedback
Citations
2,442 citations
289 citations
130 citations
75 citations
Cites background from "Cascade multiterminal source coding..."
...This dissertation draws on the author’s collaborative work from [32], [24], [33], and [34]....
[...]
...Therefore, we find as special cases that the bounds in Theorem 8 are tight if X is a function of Y , if Y is a function of X, or if the reconstruction Z is a function of X and Y [24]....
[...]
60 citations
References
4,165 citations
"Cascade multiterminal source coding..." refers background in this paper
...Encoding of information sources at separate encoders has attracted a lot of attention in the information theory community over the years....
[...]
3,288 citations
758 citations
757 citations
"Cascade multiterminal source coding..." refers background in this paper
...The results of Slepian-Wolf encoding and communication through the Multiple Access Channel (MAC) are urprising and encouraging....
[...]