scispace - formally typeset
Search or ask a question
Author

D. Vasudevan

Bio: D. Vasudevan is an academic researcher. The author has contributed to research in topics: Communications system & Distortion. The author has an hindex of 1, co-authored 1 publications receiving 48 citations.

Papers
More filters
Proceedings Article
01 Jan 2006
TL;DR: Inner and outer bounds for the rate distortion region are provided in this work for general discrete memoryless sources and the same equivalence can be established between the well-known successive refinement problem and Yamamoto's cascade communication system, without relying on their rate-distortion characterization.
Abstract: We investigate source coding in a cascade communication system consisting of an encoder, a relay and an end terminal, where both the relay and the end terminal wish to reconstruct source $X$ with certain fidelities. Additionally, side-informations $Z$ and $Y$ are available at the relay and the end terminal, respectively. The side-information $Z$ at the relay is a physically degraded version of side-information $Y$ at the end terminal. Inner and outer bounds for the rate distortion region are provided in this work for general discrete memoryless sources. The rate distortion region is characterized when the source and side-informations are jointly Gaussian and physically degraded. The doubly symmetric binary source is also investigated and the inner and outer bounds are shown to coincide in certain distortion regimes. A complete equivalence of the rate-distortion region is established between the problem being considered and the side-information scalable source coding problem, when there is no side-information at the relay. As a byproduct, the same equivalence can be established between the well-known successive refinement problem and Yamamoto's cascade communication system, without relying on their rate-distortion characterization.

49 citations


Cited by
More filters
Book
16 Jan 2012
TL;DR: In this article, a comprehensive treatment of network information theory and its applications is provided, which provides the first unified coverage of both classical and recent results, including successive cancellation and superposition coding, MIMO wireless communication, network coding and cooperative relaying.
Abstract: This comprehensive treatment of network information theory and its applications provides the first unified coverage of both classical and recent results. With an approach that balances the introduction of new models and new coding techniques, readers are guided through Shannon's point-to-point information theory, single-hop networks, multihop networks, and extensions to distributed computing, secrecy, wireless communication, and networking. Elementary mathematical tools and techniques are used throughout, requiring only basic knowledge of probability, whilst unified proofs of coding theorems are based on a few simple lemmas, making the text accessible to newcomers. Key topics covered include successive cancellation and superposition coding, MIMO wireless communication, network coding, and cooperative relaying. Also covered are feedback and interactive communication, capacity approximations and scaling laws, and asynchronous and random access channels. This book is ideal for use in the classroom, for self-study, and as a reference for researchers and engineers in industry and academia.

2,442 citations

Journal ArticleDOI
TL;DR: This work asks what dependence can be established among the nodes of a communication network given the communication constraints, and develops elements of a theory of cooperation and coordination in networks.
Abstract: We develop elements of a theory of cooperation and coordination in networks. Rather than considering a communication network as a means of distributing information, or of reconstructing random processes at remote nodes, we ask what dependence can be established among the nodes given the communication constraints. Specifically, in a network with communication rates {Ri,j} between the nodes, we ask what is the set of all achievable joint distributions p(x1, ..., xm) of actions at the nodes of the network. Several networks are solved, including arbitrarily large cascade networks. Distributed cooperation can be the solution to many problems such as distributed games, distributed control, and establishing mutual information bounds on the influence of one part of a physical system on another.

289 citations

01 Jan 2009
TL;DR: This work develops elements of a theory of coordination in networks using tools from information theory and asks for the set of all possible joint distributions p(x1, ..., x m) of actions at the nodes of a network when rate-limited communication is allowed between the nodes.
Abstract: In this work, we develop elements of a theory of coordination in networks using tools from information theory. We ask questions of this nature: If three different tasks are to be performed in a shared effort between three people, but one of them is randomly assigned his responsibility, how much must he tell the others about his assignment? If two players of a multiplayer game wish to collaborate, how should they best use communication to generate their actions? More generally, we ask for the set of all possible joint distributions p(x1, ..., x m) of actions at the nodes of a network when rate-limited communication is allowed between the nodes. Several networks are solved, including arbitrarily large cascade networks. Distributed coordination can be the solution to many problems such as distributed games, distributed control, and establishing mutual information bounds on the physical influence of one part of a system on another.

75 citations

Proceedings ArticleDOI
28 Jun 2009
TL;DR: The general contribution toward understanding the limits of the cascade multiterminal source coding network is in the form of inner and outer bounds on the achievable rate region for satisfying a distortion constraint for an arbitrary distortion function d(x, y, z).
Abstract: We investigate distributed source coding of two correlated sources X and Y where messages are passed to a decoder in a cascade fashion. The encoder of X sends a message at rate R1 to the encoder of Y. The encoder of Y then sends a message to the decoder at rate R 2 based both on Y and on the message it received about X. The decoder's task is to estimate a function of X and Y. For example, we consider the minimum mean squared-error distortion when encoding the sum of jointly Gaussian random variables under these constraints. We also characterize the rates needed to reconstruct a function of X and Y losslessly. Our general contribution toward understanding the limits of the cascade multiterminal source coding network is in the form of inner and outer bounds on the achievable rate region for satisfying a distortion constraint for an arbitrary distortion function d(x, y, z). The inner bound makes use of a balance between two encoding tactics—relaying the information about X and recompressing the information about X jointly with Y. In the Gaussian case, a threshold is discovered for identifying which of the two extreme strategies optimizes the inner bound. Relaying outperforms recompressing the sum at the relay for some rate pairs if the variance of X is greater than the variance of Y.

64 citations

Proceedings ArticleDOI
13 Jun 2010
TL;DR: In this article, the authors consider the optimality of source-channel separation in networks, and show that such a separation approach is optimal or approximately optimal for a large class of scenarios, namely, when the sources are mutually independent, and each source is needed only at one destination (or at multiple destinations at the same distortion level).
Abstract: We consider the optimality of source-channel separation in networks, and show that such a separation approach is optimal or approximately optimal for a large class of scenarios. More precisely, for lossy coding of memoryless sources in a network, when the sources are mutually independent, and each source is needed only at one destination (or at multiple destinations at the same distortion level), the separation approach is optimal; for the same setting but each source is needed at multiple destinations under a restricted class of distortion measures, the separation approach is approximately optimal, in the sense that the loss from optimum can be upper-bounded. The communication channels in the network are general, including various multiuser channels with finite memory and feedback, the sources and channels can have different bandwidths, and the sources can be present at multiple nodes.

60 citations