scispace - formally typeset
Journal ArticleDOI

The CEO problem [multiterminal source coding]

Reads0
Chats0
TLDR
There does not exist a finite value of R for which even infinitely many agents can make D arbitrarily small, and in this isolated-agents case the asymptotic behavior of the minimal error frequency in the limit as L and then R tend to infinity is determined.
Abstract
We consider a new problem in multiterminal source coding motivated by the following decentralized communication/estimation task. A firm's Chief Executive Officer (CEO) is interested in the data sequence {X(t)}/sub t=1//sup /spl infin// which cannot be observed directly, perhaps because it represents tactical decisions by a competing firm. The CEO deploys a team of L agents who observe independently corrupted versions of {X(t)}/sub t=1//sup /spl infin//. Because {X(t)} is only one among many pressing matters to which the CEO must attend, the combined data rate at which the agents may communicate information about their observations to the CEO is limited to, say, R bits per second. If the agents were permitted to confer and pool their data, then in the limit as L/spl rarr//spl infin/ they usually would be able to smooth out their independent observation noises entirely. Then they could use their R bits per second to provide the CEO with a representation of {X(t)} with fidelity D(R), where D(/spl middot/) is the distortion-rate function of {X(t)}. In particular, with such data pooling D can be made arbitrarily small if R exceeds the entropy rate H of {X(t)}. Suppose, however, that the agents are not permitted to convene, Agent i having to send data based solely on his own noisy observations {Y/sub i/(t)}. We show that then there does not exist a finite value of R for which even infinitely many agents can make D arbitrarily small. Furthermore, in this isolated-agents case we determine the asymptotic behavior of the minimal error frequency in the limit as L and then R tend to infinity.

read more

Citations
More filters
Journal ArticleDOI

Cooperative strategies and capacity theorems for relay networks

TL;DR: The capacity results generalize broadly, including to multiantenna transmission with Rayleigh fading, single-bounce fading, certain quasi-static fading problems, cases where partial channel knowledge is available at the transmitters, and cases where local user cooperation is permitted.
Book

Network Information Theory

TL;DR: In this article, a comprehensive treatment of network information theory and its applications is provided, which provides the first unified coverage of both classical and recent results, including successive cancellation and superposition coding, MIMO wireless communication, network coding and cooperative relaying.
Journal ArticleDOI

Distributed source coding using syndromes (DISCUS): design and construction

TL;DR: This work addresses the problem of compressing correlated distributed sources, i.e., correlated sources which are not co-located or which cannot cooperate to directly exploit their correlation and provides a constructive practical framework based on algebraic trellis codes dubbed as DIstributed Source Coding Using Syndromes (DISCUS), that can be applicable in a variety of settings.
Journal ArticleDOI

Nested linear/lattice codes for structured multiterminal binning

TL;DR: Nested codes are proposed, or more specifically, nested parity-check codes for the binary case and nested lattices in the continuous case, which connect network information theory with the rich areas of linear codes and lattice codes, and have strong potential for practical applications.
Journal ArticleDOI

Distributed source coding for sensor networks

TL;DR: In this article, the authors presented an intensive discussion on two distributed source coding (DSC) techniques, namely Slepian-Wolf coding and Wyner-Ziv coding, and showed that separate encoding is as efficient as joint coding for lossless compression in channel coding.
References
More filters
Journal ArticleDOI

Detection with distributed sensors

TL;DR: The extension of classical detection theory to the case of distributed sensors is discussed, based on the theory of statistical hypothesis testing, and theoretical results concerning the form of the optimal decision rule are presented.
Journal ArticleDOI

Entropy and the Central Limit Theorem

TL;DR: A strengthened central limit theorem for densities is established in this article, showing monotone convergence in the sense of relative entropy, which is a stronger theorem than the central limit for the densities.
Journal ArticleDOI

Decentralized Detection by a Large Number of Sensors

TL;DR: This work considers the decentralized detection problem, in which N independent, identical sensors transmit a finite-valued function of their observations to a fusion center which then decides which one ofM hypotheses is true, and shows how the optimal number of sensors in each group may be determined by solving a mathematical programming problem.
Journal ArticleDOI

Hypothesis testing and information theory

TL;DR: The testing of binary hypotheses is developed from an information-theoretic point of view, and the asymptotic performance of optimum hypothesis testers is developed in exact analogy to the ascyptoticperformance of optimum channel codes.