scispace - formally typeset
Open Access

Capacity theorems for the relay channel

Thomas M. Cover, +1 more
- Vol. 80, pp 572-584
TLDR
An achievable lower bound to the capacity of the general relay channel is established and superposition block Markov encoding is used to show achievability of C, and converses are established.
About
The article was published on 1979-09-01 and is currently open access. It has received 3918 citations till now. The article focuses on the topics: Relay channel & Channel capacity.

read more

Citations
More filters
Proceedings ArticleDOI

A Pragmatic Approach to Cooperative Communication

TL;DR: A more pragmatic approach to cooperative diversity based on modern error correction coding working with phase dithering is introduced, andModulation constrained information rate analysis for this approach is presented, and it is shown that the loss relative to the ideal cooperative diversity scheme is minimal.
Proceedings ArticleDOI

Cooperation vs. hierarchy: an information-theoretic comparison

TL;DR: The performance of source-cooperation in a multi-access network is compared to that of using a wireless relay and the use of a relay is shown to be advantageous for a variety of wireless fading channels.
Proceedings ArticleDOI

Optimizing network-coded cooperative communications via joint session grouping and relay node selection

TL;DR: This paper proposes a distributed and online algorithm that offers near-optimal solution to joint grouping and relay node selection problem for NC-CC and shows that the distributed algorithm adapts well to online network dynamics.
Proceedings ArticleDOI

Resource allocation games in interference relay channels

TL;DR: There is naturally a game in interference relay channels (even if the power allocation policy is fixed) when the DF protocol is used; this game is induced by the decentralized choice of the cooperation degree between each source node and the relay node.
Journal ArticleDOI

Source-Channel Coding Theorems for the Multiple-Access Relay Channel

TL;DR: For correlated sources transmitted over fading Gaussian MARCs and MABRCs, the first time optimality of separation is proved is proved, and conditions under which separation is optimal are presented.
References
More filters
Journal ArticleDOI

Noiseless coding of correlated information sources

TL;DR: The minimum number of bits per character R_X and R_Y needed to encode these sequences so that they can be faithfully reproduced under a variety of assumptions regarding the encoders and decoders is determined.
Journal ArticleDOI

Three-terminal communication channels

TL;DR: A coding theorem and weak converse are proved and a necessary and sufficient condition for a positive capacity is derived and upper and lower bounds on the capacity are obtained, which coincide for channels with symmetric structure.
Journal ArticleDOI

Source coding with side information and a converse for degraded broadcast channels

TL;DR: In Section H of the paper, a characterization of the capacity region for degraded broadcast channels (DBC's) is given, which was conjectured by Bergmans and is somewhat sharper than the one obtained by Gallager.
Journal ArticleDOI

A proof of the data compression theorem of Slepian and Wolf for ergodic sources (Corresp.)

TL;DR: It is established that the Slepian-Wolf theorem is true without change for arbitrary ergodic processes \{(X_i,Y_i)\}_{i=1}^{\infty} and countably infinite alphabets.
Journal ArticleDOI

On source coding with side information at the decoder

TL;DR: The characterization of the family of rate triples (R_0,R_1, R_2) for which this system can deliver essentially perfect reproductions of X and Y and the principal result is a characterization of this family via an information-theoretic minimization.
Related Papers (5)