scispace - formally typeset
Open Access

Capacity theorems for the relay channel

Thomas M. Cover, +1 more
- Vol. 80, pp 572-584
TLDR
An achievable lower bound to the capacity of the general relay channel is established and superposition block Markov encoding is used to show achievability of C, and converses are established.
About
The article was published on 1979-09-01 and is currently open access. It has received 3918 citations till now. The article focuses on the topics: Relay channel & Channel capacity.

read more

Citations
More filters
Proceedings ArticleDOI

Interference channel with a relay: Models, relaying strategies, bounds

TL;DR: Important scenarios under which signal relaying and interference forwarding are optimal are identified, including cognitive relaying where the relay is unaware of the source codebooks and out-of band relaying, both under certain modified strong interference conditions and the latter under additional capacity constraints on the orthogonal relay links.
Journal ArticleDOI

Single-User Broadcasting Protocols Over a Two-Hop Relay Fading Channel

TL;DR: Numerical results show that for high signal-to-noise ratios (SNRs), the broadcast approach over AF relay may achieve higher throughput gains than other relaying protocols that are numerically tractable.
Proceedings ArticleDOI

Source and channel coding for cooperative relaying

TL;DR: The end-to-end distortion achieved by a half-duplex relay system where a continuous amplitude source is transmitted over a quasi-static Rayleigh fading channel is considered.
Journal ArticleDOI

On distributed compression of linear functions

TL;DR: It is first shown that the state-of-the-art lattice scheme can be improved by including an additional linear binning stage, and an outer bound on the rate-distortion region and a separate lower bound onthe optimal sum rate are established.
Journal ArticleDOI

Compress-and-Forward Cooperative MIMO Relaying With Full Channel State Information

TL;DR: Simulation results show that CF can outperform decode-and-forward (DF) and approach capacity for realistic SNR values, which validates the performance of the proposed optimization procedure.
References
More filters
Journal ArticleDOI

Noiseless coding of correlated information sources

TL;DR: The minimum number of bits per character R_X and R_Y needed to encode these sequences so that they can be faithfully reproduced under a variety of assumptions regarding the encoders and decoders is determined.
Journal ArticleDOI

Three-terminal communication channels

TL;DR: A coding theorem and weak converse are proved and a necessary and sufficient condition for a positive capacity is derived and upper and lower bounds on the capacity are obtained, which coincide for channels with symmetric structure.
Journal ArticleDOI

Source coding with side information and a converse for degraded broadcast channels

TL;DR: In Section H of the paper, a characterization of the capacity region for degraded broadcast channels (DBC's) is given, which was conjectured by Bergmans and is somewhat sharper than the one obtained by Gallager.
Journal ArticleDOI

A proof of the data compression theorem of Slepian and Wolf for ergodic sources (Corresp.)

TL;DR: It is established that the Slepian-Wolf theorem is true without change for arbitrary ergodic processes \{(X_i,Y_i)\}_{i=1}^{\infty} and countably infinite alphabets.
Journal ArticleDOI

On source coding with side information at the decoder

TL;DR: The characterization of the family of rate triples (R_0,R_1, R_2) for which this system can deliver essentially perfect reproductions of X and Y and the principal result is a characterization of this family via an information-theoretic minimization.
Related Papers (5)