Interactive Schemes for the AWGN Channel with Noisy Feedback
Assaf Ben-Yishai,Ofer Shayevitz +1 more
Reads0
Chats0
TLDR
In this article, the problem of communication over an additive white Gaussian noise (AWGN) channel with an AWGN feedback channel was studied, and a low-complexity low-delay interactive scheme that operates close to capacity for a fixed bit error probability was proposed.Abstract:
We study the problem of communication over an additive white Gaussian noise (AWGN) channel with an AWGN feedback channel. When the feedback channel is noiseless, the classic Schalkwijk–Kailath (S-K) scheme is known to achieve capacity in a simple sequential fashion, while attaining reliability superior to non-feedback schemes. In this paper, we show how simplicity and reliability can be attained even when the feedback is noisy, provided that the feedback channel is sufficiently better than the feedforward channel. Specifically, we introduce a low-complexity low-delay interactive scheme that operates close to capacity for a fixed bit error probability (e.g., $10^{-6}$ ). We then build on this scheme to provide two asymptotic constructions, one based on high dimensional lattices, and the other based on concatenated coding, that admit an error exponent significantly exceeding the best possible non-feedback exponent. Our approach is based on the interpretation of feedback transmission as a side-information problem, and employs an interactive modulo-lattice solution.read more
Citations
More filters
Book ChapterDOI
The Zero Error Capacity of a Noisy Channel
Neil J. A. Sloane,Aaron D. Wyner +1 more
TL;DR: It is shown that while the ordinary capacity of a memoryless channel with feedback is equal to that of the same channel without feedback, the zero error capacity may be greater and a solution is given to the problem of evaluating C 0F.
Journal ArticleDOI
Deepcode: Feedback Codes via Deep Learning
TL;DR: In this article, the Gaussian noise channel with feedback was considered, and the first family of codes obtained via deep learning was presented, which significantly outperformed state-of-the-art codes designed over several decades of research.
Posted Content
Deepcode: Feedback Codes via Deep Learning
TL;DR: This work presents the first family of codes obtained via deep learning, which significantly outperforms state-of-the-art codes designed over several decades of research, and demonstrates several desirable properties of the codes.
References
More filters
Book
Information Theory and Reliable Communication
TL;DR: This chapter discusses Coding for Discrete Sources, Techniques for Coding and Decoding, and Source Coding with a Fidelity Criterion.
Book
Network Information Theory
Abbas El Gamal,Young-Han Kim +1 more
TL;DR: In this article, a comprehensive treatment of network information theory and its applications is provided, which provides the first unified coverage of both classical and recent results, including successive cancellation and superposition coding, MIMO wireless communication, network coding and cooperative relaying.
Journal ArticleDOI
The zero error capacity of a noisy channel
TL;DR: It is shown that while the ordinary capacity of a memoryless channel with feedback is equal to that of the same channel without feedback, the zero error capacity may be greater and a solution is given to the problem of evaluating C_oF.
Journal ArticleDOI
Probability of error for optimal codes in a Gaussian channel
TL;DR: Upper and lower bounds are found for the error probability in decoding with optimal codes and decoding systems for a continuous channel with an additive gaussian noise and subject to an average power limitation at the transmitter.
Journal ArticleDOI
A coding scheme for additive noise channels with feedback--I: No bandwidth constraint
J. Schalkwijk,Thomas Kailath +1 more
TL;DR: This paper presents a coding scheme that exploits the feedback to achieve considerable reductions in coding and decoding complexity and delay over what would be needed for comparable performance with the best known (simplex) codes for the one-way channel.