scispace - formally typeset
Open AccessJournal ArticleDOI

Zero-Delay Joint Source-Channel Coding for a Bivariate Gaussian on a Gaussian MAC

Reads0
Chats0
TLDR
Two distributed JSCC schemes for transmission of two correlated Gaussian memoryless sources over a Gaussian Multiple Access Channel are considered: one discrete scheme based on nested scalar quantization, and one hybrid discrete-analog schemes based on a scalarquantizer and a linear continuous mapping.
Abstract
In this paper, delay-free, low complexity, joint source-channel coding (JSCC) for transmission of two correlated Gaussian memoryless sources over a Gaussian Multiple Access Channel (GMAC) is considered. The main contributions of the paper are two distributed JSCC schemes: one discrete scheme based on nested scalar quantization, and one hybrid discrete-analog scheme based on a scalar quantizer and a linear continuous mapping. The proposed schemes show promising performance which improves with increasing correlation and are robust against variations in noise level. Both schemes also exhibit a constant gap to the performance upper bound when the channel signal-to-noise ratio gets large.

read more

Citations
More filters
Posted Content

Zero-Delay Joint Source-Channel Coding for a Multivariate Gaussian on a Gaussian MAC

TL;DR: In this article, a nonlinear hybrid discrete-analog JSSC scheme based on distributed quantization and a linear continuous mapping named Distributed Quantizer Linear Coder (DQLC) was proposed.
Journal ArticleDOI

Zero-Delay Joint Source-Channel Coding Using Hybrid Digital-Analog Schemes in the Wyner-Ziv Setting

TL;DR: The proposed encoding scheme is based on hybrid digital-analog (HDA) transmission in a zero-delay fashion and results in a better performance than the well-known inverse spiral mapping.
Journal ArticleDOI

On Joint Source-Channel Coding for a Multivariate Gaussian on a Gaussian MAC

TL;DR: The main contribution is a zero-delay JSCC named Distributed Quantizer Linear Coder (DQLC), which performs relatively close the information theoretical bounds, improves when the correlation among the sources increases, and does not level off as the signal-to-noise ratio (SNR) becomes large.
Journal ArticleDOI

Low-Delay Joint Source-Channel Mappings for the Gaussian MAC

TL;DR: The bivariate Gaussian multiterminal source coding problem with transmission over the Gaussian multiple-access channel is studied and the use of low-delay joint source - channel mappings are proposed and shown how performance saturation can be overcome by optimizing the mappings.
Journal ArticleDOI

Deterministic Annealing-Based Optimization for Zero-Delay Source-Channel Coding in Networks

TL;DR: A powerful nonconvex optimization method based on the concept of deterministic annealing-which is derived from information theoretic principles and was successfully employed in several problems including vector quantization, classification, and regression is proposed.
References
More filters
Journal ArticleDOI

A mathematical theory of communication

TL;DR: This final installment of the paper considers the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now.
Book

Elements of information theory

TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Book

Probability, random variables and stochastic processes

TL;DR: This chapter discusses the concept of a Random Variable, the meaning of Probability, and the axioms of probability in terms of Markov Chains and Queueing Theory.
Journal ArticleDOI

Communication in the presence of noise

TL;DR: A method is developed for representing any communication system geometrically and a number of results in communication theory are deduced concerning expansion and compression of bandwidth and the threshold effect.
Book

Aspects of multivariate statistical theory

TL;DR: In this paper, the authors present a set of standard tests on Covariance Matrices and Mean Vectors, and test independence between k Sets of Variables and Canonical Correlation Analysis.