scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Global Optimality of Encoders and MSE Decoders for Communicating Unstable Markov Processes over Unstable Gaussian Recursive Models with Feedback: A Nonanticipative RDF Approach

07 Jul 2019-pp 3057-3061
TL;DR: In this article, the authors derived analogous results for communicating unstable Gaussian Markov processes over unstable MIMO Gaussian recursive models (G-RM) with memory, subject to an average cost of quadratic form.
Abstract: Shannon’s coding capacity of memoryless additive Gaussian noise (AGN) channels with noiseless feedback, is known to be achieved by the Elias [1] coding scheme of communicating the mean square-error (MSE) of a Gaussian RV $X \sim N\left({0,\sigma _X^2}\right)$, from past channel outputs, and decoding it using a MSE decoder. Further, it is known that among all encoders, and decoders that minimize the MSE, then the Elias encoder and decoder are globally optimal.In this paper we derive analogous results, for communicating unstable Gaussian Markov processes over unstable multiple-input multiple-output (MIMO) Gaussian recursive models (G-RM) with memory, often called infinite impulse response (IIR) models, subject to an average cost of quadratic form. However, unlike memoryless AGNs, certain conditions are required, for such generalizations. Further, to show global optimality, we need to invoke Gorbunov and Pinsker [2] nonanticipatory entropy, instead of the classical rate distortion function of the source. Another important observation is that we need a two-parameter coding scheme, instead of the one-parameter coding scheme of memoryless AGN channels.
References
More filters
Book
01 Oct 1987
TL;DR: Stochastic Processes Linear Stochastic Systems Estimation Theory Stochastics Realization Theory System Identification: Foundations and Basic Concepts.
Abstract: Stochastic Processes Linear Stochastic Systems Estimation Theory Stochastic Realization Theory System Identification: Foundations and Basic Concepts Least Squares Parameter Estimation Maximum Likelihood Estimation of Gaussian Armax and State-Space System Minimum Prediction Error Identification Methods Non-Stationary System Identification Feedback, Causality, and Closed Loop System Identification Linear-Quadratic Stochastic Control Stochastic Adaptive Control Appendix 1: Probability Theory Appendix 2: System Theory Appendix 3: Harmonic Analysis.

728 citations


"Global Optimality of Encoders and M..." refers background in this paper

  • ...(c) By the wellknown property of the MSE [11], and μL,∗(·) given in (a), then the conditional mean minimizes the MSE....

    [...]

  • ...(46) Proof: (a), (b) are easily verified, from slight generalization of the Kalman filter equations [11]....

    [...]

Journal ArticleDOI
TL;DR: This paper presents a coding scheme that exploits the feedback to achieve considerable reductions in coding and decoding complexity and delay over what would be needed for comparable performance with the best known (simplex) codes for the one-way channel.
Abstract: In some communication problems, it is a good assumption that the channel consists of an additive white Gaussian noise forward link and an essentially noiseless feedback link. In this paper, we study channels where no bandwidth constraint is placed on the transmitted signals. Such channels arise in space communications. It is known that the availability of the feedback link cannot increase the channel capacity of the noisy forward link, but it can considerably reduce the coding effort required to achieve a given level of performance. We present a coding scheme that exploits the feedback to achieve considerable reductions in coding and decoding complexity and delay over what would be needed for comparable performance with the best known (simplex) codes for the one-way channel. Our scheme, which was motivated by the Robbins-Monro stochastic approximation technique, can also be used over channels where the additive noise is not Gaussian but is still independent from instant to instant. An extension of the scheme for channels with limited signal bandwidth is presented in a companion paper (Part II).

579 citations


"Global Optimality of Encoders and M..." refers background in this paper

  • ...} , then the probability of Maximum Likelihood (ML) decoding error at time n, decreases doubly exponentially in (n+ 1), as [4]...

    [...]

Journal ArticleDOI
TL;DR: An asymptotic equipartition theorem for nonstationary Gaussian processes is proved and it is proved that the feedback capacity C/sub FB/ in bits per transmission and the nonfeedback capacity C satisfy C > C >.
Abstract: The capacity of time-varying additive Gaussian noise channels with feedback is characterized. Toward this end, an asymptotic equipartition theorem for nonstationary Gaussian processes is proved. Then, with the aid of certain matrix inequalities, it is proved that the feedback capacity C/sub FB/ in bits per transmission and the nonfeedback capacity C satisfy C >

240 citations


"Global Optimality of Encoders and M..." refers methods in this paper

  • ...[10] T. Cover and S. Pombra, “Gaussian feedback capacity,” IEEE Transactions on Information Theory, vol. 35, no. 1, pp. 37–43, Jan. 1989....

    [...]

  • ...FEEDBACK CAPACITY To answer the question posed in Section I, we need the characterization of n−Finite Transmission Feedback Information (FTFI) Capacity (called finite block length in [10]), and feedback capacity [5], which we describe in this section....

    [...]

  • ...We should emphasize that Kim [9] applied a oneparameter coding scheme, similar to (4), for the stationary ergodic version of the scalar Cover and Pombra [10], AGN channel with memory on the noise....

    [...]

Journal ArticleDOI
TL;DR: This result shows that the celebrated Schalkwijk-Kailath coding achieves the feedback capacity for the first-order autoregressive moving-average Gaussian channel, positively answering a long-standing open problem studied by Butman, Tiernan-SchalkWijk, Wolfowitz, Ozarow, Ordentlich, Yang-Kavc¿ic¿-Tatikonda, and others.
Abstract: The feedback capacity of additive stationary Gaussian noise channels is characterized as the solution to a variational problem in the noise power spectral density. When specialized to the first-order autoregressive moving-average noise spectrum, this variational characterization yields a closed-form expression for the feedback capacity. In particular, this result shows that the celebrated Schalkwijk-Kailath coding achieves the feedback capacity for the first-order autoregressive moving-average Gaussian channel, positively answering a long-standing open problem studied by Butman, Tiernan-Schalkwijk, Wolfowitz, Ozarow, Ordentlich, Yang-Kavc?ic?-Tatikonda, and others. More generally, it is shown that a k-dimensional generalization of the Schalkwijk-Kailath coding achieves the feedback capacity for any autoregressive moving-average noise spectrum of order k. Simply put, the optimal transmitter iteratively refines the receiver's knowledge of the intended message. This development reveals intriguing connections between estimation, control, and feedback communication.

155 citations

Journal ArticleDOI
TL;DR: This well-known but surprising result is explained and simply derived here in terms of a result by Elias (1956) concerning the minimum mean-square distortion achievable in transmitting a single Gaussian random variable over multiple uses of the same Gaussian channel.
Abstract: Schalkwijk and Kailath (1966) developed a class of block codes for Gaussian channels with ideal feedback for which the probability of decoding error decreases as a second-order exponent in block length for rates below capacity. This well-known but surprising result is explained and simply derived here in terms of a result by Elias (1956) concerning the minimum mean-square distortion achievable in transmitting a single Gaussian random variable over multiple uses of the same Gaussian channel. A simple modification of the Schalkwijk-Kailath scheme is then shown to have an error probability that decreases with an exponential order which is linearly increasing with block length. In the infinite bandwidth limit, this scheme produces zero error probability using bounded expected energy at all rates below capacity. A lower bound on error probability for the finite bandwidth case is then derived in which the error probability decreases with an exponential order which is linearly increasing in block length at the same rate as the upper bound.

116 citations


"Global Optimality of Encoders and M..." refers background in this paper

  • ...Thus, no other encoder-decoder achieves a smaller MSE [3]....

    [...]

  • ...Indeed, by (5), and letting ∆ = Σn, then [3]...

    [...]