scispace - formally typeset
Journal ArticleDOI

Approximation theory of output statistics

Reads0
Chats0
TLDR
The notion of resolvability of a channel is introduced, defined as the number of random bits required per channel use in order to generate an input that achieves arbitrarily accurate approximation of the output statistics for any given input process, and a general formula is obtained which holds regardless of the channel memory structure.
Abstract
Given a channel and an input process we study the minimum randomness of those input processes whose output statistics approximate the original output statistics with arbitrary accuracy. We introduce the notion of resolvability of a channel, defined as the number of random bits required per channel use in order to generate an input that achieves arbitrarily accurate approximation of the output statistics for any given input process. We obtain a general formula for resolvability which holds regardless of the channel memory structure. We show that, for most channels, resolvability is equal to Shannon capacity. By-products of our analysis are a general formula for the minimum achievable (fixed-length) source coding rate of any finite-alphabet source, and a strong converse of the identification coding theorem, which holds for any channel that satisfies the strong converse of the channel coding theorem.

read more

Citations
More filters
Journal ArticleDOI

A general formula for channel capacity

TL;DR: A formula for the capacity of arbitrary single-user channels without feedback is proved and capacity is shown to equal the supremum, over all input processes, of the input-output inf-information rate defined as the liminf in probability of the normalized information density.
Journal ArticleDOI

Min- and Max-Relative Entropies and a New Entanglement Monotone

TL;DR: The spectral divergence rates of the information spectrum approach are shown to be obtained from the smooth min- and max-relative entropies in the asymptotic limit.
Journal ArticleDOI

Strong converse for identification via quantum channels

TL;DR: In this article, the authors present a simple proof of the strong converse for identification via discrete memoryless quantum channels, based on a novel covering lemma, which involves a development of explicit large deviation estimates to the case of random variables taking values in self-adjoint operators on a Hilbert space.
Journal ArticleDOI

Information Spectrum Approach to Second-Order Coding Rate in Channel Coding

TL;DR: In this article, the second-order coding rate of channel coding is discussed for general sequence of channels and the optimum secondorder transmission rate with a constant error constraint epsiv is obtained by using the information spectrum method.
Journal ArticleDOI

Covert Communication Over Noisy Channels: A Resolvability Perspective

TL;DR: A coding scheme based on the principle of channel resolvability is developed, which proves that if the receiver's channel is better than the warden's channel, it is possible to communicate on the order of √n reliable and covert bits over n channel uses without a secret key.
References
More filters
Book

Elements of information theory

TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Book

Information Theory: Coding Theorems for Discrete Memoryless Systems

TL;DR: This new edition presents unique discussions of information theoretic secrecy and of zero-error information theory, including the deep connections of the latter with extremal combinatorics.
Journal ArticleDOI

The common information of two dependent random variables

TL;DR: The main result of the paper is contained in two theorems which show that C(X; Y) is i) the minimum R_0 such that a sequence of independent copies of (X,Y) can be efficiently encoded into three binary streams W_0, W_1,W_2 with rates R-0, R-1,R-2.
Journal ArticleDOI

Identification via channels

TL;DR: The authors' main finding is that any object among doubly exponentially many objects can be identified in blocklength n with arbitrarily small error probability via a discrete memoryless channel (DMC), if randomization can be used for the encoding procedure.