Journal ArticleDOI
On functionals satisfying a data-processing theorem
Jacob Ziv,Moshe Zakai +1 more
Reads0
Chats0
TLDR
It is shown that the rate-distortion bound (R(d) \leq C) remains true when -\log x in the definition of mutual information is replaced by an arbitrary concave nonincreasing function satisfying some technical conditions.Abstract:
It is shown that the rate-distortion bound (R(d) \leq C) remains true when -\log x in the definition of mutual information is replaced by an arbitrary concave (\cup) nonincreasing function satisfying some technical conditions. Examples are given showing that for certain choices of the concave functions, the bounds obtained are better than the classical rate-distortion bounds.read more
Citations
More filters
Journal ArticleDOI
A view of three decades of linear filtering theory
TL;DR: Developments in the theory of linear least-squares estimation in the last thirty years or so are outlined and particular attention is paid to early mathematica[ work in the field and to more modern developments showing some of the many connections between least-Squares filtering and other fields.
Journal ArticleDOI
Collaborative in-network processing for target tracking
TL;DR: This paper study two types of commonly used sensors—acoustic-amplitude sensors for target distance estimation and direction-of-arrival sensors for bearing estimation—and investigate how networks of such sensors can collaborate to extract useful information with minimal resource usage.
Journal ArticleDOI
Information-theoretic analysis of interscale and intrascale dependencies between image wavelet coefficients
Juan Liu,Pierre Moulin +1 more
TL;DR: An information-theoretic analysis of statistical dependencies between image wavelet coefficients is presented, consistent with empirical observations that coding schemes exploiting inter- and intrascale dependencies alone perform very well, whereas taking both into account does not significantly improve coding performance.
Journal ArticleDOI
Review: Divergence measures for statistical data processing-An annotated bibliography
TL;DR: This paper provides an annotated bibliography for investigations based on or related to divergence measures for statistical data processing and inference problems.
Book
Statistical Physics and Information Theory
TL;DR: This chapter discusses the Random Energy Model and Random Coding, as well as analysis tools and Asymptotic Methods, and the physical interpretation of information measures.
References
More filters
Book
Probability theory
TL;DR: These notes cover the basic definitions of discrete probability theory, and then present some results including Bayes' rule, inclusion-exclusion formula, Chebyshev's inequality, and the weak law of large numbers.
Journal ArticleDOI
The Divergence and Bhattacharyya Distance Measures in Signal Selection
TL;DR: This partly tutorial paper compares the properties of an often used measure, the divergence, with a new measure that is often easier to evaluate, called the Bhattacharyya distance, which gives results that are at least as good and often better than those given by the divergence.