scispace - formally typeset
Search or ask a question

Showing papers on "Pointwise mutual information published in 1999"


Journal ArticleDOI
TL;DR: Formulae for estimating the errors on observed information entropies and mutual informations are derived using the standard error analysis familiar to physicists, and their validity is demonstrated by numerical experiment.

272 citations


Journal ArticleDOI
TL;DR: A data-dependent nonparametric estimator of the mutual information based on Dobrushin's information theorem, which is to build a succession of finer and finer partitions made of nested hyperrectangles, and stop the refinement process on any hyperrectangle as soon as local independence has been achieved.

76 citations


Journal ArticleDOI
TL;DR: This work considers optimal decentralized (or equivalently, quantized) detection for the Neyman– Pearson, Bayes, Ali–Silvey distance, and mutual (Shannon) information criteria, and shows that if the processes observed at the sensors are conditionally independent and identically distributed, and the criterion for optimization is either a member of a subclass of the Ali-Silvey distances or local mutual information, then the quantizers used at all of the sensor are identical.
Abstract: We consider optimal decentralized (or equivalently, quantized) detection for the Neyman– Pearson, Bayes, Ali–Silvey distance, and mutual (Shannon) information criteria. In all cases, it is shown that the optimal sensor decision rules are quantizers that operate on the likelihood ratio of the observations. We further show that randomized fusion rules are suboptimal for the mutual information criterion. We also show that if the processes observed at the sensors are conditionally independent and identically distributed, and the criterion for optimization is either a member of a subclass of the Ali–Silvey distances or local mutual information, then the quantizers used at all of the sensors are identical. We give an example to show that for the Neyman–Pearson and Bayes criteria this is not generally true. We go into some detail with respect to this last, and derive necessary conditions for an assumptions of identical sensor quantizer maps to be reasonable.

69 citations


Proceedings ArticleDOI
15 Mar 1999
TL;DR: A large database of hand-labeled fluent speech is used to compute the mutual information between phoneme labels and a point of logarithmic energy in the time-frequency plane.
Abstract: In this paper we use mutual information to study the distribution in time and frequency of information relevant for phonetic classification. A large database of hand-labeled fluent speech is used to (a) compute the mutual information between phoneme labels and a point of logarithmic energy in the time-frequency plane and (b) compute the joint mutual information between phoneme labels and two points of logarithmic energy in the time-frequency plane.

35 citations


Journal ArticleDOI
TL;DR: The exact bounds and asymptotic behaviors for the mutual information as a function of the data size and of some properties of the probability of theData given the parameter are derived.
Abstract: learning tasks. The parameter is a possibly, but not necessarily, high-dimensional vector. We derive exact bounds and asymptotic behaviors for the mutual information as a function of the data size and of some properties of the probability of the data given the parameter. We compare these exact results with the predictions of replica calculations. We briefly discuss the universal properties of the mutual information as a function of data size. @S1063-651X~99!00403-1#

7 citations


Proceedings Article
29 Nov 1999
TL;DR: This paper uses mutual information to characterize the distributions of phonetic and speaker/channel information in a time-frequency space and shows how the phonetic information is locally spread and how the speaker/ channel information is globally spread in time and frequency.
Abstract: In this paper, we use mutual information to characterize the distributions of phonetic and speaker/channel information in a time-frequency space. The mutual information (MI) between the phonetic label and one feature, and the joint mutual information (JMI) between the phonetic label and two or three features are estimated. The Miller's bias formulas for entropy and mutual information estimates are extended to include higher order terms. The MI and the JMI for speaker/channel recognition are also estimated. The results are complementary to those for phonetic classification. Our results show how the phonetic information is locally spread and how the speaker/channel information is globally spread in time and frequency.

5 citations