scispace - formally typeset
Open AccessJournal ArticleDOI

Estimating mutual information.

Alexander Kraskov, +2 more
- 23 Jun 2004 - 
- Vol. 69, Iss: 6, pp 066138-066138
Reads0
Chats0
TLDR
Two classes of improved estimators for mutual information M(X,Y), from samples of random points distributed according to some joint probability density mu(x,y), based on entropy estimates from k -nearest neighbor distances are presented.
Abstract
We present two classes of improved estimators for mutual information M(X,Y), from samples of random points distributed according to some joint probability density mu(x,y). In contrast to conventional estimators based on binnings, they are based on entropy estimates from k -nearest neighbor distances. This means that they are data efficient (with k=1 we resolve structures down to the smallest possible scales), adaptive (the resolution is higher where data are more numerous), and have minimal bias. Indeed, the bias of the underlying entropy estimates is mainly due to nonuniformity of the density at the smallest resolved scale, giving typically systematic errors which scale as functions of k/N for N points. Numerically, we find that both families become exact for independent distributions, i.e. the estimator M(X,Y) vanishes (up to statistical fluctuations) if mu(x,y)=mu(x)mu(y). This holds for all tested marginal distributions and for all dimensions of x and y. In addition, we give estimators for redundancies between more than two random variables. We compare our algorithms in detail with existing algorithms. Finally, we demonstrate the usefulness of our estimators for assessing the actual independence of components obtained from independent component analysis (ICA), for improving ICA, and for estimating the reliability of blind source separation.

read more

Citations
More filters

“Bioinformatics” 특집을 내면서

TL;DR: Assessment of medical technology in the context of commercialization with Bioentrepreneur course, which addresses many issues unique to biomedical products.
Journal ArticleDOI

Representational Similarity Analysis – Connecting the Branches of Systems Neuroscience

TL;DR: A new experimental and data-analytical framework called representational similarity analysis (RSA) is proposed, in which multi-channel measures of neural activity are quantitatively related to each other and to computational theory and behavior by comparing RDMs.
Journal ArticleDOI

Detecting Novel Associations in Large Data Sets

TL;DR: A measure of dependence for two-variable relationships: the maximal information coefficient (MIC), which captures a wide range of associations both functional and not, and for functional relationships provides a score that roughly equals the coefficient of determination of the data relative to the regression function.
Posted Content

Opening the Black Box of Deep Neural Networks via Information

TL;DR: This work demonstrates the effectiveness of the Information-Plane visualization of DNNs and shows that the training time is dramatically reduced when adding more hidden layers, and the main advantage of the hidden layers is computational.
Journal ArticleDOI

Nonlinear multivariate analysis of neurophysiological signals

TL;DR: This work describes the multivariate linear methods most commonly used in neurophysiology and shows that they can be extended to assess the existence of nonlinear interdependence between signals and describes nonlinear methods based on the concepts of phase synchronization, generalized synchronization and event synchronization.
References
More filters
Book

Elements of information theory

TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Journal ArticleDOI

Numerical recipes

“Bioinformatics” 특집을 내면서

TL;DR: Assessment of medical technology in the context of commercialization with Bioentrepreneur course, which addresses many issues unique to biomedical products.
Journal ArticleDOI

Independent coordinates for strange attractors from mutual information.

TL;DR: In this paper, the mutual information I is examined for a model dynamical system and for chaotic data from an experiment on the Belousov-Zhabotinskii reaction.
Journal ArticleDOI

Blind beamforming for non-gaussian signals

TL;DR: In this paper, a computationally efficient technique for blind estimation of directional vectors, based on joint diagonalization of fourth-order cumulant matrices, is presented for beamforming.