Elements of information theory
Citations
4,773 citations
Cites background or methods from "Elements of information theory"
...Using Jensen’s inequality for concave functions [ 4 ] we get...
[...]
...Mutual information is only one of a family of measures of statistical dependence or information redundancy (see appendix 3). We have experimented with , which can be shown to be a metric [ 4 ], and ,t heEntropy Correlation Coefficient [1]....
[...]
...It can be shown [ 4 ] that for a given mean and standard deviation...
[...]
...The use of the much more general notion of Mutual Information (MI) or relative entropy [ 4 , 16] to describe the dispersive behavior of the 2-D histogram has been proposed independently by Collignon et al. [3, 11] and by Viola et al. [20]....
[...]
4,734 citations
4,593 citations
4,410 citations
Cites background from "Elements of information theory"
...Cover and Thomas, (1991) show that for all 2 , d < (1)2 , then...
[...]
4,397 citations
References
2,415 citations
102 citations
40 citations