On Information and Sufficiency
Citations
14,825 citations
14,635 citations
Cites methods from "On Information and Sufficiency"
...Many UL methods are designed to maximize entropy-related, information-theoretic (Boltzmann, 1909; Kullback & Leibler, 1951; Shannon, 1948) objectives (e.g., Amari, Cichocki, & Yang, 1996; Barlowet al., 1989;Dayan&Zemel, 1995;Deco&Parra, 1997; Field, 1994; Hinton, Dayan, Frey, & Neal, 1995; Linsker,…...
[...]
...Many UL methods are designed to maximize entropy-related, information-theoretic (Boltzmann, 1909; Shannon, 1948; Kullback and Leibler, 1951) objectives (e....
[...]
10,141 citations
9,057 citations
Cites methods from "On Information and Sufficiency"
...Here the snapshot cost28 is the Kullback-Leibler (KL) divergence [389] between the adjacency/similarity matrix at time t and the matrix describing the community structure of the graph at time t; the historical cost is the KL divergence between the matrices describing the community structure of the graph at times t − 1 and t ....
[...]
8,933 citations
Cites background from "On Information and Sufficiency"
...In 1951 S. Kullback and R. A. Leibler published a now-famous paper (Kullback and Leibler 1951) that quantified the meaning of “information" as related to R. A. Fisher's concept of sufficient statistics....
[...]
References
6,325 citations
"On Information and Sufficiency" refers background in this paper
...A special case of this divergence is Mahalanobis' generalized distance [13]....
[...]
3,392 citations
2,464 citations
2,292 citations
"On Information and Sufficiency" refers methods in this paper
...Jeffreys (par....
[...]
...The particular measure of divergence we use has been considered by Jeffreys ([10), [111) in another connection....
[...]
2,183 citations
"On Information and Sufficiency" refers background in this paper
...We are also concerned with the statistical problem of discrimination ([31, [17]), by considering a measure of the "distance" or "divergence" between statistical populations ([1), [21, [131) in terms of our measure of information....
[...]