Elements of Information Theory, 2/E
Citations
5,314 citations
Cites background from "Elements of Information Theory, 2/E..."
...Then the average of their log probabilities will be very close to H, the negative entropy (Cover and Thomas 2006), because H is simply the mean log probability: / H Sql(x) ln (ql(x)): Thus, for ‘‘typical’’ sites whose log probabilities are close to this mean, we obtain ql(x):e H....
[...]
...Then the average of their log probabilities will be very close to H, the negative entropy (Cover and Thomas 2006), because H is simply the mean log probability: / H Sql(x) ln (ql(x)): Thus, for ‘‘typical’’ sites whose log probabilities are close to this mean, we obtain ql(x):e ....
[...]
3,105 citations
Cites background from "Elements of Information Theory, 2/E..."
...The concept of “information” was quantified and this provided a series of enormous breakthroughs affecting modern society (see Hobson and Cheng 1973, Guiasu 1977, Soofi 1994, Jessop 1995, and Cover and Thomas 2006 for background)....
[...]
991 citations
Cites background from "Elements of Information Theory, 2/E..."
...The MI between the two random variables is [19]...
[...]
816 citations
Cites methods from "Elements of Information Theory, 2/E..."
...Finally, we calculate the entropy of this probability distribution [44] which then characterizes the extent of (non-Gaussian) fluctuations in the sequence of relative semitone pitch period variations....
[...]
696 citations