scispace - formally typeset
Search or ask a question

Showing papers on "Multivariate mutual information published in 1999"


01 Jan 1999
TL;DR: This paper addresses the problem of learning Bayesian network structures from data by using an information theoretic dependency analysis approach and the empirical results show that the approach is efficient and reliable.
Abstract: This paper addresses the problem of learning Bayesian network structures from data by using an information theoretic dependency analysis approach. Based on our three-phase construction mechanism, two efficient algorithms have been developed. One of our algorithms deals with a special case where the node ordering is given, the algorithm only require ) ( 2 N O CI tests and is correct given that the underlying model is DAG-Faithful [Spirtes et. al., 1996]. The other algorithm deals with the general case and requires ) ( 4 N O conditional independence (CI) tests. It is correct given that the underlying model is monotone DAG-Faithful (see Section 4.4). A system based on these algorithms has been developed and distributed through the Internet. The empirical results show that our approach is efficient and reliable.

178 citations


Journal ArticleDOI
Chenguang Lu1
TL;DR: The mathematical foundations of the new information theory, the generalized communication model, information measures for semantic information and sensory information, and the coding meanings of generalized entropy and generalized mutual information are introduced.
Abstract: A generalized information theory is proposed as a natural extension of Shannon's information theory. It proposes that information comes from forecasts. The more precise and the more unexpected a forecast is, the more information it conveys. If subjective forecast always conforms with objective facts then the generalized information measure will be equivalent to Shannon's information measure. The generalized communication model is consistent with Popper's model of knowledge evolution. The mathematical foundations of the new information theory, the generalized communication model, information measures for semantic information and sensory information, and the coding meanings of generalized entropy and generalized mutual information are introduced. Assessments and optimizations of pattern recognition, predictions, and detection with the generalized information criterion are discussed. For economization of communication, a revised version of rate-distortion theory: rate-of-keeping-precision theory, which is a ...

35 citations