scispace - formally typeset
Topic

Information diagram

About: Information diagram is a(n) research topic. Over the lifetime, 727 publication(s) have been published within this topic receiving 45980 citation(s).

...read more

Papers
  More

Journal ArticleDOI: 10.1103/PHYSREV.106.620
E. T. Jaynes1Institutions (1)
15 Oct 1957-Physical Review
Abstract: Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity, metric transitivity, equal a priori probabilities, etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects. The former consists only of the correct enumeration of the states of a system and their properties; the latter is a straightforward example of statistical inference.

...read more

Topics: Statistical theory (67%), Statistical inference (62%), Principle of maximum entropy (59%) ...read more

11,158 Citations


Open accessBook
01 Jan 1959-
Topics: Joint entropy (81%), Information diagram (80%), Rényi entropy (79%) ...read more

7,226 Citations


Open accessJournal ArticleDOI: 10.1103/PHYSREVLETT.85.461
Thomas Schreiber1Institutions (1)
Abstract: An information theoretic measure is derived that quantifies the statistical coherence between systems evolving in time. The standard time delayed mutual information fails to distinguish information that is actually exchanged from shared information due to common history and input signals. In our new approach, these influences are excluded by appropriate conditioning of transition probabilities. The resulting transfer entropy is able to distinguish effectively driving and responding elements and to detect asymmetry in the interaction of subsystems.

...read more

3,060 Citations


Open accessJournal ArticleDOI: 10.1109/TIT.1980.1056144
John E. Shore1, R. Johnson1Institutions (1)
Abstract: Jaynes's principle of maximum entropy and Kullbacks principle of minimum cross-entropy (minimum directed divergence) are shown to be uniquely correct methods for inductive inference when new information is given in the form of expected values. Previous justifications use intuitive arguments and rely on the properties of entropy and cross-entropy as information measures. The approach here assumes that reasonable methods of inductive inference should lead to consistent results when there are different ways of taking the same information into account (for example, in different coordinate system). This requirement is formalized as four consistency axioms. These are stated in terms of an abstract information operator and make no reference to information measures. It is proved that the principle of maximum entropy is correct in the following sense: maximizing any function but entropy will lead to inconsistency unless that function and entropy have identical maxima. In other words given information in the form of constraints on expected values, there is only one (distribution satisfying the constraints that can be chosen by a procedure that satisfies the consistency axioms; this unique distribution can be obtained by maximizing entropy. This result is established both directly and as a special case (uniform priors) of an analogous result for the principle of minimum cross-entropy. Results are obtained both for continuous probability densities and for discrete distributions.

...read more

1,664 Citations


Open accessBook
01 Jan 1957-
Topics: Information diagram (50%)

1,349 Citations


Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20193
20184
201732
201642
201540
201446

Top Attributes

Show by:

Topic's top 5 most impactful authors

Flemming Topsøe

6 papers, 227 citations

Ehsan S. Soofi

4 papers, 377 citations

Dagmar Markechová

4 papers, 54 citations

Jianhua Dai

4 papers, 143 citations

D.E. Boekee

3 papers, 130 citations

Network Information
Related Topics (5)
Entropy (information theory)

23.2K papers, 472.2K citations

81% related
Information theory

8.7K papers, 421.5K citations

80% related
Fuzzy logic

151.2K papers, 2.3M citations

74% related
Cluster analysis

146.5K papers, 2.9M citations

73% related
Pattern recognition (psychology)

26.1K papers, 722.8K citations

73% related