scispace - formally typeset
Search or ask a question
Topic

Information theory

About: Information theory is a research topic. Over the lifetime, 8771 publications have been published within this topic receiving 421502 citations. The topic is also known as: Information theory.


Papers
More filters
Proceedings Article
01 Jan 1973
TL;DR: The classical maximum likelihood principle can be considered to be a method of asymptotic realization of an optimum estimate with respect to a very general information theoretic criterion to provide answers to many practical problems of statistical model fitting.
Abstract: In this paper it is shown that the classical maximum likelihood principle can be considered to be a method of asymptotic realization of an optimum estimate with respect to a very general information theoretic criterion. This observation shows an extension of the principle to provide answers to many practical problems of statistical model fitting.

18,539 citations

Journal ArticleDOI
E. T. Jaynes1
TL;DR: In this article, the authors consider statistical mechanics as a form of statistical inference rather than as a physical theory, and show that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle.
Abstract: Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity, metric transitivity, equal a priori probabilities, etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects. The former consists only of the correct enumeration of the states of a system and their properties; the latter is a straightforward example of statistical inference.

12,099 citations

Book
06 Oct 2003
TL;DR: A fun and exciting textbook on the mathematics underpinning the most dynamic areas of modern science and engineering.
Abstract: Fun and exciting textbook on the mathematics underpinning the most dynamic areas of modern science and engineering.

8,091 citations

Journal ArticleDOI
Claude E. Shannon1
01 Jan 1949
TL;DR: A method is developed for representing any communication system geometrically and a number of results in communication theory are deduced concerning expansion and compression of bandwidth and the threshold effect.
Abstract: A method is developed for representing any communication system geometrically Messages and the corresponding signals are points in two "function spaces," and the modulation process is a mapping of one space into the other Using this representation, a number of results in communication theory are deduced concerning expansion and compression of bandwidth and the threshold effect Formulas are found for the maximum rate of transmission of binary digits over a system when the signal is perturbed by various types of noise Some of the properties of "ideal" systems which transmit at this maximum rate are discussed The equivalent number of binary digits per second for certain information sources is calculated

6,712 citations

Book
01 Jan 1968
TL;DR: This chapter discusses Coding for Discrete Sources, Techniques for Coding and Decoding, and Source Coding with a Fidelity Criterion.
Abstract: Communication Systems and Information Theory. A Measure of Information. Coding for Discrete Sources. Discrete Memoryless Channels and Capacity. The Noisy-Channel Coding Theorem. Techniques for Coding and Decoding. Memoryless Channels with Discrete Time. Waveform Channels. Source Coding with a Fidelity Criterion. Index.

6,684 citations


Network Information
Related Topics (5)
Communication channel
137.4K papers, 1.7M citations
86% related
Robustness (computer science)
94.7K papers, 1.6M citations
83% related
Matrix (mathematics)
105.5K papers, 1.9M citations
83% related
Cluster analysis
146.5K papers, 2.9M citations
82% related
Optimization problem
96.4K papers, 2.1M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023108
2022222
2021331
2020393
2019375
2018358