scispace - formally typeset

Journal ArticleDOI

Logical information theory: new logical foundations for information theory

01 Oct 2017-Logic Journal of the IGPL (Oxford Academic)-Vol. 25, Iss: 5, pp 806-835

TL;DR: The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical information theory, which provides the set- theoretic and measure-theoretic foundations for information theory.
Abstract: here is a new theory of information based on logic. The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical information theory. Information is first defined in terms of sets of distinctions without using any probability measure. When a probability measure is introduced, the logical entropies are simply the values of the (product) probability measure on the sets of distinctions. The compound notions of joint, conditional, and mutual entropies are obtained as the values of the measure, respectively, on the union, difference, and intersection of the sets of distinctions. These compound notions of logical entropy satisfy the usual Venn diagram relationships (e.g., inclusion-exclusion formulas) since they are values of a measure (in the sense of measure theory). The uniform transformation into the formulas for Shannon entropy is linear so it explains the long-noted fact that the Shannon formulas satisfy the Venn diagram relations--as an analogy or mnemonic--since Shannon entropy is not a measure (in the sense of measure theory) on a given set. What is the logic that gives rise to logical information theory? Partitions are dual (in a category-theoretic sense) to subsets, and the logic of partitions was recently developed in a dual/parallel relationship to the Boolean logic of subsets (the latter being usually mis-specified as the special case of "propositional logic"). Boole developed logical probability theory as the normalized counting measure on subsets. Similarly the normalized counting measure on partitions is logical entropy--when the partitions are represented as the set of distinctions that is the complement to the equivalence relation for the partition. In this manner, logical information theory provides the set-theoretic and measure-theoretic foundations for information theory. The Shannon theory is then derived by the transformation that replaces the counting of distinctions with the counting of the number of binary partitions (bits) it takes, on average, to make the same distinctions by uniquely encoding the distinct elements--which is why the Shannon theory perfectly dovetails into coding and communications theory.

Content maybe subject to copyright    Report

LOGIC JOURNAL of the IGPL
Volume 22 Issue 1 February 2014
Contents
Original Articles
A Routley–Meyer semantics for truth-preserving and well-determined
Łukasiewicz 3-valued logics 1
Gemma Robles and José M. Méndez
Tarski’s theorem and liar-like paradoxes 24
Ming Hsiung
Verifying the bridge between simplicial topology and algebra: the
Eilenberg–Zilber algorithm 39
L. Lambán, J. Rubio, F. J. Martín-Mateos and J. L. Ruiz-Reina
Reasoning about constitutive norms in BDI agents 66
N. Criado, E. Argente, P. Noriega and V. Botti
An introduction to partition logic 94
David Ellerman
On the interrelation between systems of spheres and epistemic
entrenchment relations 126
Maurício D. L. Reis
On an inferential semantics for classical logic 147
David Makinson
First-order hybrid logic: introduction and survey 155
Torben Braüner
Applications of ultraproducts: from compactness to fuzzy
elementary classes 166
Pilar Dellunde
LOGIC JOURNAL
of the
IGPL
Volume 22 Issue 1 February 2014
www.oup.co.uk/igpl
LOGIC JOURNAL
of the
IGPL
EDITORS-IN-CHIEF
A. Amir
D. M. Gabbay
G. Gottlob
R. de Queiroz
J. Siekmann
EXECUTIVE EDITORS
M. J. Gabbay
O. Rodrigues
J. Spurr
INTEREST GROUP IN PURE AND APPLIED LOGICS
ISSN PRINT 1367-0751
ISSN ONLINE 1368-9894
JIGPAL-22(1)Cover.indd 1JIGPAL-22(1)Cover.indd 1 16-01-2014 19:02:0516-01-2014 19:02:05

Logical Information Theory:
New Logical Foundations for Information Theory
[Forthcoming in: Logic Journal of the IGPL]
David Ellerman
Philosophy Department,
University of California at Riverside
June 7, 2017
Abstract
There is a new theory of infor mation based on logic. The den ition of Shannon entropy as
well as the no tions on joint, conditional, and mutual entropy as de…ned by Shan non can al l
be derived by a uniform transformation fr om the corresponding formulas of logical informati on
theory. Information is rs t dened in terms of sets of distinctions without using any probability
measure. When a probability measure is introduced, the logical entropie s are simply the values
of the (product) probability measu re on t he se ts of distinctions. The c ompound notions of joint,
conditional, and mutual entropies are ob tained as the values of the measure, respectively, on th e
union, di¤erence, and intersection of the sets of distinctions. These compound notions of logical
entropy satisfy the usual Venn diagram relationships (e.g., inclusion-exclusion formulas) since
they are values of a measure (in the s ense of meas ure theory). The u niform transformation into
the f ormulas for Shannon entropy is linear so it expl ains the long-noted fact that t he Shannon
formulas satisfy the Venn d iagram relationsas an a nalogy or mnemonicsince Shannon entropy
is not a measure (in the sense of measure theory) on a given set.
W hat is the logic that gives rise to l ogical info rmation theory? Partitions are dual (in a
category- theoretic sense) to subsets, and the logic of partitions was recently developed in a
dual/parallel relationship to the Boolean logic of subsets (the latter being u sually mis-speci…ed
as the special case of “propositional lo gic”). Boole developed logical probability theory as the
normalized counting measure o n subsets. Similarly the normalized counting measure on parti-
tions is logical entropy–when the partitions are represented as the set of distinctions that is the
complement to the equivalence relation f or the partition.
In this manner, logic al informa tion theory provides the set-theoretic and measure-theoretic
foundations for information theory. The Shannon theory is then derived by the transformation
that r eplaces the counting o f distinctions wi th the counting of the number of binary partitions
(bits) it takes, on average, to make the same distinctions by uniquely encodi ng the distinct
elementswhich is why the Shannon theory perfectly dovetails into coding and com munications
theory.
Key words: parti tion logic, logical entropy, Shannon entropy
Contents
1 Introduction 2
2 Logical information as the measure of distinctions 3
3 Duality of subsets and partitions 4
1

4 Classical subset logic and partition logic 5
5 Classical logical probability and logical entropy 6
6 Entropy as a measure of information 8
7 The dit-bit transform 10
8 Information algebras and joint distributions 11
9 Conditional entropies 13
9.1 Logical conditional entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
9.2 Shannon conditional entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
10 Mutual information 15
10.1 Logical mutual information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
10.2 Shannon mutual information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
11 Independent Joint Distributions 17
12 Cross-entropies and divergences 18
13 Summary of formulas and dit-bit transforms 20
14 Entropies for multivariate joint distributions 20
15 Logical entropy and some related notions 24
16 The statistical interpretation of Shannon entropy 25
17 Concluding remarks 28
1 Introduction
This paper develops the logical theory of information-as-distinctions. It can be seen as the application
of the logic of partitions [15] to information theory. Partitions are dual (in a category-theoretic sense)
to subsets. George Boole developed the notion of logical probability [7] as the normalized counting
measure on subsets in his logic of subsets. This paper develops the normalized counting measure on
partitions as the analogous quantitative treatment in the logic of partitions. The resulting measure is
a new logical derivation of an old formula measuring diversity and distinctions, e.g., Corrado Gini’s
index of mutability or diversity [19], that goes back to the early 20th century. In view of the idea
of information as being based on distinctions (see next section), I refer to this logical measure of
distinctions as "logical entropy".
This raises the question of the relationship of logical entropy to Claude Shannon’s entropy
([40]; [41]). The entropies are closely related since they are both ultimately based on the concept
of information-as-distinctions–but they represent two di¤erent way to quantify distinctions. Logical
entropy directly counts the distinctions (as de…ned in partition logic) whereas Shannon entropy, in
ect, counts the minimum number of binary partitions (or yes/no questions) it takes, on average,
to uniquely determine or designate the distinct entities. Since that gives (in standard examples) a
binary code for the distinct entities, the Shannon theory is perfectly adapted for applications to the
theory of coding and communications.
2

The logical theory and the Shannon theory are also related in their compound notions of joint
entropy, conditional entropy, and mutual information. Logical entropy is a measure in the math-
ematical sense, so as with any measure, the compound formulas satisfy the usual Venn-diagram
relationships. The compound notions of Shannon entropy were de…ned so that they also satisfy
similar Venn diagram relationships. However, as various information theorists, principally Lorne
Campbell, have noted [9], Shannon entropy is not a measure (outside of the standard example of 2
n
equiprobable distinct entities where it is the count n of the number of yes/no questions necessary to
unique determine or encode the distinct entities)–so one can conclude only that the "analogies pro-
vide a convenient mnemonic" [9, p. 112] in terms of the usual Venn diagrams for measures. Campbell
wondered if there might be a "deeper foundation" [9, p. 112] to clarify how the Shannon formulas
can be de…ned to satisfy the measure-like relations in spite of not being a measure. That question is
addressed in this paper by showing that there is a transformation of formulas that transforms each of
the logical entropy compound formulas into the corresponding Shannon entropy compound formula,
and the transform preserves the Venn diagram relationships that automatically hold for measures.
This "dit-bit transform" is heuristically motivated by showing how average counts of distinctions
("dits") can be converted in average counts of binary partitions ("bits").
Moreover, Campbell remarked that it would be "particularly interesting" and "quite signi…cant"
if there was an entropy measure of sets so that joint entropy corresponded to the measure of the
union of sets, conditional entropy to the di¤erence of sets, and mutual information to the intersection
of sets [9, p. 113]. Logical entropy precisely satis…es those requirements.
2 Logical information as the measure of distinctions
There is now a widespread view that information is fundamentally about di¤erences, distinguisha-
bility, and distinctions. As Charles H. Bennett, one of the founders of quantum information theory,
put it:
So information really is a very useful abstraction. It is the notion of distinguishability
abstracted away from what we are distinguishing, or from the carrier of information. [5,
p. 155]
This view even has an interesting history. In James Gleick’s book, The Information: A History,
A Theory, A Flood, he noted the focus on di¤erences in the seventeenth century polymath, John
Wilkins, who was a founder of the Royal Society. In 1641, the year before Isaac Newton was born,
Wilkins published one of the earliest books on cryptography, Mercury or the Secret and Swift Mes-
senger, which not only pointed out the fundamental role of di¤erences but noted that any (…nite)
set of di¤erent things could be encoded by words in a binary code.
For in the general we must note, That whatever is capable of a competent Di¤erence,
perceptible to any Sense, may be a su¢ cient Means whereby to express the Cogitations.
It is more convenient, indeed, that these Di¤erences should be of as great Variety as the
Letters of the Alphabet; but it is su¢ cient if they be but twofold, because Two alone
may, with somewhat more Labour and Time, be well enough contrived to express all the
rest. [47, Chap. XVII, p. 69]
Wilkins explains that a ve letter binary code would be su¢ cient to code the letters of the alphabet
since 2
5
= 32.
Thus any two Letters or Numbers, suppose A:B. being transposed through ve Places,
will yield Thirty Two Di¤erences, and so consequently will superabundantly serve for
the Four and twenty Letters... .[47, Chap. XVII, p. 69]
3

As Gleick noted:
Any di¤erence meant a binary choice. Any binary choice began the expressing of cogi-
tations. Here, in this arcane and anonymous treatise of 1641, the essential idea of infor-
mation theory poked to the surface of human thought, saw its shadow, and disappeared
again for [three] hundred years. [20, p. 161]
Thus counting distinctions [12] would seem the right way to measure information,
1
and that is the
measure which emerges naturally out of partition logic–just as nite logical probability emerges
naturally as the measure of counting elements in Boole’s subset logic.
Although usually named after the special case of propositional’logic, the general case is Boole’s
logic of subsets of a universe U (the special case of U = 1 allows the propositional interpretation
since the only subsets are 1 and ; standing for truth and falsity). Category theory shows there is a
duality between sub-sets and quotient-sets (= partitions = equivalence relations), and that allowed
the recent development of the dual logic of partitions ([13], [15]). As indicated in the title of his
book, An Investigation of the Laws of Thought on which are founded the Mathematical Theories of
Logic and Probabilities [7], Boole also developed the normalized counting measure on subsets of a
nite universe U which was nite logical probability theory. When the same mathematical notion
of the normalized counting measure is applied to the partitions on a nite universe set U (when the
partition is represented as the complement of the corresponding equivalence relation on U U) then
the result is the formula for logical entropy.
In addition to the philosophy of information literature [4], there is a whole sub-industry in
mathematics concerned with di¤erent notions of entropy’or information’([2]; see [45] for a recent
extensive’ analysis) that is long on formulas and intuitive axioms’ but short on interpretations.
Out of that plethora of de…nitions, logical entropy is the measure (in the technical sense of measure
theory) of information that arises out of partition logic just as logical probability theory arises out
of subset logic.
The logical notion of information-as-distinctions supports the view that the notion of information
is independent of the notion of probability and should be based on nite combinatorics. As Andrei
Kolmogorov put it:
Information theory must precede probability theory, and not be based on it. By the very
essence of this discipline, the foundations of information theory have a nite combinato-
rial character. [27, p. 39]
Logical information theory precisely ful…lls Kolmogorov’s criterion.
2
It starts simply with a set
of distinctions de…ned by a partition on a nite set U , where a distinction is an ordered pair of
elements of U in distinct blocks of the partition. Thus the nite combinatorial” object is the
set of distinctions ("ditset") or information set ("infoset") associated with the partition, i.e., the
complement in U U of the equivalence relation associated with the partition. To get a quantitative
measure of information, any probability distribution on U de…nes a product probability measure on
U U, and the logical entropy is simply that probability measure of the information set.
3 Duality of subsets and partitions
Logical entropy is to the logic of partitions as logical probability is to the Boolean logic of sub-
sets. Hence we will start with a brief review of the relationship between these two dual forms of
1
This paper is about what Adriaa ns and van Benthem call "Information B: Probabilistic, information-theoretic,
measure d quantitatively", not about "Information A: knowledge, logic, wha t is conveyed in i nformative answers" where
the co nnection t o philosophy and logic is built-in from the beginning. Likewise, the p aper is not about Kolmogorov-
style "Information C: Algorithmic, code compression, mea sured quantitatively. " [4, p. 11]
2
Ko lmogorov had something else in mind such as a co mbinatorial development of Hart leys log (n) on a s et of n
equiprobable elements.[28]
4

Citations
More filters

Journal ArticleDOI
16 Feb 2018-Entropy
TL;DR: A new kind of entropy is proposed in product MV-algebras, namely the logical entropy and its conditional version and its logical cross entropy and logical divergence are defined.
Abstract: In the paper we propose, using the logical entropy function, a new kind of entropy in product MV-algebras, namely the logical entropy and its conditional version. Fundamental characteristics of these quantities have been shown and subsequently, the results regarding the logical entropy have been used to define the logical mutual information of experiments in the studied case. In addition, we define the logical cross entropy and logical divergence for the examined situation and prove basic properties of the suggested quantities. To illustrate the results, we provide several numerical examples.

6 citations


Posted Content
David Ellerman1Institutions (1)
Abstract: The development of the new logic of partitions (= equivalence relations) dual to the usual Boolean logic of subsets, and its quantitative version as the new logical theory of information provide the basic mathematical concepts to describe distinctions/indistinctions, definiteness/indefiniteness, and distinguishability/indistinguishability. They throw some new light on the objective indefiniteness or literal interpretation of quantum mechanics (QM) advocated by Abner Shimony. This paper shows how the mathematics of QM is the math of indefiniteness and thus, literally and realistically interpreted, it describes an objectively indefinite reality at the quantum level. In particular, the mathematics of wave propagation is shown to also be the math of the evolution of indefinite states that do not change the degree of indistinctness between states. This corrects the historical wrong turn of seeing QM as "wave mechanics" rather than the mechanics of particles with indefinite/definite properties. For example, the so-called "wave-particle duality' for particles is the juxtaposition of the evolution of a particle having an indefinite position ("wave-like" behavior) with a particle having a definite position (particle-like behavior).

4 citations


Cites background from "Logical information theory: new log..."

  • ...Logical information theory is the foundational theory of information based on the intuitive idea of information as distinctions, differences, and distinguishability [14]....

    [...]


Journal ArticleDOI
09 Aug 2018-Entropy
TL;DR: It is shown that the Tsallis entropy of order α, where α>1, has the property of sub-additivity, and it is proven that the proposed entropy measure is invariant under isomorphism of product MV-algebra dynamical systems.
Abstract: This paper is concerned with the mathematical modelling of Tsallis entropy in product MV-algebra dynamical systems. We define the Tsallis entropy of order α , where α ∈ ( 0 , 1 ) ∪ ( 1 , ∞ ) , of a partition in a product MV-algebra and its conditional version and we examine their properties. Among other, it is shown that the Tsallis entropy of order α , where α > 1 , has the property of sub-additivity. This property allows us to define, for α > 1 , the Tsallis entropy of a product MV-algebra dynamical system. It is proven that the proposed entropy measure is invariant under isomorphism of product MV-algebra dynamical systems.

3 citations



Journal ArticleDOI
11 Apr 2018-Entropy
TL;DR: It is shown that the Shannon entropy and the conditional Shannon entropy of fuzzy partitions can be derived from the R-norm entropy and conditional R- norm entropy of warm partitions, respectively, as the limiting cases for R going to 1.
Abstract: In the presented article, we define the R-norm entropy and the conditional R-norm entropy of partitions of a given fuzzy probability space and study the properties of the suggested entropy measures. In addition, we introduce the concept of R-norm divergence of fuzzy P-measures and we derive fundamental properties of this quantity. Specifically, it is shown that the Shannon entropy and the conditional Shannon entropy of fuzzy partitions can be derived from the R-norm entropy and conditional R-norm entropy of fuzzy partitions, respectively, as the limiting cases for R going to 1; the Kullback–Leibler divergence of fuzzy P-measures may be inferred from the R-norm divergence of fuzzy P-measures as the limiting case for R going to 1. We also provide numerical examples that illustrate the results.

2 citations


References
More filters

Journal ArticleDOI
TL;DR: This final installment of the paper considers the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now.
Abstract: In this final installment of the paper we consider the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now. To a considerable extent the continuous case can be obtained through a limiting process from the discrete case by dividing the continuum of messages and signals into a large but finite number of small regions and calculating the various parameters involved on a discrete basis. As the size of the regions is decreased these parameters in general approach as limits the proper values for the continuous case. There are, however, a few new effects that appear and also a general change of emphasis in the direction of specialization of the general results to particular cases.

60,029 citations


Book
Thomas M. Cover1, Joy A. Thomas2Institutions (2)
01 Jan 1991-
TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Abstract: Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the First Edition. 1. Introduction and Preview. 1.1 Preview of the Book. 2. Entropy, Relative Entropy, and Mutual Information. 2.1 Entropy. 2.2 Joint Entropy and Conditional Entropy. 2.3 Relative Entropy and Mutual Information. 2.4 Relationship Between Entropy and Mutual Information. 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information. 2.6 Jensen's Inequality and Its Consequences. 2.7 Log Sum Inequality and Its Applications. 2.8 Data-Processing Inequality. 2.9 Sufficient Statistics. 2.10 Fano's Inequality. Summary. Problems. Historical Notes. 3. Asymptotic Equipartition Property. 3.1 Asymptotic Equipartition Property Theorem. 3.2 Consequences of the AEP: Data Compression. 3.3 High-Probability Sets and the Typical Set. Summary. Problems. Historical Notes. 4. Entropy Rates of a Stochastic Process. 4.1 Markov Chains. 4.2 Entropy Rate. 4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph. 4.4 Second Law of Thermodynamics. 4.5 Functions of Markov Chains. Summary. Problems. Historical Notes. 5. Data Compression. 5.1 Examples of Codes. 5.2 Kraft Inequality. 5.3 Optimal Codes. 5.4 Bounds on the Optimal Code Length. 5.5 Kraft Inequality for Uniquely Decodable Codes. 5.6 Huffman Codes. 5.7 Some Comments on Huffman Codes. 5.8 Optimality of Huffman Codes. 5.9 Shannon-Fano-Elias Coding. 5.10 Competitive Optimality of the Shannon Code. 5.11 Generation of Discrete Distributions from Fair Coins. Summary. Problems. Historical Notes. 6. Gambling and Data Compression. 6.1 The Horse Race. 6.2 Gambling and Side Information. 6.3 Dependent Horse Races and Entropy Rate. 6.4 The Entropy of English. 6.5 Data Compression and Gambling. 6.6 Gambling Estimate of the Entropy of English. Summary. Problems. Historical Notes. 7. Channel Capacity. 7.1 Examples of Channel Capacity. 7.2 Symmetric Channels. 7.3 Properties of Channel Capacity. 7.4 Preview of the Channel Coding Theorem. 7.5 Definitions. 7.6 Jointly Typical Sequences. 7.7 Channel Coding Theorem. 7.8 Zero-Error Codes. 7.9 Fano's Inequality and the Converse to the Coding Theorem. 7.10 Equality in the Converse to the Channel Coding Theorem. 7.11 Hamming Codes. 7.12 Feedback Capacity. 7.13 Source-Channel Separation Theorem. Summary. Problems. Historical Notes. 8. Differential Entropy. 8.1 Definitions. 8.2 AEP for Continuous Random Variables. 8.3 Relation of Differential Entropy to Discrete Entropy. 8.4 Joint and Conditional Differential Entropy. 8.5 Relative Entropy and Mutual Information. 8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information. Summary. Problems. Historical Notes. 9. Gaussian Channel. 9.1 Gaussian Channel: Definitions. 9.2 Converse to the Coding Theorem for Gaussian Channels. 9.3 Bandlimited Channels. 9.4 Parallel Gaussian Channels. 9.5 Channels with Colored Gaussian Noise. 9.6 Gaussian Channels with Feedback. Summary. Problems. Historical Notes. 10. Rate Distortion Theory. 10.1 Quantization. 10.2 Definitions. 10.3 Calculation of the Rate Distortion Function. 10.4 Converse to the Rate Distortion Theorem. 10.5 Achievability of the Rate Distortion Function. 10.6 Strongly Typical Sequences and Rate Distortion. 10.7 Characterization of the Rate Distortion Function. 10.8 Computation of Channel Capacity and the Rate Distortion Function. Summary. Problems. Historical Notes. 11. Information Theory and Statistics. 11.1 Method of Types. 11.2 Law of Large Numbers. 11.3 Universal Source Coding. 11.4 Large Deviation Theory. 11.5 Examples of Sanov's Theorem. 11.6 Conditional Limit Theorem. 11.7 Hypothesis Testing. 11.8 Chernoff-Stein Lemma. 11.9 Chernoff Information. 11.10 Fisher Information and the Cram-er-Rao Inequality. Summary. Problems. Historical Notes. 12. Maximum Entropy. 12.1 Maximum Entropy Distributions. 12.2 Examples. 12.3 Anomalous Maximum Entropy Problem. 12.4 Spectrum Estimation. 12.5 Entropy Rates of a Gaussian Process. 12.6 Burg's Maximum Entropy Theorem. Summary. Problems. Historical Notes. 13. Universal Source Coding. 13.1 Universal Codes and Channel Capacity. 13.2 Universal Coding for Binary Sequences. 13.3 Arithmetic Coding. 13.4 Lempel-Ziv Coding. 13.5 Optimality of Lempel-Ziv Algorithms. Compression. Summary. Problems. Historical Notes. 14. Kolmogorov Complexity. 14.1 Models of Computation. 14.2 Kolmogorov Complexity: Definitions and Examples. 14.3 Kolmogorov Complexity and Entropy. 14.4 Kolmogorov Complexity of Integers. 14.5 Algorithmically Random and Incompressible Sequences. 14.6 Universal Probability. 14.7 Kolmogorov complexity. 14.9 Universal Gambling. 14.10 Occam's Razor. 14.11 Kolmogorov Complexity and Universal Probability. 14.12 Kolmogorov Sufficient Statistic. 14.13 Minimum Description Length Principle. Summary. Problems. Historical Notes. 15. Network Information Theory. 15.1 Gaussian Multiple-User Channels. 15.2 Jointly Typical Sequences. 15.3 Multiple-Access Channel. 15.4 Encoding of Correlated Sources. 15.5 Duality Between Slepian-Wolf Encoding and Multiple-Access Channels. 15.6 Broadcast Channel. 15.7 Relay Channel. 15.8 Source Coding with Side Information. 15.9 Rate Distortion with Side Information. 15.10 General Multiterminal Networks. Summary. Problems. Historical Notes. 16. Information Theory and Portfolio Theory. 16.1 The Stock Market: Some Definitions. 16.2 Kuhn-Tucker Characterization of the Log-Optimal Portfolio. 16.3 Asymptotic Optimality of the Log-Optimal Portfolio. 16.4 Side Information and the Growth Rate. 16.5 Investment in Stationary Markets. 16.6 Competitive Optimality of the Log-Optimal Portfolio. 16.7 Universal Portfolios. 16.8 Shannon-McMillan-Breiman Theorem (General AEP). Summary. Problems. Historical Notes. 17. Inequalities in Information Theory. 17.1 Basic Inequalities of Information Theory. 17.2 Differential Entropy. 17.3 Bounds on Entropy and Relative Entropy. 17.4 Inequalities for Types. 17.5 Combinatorial Bounds on Entropy. 17.6 Entropy Rates of Subsets. 17.7 Entropy and Fisher Information. 17.8 Entropy Power Inequality and Brunn-Minkowski Inequality. 17.9 Inequalities for Determinants. 17.10 Inequalities for Ratios of Determinants. Summary. Problems. Historical Notes. Bibliography. List of Symbols. Index.

42,928 citations


Book
01 Jan 2000-
Abstract: Part I Fundamental Concepts: 1 Introduction and overview 2 Introduction to quantum mechanics 3 Introduction to computer science Part II Quantum Computation: 4 Quantum circuits 5 The quantum Fourier transform and its application 6 Quantum search algorithms 7 Quantum computers: physical realization Part III Quantum Information: 8 Quantum noise and quantum operations 9 Distance measures for quantum information 10 Quantum error-correction 11 Entropy and information 12 Quantum information theory Appendices References Index

25,609 citations


01 Dec 2010-
TL;DR: This chapter discusses quantum information theory, public-key cryptography and the RSA cryptosystem, and the proof of Lieb's theorem.
Abstract: Part I. Fundamental Concepts: 1. Introduction and overview 2. Introduction to quantum mechanics 3. Introduction to computer science Part II. Quantum Computation: 4. Quantum circuits 5. The quantum Fourier transform and its application 6. Quantum search algorithms 7. Quantum computers: physical realization Part III. Quantum Information: 8. Quantum noise and quantum operations 9. Distance measures for quantum information 10. Quantum error-correction 11. Entropy and information 12. Quantum information theory Appendices References Index.

14,183 citations


Journal ArticleDOI
01 Sep 1950-Physics Today
TL;DR: The theory of communication is extended to include a number of new factors, in particular the effect of noise in the channel, and the savings possible due to the statistical structure of the original message anddue to the nature of the final destination of the information.
Abstract: HE recent development of various methods of modulation such as PCM and PPM which exchange bandwidth for signal-to-noise ratio has intensified the interest in a general theory of communication. A basis for such a theory is contained in the important papers of Nyquist1 and Hartley2 on this subject. In the present paper we will extend the theory to include a number of new factors, in particular the effect of noise in the channel, and the savings possible due to the statistical structure of the original message and due to the nature of the final destination of the information. The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages. The system must be designed to operate for each possible selection, not just the one which will actually be chosen since this is unknown at the time of design. If the number of messages in the set is finite then this number or any monotonic function of this number can be regarded as a measure of the information produced when one message is chosen from the set, all choices being equally likely. As was pointed out by Hartley the most natural choice is the logarithmic function. Although this definition must be generalized considerably when we consider the influence of the statistics of the message and when we have a continuous range of messages, we will in all cases use an essentially logarithmic measure. The logarithmic measure is more convenient for various reasons:

10,248 citations


Network Information
Related Papers (5)
Performance
Metrics
No. of citations received by the Paper in previous years
YearCitations
20211
20201
20192
20185
20171