Topic
Binary entropy function
About: Binary entropy function is a research topic. Over the lifetime, 1987 publications have been published within this topic receiving 41678 citations.
Papers published on a yearly basis
Papers
More filters
TL;DR: In this paper, the authors introduce the concept of large deviations for random variables with a finite state space, which is a generalization of the notion of large deviation for random vectors.
Abstract: I: Large Deviations and Statistical Mechanics.- I. Introduction to Large Deviations.- I.1. Overview.- I.2. Large Deviations for I.I.D. Random Variables with a Finite State Space.- I.3. Levels-1 and 2 for Coin Tossing.- I.4. Levels-1 and 2 for I.I.D. Random Variables with a Finite State Space.- I.S. Level-3: Empirical Pair Measure.- I.6. Level-3: Empirical Process.- I.7. Notes.- I.B. Problems.- II. Large Deviation Property and Asymptotics of Integrals.- II.1. Introduction.- II.2. Levels-1, 2, and 3 Large Deviations for I.I.D. Random Vectors.- II.3. The Definition of Large Deviation Property.- II.4. Statement of Large Deviation Properties for Levels-1, 2, and 3.- II.5. Contraction Principles.- II.6. Large Deviation Property for Random Vectors and Exponential Convergence.- II.7. Varadhan's Theorem on the Asymptotics of Integrals.- II.8. Notes.- II.9. Problems.- III. Large Deviations and the Discrete Ideal Gas.- III.1. Introduction.- III.2. Physics Prelude: Thermodynamics.- III.3. The Discrete Ideal Gas and the Microcanonical Ensemble.- III.4. Thermodynamic Limit, Exponential Convergence, and Equilibrium Values.- III.5. The Maxwell-Boltzmann Distribution and Temperature.- III.6. The Canonical Ensemble and Its Equivalence with the Microcanonical Ensemble.- III.7. A Derivation of a Thermodynamic Equation.- III.8. The Gibbs Variational Formula and Principle.- III.9. Notes.- III.10. Problems.- IV. Ferromagnetic Models on ?.- IV.1. Introduction.- IV.2. An Overview of Ferromagnetic Models.- IV.3. Finite-Volume Gibbs States on ?.- IV.4. Spontaneous Magnetization for the Curie-Weiss Model.- IV.5. Spontaneous Magnetization for General Ferromagnets on ?.- IV.6. Infinite-Volume Gibbs States and Phase Transitions.- IV.7. The Gibbs Variational Formula and Principle.- IV.8. Notes.- IV.9. Problems.- V. Magnetic Models on ?D and on the Circle.- V.1. Introduction.- V.2. Finite-Volume Gibbs States on ?D, D ? 1.- V.3. Moment Inequalities.- V.4. Properties of the Magnetization and the Gibbs Free Energy.- V.5. Spontaneous Magnetization on ?D, D ? 2, Via the Peierls Argument.- V.6. Infinite-Volume Gibbs States and Phase Transitions.- V.7. Infinite-Volume Gibbs States and the Central Limit Theorem.- V.8. Critical Phenomena and the Breakdown of the Central Limit Theorem.- V.9. Three Faces of the Curie-Weiss Model.- V.10. The Circle Model and Random Waves.- V.11. A Postscript on Magnetic Models.- V.12. Notes.- V.13. Problems.- II: Convexity and Proofs of Large Deviation Theorems.- VI. Convex Functions and the Legendre-Fenchel Transform.- VI.1. Introduction.- VI.2. Basic Definitions.- VI.3. Properties of Convex Functions.- VI.4. A One-Dimensional Example of the Legendre-Fenchel Transform.- VI.5. The Legendre-Fenchel Transform for Convex Functions on ?d.- VI.6. Notes.- VI.7. Problems.- VII. Large Deviations for Random Vectors.- VII.1. Statement of Results.- VII.2. Properties of IW.- VII.3. Proof of the Large Deviation Bounds for d = 1.- VII.4. Proof of the Large Deviation Bounds for d ? 1.- VII.5. Level-1 Large Deviations for I.I.D. Random Vectors.- VII.6. Exponential Convergence and Proof of Theorem II.6.3.- VII.7. Notes.- VII.8. Problems.- VIII. Level-2 Large Deviations for I.I.D. Random Vectors.- VIII.1. Introduction.- VIII.2. The Level-2 Large Deviation Theorem.- VIII.3. The Contraction Principle Relating Levels-1 and 2 (d = 1).- VIII.4. The Contraction Principle Relating Levels-1 and 2 (d ? 2).- VIII.5. Notes.- VIII.6. Problems.- IX. Level-3 Large Deviations for I.I.D. Random Vectors.- IX.1. Statement of Results.- IX.2. Properties of the Level-3 Entropy Function.- IX.3. Contraction Principles.- IX.4. Proof of the Level-3 Large Deviation Bounds.- IX.5. Notes.- IX.6. Problems.- Appendices.- Appendix A: Probability.- A.1. Introduction.- A.2. Measurability.- A.3. Product Spaces.- A.4. Probability Measures and Expectation.- A.S. Convergence of Random Vectors.- A.6. Conditional Expectation, Conditional Probability, and Regular Conditional Distribution.- A.7. The Kolmogorov Existence Theorem.- A.8. Weak Convergence of Probability Measures on a Metric Space.- Appendix B: Proofs of Two Theorems in Section II.7.- B.1. Proof of Theorem II.7.1.- B.2. Proof of Theorem II.7.2.- Appendix C: Equivalent Notions of Infinite-Volume Measures for Spin Systems.- C.1. Introduction.- C.2. Two-Body Interactions and Infinite-Volume Gibbs States.- C.3. Many-Body Interactions and Infinite-Volume Gibbs States.- C.4. DLR States.- C.5. The Gibbs Variational Formula and Principle.- C.6. Solution of the Gibbs Variational Formula for Finite-Range Interactions on ?.- Appendix D: Existence of the Specific Gibbs Free Energy.- D.1. Existence Along Hypercubes.- D.2. An Extension.- List of Frequently Used Symbols.- References.- Author Index.
1,626 citations
TL;DR: In this article, the authors use an exact local expansion of the entropy function to prove almost sure consistency and central limit theorems for three of the most commonly used discretized information estimators.
Abstract: We present some new results on the nonparametric estimation of entropy and mutual information. First, we use an exact local expansion of the entropy function to prove almost sure consistency and central limit theorems for three of the most commonly used discretized information estimators. The setup is related to Grenander's method of sieves and places no assumptions on the underlying probability measure generating the data. Second, we prove a converse to these consistency theorems, demonstrating that a misapplication of the most common estimation techniques leads to an arbitrarily poor estimate of the true information, even given unlimited data. This "inconsistency" theorem leads to an analytical approximation of the bias, valid in surprisingly small sample regimes and more accurate than the usual 1/N formula of Miller and Madow over a large region of parameter space. The two most practical implications of these results are negative: (1) information estimates in a certain data regime are likely contaminated by bias, even if "bias-corrected" estimators are used, and (2) confidence intervals calculated by standard techniques drastically underestimate the error of the most common estimation methods.Finally, we note a very useful connection between the bias of entropy estimators and a certain polynomial approximation problem. By casting bias calculation problems in this approximation theory framework, we obtain the best possible generalization of known asymptotic bias results. More interesting, this framework leads to an estimator with some nice properties: the estimator comes equipped with rigorous bounds on the maximum error over all possible underlying probability distributions, and this maximum error turns out to be surprisingly small. We demonstrate the application of this new estimator on both real and simulated data.
1,451 citations
TL;DR: This method provides an unbiased estimate of a binarized version of the image in an information theoretic sense by minimizing the cross entropy between the image and its segmented version.
Abstract: The threshold selection problem is solved by minimizing the cross entropy between the image and its segmented version. The cross entropy is formulated in a pixel-to-pixel basis between the two images and a computationally attractive algorithm employing the histogram is developed. Without making a priori assumptions about the population distribution, this method provides an unbiased estimate of a binarized version of the image in an information theoretic sense.
684 citations
TL;DR: In this paper, the authors describe recent progress in understanding the attractor mechanism and entropy of extremal black holes based on the entropy function formalism, and compare the statistical entropy of these dyons, expanded in inverse powers of electric and magnetic charges, with a similar expansion of the corresponding black hole entropy.
Abstract: In these lecture notes we describe recent progress in our understanding of attractor mechanism and entropy of extremal black holes based on the entropy function formalism. We also describe precise computation of the microscopic degeneracy of a class of quarter BPS dyons in \({\mathcal{N}=4}\) supersymmetric string theories, and compare the statistical entropy of these dyons, expanded in inverse powers of electric and magnetic charges, with a similar expansion of the corresponding black hole entropy. This comparison is extended to include the contribution to the entropy from multi-centred black holes as well.
545 citations