scispace - formally typeset
Open AccessJournal Article

Root-n consistent estimators of entropy for densities with unbounded support

Alexandre B. Tsybakov, +1 more
- 01 Jan 1996 - 
- Vol. 23, Iss: 1, pp 75-83
Reads0
Chats0
TLDR
In this paper, the authors consider a truncated version of the entropy estimator and prove the mean square /spl radic/nconsistency of this estimator for a class of densities with unbounded support, including the Gaussian density.
Abstract
We consider a truncated version of the entropy estimator and prove the mean square /spl radic/n-consistency of this estimator for a class of densities with unbounded support, including the Gaussian density.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Estimating mutual information.

TL;DR: Two classes of improved estimators for mutual information M(X,Y), from samples of random points distributed according to some joint probability density mu(x,y), based on entropy estimates from k -nearest neighbor distances are presented.
Journal ArticleDOI

Causality detection based on information-theoretic approaches in time series analysis

TL;DR: The aim of this paper is to provide a detailed overview of information theoretic approaches for measuring causal influence in multivariate time series and to focus on diverse approaches to the entropy and mutual information estimation.

Nonparametric entropy estimation. An overview

TL;DR: This research assumes that H(f) is well-defined and is finite, and the concept of differential entropy was introduced in Shannon’s original paper ([55]).
Book ChapterDOI

Information Theoretic Learning

TL;DR: In this article, the assumption of Gaussianity for the measurement error combined with the maximum likelihood principle could be emphasized to promote the least square criterion for nonlinear regression problems; considering classification as a regression problem towards estimating class posterior probabilities, least squares has been employed to train neural network and other classifier topologies to approximate correct labels.
Journal ArticleDOI

Divergence Estimation for Multidimensional Densities Via $k$ -Nearest-Neighbor Distances

TL;DR: It is shown that the speed of convergence of the k-NN method can be further improved by an adaptive choice of k.i.d., and the new universal estimator of divergence is proved to be asymptotically unbiased and mean-square consistent.
References
More filters
Journal ArticleDOI

Entropy-Based Tests of Uniformity

TL;DR: The power properties of an entropy-based test for uniformity on [0, 1] were investigated in this article, where the power of the test was compared with other tests of uniformity.
Journal ArticleDOI

Sums of Functions of Nearest Neighbor Distances, Moment Bounds, Limit Theorems and a Goodness of Fit Test

TL;DR: In this paper, the limiting behavior of sums of functions of nearest neighbor distances is studied for an m-dimensional sample, and a central limit theorem and moment bounds for such sums, and an invariance principle for the empirical process of nearest neighbour distances are both established.
Journal ArticleDOI

Density-free convergence properties of various estimators of entropy

TL;DR: This paper proposes, based on a random sample X1,…, Xn generated from ƒ, two new nonparametric estimators for H(ƒ) that are histogram-based in the sense that they involve a histograms-based desntiy estimator Ɠ n.
Journal ArticleDOI

On the logarithms of high-order spacings

Noel A Cressie
- 01 Jan 1976 - 
TL;DR: In this article, the sum of the logarithm of the mth-order gaps is used as a test statistic of uniformity. And the authors show that the Pitman asymptotic relative efficiency increases for large m approximately linearly in m. Even when m grows at a moderate rate with the sample size, the test is compared with the most powerful test symmetric in the first order gaps.
Journal ArticleDOI

On powerful distributional tests based on sample spacings

TL;DR: In this paper, it is shown that if m diverges at a slower rate than n 1 2 then the commonly used sum-function will detect alternatives distant (mn) −1 4 from the uniform.