Proceedings ArticleDOI
Kullback-Leibler divergence estimation of continuous distributions
Fernando Perez-Cruz
- pp 1666-1670
TLDR
A method for estimating the KL divergence between continuous densities is presented and it is proved it converges almost surely and can be either estimated using the empirical cdf or k-nearest-neighbour density estimation, which does not converge to the true measure for finite k.Abstract:
We present a method for estimating the KL divergence between continuous densities and we prove it converges almost surely. Divergence estimation is typically solved estimating the densities first. Our main result shows this intermediate step is unnecessary and that the divergence can be either estimated using the empirical cdf or k-nearest-neighbour density estimation, which does not converge to the true measure for finite k. The convergence proof is based on describing the statistics of our estimator using waiting-times distributions, as the exponential or Erlang. We illustrate the proposed estimators and show how they compare to existing methods based on density estimation, and we also outline how our divergence estimators can be used for solving the two-sample problem.read more
Citations
More filters
Journal ArticleDOI
Dependable Structural Health Monitoring Using Wireless Sensor Networks
TL;DR: This work designs a dependable distributed WSN framework for SHM (called DependSHM) and examines its ability to cope with sensor faults and constraints, and presents a distributed automated algorithm to detect such types of faults.
ReportDOI
Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual
Brian M. Adams,Mohamed S. Ebeida,Michael S. Eldred,John D. Jakeman,Laura Painton Swiler,John Adam Stephens,Dena M. Vigil,Tim Wildey,William J. Bohnhoff,John Eddy,Kenneth T. Hu,Keith R. Dalbey,Lara E Bauman,Patricia Diane Hough +13 more
TL;DR: This manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of Dakota’s iterative analysis capabilities.
Journal ArticleDOI
The Exceptional 2018 European Water Seesaw Calls for Action on Adaptation
Andrea Toreti,Alan Belward,Ignacio Perez-Dominguez,Gustavo Naumann,Jürg Luterbacher,Ottmar Cronie,Lorenzo Seguini,Giacinto Manfron,Raúl López-Lozano,Bettina Baruth,Maurits van den Berg,Frank Dentener,Andrej Ceglar,Thomas Chatzopoulos,Matteo Zampieri +14 more
TL;DR: In 2018, Europe experienced concurrent anomalies of b... in the spring/summer growing season, and the most important factors responsible for agricultural productivity variations were temperature and precipitation.
Journal ArticleDOI
Guiding New Physics Searches with Unsupervised Learning
Andrea De Simone,Thomas Jacques +1 more
TL;DR: A statistical test upon a test statistic which measures deviations between two samples, using a Nearest Neighbors approach to estimate the local ratio of the density of points, which is model-independent and non-parametric.
Journal ArticleDOI
Unsupervised 3D shape segmentation and co-segmentation via deep learning
TL;DR: A novel unsupervised algorithm for automatically segmenting a single 3D shape or co-segmenting a family of 3D shapes using deep learning based on deep learning, which achieves better or comparable performance over the state-of-the-art methods.
References
More filters
Statistical learning theory
TL;DR: Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.
Book
Information Theory, Inference and Learning Algorithms
TL;DR: A fun and exciting textbook on the mathematics underpinning the most dynamic areas of modern science and engineering.
Book
Information theory, inference, and learning algorithms
TL;DR: In this paper, the mathematics underpinning the most dynamic areas of modern science and engineering are discussed and discussed in a fun and exciting textbook on the mathematics underlying the most important areas of science and technology.
Journal ArticleDOI
Estimating mutual information.
TL;DR: Two classes of improved estimators for mutual information M(X,Y), from samples of random points distributed according to some joint probability density mu(x,y), based on entropy estimates from k -nearest neighbor distances are presented.