scispace - formally typeset
Open AccessProceedings Article

A kernelized stein discrepancy for goodness-of-fit tests

Reads0
Chats0
TLDR
A new discrepancy statistic for measuring differences between two probability distributions is derived based on combining Stein's identity with the reproducing kernel Hilbert space theory and a new class of powerful goodness-of-fit tests are derived that are widely applicable for complex and high dimensional distributions.
Abstract
We derive a new discrepancy statistic for measuring differences between two probability distributions based on combining Stein's identity with the reproducing kernel Hilbert space theory. We apply our result to test how well a probabilistic model fits a set of observations, and derive a new class of powerful goodness-of-fit tests that are widely applicable for complex and high dimensional distributions, even for those with computationally intractable normalization constants. Both theoretical and empirical properties of our methods are studied thoroughly.

read more

Content maybe subject to copyright    Report

Citations
More filters
Posted Content

Generative Modeling by Estimating Gradients of the Data Distribution

TL;DR: A new generative model where samples are produced via Langevin dynamics using gradients of the data distribution estimated with score matching, which allows flexible model architectures, requires no sampling during training or the use of adversarial methods, and provides a learning objective that can be used for principled model comparisons.
Journal ArticleDOI

Bayesian Deep Convolutional Encoder-Decoder Networks for Surrogate Modeling and Uncertainty Quantification

TL;DR: This approach achieves state of the art performance in terms of predictive accuracy and uncertainty quantification in comparison to other approaches in Bayesian neural networks as well as techniques that include Gaussian processes and ensemble methods even when the training data size is relatively small.
Journal ArticleDOI

Advances in Variational Inference

TL;DR: Variational inference (VI) as mentioned in this paper approximates a high-dimensional Bayesian posterior with a simpler variational distribution by solving an optimization problem, which has been successfully applied to various models and large-scale applications.
Proceedings Article

Adversarial feature matching for text generation

TL;DR: The authors proposed a framework for generating realistic text via adversarial training, which employs a long short-term memory network as generator and a con-volutional network as discriminator.
Journal ArticleDOI

Kernel Mean Embedding of Distributions: A Review and Beyond

TL;DR: A comprehensive review of existing work and recent advances in the Hilbert space embedding of distributions, and to discuss the most challenging issues and open problems that could lead to new research directions.
References
More filters
Journal ArticleDOI

Reducing the Dimensionality of Data with Neural Networks

TL;DR: In this article, an effective way of initializing the weights that allows deep autoencoder networks to learn low-dimensional codes that work much better than principal components analysis as a tool to reduce the dimensionality of data is described.
Book

Probabilistic graphical models : principles and techniques

TL;DR: The framework of probabilistic graphical models, presented in this book, provides a general approach for causal reasoning and decision making under uncertainty, allowing interpretable models to be constructed and then manipulated by reasoning algorithms.
Book

Testing statistical hypotheses

TL;DR: The general decision problem, the Probability Background, Uniformly Most Powerful Tests, Unbiasedness, Theory and First Applications, and UNbiasedness: Applications to Normal Distributions, Invariance, Linear Hypotheses as discussed by the authors.
Journal ArticleDOI

A kernel two-sample test

TL;DR: This work proposes a framework for analyzing and comparing distributions, which is used to construct statistical tests to determine if two samples are drawn from different distributions, and presents two distribution free tests based on large deviation bounds for the maximum mean discrepancy (MMD).
Book ChapterDOI

A Class of Statistics with Asymptotically Normal Distribution

TL;DR: In this article, the authors considered the problem of estimating a U-statistic of the population characteristic of a regular functional function, where the sum ∑″ is extended over all permutations (α 1, α m ) of different integers, 1 α≤ (αi≤ n, n).
Related Papers (5)