Journal ArticleDOI
A type theory for probability density functions
Sooraj Bhat,Ashish Agarwal,Richard Vuduc,Alexander G. Gray +3 more
- Vol. 47, Iss: 1, pp 545-556
Reads0
Chats0
TLDR
This work formalizes the first probabilistic language that exhibits continuous probability distributions, the ability to naturally express custom probabilism models, and probability density functions (PDFs), and serves as a foundational framework for extending the ideas to more general languages.Abstract:
There has been great interest in creating probabilistic programming languages to simplify the coding of statistical tasks; however, there still does not exist a formal language that simultaneously provides (1) continuous probability distributions, (2) the ability to naturally express custom probabilistic models, and (3) probability density functions (PDFs). This collection of features is necessary for mechanizing fundamental statistical techniques. We formalize the first probabilistic language that exhibits these features, and it serves as a foundational framework for extending the ideas to more general languages. Particularly novel are our type system for absolutely continuous (AC) distributions (those which permit PDFs) and our PDF calculation procedure, which calculates PDF s for a large class of AC distributions. Our formalization paves the way toward the rigorous encoding of powerful statistical reformulations.read more
Citations
More filters
Book ChapterDOI
PSI: Exact Symbolic Inference for Probabilistic Programs
TL;DR: This paper presents a meta-modelling framework for approximate inference, which automates the very labor-intensive and therefore time-heavy and expensive process of exact inference for probabilistic programs.
Proceedings ArticleDOI
UncertainT>: a first-order type for uncertain data
TL;DR: A Bayesian network semantics for computation and conditionals that improves program correctness and the Uncertain type system and operators encourage developers to expose and reason about uncertainty explicitly, controlling false positives and false negatives.
Book ChapterDOI
Measure transformer semantics for Bayesian machine learning
TL;DR: This work proposes a core functional calculus with primitives for sampling prior distributions and observing variables, and defines combinators for measure transformers, based on theorems in measure theory, to give a rigorous semantics to the core calculus.
Journal ArticleDOI
Measure Transformer Semantics for Bayesian Machine Learning
TL;DR: The Bayesian approach to machine learning amounts to computing posterior distributions of random variables from a probabilistic model of how the variables are related (that is, a prior distribution ...
Proceedings ArticleDOI
Exact Bayesian inference by symbolic disintegration
TL;DR: This work presents the first method of computing a disintegration from a probabilistic program and an expression of a quantity to be observed, even when the observation has probability zero, and composes with other inference methods in a modular way-without sacrificing accuracy or performance.
References
More filters
Book
Pattern Recognition and Machine Learning
TL;DR: Probability Distributions, linear models for Regression, Linear Models for Classification, Neural Networks, Graphical Models, Mixture Models and EM, Sampling Methods, Continuous Latent Variables, Sequential Data are studied.
Journal ArticleDOI
Pattern Recognition and Machine Learning
TL;DR: This book covers a broad range of topics for regular factorial designs and presents all of the material in very mathematical fashion and will surely become an invaluable resource for researchers and graduate students doing research in the design of factorial experiments.
BookDOI
Density estimation for statistics and data analysis
TL;DR: The Kernel Method for Multivariate Data: Three Important Methods and Density Estimation in Action.
Book
Pattern Recognition and Machine Learning (Information Science and Statistics)
TL;DR: Looking for competent reading resources?