scispace - formally typeset
Search or ask a question
Topic

Categorical distribution

About: Categorical distribution is a research topic. Over the lifetime, 782 publications have been published within this topic receiving 36673 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, the authors consider a group of individuals who must act together as a team or committee, and assume that each individual in the group has his own subjective probability distribution for the unknown value of some parameter.
Abstract: Consider a group of individuals who must act together as a team or committee, and suppose that each individual in the group has his own subjective probability distribution for the unknown value of some parameter. A model is presented which describes how the group might reach agreement on a common subjective probability distribution for the parameter by pooling their individual opinions. The process leading to the consensus is explicitly described and the common distribution that is reached is explicitly determined. The model can also be applied to problems of reaching a consensus when the opinion of each member of the group is represented simply as a point estimate of the parameter rather than as a probability distribution.

3,527 citations

Proceedings Article
03 Nov 2016
TL;DR: Gumbel-Softmax as mentioned in this paper replaces the non-differentiable samples from a categorical distribution with a differentiable sample from a novel Gumbel softmax distribution, which has the essential property that it can be smoothly annealed into the categorical distributions.
Abstract: Categorical variables are a natural choice for representing discrete structure in the world. However, stochastic neural networks rarely use categorical latent variables due to the inability to backpropagate through samples. In this work, we present an efficient gradient estimator that replaces the non-differentiable sample from a categorical distribution with a differentiable sample from a novel Gumbel-Softmax distribution. This distribution has the essential property that it can be smoothly annealed into a categorical distribution. We show that our Gumbel-Softmax estimator outperforms state-of-the-art gradient estimators on structured output prediction and unsupervised generative modeling tasks with categorical latent variables, and enables large speedups on semi-supervised classification.

3,390 citations

ReportDOI
01 May 1991
TL;DR: In this article, a class of priors known as Dirichlet measures have been used for the distribution of a random variable X when it takes values in R sub K, where K is the dimension of all probability measures on a large space.
Abstract: : The parameter in a Bayesian nonparametric problem is the unknown distribution P of the observation X. A Bayesian uses a prior distribution for P, and after observing X, solves the statistical inference problem by using the posterior distribution of P, which is the conditional distribution of P given X. For Bayesian nonparametrics to be successful one needs a large class of priors for which posterior distributions can be easily calculated. Unless X takes values in a finite space, the unknown distribution P varies in an infinite dimensional space. Thus one has to talk about measures in a complicated space like the space of all probability measures on a large space. This has always required a more careful attention to the attendant measure theoretic problems. A class of priors known as Dirichlet measures have been used for the distribution of a random variable X when it takes values in R sub K.

2,162 citations

Journal ArticleDOI
TL;DR: In this article, the conditional distribution of the random measure, given the observations, is no longer that of a simple Dirichlet process, but can be described as being a mixture of DirICHlet processes.
Abstract: process. This paper extends Ferguson's result to cases where the random measure is a mixing distribution for a parameter which determines the distribution from which observations are made. The conditional distribution of the random measure, given the observations, is no longer that of a simple Dirichlet process, but can be described as being a mixture of Dirichlet processes. This paper gives a formal definition for these mixtures and develops several theorems about their properties, the most important of which is a closure property for such mixtures. Formulas for computing the conditional distribution are derived and applications to problems in bio-assay, discrimination, regression, and mixing distributions are given.

2,146 citations

Posted Content
TL;DR: It is shown that the Gumbel-Softmax estimator outperforms state-of-the-art gradient estimators on structured output prediction and unsupervised generative modeling tasks with categorical latent variables, and enables large speedups on semi-supervised classification.
Abstract: Categorical variables are a natural choice for representing discrete structure in the world. However, stochastic neural networks rarely use categorical latent variables due to the inability to backpropagate through samples. In this work, we present an efficient gradient estimator that replaces the non-differentiable sample from a categorical distribution with a differentiable sample from a novel Gumbel-Softmax distribution. This distribution has the essential property that it can be smoothly annealed into a categorical distribution. We show that our Gumbel-Softmax estimator outperforms state-of-the-art gradient estimators on structured output prediction and unsupervised generative modeling tasks with categorical latent variables, and enables large speedups on semi-supervised classification.

1,476 citations


Network Information
Related Topics (5)
Estimator
97.3K papers, 2.6M citations
83% related
Multivariate statistics
18.4K papers, 1M citations
81% related
Linear model
19K papers, 1M citations
80% related
Statistical hypothesis testing
19.5K papers, 1M citations
80% related
Inference
36.8K papers, 1.3M citations
80% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20231
20227
202117
202016
20199
20188