scispace - formally typeset
Search or ask a question
Topic

Latent variable model

About: Latent variable model is a research topic. Over the lifetime, 3589 publications have been published within this topic receiving 235061 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, the authors used log-multiplicative association models as latent variable models for discrete variables and showed that these models have desirable properties, including having schematic or graphical representations of the system of observed and unobserved variables, the log-multi...
Abstract: Associations between multiple discrete measures are often due to collapsing over other variables. When the variables collapsed over are unobserved and continuous, log-multiplicative association models, including log-linear models with linear-by-linear interactions for ordinal categorical data and extensions of Goodman's (1979, 1985) RC(M) association model for multiple nominal and/or ordinal categorical variables, can be used to study the relationship between the observed discrete variables and the unobserved continuous ones, and to study the unobserved variables. The derivation and use of log-multiplicative association models as latent variable models for discrete variables are presented in this paper. The models are based on graphical models for discrete and continuous variables where the variables follow a conditional Gaussian distribution. The models have many desirable properties, including having schematic or graphical representations of the system of observed and unobserved variables, the log-multi...

43 citations

01 Jan 2001
TL;DR: A Bayesian learning scheme using the variational paradigm to learn the parameters of the model, estimate the source densities, and together with Automatic Relevance Determination (ARD) to infer the number of latent dimensions is proposed.
Abstract: Independent Component Analysis (ICA) is an important tool for extracting structure from data. ICA is traditionally performed under a maximum likelihood scheme in a latent variable model and in the absence of noise. Although extensively utilised, maximum likelihood estimation has well known drawbacks such as overfitting and sensitivity to local-maxima. In this paper, we propose a Bayesian learning scheme using the variational paradigm to learn the parameters of the model, estimate the source densities, and together with Automatic Relevance Determination (ARD) to infer the number of latent dimensions. We illustrate our method by separating a noisy mixture of images, estimating the noise and correctly inferring the true number of sources.

43 citations

Journal ArticleDOI
TL;DR: This article presents a further generalization of latent Markov models to allow for the analysis of rating data that are collected at arbitrary points in time, and shows that such duration analyses can provide valuable insights about chronometric features of emotions.
Abstract: Markov models provide a general framework for analyzing and interpreting time dependencies in psychological applications. Recent work extended Markov models to the case of latent states because frequently psychological states are not directly observable and subject to measurement error. This article presents a further generalization of latent Markov models to allow for the analysis of rating data that are collected at arbitrary points in time. This extension offers new ways of investigating change processes by focusing explicitly on the durations that are spent in latent states. In an experience sampling application the author shows that such duration analyses can provide valuable insights about chronometric features of emotions.

43 citations

Journal ArticleDOI
TL;DR: This paper introduces a more general approximate inference framework for conjugate-exponential family models, which it is shown that the LSVB approach gives better estimates of the model evidence as well as the distribution over latent variables than the VBEM approach, but in practice, the distributionover latent variables has to be approximated.
Abstract: Variational Bayesian expectation-maximization (VBEM), an approximate inference method for probabilistic models based on factorizing over latent variables and model parameters, has been a standard technique for practical Bayesian inference. In this paper, we introduce a more general approximate inference framework for conjugate-exponential family models, which we call latent-space variational Bayes (LSVB). In this approach, we integrate out model parameters in an exact way, leaving only the latent variables. It can be shown that the LSVB approach gives better estimates of the model evidence as well as the distribution over latent variables than the VBEM approach, but in practice, the distribution over latent variables has to be approximated. As a practical implementation, we present a first-order LSVB (FoLSVB) algorithm to approximate this distribution over latent variables. From this approximate distribution, one can estimate the model evidence and the posterior over model parameters. The FoLSVB algorithm is directly comparable to the VBEM algorithm and has the same computational complexity. We discuss how LSVB generalizes the recently proposed collapsed variational methods [20] to general conjugate-exponential families. Examples based on mixtures of Gaussians and mixtures of Bernoullis with synthetic and real-world data sets are used to illustrate some advantages of our method over VBEM.

42 citations

Book ChapterDOI
01 Jan 2013

42 citations


Network Information
Related Topics (5)
Statistical hypothesis testing
19.5K papers, 1M citations
82% related
Inference
36.8K papers, 1.3M citations
81% related
Multivariate statistics
18.4K papers, 1M citations
80% related
Linear model
19K papers, 1M citations
80% related
Estimator
97.3K papers, 2.6M citations
78% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202375
2022143
2021137
2020185
2019142
2018159