scispace - formally typeset
Open AccessJournal ArticleDOI

Estimating the Dimension of a Model

Gideon Schwarz
- 01 Mar 1978 - 
- Vol. 6, Iss: 2, pp 461-464
Reads0
Chats0
TLDR
In this paper, the problem of selecting one of a number of models of different dimensions is treated by finding its Bayes solution, and evaluating the leading terms of its asymptotic expansion.
Abstract
The problem of selecting one of a number of models of different dimensions is treated by finding its Bayes solution, and evaluating the leading terms of its asymptotic expansion. These terms are a valid large-sample criterion beyond the Bayesian context, since they do not depend on the a priori distribution.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

MEGA5: Molecular Evolutionary Genetics Analysis using Maximum Likelihood, Evolutionary Distance, and Maximum Parsimony Methods

TL;DR: The newest addition in MEGA5 is a collection of maximum likelihood (ML) analyses for inferring evolutionary trees, selecting best-fit substitution models, inferring ancestral states and sequences, and estimating evolutionary rates site-by-site.
Journal ArticleDOI

On the evaluation of structural equation models

TL;DR: In this article, structural equation models with latent variables are defined, critiqued, and illustrated, and an overall program for model evaluation is proposed based upon an interpretation of converging and diverging evidence.
Journal ArticleDOI

jModelTest 2: more models, new heuristics and parallel computing.

TL;DR: jModelTest 2: more models, new heuristics and parallel computing Diego Darriba, Guillermo L. Taboada, Ramón Doallo and David Posada.
Journal ArticleDOI

Bayesian measures of model complexity and fit

TL;DR: In this paper, the authors consider the problem of comparing complex hierarchical models in which the number of parameters is not clearly defined and derive a measure pD for the effective number in a model as the difference between the posterior mean of the deviances and the deviance at the posterior means of the parameters of interest, which is related to other information criteria and has an approximate decision theoretic justification.

Pattern Recognition and Machine Learning

TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.