scispace - formally typeset
Open AccessJournal ArticleDOI

Information Criteria for Discriminating Among Alternative Regression Models

Takamitsu Sawa
- 01 Nov 1978 - 
- Vol. 46, Iss: 6, pp 1273-1291
Reads0
Chats0
TLDR
In this paper, decision rules for discriminating among alternative regression models are proposed and mutually compared based on the Akaike Information Criterion as well as the Kullback-Leibler information Criterion (KLIC).
Abstract
Some decision rules for discriminating among alternative regression models are proposed and mutually compared. They are essentially based on the Akaike Information Criterion as well as the Kullback-Leibler Information Criterion (KLIC) : namely, the distance between a postulated model and the true unknown structure is measured by the KLIC. The proposed criteria combine the parsimony of parameters with the goodness of fit. Their relationships with conventional criteria are discussed in terms of a new concept of unbiasedness .

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Bayesian measures of model complexity and fit

TL;DR: In this paper, the authors consider the problem of comparing complex hierarchical models in which the number of parameters is not clearly defined and derive a measure pD for the effective number in a model as the difference between the posterior mean of the deviances and the deviance at the posterior means of the parameters of interest, which is related to other information criteria and has an approximate decision theoretic justification.
Journal ArticleDOI

Likelihood Ratio Tests for Model Selection and Non-Nested Hypotheses

Quang Vuong
- 01 Mar 1989 - 
TL;DR: In this article, the authors propose simple and directional likelihood-ratio tests for discriminating and choosing between two competing models whether the models are nonnested, overlapping or nested and whether both, one, or neither is misspecified.
Journal ArticleDOI

Testing the number of components in a normal mixture

TL;DR: In this article, it was shown that the likelihood ratio statistic based on the Kullback-Leibler information criterion of the null hypothesis that a random sample is drawn from a k 0 -component normal mixture distribution against the alternative hypothesis that the sample was drawn from an k 1 -component normalized mixture distribution is asymptotically distributed as a weighted sum of independent chi-squared random variables with one degree of freedom, under general regularity conditions.
Posted Content

Cointegration and Tests of Present Value Models

TL;DR: In this paper, the authors proposed a cointegrated model where a variable Y[sub t] is proportional to the present value, with constant discount rate, of expected future values of a variable y[subt] and the "spread" S [sub t]= Y[Sub t] -[theta sub t] will be stationary for some [theta] whether or not y(sub t) must be differenced to induce stationarity.
Journal ArticleDOI

Several tests for model specification in the presence of alternative hypotheses

TL;DR: In this paper, several procedures are proposed for testing the specification of an econometric model in the presence of one or more other models which purport to explain the same phenomenon.
References
More filters
Book ChapterDOI

Information Theory and an Extension of the Maximum Likelihood Principle

TL;DR: In this paper, it is shown that the classical maximum likelihood principle can be considered to be a method of asymptotic realization of an optimum estimate with respect to a very general information theoretic criterion.