scispace - formally typeset
Open AccessProceedings Article

Information Theory and an Extention of the Maximum Likelihood Principle

H. Akaike
- pp 267-281
TLDR
The classical maximum likelihood principle can be considered to be a method of asymptotic realization of an optimum estimate with respect to a very general information theoretic criterion to provide answers to many practical problems of statistical model fitting.
Abstract
In this paper it is shown that the classical maximum likelihood principle can be considered to be a method of asymptotic realization of an optimum estimate with respect to a very general information theoretic criterion. This observation shows an extension of the principle to provide answers to many practical problems of statistical model fitting.

read more

Citations
More filters
Journal ArticleDOI

A new look at the statistical model identification

TL;DR: In this article, a new estimate minimum information theoretical criterion estimate (MAICE) is introduced for the purpose of statistical identification, which is free from the ambiguities inherent in the application of conventional hypothesis testing procedure.
Book

Neural networks for pattern recognition

TL;DR: This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition, and is designed as a text, with over 100 exercises, to benefit anyone involved in the fields of neural computation and pattern recognition.
Journal ArticleDOI

Deep learning in neural networks

TL;DR: This historical survey compactly summarizes relevant work, much of it from the previous millennium, review deep supervised learning, unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.
Journal ArticleDOI

Bayesian measures of model complexity and fit

TL;DR: In this paper, the authors consider the problem of comparing complex hierarchical models in which the number of parameters is not clearly defined and derive a measure pD for the effective number in a model as the difference between the posterior mean of the deviances and the deviance at the posterior means of the parameters of interest, which is related to other information criteria and has an approximate decision theoretic justification.
Journal ArticleDOI

Conditional heteroskedasticity in asset returns: a new approach

Daniel B. Nelson
- 01 Mar 1991 - 
TL;DR: In this article, an exponential ARCH model is proposed to study volatility changes and the risk premium on the CRSP Value-Weighted Market Index from 1962 to 1987, which is an improvement over the widely-used GARCH model.
References
More filters
Journal ArticleDOI

A mathematical theory of communication

TL;DR: This final installment of the paper considers the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now.
Book

Spectral analysis and its applications

TL;DR: In this paper, Spectral Analysis and its Applications, the authors present a set of applications of spectral analysis and its application in the field of spectroscopy, including the following:
Journal ArticleDOI

Theory of Statistical Estimation

TL;DR: It has been pointed out to me that some of the statistical ideas employed in the following investigation have never received a strictly logical definition and analysis, and it is desirable to set out for criticism the manner in which the logical foundations of these ideas may be established.
Book ChapterDOI

Fitting autoregressive models for prediction

TL;DR: This is a preliminary report on a newly developed simple and practical procedure of statistical identification of predictors by using autoregressive models in a stationary time series.
Trending Questions (1)
What is information theory in cybernetics?

Information theory in cybernetics is a method of asymptotic realization of an optimum estimate with respect to a general information theoretic criterion.