Author
Hirotugu Akaike
Bio: Hirotugu Akaike is a academic researcher from University of Hawaii. The author has contributed to research in topic(s): Autoregressive model & Bayesian statistics. The author has an hindex of 40, co-authored 89 publication(s) receiving 79485 citation(s).
...read more
Topics: Autoregressive model, Bayesian statistics, Bayes factor ...read more
Papers
More
Abstract: The history of the development of statistical hypothesis testing in time series analysis is reviewed briefly and it is pointed out that the hypothesis testing procedure is not adequately defined as the procedure for statistical model identification. The classical maximum likelihood estimation procedure is reviewed and a new estimate minimum information theoretical criterion (AIC) estimate (MAICE) which is designed for the purpose of statistical identification is introduced. When there are several competing models the MAICE is defined by the model and the maximum likelihood estimates of the parameters which give the minimum of AIC defined by AIC = (-2)log-(maximum likelihood) + 2(number of independently adjusted parameters within the model). MAICE provides a versatile procedure for statistical model identification which is free from the ambiguities inherent in the application of conventional hypothesis testing procedure. The practical utility of MAICE in time series analysis is demonstrated with some numerical examples.
...read more
Topics: Likelihood function (61%), Akaike information criterion (61%), Statistical model (60%) ...read more
42,619 Citations
01 Jan 1973-
Abstract: In this paper it is shown that the classical maximum likelihood principle can be considered to be a method of asymptotic realization of an optimum estimate with respect to a very general information theoretic criterion. This observation shows an extension of the principle to provide answers to many practical problems of statistical model fitting.
...read more
Topics: Likelihood principle (64%), Likelihood function (59%), Bayesian information criterion (58%) ...read more
15,032 Citations
Abstract: The information criterion AIC was introduced to extend the method of maximum likelihood to the multimodel situation. It was obtained by relating the successful experience of the order determination of an autoregressive model to the determination of the number of factors in the maximum likelihood factor analysis. The use of the AIC criterion in the factor analysis is particularly interesting when it is viewed as the choice of a Bayesian model. This observation shows that the area of application of AIC can be much wider than the conventional i.i.d. type models on which the original derivation of the criterion was based. The observation of the Bayesian structure of the factor analysis model leads us to the handling of the problem of improper solution by introducing a natural prior distribution of factor loadings.
...read more
Topics: Bayesian information criterion (64%), Bayes factor (61%), Deviance information criterion (61%) ...read more
4,560 Citations
Abstract: This is a preliminary report on a newly developed simple and practical procedure of statistical identification of predictors by using autoregressive models. The use of autoregressive representation of a stationary time series (or the innovations approach) in the analysis of time series has recently been attracting attentions of many research workers and it is expected that this time domain approach will give answers to many problems, such as the identification of noisy feedback systems, which could not be solved by the direct application of frequency domain approach [1], [2], [3], [9].
...read more
Topics: SETAR (67%), STAR model (65%), Autoregressive integrated moving average (63%) ...read more
2,343 Citations
Open access•Proceedings Article•
01 Jan 1973-
Topics: Extension (predicate logic) (52%), Information theory (51%)
2,038 Citations
Cited by
More
Abstract: This article examines the adequacy of the “rules of thumb” conventional cutoff criteria and several new alternatives for various fit indexes used to evaluate model fit in practice. Using a 2‐index presentation strategy, which includes using the maximum likelihood (ML)‐based standardized root mean squared residual (SRMR) and supplementing it with either Tucker‐Lewis Index (TLI), Bollen's (1989) Fit Index (BL89), Relative Noncentrality Index (RNI), Comparative Fit Index (CFI), Gamma Hat, McDonald's Centrality Index (Mc), or root mean squared error of approximation (RMSEA), various combinations of cutoff values from selected ranges of cutoff criteria for the ML‐based SRMR and a given supplemental fit index were used to calculate rejection rates for various types of true‐population and misspecified models; that is, models with misspecified factor covariance(s) and models with misspecified factor loading(s). The results suggest that, for the ML method, a cutoff value close to .95 for TLI, BL89, CFI, RNI, and G...
...read more
Topics: Cutoff (52%), Goodness of fit (51%)
63,509 Citations
Abstract: The history of the development of statistical hypothesis testing in time series analysis is reviewed briefly and it is pointed out that the hypothesis testing procedure is not adequately defined as the procedure for statistical model identification. The classical maximum likelihood estimation procedure is reviewed and a new estimate minimum information theoretical criterion (AIC) estimate (MAICE) which is designed for the purpose of statistical identification is introduced. When there are several competing models the MAICE is defined by the model and the maximum likelihood estimates of the parameters which give the minimum of AIC defined by AIC = (-2)log-(maximum likelihood) + 2(number of independently adjusted parameters within the model). MAICE provides a versatile procedure for statistical model identification which is free from the ambiguities inherent in the application of conventional hypothesis testing procedure. The practical utility of MAICE in time series analysis is demonstrated with some numerical examples.
...read more
Topics: Likelihood function (61%), Akaike information criterion (61%), Statistical model (60%) ...read more
42,619 Citations
Open access•Book•
01 Jan 1995-
Abstract: Setting of the learning problem consistency of learning processes bounds on the rate of convergence of learning processes controlling the generalization ability of learning processes constructing learning algorithms what is important in learning theory?.
...read more
Topics: Algorithmic learning theory (76%), Statistical learning theory (71%), Computational learning theory (71%) ...read more
38,164 Citations
Abstract: Summary: The program MODELTEST uses log likelihood scores to establish the model of DNA evolution that best fits the data. Availability: The MODELTEST package, including the source code and some documentation is available at http://bioag.byu.edu/zoology/crandall―lab/modeltest.html. Contact: dp47@email.byu.edu.
...read more
19,721 Citations
Open access•Proceedings Article•
01 Jan 1973-
Abstract: In this paper it is shown that the classical maximum likelihood principle can be considered to be a method of asymptotic realization of an optimum estimate with respect to a very general information theoretic criterion. This observation shows an extension of the principle to provide answers to many practical problems of statistical model fitting.
...read more
Topics: Information theory (51%)
17,414 Citations