scispace - formally typeset

Model selection

About: Model selection is a(n) research topic. Over the lifetime, 14339 publication(s) have been published within this topic receiving 786214 citation(s). The topic is also known as: model order selection. more


Open accessJournal ArticleDOI: 10.1109/TAC.1974.1100705
Abstract: The history of the development of statistical hypothesis testing in time series analysis is reviewed briefly and it is pointed out that the hypothesis testing procedure is not adequately defined as the procedure for statistical model identification. The classical maximum likelihood estimation procedure is reviewed and a new estimate minimum information theoretical criterion (AIC) estimate (MAICE) which is designed for the purpose of statistical identification is introduced. When there are several competing models the MAICE is defined by the model and the maximum likelihood estimates of the parameters which give the minimum of AIC defined by AIC = (-2)log-(maximum likelihood) + 2(number of independently adjusted parameters within the model). MAICE provides a versatile procedure for statistical model identification which is free from the ambiguities inherent in the application of conventional hypothesis testing procedure. The practical utility of MAICE in time series analysis is demonstrated with some numerical examples. more

Topics: Likelihood function (61%), Akaike information criterion (61%), Statistical model (60%) more

42,619 Citations

Open accessBook
19 Jun 2013-
Abstract: Introduction * Information and Likelihood Theory: A Basis for Model Selection and Inference * Basic Use of the Information-Theoretic Approach * Formal Inference From More Than One Model: Multi-Model Inference (MMI) * Monte Carlo Insights and Extended Examples * Statistical Theory and Numerical Results * Summary more

Topics: Inference (66%), Statistical theory (58%), Model selection (56%) more

35,811 Citations

Open accessJournal ArticleDOI: 10.1214/AOS/1176344136
Abstract: The problem of selecting one of a number of models of different dimensions is treated by finding its Bayes solution, and evaluating the leading terms of its asymptotic expansion. These terms are a valid large-sample criterion beyond the Bayesian context, since they do not depend on the a priori distribution. more

Topics: Bayesian information criterion (57%), g-prior (55%), Bayes' theorem (55%) more

35,659 Citations

Open accessBook
01 Jan 1970-
Abstract: From the Publisher: This is a complete revision of a classic, seminal, and authoritative book that has been the model for most books on the topic written since 1970. It focuses on practical techniques throughout, rather than a rigorous mathematical treatment of the subject. It explores the building of stochastic (statistical) models for time series and their use in important areas of application —forecasting, model specification, estimation, and checking, transfer function modeling of dynamic relationships, modeling the effects of intervention events, and process control. Features sections on: recently developed methods for model specification, such as canonical correlation analysis and the use of model selection criteria; results on testing for unit root nonstationarity in ARIMA processes; the state space representation of ARMA models and its use for likelihood estimation and forecasting; score test for model checking; and deterministic components and structural components in time series models and their estimation based on regression-time series model methods. more

Topics: Moving-average model (59%), Model selection (56%), Decomposition of time series (56%) more

19,748 Citations

Open accessJournal ArticleDOI: 10.1162/153244303322753616
Isabelle Guyon, André Elisseeff1Institutions (1)
Abstract: Variable and feature selection have become the focus of much research in areas of application for which datasets with tens or hundreds of thousands of variables are available. These areas include text processing of internet documents, gene expression array analysis, and combinatorial chemistry. The objective of variable selection is three-fold: improving the prediction performance of the predictors, providing faster and more cost-effective predictors, and providing a better understanding of the underlying process that generated the data. The contributions of this special issue cover a wide range of aspects of such problems: providing a better definition of the objective function, feature construction, feature ranking, multivariate feature selection, efficient search methods, and feature validity assessment methods. more

13,554 Citations

No. of papers in the topic in previous years

Top Attributes

Show by:

Topic's top 5 most impactful authors

Gerda Claeskens

38 papers, 3.3K citations

David F. Hendry

29 papers, 2.2K citations

Claudia Czado

20 papers, 547 citations

Yuhong Yang

20 papers, 1K citations

Nizar Bouguila

17 papers, 207 citations

Network Information
Related Topics (5)
Bayesian probability

26.5K papers, 817.9K citations

96% related
Bayesian inference

22.4K papers, 820.4K citations

95% related
Nonparametric statistics

19.9K papers, 844.1K citations

94% related
Posterior probability

13.7K papers, 475K citations

94% related
Markov chain Monte Carlo

20.1K papers, 746.5K citations

94% related