scispace - formally typeset
Journal ArticleDOI

Selection of simplified models: ii. development of a model selection criterion based on mean squared error

Reads0
Chats0
TLDR
In this article, the authors proposed a new criterion to help modellers select the best simplified model with the lowest expected mean squared error (EME) and compared it with the effectiveness of Bayesian Information Criterion (BIC).
Abstract
Simplified models (SMs) with a reduced set of parameters are used in many practical situations, especially when the available data for parameter estimation are limited. A variety of candidate models are often considered during the model formulation, simplification, and parameter estimation processes. We propose a new criterion to help modellers select the best SM, so that predictions with lowest expected mean squared error can be obtained. The effectiveness of the proposed criterion for selecting simplified nonlinear univariate and multivariate models is demonstrated using Monte-Carlo simulations and is compared with the effectiveness of the Bayesian Information Criterion (BIC). Des modeles simplifies (MS) avec ensemble reduit de parametres sont utilises dans de nombreuses situations pratiques, particulierement lorsque les donnees disponibles pour l'estimation des parametres sont limitees. Divers modeles candidats sont souvent examines durant la formulation du modele, la simplification et l'estimation des parametres. Nous proposons un nouveau critere pour aider les modelisateurs a selectionner le meilleur MS et obtenir des predictions porteuses de l'erreur quadratique moyenne la plus faible. L'efficacite du critere propose pour la selection des modeles simplifies non lineaires a variable uniques et multiples est demontree en utilisant les simulations Monte-Carlo et comparee a l'efficacite du critere bayesian des informations (BIC).

read more

Citations
More filters
Journal ArticleDOI

Mathematical modelling of chemical processes—obtaining the best model predictions and parameter estimates using identifiability and estimability procedures

TL;DR: In this article, the authors reviewed techniques for assessing identifiability and estimability, as well as techniques for estimating a reduced number of parameters can lead to better model predictions with lower mean squared error (MSE).
Journal ArticleDOI

Selection of optimal parameter set using estimability analysis and MSE-based model-selection criterion

TL;DR: In this article, a mean squared error (MSE)-based model selection criterion is used to determine the optimal number of parameters to estimate from the ranked parameter list, so that the most reliable model predictions can be obtained.
Journal ArticleDOI

Highly-selective CO2 conversion via reverse water gas shift reaction over the 0.5wt% Ru-promoted Cu/ZnO/Al2O3 catalyst

TL;DR: In this paper, the reverse water gas shift (RWGS) reaction over a 0.5% Ru-promoted 40% Cu/ZnO/Al2O3 catalyst is studied.
Journal ArticleDOI

Mean-Squared-Error Methods for Selecting Optimal Parameter Subsets for Estimation

TL;DR: In this paper, an orthogonalization algorithm combined with a mean squared error (MSE) based selection criterion has been used to rank parameters from most to least estimable and to determine the parameter subset that should be estimated to obtain the best predictions.
Journal ArticleDOI

Feature Importance of Stabilised Rammed Earth Components Affecting the Compressive Strength Calculated with Explainable Artificial Intelligence Tools

TL;DR: This work uses three machine learning regression tools, i.e., artificial neural networks, decision tree, and random forest, for predicting the compressive strength based on the relative content of CSRE composites (clay, silt, sand, gravel, cement, and water content).
References
More filters
Book

Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach

TL;DR: The second edition of this book is unique in that it focuses on methods for making formal statistical inference from all the models in an a priori set (Multi-Model Inference).
Proceedings Article

Information Theory and an Extention of the Maximum Likelihood Principle

H. Akaike
TL;DR: The classical maximum likelihood principle can be considered to be a method of asymptotic realization of an optimum estimate with respect to a very general information theoretic criterion to provide answers to many practical problems of statistical model fitting.
Book ChapterDOI

Information Theory and an Extension of the Maximum Likelihood Principle

TL;DR: In this paper, it is shown that the classical maximum likelihood principle can be considered to be a method of asymptotic realization of an optimum estimate with respect to a very general information theoretic criterion.
Book

Introduction to Linear Regression Analysis

TL;DR: In this paper, the authors propose a simple linear regression model with variable selection and multicollinearity for robust regression, and validate the model using regression analysis and validation of regression models.
Book

Model selection and multimodel inference

TL;DR: The first € price and the £ and $ price are net prices, subject to local VAT, and the €(D) includes 7% for Germany, the€(A) includes 10% for Austria.
Related Papers (5)