scispace - formally typeset
Search or ask a question
Topic

Verification and validation of computer simulation models

About: Verification and validation of computer simulation models is a research topic. Over the lifetime, 1556 publications have been published within this topic receiving 43203 citations.


Papers
More filters
Proceedings ArticleDOI
08 Dec 2002
TL;DR: Practical techniques and guidelines for verifying and validating simulation models are outlined and examples of a number of typical situations where model developers may make inappropriate or inaccurate assumptions are provided.
Abstract: In this paper we outline practical techniques and guidelines for verifying and validating simulation models. The goal of verification and validation is a model that is accurate when used to predict the performance of the real-world system that it represents, or to predict the difference in performance between two scenarios or two model configurations. The process of verifying and validating a model should also lead to improving a model's credibility with decision makers. We provide examples of a number of typical situations where model developers may make inappropriate or inaccurate assumptions, and offer guidelines and techniques for carrying out verification and validation.

269 citations

Journal ArticleDOI
TL;DR: Some of the current issues in forecast verification are addressed, some of the most recently developed verification techniques are reviewed, and some recommendations for future research are provided.
Abstract: Research and development of new verification strategies and reassessment of traditional forecast verification methods has received a great deal of attention from the scientific community in the last decade. This scientific effort has arisen from the need to respond to changes encompassing several aspects of the verification process, such as the evolution of forecasting systems, or the desire for more meaningful verification approaches that address specific forecast user requirements. Verification techniques that account for the spatial structure and the presence of features in forecast fields, and which are designed specifically for high-resolution forecasts have been developed. The advent of ensemble forecasts has motivated the re-evaluation of some of the traditional scores and the development of new verification methods for probability forecasts. The expected climatological increase of extreme events and their potential socio-economical impacts have revitalized research studies addressing the challenges concerning extreme event verification. Verification issues encountered in the operational forecasting environment have been widely discussed, verification needs for different user communities have been identified, and models to assess the forecast value for specific users have been proposed. Proper verification practice and correct interpretation of verification statistics has been extensively promoted with recent publications and books, tutorials and workshops, and the development of open-source software and verification tools. This paper addresses some of the current issues in forecast verification, reviews some of the most recently developed verification techniques, and provides recommendations for future research. Copyright © 2008 Royal Meteorological Society and Crown in the right of Canada.

266 citations

Journal ArticleDOI
TL;DR: The assumptions and derivations of seven analytical models are compared and the difference between analytical and simulation models was found to be significant.
Abstract: The purpose of this paper is to evaluate and compare past models of transfer lines. After a discussion of the nature of transfer lines and their stoppages, the assumptions and derivations of seven analytical models are compared. The major discrepancies in assumptions concern whether failures are time dependent or operation dependent. The comparison of derivations includes a translation of all results to a common set of symbols. The predictions of the models at various banking levels are observed to vary significantly. Model validation is discussed in two phases. First the validity of the assumptions is tested by real data from a transfer line. The major problem is the failure of the real data to satisfy the assumptions of the model. Finally, the predictions of the analytical models are compared with a simulation model which uses the actual data. The difference between analytical and simulation models was found to be significant.

264 citations

Journal ArticleDOI
TL;DR: The success of program verification as a generally applicable and completely reliable method for guaranteeing program performance is not even a theoretical possibility.
Abstract: The notion of program verification appears to trade upon an equivocation. Algorithms, as logical structures, are appropriate subjects for deductive verification. Programs, as causal models of those structures, are not. The success of program verification as a generally applicable and completely reliable method for guaranteeing program performance is not even a theoretical possibility.

262 citations

ReportDOI
TL;DR: In this article, the authors present guidelines for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation.

257 citations


Network Information
Related Topics (5)
Scheduling (computing)
78.6K papers, 1.3M citations
78% related
Software development
73.8K papers, 1.4M citations
78% related
Software
130.5K papers, 2M citations
77% related
Control system
129K papers, 1.5M citations
75% related
Robustness (computer science)
94.7K papers, 1.6M citations
74% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20236
20228
202115
20208
201923
201821