scispace - formally typeset
Search or ask a question

Showing papers on "Verification and validation of computer simulation models published in 2010"


Journal ArticleDOI
TL;DR: This paper reviews the standard definitions of verification and validation in the context of engineering design and progresses to provide a coherent analysis and classification of these activities from preliminary design, to design in the digital domain and the physical verification and validate of products and processes.

239 citations


Journal Article
TL;DR: A type of model for verification and validation in scientific computing code is proposed and designed based on the procedures of application codes designing and the cycle of software developing to afford an assessment method for verification validation in science computing code.
Abstract: Aiming at the developing of application codes and the assessment of prediction capability in scientific computing,a type of model for verification and validation in scientific computing code is proposed and designed based on the procedures of application codes designing and the cycle of software developing in this paper.The relations of physical modeling and numerical simulation to verification validation in scientific computing code are explored.It is also proposed the methods and procedures for verification and validation in scientific computing code.The motivation of this model is to afford an assessment method for verification validation in scientific computing code.

202 citations


Journal ArticleDOI
01 Jul 2010
TL;DR: The task of model validation will be discussed, with a focus on current techniques, and it is hoped that this review will encourage investigators to engage and adopt the verification and validation process in an effort to increase peer acceptance of computational biomechanics models.
Abstract: The topics of verification and validation have increasingly been discussed in the field of computational biomechanics, and many recent articles have applied these concepts in an attempt to build credibility for models of complex biological systems. Verification and validation are evolving techniques that, if used improperly, can lead to false conclusions about a system under study. In basic science, these erroneous conclusions may lead to failure of a subsequent hypothesis, but they can have more profound effects if the model is designed to predict patient outcomes. While several authors have reviewed verification and validation as they pertain to traditional solid and fluid mechanics, it is the intent of this paper to present them in the context of computational biomechanics. Specifically, the task of model validation will be discussed, with a focus on current techniques. It is hoped that this review will encourage investigators to engage and adopt the verification and validation process in an effort to increase peer acceptance of computational biomechanics models.

168 citations


Journal ArticleDOI
TL;DR: As the role of simulation modeling in population health is increasing and models are becoming more complex, there is a need for further improvements in model validation methodology and common standards for evaluating model credibility.
Abstract: Background: Computer simulation models are used increasingly to support public health research and policy, but questions about their quality persist. The purpose of this article is to review the principles and methods for validation of population-based disease simulation models. Methods: We developed a comprehensive framework for validating population-based chronic disease simulation models and used this framework in a review of published model validation guidelines. Based on the review, we formulated a set of recommendations for gathering evidence of model credibility. Results: Evidence of model credibility derives from examining: 1) the process of model development, 2) the performance of a model, and 3) the quality of decisions based on the model. Many important issues in model validation are insufficiently addressed by current guidelines. These issues include a detailed evaluation of different data sources, graphical representation of models, computer programming, model calibration, between-model comparisons, sensitivity analysis, and predictive validity. The role of external data in model validation depends on the purpose of the model (e.g., decision analysis versus prediction). More research is needed on the methods of comparing the quality of decisions based on different models. Conclusion: As the role of simulation modeling in population health is increasing and models are becoming more complex, there is a need for further improvements in model validation methodology and common standards for evaluating model credibility.

113 citations


Journal ArticleDOI
09 Jun 2010
TL;DR: An interactive approach called HyperMoVal that is designed to support multiple tasks related to model validation, significantly accelerates the identification of regression models and increases the confidence in the overall engineering process.
Abstract: During the development of car engines, regression models that are based on machine learning techniques are increasingly important for tasks which require a prediction of results in real-time. While the validation of a model is a key part of its identification process, existing computation- or visualization-based techniques do not adequately support all aspects of model validation. The main contribution of this paper is an interactive approach called HyperMoVal that is designed to support multiple tasks related to model validation: 1) comparing known and predicted results, 2) analyzing regions with a bad fit, 3) assessing the physical plausibility of models also outside regions covered by validation data, and 4) comparing multiple models. The key idea is to visually relate one or more n-dimensional scalar functions to known validation data within a combined visualization. HyperMoVal lays out multiple 2D and 3D sub-projections of the n-dimensional function space around a focal point. We describe how linking HyperMoVal to other views further extends the possibilities for model validation. Based on this integration, we discuss steps towards supporting the entire workflow of identifying regression models. An evaluation illustrates a typical workflow in the application context of car-engine design and reports general feedback of domain experts and users of our approach. These results indicate that our approach significantly accelerates the identification of regression models and increases the confidence in the overall engineering process.

104 citations


Journal ArticleDOI
TL;DR: This paper will cover principles and practices for verification and validation including lessons learned from related fields.
Abstract: Dramatic progress in the scope and power of plasma simulations over the past decade has extended our understanding of these complex phenomena However, as codes embody imperfect models for physical reality, a necessary step toward developing a predictive capability is demonstrating agreement, without bias, between simulations and experimental results While comparisons between computer calculations and experimental data are common, there is a compelling need to make these comparisons more systematic and more quantitative Tests of models are divided into two phases, usually called verification and validation Verification is an essentially mathematical demonstration that a chosen physical model, rendered as a set of equations, has been accurately solved by a computer code Validation is a physical process which attempts to ascertain the extent to which the model used by a code correctly represents reality within some domain of applicability, to some specified level of accuracy This paper will cover principles and practices for verification and validation including lessons learned from related fields

96 citations


Proceedings ArticleDOI
06 Apr 2010
TL;DR: D deduction rules are proposed and the verification of a model transformation of processing business process models is presented, showing how a reasoning system can use an assertion set to automatically derive additional assertions describing additional properties of model transformations.
Abstract: Verification of models and model processing programs are inevitable in real-world model-based software development. Model transformation developers are interested in offline verification methods, when only the definition of the model transformation and the metamodels of the source and target languages are used to analyze the properties and no concrete input models are taken into account. Therefore, the results of the analysis hold for each output model not just particular ones, and we have to perform the analysis only once. Most often, formal verification of model transformations is performed manually or the methods can be applied only for a certain transformation or for the analysis of only a certain property. Previous work has presented a formalism to describe the characteristics of model transformations in separate formal expressions called assertions. This description is based on the first-order logic, therefore, if deduction rules are provided, a reasoning system can use an assertion set to automatically derive additional assertions describing additional properties of model transformations. In this paper, we propose deduction rules and present the verification of a model transformation of processing business process models.

96 citations



Journal ArticleDOI
01 May 2010
TL;DR: The game-theoretic approach enhances existing methods for the validation of discrete-event simulation models and techniques for simulation-based optimization by incorporating the inherent game setting of AC into the analysis.
Abstract: This paper presents a new game-theoretic approach toward the validation of discrete-event air combat (AC) simulation models and simulation-based optimization. In this approach, statistical techniques are applied for estimating games based on data produced by a simulation model. The estimation procedure is presented in cases involving games with both discrete and continuous decision variables. The validity of the simulation model is assessed by comparing the properties of the estimated games to actual practices in AC. These games are also applied for simulation-based optimization in a two-sided setting in which the action of the opponent is taken into account. In optimization, the estimated games enable the study of effectiveness of AC tactics as well as aircraft, weapons, and avionics configurations. The game-theoretic approach enhances existing methods for the validation of discrete-event simulation models and techniques for simulation-based optimization by incorporating the inherent game setting of AC into the analysis. It also provides a novel game-theoretic perspective to simulation metamodeling which is used to facilitate simulation analysis. The utilization of the game-theoretic approach is illustrated by analyzing simulation data obtained with an existing AC simulation model.

48 citations


Proceedings ArticleDOI
25 Jul 2010
TL;DR: Comparisons with measured data indicate the quality of the overall model; analysis of the differences demonstrates which subsystem component models need to be revalidated.
Abstract: Since models form the basis for most power system studies, power system model validation is an essential procedure for maintaining system security and reliability. The procedure may be viewed as a “top-down” approach to model verification; comparisons with measured data indicate the quality of the overall model. Analysis of the differences demonstrates which subsystem component models need to be revalidated. Numerous examples are presented to illustrate the use and importance of system model validation.

43 citations


Proceedings ArticleDOI
22 Dec 2010
TL;DR: An overview of procedures for Code Verification, Solution Verification and Validation is given, which intends to verify that a given code solves correctly the equations of the model that it contains by error evaluation.
Abstract: The maturing of CFD codes for practical calculations of complex turbulent flows implies the need to establish the credibility of the results by Verification & Validation. These two activities have different goals: Verification is a purely mathematical exercise that intends to show that we are “solving the equations right”, whereas Validation is a science/engineering activity that intends to show that we are “solving the right equations”. Verification is in fact composed of two different activities: Code Verification and Solution Verification. Code Verification intends to verify that a given code solves correctly the equations of the model that it contains by error evaluation. On the other hand, Solution Verification intends to estimate the error of a given calculation, for which in general the exact solution is not known. Validation intends to estimate modelling errors by comparison with experimental data. The paper gives an overview of procedures for Code Verification, Solution Verification and Validation. Examples of the three types of exercises are presented for simple test cases demonstrating the advantages of performing Verification and Validation exercises.Copyright © 2010 by ASME

Journal ArticleDOI
TL;DR: DBSolve Optimum is a user friendly simulation software package that enables to simplify the construction, verification, analysis and visualization of kinetic models to present them in more comprehensible mode.
Abstract: Systems biology research and applications require creation, validation, extensive usage of mathematical models and visualization of simulation results by end-users. Our goal is to develop novel method for visualization of simulation results and implement it in simulation software package equipped with the sophisticated mathematical and computational techniques for model development, verification and parameter fitting. We present mathematical simulation workbench DBSolve Optimum which is significantly improved and extended successor of well known simulation software DBSolve5. Concept of "dynamic visualization" of simulation results has been developed and implemented in DBSolve Optimum. In framework of the concept graphical objects representing metabolite concentrations and reactions change their volume and shape in accordance to simulation results. This technique is applied to visualize both kinetic response of the model and dependence of its steady state on parameter. The use of the dynamic visualization is illustrated with kinetic model of the Krebs cycle. DBSolve Optimum is a user friendly simulation software package that enables to simplify the construction, verification, analysis and visualization of kinetic models. Dynamic visualization tool implemented in the software allows user to animate simulation results and, thereby, present them in more comprehensible mode. DBSolve Optimum and built-in dynamic visualization module is free for both academic and commercial use. It can be downloaded directly from http://www.insysbio.ru .

DOI
19 May 2010
TL;DR: A high-level view of the tools involved and the successive transformations performed by the behavioral verification of AADL models are described to draw a set of conclusions about the integration of model-checking tools in an industrial development process.
Abstract: This paper details works undertaken in the scope of the Spices project concerning the behavioral verification of AADL models. We give a high-level view of the tools involved and describe the successive transformations performed by our verification process. We also report on an experiment carried out in order to evaluate our framework and give the first experimental results obtained on real-size models. This demonstrator models a network protocol in charge of data communications between an airplane and ground stations. From this study we draw a set of conclusions about the integration of model-checking tools in an industrial development process.

Journal ArticleDOI
TL;DR: This research makes effort to settle the problems above and proposes a domain knowledge-based method for the validation of relationship among behavior segments and aggregation behavior which is unique for complex simulation models.

Journal ArticleDOI
TL;DR: The review of the literature showed that descriptive models are only applied to process design problems, and that analytical and computer simulation models are applied to all types of problems to approximately the same extent.
Abstract: Purpose – The purpose of this article is to find decision‐making models for the design and control of processes regarding patient flows, considering various problem types, and to find out how usable these models are for managerial decision making.Design/methodology/approach – A systematic review of the literature was carried out. Relevant literature from three databases was selected based on inclusion and exclusion criteria and the results were analyzed.Findings – A total of 68 articles were selected. Of these, 31 contained computer simulation models, ten contained descriptive models, and 27 contained analytical models. The review showed that descriptive models are only applied to process design problems, and that analytical and computer simulation models are applied to all types of problems to approximately the same extent. Only a few models have been validated in practice, and it seems that most models are not used for their intended purpose: to support management in decision making.Research limitations...

Book ChapterDOI
14 Jul 2010

Journal ArticleDOI
TL;DR: The use of state-of-the-art V&V technology to support knowledge engineering for a timeline-based planning system called MrSPOCK is analyzed, and new functionalities have been added to perform both model validation and plan verification.
Abstract: To foster effective use of artificial intelligence planning and scheduling (PS moreover, they employ resolution processes designed to optimize the solution with respect to non-trivial evaluation functions. Knowledge engineering environments aim at simplifying direct access to the technology for people other than the original system designers, while the integration of validation and verification (V&V) capabilities in such environments may potentially enhance the users’ trust in the technology. Somehow, V&V techniques may represent a complementary technology, with respect to P&S, that contributes to developing richer software environments to synthesize a new generation of robust problem-solving applications. The integration of V&V and P&S techniques in a knowledge engineering environment is the topic of this paper. In particular, it analyzes the use of state-of-the-art V&V technology to support knowledge engineering for a timeline-based planning system called MrSPOCK. The paper presents the application domain for which the automated solver has been developed, introduces the timeline-based planning ideas, and then describes the different possibilities to apply V&V to planning. Hence, it continues by describing the step of adding V&V functionalities around the specialized planner, MrSPOCK. New functionalities have been added to perform both model validation and plan verification. Lastly, a specific section describes the benefits as well as the performance of such functionalities.

Proceedings ArticleDOI
21 Apr 2010
TL;DR: It is argued that reluctance by the system dynamics modelers to expose their models to formal validity procedures is the main problem, which leads to an exploration of formalvalidation procedures available but less explored in system dynamics modeling 'repertoire'.
Abstract: Simulation models in general and system dynamics type simulation models in particular have become increasingly popular in the analysis of important policy issues in business organizations. The usefulness of these models is predicated on their ability to link patterns of behavior of a system to the underlying structures of the system. Despite their capabilities, the acceptance of system dynamics simulation models by the broader community of modelers and decision makers is limited. We argue that reluctance by the system dynamics modelers to expose their models to formal validity procedures is the main problem. This leads to an exploration of formal validity procedures available but less explored in system dynamics modeling 'repertoire'. An illustration of the application of tests for both the structural and behavior validity of a system dynamics simulation model follows. Finally, some conclusions on the increased appeal for simulation models for the wider community of model builders and users are drawn.

Proceedings ArticleDOI
25 Jul 2010
TL;DR: This paper proposes an alternative method of validation of the transients in time domain allowing a rapid and objective quantification of the simulations results.
Abstract: In computational electromagnetic simulations, most validation methods have been developed until now to be used in the frequency domain. However, the EMC analysis of the systems in the frequency domain many times is not enough to evaluate the immunity of current communication devices. Based on several studies, in this paper we propose an alternative method of validation of the transients in time domain allowing a rapid and objective quantification of the simulations results.

Proceedings ArticleDOI
25 Jul 2010
TL;DR: In this paper, the authors present a discussion of the various approaches to model validation of power system models used in planning studies, and present approaches and emerging methodologies, particularly for validation of the power system components/equipment.
Abstract: This paper presents a discussion of the various approaches to model validation of power system models used in planning studies. It focuses on present approaches and emerging methodologies, particularly for validation of power system components/equipment. Results are presented for the current practice of component model validation (i.e. validation of single power plant models, or FACTS devices, etc.). Then it is shown how some of these techniques might be used to extend current approaches to consider system wide model validation. Finally a discussion is presented on future trends in planning and operational studies to meet the needs of the North American power system.

Proceedings ArticleDOI
Ping Xu1, Zili Wang1, Vue Li1
17 Feb 2010
TL;DR: This paper defines the emphasis on the PHM System analysis and assessment, ascertains the implementary elements and the simulated environmental framework of thePHM System simulated validation and elaborates two central elements of the PHm System experimental validation - the experiment scheme and the experiment technique.
Abstract: This paper based on four key abilities of the Prognostics and Health Management (PHM) System including the abilities of fault detection, fault isolation, fault prognosis and prognosis of the remaining life, analyzes and constructs a framework of the PHM quantitative requirements, and defines the quantitative assessment emphasis of the PHM abilities. Targeting these quantitative requirements and proceeding from the viewpoint of life cycle assessment, this paper presents three methods of the PHM System validation, which are the validation method based on analysis and assessment, the validation method based on simulation (including entire simulation validation and half-entity simulation validation) and the validation method based on experimentation. This paper defines the emphasis on the PHM System analysis and assessment, ascertains the implementary elements and the simulated environmental framework of the PHM System simulated validation and elaborates two central elements of the PHM System experimental validation - the experiment scheme and the experiment technique. This paper provides thoughts and methods to the quantitative validation assessment of the rapidly developing and applying PHM System.

Patent
19 Feb 2010
TL;DR: In this paper, a system and method for testing robustness of a simulation model of a cyber-physical system includes computing a set of symbolic simulation traces for the simulation model for a continuous time system stored in memory.
Abstract: A system and method for testing robustness of a simulation model of a cyber-physical system includes computing a set of symbolic simulation traces for a simulation model for a continuous time system stored in memory, based on a discrete time simulation of given test inputs stored in memory. Simulation errors are accounted for due to at least one of numerical instabilities and numeric computations. The set of symbolic simulation traces are validated with respect to validation properties in the simulation model. Portions of the simulation model description are identified that are sources of the simulation errors.

Proceedings ArticleDOI
21 Jun 2010
TL;DR: A statistical end-to-end cost model is presented to capture the end- to-end latency and energy consumption of anycasting operation under a realistic wireless channel model and a series of preamble length control guidelines are proposed for low and extremely low duty-cycled WSNs.
Abstract: Anycasting has been proposed recently as an efficient communication method for asynchronous duty-cycled wireless sensor networks. However, the interdependencies between end-to-end communication cost and the anycasting design parameters have not been systematically studied. In this paper, a statistical end-to-end cost model is presented to capture the end-to-end latency and energy consumption of anycasting operation under a realistic wireless channel model. By exploring the relationship between the end-to-end cost efficiency and the forwarding decision dependent anycasting design parameters, two anycasting forwarding metrics are proposed for fully distributed forwarding decision. By exploring the relationship among the preamble length, the size of the forwarding set and the achievable end-to-end cost efficiency, a series of preamble length control guidelines are proposed for low and extremely low duty-cycled WSNs. According to our analytical results and simulation validation, the proposed forwarding metrics help reduce the end-to-end latency and energy consumption by about $55\%$ for anycasting with moderate preamble length, compared with the existing heuristic forwarding metrics. The proposed preamble length control guidelines help reduce, by more than half, the end-to-end energy and latency costs in low and extremely-low duty-cycled WSNs.

Proceedings ArticleDOI
20 Sep 2010
TL;DR: A new framework that make use of artificial intelligence and machine learning to generate and evolve models from partial descriptions and examples created by the model checking process, capable of integrated verification and evolution of abstract models, but also of reengineering partial models of a system.
Abstract: In software development, formal verification plays an important role in improving the quality and safety of products and processes. Model checking is a successful approach to verification, used both in academic research and industrial applications. One important improvement regarding utilization of model checking is the development of automated processes to evolve models according to information obtained from verification. In this paper, we propose a new framework that make use of artificial intelligence and machine learning to generate and evolve models from partial descriptions and examples created by the model checking process. This was implemented as a tool that is integrated with a model checker. Our work extends model checking to be applicable when initial description of a system is not available, through observation of actual behaviour of this system. The framework is capable of integrated verification and evolution of abstract models, but also of reengineering partial models of a system.

Journal ArticleDOI
TL;DR: In this article, a battery model for hybrid electric vehicle applications is presented, which is able to predict battery performance and thermal behavior in response to inputs such as drive and test profiles, including 40-season equivalent key life tests (KLTs).
Abstract: This paper describes a battery model for hybrid electric vehicle applications. The model is able to predict battery performance and thermal behavior in response to inputs such as drive and test profiles, including 40-season equivalent key life tests (KLTs). In order to balance the dynamic driving simulation responsiveness and accuracy with computing efficiency over lifetime duration simulations, the present model is built on a semi-empirical basis with sufficiently detailed descriptions of multiple chemical–physical behaviors in effect during cycling and storage. A set of computing processes are running concurrently to address real-time power capability, response, and ageing, which are represented by cumulative key stress factors. This paper focuses on model validation against actual KLT data, and it meanwhile describes the importance of KLT for hybrid and electric battery implementation. The present model is also designed to offer functionality for incorporation into vehicle-level simulations in a Matlab/Simulink environment. Copyright © 2009 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: A novel methodology is proposed that generates a model of the verification tests in a given test set using unsupervised support vector analysis and can be combined with a test generation method like constrained-random test generation to increase its effectiveness without making fundamental changes to the verification flow.
Abstract: Success of simulation-based functional verification depends on the quality and diversity of the verification tests that are simulated. The objective of test generation methods is to generate tests that exercise as much different functionality of the hardware designs as possible. In this paper, we propose a novel methodology that generates a model of the verification tests in a given test set using unsupervised support vector analysis. One potential application is to use this model to select tests that are likely to exercise functionality that has not been tested so far. Since this selection can be done before simulation, it can be used to filter redundant tests and reduce required simulation cycles. Our methodology can be combined with a test generation method like constrained-random test generation to increase its effectiveness without making fundamental changes to the verification flow. Experimental results based on application of the proposed methodology to the OpenSparc T1 processor are reported to demonstrate the practicality of our approach.

Proceedings ArticleDOI
17 Oct 2010
TL;DR: It is found that although there are some means available for verification, comprehensive methods still do not exist and that maintaining a bidirectional trace link between requirements, models and the generated deliverables is a promising approach to significantly facilitate the validation process.
Abstract: The utilisation of Domain-Specific Modelling (DSM) in software development has a significant positive impact on productivity. The productivity increase is caused by the utilisation of modelling languages and generators that are especially suitable for a specific problem domain instead of those designed for solution domains. The prerequisite for this significant productivity increase is that the languages and the automation function correctly. To ensure the suitability of the languages and tools, we need to be able to use the verification and validation (V&V) techniques in the context of DSM. In this position paper we study what V&V actually stands for in this particular context and what the current means are for performing V&V. We found that although there are some means available for verification, comprehensive methods still do not exist. For validation, we believe that maintaining a bidirectional trace link between requirements, models and the generated deliverables is a promising approach to significantly facilitate the validation process.

Proceedings ArticleDOI
15 Mar 2010
TL;DR: FAMVal, a validation architecture that supports the seamless integration of different validation techniques, is introduced using the plug-in based design of the modeling and simulation framework JAMES II.
Abstract: With the rising number and diversity of validation methods, the need for a tool supporting an easy exploitation of those methods emerges. We introduce FAMVal, a validation architecture that supports the seamless integration of different validation techniques. We structure a validation experiment into the tasks specification of requirements, configuration of the model, model execution, observation, analysis, and evaluation. This structuring improves the flexibility of the approach, by facilitating the combination of methods for different tasks. In addition to the overall architecture, basic components and their interactions are presented. The usage of FAMVal is illuminated by several validation experiments with a small chemical model. The architecture has been realized using the plug-in based design of the modeling and simulation framework JAMES II.

Journal ArticleDOI
TL;DR: The basic concepts and procedures of verification of the numerical accuracy of computed information with particular reference to a model problem in solid mechanics are outlined and illustrated by examples.
Abstract: This paper is concerned with the problem of verification of the numerical accuracy of computed information with particular reference to a model problem in solid mechanics. The basic concepts and procedures are outlined and illustrated by examples.

Proceedings ArticleDOI
22 Mar 2010
TL;DR: In this paper, the authors describe a new approach called continuous verification to detect design errors as quickly as possible by exploiting information from the software configuration management system and by combining dynamic and static verification to reduce the state space to be explored.
Abstract: The complexity of software in embedded systems has increased significantly over the last years so that software verification now plays an important role in ensuring the overall product quality. In this context, bounded model checking has been successfully applied to discover subtle errors, but for larger applications, it often suffers from the state space explosion problem. This paper describes a new approach called continuous verification to detect design errors as quickly as possible by exploiting information from the software configuration management system and by combining dynamic and static verification to reduce the state space to be explored. We also give a set of encodings that provide accurate support for program verification and use different background theories in order to improve scalability and precision in a completely automatic way. A case study from the telecommunications domain shows that the proposed approach improves the error-detection capability and reduces the overall verification time by up to 50%.