scispace - formally typeset
Search or ask a question

Showing papers on "Verification and validation of computer simulation models published in 2002"


Journal ArticleDOI
TL;DR: An extensive review of the literature in V&V in computational fluid dynamics (CFD) is presented, methods and procedures for assessing V &V are discussed, and a relatively new procedure for estimating experimental uncertainty is given that has proven more effective at estimating random and correlated bias errors in wind-tunnel experiments than traditional methods.

948 citations


ReportDOI
01 Mar 2002
TL;DR: This paper presents an extensive review of the literature in V and V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V andV, and develops a number of extensions to existing ideas.
Abstract: Verification and validation (V and V) are the primary means to assess accuracy and reliability in computational simulations This paper presents an extensive review of the literature in V and V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V and V, and develops a number of extensions to existing ideas The review of the development of V and V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities Fundamental issues in V and V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized

321 citations


Proceedings ArticleDOI
08 Dec 2002
TL;DR: Practical techniques and guidelines for verifying and validating simulation models are outlined and examples of a number of typical situations where model developers may make inappropriate or inaccurate assumptions are provided.
Abstract: In this paper we outline practical techniques and guidelines for verifying and validating simulation models. The goal of verification and validation is a model that is accurate when used to predict the performance of the real-world system that it represents, or to predict the difference in performance between two scenarios or two model configurations. The process of verifying and validating a model should also lead to improving a model's credibility with decision makers. We provide examples of a number of typical situations where model developers may make inappropriate or inaccurate assumptions, and offer guidelines and techniques for carrying out verification and validation.

269 citations


Journal ArticleDOI
TL;DR: In this paper, an analytical verification and comparative diagnostic procedure was developed to test the ability of whole-building simulation programs to model the performance of unitary space-cooling equipment that is typically modeled using manufacturer design data presented as empirically derived performance maps.

95 citations


Journal ArticleDOI
TL;DR: This paper proposes a remedy to this state of affairs in the case of finite-state concurrent systems by describing an approach to developing customizable yet efficient verification tools.

41 citations


Journal ArticleDOI
TL;DR: The purpose of this paper is to examine how far simulation studies of maintenance systems are neglecting simulation-related statistical issues, which may leave simulation results suspicious and hard to explain.
Abstract: The purpose of this paper is to examine how far simulation studies of maintenance systems are neglecting simulation-related statistical issues This negligence may leave simulation results suspicious and hard to explain Several simulation factors are used to identify the strength of executed simulation experiments and to evaluate the level of clarity and reliability of the simulation models The factors includes purpose of simulation, simulation model, model assumptions, distribution and random variables, simulation languages and computers, program verification and model validation, design of experiment, and analysis of the output For this purpose, the literature is reviewed and subjected to the evaluation It is observed that most papers define clearly their objectives, simulation languages, and model performance measures However, verification, validation, experimental design, and output analysis are the most unclear factors

37 citations


Journal ArticleDOI
01 Nov 2002
TL;DR: This paper suggests a modification of partial order reduction, allowing its combination with any BDD-based verification tool, and describes a co-verification methodology developed using these techniques jointly, suggesting that for moderate–size systems the method is ready for industrial application.
Abstract: Combining verification methods developed separately for software and hardware is motivated by the industry's need for a technology that would make formal verification of realistic software/hardware co-designs practical. We focus on techniques that have proved successful in each of the two domains: BDD-based symbolic model checking for hardware verification and partial order reduction for the verification of concurrent software programs. In this paper, we first suggest a modification of partial order reduction, allowing its combination with any BDD-based verification tool, and then describe a co-verification methodology developed using these techniques jointly. Our experimental results demonstrate the efficiency of this combined verification technique, and suggest that for moderate–size systems the method is ready for industrial application.

36 citations


Journal ArticleDOI
01 Apr 2002
TL;DR: An overview of the development of a vehicle dynamics simulation model for use on the National Highway Traffic Safety Administration's (NHTSA's) National Advanced Driving Simulator is provided in this paper.
Abstract: This paper provides an overview of the development of a vehicle dynamics simulation model for use on the National Highway Traffic Safety Administration's (NHTSA's) National Advanced Driving Simulator. The paper describes fundamental aspects of models used to represent rigid body chassis and suspension systems, powertrain, tyres, brakes, steering and aerodynamics. Representative data from laboratory measurements, instrumented field tests and simulation runs of a 1997 Jeep Cherokee Sport are presented to illustrate simulation development and validation efforts. A companion paper to this one also uses Jeep Cherokee data. Both papers highlight current capabilities and methodologies employed by two organizations that have worked, often collaboratively, to advance the state of the art of vehicle dynamics modelling and simulation validation. This paper features work done by NHTSA's Vehicle Research and Test Center; the companion paper reports on work done by Systems Technology, Incorporated (STI).

31 citations


Journal ArticleDOI
TL;DR: This paper presents a theorem prover-based verification technique as a supplementary validation measure for safety-related application domains and outlines core concepts of the standardized languages, their semantic embedding into higher order logic, and the verification approach.

25 citations


Proceedings ArticleDOI
08 Dec 2002
TL;DR: Practical techniques and guidelines for verifying and validating simulation models are outlined and examples of a number of typical situations where model developers may make inappropriate or inaccurate assumptions are provided.
Abstract: In this paper we outline practical techniques and guidelines for verifying and validating simulation models. The goal of verification and validation is a model that is accurate when used to predict the performance of the real-world system that it represents, or to predict the difference in performance between two scenarios or two model configurations. The process of verifying and validating a model should also lead to improving a model's credibility with decision makers. We provide examples of a number of typical situations where model developers may make inappropriate or inaccurate assumptions, and offer guidelines and techniques for carrying out verification and validation.

24 citations


01 Jan 2002
TL;DR: In this article, the authors present a set of recommended guidelines for the development and application of traffic simulation models based on previously published information, interviews with practitioners, and results from successfully completed simulation projects.
Abstract: Traffic engineers and transportation planners are using traffic simulation models with greater frequency to plan and design future transportation facilities. However, the transportation profession has not established formal and consistent guidelines regarding the development and application of these models. The lack of such guidance or direction has led to conflicts between model users, inappropriate use of the models, and inaccurate results from the models. Many of these situations could be avoided if guidelines were available that address both model development and application. The purpose of this paper is to present an initial set of recommended guidelines for the development and application of traffic simulation models. The guidelines are based on previously published information, interviews with practitioners, and results from successfully completed simulation projects. Key issues addressed include the following: Calibration of model parameters for traffic control operation, traffic flow characteristics, and driver behavior; Validation guidelines for traffic flow measurement; and Multiple run requirements for simulation models. The guidelines contained in the paper require formal refinement, but have been proven successful in the field and were accepted by technical professionals as well as decision makers. Ideally, this paper will be used as a building block for more formal guidelines.

Book ChapterDOI
24 Jun 2002
TL;DR: This contribution considers validation concepts based on various chapters of Petri net theory and discusses how simulation is used for validation purposes and how the creation of runs can be performed in an efficient way.
Abstract: The analysis and verification of a Petri net model can only yield a valuable result if the model correctly captures the considered system and if the analyzed or verified properties reflect the actual requirements So validation of both nets and specifications of desired properties is a first class task in model-based system development This contribution considers validation concepts based on various chapters of Petri net theory A particular emphasis is on simulation based validation Simulation means construction of runs, which are high-level process nets in our approach We discuss how simulation is used for validation purposes and how the creation of runs can be performed in an efficient way

Journal ArticleDOI
TL;DR: The issues involved in the design and development of an Ultra-large-Scale Simulation Framework (USSF) are presented and the techniques employed in the framework to reduce and regulate the memory requirements of the simulations are described.

Journal ArticleDOI
TL;DR: Three activity-based white-box methods for assisting a user in verifying and validating construction simulations and can be used jointly to debug a simulation model, so as to confirm that the simulation is correctly conducted and the obtained results are valid.
Abstract: A simulation model must be verified to confirm that it describes correctly its intended real world process under study; moreover, the simulation results obtained must be a valid representation of the process. This study presents three activity-based white-box methods for assisting a user in verifying and validating construction simulations. The first method reports a simulation by listing all activities in the chronological order of their executions, so that a user can contrast the simulated progress against the actual progress in the real world. The second method summarizes the operating counts and mean durations of all activities over the simulated time period, to enable a user to evaluate whether all activities have been executed correctly during simulation. The third method generates an activity cycle report for any selected resource entity, so that a user can examine whether the entity is moving in the correct logical and chronological order during simulation. The three methods can be used jointly to...

Journal ArticleDOI
TL;DR: In this paper, the authors present adaptive finite element methods as a means of achieving verification of codes simulations (obtaining numerical solutions with controlled accuracy). Validation of these grid-independent solutions is then performed by comparing predictions to measurements.
Abstract: This paper presents adaptive finite-element methods as a means of achieving verification of codes simulations (obtaining numerical solutions with controlled accuracy). Validation of these grid-independent solutions is then performed by comparing predictions to measurements. We adopt the standard accepted definitions of verification validation (AIAA 1998; Roache 1998). Mesh adaptation is used to perform the systematic rigorous grid-refinement studies required for both verification validation in CFD. This ensures that discrepancies observed between predictions measurements are due to deficiencies in the mathematical model of the flow. Issues in verification validation are discussed. The paper presents examples of code verification by the method of manufactured solution. Examples of successful unsuccessful validation show that agreement with experiment is achieved only with a good mathematical model of the flow physics combined with accurate numerical solutions of the differential equations. The paper emphas...

Journal ArticleDOI
TL;DR: A roadmap toward the development of a verification and validation procedure through the use of a flow library on the Web is discussed, which describes the experience of the web-based FLOWnet database and some representative test cases installed in the data base.
Abstract: The aspects of verification and validation of computational fluid dynamics must always be addressed with an emphasis on the quantification of the uncertainties due to the model assumptions (either physical or geometrical), and to the numerical and experimental approximations. The credibility of computational fluid dynamics can only be established by a rigorous process of verification and validation. Verification is the process of determining how accurate a computational simulation represents a given conceptual model. Validation establishes how accurate a simulation (i.e. the conceptual model) represents the phenomenon to be investigated. In the present paper we discuss a roadmap toward the development of a verification and validation procedure through the use of a flow library on the Web. In particular, we describe the experience of the web-based FLOWnet database and we analyze some representative test cases installed in the data base

Proceedings ArticleDOI
06 May 2002
TL;DR: The design of a generic, C based multi-processor instruction set simulator framework, termed the "simulation bridge", facilitates highly accurate, yet efficient simulation and addresses the multiple key issues of execution control, synchronization, connectivity and communication.
Abstract: Multi-processor solutions in the embedded world axe being designed to meet the ever increasing computational demands of the emerging applications. Such architectures comprise two or more processors (often a mix of general purpose and digital signal processors) together with a rich peripheral mix to provide a high performance computational platform. While there are many simulation solutions in the industry available to address the system partitioning issues and also the verification of HW-SW interactions in these complex systems, there are very few solutions targetted towards the SW application developers' needs. The primary concern of the SW application developers is to debug and optimize their code. Hence, cycle accuracy and performance of the simulation solution becomes the key enablers. Desired observability and controllability of the models are additional careabouts. Secondly, application developers are more comfortable at instruction level simulations than they are with RTL or gate level simulation. These specific requirements have a bearing on the choices in the simulation solutions. This paper describes the design of a generic, C based multi-processor instruction set simulator framework in the context of the above parameters. This framework, termed the "simulation bridge", facilitates highly accurate, yet efficient simulation. The SimBridge performs clock correct lock-step simulation of the models in the architecture using a global simulation engine that handles both intra-processor and inter-processor communication in a homogenous fashion. It addresses the multiple key issues of execution control, synchronization, connectivity and communication. The paper concludes with the performance analysis of the SimBridge in an experimental test setup as well as in the Texas Instruments (TI) TMS320C54x-based simulators.

Proceedings ArticleDOI
04 Mar 2002
TL;DR: This paper focuses on the evaluation of error models, used in test generation and in functional verification, when fault injection methodologies are used to evaluate the dependability of complex system.
Abstract: Summary form only given. The combined effects of devices increased complexity and reduced design cycle time creates a testing problem: an increasing larger portion of the design time is devoted to testing and verification. Today EDA tools, moving towards higher levels of abstraction, promise greater designer productivity, resulting in increased design complexity and size. In order to reduce the testing and verification time, different high-level approaches have been proposed in literature. Most of these approaches are based on the definition of an error or fault model, applicable at a higher level of abstraction of the description of the system to be implemented. In this paper we concentrate our attention on the evaluation of error models, used in test generation and in functional verification. Evaluation of error models is also an important aspect when fault injection methodologies are used to evaluate the dependability of complex system.

Journal ArticleDOI
TL;DR: A seven-step method for validation of a simulation model, emphasizing the following issues: determining when predictions should be considered as valid, accomplishing validation on the basis of the available model and system data, and indicating potential system changes, so that the model can be modified in real time.

Proceedings ArticleDOI
14 Apr 2002
TL;DR: The definition of a model of flow injection using Cell-DEVS showed a margin of error within the expected values for the experiment, showing how to employ the formalism in analyzing physical systems.
Abstract: Cell-DEVS is an extension to the DEVS formalism that allows the definition of cellular models. Complex physical systems can be defined using simple rules, reducing the development. We present the definition of a model of flow injection using Cell-DEVS. The simulation validation results showed a margin of error within the expected values for the experiment, showing how to employ the formalism in analyzing physical systems.

Dissertation
01 Nov 2002
TL;DR: This thesis aims to evaluate the effectiveness of a formal language (Finite State Process) automated verification tool (Labelled Transition System Analyser) at finding and resolving errors in design models of software.
Abstract: This thesis aims to evaluate the effectiveness of a formal language (Finite State Process) automated verification tool (Labelled Transition System Analyser) at finding and resolving errors in design models of software. FSP is used to model the Lift Problem from a specification refined by validation. The specification is mapped to a finite state domain and tested for errors - in the mapping, in the understanding of the initial requirements, in the accuracy of the initial requirements, and in the concurrency properties of the identified co-operating entities. Exposition of errors refines (validates) the initial description, and drives their resolution giving rise to an evolutionary corrected model; upon exit of iterative analysis this is mapped to UML behavioural diagrams forming Implementation Specifications.

01 Jan 2002
TL;DR: By employing the data of the aero experimentation, qualitative and quantitative methods for the validation of the static and dynamic performance of the simulation model are presented.
Abstract: By employing the data of the aero experimentation, this paper presents qualitative and quantitative methods for the validation of the static and dynamic performance of the simulation model Simulation results as an example have shown that an appropriate method can yield better validation result by using the limited sample of the aero experimentation

Proceedings ArticleDOI
08 Dec 2002
TL;DR: This paper presents a method for the way in which simulation models can be documented by standardised notations, which is adapted to different users of the simulation model.
Abstract: The concept of life cycle simulation appeals to most production engineers. There is a problem with simulation models of manufacturing systems; they may be highly complex and time consuming to develop. They embrace a considerable experience, which is gained through the development process of the simulation model. It is therefore not enough to develop an accurate simulation model. The model must be understood, updated, re-used and inhered by others. One way to achieve these goals could be through use of a standardised documentation, which in turn explains the model and how it has been developed. This paper presents a method for how simulation models can be documented by standardised notations. The documentation is adapted to different users of the simulation model.

Journal ArticleDOI
TL;DR: This paper summarizes a standardized verification process for network traffic simulation models using virtual data on a simple network to confirm their fundamental functions.
Abstract: This paper summarizes a standardized verification process for network traffic simulation models. After the general introduction of philosophy of verification, the detailed processes of the verification and its application to several well-known simulation models are explained. "Verification" here means several examination tests of simulation models using virtual data on a simple network so as to confirm their fundamental functions. In the course of model development, the developers have to examine whether the model performance is consistent with the specifications that they intend and also with the well authorized traffic engineering theory. Because of several constraints in putting the model specifications into the computer programming such as discretizing of tome and space and simplifying vehicle behaviors to some degree, the intended model specifications may not be fully achieved in a computer. The verification is strongly recommended before applying the models to a real network.

Proceedings ArticleDOI
02 Oct 2002
TL;DR: A formal mapping between static information models and dynamic models is presented to enable the reuse of process information when creating dynamic models for process control and enables simulation and verification to be conducted early in the development.
Abstract: A formal mapping between static information models and dynamic models is presented. The static information models are given according to an international standard for product, process and resource information exchange. The dynamic models are described as discrete event systems. The product, process and resource information is automatically converted into product routes and used for simulation, controller synthesis and verification. The focus in this paper is on resource booking problems. A high level language, which combines Petri nets and process algebra, is presented and used for specification of desired routes. A main implication of this methodology is to enable the reuse of process information when creating dynamic models for process control. This method also enables simulation and verification to be conducted early in the development.

Book ChapterDOI
25 Aug 2002
TL;DR: The ISILEIT project aims at the development of a seamless methodology for the integrated design, analysis and validation of such embedded systems as mentioned in this paper, using UML and SDL for the design of distributed production control systems.
Abstract: The specification of software for distributed production control systems is an error prone task. The ISILEIT project aims at the development of a seamless methodology for the integrated design, analysis and validation of such embedded systems. Suitable subsets of UML and SDL for the design of such systems are therefore identified in a first step. The paper then focuses on how we use a series of formal semantics of our design language to enable the effective evaluation of software designs by means of validation and verification. We will further explain how the use of multiple Abstract State Machine meta-models permits simulation and model checking at different levels of abstraction

Journal Article
TL;DR: In this paper, the authors modify the procedure verification approach by introducing two strategies that make use of detail knowledge of procedures in order to reduce the complexity of model checking, which may improve the efficiency of procedure verification significantly and therefore scale up the applicability of the verification approach.
Abstract: Verification of operating procedures by model checking has been discussed in [11, 12]. As an execution of a procedure may affect or be affected by many processes, a model of the procedure with its related processes could be very large. We modify the procedure verification approach [11, 12] by introducing two strategies that make use of detail knowledge of procedures in order to reduce the complexity of model checking. A case study demonstrates the potential advantages of the strategies and shows that the strategies may improve the efficiency of procedure verification significantly and therefore scale up the applicability of the verification approach.

Book ChapterDOI
10 Sep 2002
TL;DR: The procedure verification approach is modified by introducing two strategies that make use of detail knowledge of procedures in order to reduce the complexity of model checking and scale up the applicability of the verification approach.
Abstract: Verification of operating procedures by model checking has been discussed in [11, 12]. As an execution of a procedure may affect or be affected by many processes, a model of the procedure with its related processes could be very large. We modify the procedure verification approach [11, 12] by introducing two strategies that make use of detail knowledge of procedures in order to reduce the complexity of model checking. A case study demonstrates the potential advantages of the strategies and shows that the strategies may improve the efficiency of procedure verification significantly and therefore scale up the applicability of the verification approach.

Journal ArticleDOI
TL;DR: The validation procedures are organised into a hierarchical framework that introduces a structured approach to the validation process that increases the efficiency of the validate process by reducing the time and resources spent on it.
Abstract: The most common procedures used for validating simulation models are briefly reviewed. The validation procedures are organised into a hierarchical framework that introduces a structured approach to the validation process. Application of the validation procedures according to the hierarchical framework results in a systematic approach to simulation model validation. A systematic approach increases the efficiency of the validation process by reducing the time and resources spent on it.

Journal ArticleDOI
TL;DR: The results obtained on a few case studies seem to indicate the feasibility of the proposed approach to the verification and performance evaluation of communication protocols and, in general, of entire computer networks based on VHDL modeling and simulation.
Abstract: A communication protocol usually represents a system whose behavior can be specified through a finite state machine. Finite state machines are often used to model digital systems in the context of logic synthesis and formal hardware verification. Therefore, sophisticated and efficient tools (for example, hardware simulators) to analyze this type of systems do exist.In this paper, we propose an approach to the verification and performance evaluation of communication protocols and, in general, of entire computer networks based on VHDL modeling and simulation. The results we have obtained on a few case studies (some of which are reported in this paper) seem to indicate the feasibility of the method.