scispace - formally typeset
Search or ask a question

Showing papers on "Verification and validation of computer simulation models published in 2013"


Journal ArticleDOI
TL;DR: Three approaches to deciding model validity are described, two paradigms that relate verification and validation to the model development process are presented, and various validation techniques are defined.
Abstract: Verification and validation of simulation models are discussed in this paper. Three approaches to deciding model validity are described, two paradigms that relate verification and validation to the...

1,425 citations


Journal ArticleDOI
TL;DR: This paper provides a broad, but not exhaustive overview of the crowd motion simulation models of the last decades and argues that any model used for crowd simulation should be able to simulate most of the phenomena indicated in this paper.
Abstract: Currently, pedestrian simulation models are used to predict where, when and why hazardous high density crowd movements arise. However, it is questionable whether models developed for low density situations can be used to simulate high density crowd movements. The objective of this paper is to assess the existent pedestrian simulation models with respect to known crowd phenomena in order to ascertain whether these models can indeed be used for the simulation of high density crowds and to indicate any gaps in the field of pedestrian simulation modeling research. This paper provides a broad, but not exhaustive overview of the crowd motion simulation models of the last decades. It is argued that any model used for crowd simulation should be able to simulate most of the phenomena indicated in this paper. In the paper cellular automata, social force models, velocity-based models, continuum models, hybrid models, behavioral models and network models are discussed. The comparison shows that the models can roughly be divided into slow but highly precise microscopic modeling attempts and very fast but behaviorally questionable macroscopic modeling attempts. Both sets of models have their use, which is highly dependent on the application the model has originally been developed for. Yet, for practical applications, that need both precision and speed, the current pedestrian simulation models are inadequate.

407 citations


Journal ArticleDOI
TL;DR: An exhaustive review of the literature on the verification of model transformations analyzing these three components and taking a problem-based approach exemplifying those aspects of interest that could be verified on a model transformation and how this can be done is presented.

69 citations


Book ChapterDOI
13 Jul 2013
TL;DR: It is shown how by combining Explicit Model Checking techniques and simulation it is possible to effectively carry out (bounded) System Level Formal Verification of large Hybrid Systems such as those defined using model-based tools like Simulink.
Abstract: We show how by combining Explicit Model Checking techniques and simulation it is possible to effectively carry out (bounded) System Level Formal Verification of large Hybrid Systems such as those defined using model-based tools like Simulink. We use an explicit model checker (namely, CMurphi) to generate all possible (finite horizon) simulation scenarios and then optimise the simulation of such scenarios by exploiting the ability of simulators to save and restore visited states. We show feasibility of our approach by presenting experimental results on the verification of the fuel control system example in the Simulink distribution. To the best of our knowledge this is the first time that (exhaustive) verification has been carried out for hybrid systems of such a size.

64 citations


Book ChapterDOI
15 Mar 2013
TL;DR: The recent initiative by several European hydraulic research and engineering institutes to start producing validation documents for computational modeling software is described, to contain claims about model applicability and accuracy together with the available evidence for those claims.
Abstract: The recent initiative by several European hydraulic research and engineering institutes to start producing validation documents for computational modeling software is described. These documents are to contain claims about model applicability and accuracy, together with the available evidence for those claims. A brief discussion of recent literature on general aspects of model validation is included. We view model validation as a process of formulating and substantiating claims about model applicability and accuracy. This involves a broad range of activities: theoretical analysis of model assumptions, numerical analysis of discretization techniques, computational experiments on hypothetical test cases, comparisons between model results and laboratory measurements, case studies involving eld applications, etc. A description and classiication of distinct aspects of the validation process is introduced; terminology is carefully deened and included in a glossary.

45 citations


Journal ArticleDOI
TL;DR: Three different models used in simulation: system dynamic, agent based simulation and discrete event simulation are discussed, in context of features, advantages, disadvantages and tools being used in each simulation method.
Abstract: Simulation model is one of the methods commonly used in Operational Research in order to represent the real situation that occurs in a system as well as to test the scenario based on different behavior. In this paper we discuss about three different models used in simulation: system dynamic, agent based simulation and discrete event simulation. The aim of this paper is to compare all these three methods in context of features, advantages, disadvantages and tools being used in each simulation method. The comparison of this paper also includes the classification of simulation model using taxonomy. Throughout this paper, we view a few software tools usually being used in a simulation like Vensim, ProModel and AnyLogic.

42 citations


Proceedings ArticleDOI
30 Oct 2013
TL;DR: The presented work uses a branch of mathematics called model theory to investigate definitions of interoperability and composability and provides the implications for verification and validation procedures, model-based approaches, and simulation interoperability standards.
Abstract: Interoperability is generally defined as the ability to exchange data and to make use of these data within the receiving system. For information technology systems this definition makes perfect sense, as the exchange of data via common protocols in a shared infrastructure is the only way to make systems work with each other. For simulation systems, however, the exchange and use of data is necessary, but not sufficient. Simulation systems execute models, which represent purposeful abstractions and simplifications of a task-oriented reality. Meaningful interoperation of two systems requires the alignment of concepts represented in the underlying models. Compos ability ensures the alignment of these concepts by ensuring the consistent representation of interpretations of truth among all participating systems. The presented work uses a branch of mathematics called model theory to investigate definitions of interoperability and compos ability and provides the implications for verification and validation procedures, model-based approaches, and simulation interoperability standards.

35 citations


Proceedings ArticleDOI
08 Dec 2013
TL;DR: A graphical paradigm that shows how verification and validation are related to the model development process and a flowchart that shows why model verification and validate are important are presented and discussed.
Abstract: Model verification and validation are defined, and why model verification and validation are important is discussed. The three approaches to deciding model validity are described. A graphical paradigm that shows how verification and validation are related to the model development process and a flowchart that shows how verification and validation is part of the model development process are presented and discussed. A recommended procedure for verification and validation is given.

33 citations


Book ChapterDOI
09 Jun 2013
TL;DR: By combining abstract method calls, structured reuse in specification contracts, and caching of verification conditions, it is possible to detect reusability of contracts automatically via first-order reasoning and build a verification framework that is able to deal with code undergoing frequent changes.
Abstract: A major obstacle facing adoption of formal software verification is the difficulty to track changes in the target code and to accomodate them in specifications and in verification arguments. We introduce abstract method calls, a new verification rule for method calls that can be used in most contract-based verification settings. By combining abstract method calls, structured reuse in specification contracts, and caching of verification conditions, it is possible to detect reusability of contracts automatically via first-order reasoning. This is the basis for a verification framework that is able to deal with code undergoing frequent changes.

33 citations


Book ChapterDOI
24 Sep 2013
TL;DR: Automatic generation of such tests would require “executing” the complex specifications typically used for verification (with unbounded quantification and other expressive constructs), something beyond the capabilities of standard testing tools.
Abstract: When program verification fails, it is often hard to understand what went wrong in the absence of concrete executions that expose parts of the implementation or specification responsible for the failure. Automatic generation of such tests would require “executing” the complex specifications typically used for verification (with unbounded quantification and other expressive constructs), something beyond the capabilities of standard testing tools.

24 citations


Journal ArticleDOI
TL;DR: A simplified probabilistic model for off-line signature verification is proposed that is able to predict the accuracy of a signature verification system based on just a few a priori known parameters, such as the cardinality and the quality of input samples.

Proceedings ArticleDOI
29 May 2013
TL;DR: The experimental results show that by leveraging the knowledge extracted from constrained-random simulation, this work can improve the test templates to activate the assertions that otherwise are difficult to activate by extensive simulation.
Abstract: This work proposes a methodology of knowledge extraction from constrained-random simulation data. Feature-based analysis is employed to extract rules describing the unique properties of novel assembly programs hitting special conditions. The knowledge learned can be reused to guide constrained-random test generation towards uncovered corners. The experiments are conducted based on the verification environment of a commercial processor design, in parallel with the on-going verification efforts. The experimental results show that by leveraging the knowledge extracted from constrained-random simulation, we can improve the test templates to activate the assertions that otherwise are difficult to activate by extensive simulation.

Journal ArticleDOI
TL;DR: The lack of information regarding most of the simulation-based studies and their models restricts replication, making the results usually specifics and generalization hard and compromises validity confidence.
Abstract: CONTEXT: Simulation-based studies have been used in different research areas in order to conduct computerized experiments with distinct purposes. However, it seems that Software Engineering simulation studies have been performed in a non-systematic way, using ad-hoc experimental design and analysis procedures, i.e., without defining a research protocol and missing information when reporting results. OBJECTIVE: To characterize simulation-based studies and identify the common simulation strategies in Software Engineering. METHOD: To undertake a quasi-Systematic Review. Three online digital libraries (Scopus, Ei Compendex, and Web of Science) are used as sources of information. Information extraction from the primary studies should be captured using a predefined form. Plotted charts and tables should be used together with quantitative/qualitative approaches when possible to support data analysis. RESULTS: From 946 papers, 108 have been included, from which it is possible to identify 19 simulation approaches, 17 domains, 28 simulation models characteristics, 22 output analysis instruments, and 9 procedures for the verification and validation of simulation models in the Software Engineering context. Most dominant approach is System Dynamics in the Software Project Management domain. Replication is not a common behaviour. CONCLUSION: The lack of information regarding most of the simulation-based studies and their models restricts replication, making the results usually specifics and generalization hard. Apart from that, it compromises validity confidence. More research and discussions should be made by the community to increase the reliability and use of simulation-based studies in Software Engineering.

Journal ArticleDOI
TL;DR: A summary of the development of more and more sophisticated validation strategies up to the present trend for science-based validation approaches in the field of simulation tools for nuclear reactor design is given.

Proceedings ArticleDOI
08 Dec 2013
TL;DR: This paper aims to answer the question to what extent gaming simulation can be used as an experimental research setting, due to its loosely demarcated experimental features, and strives to improve gaming simulation for testing innovations that tackle social and technical elements of a system.
Abstract: Gaming simulation in the railway sector often uses the same conceptual model as in computer simulation, and enables operators to interact with this model during a simulation run. Therefore, gaming simulation validation poses different challenges. This paper aims to answer the question to what extent gaming simulation can be used as an experimental research setting, due to its loosely demarcated experimental features. Focusing on validity issues, we study five cases in which the Dutch railway sector used gaming simulation to test innovations in a controlled environment. The results show that in addition to traditional external validity issues, human game players inherently open up this controlled environment, bringing in many confounding variables. By signaling what the specific validity threats are, this paper strives to improve gaming simulation for testing innovations that tackle social and technical elements of a system.

Journal ArticleDOI
TL;DR: In this paper, a generic testing framework for agent-based simulation models to conduct validation and verification of models is presented, which demonstrates its effectiveness by showing its applicability on a realistic agentbased simulation case study.
Abstract: Agent-based modelling and simulation (ABMS) had an increasing attention during the last decade. However, the weak validation and verification of agent-based simulation models makes ABMS hard to trust. There is no comprehensive tool set for verification and validation of agent-based simulation models, which demonstrates that inaccuracies exist and/or reveals the existing errors in the model. Moreover, on the practical side, many ABMS frameworks are in use. In this sense, we designed and developed a generic testing framework for agent-based simulation models to conduct validation and verification of models. This paper presents our testing framework in detail and demonstrates its effectiveness by showing its applicability on a realistic agent-based simulation case study.

Journal ArticleDOI
01 Dec 2013
TL;DR: A Design Science approach is taken to demonstrate that the REA ontology, which provides a shared conceptual ground for these three model types, and its axioms, can help to build conceptually sound simulation models and identify the integration points between these models.
Abstract: This paper presents a framework for the integration of supply chain (or logistics/distribution), value chain (or financial), and business process (or operational/manufacturing) simulation models, which should facilitate assessing the impact of supply chain and operational changes on an enterprise's financial performance. A Design Science approach is taken to demonstrate that the REA ontology, which provides a shared conceptual ground for these three model types, and its axioms, which describe invariant conditions for value systems, can help to build conceptually sound simulation models and identify the integration points between these models. It is further shown how these three types of simulation models can be integrated into one value system model for discrete event simulation, making use of the ExSpecT simulation tool. With this ontology-based framework, simulation model builders should be able to scope their models better and define integration points with other models, which is expected to promote the (re)use of simulation models for different purposes (e.g., simulating logistical, operational and financial performance). Defining the scope of business process, supply and value chain simulation modelsRephrasing the REA modeling axioms for each of these types of simulation modelsIdentifying integration points between these models through the REA ontologyIntegrating these three types of models into a hierarchic value system modelFirst application of the REA ontology for building discrete-event simulation models

Proceedings ArticleDOI
07 Jul 2013
TL;DR: This work illustrates the possibilities associated with the concept of experimental frame in the domain of simulation by detecting incompatibilities with parts of the model and developing a method to measure it.
Abstract: This work illustrates the possibilities associated with the concept of experimental frame in the domain of simulation The experimental frame noted "EF" is used to define the environment in which a model will evolve EF and model are extracted from specifications of the system understudy Formal language has been utilized for designing EF & model such that model checking techniques can be employed We have chosen "I/O automata", applying the frame to the model implies incompatibilities with parts of the model In this contribution we detect these incompatibilities to measure it Our method will allow a set of metrics evaluate the validity of simulations according to goals of simulation user-defined

Journal ArticleDOI
TL;DR: The model implemented in the equation-oriented environment proved to be suitable for real-time optimization (RTO) applications due to its robustness, fast convergence, and ability to represent the real process.
Abstract: The modeling, analysis, and simulation validation of an industrial-scale depropenizer column (Refinaria de Capuava, Maua, Sao Paulo) owned by Petrobras S.A. are carried out using equation-oriented and sequential modular approaches. The model implemented in the equation-oriented environment proved to be suitable for real-time optimization (RTO) applications due to its robustness, fast convergence, and ability to represent the real process. The analysis of the model allows better understanding of the process and establishment of boundaries to process specifications and the parameters updated in the RTO cycle; furthermore, it is shown that feed composition and column pressure drop are critical aspects in the model to represent the actual process.

Book ChapterDOI
TL;DR: This paper updates an earlier one that reviewed the literature on V&V of models and outlined future directions, and describes the experiences LANL researchers have had with the V &V of extensible logic models used to evaluate the efficacy of various technologies in countering national security threats.

Proceedings ArticleDOI
08 Dec 2013
TL;DR: Comparisons of hourly census in many areas in the real-world ED with those predicted from simulation, before and after a process intervention provide evidence that a methodologically rigorous DES model, incorporating the method for model verification using SysML, is a valuable decision making tool for judging the utility of interventions to improve patient flow.
Abstract: Continual improvements to the efficiency of patient flow in emergency departments (EDs) are necessary to meet patient demand. Discrete Event Simulation (DES) is commonly employed for this purpose, but validation and verification is daunting in that many stakeholders -- clinicians, administrators, and engineers -- need to understand the system's processes in a unified manner. Therefore, knowledge transfer between stakeholders requires a unified formal approach. We describe the use use of System Modeling Language (SysML) to this end, as well as the results of model validation by comparing hourly census in many areas in the real-world ED with those predicted from simulation, before and after a process intervention. The accuracy of these comparisons provides evidence that a methodologically rigorous DES model, incorporating our method for model verification using SysML, is a valuable decision making tool for judging the utility of interventions to improve patient flow.

Proceedings ArticleDOI
08 Dec 2013
TL;DR: This paper argues for a new approach to verification and validation that leverages two techniques from computer science: model checking and automated debugging that eventually will be able to yield feedback to SMEs in a familiar language.
Abstract: The process of developing, verifying and validating models and simulations should be straightforward. Unfortunately, following conventional development approaches can render a model design that appeared complete and robust into an incomplete, incoherent and invalid simulation during implementation. An alternative approach is for subject matter experts (SMEs) to employ formal methods to describe their models. However, formal methods are rarely used in practice due to their intimidating syntax and semantics rooted in mathematics. In this paper we argue for a new approach to verification and validation that leverages two techniques from computer science: (1) model checking and (2) automated debugging. The proposed vision offers an initial path to replace conventional simulation verification and validation methods with new automated analyses that eventually will be able to yield feedback to SMEs in a familiar language.

Journal ArticleDOI
TL;DR: The goal of the research is to support offline transformation analysis by automated methods, where offline means that only the definition of the program itself, the language definitions of its source and target models are used during the analysis, and the analysis needs to be performed only once.
Abstract: SUMMARY Model processing programs are regularly used when working with models or synthetizing the code from them; therefore, their verification has become an essential component of constructing reliable software in model-based software engineering. Models are usually formalized and visualized as graphs; therefore, model processing programs based on algebraic graph rewriting systems—such programs are called model transformations—are often applied, and their verification has become an important research area. The goal of our research is to support offline transformation analysis by automated methods, where offline means that only the definition of the program itself, the language definitions of its source and target models are used during the analysis. Therefore, the results are independent from concrete source models, and the analysis needs to be performed only once. Based on previous work, this paper provides the synthesis and of a set of individual components and improves them to provide a complete verification solution: (i) a language is introduced to specify the properties to be verified; (ii) a formalism to describe model transformations in a declarative way; and (iii) automated algorithms that can analyse the declarative transformations as well as the properties expressed by the language. Besides its theoretical basis, the implementation of a verification framework is presented, and its operation is illustrated on a case study. Although the formal verification of model transformation properties is algorithmically undecidable in general, our goal is to provide a practically usable, scoped framework that can largely facilitate the manual verification of model transformations. Copyright © 2013 John Wiley & Sons, Ltd.

Book ChapterDOI
06 May 2013
TL;DR: A prototypical version of a highly customisable approximate model checker is presented which was used in a range of experiments to verify properties of large scale models whose complexity prevents them from being amenable to conventional explicit or symbolic model checking.
Abstract: This paper focusses on the usefulness of approximate probabilistic model checking for the internal and external validation of large-scale agent-based simulations. We describe the translation of typical validation criteria into a variant of linear time logic. We further present a prototypical version of a highly customisable approximate model checker which we used in a range of experiments to verify properties of large scale models whose complexity prevents them from being amenable to conventional explicit or symbolic model checking.

Proceedings ArticleDOI
08 Dec 2013
TL;DR: The concept of a validation frame of reference is introduced; a taxonomy of using frames of reference found in the validation literature is built; and the creation of a unifying theoretical framework is proposed to build a foundation for the development of a paradigm of common theoretical and practical understanding of validation across the M&S field.
Abstract: The modeling and simulation (MS builds a taxonomy of using frames of reference found in the validation literature; and proposes the creation of a unifying theoretical framework for M&S validation to build a foundation for the development of a paradigm of common theoretical and practical understanding of validation across the M&S field.

Proceedings ArticleDOI
08 Dec 2013
TL;DR: A characterization approach to developing a V &V techniques catalog that packages the available techniques together with the information about their application conditions and a planning and tailoring strategy for project-specific selection of the appropriate V&V techniques from the established catalog according to the goals and characteristics of a simulation study are presented.
Abstract: Conducting verification and validation (VV and (2.) a planning and tailoring strategy for project-specific selection of the appropriate V&V techniques from the established catalog according to the goals and characteristics of a simulation study.

Dissertation
29 Nov 2013
TL;DR: In this article, the authors present a tool, the generator of deadlock-freeness theorems, to automatically construct these theorem for Event-B models.
Abstract: This thesis aims at the specification, verification and validation of safety-critical systems with formal methods, in particular, with Event-B. We assessed the usability of Event-B by the development of platooning control algorithms, specially how it scaled up from a simplified 1D version to a more realistic 2D version. The critical analysis of the 1D platooning model uncovered some anomalous behaviors. The difficulty of expressing the deadlock-freeness theorems in Event-B motivated us to develop a tool, the generator of deadlock-freeness theorems, to automatically construct these theorems. Our assessment confirmed that the mathematical proofs are not sufficient to assure the correctness of a formal specification: a formal specification should also be validated. We believe that the validation activities, like the verification activities, should be associated with each refinement during the development. To do that, we need better validation tools. The state-of-the-art tools which can execute Event-B models failed in highly non-deterministic models. Therefore we designed and implemented a new execution tool, JeB, which is a JavaScript simulation framework for Event-B. JeB allows users to safely insert hand-coded pieces of code to supply deterministic computations where the automatic translation fails. To achieve this goal, we have defined a set of proof-obligations which, when discharged, guarantee the correctness of the simulations with respect to the model.

ReportDOI
01 Sep 2013
TL;DR: The purpose of the present work is to develop methodology toRoll-up validation results to an application, and to assess the impact the validation hierarchy design has on the roll-up results.
Abstract: Suites of experiments are preformed over a validation hierarchy to test computational simulation models for complex applications. Experiments within the hierarchy can be performed at different conditions and configurations than those for an intended application, with each experiment testing only part of the physics relevant for the application. The purpose of the present work is to develop methodology to roll-up validation results to an application, and to assess the impact the validation hierarchy design has on the roll-up results. The roll-up is accomplished through the development of a meta-model that relates validation measurements throughout a hierarchy to the desired response quantities for the target application. The meta-model is developed using the computation simulation models for the experiments and the application. The meta-model approach is applied to a series of example transport problems that represent complete and incomplete coverage of the physics of the target application by the validation experiments.

Journal Article
TL;DR: In this paper, a fault diagnosis system for four common faults of the commercial aircraft engine was developed based on a bank of hybrid Kalman filters, which was evaluated using the Monte Carlo simulation method at both the health and degraded conditions.
Abstract: Aircraft engine fault diagnosis system is of great significance to improve the engine safety and reliability.Its performance evaluation indicators include the rates of fault detection,fault isolation and false alarm.A fault diagnosis system for four common faults of the commercial aircraft engine was developed based on a bank of hybrid Kalman filters.The performance evaluation indicators of the fault diagnosis system were evaluated using the Monte Carlo simulation method at both the health and degraded conditions.The simulation results show that at these two conditions the fault detection rate of the system is above 98%,the fault isolation rate is above 90%,and the false alarm rate is below 1%.It demonstrates that fault diagnosis system meets the needs of practical applications.

Proceedings ArticleDOI
04 Aug 2013
TL;DR: This work aims to introduce the first necessary step, herein, creation of domain ontology for formally characterizing reusable numerical model to reduce knowledge gap between model provider and user, and to achieve a higher level of reuse.
Abstract: Complex product development processes are evolving towards simulation driven design which leads to many heterogeneous computational models and design teams that interact with each other. However, this interaction creates a bottleneck for communication and models reuse throughout the design process because, very often, the model provider (i.e. analysts) and model users (i.e. designers) do not have the same level of understanding. In addition, the tools such as PDM (Product Data Management) or SDM (Simulation Data Management) consider the numerical models as black-box documents and they cannot access or link parameters and variables of models. The poverty of semantics in terms of simulation logics and design leads to a lack of interoperability between the contributing disciplinary simulation components, herein called numerical models. To reinforce this semantics, it is necessary to create a semantically-rich model characterization support to reduce knowledge gap between model provider and user, and to achieve a higher level of reuse. This work aims to introduce the first necessary step, herein, creation of domain ontology for formally characterizing reusable numerical model. Based on this common vocabulary, in automotive context, a Model Identity Card (MIC) is developed as an intermediate support which characterizes a model into five attributes; Physical Object, Interface, Methods, Means Usage, Validation and Verification. The MIC is illustrated with a Vehicle Thermic Comfort model example and a computer interface is developed to collect a series of representative MICs in a database.Copyright © 2013 by ASME