scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Simulation in 2011"


Journal ArticleDOI
TL;DR: This paper describes how R&C can be achieved by using a simulation conceptual model (CM) in a community of interest (COI) and presents how a CM developed for a COI can assist inR&C for the design of any type of large-scale complex M&S application in that COI.
Abstract: Reusability and composability (R&C) are two important quality characteristics that have been very difficult to achieve in the Modelling and Simulation (M&S) discipline. Reuse provides many technical and economical benefits. Composability has been increasingly crucial for M&S of a system of systems, in which disparate systems are composed with each other. The purpose of this paper is to describe how R&C can be achieved by using a simulation conceptual model (CM) in a community of interest (COI). We address R&C in a multifaceted manner covering many M&S areas (types). M&S is commonly employed where R&C are very much needed by many COIs. We present how a CM developed for a COI can assist in R&C for the design of any type of large-scale complex M&S application in that COI. A CM becomes an asset for a COI and offers significant economic benefits through its broader applicability and more effective utilization.

52 citations


Journal ArticleDOI
TL;DR: Based on the review of the literature on related topics, and the experience in applying simulation, some ‘warnings’ for the user community are compiled.
Abstract: Discrete-event simulation modelling is a powerful systems analysis tool. However, in practice, several mistakes can compromise a simulation study that might lead the decision maker to the wrong conclusion. Based on our review of the literature on related topics, and our experience in applying simulation, we have compiled some ‘warnings’ for the user community. These warnings are grouped into seven categories as follows: Data Collection, Model Building, Verification and Validation, Analysis, Simulation Graphics, Managing the Simulation Process, and Human Factors, Knowledge, and Abilities.

39 citations


Journal ArticleDOI
TL;DR: The paper scrutinizes possibilities and fundamental limits for such a balance with a focus on simulation model interoperability and ontology-driven development based on experiences with ontologies in military projects.
Abstract: In modelling and simulation, ontologies can be used for the formal definition of methods and techniques (methodological ontologies), as well as for the representation of parts of reality (referential ontologies), like manufacturing or military systems, for example. Such ontologies are two sided: they are both models of a certain body of knowledge and models for automated information processing and further implementation. The first function of ontologies as pre-images (models of) has a strong epistemic nature especially for referential ontologies since they try to capture pieces of the ‘semantic relations of the real world’. The second function as models for further processing, in contrast, is completely normative in nature—it is a specification of a ‘formal semantics’. Unfortunately, the ideal realization of ontologies as epistemic models differs from the normative ideal. As specifications, ontologies have to be as precise (unequivocal) as possible; as representations of reality, in contrast, they have to be as descriptive as possible, which may imply ambiguity and even inconsistency in some domains. Ontology processing is particularly challenging as balancing these ideals is a domain specific task. The paper scrutinizes possibilities and fundamental limits for such a balance with a focus on simulation model interoperability and ontology-driven development based on experiences with ontologies in military projects.

33 citations


Journal ArticleDOI
TL;DR: This paper uses OMG SysML™ to create an ontology implementation referred to as a domain-specific language, or DSL, for a class of simulation applications; the DSL is used to create a specific (conceptual) user model for a problem in the domain.
Abstract: The challenges in cost-effectively deploying simulation technology are well known. Two major challenges are creating an appropriate conceptual model and translating that conceptual model correctly into a computational model. Ontologies have been widely discussed as one mechanism for capturing modelling knowledge in a reusable form, making it effectively available in the conceptual and computational modelling phases. In this paper, we show how ontologies can be effectively deployed in simulation using recent innovations from systems engineering and software engineering. We use OMG SysML™ to create an ontology implementation referred to as a domain-specific language, or DSL, for a class of simulation applications; the DSL is used to create a specific (conceptual) user model for a problem in the domain. We then use model transformation to automate the translation to a computational simulation model. Two proof-of-concept implementations are described, one using a legacy simulation language, and another using an object-oriented simulation language.

32 citations


Journal ArticleDOI
TL;DR: Performance of the ED had substantial increases with balancing the utilisation of physicians, treatment areas and critical bed levels, and an optimisation model was built to optimise various aspects of the multi-criteria objectives.
Abstract: This paper analysed and optimised the patient flow of a public hospital emergency department (ED). The simulation model enabled a detailed analysis of variables in the system without the chaos and ...

25 citations


Journal ArticleDOI
TL;DR: The Automated Simulation Output Analyser identifies the warm-up period, estimates the number of replications, and/or analyses output from a single run, with the aim of providing the user with accurate and precise measures of their chosen output statistics.
Abstract: There are two key issues in assuring the accuracy of estimates of performance obtained from a simulation model. The first is the removal of any initialisation bias; the second is ensuring that enough output data are produced to obtain an accurate estimate of performance. Our aim is to produce an automated procedure for inclusion into commercial simulation software to address both of these issues. This paper describes the results of a 3-year project to produce such an analyser. Our Automated Simulation Output Analyser identifies the warm-up period, estimates the number of replications, and/or analyses output from a single run, with the aim of providing the user with accurate and precise measures of their chosen output statistics.

23 citations


Journal ArticleDOI
TL;DR: This paper model the spread of pandemic influenza in a local community, a university, and evaluates the mitigation policies as differential equations-based compartmental model based on the preparedness plan of one of the biggest universities in the world, Arizona State University.
Abstract: Pandemic influenza preparedness plans strongly focus on efficient mitigation strategies including social distancing, logistics and medical response. These strategies are formed by multiple decision makers before a pandemic outbreak and during the pandemic in local communities, states and nation-wide. In this paper, we model the spread of pandemic influenza in a local community, a university, and evaluate the mitigation policies. Since the development of an appropriate vaccine requires a significant amount of time and available antiviral quantities can only cover a relatively small proportion of the population, university decision makers will first focus on non-pharmaceutical interventions. These interventions include social distancing and isolation. The disease spread is modelled as differential equations-based compartmental model. The system is simulated for multiple non-pharmaceutical interventions such as social distancing including suspending university operations, evacuating dorms and isolation of infected individuals on campus. Although the model is built based on the preparedness plan of one of the biggest universities in the world, Arizona State University, it can easily be generalized for other colleges and universities. The policies and the decisions are tested by several simulation runs and evaluations of the mitigation strategies are presented in the paper.

21 citations


Journal ArticleDOI
TL;DR: A formal framework for capturing and applying the knowledge needed to automatically generate system-level analysis models from system- level descriptive models is defined by defining the Object Management Group's Systems Modeling Language.
Abstract: During the systems design process, there are a multitude of analyses and computer simulations that are performed to evaluate a particular design or architecture. This paper focuses on automating this process by defining a formal framework for capturing and applying the knowledge needed to automatically generate system-level analysis models from system-level descriptive models. The framework builds on the similarities that exist between analytical and descriptive models when considered from a systems perspective, namely, as consisting of sub-systems or components and the interactions between them. The relationships between analytical and descriptive models are captured at the component level in multi-aspect component models (MAsCoMs). The information in MAsCoMs is represented formally in the Object Management Group's Systems Modeling Language and can then be applied automatically through the use of generic model transformations. The transformations apply to all models in a certain domain, such as dynamic simulation modelling. In this paper, the approach is demonstrated for a hydraulic system by generating a system-level dynamic simulation from a descriptive model of the hydraulic circuit.

21 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigate discrete-event and agent-based modeling and simulation approaches to develop simulation models that are heterogeneous and more life-like, though poses a new research question: when developing such simulation models one still has to abstract from the real world, however, ideally in such a way that the essence of the system is still captured.
Abstract: Models to understand the impact of management practices on retail performance are often simplistic and assume low levels of noise and linearity. Of course, in real life, retail operations are dynamic, nonlinear and complex. To overcome these limitations, we investigate discrete-event and agent-based modeling and simulation approaches. The joint application of both approaches allows us to develop simulation models that are heterogeneous and more life-like, though poses a new research question: When developing such simulation models one still has to abstract from the real world, however, ideally in such a way that the ‘essence’ of the system is still captured. The question is how much detail is needed to capture this essence, as simulation models can be developed at different levels of abstraction. In the literature the appropriate level of abstraction for a particular case study is often more of an art than a science. In this paper, we aim to study this question more systematically by using a retail branch simulation model to investigate which level of model accuracy obtains meaningful results for practitioners. Our results show the effects of adding different levels of detail and we conclude that this type of study is very valuable to gain insight into what is really important in a model.

16 citations


Journal ArticleDOI
TL;DR: A method to model and simulate non-stationary, non-renewal arrival processes that depends only on the analyst setting intuitive and easily controllable parameters is introduced, suitable for assessing the impact of non- stationary,non-exponential, and non-independent arrivals on simulated performance when they are suspected.
Abstract: This paper introduces a method to model and simulate non-stationary, non-renewal arrival processes that depends only on the analyst setting intuitive and easily controllable parameters. Thus, it is suitable for assessing the impact of non-stationary, non-exponential, and non-independent arrivals on simulated performance when they are suspected. A specific implementation of the method is also described and provided for download.

14 citations


Journal ArticleDOI
A Beck1
TL;DR: How simulation was adopted to model passenger flows in a new airport terminal and how the model is being used post opening is described and what impact the modelling has had and is still having.
Abstract: This paper is an expanded version of a presentation given at the final of the president's medal during the Operations Research Society's 50th conference held in York. The paper describes how simula...

Journal ArticleDOI
TL;DR: A simulation queueing theory model with three input streams consisting of Emergency, Direct Elective and Direct Transfers is designed to examine the question of how bed counts impact access to acute care services in BC's hospitals and an optimization algorithm is developed to determine optimal bed allocations among hospital segments.
Abstract: In Canada, acute care refers to the in-hospital treatment of a disease during its initial phases (typically measured in days). Determining how bed availability impacts patient access to care is of great interest to the British Columbia (BC) Ministry of Health Services and to the health service delivery industry worldwide. In this article we discuss a simulation queueing theory model with three input streams consisting of Emergency, Direct Elective and Direct Transfers, designed to examine the question of how bed counts impact access to acute care services inBC's hospitals. We further develop an optimization algorithm to determine optimal bed allocations among hospital segments in order to minimize average patient wait time. The algorithm is demonstrated on an exemplar hospital.

Journal ArticleDOI
TL;DR: It is recognized by several experts in the field, during panel discussions conducted during recent conferences, that the focus of effort needs to shift from the implementation aspects to the conceptual aspects of modelling and simulation as well.
Abstract: Two emerging trends in modelling and simulation (M&S) are beginning to dovetail in a potentially highly productive manner, namely conceptual modelling and semantic modelling. Conceptual modelling has existed for several decades, but its importance has risen to the forefront in the last decade (Taylor and Robinson, 2006; Robinson, 2007). Also, during the last decade, progress on the Semantic Web has begun to influence M&S, with the development of general modelling ontologies (Miller et al, 2004), as well as ontologies for modelling particular domains (Durak et al, 2006). An ontology, which is a formal specification of a conceptualization (Gruber, 1993), can be used to rigorously define a domain of discourse in terms of classes/concepts, properties/relationships and instances/individuals. For the Semantic Web, ontologies are typically specified using the Web Ontology Language (OWL). Although, conceptual modelling is broader than just semantics (it includes additional issues such as pragmatics (Tolk et al, 2008)), progress in the Semantic Web and ontologies is certainly beneficial to conceptual modelling. Benefits are accrued in many ways including the large knowledge bases being placed on the Web in numerous fields in which simulation studies are conducted and the powerful reasoning algorithms based on description logic being developed that allow the consistency of large specifications to be checked. Conceptual and semantic models are useful for developing executable simulations in general, but are particularly helpful in supporting composability and interoperability. Interoperability of simulation systems is concerned with the correctness of interactions among components in the simulation environment and builds on the composability of their underlying models. In order to fully utilize and share the underlying models, the interactions have to be made explicit, which requires well-documented conceptual models as well as their implementations. Several best practices and even standardized methods exist supporting integration of two or more simulations in order to provide a broader basis for M&S-based research, as envisioned in the 1996 National Science Foundation (NSF) Report on ‘Simulation-based Engineering Science’ showing the potential of using simulation technology and methods to revolutionize engineering science. In addition, it is recognized by several experts in the field, during panel discussions conducted during recent conferences, that the focus of effort needs to shift from the implementation aspects to the conceptual aspects of modelling and simulation as well. In all these discussions, the value and necessity of simulation integration efforts was recognized as a necessary part of interoperability and composability, but it was also recognized that these efforts are not sufficient. While the community agrees on the necessity of unambiguous and machine readable documentation of the conceptual component of M&S in principle, the details of different approaches are currently not well aligned (Balci et al, 2010). However, one discussion topic is most often observed when it comes to conceptual modelling: the increased use of ontological means supporting precisely defined formal models that capture semantics, yet afford more flexibility in syntax. In particular, the experts of several related panels expressed their conviction that ontologies offer a means for enhancing composability and interoperability among models and simulations developed independently. The rationale for supporting this belief that emerges from these discussions is that an ontology is a formal specification of a conceptualization, which fulfills the requirements for a conceptual model:

Journal ArticleDOI
TL;DR: A high performance spreadsheet simulation system called S3, which is to add power of parallel computing on Windows-based desktop grid into popular Excel models by using standard Web Services and Service-Oriented Architecture, is presented.
Abstract: In this paper, a high performance spreadsheet simulation system called S3 is presented. Our approach is to add power of parallel computing on Windows-based desktop grid into popular Excel models by...

Journal ArticleDOI
TL;DR: Two agent-based models spanning the problem domain are described, which capture the key ideas of complexity, within a PSO context, taking account of the complex interactions between peacekeepers, civilians, insurgents and non-governmental organisations involved.
Abstract: This paper firstly discusses the modelling of Peace Support Operations (PSO) within the defence simulation modelling context. It then provides a summary background of the current relevant approaches in such modelling, taking account of the increasing complexity of the strategic environment, and the relevance of ideas from Complex Adaptive Systems theory. It goes on to describe the details of two agent-based models spanning the problem domain, which capture the key ideas of complexity, within a PSO context, taking account of the complex interactions between peacekeepers, civilians, insurgents and nongovernmental organisations involved.

Journal ArticleDOI
TL;DR: This paper applies an ontology-based simulation development methodology to fulfil the functional requirements of a trajectory simulation while targeting reuse through interoperability and composability, and demonstrates the approach to achieve composable and interoperable simulations over a case study.
Abstract: Trajectory simulation is a software module that computes the flight path and flight parameters of munitions. It is used throughout the engineering process, including simulations for studying the de...

Journal ArticleDOI
TL;DR: This work has created a methodology that requires domain ontologies to be rendered in the human-interface layer in support of the construction of dynamic models and their corresponding visualizations to provide a novel and accessible way to author visualizations and a framework for creating new teaching tools that merge concrete and abstract knowledge.
Abstract: The purpose of an ontology is to structurally define knowledge about a topic. For the practice of simulation, ontologies have been demonstrated to be useful across a variety of techniques ranging f...

Journal ArticleDOI
TL;DR: Simulation experiments with models for large-scale wafer fabs are performed to solve multiple orders per job type formation and release strategies and a complementary analytical method is discussed to determine an acceptable number of FOUPs given a prescribed order release rate.
Abstract: In this paper, multiple orders per job type formation and release strategies are described for semiconductor wafer fabrication facilities (wafer fabs). Different orders are grouped into one job because orders of an individual customer very often fill only a portion of a Front-Opening Unified Pod (FOUP). After job formation, an FOUP is assigned to each job and is used to move the job throughout the wafer fab. We determine an acceptable number of FOUPs given a prescribed order release rate to find appropriate values for on-time delivery performance measures, cycle time, and throughput by discrete event simulation. On the other hand, given a prescribed number of FOUPs and target values for these performance measures, we look for an appropriate order release rate. Simulation experiments with models for large-scale wafer fabs are performed to solve these two problems. We also discuss a complementary analytical method to determine an appropriate number of FOUPs in some specific situations.

Journal ArticleDOI
TL;DR: The required sample sizes for equivalence testing at a specified confidence are derived and an indifference-zone procedure to select only the best systems is developed, guaranteed to contain all thebest systems and none of the systems that deviate more than a specified amount from the best system.
Abstract: This paper investigates the hypothesis testing of equivalence of mean of multiple systems using the range statistic. The procedure allows unequal variances among systems. Equivalence testing has many applications in screening for new product development. First, the required sample sizes for equivalence testing at a specified confidence are derived. We then develop an indifference-zone procedure to select only the best systems. For a specified confidence level P*, the selected subset is guaranteed to contain all the best systems and none of the systems that deviate more than a specified amount from the best systems. A table of critical constants is provided for implementing the procedure. An experimental performance evaluation demonstrates the validity and efficiency of the procedure.

Journal ArticleDOI
TL;DR: The Collaborative DEVS Modelling (CDM) approach and its realization based on the Computer Supported Collaborative Work (CSCW) and the Discrete Event System Specification (DEVS) concepts and technologies are proposed.
Abstract: Collaborative modelling enables dispersed users to develop component-based system models in group settings. A realization of such an approach requires coordinating and maintaining the causality of the users’ activities. We propose the Collaborative DEVS Modelling (CDM) approach and its realization based on the Computer Supported Collaborative Work (CSCW) and the Discrete Event System Specification (DEVS) concepts and technologies. The CSCW concepts are introduced into the DEVS modelling framework in order to support model development in virtual team settings. A set of modelling rules and tasks enabling collaborative, visual, and persistent model construction and synthesis is developed. To support separate groups of modellers to independently develop models, the realization of CDM supports independent modelling sessions. An illustrative example is developed to demonstrate collaborative and incremental model development. The design of the CDM realization and future research are briefly described.

Journal ArticleDOI
TL;DR: This work describes a complementary approach that uses Web Ontology Language and Semantic Web Rule Language to capture information about the roles and capabilities required to complete a task, and the detailed attributes of candidate resources.
Abstract: Military training and testing events integrate a diverse set of live and simulated systems, most of which were built independently and weren’t specifically designed to work together. Data interoperability and service-oriented architecture approaches, while essential, do not provide a complete solution to ensuring that systems will be fully compatible in their interactions. We describe a complementary approach that uses Web Ontology Language and Semantic Web Rule Language to capture information about the roles and capabilities required to complete a task, and the detailed attributes of candidate resources. Our toolset applies automated reasoning to determine whether each candidate resource has the requisite capabilities and is compatible with other resources. If there are multiple candidates for a role, the reasoner ranks the relative goodness of each with respect to constraints and metrics that are appropriate for the specific task needs of the exercise or deployment. We include worked examples illustrating the kinds of information we capture about resources and how rules and constraints are applied to provide a nuanced assessment of their compatibility in a specific context.

Journal ArticleDOI
TL;DR: A model of an Internet radio service is developed and validated using the information gathered from a real service, which reflects the real behaviour of the users, and the devices and the protocols involved in the service.
Abstract: Internet radios have become one of the most popular services on the Internet nowadays. This popularity has motivated the interest of the scientific community and a lot of research has been carried out in order to improve and study these services. The goal of this paper is to develop a model of an Internet radio service in order to help service managers to predict future situations, avoiding problems in advance. This model has been developed and validated using the information gathered from a real service; thus, it reflects the real behaviour of the users, and the devices and the protocols involved in the service. Also, the model of the service has been integrated in a model of a cable network configured to work under real load conditions. The results of the simulations allow us to draw conclusions about the performance of the service in a real scenario.

Journal ArticleDOI
TL;DR: It is suggested that simulation modelling should follow world market reality and become more balanced towards a customer service perspective, by working on the deficiencies of the current manufacturing-centred model.
Abstract: This paper builds upon a real case to share a marketing point of view to simulation modelling and its current model of business. The case applies mainstream simulation methodologies up to the delivery of results and a success model to facilitate the analysis of the case's impact and success. With the use of modern marketing principles we extended our case model to a balanced integrated framework that encompasses good simulation modelling practice with new marketing principles. We suggest that simulation modelling should follow world market reality and become more balanced towards a customer service perspective, by working on the deficiencies of the current manufacturing-centred model.

Journal ArticleDOI
TL;DR: The motivations, methods, and solution concepts of a knowledge-driven framework for simulation application integration are described, and opportunities provided by recent advances in knowledge representation and ontology development methods are outlined.
Abstract: This paper describes the motivations, methods, and solution concepts of a knowledge-driven framework for simulation application integration. First, the solution ideas are motivated by providing a characterization of the challenges associated with simulation integration at the semantic level. Next, opportunities provided by recent advances in knowledge representation and ontology development methods are outlined. The important role of ontologies in simulation integration is then briefly described. Next, a method for knowledge-driven simulation application integration is described in detail. Illustrative application examples are then outlined in order to provide a flavour for the practical value for the research. Finally, we outline the potential benefits of the research.