scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Simulation in 2013"


Journal ArticleDOI
TL;DR: Three approaches to deciding model validity are described, two paradigms that relate verification and validation to the model development process are presented, and various validation techniques are defined.
Abstract: Verification and validation of simulation models are discussed in this paper. Three approaches to deciding model validity are described, two paradigms that relate verification and validation to the...

1,425 citations


Journal ArticleDOI
TL;DR: A meta-model in which possibly mobile, interconnected and communicating agents work according to a set of chemical-like laws is adopted, which retains the performance of known Stochastic Simulation Algorithms for (bio)chemistry, though it is tailored to the specific features of complex and situated computational systems.
Abstract: In this paper we address the engineering of complex and emerging computational systems featuring situatedness, adaptivity and self-organisation, like pervasive computing applications in which humans and devices, dipped in a very mobile environment, opportunistically interact to provide and exploit information services. We adopt a meta-model in which possibly mobile, interconnected and communicating agents work according to a set of chemical-like laws. According to this view, substantiated by recent research on pervasive computing systems, we present the Alchemist simulation framework, which retains the performance of known Stochastic Simulation Algorithms for (bio)chemistry, though it is tailored to the specific features of complex and situated computational systems.

101 citations


Journal ArticleDOI
TL;DR: In this article, the uncertainties tainting the results of computer simulations are quantified and quantitatively assessed using the uncertainties in the simulation results of both industrial and scientific communities. One of the key...
Abstract: Quantitative assessment of the uncertainties tainting the results of computer simulations is nowadays a major topic of interest in both industrial and scientific communities. One of the key...

99 citations


Journal ArticleDOI
TL;DR: The MS-SDF comprises the SE processes of requirements capture, conceptual modelling, and verification and validation (V&V), and extends them to M&S, and uses model theory as a deductive apparatus in order to develop the framework.
Abstract: Whether by design or by practice, systems engineering (SE) processes are used more and more often in Modeling and Simulation (M&S). While the two disciplines are very close, there are some differences that must be taken into account in order to successfully reuse practices from one community to another. In this paper, we introduce the M&S System Development Framework (MS-SDF) that unifies SE and M&S processes. The MS-SDF comprises the SE processes of requirements capture, conceptual modelling, and verification and validation (V&V), and extends them to M&S. We use model theory as a deductive apparatus in order to develop the MS-SDF. We discuss the benefits of the MS-SDF especially in the selection between federation development and multi-model approaches and the design of composable models and simulations. Lastly, a real life application example of the framework is provided.

90 citations


Journal ArticleDOI
TL;DR: A framework for generating a design, of specified size, that is nearly orthogonal and nearly balanced for any mix of factor types (categorical, numerical discrete, and numerical continuous) and mix of factors levels is proposed.
Abstract: Designed experiments are powerful methodologies for gaining insights into the behaviour of complex simulation models. In recent years, many new designs have been created to address the large number of factors and complex response surfaces that often arise in simulation studies, but handling discrete-valued or qualitative factors remains problematic. We propose a framework for generating a design, of specified size, that is nearly orthogonal and nearly balanced for any mix of factor types (categorical, numerical discrete, and numerical continuous) and mix of factor levels. These new designs allow decision makers structured methods for trade-off analyses in situations that are not necessarily amenable to other methods for choosing alternatives, such as simulation optimization or ranking and selection approaches. These new designs also compare well to existing approaches for constructing custom designs for smaller experiments, and may also be of interest for exploring computer models in domains where fewer factors are involved.

29 citations


Journal ArticleDOI
TL;DR: Some approaches to teaching ABS that the authors have successfully used in a range of classes and workshops are reported on.
Abstract: Agent-based simulation is a relatively new modeling technique that is being widely used by many disciplines to model complex adaptive systems. Few full-length courses exist on agent-based modeling, and a standard curriculum has not yet been established. But there is considerable demand to include agent-based modeling into simulation courses. Modelers often come to agent-based simulation (ABS) by way of self-study or attendance at tutorials and short courses. Although there is substantial overlap, there are many aspects of agent-based modeling that differ from discrete-event simulation and System Dynamics, including the applicable problem domains, the disciplines and backgrounds of students, and the underpinnings of its computational implementation. These factors make agent-based modeling difficult to include as an incremental add-on to existing simulation courses. This paper’s contribution is to report on some approaches to teaching ABS that the authors have successfully used in a range of classes and workshops.

24 citations


Journal ArticleDOI
TL;DR: This paper adopts an approach to self-organising coordination based on biochemical tuple spaces for self- Organising coordination, and shows how it can be applied to the simulation of complex interaction patterns of intracellular signalling pathways.
Abstract: Modelling the interaction among system components is a fundamental issue in complex system simulation. Simulation frameworks based on coordination models—that is, explicitly handling interaction—suit well the complex system simulation: and those based on nature-inspired coordination models, in particular, are well-suited for the simulation of complex natural systems. In this paper, we adopt an approach to self-organising coordination based on biochemical tuple spaces for self-organising coordination, and show how it can be applied to the simulation of complex interaction patterns of intracellular signalling pathways. We first present the model and a general high-level architecture, then we develop and discuss a simple case study—a single signalling pathway from the complex network of the Ras signalling pathways.

23 citations


Journal ArticleDOI
TL;DR: This article extends the modelling framework of Robinson to include data requirements in 5W1H format, revision table, complexity description (that specifies the level of detail and scope of the model), and other features (such as input/output definitions, process description and so on).
Abstract: Although conceptual modelling is one of the most significant steps in the modelling process for discrete-event simulation, this step deserves greater attention in the conduct of practical projects. There are many advantages in developing a conceptual model such as less rework, the availability of documentation for post-simulation study, revision, auditing, and the possibility that someone different than the modellers will conduct the model implementation. The conceptual modelling product can be expressed in different ways depending on the modelling framework adopted. This article extends the modelling framework of Robinson. Some of the extensions addressed include data requirements in 5W1H (What? When? Where? Why? Who? How?) format, revision table (in order to register the changes in the conceptual model document), complexity description (that specifies the level of detail and scope of the model), and other features (such as input/output definitions, process description and so on). At the end of this article, we provide a real case taken from a multinational logistics company using our framework.

23 citations


Journal ArticleDOI
TL;DR: In this paper, a generic testing framework for agent-based simulation models to conduct validation and verification of models is presented, which demonstrates its effectiveness by showing its applicability on a realistic agentbased simulation case study.
Abstract: Agent-based modelling and simulation (ABMS) had an increasing attention during the last decade. However, the weak validation and verification of agent-based simulation models makes ABMS hard to trust. There is no comprehensive tool set for verification and validation of agent-based simulation models, which demonstrates that inaccuracies exist and/or reveals the existing errors in the model. Moreover, on the practical side, many ABMS frameworks are in use. In this sense, we designed and developed a generic testing framework for agent-based simulation models to conduct validation and verification of models. This paper presents our testing framework in detail and demonstrates its effectiveness by showing its applicability on a realistic agent-based simulation case study.

18 citations


Journal ArticleDOI
TL;DR: This paper employs Multi-Agent System simulation to optimize the total system output for recovery from machine and/or conveyor failure cases and examines the economic interdependencies between the examined parameters and the benefits of using the agent paradigm to minimize the impact of the disrupting events on the dynamic system.
Abstract: The application of intelligent agent technologies is considered a promising approach to improve system performance in complex and changeable environments Especially, in the case of unforeseen events, for example, machine breakdowns that usually lead to a deviation from the initial production schedule, a multi-agent approach can be used to enhance system flexibility and robustness In this paper we apply this approach to revise and re-optimize the dynamic system schedule in response to unexpected events We employ Multi-Agent System simulation to optimize the total system output (eg, number of finished products) for recovery from machine and/or conveyor failure cases Diverse types of failure classes (conveyor and machine failures), as well as duration of failures are used to test a range of dispatching rules in combination with the All Rerouting re-scheduling policy, which showed supreme performance in our previous studies In this context, the Critical Ratio rule, which includes the transportation time into the calculation for the selection of the next job, outperformed all other dispatching rules We also analyzed the impact of diverse simulation parameters (such as number of pallets, class of conveyor failure and class of machine failure) on the system effectiveness Presented research also enlightens the economic interdependencies between the examined parameters and the benefits of using the agent paradigm to minimize the impact of the disrupting events on the dynamic system

18 citations


Journal ArticleDOI
TL;DR: An economic production quantity model for multi-item with storage space and budget constraints in a volume flexible manufacturing system is developed and the crisp nonlinear optimization problem is solved by Fuzzy simulation, Contractive Mapping Genetic Algorithm and Generalized Reduced Gradient technique.
Abstract: In this paper, an economic production quantity model for multi-item with storage space and budget constraints in a volume flexible manufacturing system is developed. Here it is assumed that the demand rate is constant up to a certain level of stock and after that it depends on stock itself. The unit production cost is taken to be a function of the finite production rate involving labour cost and wear and tear expenditure. Here, the inventory costs, selling price, storage space and available budget are defined imprecisely. Using necessary measure theory, the imprecise problem is reduced to deterministic problem. Here, necessity measure approach has been used for triangle fuzzy number and parabolic fuzzy number. Finally the crisp nonlinear optimization problem is solved by Fuzzy simulation, Contractive Mapping Genetic Algorithm and Generalized Reduced Gradient technique. The model is illustrated numerically and the results are compared.

Journal ArticleDOI
TL;DR: An agent-based model (ABM) is presented designed to help understand European military tactics during the eighteenth century, in particular during the War of the Spanish Succession, to show that the choice of a particular firing system was not as important as most historians state.
Abstract: Computational models have been extensively used in military operations research, but they are rarely seen in military history studies. The introduction of this technique has potential benefits for the study of past conflicts. This paper presents an agent-based model (ABM) designed to help understand European military tactics during the eighteenth century, in particular during the War of the Spanish Succession. We use a computer simulation to evaluate the main variables that affect infantry performance in the battlefield, according to primary sources. The results show that the choice of a particular firing system was not as important as most historians state. In particular, it cannot be the only explanation for the superiority of Allied armies. The final discussion shows how ABM can be used to interpret historical data, and explores under which conditions the hypotheses generated from the study of primary accounts could be valid.

Journal ArticleDOI
TL;DR: A method is proposed that incorporates known dependencies between input variables into design selection for simulators and the benefits of this approach via a simulator for atmospheric dispersion are demonstrated.
Abstract: Many computer models or simulators have probabilistic dependencies between their input variables, which if not accounted for during design selection may result in a large numbers of simulator runs being required for analysis. We propose a method that incorporates known dependencies between input variables into design selection for simulators and demonstrate the benefits of this approach via a simulator for atmospheric dispersion. We quantify the benefit of the new techniques over standard space-filling and Monte Carlo simulation. The proposed methods are adaptations of computer-generated spread and coverage space-filling designs, with ‘distance’ between two input points redefined to include a weight function. This weight function reflects any known multivariate dependencies between input variables and prior information on the design region. The methods can include quantitative and qualitative variables, and different types of prior information. Novel graphical methods, adapted from fraction of design space plots, are used to assess and compare the designs.

Journal ArticleDOI
TL;DR: Currently available software development tools and the specific benefits and limitations that can be encountered when using them to develop VR surgical simulations are introduced and a detailed review of collision detection libraries that are central to achieving reliable haptic rendering is provided.
Abstract: Virtual reality (VR) surgical simulations are among the most difficult software applications to develop mainly because of the type of user interactions that they must support. Surgery typically includes precise cutting of often intricate structures. Modelling these structures and accurately simulating their response to user interaction requires many software components to effectively work in unison. Some of these components are readily available but are tailored to more common applications such as computer games or open-world simulations such as flight-simulators. This article explores the software libraries that are currently available to developers of VR surgical simulation software. Like computer games and other VR simulations, VR surgical simulations require real-time lighting and rendering systems and physics-based interactions. However, in addition they require haptic interaction with cut-able and deformable soft-tissue, a key requirement that is not supported by the majority of the available tools. In this article, we introduce currently available software development tools and the specific benefits and limitations that can be encountered when using them to develop VR surgical simulations. We also provide a detailed review of collision detection libraries that are central to achieving reliable haptic rendering.

Journal ArticleDOI
TL;DR: An overview of the main simulation-based methodologies for developing multi-agent systems (MASs) that describe interesting ABMS application domains where the integration of AOSE and ABMS can benefit MAS development is provided.
Abstract: This paper briefly surveys an emerging research area: the integration of agent-oriented software engineering (AOSE) and agent-based modelling and simulation (ABMS). Both AOSE and ABMS are well-established research areas in the agent-based computing domain. Specifically, this paper provides an overview of the main simulation-based methodologies for developing multi-agent systems (MASs) that describe interesting ABMS application domains where the integration of AOSE and ABMS can benefit MAS development.

Journal ArticleDOI
TL;DR: The use of Bayesian networks (BNs) as an exploratory metamodelling tool for supporting simulation studies conducted with stochastic simulation models containing multiple inputs and outputs is introduced.
Abstract: This paper introduces the use of Bayesian networks (BNs) as an exploratory metamodelling tool for supporting simulation studies conducted with stochastic simulation models containing multiple inputs and outputs. BN metamodels combine simulation data with available expert knowledge into a non-parametric description of the joint probability distribution of discrete random variables representing simulation inputs and outputs. The distributions of the inputs are determined based on expert knowledge and/or a real-world data source while the conditional distributions of the outputs are estimated from the simulation data. The exploratory use of the BN metamodels is an iterative process including the construction and validation of the BNs and allowing various analyses dealing with the dependencies among the inputs and the outputs, input uncertainty, and inverse reasoning. The results of these analyses are applied to guide and aid the utilization and interpretation of the simulation model under consideration. In addition, the analyses are used for studying the behaviour of the simulated system. The exploratory use is illustrated with an example involving a simulated queue.

Journal ArticleDOI
TL;DR: This paper considers the design of experiments for the validation of the fitted metamodel based on maximin Latin hypercubes and adds an extra criterion to be optimised based on the distances between the points in the validation and original designs.
Abstract: Metamodels (or emulators) are statistical tools for the analysis of large complex simulation models. They consist of a Gaussian (or second order) process (kriging) fitted to a designed set of simulator runs. Once an emulator has been built, it is important that it is validated against some independent runs of the simulator. This paper considers the design of experiments for the validation of the fitted metamodel. All the proposed designs are based on maximin Latin hypercubes and add an extra criterion to be optimised based on the distances between the points in the validation and original designs. Simulation experiments are carried out to determine how well each design performs against the alternative criteria.

Journal ArticleDOI
TL;DR: The ratio of confidence interval width to the range of sample measures was found to be an indicator of the impact of statistical error on the quality of model fit and the WLS method is suggested for stochastic regressions of simulations.
Abstract: Modelling and simulation environments that are stochastic in nature present a multitude of problems in the creation of meta-models, notably the reduction of the quality of fit due to statistical er...

Journal ArticleDOI
TL;DR: A simulation study on prioritizing patients for receiving scarce cadaveric liver donations shows that the proposed ranking formula and the mixture design approach allow to systematically analyse patient prioritization in liver transplantation and allocation.
Abstract: In this article, we present a simulation study on prioritizing patients for receiving scarce cadaveric liver donations. We propose a ranking formula that combines the four criteria commonly used for prioritizing wait-list liver transplant candidates. We apply the proposed ranking formula to evaluate several system outcomes in a liver-allocation simulation model. For each outcome, we identify promising schemes that outperform the currently implemented scheme by analysing the response surfaces constructed with a mixture design of simulation experiments on the four criteria. We also show that it is unlikely to have a scheme that reduces pretransplant mortality and improves other system outcomes simultaneously. Finally, we conduct sensitivity analyses on the two subjective scalars in the ranking formula. Overall, our simulation study shows that the proposed ranking formula and the mixture design approach allow us to systematically analyse patient prioritization in liver transplantation and allocation.

Journal ArticleDOI
TL;DR: The objective of this work is to formalize and to specify a part of the Agent-oriented Software Process for Engineering Complex Systems methodology (Problem and Agency Domains) for modeling the holarchy of studied system by using a formal specification approch based on two formalisms: Petri Net and Object-Z language.
Abstract: In complex systems, multiple aspects interact and influence each other. A vast number of entities are present in the system. Traditional modeling and simulation techniques fail to capture interactions between loosely coupled aspects of a complex distributed system. The objective of this work is to formalize and to specify a part of the Agent-oriented Software Process for Engineering Complex Systems methodology (Problem and Agency Domains) for modeling the holarchy of studied system by using a formal specification approch based on two formalisms: Petri Net and Object-Z language. Such a specification style facilitates the modeling of complex systems with both structural and behavioural aspects. Our generic approach is illustrated by applying it to FIRA Robot Soccer and is validated with the Symbolic Analysis Laboratory framework.

Journal ArticleDOI
TL;DR: The use of trajectory sensitivity is described to illustrate how it can be used to help differentiate between competing System Dynamics models ascertaining the most significant parameters.
Abstract: This paper describes the use of trajectory sensitivity to illustrate how it can be used to help differentiate between competing System Dynamics models ascertaining the most significant parameters. A comparison of trajectory simulation and eigenvalue sensitivity is made in the case of simple models, examining the sensitivity of non-linear systems to changes in parameters. These methods can be used in the validation of models, in designing systems that are robust or in determining the principle cause of possible change in a given system. The two methods are applied to a case study of the simple SIR epidemic transmission model to illustrate the approach. The trajectory simulation method shows that system performance is most affected by the contact rate and infectivity around the period of maximum infection. A subsequent analysis of Flu data for 1978 shows that the errors between the SIR model and the data are not accounted for by possible parameter variations.

Journal ArticleDOI
TL;DR: A basic overview of the relevant information for simulation studies in manufacturing and logistics can be given in a clearly arranged list.
Abstract: Information is the key resource for performing simulation studies in manufacturing and logistics planning. To build the simulation model and to subsequently collect the requested simulation results, this information must have the necessary level of quality. To evaluate the influence of information quality on the quality of simulation study results, it is first of all essential that the quality of information be assessable. However, to evaluate quality, it is also necessary to know which information in the context of simulation studies will be deemed as generally relevant. To clarify this question, a Delphi Study about information in simulation studies was conducted. Based on the results of this Delphi Study, a basic overview of the relevant information for simulation studies in manufacturing and logistics can be given in a clearly arranged list.

Journal ArticleDOI
TL;DR: This paper describes an efficient and online generator of the correlation structure of the fractional Gaussian noise process and describes its application in simulation studies.
Abstract: Detailed observations of communications networks have revealed singular statistical properties of the measurements, such as self-similarity, long-range dependence and heavy tails, which cannot be overlooked in modelling Internet traffic. The use of stochastic processes consistent with these properties has opened new research fields in network performance analysis and particularly in simulation studies, where the efficient synthetic generation of samples is one of the main topics. In this paper, we describe an efficient and online generator of the correlation structure of the fractional Gaussian noise process.

Journal ArticleDOI
TL;DR: It is suggested that a level of assurance in models exchanged or co-developed internationally could be readily achieved by adopting the current UK guidelines, although with some modifications, along with the acceptance of a definition for fitness-for-purpose cast in the context of OR studies.
Abstract: Modelling and Simulation (M&S) have become an integral part of the evidence-based approach to decision making used by Defence departments around the world. A key part of M&S is the use of computer-...

Journal ArticleDOI
TL;DR: Four different heuristics for qualified worker selection for machines in discrete event simulation are analysed, including least number of qualifications (LENQ), and a heuristic that selects a worker with the lowest impact factor on the qualification pool (LIMP), which have the benefit of more closely modelling what happens in reality.
Abstract: In this paper, we analyse four different heuristics for qualified worker selection for machines in discrete event simulation. Conventional simulators simply select a capable worker randomly or from the top-of-the-stack (TOS) of candidates that are qualified to operate a machine, without considering the impact of removing that worker from the current available qualification pool (qPool). To investigate the efficacy of this approach, we compare these random and TOS approaches with two other worker selection rules: least number of qualifications (LENQ), and a heuristic that selects a worker with the lowest impact factor on the qualification pool (LIMP). LIMP ranks workers based on their contribution to the qPool and the constrainedness of each of their qualifications. We apply LENQ to a simulation model of a real company, and compared with the Random heuristic we observe a 44% reduction in the qualification resource constraint metric (RCM q ) and a 2% reduction in the total lateness in sales-order satisfaction. For the LIMP heuristic, the RCM q reduction is 77%. However, LIMP yields no significant improvement in sales-order lateness over the simpler LENQ approach. The LENQ and LIMP heuristics also have the benefit of more closely modelling what happens in reality, as they are based on intuition that would be used in practice, rather than using a random or simple TOS approach followed in conventional simulation.

Journal ArticleDOI
TL;DR: This paper develops a generic procedure for selection with constraints using relative performance measures as constraints, and proposes systems having each performance measure within a user-specified amount of the unknown best are considered as feasible systems.
Abstract: This paper develops a generic procedure for selection with constraints. The bounds of current selection with constraints are based on user-specified values of the underlying performance measures. In some cases, users have no priori of what the values of the secondary performance measures may be, hence, the specified values may not be accurate. On the basis of the difference between comparison with a standard and comparison with a control, we propose using relative performance measures as constraints. That is, systems having each performance measure within a user-specified amount of the unknown best are considered as feasible systems. An experimental performance evaluation demonstrates the validity and efficiency of the selection-with-constraints procedure.

Journal ArticleDOI
TL;DR: Despite the title of the special issue, all of the papers included in it concern themselves with output analysis rather than input modelling, which is obviously a less fashionable area.
Abstract: Despite the title of the special issue, all of the papers included in it concern themselves with output analysis rather than input modelling. Input modelling, which involves adequately representing the variability in the input variables to a simulation model, is obviously a less fashionable area. Any simulation project that involves generating numerical results from a simulation model needs to conduct some form of output analysis. Done well, output analysis will help to generate more information from fewer simulation runs, either by making use of metamodels or by employing clever experimental designs. Output analysis draws heavily on statistical ideas and many of the papers in this special issue apply state-of-the-art statistics to the specific situation of simulation modelling. The papers fall fairly naturally into two groups. The first group includes the papers by Bowman and Woods, Vieira Jr, Challenor, Damblin, and their co-authors, all of whom concentrate on experimental designs, which reduce the number of simulation runs needed. The second group of papers is more disparate, and includes the papers by Pousi, Turner, and their co-authors, which both discuss the estimation of metamodels as well as Abel’s paper on evaluating the quality of information needed for simulation studies. Challenor’s paper on the validation of simulation models would fit well in either group but has slightly more in common with the experimental design work described by the other papers in the first group. Experimental design has its origins in agricultural experiments (Fisher, 1925), but is widely used in simulation, where the complexity of the simulation models presents very different issues to those observed in physical experiments. Damblin and colleagues, and Bowman and Woods both examine properties of space-filling designs. Here, as is also true of Challenor’s paper on validation of metamodels, the aim is to test the original simulation model over the full range of input parameters. Damblin and colleagues specifically focus on an exploration space that is high-dimensional and compare a number of optimisation algorithms from the literature based on their convergence speed, robustness and space-filling properties. Bowman and Woods examine the situation where there are known dependencies between the model variables. The dependencies and prior information about the design region can be taken into account by using a weight function to redefine the distance between two design points. In contrast, Vieira Jr and colleagues describe experimental designs that can be used for trade-off analyses where a mix of factor types is allowed (categorical, numerical discrete and numerical continuous) and the simulation model is being used to make complex decisions based on a number of different measures, both quantitative and qualitative. These Nearly Orthogonal-and-Balanced designs are particularly useful for large-scale simulation studies and can significantly reduce the number of design points that are required. Metamodels are defined to be simpler approximations of (usually complex) simulation models (Barton, 1998). Generally, a relatively simple relationship will be determined between the simulation outputs of interest and the important simulation inputs. Both Challenor’s and Turner’s papers are more concerned with the quality of the fitted metamodels. Challenor describes an effective experimental design to validate fitted metamodels. After the metamodel has been built and fitted to simulation data, it is important to validate it against some additional independent runs of the original simulation model. As is true when fitting the model, it is important to validate the model, as far as possible, over the full range of input parameters. Challenor suggests maximin Latin Hypercube Sampling designs that take into account the positions of the training points when positioning the validation points. This allows the validation to test whether the design is likely to be valid over the whole range of input parameters. Turner investigates how the number of exploratory runs impacts on the quality of the fit of a metamodel, where the quality is measured by the accuracy of the coverage of the mean and variance intervals. The authors suggest that the ratio of the confidence interval width to the range of the sample mean is an effective measure in determining the number of replications needed to fit a metamodel. The work of Pousi et al adds an extra piece of information to the evaluation of these simulation metamodels, by combining the results of real experiments and/or expert opinion with simulation output. In this case, the metamodel takes the form of a Bayesian network, which retains complete information about the probability distributions of the simulation inputs and outputs. Abel uses a Delphi-study that draws on the opinions of simulation experts from industry and academia to come up with a categorised list of information types in a simulation model, and evaluates their importance. This uses much more subjective information than the other papers in the special issue but continues the theme of determining the importance of different inputs to a simulation model. It has been a great privilege to edit a special issue with so many high-quality papers, which manage to describe highly technical Journal of Simulation (2013) 7, 227–228 © 2013 Operational Research Society Ltd. All rights reserved. 1747-7778/13