scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Simulation in 2010"


Journal ArticleDOI
TL;DR: A brief introduction to ABMS is provided, the main concepts and foundations are illustrated, some recent applications across a variety of disciplines are discussed, and methods and toolkits for developing agent models are identified.
Abstract: Agent-based modelling and simulation (ABMS) is a relatively new approach to modelling systems composed of autonomous, interacting agents. Agent-based modelling is a way to model the dynamics of complex systems and complex adaptive systems. Such systems often self-organize themselves and create emergent order. Agent-based models also include models of behaviour (human or otherwise) and are used to observe the collective effects of agent behaviours and interactions. The development of agent modelling tools, the availability of micro-data, and advances in computation have made possible a growing number of agent-based applications across a variety of domains and disciplines. This article provides a brief introduction to ABMS, illustrates the main concepts and foundations, discusses some recent applications across a variety of disciplines, and identifies methods and toolkits for developing agent models.

1,597 citations


Journal ArticleDOI
TL;DR: This work discusses why specificity dominates and why more generic approaches are rare in the DES literature, and classify papers according to the areas of application evident in the literature, discussing the apparent lack of genericity.
Abstract: Discrete Event Simulation (DES) has been widely used in modelling health-care systems for many years and a simple citation analysis shows that the number of papers published has increased markedly since 2004. Over the last 30 years several significant reviews of DES papers have been published and we build on these to focus on the most recent era, with an interest in performance modelling within hospitals. As there are few papers that propose or illustrate general approaches, we classify papers according to the areas of application evident in the literature, discussing the apparent lack of genericity. There is considerable diversity in the objectives of reported studies and in the consequent level of detail: We discuss why specificity dominates and why more generic approaches are rare.

505 citations


Journal ArticleDOI
TL;DR: This paper captures the discussion that took place at the UK Operational Research Society's Simulation Workshop 2010 and addresses the key questions and opportunities regarding ABS that will face the OR community in the future.
Abstract: There has been much discussion about why agent-based simulation (ABS) is not as widely used as discrete-event simulation in Operational Research (OR) as it is in neighbouring disciplines such as Computer Science, the Social Sciences or Economics. To consider this issue, a plenary panel was organised at the UK Operational Research Society's Simulation Workshop 2010 (SW10). This paper captures the discussion that took place and addresses the key questions and opportunities regarding ABS that will face the OR community in the future.

363 citations


Journal ArticleDOI
TL;DR: This text by Michael North and Charles Macal should be reserved as a key reference text for anyone commencing their studies into agents and agent-based modelling and this broad review of modelling approaches provides useful background information for the remainder of the book.
Abstract: This book, Managing Business Complexity by Michael North and Charles Macal, is not your typical agent-based modelling text. To date, it seems, most agent-based modelling texts fall into two broad categories: a collection of readings (such as from a conference), or a specific how-to text (for a specific methodology or software platform). Although serving a valued purpose, such works fail to provide that text targeted at that new technically savvy audience of readers interested in the agent-based modelling paradigm that require a relatively complete exposition based on a broadly defined view of agents and agent-based models. While the landmark text by Epstein and Axtell (1996) remains a necessary initial reference for anyone commencing their studies into agents and agent-based modelling, this text by North and Macal should be reserved as a key reference text. As noted in an earlier review of this book (Van Dyke Parunak, 2007), the 15 chapters in the book fall into three categories; a categorization that works for this review as well. Chapters 1–5 really provide foundational information about what are agents and what is agent-based modelling. Chapters 6–10 provide a series of ‘how to’ chapters pertaining to different components of the agent-based modelling paradigm. Finally, Chapters 11–14 cover topics related to modelling and simulation in general but tailored to agent-based modelling specifically. Chapter 15 is a concluding chapter. The authors take a much broader definition of agentbased than might be found in any other book on agents and agent-based modelling. As such they cover a greater breadth of topics as compared to these other texts. Their target audience is not the accomplished agent-based modeller, even though there is a wealth of worthwhile information in the book for that modeller. As stated, the target audience is ‘managers, analysts, and software developers in business and government’. To this, one should add, ‘or anyone interested in learning about agents or to those just getting started in research using agent models’. The caveat to this is that the book has sufficient technical depth to require some level of technical competence to comprehend the coverage and exploit the knowledge the book imparts. The first five chapters introduce the reader to agents and how agent-based models apply. This coverage includes defining the agent-based model paradigm (Chapter 2), defining agents (Chapter 3), providing some history of agent modelling (Chapter 4) and where agent-based modelling fits (Chapter 5). While overall they present quite a complete background, the Chapter 5 coverage gets overly ambitious in that it tries to lay out the complete spectrum of modelling approaches. Although not really in line with the focus of the book, this full-spectrum coverage does help the newcomer to agent-based modelling better distinguish among modelling approaches. Further, this broad review of modelling approaches provides useful background information for the remainder of the book. Chapters 6–10 provide the ‘how-to’ portion of the book. Chapter 6 is one of the best chapters in the text and a wonderful addition to the agent-based modelling literature. While capturing agent behaviours, and their interactions, is a crucial part of the agent-based modelling methodology, rarely do texts provide a how-to associated with this knowledge engineering component. North and Macal correct Journal of Simulation (2010) 4, 211–212 r 2010 Operational Research Society Ltd. All rights reserved. 1747-7778/10

49 citations


Journal ArticleDOI
TL;DR: Various interventions designed to reduce MRSA transmission are embedded in the model including: admission and repeat screening tests, shorter test turnaround time, isolation, and decolonisation treatment.
Abstract: Hospital patients who are colonized with methicillin-resistant Staphylococcus aureus (MRSA), may transmit the bacteria to other patients. An agent-based simulation is designed to determine how the problem might be managed and the risk of transmission reduced. Most MRSA modeling studies have applied mathematical compartmental models or Monte Carlo simulations. In the agent-based model, each patient is identified on admission as being colonized or not, has a projected length of stay and may be more or less susceptible to colonization. Patient states represent colonization, detection, treatment, and location within the ward. MRSA transmission takes place between pairs of individuals in successive time slices. Various interventions designed to reduce MRSA transmission are embedded in the model including: admission and repeat screening tests, shorter test turnaround time, isolation, and decolonization treatment. These interventions can be systematically evaluated by model experimentation.

48 citations


Journal ArticleDOI
TL;DR: Emphasis is given to the generalized beta distribution family, the Johnson translation system of distributions, and the Bézier distribution family because of the flexibility of these families to model a wide range of distributional shapes that arise in practical applications.
Abstract: Techniques are presented for modelling and then randomly sampling many of the continuous univariate probabilistic input processes that drive discrete-event simulation experiments. Emphasis is given to the generalized beta distribution family, the Johnson translation system of distributions, and the Bezier distribution family because of the flexibility of these families to model a wide range of distributional shapes that arise in practical applications. Methods are described for rapidly fitting these distributions to data or to subjective information (expert opinion) and for randomly sampling from the fitted distributions. Also discussed are applications ranging from pharmaceutical manufacturing and medical decision analysis to smart-materials research and health-care systems analysis.

35 citations


Journal ArticleDOI
TL;DR: The aim of the study was to optimise the number of beds available in order to minimise cancellations of Elective surgery and maintain an acceptable level of bed-occupancy in the Critical Care Unit of a large teaching hospital.
Abstract: This article focuses on the Critical Care Unit (CCU) of a large teaching hospital. The aim of the study was to optimise the number of beds available in order to minimise cancellations of Elective surgery and maintain an acceptable level of bed-occupancy. The CCU is where critically ill patients are cared for and often requires one-to-one nursing care. The discrete event simulation model, built in Visual Basic for Applications for Excel, seeks to simulate the bed-occupancy of the CCU as well as monitoring any cancellations of Elective surgery. Several ‘what-if’ scenarios are run including increasing bed numbers, ‘ring-fencing’ beds for Elective patients, reducing length of stay to account for delayed discharge and changing the scheduling of Elective surgery, and the results are reported.

34 citations


Journal ArticleDOI
TL;DR: In this paper, a mathematical simulation was developed in SIMULINK to simulate an existing MCL and model validation was achieved by applying the physical MCL characteristics to the simulation and comparing the resulting pressure traces.
Abstract: Cardiovascular assist devices are tested in mock circulation loops (MCLs) prior to animal and clinical testing. These MCLs rely on characteristics such as pneumatic parameters to create pressure and flow, and pipe dimensions to replicate the resistance, compliance and fluid inertia of the natural cardiovascular system. A mathematical simulation was developed in SIMULINK to simulate an existing MCL. Model validation was achieved by applying the physical MCL characteristics to the simulation and comparing the resulting pressure traces. These characteristics were subsequently altered to improve and thus predict the performance of a more accurate physical system. The simulation was successful in simulating the physical mock circulation loop, and proved to be a useful tool in the development of improved cardiovascular device test rigs.

32 citations


Journal ArticleDOI
TL;DR: Some of the scientific developments in computers, complexity, and systems thinking that helped lead to the emergence of ABM are re-examines by shedding new light onto some old theories and connecting them to several key ABM principles of today.
Abstract: Agent-based modeling (ABM) has become a popular simulation analysis tool and has been used to examine systems from myriad domains. This article re-examines some of the scientific developments in computers, complexity, and systems thinking that helped lead to the emergence of ABM by shedding new light onto some old theories and connecting them to several key ABM principles of today. As it is often the case, examining history can lead to insightful views about the past, present, and the future. Thus, themes from cellular automata and complexity, cybernetics and chaos, and complex adaptive systems are examined and placed in historical context to better establish the application, capabilities, understanding, and future of ABM.

30 citations


Journal ArticleDOI
TL;DR: It is demonstrated that mitigating initialization bias is not a matter of determining the most representative initial condition per se and that minimum mean-squared error is a more appropriate objective, and argues against the efficiency of the replication/deletion approach to output analysis.
Abstract: In a comprehensive study of methods for mitigating the problem of the initial transient, Hoad et al (2009) conclude that MSER (White, 1997) is an efficient, effective, and robust truncation rule, appropriate for automation. Franklin and White (2008) suggest that MSER works because it minimizes an approximation to the mean-squared error in the estimated steady-state mean; Franklin et al (2009) offer empirical support for this suggestion. In this paper, we use the example of an M/M/1 queue to provide a clear restatement of initialization problem in both the time and frequency domains, distinguishing between the biasing effects of initialization and autocorrelation. We demonstrate that mitigating initialization bias is not a matter of determining the most representative initial condition per se and that minimum mean-squared error is a more appropriate objective. This demonstration also argues against the efficiency of the replication/deletion approach to output analysis.

28 citations


Journal ArticleDOI
TL;DR: This paper will document the tools already used in environmental supply chain analysis and investigate the potential use of discrete event simulation (DES) as a method of capturing the dynamic nature of modern SC design and operation and illustrates the economic versus environmental trade-offs of alternative SC designs.
Abstract: Across the European Economic Area (EEA) more goods are being transported over longer distances more frequently than ever before. As a result, Greenhouse Gas emissions from transport increased overall by 28% between 1990 and 2006 for the 32 countries in the EEA. With incoming Kyoto regulations, and opinion resistant to heavy freight traffic, efficient freight transport has become a growing concern. This paper will document the tools already used in environmental supply chain (SC) analysis and investigate the potential use of discrete event simulation (DES) as a method of capturing the dynamic nature of modern SC design and operation. The paper reviews and analyses the use of quantitative analysis for supporting SC decision makers in choosing the most environmentally friendly SC design. This is done through the development and use of a DES model, which, through the capture of dynamic input factors, illustrates the economic versus environmental trade-offs of alternative SC designs.

Journal ArticleDOI
TL;DR: An advanced procedure model for a structured, goal- and task-oriented information and data acquisition for the model-based analyses of LLN is proposed, which differs from other approaches by focussing on information acquisition rather than solely on data acquisition.
Abstract: Design, organisation and management of Large Logistics Networks (LLN) usually involve model-based analyses of the networks. The usefulness of such an analysis highly depends on the quality of the i...

Journal ArticleDOI
TL;DR: The preliminary findings indicate that there is still a robust discussion over which methods to include on the menu list of such a tool, and thatthere is an appetite for an accessible introduction to modelling methods.
Abstract: A project to take simulation and modelling techniques to healthcare practitioners in the form of a ‘selection tool’ is reported. We describe the processes by which a tool to enable decision-makers ...

Journal ArticleDOI
TL;DR: The concept and implementation of a generic interface for machine data acquisition into the simulation system d3FACT insight is described, which enables data transfer from a real production system into simulation to initialize and update the simulation model.
Abstract: The need for high flexibility to react to market changes and customer demand is constantly growing for both short- and long-term success of companies that want to succeed in global markets. The simulation of material flows in modern factories offers these companies the possibility to plan and optimize their floor shops in a fast and cost-efficient way and enables them to react to changes and malfunctions. The most expensive factor in such simulation experiments is the data capturing process from the actual factory. This paper describes the concept and implementation of a generic interface for machine data acquisition into the simulation system d3FACT insight. The interface enables data transfer from a real production system into simulation to initialize and update the simulation model. Within this approach the Devices Profile for Web Services specification will be used. By the use of the developed approach, simulation tools are doing one step further to a possible daily use.

Journal ArticleDOI
TL;DR: The main conclusion of this work is that it is possible to reduce the number of runs needed to find optimum values, while generating a system knowledge capable to improve resource allocation.
Abstract: This paper presents a sensitivity analysis of discrete-event simulation models based on a twofold approach formed by Design of Experiments (DOE) factorial designs and simulation routines. This sensitivity analysis aim is to reduce the number of factors used as optimization input via simulation. The advantage of reducing the input factors is that optimum search can become very time-consuming as the number of factors increases. Two cases were used to illustrate the proposal: the first one, formed only by discrete variable, and the second presenting both discrete and continuous variables. The paper also shows the use of the Johnson's transformation to experiments with non-normal response variables. The specific case of the sensitivity analysis with a Poisson distribution response was studied. Generally, discrete probability distributions lead to violation of constant variance assumption, which is an important principle in DOE. Finally, a comparison between optimization conducted without planning and optimization based on sensitivity analysis results was carried out. The main conclusion of this work is that it is possible to reduce the number of runs needed to find optimum values, while generating a system knowledge capable to improve resource allocation.

Journal ArticleDOI
TL;DR: The GRASP heuristic is applied within a constraint-based simulation approach to improve construction schedules by observing well-established execution strategies and the benefits are shown for the strategies Avoid Soiling and Human Strain Factor.
Abstract: In building engineering different strategies can be used to schedule the execution processes of building projects Currently, these strategies have not yet been sufficiently formalized, and considered for construction scheduling This paper presents a concept for modelling and simulating execution strategies by using Soft Constraint representations In particular, the GRASP heuristic is applied within a constraint-based simulation approach to improve construction schedules by observing well-established execution strategies The benefits of this concept are shown for the strategies Avoid Soiling and Human Strain Factor

Journal ArticleDOI
G. Mayer1, Sven Spieckermann
TL;DR: Two cases from the German car manufacturer BMW Group are presented to illustrate the particular technological and organizational challenges related to the application of simulation models for almost a decade.
Abstract: There is a tendency to maintain and use some simulation models, once implemented, over a period of several years. This article discusses a couple of reasons for the increasing long-term use of models. Two cases from the German car manufacturer BMW Group are presented to illustrate the particular technological and organizational challenges related to the application of simulation models for almost a decade.

Journal ArticleDOI
TL;DR: This paper discusses the application of cluster-randomized intervention studies to simulation-based evaluation of surgical care policies, arguing that the methodological rigour of evaluative studies should be applied to the design and analysis of simulation experiments.
Abstract: In this paper we discuss the application of cluster-randomized intervention studies to simulation-based evaluation of surgical care policies, arguing that the methodological rigour of evaluative studies should be applied to the design and analysis of simulation experiments. We introduce a framework and study design to evaluate methods for improving the surgical care process with the use of patient flow models that simulate the steps in service delivery and response pathways for individual patients. Because patient-level outcomes in a given simulation run may be correlated, we suggest a cluster-randomized design of experiment for determining how many simulation runs are required and how input factors should vary across the runs. In such a design, simulation runs rather than simulated individuals are randomized to different study groups. For patient outcomes that vary more across simulation runs than within each run, we provide formulas to be adapted in sample size calculations to allow for clustering of responses. As an example, we report on the design and analysis of a simulation study comparing two methods of booking admission dates for patients who are to undergo elective surgery.

Journal ArticleDOI
TL;DR: A novel semi-automated methodology that provides semi-automatic hypothesis testing for exploring an unexpected behaviour, and automatic identification of statements in an agent-based simulation's source code which have the strongest influence on annexpected behaviour is developed.
Abstract: Unexpected behaviours in simulations require explanation, so that decision-makers and subject matter experts can separate valid behaviours from design or coding errors. Validation of unexpected behaviours requires accumulation of insight into the behaviour and the conditions under which it arises. Agent-based simulations are known for unexpected behaviours that emerge as the simulation executes. To facilitate user exploration, analysis, understanding and insight of unexpected behaviours, we have developed a novel semi-automated methodology, INSIGHT. INSIGHT provides: (1) semi-automatic hypothesis testing for exploring an unexpected behaviour, and (2) automatic identification of statements in an agent-based simulation's source code which have the strongest influence on an unexpected behaviour. INSIGHT is applicable to both deterministic and stochastic agent-based simulations and extends the state of the art in agent-based simulation analysis.

Journal ArticleDOI
TL;DR: This paper discusses the conceptual modelling and development of three types of model categories called Fixed Facility Focus models, Variable Facility focus models and Transient Entity Focus models and explains their use to solve problems of strategic nature related to fixed facilities, tactical problems related to the variable facilities, and operational issues related to transient entities and their schedules.
Abstract: Simulation modelling has been widely used for the design and performance improvement of logistic terminal systems. One of the issues related to the performance improvement studies of airport terminal systems is the need for developing and using many different models of the same terminal system for solving its problems. A possible approach to address this difficulty is some form of problem clustering and model clustering, that is for the solution of problems from a problem category using corresponding models from the model category. In this paper, we discuss the conceptual modelling and development of three types of model categories called Fixed Facility Focus models, Variable Facility Focus models and Transient Entity Focus models and explain their use to solve problems of strategic nature related to fixed facilities, tactical problems related to the variable facilities, and operational problems related to transient entities and their schedules.

Journal ArticleDOI
TL;DR: Thrombophilia testing was estimated to be cost-effective in most sub-groups; however these results are subject to large uncertainty, and primary research is required before a definitive conclusion can be reached.
Abstract: A review of modelling work funded by the National Coordinating Centre for Health Technology Assessment (NCCHTA) was undertaken to quantify the use of discrete event simulation (DES) techniques in health economics. A case study, funded by the NCCHTA, estimating the cost-effectiveness of thrombophilia testing is presented. Thrombophilia may increase the risk of venous thromboembolism (VTE) which can be fatal; however, the preventative treatment, warfarin, is associated with an increased risk of haemorrhage, which is also potentially fatal. A DES model, populated from literature reviews and incorporating VTE events, haemorrhages and death was constructed. The most cost-effective duration of warfarin treatment (‘standard treatment’ of 3 or 6 months, 10 years, 20 years or lifelong) was estimated for patients with initial idiopathic VTE, sub-divided into age, gender, VTE type and known thrombophilia type groups. The primary goal was to ascertain, for each sub-group, whether the cost of thrombophilia tes...

Journal ArticleDOI
TL;DR: The proposed aggregate simulation model can accurately predict the mean cycle time in a region around the workstations’ operational product mix and is demonstrated for a workstation in the Crolles2 wafer factory.
Abstract: Cycle Time-Throughput-Product mix (CT-TH-PM) surfaces give the mean cycle time as a function of throughput and product mix for manufacturing workstations. To generate the CT-TH-PM surface, detailed simulation models may be used. However, detailed models require much development time, and it may not be possible to estimate all model parameters. Instead, we propose an aggregate simulation model to generate a workstation's CT-TH-PM surface. In the aggregate model, the various workstation details are lumped into an ‘effective process time’ (EPT) distribution. The EPT distribution in the aggregate simulation model is estimated from arrival and departure data measured at the workstation in operation. We validate the proposed method using a simulation example representing a semiconductor workstation. We find that the method can accurately predict the mean cycle time in a region around the workstations’ operational product mix. We also present an industry test case; the applicability of the method is demonstrated for a workstation in the Crolles2 wafer factory.

Journal ArticleDOI
TL;DR: This paper first shows how behaviours at different levels can be specified and detected in a simulation using the complex event formalism, and applies partial least squares regression to frequencies of these behaviours to infer models predicting the global behaviour of the system from lower-level behaviours.
Abstract: This paper presents a novel approach towards showing how specific emergent multi-level behaviours in agent-based simulations (ABSs) can be quantified and used as the basis for inferring predictive models. First, we first show how behaviours at different levels can be specified and detected in a simulation using the complex event formalism. We then apply partial least squares regression to frequencies of these behaviours to infer models predicting the global behaviour of the system from lower-level behaviours. By comparing the mean predictive errors of models learned from different subsets of behavioural frequencies, we are also able to determine the relative importance of different types of behaviour and different resolutions. These methods are applied to ABSs of a novel agent-based model of cancer in the colonic crypt, with tumorigenesis as the global behaviour we wish to predict.

Journal ArticleDOI
TL;DR: A four-category framework (based on demographic and geographic properties) to discuss ways of applying network-based simulation approaches to undergraduate students and novice researchers and a geospatial modelling approach that integrates a national commuting network as well as multi-scale contact structures are described.
Abstract: Recent and potential outbreaks of infectious diseases are triggering interest in predicting epidemic dynamics on a national scale and testing the efficacies of different combinations of public health policies. Network-based simulations are proving their worth as tools for addressing epidemiology and public health issues considered too complex for field investigations and questionnaire analyses. Universities and research centres are therefore using network-based simulations as teaching tools for epidemiology and public health education students, but instructors are discovering that constructing appropriate network models and epidemic simulations are difficult tasks in terms of individual movement and contact patterns. In this paper we will describe (a) a four-category framework (based on demographic and geographic properties) to discuss ways of applying network-based simulation approaches to undergraduate students and novice researchers; (b) our experiences simulating the transmission dynamics of two infectious disease scenarios in Taiwan (HIV and influenza); (c) evaluation results indicating significant improvement in student knowledge of epidemic transmission dynamics and the efficacies of various public health policy suites; and (d) a geospatial modelling approach that integrates a national commuting network as well as multi-scale contact structures.

Journal ArticleDOI
TL;DR: This article describes this procedure and shows with the help of a pilot study how the development of the productivity of an ageing workforce can be forecasted for a period of several years.
Abstract: Many industrial nations are faced with the question of the effects of the demographic development, particularly with respect to the employment of an ageing workforce. An increase in the share of ol...

Journal ArticleDOI
Hongchun Qu1, Hongchun Qu2, Q. Zhu1, H. Fu1, Z. Lu 
TL;DR: Simulation results demonstrate that VirtualEP can effectively deal with global nutrients allocation, growth in response to variation of air temperature, solar radiation as well as water and nitrogen stress and provide vivid 3D visualization of these features.
Abstract: This paper presents VirtualEP, a novel simulator for eggplant growth, integrating Agent-Based Modelling technology and existing knowledge of plant physiology. VirtualEP simulates the growth and development of eggplant as an evolution of a dynamic branching network whose nodes are represented by Autonomous Virtual Organs (AVOs). The AVO possesses inbuilt data structure, states and functional rules so that it can autonomously perform physiological procedures (eg photosynthesis, nutrient uptake, storage, mobilization and respiration, etc) to respond to environmental heterogeneity. A discrete implementation of pressure-flow paradigm is incorporated to simulate carbon, water and nitrogen transport and allocation among AVOs. Simulation results demonstrate that VirtualEP can effectively deal with global nutrients allocation, growth in response to variation of air temperature, solar radiation as well as water and nitrogen stress. Moreover, VirtualEP can also provide vivid 3D visualization of these features.

Journal ArticleDOI
TL;DR: In this paper, the authors developed procedures for selecting a set of normal populations with unknown means and unknown variances such that the final subset of selected populations satisfies the following requirement: with probability at least P676*, the selected subset will contain a population or "only and all" of those populations whose mean or means are within a value of d676* from the smallest mean.
Abstract: This paper develops procedures for selecting a set of normal populations with unknown means and unknown variances such that the final subset of selected populations satisfies the following requirement: with probability at least P *, the selected subset will contain a population or ‘only and all’ of those populations whose mean or means are within a value of d * from the smallest mean. The size of the selected subset is random; however, at most, m populations will be chosen. A restricted subset attempts to exclude populations that deviate more than d * from the smallest mean. Here P *, d *, and m are user-specified parameters. These procedures can be used when the unknown variances across populations are unequal. We then extend the sequentialized procedure to perform a selection with constraints. An experimental performance evaluation demonstrates the validity and efficiency of these restricted-subset-selection procedures.

Journal ArticleDOI
TL;DR: Taguchi's quality engineering concepts are valuable, but his statistical tools, including: signal-to-noise ratio, testing at F value of four and pooling-up techniques, are found inefficient tools for robust design.
Abstract: The alpha mistake of Taguchi method is investigated for the smaller-the-better type quality characteristic with L18 (21 × 37) array using simulation. The results showed that the Taguchi method might result in identifying incorrectly some insignificant factors as significant. In conclusion, Taguchi's quality engineering concepts are valuable, but his statistical tools, including: signal-to-noise ratio, testing at F value of four and pooling-up techniques, are found inefficient tools for robust design.