scispace - formally typeset
Search or ask a question
BookDOI

Design and Analysis of Experiments with R

17 Dec 2014-
TL;DR: In this paper, the authors present an example of a two-factor Factorial Plan in R with fixed and random factors for estimating variance components of two-Factor Factorial Plans.
Abstract: Introduction Statistics and Data Collection Beginnings of Statistically Planned Experiments Definitions and Preliminaries Purposes of Experimental Design Types of Experimental Designs Planning Experiments Performing the Experiments Use of R Software Completely Randomized Designs with One Factor Introduction Replication and Randomization A Historical Example Linear Model for Completely Randomized Design (CRD) Verifying Assumptions of the Linear Model Analysis Strategies When Assumptions Are Violated Determining the Number of Replicates Comparison of Treatments after the F-Test Factorial Designs Introduction Classical One at a Time versus Factorial Plans Interpreting Interactions Creating a Two-Factor Factorial Plan in R Analysis of a Two-Factor Factorial in R Factorial Designs with Multiple Factors-Completely Randomized Factorial Design (CRFD) Two-Level Factorials Verifying Assumptions of the Model Randomized Block Designs Introduction Creating a Randomized Complete Block (RCB) Design in R Model for RCB An Example of a RCB Determining the Number of Blocks Factorial Designs in Blocks Generalized Complete Block Design Two Block Factors Latin Square Design (LSD) Designs to Study Variances Introduction Random Sampling Experiments (RSE) One-Factor Sampling Designs Estimating Variance Components Two-Factor Sampling Designs-Factorial RSE Nested SE Staggered Nested SE Designs with Fixed and Random Factors Graphical Methods to Check Model Assumptions Fractional Factorial Designs Introduction to Completely Randomized Fractional Factorial (CRFF) Half Fractions of 2k Designs Quarter and Higher Fractions of 2k Designs Criteria for Choosing Generators for 2k-p Designs Augmenting Fractional Factorials Plackett-Burman (PB) Screening Designs Mixed-Level Fractional Factorials Orthogonal Array (OA) Definitive Screening Designs Incomplete and Confounded Block Designs Introduction Balanced Incomplete Block (BIB) Designs Analysis of Incomplete Block Designs Partially Balanced Incomplete Block (PBIB) Designs-Balanced Treatment Incomplete Block (BTIB) Row Column Designs Confounded 2k and 2k-p Designs Confounding 3 Level and p Level Factorial Designs Blocking Mixed-Level Factorials and OAs Partially CBF Split-Plot Designs Introduction Split-Plot Experiments with CRD in Whole Plots (CRSP) RCB in Whole Plots (RBSP) Analysis Unreplicated 2k Split-Plot Designs 2k-p Fractional Factorials in Split Plots (FFSP) Sample Size and Power Issues for Split-Plot Designs Crossover and Repeated Measures Designs Introduction Crossover Designs (COD) Simple AB, BA Crossover Designs for Two Treatments Crossover Designs for Multiple Treatments Repeated Measures Designs Univariate Analysis of Repeated Measures Design Response Surface Designs Introduction Fundamentals of Response Surface Methodology Standard Designs for Second-Order Models Creating Standard Response Surface Designs in R Non-Standard Response Surface Designs Fitting the Response Surface Model with R Determining Optimum Operating Conditions Blocked Response Surface (BRS) Designs Response Surface Split-Plot (RSSP) Designs Mixture Experiments Introduction Models and Designs for Mixture Experiments Creating Mixture Designs in R Analysis of Mixture Experiment Constrained Mixture Experiments Blocking Mixture Experiments Mixture Experiments with Process Variables Mixture Experiments in Split-Plot Arrangements Robust Parameter Design Experiments Introduction Noise Sources of Functional Variation Product Array Parameter Design Experiments Analysis of Product Array Experiments Single Array Parameter Design Experiments Joint Modeling of Mean and Dispersion Effects Experimental Strategies for Increasing Knowledge Introduction Sequential Experimentation One-Step Screening and Optimization An Example of Sequential Experimentation Evolutionary Operation Concluding Remarks Appendix: Brief Introduction to R Answers to Selected Exercises Bibliography Index A Review and Exercises appear at the end of each chapter.
Citations
More filters
Journal ArticleDOI
TL;DR: Approaches to, and examples of the application of statistical design of experiments to stem cell bioprocess optimization are discussed.
Abstract: “To consult the statistician after an experiment is finished is often merely to ask him to conduct a post mortem examination. He can perhaps say what the experiment died of.” – R.A. Fisher While this idea is relevant across research scales, its importance becomes critical when dealing with the inherently large, complex and expensive process of preparing material for cell-based therapies (CBTs). Effective and economically viable CBTs will depend on the establishment of optimized protocols for the production of the necessary cell types. Our ability to do this will depend in turn on the capacity to efficiently search through a multi-dimensional problem space of possible protocols in a timely and cost-effective manner. In this review we discuss approaches to, and illustrate examples of the application of statistical design of experiments to stem cell bioprocess optimization.

16 citations


Cites background from "Design and Analysis of Experiments ..."

  • ...Statisticians have been giving serious thought to such issues for many decades, developing a field of research known as design of experiments (DOE) or experimental design [14]....

    [...]

Journal ArticleDOI
TL;DR: In this article, the effects of five parameters on the energy and emissions performance of a natural draft biomass cookstove were investigated and the CCD (Central composite design) combined with RSM and the desirability function was found to be an effective tool for performance optimization.
Abstract: Cookstove researchers are paying more attention to the improvement in the combustion performance of a biomass cookstove, which requires an understanding of the effects of different parameters on pollutant emissions. Five such important parameters were identified namely, Inlet area ratio, Primary air ratio, Pot gap, Fuel surface to volume ratio and Pot diameter. The work presented here comprises estimating the effects of these five parameters on the energy and emissions performance of natural draft biomass cookstove. A prototype stove was tested to quantify the effects of these parameters on chosen performance parameters like overall efficiency, CO and PM2.5 emissions per MJ of energy delivered to the pot. The data obtained from a proper sequence of designed experiments were coupled with RSM (Response surface Methodology) technique to fit second-order models for predicting the performance of the stove under different conditions. The RSM models were then used with desirability function for robust parameter optimization of stove and confirmation trials were performed for validating the findings. The CCD (Central composite design) combined with RSM and the desirability function was found to be an effective tool for performance optimization of a natural draft biomass cookstove.

15 citations

Journal ArticleDOI
TL;DR: In this article, the response surface methodology was applied to optimize the Ar-N2-CO2 ternary shielding gas for a nitrogen-containing filler metal in high nitrogen stainless welding.

11 citations

01 Jan 2017
TL;DR: This study test the hypothesis that testing by lot provides a better indicator of true process behavior than process capability indices (PCIs) calculated from the mixed lots that often occur in a typical production situation, and indicates that decomposition does not increase the accuracy of the PCI.
Abstract: Process Capability Calculations with Nonnormal Data in the Medical Device Manufacturing Industry by James W. Kwiecien MBA, University of St. Thomas, 1979 BS, Loyola University, 1969 Dissertation Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy Management Walden University May 2017 Abstract U.S. Food and Drug Administration (FDA) recalls of medical devices are at historically high levels despite efforts by manufacturers to meet stringent agency requirements to ensure quality and patient safety. A factor in the release of potentially dangerous devices might be the interpretations of nonnormal test data by statistically unsophisticated engineers. The purpose of this study was to test the hypothesis that testing by lot provides a better indicator of true process behavior than process capability indices (PCIs) calculated from the mixed lots that often occur in a typical production situation. The foundations of this research were in the prior work of Bertalanffy, Kane, Shewhart, and Taylor. The research questions examined whether lot traceability allows the decomposition of the combination distribution to allow more accurate calculations of PCIs used to monitor medical device production. The study was semiexperimental, using simulated data. While the simulated data were random, the study was a quasiexperimental design because of the control of the simulated data through parameter selection. The results of this study indicate that decomposition does not increase the accuracy of the PCI. The conclusion is that a systems approach using the PCI, additional statistical tools, and expert knowledge could yield more accurate results than could decomposition alone. More accurate results could ensure the production of safer medical devices by correctly identifying noncapable processes (i.e., processes that may not produce required results), while also preventing needless waste of resources and delays in potentially life-savings technology, reaching patients in cases where processes evaluate as noncapable when they are actually capable.U.S. Food and Drug Administration (FDA) recalls of medical devices are at historically high levels despite efforts by manufacturers to meet stringent agency requirements to ensure quality and patient safety. A factor in the release of potentially dangerous devices might be the interpretations of nonnormal test data by statistically unsophisticated engineers. The purpose of this study was to test the hypothesis that testing by lot provides a better indicator of true process behavior than process capability indices (PCIs) calculated from the mixed lots that often occur in a typical production situation. The foundations of this research were in the prior work of Bertalanffy, Kane, Shewhart, and Taylor. The research questions examined whether lot traceability allows the decomposition of the combination distribution to allow more accurate calculations of PCIs used to monitor medical device production. The study was semiexperimental, using simulated data. While the simulated data were random, the study was a quasiexperimental design because of the control of the simulated data through parameter selection. The results of this study indicate that decomposition does not increase the accuracy of the PCI. The conclusion is that a systems approach using the PCI, additional statistical tools, and expert knowledge could yield more accurate results than could decomposition alone. More accurate results could ensure the production of safer medical devices by correctly identifying noncapable processes (i.e., processes that may not produce required results), while also preventing needless waste of resources and delays in potentially life-savings technology, reaching patients in cases where processes evaluate as noncapable when they are actually capable. Process Capability Calculations with Nonnormal Data in the Medical Device Manufacturing Industry by James W. Kwiecien MBA, University of St. Thomas, 1979 BS, Loyola University, 1969 Dissertation Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy Management Walden University May 2017 Dedication I dedicate the successful outcome of my doctoral efforts to my wife, Susan M. Kwiecien, Ph.D. Without her patience, support, and counsel none of this would have been possible. Acknowledgments The completion of this dissertation would not have been possible without the help, support, and encouragement of many people throughout the process. First, I want to thank Dr. Robert Levasseur for accepting the chair role during the approval process for my proposal. I will always be grateful for his willingness to accept this role so late in the process, for quick the quick turnaround times on my submissions, and for the advice and help without which my success would not have been possible. Second I am grateful to Dr. Thomas Spencer for serving on my committee and particularly for providing valuable advice on sampling that helped me produce more meaningful research. I also want to thank Dr. David A. Gould, my University Research Reviewer, whose insight and knowledge also contributed to the quality of this document. In addition, I want to thank Dr. Kimberly L. Ross, my first mentor, whose growing work and family obligations led to her leaving Walden, and Dr. Christos N. Makrigeorgis, who provided guidance and encouragement as my mentor for several years, but who transitioned to a different role within Walden. I am also grateful for the efforts of other Walden faculty members who contributed to my success. Finally, I want to thank two highly influential teachers from my distant past, Dr. Albert C. Claus, and the late Father Gerald T. Hughes, S.J. They taught me how to think clearly “in this, the most perfect of all possible universes.”

9 citations


Cites background or methods from "Design and Analysis of Experiments ..."

  • ...DOE has evolved into a complex area of study incorporating a wide variety of methods to optimize performance (Lawson, 2014; Montgomery, 2001)....

    [...]

  • ...DOE has evolved into a complex area of study incorporating a wide variety of methods to optimize performance (Lawson, 2014; Montgomery, 2001)....

    [...]

  • ...As previously mentioned, Liu and Chen (2006) applied a method using the Burr XII distribution to data simulated using the beta, gamma, and Weibull distributions....

    [...]

Journal ArticleDOI
TL;DR: In this article, optimal Bayesian crossover designs for generalized linear models were proposed to minimize the log determinant of the variance of the estimated treatment effects over all possible allocation of the n subjects to the treatment sequences.

8 citations