scispace - formally typeset
Search or ask a question

Showing papers in "Technometrics in 1989"



Journal ArticleDOI
TL;DR: The issues of choice of stochastic-process model and computation of efficient designs are addressed, and applications are made to some chemical kinetics problems.
Abstract: A computer experiment generates observations by running a computer model at inputs x and recording the output (response) Y. Prediction of the response Y to an untried input is treated by modeling the systematic departure of Y from a linear model as a realization of a stochastic process. For given data (selected inputs and the computed responses), best linear prediction is used. The design problem is to select the inputs to predict efficiently. The issues of choice of stochastic-process model and computation of efficient designs are addressed, and applications are made to some chemical kinetics problems.

906 citations


Journal ArticleDOI
TL;DR: In this paper, the size of contrasts in factorial and fractional factorial designs is measured in terms of the original units of measurement, and the results are given in a direct association with the data may make the analysis easier to explain.
Abstract: Box and Meyer (1986) introduced a method for assessing the sizes of contrasts in unreplicated factorial and fractional factorial designs. This is a useful technique, and an associated graphical display popularly known as a Bayes plot makes it even more effective. This article presents a competing technique that is also effective and is computationally simple. An advantage of the new method is that the results are given in terms of the original units of measurement. This direct association with the data may make the analysis easier to explain.

557 citations


Journal ArticleDOI
TL;DR: This article reviews the progrrss of RSM in the general areas of experimental design and analysis and indicates how its role has been affected by advanccs in other fields of applied statistics.
Abstract: Response sarfxe methodology (RSM) is a collection of tools developed in the 1950s for the purpose of determining optimum operating conditions in applications in the chemical industry. This article reviews the progrrss of RSM in the general areas of experimental design and analysis and indicates how its role has been affected by advanccs in other fields of applied statistics. Current areas of research in RSM are highlighted. and areas for future research are discussed.

555 citations


Journal ArticleDOI
TL;DR: In this paper, a simple method is presented for fitting regression models that are nonlinear in the explanatory variables, which has powerful characteristics that cause it to be competitive with and often superior to more sophisticated techniques, especially for small data sets in the presence of high noise.
Abstract: A simple method is presented for fitting regression models that are nonlinear in the explanatory variables. Despite its simplicity—or perhaps because of it—the method has some powerful characteristics that cause it to be competitive with and often superior to more sophisticated techniques, especially for small data sets in the presence of high noise.

453 citations


Journal ArticleDOI
TL;DR: This article places statistical constraints on economic models to provide designs that meet industry's demand for lowprocess variability and long-term product quality and yields a design I call an economic statistical design.
Abstract: Control charts are the primary tools of statistical process control. These charts may be designed by using a simple rule suggested by Shewhart, by a statistical criterion, or by an economic criterion. Each method has its advantages and disadvantages. In this article, I place statistical constraints on economic models to provide designs that meet industry's demand for lowprocess variability and long-term product quality. This constrained economic model yields a design I call an economic statistical design. The model can be readily adapted to design ant Shewhart-type control chart. In this article, I illustrate its use in the joint design of an chart and an R chart.

370 citations


Journal ArticleDOI
TL;DR: A wheeled toy vehicle including a drive assembly consisting of a monofilament line having one extremity connected to a manually operable control means and the opposite end connected to the running gear of the vehicle as mentioned in this paper.
Abstract: A wheeled toy vehicle including a drive assembly which comprises a monofilament line having one extremity connected to a manually operable control means and the opposite end connected to the running gear of the vehicle. The dimensions and configuration of the monofilament line is such as to transmit rotation of the line about its own longitudinal axis, caused by activation of the control means, directly to the running gear which may comprise a drive axle and/or one or more drive wheels. Connecting means may attach the one extremity of the line to a predetermined outer portion of an axle or wheel by means of forming a socket therein correspondingly shaped to at least partially enclose a finger attached to the extremity of the line means cooperating therewith. Alternately, a finger can be connected to the extremity of the drive axle and be disposed so as to be enclosed within a socket formed within a sleeve which is connected to the extremity of the line and comprises another embodiment of the connecting means.

357 citations


Journal ArticleDOI
TL;DR: This book discusses the development of Reliability Standards and Specifications, as well as techniques of Estimating Reliability at Design Stage, and the role of management in Reliability.
Abstract: Introduction, Definitions, and Relationships. The Role of Management in Reliability. Managing Reliability as a Process. Economics of Reliability. Design for Reliability. Failure Modes and Effects (FMEA) and Fault-Tree Analysis (FTA) (Success-Tree Analysis--STA). Reliability Specification and Goal Setting. Concurrent Engineering. Human-Centered Design. Reliability Information Collection and Analysis. Designing Experiments to Measure and Improve Reliability. Accelerated Testing. Failure Analysis System--Root Cause and Corrective Action. Physics of Failure. Maintainability and Reliability. Component Reliability. Thermal Management and Reliability of Electronics. Mechanical Stress and Analysis. Mechanical Reliability. Design for Mechanical Reliability. System Reliability. Software Reliability and the Development Process. Supplier Reliability and Quality Assurance. Techniques of Estimating Reliability at Design Stage. Mathematical and Statistical Methods and Models in Reliability and Life Studies. Life Distributions and Concepts. Graphical Analyses of Reliability Data. Appendices: A: Tables and Charts. B: Charts. C: Reliability Standards and Specifications.

254 citations


Journal ArticleDOI
TL;DR: In this article, the authors summarized recent work in optimal experimental design in nonlinear problems, in which the major difficulty in obtaining good or optimal designs is their dependence on the true value of the parameters.
Abstract: This article summarizes recent work in optimal experimental design in nonlinear problems, in which the major difficulty in obtaining good or optimal designs is their dependence on the true value of the parameters. This difficulty arises in problems with nonlinear models or with linear models in which interest lies in a nonlinear function of the parameters. Most approaches use a static design based on “prior” information about the parameters or a sequential procedure that takes advantage of the inflow of new information about them. The various versions of these methods are discussed, as are some of the consequent problems of inference. Some selected procedures are compared using simulation studies.

250 citations


Journal ArticleDOI
TL;DR: In this paper, a matrix differential calculus with applications in Statistics and Econometrics is presented, with a focus on the application of the matrix calculus in statistics and economic analysis.
Abstract: (1989). Matrix Differential Calculus With Applications in Statistics and Econometrics. Technometrics: Vol. 31, No. 4, pp. 501-502.

231 citations



Journal ArticleDOI
TL;DR: In this article, the robustness of the standard and R charting procedures is investigated and the most sensitive procedures for detecting the out-of-control state are those that plot a subgroup statistic that is sensitive to outliers (e.g., mean or range) but determine the control limits in a resistant fashion.
Abstract: When a process is first submitted to statistical quality control, a standard procedure is to collect 20–40 subgroups of about five units each and then to construct control charts, such as charts and R charts, with limits determined by the data. These control charts are then used to detect problems in control such as outhers or excess variability in subgroup means that may have a special cause. In this article, the robustness of these charting procedures is investigated. If the number of false alarms when the process is in control is held constant, the most sensitive procedures for detecting the out-of-control state are those that plot a subgroup statistic that is sensitive to outliers (e.g., mean or range) but determine the control limits in a resistant fashion. Ordinary charting procedures, such as the standard and R charts, perform less well, and the worst performance is turned in by procedures in which the subgroup statistics are themselves resistant (e.g., median charts). To illustrate the point that ...


Journal ArticleDOI
TL;DR: In this article, a plot of the spherical variance and the maximum and minimum prediction variances for locations on a sphere against the radius of the sphere is used to investigate and compare the prediction capabilities of certain response surface designs.
Abstract: Measures of the quality of prediction at locations on the surface of a hyperspherc are presented. These measures are used to form a graphical method of assessing the overall prediction capability of an experimental design throughout the region of interest. A plot of the spherical variance and the maximum and minimum prediction variances for locations on a sphere against the radius of the sphere. a vuriance dispersion graph. is used to give a comprehensive picture of the behavior of the prediction variances throughout a region and hence of the quality of the predicted responses obtained with a particular design. Such plots are used to investigate and compare the prediction capabilities of certain response surface designs currently available to the researcher.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a method for distinguishing an observational outlier from an innovational one using regression analysis techniques, and a four-step procedure for modeling time series in the presence of outliers.
Abstract: Some statistics used in regression analysis are considered for detection of outliers in time series. Approximations and asymptotic distributions of these statistics are considered. A method is proposed for distinguishing an observational outlier from an innovational one. A four-step procedure for modeling time series in the presence of outliers is also proposed, and an example is presented to illustrate the methodology.

Journal ArticleDOI
TL;DR: In this article, the authors describe the use of bootstrap methods for the problem of testing homogeneity of variances when means are not assumed equal or known, and show that the new resampling procedures compare favorably with older methods in terms of test validity and power.
Abstract: This article describes the use of bootstrap methods for the problem of testing homogeneity of variances when means are not assumed equal or known. The methods are new in this context and allow the use of normal-theory test statistics such as F = s 2 1/s 2 2 without the normality assumption that is crucial for validity of critical values obtained from the F distribution. Both asymptotic analysis and Monte Carlo sampling show that the new resampling procedures compare favorably with older methods in terms of test validity and power.

Journal ArticleDOI
TL;DR: In this article, the Mathematical Theory of Trees has been used for Information Games of Chance and Probability Theory and for Variations on a Theme by Fibonacci, and for the teaching of probability theory.
Abstract: Foreword On the Mathematical Notion of Information Games of Chance and Probability Theory Notes on the Teaching of Probability Theory Variations on a Theme by Fibonacci The Mathematical Theory of Trees.

Journal ArticleDOI
TL;DR: In this paper, maximum likelihood estimators for the mean of a population, based on censored samples that can be transformed to normality, are calculated via the expectation-maximization algorithm.
Abstract: The reporting procedures for potentially toxic pollutants are complicated by the fact that concentrations are measured using small samples that include a number of observations lying below some detection limit. Furthermore, there is often a small number of high concentrations observed in combination with a substantial number of low concentrations. This results in small, nonnormally distributed censored samples. This article presents maximum likelihood estimators for the mean of a population, based on censored samples that can be transformed to normality. The method estimates the optimal power transformation in the Box-Cox family by searching the censored-data likelihood. Maximum likelihood estimators for the mean in the transformed scale are calculated via the expectation-maximization algorithm. Estimates for the mean in the original scale are functions of the estimated mean and variance in the transformed population. Confidence intervals are computed using the delta method and the nonparametric percentil...

Journal ArticleDOI
TL;DR: In this paper, failure-censored samples are used to design variable sampling plans for lognormal and Wcibull-distributed lifetimes, and Monte Carlo simulations show that these plans meet the required risks when bias corrected maximum likelihood estimators for the location and scale parameters are used.
Abstract: Variables-sampling plans arc more efficient than attributes-sampling plans. If the quality characteristic is the life of a product. however, variables sampling can be very time consuming and expensive. To save time and money, tests can be terminated before all test units have failed. This article discusses the design of variables-sampling plans based on failure-censored samples. This method of design can be applied to lognormal-distributed and Wcibull-distributed lifetimes. An example for lognormal-distributed lifetimes will be given. Although the method is based on asymptotic results, Monte Carlo simulations for lognormal and Weibull lifetimes show that the sampling plans meet the required risks when bias corrected maximum likelihood estimators for the location and scale parameters are used. The advantage of failurecensored sampling plans is that they require a much smaller sample size than attributes-sampling plans and a greatly reduced test time when compared to complete variables-sampling plans.

Journal ArticleDOI
TL;DR: In this paper, a generalization of Harville's (1974) algorithm is given for generating nearly D-optimal block designs and is applied in a number of representative settings, and can be used in conjunction with qualitative treatments for generating many of the classical designs.
Abstract: We consider the problem of blocking response-surface and factorial designs when block sizes are prespecified, often rendering standard approaches inapplicable. A generalization of Harville's (1974) algorithm is given for generating nearly D-optimal block designs and is applied in a number of representative settings. The algorithm is time and space efficient and can be used in conjunction with qualitative treatments for generating many of the classical designs—balanced and partially balanced incomplete block designs, for example.



Journal ArticleDOI
TL;DR: The intended audience for this book, as stated by the author, is industrial practitioners involved in product or process experimentation and development and those with statistical experience who may wish to consider recommending Taguchi literature to statistically inexperienced colleagues.
Abstract: The intended audience for this book, as stated by the author, is "industrial practitioners (managers, engineers, and scientists) involved in product or process experimentation and development" (p. xi). The author notes the scarcity of information for nonstatisticians on Taguchi methods and states that the book should be "useful to the statistically inexperienced engineer who would have some difficulty understanding and utilizing a traditional text concerning designed experiments" (p. xii). Most readers of Technometrics do not fit this last description, so this review will be oriented to those with statistical experience who may wish to consider recommending Taguchi literature to statistically inexperienced colleagues. The scope of the book is limited to basic "cookbook" applications involving product design and process development (offline quality control) as opposed to process monitoring (on-line quality control), as presented by Taguchi. There are eight chapters and seven appendixes, as follows: 1. The Economics of Variation 2. Introduction to the Analysis of Variance 3. Introduction to Orthogonal Arrays 4. Multiple Level Experiments 5. Interpretation of Experimental Results 6. Special Designs 7. Attribute Data

Journal ArticleDOI
TL;DR: The assumptions of the on-line process control methods presented by Taguchi, Elsayed, and Hsiang (1989) are examined in this paper, where a method for approximating optimal control strategies is presented and compared with results obtained using Taguchi's method.
Abstract: The assumptions of the on-line process-control methods presented by Taguchi, Elsayed, and Hsiang (1989) are examined. Taguchi's method for obtaining control strategies is evaluated for the random-walk case. A method for approximating optimal control strategies is presented and compared with results obtained using Taguchi's method, with a new modification of Taguchi's method, and with simulation results.

Journal ArticleDOI
Emmanuel Yashchin1
TL;DR: A class of weighted control schemes that generalizes the basic cumulative sum (CUSUM) technique is introduced, and representatives of this class are shown to have better run length characteristics with respect to drift in the level of a controlled process than does the classical CUSUM.
Abstract: A class of weighted control schemes that generalizes the basic cumulative sum (CUSUM) technique is introduced. The schemes of the first type, in which the weights represent information concomitant with the data, prove to be especially useful when handling charts corresponding to samples of varying sizes. The schemes of the second type are based on giving greater weight to more recent information. Representatives of this class are shown to have better run length characteristics with respect to drift in the level of a controlled process than does the classical CUSUM, while maintaining good sensitivity with respect to shifts. Analogous to the classical CUSUM scheme, they admit a dual graphical representation; that is, the scheme can be applied by means of a one- or two-sided decision interval or via a V mask. A special case of this type of scheme, designated the geometric CUSUM, is considered in detail. It can be viewed as a CUSUM-type counterpart of the exponentially weighted moving average.

Journal ArticleDOI
TL;DR: Using rotation and animation in regression diagnostics for checking for interactions and normality, assessing the need to transform the data, and adding predictors to a model are developed.
Abstract: We develop uses for two recently proposed types of dynamic displays—rotation and animation—in regression diagnostics. Some of the general issues that we address by using these displays include checking for interactions and normality, assessing the need to transform the data, and adding predictors to a model. Animation is used in probability plotting and as an aid to understanding the effects of adding variables to a model. Rotation is used for threedimensional added-variable and residual plots, each of which may be effective for diagnosing the presence of an interaction.

Journal ArticleDOI
TL;DR: One of the books that can be recommended for new readers is maximum entropy and bayesian spectral analysis and estimation problems, which is not kind of difficult book to read.
Abstract: Preparing the books to read every day is enjoyable for many people. However, there are still many people who also don't like reading. This is a problem. But, when you can support others to start reading, it will be better. One of the books that can be recommended for new readers is maximum entropy and bayesian spectral analysis and estimation problems. This book is not kind of difficult book to read. It can be read and understand by the new readers.


Journal ArticleDOI
TL;DR: In this article, assignment methods in combinatorial data analysis are presented. But they do not cover the problem of assigning data points to data points in the same set of data points.
Abstract: (1989). Assignment Methods in Combinatorial Data Analysis. Technometrics: Vol. 31, No. 2, pp. 272-273.