scispace - formally typeset
Search or ask a question

Showing papers in "Technometrics in 1994"


Journal ArticleDOI
TL;DR: In this paper, a statistical model based on counting process is proposed. But the model is not suitable for counting-based counting process-based models, and it cannot be used for all counting processes.
Abstract: (1994). Statistical Models Based On Counting Process. Technometrics: Vol. 36, No. 1, pp. 111-112.

755 citations


Journal ArticleDOI
TL;DR: In this article, Simulation Modeling and Analysis (2nd Ed) is used for simulation modeling and analysis of simulation models and analysis. pp. 429-430 and 471-430
Abstract: (1994) Simulation Modeling and Analysis (2nd Ed) Technometrics: Vol 36, No 4, pp 429-430

467 citations


Journal ArticleDOI

450 citations



Reference BookDOI
TL;DR: The integration of GIS Reference Record with Geographic Information System (GIS) Reference Record is described in detail in the second part of this presentation.
Abstract: Keywords: integration of GIS Reference Record created on 2005-06-20, modified on 2016-08-08

385 citations


BookDOI
TL;DR: In this article, the authors present a monograph for students at the graduate level in biostatistics, statistics or other disciplines that collect longitudinal data, focusing on the state space approach that provides a convenient way to compute likelihoods using the Kalman filter.
Abstract: This monograph is written for students at the graduate level in biostatistics, statistics or other disciplines that collect longitudinal data. It concentrates on the state space approach that provides a convenient way to compute likelihoods using the Kalman filter.

344 citations



Journal ArticleDOI
TL;DR: In this paper, a conditional method of inference is used to derive exact confidence intervals for several life characteristics such as location, scale, quantiles, and reliability when the data are Type II progressively censored.
Abstract: A conditional method of inference is used to derive exact confidence intervals for several life characteristics such as location, scale, quantiles, and reliability when the data are Type II progressively censored. The method is shown to be feasible and practical, although a computer program may be required for its implementation. The method is applied for the purpose of illustration to the extreme-value and the one- and two-parameter exponential models. Prediction limits for the lifelength of future units are also discussed. An example consisting of data from an accelerated test on insulating fluid reported by Nelson is used for illustration and comparison.

329 citations


Journal ArticleDOI

322 citations


Journal ArticleDOI
TL;DR: In this paper, run-length distributions of the special cause control chart were derived for correlated observations, given that the assignable cause to be detected is a shift in the process mean.
Abstract: We derive run-length distributions of the special-cause control chart proposed by Alwan and Roberts for correlated observations, given that the assignable cause to be detected is a shift in the process mean. Both recursive and closed-form solutions are derived for the run-length distribution, average run length (ARL), and standard deviation of the run length (SRL) for any AR(p) process, and approximate solutions are derived for the more general ARMA(p,q) processes. The expressions derived do not depend on the type of shift in the process mean. Numerical results are illustrated for the ARL and SRL of the ARMA(l,l) model, given that the shift in the mean is a step shift. These results show that the ARL and SRL of the specialcause control chart are relatively smaller when the process is negatively rather than positively autocorrelated. Regardless of the sign of the autocorrelation, the shape of the probability mass function of the run length reveals that the probability of detecting shifts very early is subs...

302 citations


Journal ArticleDOI
TL;DR: This paper proposed a modification of the D-optimal approach that preserves the flexibility and ease of use of algorithmic designs while being more resistant to the biases caused by an incorrect model.
Abstract: D-optimal and other computer-generated experimental designs have been criticized for being too dependent on an assumed statistical model. To address this criticism, we introduce the notion of empirical models that have both primary and potential terms. Combining this idea with the Bayesian paradigm, this article proposes a modification of the D-optimal approach that preserves the flexibility and ease of use of algorithmic designs while being more resistant to the biases caused by an incorrect model. These designs provide a Bayesian justification for resolution IV designs. Several theoretical examples and a practical example from the literature demonstrate the advantages of the proposed method.

Journal ArticleDOI


Journal ArticleDOI
TL;DR: In this paper, the authors propose a methodology for designing experiments for degradation processes in which the amount of degradation over time levels off toward a plateau (maximum degradation) that is a function of stress.
Abstract: Traditionally, reliability assessment of new devices has been based on accelerated life tests. This approach is not practical for highly reliable devices, such as lasers, which are not likely to fail in experiments of reasonable length. An alternative approach is to monitor the devices for a period of time and assess their reliability from the changes in performance (degradation) observed during the experiment. In this article, we propose a methodology for designing experiments for degradation processes in which the amount of degradation over time levels off toward a plateau (maximum degradation) that is a function of stress. We provide (a) the stress levels for the experiment, (b) the proportion of devices to test at each stress level, (c) the times at which to measure the devices, and (d) the total number of devices to test. We apply the proposed methodology to an actual example.



Journal ArticleDOI
TL;DR: In this article, statistical models for Causal Analysis (SMCA) are used for causality analysis in the context of stochastic models for the analysis of complex systems.
Abstract: (1994). Statistical Models for Causal Analysis. Technometrics: Vol. 36, No. 3, pp. 327-328.

Journal ArticleDOI
TL;DR: An Introduction to Stochastic Modeling (2nd Ed.) Technometrics: Vol. 36, No. 4, pp. 428-429 as discussed by the authors is a good starting point.
Abstract: (1994). An Introduction to Stochastic Modeling (2nd Ed.) Technometrics: Vol. 36, No. 4, pp. 428-429.

Journal ArticleDOI
TL;DR: In this paper, a new sensitivity test was proposed to estimate the parameters associated with latent continuous variables that cannot be measured, such as the parameters of the distribution of a set of explosive specimens.
Abstract: Sensitivity tests are often used to estimate the parameters associated with latent continuous variables that cannot be measured. For example, each explosive specimen has a threshold. The specimen will detonate if and only if an applied shock exceeds this value. Since there is no way to determine the threshold of an individual, specimens are tested at various levels to determine parameters of the population. A new test described here produces efficient estimates of the parameters of the distribution, even with limited prior knowledge. This test efficiently characterizes the entire distribution and desired percentiles of any population.

Journal ArticleDOI
TL;DR: In this article, the authors extend the existing maximum likelihood theory for test planning to the nonconstant σ model and present test plans for a large range of practical testing situations.
Abstract: Previous work on planning accelerated life tests has assumed that the scale parameter σ for a location-scale distribution of log lifetime remains constant over all stress levels. This assumption is, however, inappropriate for many applications, including accelerated tests for metal fatigue and certain electronic components. This article extends the existing maximum likelihood theory for test planning to the nonconstant σ model and presents test plans for a large range of practical testing situations. The test plans are optimum in that they minimize the asymptotic variance of the maximum likelihood estimator of a specified quantile at the design stress. The development and discussion in the article, as well as the theory given in the Appendix, applies to accelerated-life-test models in which the log time-to-failure can be modeled as a location-scale distribution. The test setup assumes simultaneous testing of units with time censoring. We give particular numerical results for the Weibull failure-time distr...

Journal ArticleDOI
TL;DR: Understanding Variation: The Key to Managing Chaos Revised by Donald J. Wheeler as discussed by the authors is a short, clearly written, and important work on variation that should be read by all technical communicators who would like to understand the meaning of time.
Abstract: Sat, 06 Sep 2008 23:55:00 GMT understanding variation the key to pdf Understanding Variation: The Key to Managing Chaos Book Review by Mike Taigman “Our crews are not getting signatures on their patient care reports….This laziness ... Sat, 26 Jan 2019 15:22:00 GMT Understanding Variation: The Key to Managing Chaos Book ... Understanding Variation has 281 ratings and 27 reviews. Bob said: I have followed Mark Graban's Lean Blog and podcast for many years and I have observed ... Wed, 27 Feb 2019 10:34:00 GMT Understanding Variation: The Key to Managing Chaos by ... statisticians will enjoy and benefit from the book as well as novices in the field. While only taking two or three hours to read, it is a treasure-trove of ... Wed, 24 Apr 2013 23:54:00 GMT Understanding Variation: The Key To Managing Chaos PDF Understanding Variation the Key to Managing Chaos by Donald J. Wheeler Contains book description, cover image, ISBN and release date Tue, 04 Dec 2018 09:44:00 GMT SPC Press -Book: Understanding Variation the Key to ... Here is a short, clearly written, and important work on variation that should be read by all technical communicators who would like to understand the meaning of time ... Mon, 25 Feb 2019 06:26:00 GMT Understanding Variation: The Key to Managing Chaos ... Understanding variation pdf keyword after analyzing the system lists the list of keywords related and the list of websites with related content, in addition you can ... Mon, 11 Feb 2019 05:02:00 GMT Understanding variation pdf\" Keyword Found Websites ... Buy Understanding Variation: The Key to Managing Chaos Revised by Donald J. Wheeler (ISBN: 9780945320531) from Amazon's Book Store. Everyday low prices and free ... Tue, 19 Feb 2019 23:58:00 GMT Understanding Variation: The Key to Managing Chaos ... [PDF] Free Download Understanding Variation: The Key to Managing Chaos By : ... [PDF] Free Download Understanding Variation: The Key to Managing Chaos By : ... Mon, 25 Feb 2019 11:55:00 GMT Ebook Understanding Variation: The Key to Managing Chaos ... Donald J. Wheeler: Understanding Variation: The Key to Managing Chaos. PDF Download, MOBi EPUB Kindle. Description. We live in the Information Age, and much of that ... Tue, 19 Feb 2019 02:22:00 GMT [ FREE ] Understanding Variation: The Key to Managing ... Understanding Variation Training. ... Basic SPC – Comprehensive training in understanding and controlling variation using ... Identify key patterns of variation and ... Wed, 20 Nov 2013 04:57:00 GMT Understanding Variation Training | QualityTrainingPortal Understanding Variation The Key to Managing Understanding Variation The Key to Managing Chaos 2 leaders would read books like Understanding Variation … span ... Sun, 10 Feb 2019 15:05:00 GMT PDF Understanding Variation: The Key to Managing Chaos ... The Book that Changed Me: “Understanding Variation ... but rather the amazing book Understanding Variation: The Key to Managing ... “Understanding Variation ... Sat, 16 Feb 2019 18:39:00 GMT The Book that Changed Me: “Understanding Variation― Read online (Issuu 6) Understanding Variation: The Key to Managing Chaos [NEWS] KWH. Read online (Issuu 6) Understanding Variation: The Key to Managing Chaos [NEWS ... Mon, 18 Feb 2019 17:11:00 GMT Read online (Issuu 6) Understanding Variation: The Key to ... Understanding Variation Key to More Effective Decision Making By Clifford L. Norman. Associates in Process Improvement (API) May 22, 2018 Sun, 24 Feb 2019


Journal ArticleDOI
TL;DR: The multivariate profile (MP) chart as mentioned in this paper is a control chart for simultaneous display of univariate and multivariate statistics that is designed to analyze and display extended structures of statistical process control data for various cases of grouping, reference distribution, and use of nominal specifications.
Abstract: The multivariate profile (MP) chart is a new control chart for simultaneous display of univariate and multivariate statistics. It is designed to analyze and display extended structures of statistical process control data for various cases of grouping, reference distribution, and use of nominal specifications. For each group of observations, the scaled deviations from reference values are portrayed together as a modified profile plot symbol. The vertical location of the symbol is determined by the multivariate distance of the vector of means from the reference values. The graphical display in the MP chart enjoys improved visual characteristics as compared with previously suggested methods. Moreover, the perceptual tasks required by the use of the MP chart provide higher accuracy in retrieving the quantitative information. This graphical display is used to display other combined univariate and multivariate statistics, such as measures of dispersion, principal components, and cumulative sums

Journal ArticleDOI
TL;DR: In this paper, a generalized p value is applied for testing hypotheses in two situations, testing the significance of a variance component in a general balanced mixed model when an exact F test does not exist and comparing randomeffects variance components in two independent balanced mixed models.
Abstract: The concept of a generalized p value, introduced by Tsui and Weerahandi, is applied for testing hypotheses in two situations, testing the significance of a variance component in a general balanced mixed model when an exact F test does not exist and comparing randomeffects variance components in two independent balanced mixed models. Extensions to the unbalanced cases are also indicated. The proposed test is compared with the test based on the Satterthwaite approximation through their simulated Type I error probabilities. The simulations indicate that the test based on the generalized p value hasType I error probabilities less than the chosen significance level most of the time, whereas the Type I error probabilities of the Satterthwaite approximate test can be much larger than the significance level. The results are illustrated using two examples.


Journal ArticleDOI
Emmanuel Yashchin1
TL;DR: The article discusses the problem of monitoring the process level and variability by using the cumulative sum technique and some aspects of the implementation of this methodology are considered.
Abstract: This article discusses methods for monitoring a process in which the variance of the measurements is attributed to several known sources of variability. For example, in the case of integrated circuit fabrication one is typically interested in monitoring the process mean as well as the lot-to-lot, wafer-to-wafer-within-lot, and within-wafer components of variability. The article discusses the problem of monitoring the process level and variability by using the cumulative sum technique. Some aspects of the implementation of this methodology are also considered, and examples are given.

Journal ArticleDOI
TL;DR: In this article, the authors propose an additional rule to the usual definition of resolution that provides a conservative but more realistic assessment of the resolution of two-level fractional factorial designs.
Abstract: When two-level fractional factorial designs are blocked, the application of the standard definition of resolution requires careful consideration. Sometimes linear contrasts that superficially appear to be estimates of higher-order interaction effects are in reality estimates of first-order effects. Experimenters may therefore inadvertently choose designs that are of lower resolution than intended or unknowingly confound important effects. In this note, I discuss this subtle problem and propose an additional rule to the usual definition of resolution that provides a conservative but more realistic assessment of the resolution. With this more realistic characterization, experimenters are provided with a warning about possible confounding. I also show that my amendment to the definition of resolution may be useful when characterizing designs in which several two-level contrasts are combined to accommodate factors with four or more levels