scispace - formally typeset
Search or ask a question
Author

James M. Lucas

Bio: James M. Lucas is an academic researcher from DuPont. The author has contributed to research in topics: CUSUM & Shewhart individuals control chart. The author has an hindex of 17, co-authored 26 publications receiving 5587 citations.

Papers
More filters
Journal Article
TL;DR: The recognition that an EWMA control scheme can be represented as a Markov chain allows its properties to be evaluated more easily and completely than has previously been done.

1,624 citations

Journal ArticleDOI
TL;DR: In this article, the authors evaluate the properties of an exponentially weighted moving average (EWMA) control scheme used to monitor the mean of a normally distributed process that may experience shifts away from the target value.
Abstract: Roberts (1959) first introduced the exponentially weighted moving average (EWMA) control scheme. Using simulation to evaluate its properties, he showed that the EWMA is useful for detecting small shifts in the mean of a process. The recognition that an EWMA control scheme can be represented as a Markov chain allows its properties to be evaluated more easily and completely than has previously been done. In this article, we evaluate the properties of an EWMA control scheme used to monitor the mean of a normally distributed process that may experience shifts away from the target value. A design procedure for EWMA control schemes is given. Parameter values not commonly used in the literature are shown to be useful for detecting small shifts in a process. In addition, several enhancements to EWMA control schemes are considered. These include a fast initial response feature that makes the EWMA control scheme more sensitive to start-up problems, a combined Shewhart EWMA that provides protection against both larg...

1,380 citations

Journal ArticleDOI
TL;DR: A group of practitioners and researchers discuss the role of parameter design and Taguchi's methodology for implementing it and the importance of parameter-design principles with well-established statistical techniques.
Abstract: It is more than a decade since Genichi Taguchi's ideas on quality improvement were inrroduced in the United States. His parameter-design approach for reducing variation in products and processes has generated a great deal of interest among both quality practitioners and statisticians. The statistical techniques used by Taguchi to implement parameter design have been the subject of much debate, however, and there has been considerable research aimed at integrating the parameter-design principles with well-established statistical techniques. On the other hand, Taguchi and his colleagues feel that these research efforts by statisticians are misguided and reflect a lack of understanding of the engineering principles underlying Taguchi's methodology. This panel discussion provides a forum for a technical discussion of these diverse views. A group of practitioners and researchers discuss the role of parameter design and Taguchi's methodology for implementing it. The topics covered include the importance of vari...

654 citations

Journal ArticleDOI
James M. Lucas1
TL;DR: The SheWhart-CUSUM quality control scheme which combines the key features of the Shewhart and CUSUM control procedures is described and evaluated.
Abstract: The Shewhart-CUSUM quality control scheme which combines the key features of the Shewhart and CUSUM control procedures is described and evaluated. In this scheme the CUSUM feature will quickly detect small shifts from the goal while the addition of Shew..

412 citations

Journal ArticleDOI
TL;DR: Design and implementation procedures for counted data CUSUM's (these are sometimes called C USUM's for attributes) are described, which are easy to design and implement and can be used to detect both increases and decreases in the count level.
Abstract: Cumulative Sum (CUSUM) control schemes are widely used in industry for process and measurement control. Most CUSUM applications have been for continuous variables. There have been fewer uses of CUSUM control schemes when the response is a count such as the number of defects per unit or the occurrence of an accident. This article describes design and implementation procedures for counted data CUSUM's (these are sometimes called CUSUM's for attributes). These CUSUM's are easy to design and implement; they can be used to detect both increases and decreases in the count level. Enhancements to the CUSUM scheme, including the fast initial response (FIR) feature and the robust CUSUM are discussed. These enhancements speed up the detection of changes in the count level and guard against the effects of atypical or outlier observations.

372 citations


Cited by
More filters
Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Journal ArticleDOI
TL;DR: A unified framework for the design and the performance analysis of the algorithms for solving change detection problems and links with the analytical redundancy approach to fault detection in linear systems are established.
Abstract: This book is downloadable from http://www.irisa.fr/sisthem/kniga/. Many monitoring problems can be stated as the problem of detecting a change in the parameters of a static or dynamic stochastic system. The main goal of this book is to describe a unified framework for the design and the performance analysis of the algorithms for solving these change detection problems. Also the book contains the key mathematical background necessary for this purpose. Finally links with the analytical redundancy approach to fault detection in linear systems are established. We call abrupt change any change in the parameters of the system that occurs either instantaneously or at least very fast with respect to the sampling period of the measurements. Abrupt changes by no means refer to changes with large magnitude; on the contrary, in most applications the main problem is to detect small changes. Moreover, in some applications, the early warning of small - and not necessarily fast - changes is of crucial interest in order to avoid the economic or even catastrophic consequences that can result from an accumulation of such small changes. For example, small faults arising in the sensors of a navigation system can result, through the underlying integration, in serious errors in the estimated position of the plane. Another example is the early warning of small deviations from the normal operating conditions of an industrial process. The early detection of slight changes in the state of the process allows to plan in a more adequate manner the periods during which the process should be inspected and possibly repaired, and thus to reduce the exploitation costs.

3,830 citations

Book
01 Jan 2004
TL;DR: This paper establishes the possibility of stable recovery under a combination of sufficient sparsity and favorable structure of the overcomplete system and shows that similar stability is also available using the basis and the matching pursuit algorithms.
Abstract: Overcomplete representations are attracting interest in signal processing theory, particularly due to their potential to generate sparse representations of signals. However, in general, the problem of finding sparse representations must be unstable in the presence of noise. This paper establishes the possibility of stable recovery under a combination of sufficient sparsity and favorable structure of the overcomplete system. Considering an ideal underlying signal that has a sufficiently sparse representation, it is assumed that only a noisy version of it can be observed. Assuming further that the overcomplete system is incoherent, it is shown that the optimally sparse approximation to the noisy data differs from the optimally sparse decomposition of the ideal noiseless signal by at most a constant multiple of the noise level. As this optimal-sparsity method requires heavy (combinatorial) computational effort, approximation algorithms are considered. It is shown that similar stability is also available using the basis and the matching pursuit algorithms. Furthermore, it is shown that these methods result in sparse approximation of the noisy data that contains only terms also appearing in the unique sparsest representation of the ideal noiseless sparse signal.

2,365 citations

Journal ArticleDOI
TL;DR: This paper surveys their existing application in engineering design, and addresses the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes, along with recommendations for the appropriate use of statistical approximation techniques in given situations.
Abstract: The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of today’s engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper, we review several of these techniques, including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning and kriging. We survey their existing application in engineering design, and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations, and how common pitfalls can be avoided.

1,886 citations