scispace - formally typeset
Search or ask a question

Showing papers in "Technometrics in 1993"


Journal ArticleDOI
TL;DR: The Fundamentals of Statistical Signal Processing: Estimation Theory as mentioned in this paper is a seminal work in the field of statistical signal processing, and it has been used extensively in many applications.
Abstract: (1995). Fundamentals of Statistical Signal Processing: Estimation Theory. Technometrics: Vol. 37, No. 4, pp. 465-466.

14,342 citations


Journal ArticleDOI
TL;DR: A unified framework for the design and the performance analysis of the algorithms for solving change detection problems and links with the analytical redundancy approach to fault detection in linear systems are established.
Abstract: This book is downloadable from http://www.irisa.fr/sisthem/kniga/. Many monitoring problems can be stated as the problem of detecting a change in the parameters of a static or dynamic stochastic system. The main goal of this book is to describe a unified framework for the design and the performance analysis of the algorithms for solving these change detection problems. Also the book contains the key mathematical background necessary for this purpose. Finally links with the analytical redundancy approach to fault detection in linear systems are established. We call abrupt change any change in the parameters of the system that occurs either instantaneously or at least very fast with respect to the sampling period of the measurements. Abrupt changes by no means refer to changes with large magnitude; on the contrary, in most applications the main problem is to detect small changes. Moreover, in some applications, the early warning of small - and not necessarily fast - changes is of crucial interest in order to avoid the economic or even catastrophic consequences that can result from an accumulation of such small changes. For example, small faults arising in the sensors of a navigation system can result, through the underlying integration, in serious errors in the estimated position of the plane. Another example is the early warning of small deviations from the normal operating conditions of an industrial process. The early detection of slight changes in the state of the process allows to plan in a more adequate manner the periods during which the process should be inspected and possibly repaired, and thus to reduce the exploitation costs.

3,830 citations


Journal ArticleDOI
TL;DR: In this article, the curse of dimensionality and dimension reduction is discussed in the context of multivariate data representation and geometrical properties of multi-dimensional data, including Histograms and Kernel Density Estimators.
Abstract: Representation and Geometry of Multivariate Data. Nonparametric Estimation Criteria. Histograms: Theory and Practice. Frequency Polygons. Averaged Shifted Histograms. Kernel Density Estimators. The Curse of Dimensionality and Dimension Reduction. Nonparametric Regression and Additive Models. Special Topics. Appendices. Indexes.

3,007 citations


Journal ArticleDOI
TL;DR: The Factor Analytic Model of Factor Extraction by the Centroid Method as mentioned in this paper is an example of the use of factor analysis in the Comrey personality scales, and it has been used in many applications.
Abstract: Contents: Preface to the First Edition. Preface to the Second Edition. Introduction. The Factor Analytic Model. Factor Extraction by the Centroid Method. Methods of Factor Extraction. Orthogonal Hand Rotations. Oblique Hand Rotations. Simple Structure and Other Rotational Criteria. Planning the Standard Design Factor Analysis. Alternate Designs in Factor Analysis. Interpretation and Application of Factor Analytic Results. Development of the Comrey Personality Scales: An Example of the Use of Factor Analysis. Confirmatory Factor Analysis. Structural Equation Models. Computer Programs.

2,704 citations


Journal ArticleDOI
TL;DR: In this article, the authors examined partial least squares and principal components regression from a statistical perspective and compared them with other statistical methods intended for those situations, such as variable subset selection and ridge regression.
Abstract: Chemometrics is a field of chemistry that studies the application of statistical methods to chemical data analysis. In addition to borrowing many techniques from the statistics and engineering literatures, chemometrics itself has given rise to several new data-analytical methods. This article examines two methods commonly used in chemometrics for predictive modeling—partial least squares and principal components regression—from a statistical perspective. The goal is to try to understand their apparent successes and in what situations they can be expected to work well and to compare them with other statistical methods intended for those situations. These methods include ordinary least squares, variable subset selection, and ridge regression.

2,309 citations


Journal ArticleDOI
TL;DR: In this article, the authors developed statistical methods for using degradation measures to estimate a time-to-failure distribution for a broad class of degradation models, using a nonlinear mixed-effects model and developing methods based on Monte Carlo simulation to obtain point estimates and confidence intervals for reliability assessment.
Abstract: Some life tests result in few or no failures. In such cases, it is difficult to assess reliability with traditional life tests that record only time to failure. For some devices, it is possible to obtain degradation measurements over time, and these measurements may contain useful information about product reliability. Even with little or no censoring, there may be important practical advantages to analyzing degradation data. If failure is defined in terms of a specified level of degradation, a degradation model defines a particular time-to-failure distribution. Generally it is not possible to obtain a closed-form expression for this distribution. The purpose of this work is to develop statistical methods for using degradation measures to estimate a time-to-failure distribution for a broad class of degradation models. We use a nonlinear mixed-effects model and develop methods based on Monte Carlo simulation to obtain point estimates and confidence intervals for reliability assessment.

1,062 citations


Journal ArticleDOI

887 citations


Journal ArticleDOI
TL;DR: In this article, a wide range of multiple time series models and methods are considered, including vector autoregressive, vector auto-regressive moving average, cointegrated and periodic processes as well as state space and dynamic simultaneous equations models.
Abstract: This graduate-level textbook deals with analyzing and forecasting multiple time series. It considers a wide range of multiple time series models and methods. The models include vector autoregressive, vector autoregressive moving average, cointegrated and periodic processes as well as state space and dynamic simultaneous equations models. Least squares, maximum likelihood and Bayesian methods are considered for estimating these models. Different procedures for model selection or specification are treated and a range of tests and criteria for evaluating the adequacy of a chosen model are introduced. The choice of point and interval forecasts as well as innovation accounting are presented as tools for structural analysis within the multiple time series context.

623 citations


Journal ArticleDOI
TL;DR: In this paper, the best linear unbiased prediction procedure within a Bayesian framework was proposed for Gaussian random fields in a way that appropriately dealt with uncertainty in the covariance function.
Abstract: This article is concerned with predicting for Gaussian random fields in a way that appropriately deals with uncertainty in the covariance function. To this end, we analyze the best linear unbiased prediction procedure within a Bayesian framework. Particular attention is paid to the treatment of parameters in the covariance structure and their effect on the quality, both real and perceived, of the prediction. These ideas are implemented using topographical data from Davis.

609 citations


Journal ArticleDOI
TL;DR: Handbook for Digital Signal Processing is the definitive source of detailed information on all important topics in modern digital signal processing and fills the needs of practicing engineers and designers of hardware, systems, and software.
Abstract: From the Publisher: Digital signal processing (DSP) revolutionized the electronics industry. Its flexibility, cost effectiveness, grammability, precision and broad range of applications - including applications in telecommunications, consumer electronics, radar, sonar, and more - have made it the methodology of choice over analog signal processing. Over the past two decades, advances in DSP technology have been so rapid and so massive that, until now, no single volume has offered comprehensive theoretical coverage of this fascinating field along with practical DSP applications. Handbook for Digital Signal Processing is the definitive source of detailed information on all important topics in modern digital signal processing. The only up-to-date handbook of its kind, it fills the needs of practicing engineers and designers of hardware, systems, and software. Written and edited by internationally known authorities on DSP, Handbook for Digital Signal Processing is supplemented with hundreds of informative tables and illustrations. For professional engineers, designers, and researchers in electronics and telecommunications, this book will be an indispensable reference, now and for years to come.

410 citations


ReportDOI
TL;DR: This article is concerned with the problem of predicting a deterministic response function yo over a multidimensional domain T, given values of yo and all of its first derivatives at a set of design sites (points) in T.
Abstract: This article is concerned with the problem of predicting a deterministic response function yo over a multidimensional domain T, given values of yo and all of its first derivatives at a set of design sites (points) in T. The intended application is to computer experiments in which yo is an output from a computer model of a physical system and each point in T represents a particular configuration of the input parameters. It is assumed that the first derivatives are already available (e.g., from a sensitivity analysis) or can be produced by the code that implements the model. A Bayesian approach in which the random function that represents prior uncertainty about yo is taken to be a stationary Gaussian stochastic process is used. The calculations needed to update the prior given observations of yo and its first derivatives at the design sites are given and are illustrated in a small example. The issue of experimental design is also discussed, in particular the criterion of maximizing the reduction in entropy...


Journal ArticleDOI
TL;DR: In this paper, a new class of supersaturated designs is constructed using half fractions of Hadamard matrices, which can investigate up to N − 2 factors in N/2 runs.
Abstract: Supersaturated designs are useful in situations in which the number of active factors is very small compared to the total number of factors being considered. In this article, a new class of supersaturated designs is constructed using half fractions of Hadamard matrices. When a Hadamard matrix of order N is used, such a design can investigate up to N – 2 factors in N/2 runs. Results are given for N ≤ 60. Extension to larger N is straightforward. These designs are superior to other existing supersaturated designs and are easy to construct. An example with real data is used to illustrate the ideas.

Journal ArticleDOI
TL;DR: In this article, Randomization and Monte Carlo Methods in Biology have been used to perform Monte Carlo methods in the field of biology, and the results show that the Monte Carlo method is effective.
Abstract: (1993). Randomization and Monte Carlo Methods in Biology. Technometrics: Vol. 35, No. 1, pp. 96-97.

Journal ArticleDOI
TL;DR: In this paper, Statistical Quality Design and Control (SQCDC) is used for statistical quality design and control in the context of statistical quality assurance. But this work is limited.
Abstract: (1993). Statistical Quality Design and Control. Technometrics: Vol. 35, No. 3, pp. 331-332.

Journal ArticleDOI
TL;DR: In this article, Statistical Methods for Survival Data Analysis (SVMDA) is used to analyze survival data in the context of statistical methods for survival data analysis (SDFA).
Abstract: (1993). Statistical Methods for Survival Data Analysis. Technometrics: Vol. 35, No. 1, pp. 101-101.

Journal ArticleDOI
TL;DR: The book reviews section generally accepts for review only those books whose content and level reflect the general editorial policy of Technometrics as discussed by the authors, and publishers are invited to send books for review to Eric R. Ziegel, Amoco Research Center, Mail Station F-l/C&PO. Box 3011, Naperville, Illinois 60566-7011.
Abstract: The book reviews section generally accepts for review only those books whose content and level reflect the general editorial policy of Technometrics. Publishers are invited to send books for review to Eric R. Ziegel, Amoco Research Center, Mail Station F-l/C& P.O. Box 3011, Naperville, Illinois 60566-7011. Please include the price of the book. The opinions expressed in this section are those of the reviewers and do not necessarily reflect those of the editorial staff or the sponsoring societies. Listed prices were provided by the publisher when the books were received by Technometrics and may not be current.


Journal ArticleDOI
TL;DR: The generalized Pareto distribution (GPD) as mentioned in this paper is a two-parameter family of distributions that can be used to model exceedances over a threshold, since they are asymptotically normal.
Abstract: The generalized Pareto distribution (GPD) is a two-parameter family of distributions that can be used to model exceedances over a threshold. Maximum likelihood estimators of the parameters are preferred, since they are asymptotically normal and asymptotically efficient in many cases. Numerical methods are required for maximizing the log-likelihood, however. This article investigates the properties of a reduction of the two-dimensional numerical search for the zeros of the log-likelihood gradient vector to a one-dimensional numerical search. An algorithm for computing the GPD maximum likelihood estimates based on this dimension reduction and properties are given.

Journal ArticleDOI
TL;DR: In this article, the authors present a set of tools for presenting generic technical issues and experimental features found in industrial experiments. And they also help experimenters discuss complex trade-offs between practical limitations and statistical preferences in the experiment.
Abstract: Design of experiments and analysis of data from designed experiments are well-established methodologies in which statisticians are formally trained. Another critical and rarely taught skill is the planning that precedes designing an experiment. This article suggests a set of tools for presenting generic technical issues and experimental features found in industrial experiments. These tools are predesign experiment guide sheets to systematize the planning process and to produce organized written documentation. They also help experimenters discuss complex trade-offs between practical limitations and statistical preferences in the experiment. A case study involving the (computer numerical control) CNC-machining of jet engine impellers is included.

BookDOI
TL;DR: Theory of Petroleum Other applications - environment, hydrology, soil sciences, natural resources, human sciences, mining as mentioned in this paper, and other applications - Environment, Hydrology, and soil sciences.
Abstract: Theory petroleum other applications - environment, hydrology, soil sciences, natural resources, human sciences, mining.

Journal ArticleDOI
TL;DR: In this paper, the authors present an extension to Shewhart charts for one-at-a-time data: one-time-time sampling estimation of sigma for one at-atime data details of further control charts for average level charts for control of (within group) process spread.
Abstract: Part 1 Statistical process control: development of SPC what SPC is and is not online SPC methods off-line process control SPC methodolgy other factors affecting the success of SPC. Part 2 Some basic distributions: attribute data countable data geometric distribution the Normal distribution distributions derived from the Normal distribution application of results testing for normality the Normal approximation to the binomial distribution Normal approximation to the Poisson distribution. Part 3 Process variation: reasons for process variation types of process variation some models for process variation sampling error and measurement error. Part 5 Basic Shewhart control charts for continuous variables: control charts for average level charts for control of (within group) process spread the average run length (ARL) special problems some theoretical results for Shewhart charts charts for control of process spread. Part 6 Extensions to Shewhart charts for one-at-a-time data: one-at-a-time sampling estimation of sigma for one-at-a-time data details of further control charts for control of process average level control of process spread choice of charting method practical use of Shewhart and moving average charts properties of EWMA and MA charts. Part 7 Cumulative sum techniques for continuous variables: CuSum charts - for control of average level, for control of process spread nomogram for CuSums. Part 8 Further theoretical results on control charts for continuous variables: the effect of departures from assumption on moments of x and s(2) Shewhart charts - Markov chain approach cumulative Sum charts charts for control of process spread. Part 9 The design of control charts for specification limits: single specification limits - chart for means double specification limits - high capability processes, an alternative approach. Part 10 Control of discrete data processes: Shewhart charts - for countable data (c and u), for attribute data (np and p) CuSum charts for countable data, for attribute data comparison of Shewhart and CuSum schemes. Part 11 Sampling inspection: classification of inspection plans some properties of sampling plans methods of using sampling plans for attributes. Part 12 Inspection by variables: single specification limit - sigma known, sigma unknown estimation of fraction non-conforming, single specification limit double specification limit - sigma known, sigma unknown multivariate sampling plans. Part 13 Standard sampling systems: statement of method for inspection by attributes inspection by variables International Standards for process and quality control. Part 14 Adaptive sampling plans: CSP-1 and the AOQL criterion theory of CSP-1 the AEDL criterion. (Part contents).

Journal ArticleDOI
TL;DR: In this paper, the authors propose to replace the sequence of serially correlated observations by a sequence of independent and identically distributed observations for which the run-length characteristics of interest are roughly the same.
Abstract: This article discusses situations in which one is interested in evaluating the run-length characteristics of a cumulative sum control scheme when the underlying data show the presence of serial correlation. In practical applications, situations of this type are common in problems associated with monitoring such characteristics of data as forecasting errors, measures of model adequacy, and variance components. The discussed problem is also relevant in situations in which data transformations are used to reduce the magnitude of serial correlation. The basic idea of analysis involves replacing the sequence of serially correlated observations by a sequence of independent and identically distributed observations for which the runlength characteristics of interest are roughly the same. Applications of the proposed method for several classes of processes arising in the area of statistical process control are discussed in detail, and it is shown that it leads to approximations that can be considered acceptable in...

Journal ArticleDOI
TL;DR: Part 1 Examples of problems - vignettes: setting up the analysis - formulating the problem organizing the evidence types of evidence biases summarizing the evidence influence diagrams models for combining evidence introduction to Bayes' Formula.
Abstract: Part 1 Examples of problems - vignettes: setting up the analysis - formulating the problem organizing the evidence types of evidence biases summarizing the evidence influence diagrams models for combining evidence introduction to Bayes' Formula Part 2 Formulas of the confidence profile method: components of the confidence profile method the general model prior distributions likelihood functions bias functions other functions solution methods Part 3 Solutions to the example problems: analysis of tamoxifen analysis of HIP trial of breast cancer screening analysis of Ulrecht Project (DOM) analysis of breast cancer screening - seven controlled trials Analysis of Breast Cancer Detection Demonstration Project (BCDDP) analysis of t-Pa Analysis of Anisoylated Plasminogen Streptokinase Activated Complex (APSAC) analysis of amlodipine analysis of alcohol and breast cancer analysis of screening for maple syrup urine disease analysis of nitrogen dioxide and respiratory illness in children analysis of surveillance of colorectal cancer patients analysis of colon cancer screening trial Part 4 Implementation issues: sensitivity analysis research planning data collection and reporting test case Conclusions: other meta-analysis methods summary and conclusions

Journal ArticleDOI
TL;DR: In this paper, combining information: Statistical Issues and Opportunities for Research is used to compare statistical issues and opportunities for research in the field of computer vision and artificial intelligence, respectively.
Abstract: (1993). Combining Information: Statistical Issues and Opportunities for Research. Technometrics: Vol. 35, No. 4, pp. 462-463.

Journal ArticleDOI
TL;DR: This book discusses experiments with both qualitative and quantitative factors, and the choice of a model and criteria for a good experiment, as well as the analysis of experiments.

Journal ArticleDOI
TL;DR: The theory of correspondence analysis numerical example of correspondence - analysis exercise illustrating the theory reading and interpretation of the listings analysis and interpretation cluster analysis - agglomerative hierarchical clustering.
Abstract: The theory of correspondence analysis numerical example of correspondence - analysis exercise illustrating the theory reading and interpretation of the listings analysis and interpretation cluster analysis - agglomerative hierarchical clustering.

Journal ArticleDOI
Eric R. Ziegel1
TL;DR: An introduction to the essentially mathematical principles of survey sampling as they are applied in practice, intended for survey sampling theorists and practitioners.
Abstract: An introduction to the essentially mathematical principles of survey sampling as they are applied in practice. Intended for survey sampling theorists and practitioners, as a guide for those who may have to design and conduct a survey, and for those commissioning, organizing, and overseeing survey op

Journal ArticleDOI
TL;DR: A schema is presented for uniting traditional SPC and feedforward/feedback control into a system that exploits the strengths of both and discusses the theory and practice of such an approach, along with a consideration of research and technical issues that arise.
Abstract: Statistical process control (SPC) has traditionally been applied to processes in which successive observations would ideally be independent and identically distributed as a basis for achieving fundamental process improvement. Stochastic control, on the other hand, addresses situations in which observations are dynamically related over time; its intent is to run the existing process well, as opposed to improving it as such. A schema is presented for uniting traditional SPC and feedforward/feedback control into a system that exploits the strengths of both, Building on past work by MacGregor, Box, Astrom, and others, we discuss the theory and practice of such an approach, along with a consideration of research and technical issues that arise.