scispace - formally typeset
Search or ask a question

Showing papers in "Technometrics in 1975"


Journal ArticleDOI
TL;DR: The Univariate Normal Linear Regression Model (ULRRLR) as discussed by the authors is a well-known model for regression analysis in economics and has been used extensively in the literature.
Abstract: Remarks on Inference in Economics. Principles of Bayesian Analysis with Selected Applications. The Univariate Normal Linear Regression Model. Special Problems in Regression Analysis. On Errors in the Variables. Analysis of Single Equation Nonlinear Models. Time Series Models: Some Selected Examples. Multivariate Regression Models. Simultaneous Equation Econometric Models. On Comparing and Testing Hypotheses. Analysis of Some Control Problems. Conclusion. Appendices. Bibliography. Indexes.

2,119 citations


Journal ArticleDOI
TL;DR: In this paper, Statistical Inference Under Order Restrictions (SINR) under order restrictions is discussed. But this paper is restricted to the case of order restrictions. And it is not applicable to the present paper.
Abstract: (1975). Statistical Inference Under Order Restrictions. Technometrics: Vol. 17, No. 1, pp. 139-140.

1,622 citations


Journal ArticleDOI
TL;DR: In this paper, the normal probability plot correlation coefficient (NPC) was used as a test statistic for the composite hypothesis of normality, and the proposed test statistic is conceptnally simple, is compntationally convenient, and is readily extendible to testing non-normal distributional hypotheses.
Abstract: This paper introdLlces the normal probability plot correlation coefficient as a test statistic in complete samples for the composite hypothesis of normality. The proposed test statistic is conceptnally simple, is compntationally convenient, and is readily extendible to testing non-normal distributional hypotheses. An empirical power strldy shows that the normal probability plot correlation coefficient, compares favorably with 7 other normal test statistics. Percent points are tabulated for n = 3(l)50(5)100.

838 citations


Journal ArticleDOI
TL;DR: The Nelder-Mead simplex method for function minimization is a direct method requiring no derivatives as mentioned in this paper, where the objective function is evaluated at the vertices of a simplex, and movement is away from the poorest value.
Abstract: The Nelder-Mead simplex method for function minimization is a “direct” method requiring no derivatives. The objective function is evaluated at the vertices of a simplex, and movement is away from the poorest value. The process is adaptive, causing the simplexes to be continually revised to best conform to the nature of the response surface. The generality of the method is illust'rated by using it. to solve six problems appearing in the May 1973 issue of Technometrics.

387 citations


Journal ArticleDOI
TL;DR: The model and the design problem are stated and the way the criterion has been extended to non-linear models is reviewed, particularly those on the theory of design and algorithms for constructing D-optimum designs are discussed.
Abstract: After stating the model and the design problem, we briefly present the results for regression design prior to the work of Kiefer and Wolfowitz. We then review the major results of Kiefer and Wolfowitz, particularly those on the theory of design, as well as the way the criterion has been extended to non-linear models. Finally, we discuss algorithms for constructing D-optimum designs.

349 citations



Journal ArticleDOI
TL;DR: In this article, the authors proposed several many outlier procedtues to detect more than one outlier in a sample and compared several different procedtuses in various sample sizes.
Abstract: This article is concerned with “many outlier” procedures i.e., procedures that can detect more than one outlier in a sample. Several many outlier procedtues are proposed in Section 2 and via power comparisons in Section 3 are found to be much superior to one outlier procedures in detecting many outliers. We then compare several different. many outlier procedures in Section 4 and find that the procedutre based on the extreme studentized deviate (ESD) is slightly the best. Finally, 5%, 1% and .5% points are given for the ESD procedure for various sample sizes in Section 5.

257 citations


Journal ArticleDOI
TL;DR: In this article, a test for the largest residual being an outlier is implemented through table development, and tables of critical valltes for tests at levels (α ≤ 0.10, 00.5, and 0.01) are included.
Abstract: Residuals from fit are often examined in regression analysis. A test suggested by Ellenberg [5] and Prescott [7] for the largest residual being an outlier is implemented through table development. Tables of critical valltes for tests at levels (α ≤ 0.10, 00.5, and 0.01 are included.

234 citations


Journal ArticleDOI
TL;DR: In this article, the general form of ridge regression proposed by Hoerl and Kennard is examined in the context of the iterative procedure they suggest for obtaining optimal estimators.
Abstract: The general form of ridge regression proposed by Hoerl and Kennard is examined in the context of the iterative procedure they suggest for obtaining optimal estimators. It is shown that a non-iterative, closed form solution is available for this procedure. The solution is found to depend upon certain convergence/divergence conditions which relate to the ordinary least) squares estimators. Numerical examples are given.

128 citations


Journal ArticleDOI
TL;DR: In this article, a production model for the current control of the number of defectives by an np-chart is proposed, and an algorithm for determining the most economic control parameters is presented.
Abstract: This paper proposes a production model for the current control of the number of defectives by an np-chart. An algorithm for determining the most economic control parameters is presented. An exceedingly simple procedure is suggested for designing control plans which approximate in cost the most economic control plans. This procedure is applicable at the workshop level.

120 citations


Journal ArticleDOI
TL;DR: In this paper, a cumulative sum technique which allows distribution-free tests is developed and applied to the problem of detecting a change in median of a rainfall distribution, which can be used to detect changes in rainfall distribution.
Abstract: A cumulative sum technique which allows distribution-free tests is developed and applied to the problem of detecting a change in median of a rainfall distribution.

Journal ArticleDOI
TL;DR: In this article, the authors provide tables of percentage points of the asymptotic distribution of the one sample truncated Kolmogorov-Smirnov statistic for goodness of fit problems involving tnmcated or censored data and indicate that the tables provide accurate critical values for sample sizes greater than 30.
Abstract: In this article we provide tables of percentage points of the asymptotic distribution of the one sample truncated Kolmogorov-Smirnov statistic. We discuss use of the tables in goodness of fit problems involving tnmcated or censored data and indicate that the tables provide accurate critical values for sample sizes greater than 30. We also discuss use of the tables in situations involving censored data and in two sample testing problems.

Journal ArticleDOI
TL;DR: In this article, a test statistic for detecting outliers in linear models involving residuals standardized by their individual standard deviations is considered and it is suggested that its critical values are adequately approximated by upper bounds for the critical values of a similar test statistic involving residual values standardized by a constant standard deviation.
Abstract: A test statistic for detecting outliers in linear models involving residuals standardized by their individual standard deviations is considered and it is suggested that its critical values are adequately approximated by upper bounds for the critical values of a similar test statistic involving residuals standardized by a constant standard deviation. The test procedure is applicable to any linear model and does not require a re-analysis with the suspected outlier omitted or treated as a missing value. Two regression analyses are given to illustrate the procedure.

Journal ArticleDOI
TL;DR: In this article, the similarity between the minimum mean square error estimator and Hoer1 and Kennard's ridge regression was explored, and it was shown that the similarity can be further improved by using Hoer 1 and KG regression.
Abstract: In this paper we explore the similarity between the minimum mean square error estimator and Hoer1 and Kennard's ridge regression.

Journal ArticleDOI
TL;DR: In this paper, an algorithm for computing statistics for all possible subsets of variables for a discriminant analysis is proposed and a comparison with a stepwise procedure is also presented through two examples.
Abstract: An algorithm is proposed for computing statistics for all possible subsets of variables for a discriminant analysis. Optimal subsets of any given size can then be determined. A comparison with a stepwise procedure is also presented through two examples.

Journal ArticleDOI
Glen H. Lemon1
TL;DR: In this article, the authors developed maximum likelihood estimators for the three parameter Weibull distribution based on various left, and right, censored data situations, and the asymptotic variance-covariance matrix of the estimators is given.
Abstract: This study develops maximum likelihood estimators for the three parameter Weibull distribution based on various left, and right, censored data situations. For the case of single censoring from the left and progressive censoring from the right, the developed estimation procedure involves the simultaneous solution of two iterative equations compared to the arduous task of solving three simultaneous iterative equations as outlined by Harter and Moore [6]. For the two parameter Weibull [4], the above case results in an estimation procedure involving only one iterative equation. The asymptotic variance-covariance matrix of the estimators is given. Pivotal functions are provided whose distributions can be used for confidence interval estimation and for tests of hypotheses. An example is included which illustrates the estimation procedure involving left censored samples.

Journal ArticleDOI
TL;DR: In this paper, a simple approximation of expected values of order statistics in the Shapiro-Francis (1972) W′ statistics, yield a statistics that is more suitable for machine computation. And they show that is equivalent to W′ and discuss its percentage points.
Abstract: Replacement by a simple approximation of expected values of order statistics in the Shapiro-Francis (1972) W′ statistics, yield a statistics that is more srlitable for machine computation. We show that is equivalent to W′ and discuss its percentage points.

Journal ArticleDOI
TL;DR: In this paper, a simple application using traffic accident data is presented, where Bayesian tests as well as a test based on the maximum likelihood estimate of γ are considered and their powers are compared by Monte Carlo methods.
Abstract: We consider tests based on one observation on each of N ≥ 2 random variables X l, …, XN to decide if the means μ of the xi 's are all equal against the one-sided alternative that a shift has occurred at some unknown point γ, (i.e. μ1, = μ2 = … = μ r < μ r+1 = … = μ N ). The x i 's are considered to be normally distributed with a common unknown variance. Bayesian tests as well as a test based on the maximum likelihood estimate of γ are considered and their powers are compared by Monte Carlo methods. The exact distribution of a Bayesian test statistic is derived. A simple application using traffic accident data is presented.

Journal ArticleDOI
TL;DR: In this paper, the development of experimental designs for constrained mixtttre systems in which the response can be described by a qlladratic model is discussed, and it is shown that efficient designs can be constructed from a srtbset of the vertices and centroids of the feasible region.
Abstract: The development of experimental designs for constrained mixtttre systems in which the response can be described by a qlladratic model is discussed. It is shown that efficient designs can be constructed from a srtbset of the vertices and centroids of the feasible region. For systems with five or more components, it is recommended that the Wynn and exchange algorithms be used to select a design from the available vertires and centroids. Illustrative examples are included.

Journal ArticleDOI
Marion R. Reynolds1
TL;DR: In this paper, an approximation to the average run length for cumulative sum control charts is derived using the analogy between this procedure and the sequential probability ratio test for normal observations, which does not require the normality assumption.
Abstract: An approximation to the average run length for cumulative sum control charts is derived using the analogy between this procedure and the sequential probability ratio test for normal observations. This approximation is also derived by using a Brownian motion approximation to the cumulative sum. The Brownian motion approximation does not require the normality assumption. The analytical expression for the average run length obtained from the approximation is then used to determine the optimal choice of parameters to minimize the average run length at a specified deviation flom control, subject to a fixed average run length when in control.


Journal ArticleDOI
TL;DR: In this article, a method of finding confidence bounds on the percentiles of the Weibull or extreme value distributions is presented, which arise from considering the distribution of parameter estimates given the observed value of an ancillary statistic, and are applicable whether the data are complete or Type II censored.
Abstract: A method of finding confidence borunds on the percentiles of the two-parameter Weibull or extreme-value distributions is presented. The procedures, which arise from considering the distribution of parameter estimates given the observed value of an ancillary statistic, do not require the constuction of any tables, and are applicable whether the data are complete or Type II censored. The procedures also allow an investigation to be made into the adequacy of approximate confidence bounds based on an F approximation presented in Mann, Schafer and Singptuwalla [13], and those based on large sample theory for the maximum likelihood estimates.

Journal ArticleDOI
TL;DR: In this article, the normal distribution theory likelihood ratio statistic for the corresponding composite hypothesis is derived, and a small sample F-test is shown to be conservative, and the asymptotic distribution of the likelihood ratio provides a large sample test of the RISE optimality of any restricted Ridge family of solutions.
Abstract: RIDGE ANALYSIS is of interest when some generalized Ridge regression coefficient vectors are significantly more likely to have Mean Squared Error (MSE) optimality properties than is any uniformly SHRUNKEN version of the ordinary least squares estimator. The normal distribution theory likelihood ratio statistic for the corresponding composite hypothesis is derived, and a small sample F-test is shown to be conservative. The asymptotic distribution of the likelihood ratio provides a large sample test of the RISE optimality of any restricted Ridge FAMILY of solutions. The likelihood approach for solution selection WITHIN a given family is then compared and contrasted with some snggestions of Mallows [8] and Allen [1] and with a new, non-stochastic criterion, SSCBC.

Journal ArticleDOI
TL;DR: In this paper, maximum likelihood estimators and estimators which utilize the first order statistic are derived for the three parameter Weibull distribution for the special case in which the shape parameter is known, a special case which includes the two parameter exponential distribution.
Abstract: In life and fatique testing, multi-censored samples arise when at various stages of a test, some of the survivors are withdrawn from further observation. Sample specimens which remain after each stage of censoring continue to be observed until failure or until a subsequent stage of censoring. In this paper, maximum likelihood estimators and estimators which utilize the first order statistic are derived for the three parameter Weibull distribution. Estimators are also derived for the special case in which the shape parameter is known, a special case which includes the two parameter exponential distribution. An illustrative example is included.


Journal ArticleDOI
TL;DR: In this paper, two modified versions of the half-normal plot are presented. The modified versions incorporate corrections of a major flaw in the original version relating to critical values and a minor flaw relating to plotting positions.
Abstract: The original version of the half-normal plot [C. Daniel, Technometrics, t, (1959), 311–34] is reviewed and two modified versions are introduced. The modified versions incorporate corrections of a major flaw in the original version relating to critical values and a minor flaw relating to plotting positions. The critical values given in this article control the probability error rate, the probability of at least one false positive in the analysis of an experiment, and are considerably different from those given by Daniel. One of the modified versions is more powerful than Daniel's original procedure. Anothe improvement made in the modified versions is that only the smallest 70%, of the contrasts not declared significant, are used to constuct the final estimate of the error variance. This results in an appreciable redaction in the mean square error of this estimate. Examples are plesented to illustrate the use of the various half-normal plot versions.

Journal ArticleDOI
TL;DR: Monte Carlo studies of a modification of the original version of the half-normal plot and four new versions are reported, indicating that the proportion of real contrasts declared significant, is larger for one of the new versions than for the original versions.
Abstract: Monte Carlo studies of a modification of the original version of the half-normal plot (Daniel, Technometrics, 1 (1959), 311–341) and four new versions are reported. Data representative of the 15 contrasts from a 2p–q , p – q = 4, factorial experirnenl. are generated. Design parameters in the Main Simiulation Study are the probability error rate, the number of real contrasts, and the size of the real contrasts. The critical values used by the various versions control the probability error rate. These critical values are considerably different, than those given by Daniel. The Monte Carlo strtdies indicate that the detection rate, i.e., the proportion of real contrasts declared significant, is larger for one of the new versions than for the original version. The detection rate of all versions decreases drastically when the number of real contrasts present increases from one to two to four. Nomination procedures for analyzing single replication 24 factorial experiments have a smaller detection rate than the h...

Journal ArticleDOI
TL;DR: In this paper, a response surface criterion for design selection is developed for the case of estimation of a set of parametric functions rather than a single response, and a specific application is emphasized, that in which interest, centers on the partial derivat.
Abstract: A response surface criterion for design selection is developed in this paper for the case of estimation of a set of parametric functions rather than a single response. A specific application is emphasized, that in which interest, centers on the partial derivat. ives of the response function. Both bias and variance of the parametric functions are taken into consideration. For the case of the slope of the response function two situations are given special emphasis; the fitting of a first order model in the presence of a true second order structure, and the fitting of a second order function in the presence of a third order alternative. The traditional response surface bias and variance criteria arise as a special case for the situation in which a single parametric function is merely the ordinary response surface.

Journal ArticleDOI
TL;DR: In this paper, simple, closed form approximations for maximum likelihood estimates of the parameters of the Weibull or extreme value distribution are discussed, and the exact computation of constants required to calculate the estimates is presented, and simpler approximate methods are also provided.
Abstract: Simple, closed form approximations for maximum likelihood estimates of the parameters of the Weibull or extreme-value distribution are discussed. A method for the exact computation of constants required to calculate the estimates is presented, and simpler approximate methods are also provided. Some inference procedures for the parameters are also discussed.

Journal ArticleDOI
TL;DR: In this paper, the statistical properties of residuals from additivity in two-way tables are investigated. But the results are based mainly on empirical sampling and involve average values, correlation properties, and the use of the W-statistic (Shapiro and Wilk [24] and of probability plotting methods.
Abstract: This paper deals with some of the statistical properties of, and methods of analysis for, conventional residuals from additivity in two-way tables. Attention is given to three cases: (i) normal fluctuations superimposed on an additive model; (ii) one outlier added to the condition described in (i); and (iii) two outliers superimposed on the condition in (i). The results are based mainly on empirical sampling and involve average values, correlation properties, the use of the W-statistic (Shapiro and Wilk [24]) and of probability plotting methods. Generally speaking, in the null case of no outliers, the residuals do behave much like a normal sample. When one outlier is present, the direct statistical treatment of residuals provides a complete basis for data-analytic judgments, especially through judicious use of probability plots. When two outliers are present, however, the resulting residuals will often not have any noticeable statistical peculiarities.