scispace - formally typeset
Search or ask a question

Showing papers in "Applied statistics in 1980"


Journal ArticleDOI
TL;DR: The technique set out in the paper, CHAID, is an offshoot of AID (Automatic Interaction Detection) designed for a categorized dependent variable with built-in significance testing, multi-way splits, and a new type of predictor which is especially useful in handling missing information.
Abstract: SUMMARY The technique set out in the paper, CHAID, is an offshoot of AID (Automatic Interaction Detection) designed for a categorized dependent variable. Some important modifications which are relevant to standard AID include: built-in significance testing with the consequence of using the most significant predictor (rather than the most explanatory), multi-way splits (in contrast to binary) and a new type of predictor which is especially useful in handling missing information.

2,744 citations


Journal ArticleDOI
TL;DR: In this paper, the numerical inversion of the characteristic function is used to find the distribution of the ratio of two quadratic forms in independent normal variables, and the accuracy is set by the user, a maximum error of 0-0001 being an appropriate value.
Abstract: pr(Q

514 citations



Journal ArticleDOI
TL;DR: In this paper, a method is suggested by which marginal third- or fourth-order moments may be combined to produce a statistic to test for multivariate normality, which can be used to detect multivariate normalization.
Abstract: SUMMARY A method is suggested by which marginal third- or fourth-order moments may be combined to produce a statistic to test for multivariate normality.

109 citations


Journal ArticleDOI
TL;DR: With this assumption, in the renal transplant application the intersection point, y, corresponds to the time at which a rejection occurs, and this parameter is of very great interest, both in the treatment of individual patients and because inferences about v for each of a number of patients permit inferences in the rates of rejection.
Abstract: Y1=(X2+f2xi+ei, i=zT+l,..., (1) where the ei are independently, normally distributed, N(O, a2), 1 < T < T (al f,) $ (a2, ,B2) and all the parameters a,, f1 a25 ,B25 a2, T are unknown. The two straight lines in (1) intersect at a point with x coordinate v = (a1 a2)/(12 fl1). If we further assume that X1 < X2 <... < XT (in many applications, x will denote time), it is useful to distinguish two versions of (1) according to whether v satisfies x, < v < x+ 1 or not. When this constraint is assumed, we shall refer to the constrained case of switching straight lines. It has been found see, for example, Knapp et al. (1977)-that the constrained version provides a satisfactory statistical model for monitoring the function of renal transplants. Following a transplant, daily measurements are made of serum-creatinine, a substance which indicates the level of functioning of the patient's kidney and a plot of the reciprocal of bodyweight corrected serum-creatinine (y) is then made against time (x). Fig. 1 shows part of two typical plots obtained from patients undergoing treatment at the City Hospital, Nottingham. An increasing straight line indicates successful functioning of the transplanted kidney. A sudden switch to a decreasing straight line indicates that rejection has occurred. It might be argued that rejection is not an instantaneous process, so that the sharp intersection of the two lines is a fiction, and an approach assuming a less sharp transition, such as that of Bacon and Watts (1971), might be more reasonable. On present evidence, however, any transition period would appear to be usually very short compared with the interval between measurements, and so the switching straight line model provides a reasonable approximation. With this assumption, in the renal transplant application the intersection point, y, corresponds to the time at which a rejection occurs. This parameter is of very great interest, both in the treatment of individual patients and because inferences about v for each of a number of patients permit inferences about daily and hourly variations in the rates of rejection. This, in turn, has implications for treatment and monitoring procedures in the hospital (see Knapp et al., 1979).

95 citations


Journal ArticleDOI
TL;DR: Bartholomew and Forbes as discussed by the authors presented Statistical Techniques for Manpower Planning, which is a technique for planning a team of workers. xiii, 288 p. 23 cm.
Abstract: Statistical Techniques for Manpower Planning. By David J. Bartholomew and Andrew F. Forbes. Chichester and New York, Wiley, 1979. xiii, 288 p. 23 cm. £14·75.

75 citations


Journal ArticleDOI
TL;DR: In this paper, a general model for response surface problems is proposed, in which it is anticipated that the response on a particular unit will be affected by overlaps effects from neighbouring units, and it is assumed that only plots that are physically adjacent "horizontally, vertically or diagonally" mutually affect each other.
Abstract: SUMMARY A general model is suggested for response surface problems in which it is anticipated that the response on a particular unit will be affected by "overlap" effects from neighbouring units. Two numerical examples, one with constructed data, and one with Rothamsted data on mildew control, illustrate use of the model. In both examples, it is assumed that only plots that are physically adjacent "horizontally, vertically, or diagonally" mutually affect each other. In general, more or less complicated assumptions can be accommodated.

68 citations


Journal ArticleDOI
Abstract: SUMMARY Experimental designs are presented for estimating an extreme percentage point of a logistic distribution when the observations are quantal responses and the location and scale parameters are unknown. The method is based on a prior distribution of the parameters and a predicted value of the posterior variance. The paper is an extension of an earlier article (Tsutakawa, 1972) for the case when the scale parameter is known.

65 citations


Journal ArticleDOI
TL;DR: Shrunken estimators in canonical variate analysis are shown to lead to improved stability of the resulting coefficients when the between-groups sum of squares for a particular principal component defined by the within-groups covariance or correlation matrix is small and the corresponding eigenvalue is also small.
Abstract: SUMMARY Shrunken estimators in canonical variate analysis are shown to lead to improved stability of the resulting coefficients when the between-groups sum of squares for a particular principal component defined by the within-groups covariance or correlation matrix is small and the corresponding eigenvalue is also small.

63 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigate problems connected with the appearance of surprising values in samples of univariate data that are circular or directional, and examine how their presence might be detected and describe possible tests of discordancy.
Abstract: SUMMARY In this paper we investigate problems connected with the appearance of surprising values in samples of univariate data that are circular or directional. We examine how their presence might be detected and describe possible tests of discordancy. The relative performance of the test procedures is investigated using Monte Carlo methods. A practical application of the discordancy tests is included.

58 citations



Journal ArticleDOI
TL;DR: In this paper, a simple scoring system is presented for controlling the mean of a normal distribution in the one and two-sided case, where corrective action is indicated whenever a sample mean falls outside "action limits" placed at juo? k 2 /v/n.
Abstract: SUMMARY Process control schemes using a simple scoring system are presented for controlling the mean of a normal distribution in the one and two sided case. The average run lengths of the schemes are simpler to compute than those of cumulative sum schemes, and for small deviations in the process mean the schemes are considerably more sensitive than Shewhart schemes. Some examples and comparisons are given. corrective action is indicated whenever a sample mean falls outside "action limits" placed at juo ? k2 /v/n. Page (1955) has suggested the inclusion of "warning limits" at juo ? k 1 a/v/n (k 1 < k2), with the rule that the process is halted if r consecutive points fall between the warning and action limits, or a single point falls outside the action limits. In the cusum scheme (Page, 1954) for detecting increases in the mean from its target value, the cumulative sums of the differences between the sample means and some reference value K, (Xi -K), are plotted against sample number. If the cusum becomes negative, the cumulation is restarted, but if it reaches some value H (the decision interval), then corrective action is indicated. For two-sided control, it is necessary to test also for decreases in the mean, and this is done by operating a second cusum with reference value and decision interval - K and - H respectively. The schemes proposed in this paper assign a score of -1, + 1 or 0 to the sample means according to whether they are "extreme negative", "extreme positive" or otherwise. In the two- sided case corrective action is indicated when the modulus of the cumulative score reaches some fixed value, which amounts to operating a cusum on the scores with zero reference value. In the one-sided case, a new decision rule is proposed. Like the Shewhart scheme, both schemes have the attractive property that the average run length, ARL (u), can be expressed as a simple function of the tail areas of the quality distribution, and the basic Shewhart scheme is in fact a special case. The advantage of these schemes is that they can be more sensitive to small deviations in the process mean than Shewhart schemes, at the expense of some efficiency for




Journal ArticleDOI
TL;DR: In this paper, two normalizing transformations of a Student's t variate are proposed, one is more accurate uniformly than any transformation previously given, while the other is very accurate locally at a prescribed deviate of the standard normal distribution.
Abstract: SUMMARY Two normalizing transformations of a Student's t variate are proposed. One is more accurate uniformly than any transformation previously given, while the other is very accurate locally at a prescribed deviate of the standard normal distribution. A NUMBER of authors have suggested transformations that normalize a Student's t variate; some of these are compared in Prescott (1974). Such transformations have several applications: they can be used in the construction of a single test statistic from a number of calculated t values; and they can be included in computer software packages so that, for instance, a t test can be carried out with the need for only a few critical values of the normal distribution to be stored. In view of this latter point, there is a particular need for a transformation that is accurate locally at a prescribed deviate of the standard normal distribution. In this paper we derive such a transformation along with another that is more accurate uniformly than any previously given. Both transformations, like their precursors, can be evaluated easily on a modest electronic calculator.





Journal ArticleDOI
TL;DR: In this paper, a variation on a standard analysis is described in which different sets of data conform to the same linear model but with individual constants of proportionality, and a variation of the standard analysis with a different set of data is described.
Abstract: A variation on a standard analysis is described in which different sets of data conform to the same linear model but with individual constants of proportionality.