scispace - formally typeset
Search or ask a question

Showing papers on "Robustness (computer science) published in 1973"



Proceedings ArticleDOI
12 Jun 1973

37 citations


Journal ArticleDOI
TL;DR: In this paper, a small-scale sampling experiment on the performance of robust estimators of location was conducted, where the underlying populations were taken to be Normal contaminated unsymmetrically.
Abstract: Results are reported of a small‐scale sampling experiment on the performance of robust estimators of location. The underlying populations are taken to be Normal contaminated unsymmetrically. The relative behaviour of the estimators depends on the aspect of the distribution it is required to estimate.

14 citations


Journal ArticleDOI
TL;DR: In this article, the authors describe the incorporation of surface wave devices into a practical radar system and the capabilities and limitations of the technology are discussed, including their ability to be used in the generation and compression of the chirp waveforms used in many modern radars.

8 citations






Journal ArticleDOI
TL;DR: In this article, the robustness of reliability predictions based on the exponential failure law was investigated under possible deviations within the Weibull family of failure distributions, and regions of robustness were provided for parallel systems of N identical components, N = 1(1)15.
Abstract: The robustness of reliability predictions based on the exponential failure law is investigated under possible deviations within the Weibull family of failure distributions. Regions of robustness are provided for parallel systems of N identical components, N = 1(1)15; i.e., regions in the space of the Weibull shape parameter, within which one may safely use the exponential prediction procedure and have no more than a prespecified error.

2 citations



Proceedings ArticleDOI
01 Jan 1973
TL;DR: In this paper, the term analytic model is used to refer to that set of computer system models based upon queuing theory, which indicate rather clearly how various policies and variables interact to effect system performance, and the predictions of some coincide well with measurable data.
Abstract: The use of simulation studies to investigate the behavior of computer hardware/software systems is well established, although in some sense analytic models are more desirable. Simulation is used in a situation which is completely intractable to analytic machinery, or for which the essence is lost when the prerequisite abstractions and simplifying assumptions necessary to the analytic technique are made. In this paper, the term analytic model is used to refer to that set of computer system models based upon queuing theory.Many view the operating system as a network of resources each potentially precipitating queues. Such a view allows the units passing through the network to request (employ) the devices in various combinations in a multiprogramming environment. The computing system is thus viewed as sequences of resource seizures and releases (processors, memory and input-output equipment) with appropriate holding time assumptions. In many cases, the models indicate rather clearly how various policies and variables interact to effect system performance, and the predictions of some coincide well with measurable data.


01 Jan 1973
TL;DR: The effects of inaccuracy in parameters of a hypothesized normal distribution on certain tests of symmetrical sample censoring are investigated and robustness of certain Tests of Censoring of Extreme Sample Values is found.
Abstract: The effects of inaccuracy in parameters of a hypothesized normal distribution on certain tests of symmetrical sample censoring are investigated • 1 This research was supported in part by the U.S. Army Research Office, Durham, under Contract No. DAEC04-71-C-0042. .e Robustness of Certain Tests of Censoring of Extreme Sample Values 1'1. L. JOHNSON Univer>sity of North Carolina at Chapel Hill

01 Jan 1973
TL;DR: In this paper, the robustness of the following four tests for homogeneity of variance was investigated: Cochran's, Hartley's, Miller's Jackknife and Scheffe's.
Abstract: The robustness of the following four tests for homogeneity of variance was investigated: Cochran's, Hartley's, Miller's Jackknife and Scheffe's. Both Type I and Type II error was determined under conditions of non-normal data and unequal sample sizes. All tests were on two samples and sample sizes ranged from 10 to 30. The results are based on Monte Carlo calculations of one thousand points. Cochran's, Hartley's and Miller's tests were found to perform well. Scheffe's test had poor power, but this may have been caused by the number of samples and their small size .

Journal ArticleDOI
TL;DR: In this paper, the effect of non-normality on tolerance limits is investigated through a simulation, measured by estimating the confidence associated with tolerance limits constructed under normal theory when the underlying population density departs from normality in varying degrees.
Abstract: The effect of non-normality on tolerance limits is investigated through a simulation. This effect is measured by estimating the confidence associated with tolerance limits constructed under normal theory when the underlying population density departs from normality in varying degrees. The results are presented in tabular form for sample sizes n = 3,5,10,20; departures in skewness of β1 = 0,.25,.50,1.00; departures in kurtosis of β2 = 1.7,2.0,3.0,4.0,5.0; and percentages of the population of 90,95,99 for a confidence coefficient of .90 under normal theory