scispace - formally typeset
Search or ask a question

Showing papers on "Sampling (statistics) published in 1969"


Journal ArticleDOI
TL;DR: In this article, the problem of estimating the components of a mixture of two normal distributions, multivariate or otherwise, with common but unknown covariance matrices is examined, and the maximum likelihood equations are shown to be not unduly laborious to solve and the sampling properties of the resulting estimates are investigated.
Abstract: SUMMARY The problem of estimating the components of a mixture of two normal distributions, multivariate or otherwise, with common but unknown covariance matrices is examined. The maximum likelihood equations are shown to be not unduly laborious to solve and the sampling properties of the resulting estimates are investigated, mainly by simulation. Moment estimators, minimum x2 and Bayes estimators are discussed but they appear greatly inferior to maximum likelihood except in the univariate case, the inferiority lying either in the sampling properties of the estimates or in the complexity of the computation. The wider problems obtained by allowing the components in the mixture to have different covariance matrices, or by having more than two components in the mixture, are briefly discussed, as is the relevance of this problem to cluster analysis.

813 citations


Book
01 Jan 1969
TL;DR: In this paper, an illustrated handbook outlining the methods, materials, variations, limitations, and applications that would be useful to anyone dealing with sediments is presented. Subjects include sedimentary peels, impregnation, radiography, laboratory instruments, sampling, and a variety of other methods.
Abstract: An illustrated handbook outlining the methods, materials, variations, limitations, and applications that would be useful to anyone dealing with sediments. Subjects include: sedimentary peels, impregnation, radiography, laboratory instruments, sampling, and a variety of other methods. -- AATA

197 citations


Journal ArticleDOI

178 citations


Journal ArticleDOI
TL;DR: In this paper, the Weibull process with an unknown scale parameter was examined as a model for Bayesian decision making, and the analysis was extended by treating both the shape and scale parameters as unknown.
Abstract: Previously, the Weibull process with an unknown scale parameter was examined as a model for Bayesian decision making. The analysis is extended by treating both the shape and scale parameters as unknown. It is not possible to find a family of continuous joint prior distributions on the two parameters that is closed under sampling, so a family of prior distributions is used that places continuous distributions on the scale parameter and discrete distributions on the shape parameter. Prior and posterior analyses are examined and seen to be no more difficult than for the case in which only the scale parameter is treated as unknown, but preposterior analysis and determination of optimal sampling plans are considerably more complicated in this case. To illustrate the use of the present model, an example is presented in which it is necessary to make probability statements about the mean life and reliability of a long-life component both before and after life testing.

177 citations


Journal ArticleDOI
TL;DR: The probability distributions of population size are derived for populations living in randomly varying environments for both density-dependent and density-independent population growth.
Abstract: The probability distributions of population size are derived for populations living in randomly varying environments for both density-dependent and density-independent population growth. The effects of random variation in the rate of increase, the carrying capacity, and sampling variation in numbers are examined.

164 citations


Journal ArticleDOI
TL;DR: In this paper, the authors investigated whether data of this kind can be validly analyzed via F-tests and found that one or more of the hypotheses are concerned with interactions between factors.
Abstract: Many educational and psychological measurements are made with instruments that yield only a limited number of score values. Ratings and course grades, for example, are frequently recorded on a five-point scale. The effectiveness of various educational procedures, such as those used in counseling, may be evaluated on a four-point scale. Pupil traits in non-academic areas are sometimes assessed on a threepoint scale—\"above average,\" \"average,\" or \"below average.\" In situations where change in status is measured, the assessment may be simply one of \"Improved\" or \"No change.\" If such measures, numerically coded, are taken in an experiment and the investigator is concerned solely with means, he may be tempted to analyze the data via the techniques of analysis of variance. This is especially likely if there are several factors of interest in the experiment, and one or more of the hypotheses are concerned with interactions between factors. The issue is then raised, \"Can data of this kind be validly analyzed via F-tests?\" Investigation of this question was the primary purpose of this study.

150 citations


Journal ArticleDOI
TL;DR: In this paper, a superpopulation model is proposed for two-stage sampling from a finite population and the problem of estimating a linear function of the finite population elements is considered, where the superpopulation is assumed to be normal and the estimate is the mean of the posterior distribution.
Abstract: A superpopulation model is proposed for two-stage sampling from a finite population and we consider the problem of estimating a linear function of the finite population elements. We find the estimate with smallest mean squared error among linear estimates with bounded mean squared error without any assumption about the form of the super-population distribution. If the superpopulation is assumed to be normal this estimate is the mean of the posterior distribution. The estimate is compared with standard results for the special case of the finite population mean.

141 citations



Journal ArticleDOI
Eizi Kuno1
TL;DR: In this paper, a simple method of sequential sampling is developed which would make it automatically possible to secure, without excess sampling, a predetermined level of precision for a series of population estimates being required.
Abstract: A simple method of sequential sampling is developed which would make it automatically possible to secure, without excess sampling, a predetermined level of precision for a series of population estimates being required. It appears to have wide application to sampling field populations under various situations since it is simply based upon the relationship of variance to mean for which a comprehensive formula deduced for biological populations from the linearity in the regression of mean crowding on mean density could be adopted. Some problems that may arise in practical application of the method are also discussed.

123 citations


Journal ArticleDOI
TL;DR: In this article, the authors consider whether or not variability represents relevant information or has a significant chance of being due to a combination of sampling and analytical errors and conclude that the importance of analytical precision may be less important than sampling error as a factor governing whether errors are significantly smaller than overall data variability.
Abstract: Analytical precision may be less important than sampling error as a factor governing whether or not errors are significantly smaller than overall data variability. Identical treatment of original and replicate samples, and collection of replicate samples at some 10 percent, but at least 30, of the sample sites of a survey should provide enough information to determine whether or not variability represents relevant information or has a significant chance of being due to a combination of sampling and analytical errors.

121 citations


Journal ArticleDOI
TL;DR: In this article, the authors used statistical correlation methods to define sampling requirements for precipitation measurement networks and found that a minimum acceptance of 75% explained variance between sampling points is needed for 1-min rain rates compared with 7.5 mi for total storm rainfall in summer storms.
Abstract: One approach to defining sampling requirements for precipitation measurement networks is through statistical correlation methods. Data from three dense raingage networks in Illinois were used with this method on rainfall measurements ranging from 1-min rates to total storm, monthly and seasonal amounts. Effects of rain type, synoptic storm type, and other factors on spatial correlations were studied. Correlation decay with distance, used to indicate sampling requirements, was greatest in thunderstorms, rainshowers and air mass storms. Conversely, minimum decay occurred with steady rain and the passage of low pressure centers. Seasonally, the decay rate is much greater in May–September storms than in cold season precipitation. Sampling requirements are extreme in measuring rainfall rates; thus, assuming a minimum acceptance of 75% explained variance between sampling points, a gage spacing of 0.3 mi is needed for 1-min rain rates compared with 7.5 mi for total storm rainfall in summer storms.

Journal ArticleDOI
TL;DR: In this paper, a midwater trawl based on that designed by Tucker (1951) is described with details of opening and closing equipment, acoustic release gear and depth telemetering pinger.
Abstract: A new midwater trawl based on that designed by Tucker (1951) is described with details of opening and closing equipment, acoustic release gear and depth telemetering pinger. An account is given of the method used in operating the xtrawl from R.R.S. ‘Discovery’.


Journal ArticleDOI
TL;DR: The authors used statistical and quantitative techniques to summarize a large body of numbers into a small collection of typical values and confirm the results of the analysis by using tests of statistical significance that help protect against sampling and measurement error.
Abstract: Students of politics use statistical and quantitative techniques to: summarize a large body of numbers into a small collection of typical values;confirm (and perhaps sanctify) the results of the analysis by using tests of statistical significance that help protect against sampling and measurement error;discover what's going on in their data and expose some new relationships; andinform their audience what's going on in the data.

Journal ArticleDOI
H.F. Dodge1
TL;DR: Among the various statistical tools of quality control are the organized sets of sampling inspection plans and tables used in the acceptance sampling activities of producers and consumers.
Abstract: Among the various statistical tools of quality control are the organized sets of sampling inspection plans and tables used in the acceptance sampling activities of producers and consumers. Prominent among these are the LTPD, the AOQL, the AQL and the CS..

01 Jan 1969
TL;DR: In this article, it was found that under typical sampling conditions, bottom disturbance was induced by the ship when the total water depth was less than about 6 m, and the ship, during typical sampling procedures in deeper water, induces disturbance in the water column, beneath it, to a depth of at least 14 m.
Abstract: Tests with sampling equipment during 1967 and 1968 in Lake Ontario showed that some of the bottom samples were badly disturbed. As a result of these tests, further extensive trials of bottom-sampling equipment were conducted in Georgian Bay in 1968. These were designed to test the disturbance of the bottom sediments caused by ship movement and to test the efficiency and suitability of various types of sampling devices. It was found that under typical sampling conditions, bottom disturbance was induced by the ship when the total water depth was less than about 6 m. Also it was found that the ship, during typical sampling procedures in deeper water, induces disturbance in the water column, beneath it, to a depth of at least 14 m. Tests were conducted with 3 types of Gravity corer, 2 multiple corers, and 6 Grab samplers. These tests were designed to show their usefulness for both biological and geological sampling. All 3 gravity corers proved efficient for particular tasks. The most successful bottom grab samplers were found to be the Ponar grab and the Shipek bucket sampler.

Journal ArticleDOI
01 Jan 1969
TL;DR: One hundred species of amphibians (32) and reptiles (68) are estimated to occur on Barro Colorado Island, on the basis of approximately 47 years of collecting as mentioned in this paper, and approximately 80 percent of the species recorded from the island had been collected by 1931, after only about a decade of sporadic, unsystematic collecting by various persons.
Abstract: One hundred species of amphibians (32) and reptiles (68) are estimated to occur on Barro Colorado Island, on the basis of approximately 47 years of collecting. The island is a seasonally wet, tropical forest locality in man-made Gatun Lake, central Panama. The faunal composition has not been static since the island's formation in 1912-1914. Some species have disappeared from the island whereas some others seem to be recent arrivals. Faunal change is at least partly correlated with vegetational succession, as old clearings change toward mature forest. The extirpation of certain "edge" species and their failure to have recolonized the laboratory clearing indicates that it is easier for a resident population to become extinct than for new colonization to occur. The sampling of such a complex, tropical herpetofauna is shown to be not so difficult as might be expected. Man-hours of collecting are plotted against percent of the herpetofauna for several collections, indicating that nearly one-half of the species can be collected in a few weeks of intensive effort in the rainy season. Approximately 80 percent of the species recorded from the island had been collected by 1931, after only about a decade of sporadic, unsystematic collecting by various persons. The generalization that tropical species have lower population densities than temperate species may not be valid for such groups as frogs and lizards but does seem true of snake faunas in low, humid forest regions. Snakes also are more difficult to collect in the tropics because of shifts in habits. There is a great expansion of tropical snakes into arboreal situations and a general avoidance (by all vertebrates) of rock and log microhabitats, which are frequently occupied by large arachnids. Small terrestrial snakes of lowland tropical forests tend either to be fossorial or to inhabit the leaf litter, where they are difficult to detect. Seasonal aggregations of snakes are rare in the wet tropics. Recent years have brought a greatly increased use of Barro Colorado Island by ecologists, physiologists, and other nontaxonomists, thus creating a need for lists of the currently accepted names of local animals. The present checklist of amphibians and reptiles brings up to date the nomenclature and included species of earlier lists (Dunn, 1931a, 1931b, 1949; Zetek, 1951), and also indicates departures in nomenclature from other papers, including Allee (1926), Allee and Torvik (1927), and Netting (1936). One hundred species of amphibians and reptiles are estimated to occur in this forested area of less than six square miles. This count probably is the most complete available of the herpetofauna of any humid locality on Charles W. Myers, Department of Herpetology, The American Museum of Natural History, New York, New York 10024. A. Stanley Rand, Smithsonian Tropical Research Institute, Box 2072, Balboa, Canal Zone. a tropical mainland, and is the result of collecting efforts of a number of persons over several decades. By reason of its relative completeness, the present list should be useful as a means of comparing the amphibian and reptile faunas of other areas less well known. In concluding paragraphs, we comment briefly on faunal changes and abundance and demonstrate that a fauna even as large and complex as this one can be effectively sampled within reasonable time limits. Barro Colorado (9° 10' North Latitude) is a former hilltop and the largest island in Gatun Lake, which was formed by the dredging of the Panama Canal and the damming of the Chagres River in 1912. The island is approximately 3600 acres in area (about 14.5 square kilometers), hilly, and has a maximum elevation of 452 feet above Gatun Lake, or 537 feet (164 meters) above sea level. Surface drainage is by small, clearwater streams with rocky beds that dry to isolated pools

Patent
24 Jan 1969
TL;DR: In this article, the RandOMDEC APPARATUS is used for measuring the DAMPING CHARACTERISTICS of a structure or a system, and a system is proposed to provide an on-the-line DAMPing RATIO and PERIOD DETECTION and DISCARD.
Abstract: APPARATUS FOR MEASURING THE DAMPING CHARACTERICS OF A STRUCTURE OR SYSTEM DURING EXCITATION BY RANDOM FORCES OR INFLUENCES. THE "RANDOMDEC" APPARATUS IS COMPRISED OF AT LEAST TWO PARALLEL SAMPLING CIRCUITS WHICH PERFORM TIME SEQENTIAL SAMPLING OPERATIONS ON PREDETERMINED PORTIONS OF A GIVEN INPUT SIGNAL. THE CORRESPONDING OUTPUTS OF EACH SAMPLING CIRCUIT ARE THAN ADDED TOGETHER AT A PLURALITY OF OUTPUT TERMINALS. FROM EACH OUTPUT TERMINAL A SAMPLING TRANSIENT INDICATIVE OF A POINT ON THE DAMPING CHARACTERISTIC OF THE STRUCTURE CAN BE OBTAINED. IN ACCORDANCE WITH THE INVENTION, A SYSTEM IS ALSO DISCLOSED UTILIZING THIS APPARATUS TO PROVIDE AN ONTHE-LINE DAMPING RATIO AND PERIOD DETECTION AND DISPLAY APPARATUS INCLUDING MEANS FOR INSTANTANEOUSLY OBSERVING THE DAMPING CHARACTERISTICS OF THE MONITORED STRUCTURE AND PROVIDING AN OUTPUT CONTROL SIGNAL WHICH CAN BE USED TO INFLUENCE THE CHARACTERISTICS OF THE STRUCTURE OR THE FORCES APPLIED THERETO IN A SELECTED MANNER.


Journal ArticleDOI
TL;DR: Questions about the accuracy of library records, the behavior or attitudes of patrons, or the conditions of the books in the collection can often be answered by a random sampling study.
Abstract: Questions about the accuracy of library records, the behavior or attitudes of patrons, or the conditions of the books in the collection can often be answered by a random sampling study. Use of this time and money saving technique requires no special mathematical ability or statistical background. The concept of accuracy is discussed and a table is provided to simplify the determination of an appropriate sample size. A method of selecting a sample using random numbers is shown. Three examples illustrate the application of the technique to library problems.

Journal ArticleDOI
TL;DR: In this article, the authors describe the nets used and the sampling procedures adopted to investigate patterns of vertical distribution in an oceanic pelagic fauna and discuss the limitations of the sampling.
Abstract: Details are given of the nets used and the sampling procedures adopted to investigate patterns of vertical distribution in an oceanic pelagic fauna. Ancillary gear and laboratory methods are described. The paper concludes with a discussion of the limitations of the sampling.

Journal ArticleDOI
TL;DR: The analysis of data for pattern in plant communities is a standard procedure in quantitative ecology, and the methods used are described by Greig-Smith (1964) and Kershaw (1964), as mentioned in this paper.
Abstract: The analysis of data for pattern in plant communities is a standard procedure in quantitative ecology, and the methods used are described by Greig-Smith (1964) and Kershaw (1964). The analysis depends upon a series of contiguous quadrats or groups of points, each series having a number of units which is a power of 2. One unit is conveniently termed 'Block size 1'. The data from adjacent pairs are combined to give 'Block size 2', from adjacent pairs of pairs to give 'Block size 4', and so on. The method of analysis is analogous to that of a multiple split-plot experiment, with the interest focused upon the estimated error variance of the block sizes (subplots). Most work has concentrated upon the analysis of parallel transects of samples, each transect having the number of units a multiple of 2. Kershaw (1957) experimented with a number of artificial communities, which consisted of clumps of 100% cover being laid out on graph paper. Both Kershaw and Greig-Smith, Kershaw & Anderson (1963) demonstrate the power of this method of analysis. Kershaw discussed the relation between the scale of the heterogeneity and the size of the basic sampling unit. Greig-Smith (1964, p. 88) quotes his results thus: 'If sample size, i.e. total number of grid units, is inadequate, the peaks tend to drift one block size to the right, owing to patches overlapping two blocks of their own size. This drift can be cancelled by an increase in sample size.' The problem of the drift of a peak to the right is important, since this weakens a potentially strong analysis. Both Kershaw's study and the published field studies have started transects or grids at a randomly selected point.

Proceedings ArticleDOI
09 Dec 1969
TL;DR: A tutorial section is included for the purpose of giving the reader an intuitive feeling for the kinds of information contained in a diffraction pattern and how it relates to the original photographic imagery.
Abstract: This paper describes diffraction-pattern sampling as a basis for automatic pattern recognition in photographic imagery; it covers: diffraction-pattern generation, diffraction-pattern/image-area relationships, diffraction-pattern sampling, algorithm development, facility description, and experimental results which have been obtained over the last few years at General Motors' AC Electronics-Defense Research Laboratories in Santa Barbara, California. Sampling the diffraction pattern results in a sample - signature - a different one for each sampling geometry. The kinds of information obtainable from sample-signatures are described, and considerations for developing algorithms based on such information are discussed. A tutorial section is included for the purpose of giving the reader an intuitive feeling for the kinds of information contained in a diffraction pattern and how it relates to the original photographic imagery.© (1969) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.


Journal ArticleDOI
TL;DR: In this article, the bias and precision of the estimators for ED 50 applying the dose-averaging formula and probit analysis are computed and relatively efficient sampling schemes are discussed.
Abstract: The up-and-down method in bioassay using multiple samples in each trial is investigated. The bias and precision of the estimators for ED 50 applying the dose-averaging formula and probit analysis are computed and relatively efficient sampling schemes are discussed. The use of the method to estimate extreme percentage doses is developed. It is shown that the estimators from this method are generally more efficient than the non-sequential estimators and are as good as other comparable sequential estimators. Several numerical examples of the asymptotic distribution of the dose-levels from this sampling procedure are tabulated.

01 Jan 1969
TL;DR: The Hourglass program as discussed by the authors was one of the few major systematic biological sampling programs undertaken on the continental shelf of the Gulf of Mexico and was conducted by the Marine Research Laboratory of the Florida Board of Conservation.
Abstract: This paper describes in detail the rationale, cruise patterns, stations, gear, sampling procedures, and methods of specimen handling, and presents all the hydrographic data accumulated during the 28 months of the Hourglass program ( August 1965 - November 1967). The Hourglass cruises were conducted by the Marine Research Laboratory of the Florida Board of Conservation and represent one of the few major systematic biological sampling programs undertaken on the continental shelf of the Gulf of Mexico. (50pp.)

Journal ArticleDOI
01 Sep 1969
TL;DR: In this article, a short description of the isokinetic sampling technique is given, which is applied to a dense solids-gas mixture, and an analysis is presented of the factors in (1) the measuring procedure, and (2) the geometry of the probe, that influence the accuracy with which gas and solids flux profiles can be measured.
Abstract: A short description of the isokinetic sampling technique is given. The technique, which in itself is not new, is here applied to a dense solids-gas mixture. An analysis is presented of the factors in (1) the measuring procedure, and (2) the geometry of the probe, that influence the accuracy with which gas and solids flux profiles can be measured. Finally, a few random examples are given of profiles as measured in our 0·3-m diameter riser.


Book ChapterDOI
01 Jan 1969
TL;DR: In this article, the authors extend stimulus sampling theory to situations involving a continuum of possible responses, which is similar to the case in this paper, but with a fixed number of responses.
Abstract: The aim of the present investigation is to extend stimulus-sampling theory to situations involving a continuum of possible responses. The theory for a finite number of responses stems from the basic paper by Estes (1950); the present formulation will resemble most closely that given for the finite case in Suppes and Atkinson (1959). In a previous study (Suppes, 1959b) I was concerned with a corresponding extension of linear learning models, and several results of that study are, as we shall see, closely related to the present one.

Journal ArticleDOI
TL;DR: The initial sampling stage is the most critical part of any test procedure and where representative samples are difficult to obtain, the enforcement of a code of practice in manufacture is preferable to attempts to control quality by subsequent testing.
Abstract: The testing of material in order to determine its conformity to a microbiological standard involves two main processes; selection of the samples of material to be tested, and the testing of this material. For the procedure to be reliable, the sampled material must be representative of the whole and the test must give a reliable result. The errors arising from the sampling stage are indicated in this paper, and sampling schemes which have been recommended for this purpose are discussed. The chief test methods (colony and total counts, dilution methods, presence/absence tests, and chemical methods) are then discussed, both from the point of view of their underlying statistical reliability and the assessment of operator errors. Finally it is concluded that the initial sampling stage is the most critical part of any test procedure and where representative samples are difficult to obtain, the enforcement of a code of practice in manufacture is preferable to attempts to control quality by subsequent testing.