scispace - formally typeset
Search or ask a question

Showing papers on "Sampling (statistics) published in 1973"


Book
01 Jan 1973
TL;DR: The Logic of Survey Sampling and the Logic of Probability Sampling, a comparison of Survey and Other Methods, and the Ethics of Survey Research, a review of social science research in the 21st Century.
Abstract: PART I. THE SCIENTIFIC CONTEXT OF SURVEY RESEARCH. 1. The Logic of Science. The Traditional Perspective. The Debunking of Science. Science in Practice. What is Science? 2. Science and Social Science. The Search for Social Regularities. The Characteristics of Social Science. Methods of Social Scientific Research. 3. Survey Research as a Method of Social Science. A Brief History of Survey Research. The Scientific Characteristics of Survey Research. A Comparison of Survey and Other Methods. Is Survey Research Really Scientific? PART II. SURVEY RESEARCH DESIGN. 4. Types of Study Design. Purposes of Survey Research. Units of Analysis. Basic Survey Designs. Variations on Basic Designs. Choosing the Appropriate Design. 5. The Logic of Survey Sampling. The Logic of Probability Sampling. Sampling Concepts and Terminology. Probability Sampling Theory and Sampling. Distribution. Populations and Sampling Frames. Types of Sampling Designs. Disproportionate Sampling and Weighting. Nonprobability Sampling. Nonsurvey Uses of Sampling Methods. 6. Examples of Sample Designs. Sampling University Students. Sampling Medical School Faculty. Sampling Episcopal Churchwomen. Sampling Oakland Households. 7. Conceptualization and Instrument Design. Logic of Conceptualization. An Operationalization Framework. Types of Data. Levels of Measurement. Guides to Question Construction. Measurement Quality. General Questionnaire Format. Ordering Questions in a Questionnaire. Instructions. Reproducing the Questionnaire. 8. Index and Scale Construction. Indexes Versus Scales. Index Construction. Scale Construction. Typologies. PART III. DATA COLLECTION. 9. Self-Administered Questionnaires. Mail Distribution and Return. Postal Options and Relative Costs. Monitoring Returns. Follow-up Mailings. Acceptable Response Rates. A Case Study. 10. Interview Surveys. Imporance of Interviewer. General Rules for Interviewing. Interviewer Training. The Interviewing Operation. 11. Data Processing. Computers in Survey Research. Coding. Codebook Construction. Coding and Data Entry Options. Precoding for Data Entry. Data Cleaning. 12. Pretests and Pilot Studies. Conducting Pretests. Conducting Pilot Studies. Evaluating Pretests and Pilot Studies. PART IV. SURVEY RESEARCH ANALYSIS. 13. The Logic of Measurement and Association. The Traditional Image. The Interchangeability of Indexes. Implications. 14. Constructing and Understanding Tables. Univariate Analysis. Subgroup Descriptions. Bivariate Analysis. Multivariate Analysis. 15. The Elaboration Model. History of the Elaboration Model. The Elaboration Paradigm. Elaboration and Ex Post Facto Hypothesizing. 16. Social Statistics. Descriptive Statistics. Inferential Statistics. 17. Advanced Multivariate Techniques. Regression Analysis. Path Analysis. Factor Analysis. Analysis of Variance. Discriminant Analysis. Log-Linear Models. 18. The Reporting of Survey Research. Some Basic Considerations. Organization of the Reports. Guidelines for Reporting Analysis. PART V. SURVEY RESEARCH IN SOCIAL CONTEXT. 19. The Ethics of Survey Research. Voluntary Participation. No Harm to Respondents. Anonymity and Confidentiality. Identifying Purpose and Sponsor. Analysis and Reporting. A Professional Code of Ethics. Ethics -- Relevant Illustrations. 20. The Informed Survey Research Consumer. Research Design. Measurement. Sampling. Data Analysis. Data Reporting. APPENDICES. Appendix A. Table of Random Numbers. Appendix B. Estimated Sampling Error for a Binomial (95% Confidence Level). Appendix C. Distribution of Chi Square. Appendix D. Normal Curve Areas.

2,364 citations


Journal ArticleDOI
TL;DR: In this article, a least-squares prediction approach is applied to finite population sampling theory, focusing on characteristics of particular samples rather than on plans for choosing samples, and random sampling is considered in the light of these results.
Abstract: This is an application of a least-squares prediction approach to finite population sampling theory. One way in which this approach differs from the conventional one is its focus on characteristics of particular samples rather than on plans for choosing samples. Here we study samples in which many superpopulation models lead to the same optimal (BLU) estimator. Random sampling is considered in the light of these results.

199 citations


Journal ArticleDOI
01 Jun 1973-Talanta
TL;DR: The error in a determination of an element in a rock or mineral sample depends on the analytical error, the weight of sample analysed, and the nature and history of the laboratory sample, while subsampling errors can be controlled through the use of sampling constants.

183 citations


Journal ArticleDOI
TL;DR: In this article, the steady-state Kalman filter is used to obtain analytical expressions for the optimum position and velocity accuracy that can be achieved in a navigation system that measures position at uniform sampling intervals of T seconds through random noise with an rms value of?x.
Abstract: The Kalman filtering technique is used to obtain analytical expressions for the optimum position and velocity accuracy that can be achieved in a navigation system that measures position at uniform sampling intervals of T seconds through random noise with an rms value of ?x. A one-dimensional dynamic model, with piecewise-constant acceleration assumed, is used in the analysis, in which analytic expressions for position and velocity accuracy (mean square), before and after observations, are obtained. The errors are maximum immediately before position measurements are made. The maximum position error, however, can be bounded by the inherent sensor error by use of a sufficiently high sampling rate, which depends on the sensor accuracy and acceleration level. The steady-state Kalman filter for realizing the optimum estimates consists of a double integrator, the initial conditions of which are reset at each observation.

123 citations


Book
01 Jan 1973
TL;DR: Topics covered range from an introductory discussion to the listing with expanded writeup of the computer program used to analyze the data, and an attempt has been made to keep the practitioner clearly in mind.
Abstract: Multiple matrix sampling is a psychometric procedure in which a set of test-items is subdivided randomly into subtests of items with each subtest administered to different subgroups of examinees selected at random from the examinee population. Although each examinee rec, only a proportion of the complete set of items, the statistical model employed permits the researcher to estimate the mean, variance and frequency distribution of test scores which would have been obtained by testing all examinees on all items. Contained herein is a detailed description of multiple matrix sampling. The topics covered range from an introductory discussion to the listing with expanded writeup of the computer program used to analyze the data. Throughout this Report an attempt has been made to keep the practitioner clearly in mind.

110 citations


Journal ArticleDOI
TL;DR: In this paper, the authors recognized three major lithological units in the main basin of Lake Huron: glacial till and bedrock, glaciolacustrine clay, and glacial sandstone.
Abstract: On the basis of extensive sampling and echo sounding, three major lithological units are recognized in the main basin of Lake Huron: (1) glacial till and bedrock; (2) glaciolacustrine clay; and (3)...

102 citations





Journal ArticleDOI
TL;DR: In this article, a skip-lot sampling plans for lot-inspection designated as type SkSP-2, where provision is made for skipping inspection of some of the lots when the quality of the submitted product is good.
Abstract: This article presents a system of skip-lot sampling plans for lot-inspection designated as type SkSP-2, where provision is made for skipping inspection of some of the lots when the quality of the submitted product is good. The operating characteristic p..

79 citations


Journal ArticleDOI
TL;DR: In this paper, the Cramer-Rao statistical error bounds for estimating the amplitudes and phases of interference fringes at low light levels are derived by the finite number of photoevents registered in the measurement.
Abstract: Fundamental limitations of estimating the amplitudes and phases of interference fringes at low light levels are determined by the finite number of photoevents registered in the measurement. By modeling the receiver as a spatial array of photon-counting detectors, results are obtained that permit specification of the minimum number of photoevents required for estimation of fringe parameters to a given accuracy. Both a discrete Fourier-transform estimator and an optimum joint maximum-likelihood estimator are considered. In addition, the Cramer–Rao statistical error bounds are derived, specifying the limiting performance of all unbiased estimators in terms of the collected light flux. The performance of the spatial sampling receiver is compared with that of an alternate technique for fringe-parameter estimation that uses a barred grid and temporal sampling of a moving fringe.



Journal ArticleDOI
TL;DR: In this paper, it is shown that Forsythe's method for the normal distribution can be adjusted so that the average number R of uniform deviations required drops to 2.53947 in spite of a shorter program.
Abstract: This article is an expansion of G. E. Forsythe's paper "Von Neumann's com- parison method for random sampling from the normal and other distributions" (5). It is shown that Forsythe's method for the normal distribution can be adjusted so that the average number R of uniform deviates required drops to 2.53947 in spite of a shorter program. In a further series of algorithms, R is reduced to values close to 1 at the expense of larger tables. Extensive computational experience is reported which indicates that the new methods compare extremely well with known sampling algorithms for the normal distribution. The paper starts with a restatement of Forsythe's generalization of von Neumann's comparison method. A neater proof is given for the validity of the approach. The calculation of the expected number NT of uniform deviates required is also done in a shorter way. Subsequently, this quantity N is considered in more detail for the special case of the normal distribution. It is shown that N may be decreased by means of suitable subdivisions of the range (0, co). In the central part (Sections 4 and 5), Forsythe's special algorithm for the normal distribution (called FS) is embedded in a series of sampling procedures which range from the table-free center-tail method (CT) through an improvement of FS (called FT) to algorithms that require longer tables (FL). For the transition from FT to FL, a

Journal ArticleDOI
TL;DR: In this article, a computer program has been written to overcome the limitations of two published programs that compute statistical parameters in grain size analysis, using all available data, obtained from standard sedimentological analyses, regardless of phi interval, and computes statistical parameters by direct integration of a linear approximation to the cumulative curve.
Abstract: A computer program has been written to overcome the limitations of two published programs that compute statistical parameters in grain size analysis. This program uses all available data, obtained from standard sedimentological analyses, regardless of phi interval, and computes statistical parameters by direct integration of a linear approximation to the cumulative curve. This has the advantage that the distribution curve is known and hence the limitations of the parameters obtained can be assessed. Furthermore, it is argued, with examples, that this approximation introduces no greater degree of inaccuracy than is already present in standard sampling and analytical procedures.



Journal ArticleDOI
TL;DR: In this paper, the authors investigate a class of statistical estimators and show that for typical auditing populations, the standard distribution theory is not always appropriate for statistical inference, and conclude that entirely new approaches may be required for statistical sampling in auditing.
Abstract: The traditional literature applying statistical sampling to auditing has recognized neither the special structure of auditing populations nor the unique environment in which this sampling occurs. Much of the literature is based on techniques developed for sample surveys. But the auditor typically has a great deal more information about his population than is available to the social scientist in sample surveys. Counterbalancing this, the auditor operates under much tighter precision requirements than the sample survey investigator. In this paper, we explore these issues in greater detail and show how the auditor must use statistical estimators which explicitly use all the auxiliary information available to him. We investigate a class of such estimators and show that, for typical auditing populations, the standard distribution theory is not always appropriate for statistical inference. Therefore, we conclude that entirely new approaches may be required for statistical sampling in auditing.

Journal ArticleDOI
TL;DR: The behaviour of the mark-recapture estimators of Petersen, Bailey and Jolly and Seber are examined theoretically and empirically by means of simulation techniques and the correlation between the parameter and its associated variance is shown to be significant for all the estimators.
Abstract: The behaviour of the mark-recapture estimators of Petersen, Bailey (triple catch) and Jolly and Seber are examined theoretically and empirically by means of simulation techniques. The correlation between the parameter [Formula: see text] and its associated variance is shown to be significant for all the estimators. This correlation makes the estimated variance an insensitive measure of the accuracy of the estimate except at very high sampling intensities. Such sampling intensities are rarely achieved in experimental work and so the method of mark-recapture must be considered of very limited use. At the sampling intensities necessary to give a coefficient of variation of less than 0.05 it does not seem likely that the correlation between [Formula: see text] and its variance will produce serious underestimation but the minimum confidence limits [Formula: see text] are recommended.

Journal ArticleDOI
TL;DR: Surprisingly, some of the methods for combining directory sampling with random digit dialing and some door-to-door listing to obtain various levels of sample quality are not only higher in quality, but less expensive than methods that use random dialing or door- to- door listing exclusively.
Abstract: Several recent publications [1-3] have discussed the use of random digit dialing to overcome the limitations of telephone directories due to unlisted numbers and new telephone installations. While these new procedures do not require the use of telephone directories, they do have problems of their own. In this article, we discuss methods for combining directory sampling with random digit dialing and some door-to-door listing to obtain various levels of sample quality. Surprisingly, some of the methods that use directories are not only higher in quality, but less expensive than methods that use random dialing or door-to-door listing exclusively.


Journal ArticleDOI
TL;DR: In this article, a simple mathematical model describing the species-area relation was developed, which dealt with the case that discrete random samples are combined, and the model was applied to the case where the random samples were combined.
Abstract: A simple mathematical model describing the species-area relation was developed. This paper dealt with the case that discrete random samples are combined.

01 Jan 1973
TL;DR: Sample surveys are often used instead of complete enumeration, or complete cnumeration, for the following reasons: as discussed by the authors The sampling is the prcwbs of choostng a rcprcsentiitlvc pjrllori======¯¯¯¯¯¯ of a populat~on.
Abstract: "Sampling" IS the prcwbs of choostng a rcprcsentiitlvc pjrllori of a populat~on.I t contrasts tspc~allqw ~thth e prt*.c*\ of ciiniplcte enumeration. in wh~ch every mcmber of the ilctincd population is included among those to he ~lIler\l~~1'01~. d. example. Sample surveys, or surveys of e silmplc ofthe pt~puli~t~on. are often used instead of complete cnumerationa, or cen\uxa\. for the following reasons.

Journal ArticleDOI
TL;DR: The main pitfall of random sampling is that it does not indicate the reliability of the particular sample that was selected in the research as mentioned in this paper, and the randomization process provides no safeguards against the 5% of chance occasions in which a 95% confidence interval will be erroneous because the "luck of the draw" produced a substantially distorted sample.
Abstract: The previous paper13 in this series was concerned with the advantages and pitfalls of using a randomization process in sampling from a parent population. By depending on chance alone, the randomized selection removes any element of human choice in the sample, and allows the results to be interpreted with inferences based on statistical probability. These statistical inferences are particularly useful when the parent population is being sampled to estimate the value of an unknown parameter, such as the mean of a selected variable. From the results found in the random sample, the investigator can demarcate a zone of values, called a “confidence interval,” and can have a specified “confidence level” for the probability that the true value of the parameter lies within the demarcated zone. The main pitfall of random sampling is that it does not indicate the reliability of the particular sample that was selected in the research. The randomization process provides no safeguards against the 5% of chance occasions in which a 95% confidence interval will be erroneous because the “luck of the draw” produced a substantially distorted sample.

01 May 1973
TL;DR: In this paper, the authors investigated the accrual of information for the three parameter, bivariate exponential distribution of Marshall and Olkin (BVE) for several sampling plans.
Abstract: : Estimation for the three parameter, bivariate exponential distribution of Marshall and Olkin (BVE) is investigated for several sampling plans. Sample data arising from a set of two-component systems on life test may be of three basic types depending on whether the systems on test are series, parallel, or parallel with observable causes of individual component failure. Series and parallel BVE sampling have been investigated previously, but here the theory is unified by characterizing the accrual of information, i.e., by specifying the densities of the additional random variables that become observable, in going from series to parallel to cause identifiable sampling. Using these results, a likelihood is computed for samples that may contain any combination of series, parallel, or cause identifiable subsamples. By slightly modifying the densities involved in the information accrual characterization, the likelihoods of time truncated parallel and cause identifiable samples are obtained. (Modified author abstract)

Journal ArticleDOI
TL;DR: An ultrarapid freeze-sampling method, originally developed by QUISTORFF for rat liver studies, was used in an investigation of brain lactic acid levels during electroshock seizures in immobolized, artikially ventilated rats.
Abstract: INCREASED cerebral metabolic activity during epileptic seizures accompanied by excess brain lactate formation has been amply demonstrated in animal experiments, where no control of gas tensions and acid-base parameters in the arterial blood was exercised (KLEIN and OMEN, 1947; GURDJIAN et al., 1947; KING et al., 1967). However, in a series of studies both in experimental animals and in man, no convincing evidence of cerebral lactate acidosis during seizure activity was found when sufficient oxygenation of the arterial blood was assured during controlled ventilation and oxygen breathing (PLUM et al., 1968; BERESFORD et al., 1969; COLLINS et al., 1970). In these studies, an increased cerebral blood flow was demonstrated, and as the cerebral hyperaemia observed during epileptic seizures has been assumed to be, at least in part, due to tissue lactic acidosis, the failure to find excess lactate in hyperaemic brains during seizure activity casts doubt upon a key argument in the theory of metabolic flow regulation. In a recent study employing relaxed, ventilated patients receiving electroconvulsive therapy, an increased output of lactate from brain together with an increased cerebral blood flow during seizures was demonstrated (BRODERSEN et al., 1973). Such different findings all obtained in situations of sufficient oxygen supply make the unsolved problem of lactic acid production in brain tissue during seizures a crucial one. Since the steady state concentration of lactate in brain changes very rapidly after interruption of blood supply to the tissue (LOWRY et al., 1964), estimations of in vivo concentrations requires a sampling method that reduces the time from biopsy to cessation of metabolic activity to a minimum. In the present study an ultrarapid freeze-sampling method, originally developed by QUISTORFF (1972) for rat liver studies, was used. This technique, which allowed removal and freeze-clamping of about 20 per Cent of the rat brain in 0.12 s, was used in an investigation of brain lactic acid levels during electroshock seizures in immobolized, artikially ventilated rats.

Journal ArticleDOI
TL;DR: Two methods of simulating continuous-data systems by sampled-data models are presented; one relies on the optimization of the sampled- data system with a quadratic performance index, and another relies on a point-by-point comparison method.

Journal ArticleDOI
TL;DR: A new technique by which the hologram image speckle can be reduced by spatially random sampling of the holographic aperture by means of a movable random mask.
Abstract: In this paper, we will illustrate a new technique by which the hologram image speckle can be reduced by spatially random sampling of the hologram aperture. The sampling procedure can be performed by a movable random mask. If the sampling function is made to be uncorrelated in subsequent spatial sampling, the speckle effect in the hologram image can be eliminated. However, an optimum sampling function may be difficult to obtain, which remains to be seen. This speckle reduction technique can be achieved only by some trade-off of the holographic resolution. A simple experimental confirmation of this technique is illustrated. Since one of the most severely limiting factors in applications of holography may be the speckle effect, this proposed technique should be added, together with the other existent techniques, to remedy this limiting factor.

Journal ArticleDOI
01 Sep 1973-Talanta
TL;DR: Application of this test to the results of determinations of manganese in human serum by a method of established precision, led to the detection of airborne pollution of the serum during the sampling process.

Patent
20 Jun 1973
TL;DR: An apparatus for bacteriological sampling in which Petri dishes are automatically provided with the ingredients for performing bacteriological colony counts is described in this article, where the Petri dish can be used to perform colony counts.
Abstract: An apparatus for bacteriological sampling in which Petri dishes are automatically provided with the ingredients for performing bacteriological colony counts. Sequentially diluted samples can be prepared by the apparatus.