scispace - formally typeset
Search or ask a question

Showing papers on "Parametric statistics published in 1990"


Journal ArticleDOI
TL;DR: In this paper, the authors study the asymptotic properties of instrumental variable estimates of multivariate cointegrating regressions and allow for deterministic and stochastic regressors as well as quite general deterministic processes in the data generating mechanism.
Abstract: This paper studies the asymptotic properties of instrumental variable (IV) estimates of multivariate cointegrating regressions and allows for deterministic and stochastic regressors as well as quite general deterministic processes in the data-generating mechanism. It is found that IV regressions are consistent even when the instruments are stochastically independent of the regressors. This phenomenon, which contrasts with traditional theory for stationary time series, is a beneficial artifact of spurious regression theory whereby stochastic trends in the instruments ensure their relevance asymptotically. Problems of inference are also addressed and some promising new theoretical results are reported. These involve a class of Wald tests which are modified by semiparametric corrections for serial correlation and for endogeneity. The resulting test statistics which we term fully-modified Wald tests have limiting x2 distributions, thereby removing the obstacles to inference in cointegrated systems that were presented by the nuisance parameter dependencies in earlier work. Some simulation results are reported which seek to explore the sampling behaviour of our suggested procedures. These simulations compare our fully modified (semiparametric) methods with the parametric error-correction methodology that has been extensively used in recent empirical research and with conventional least squares regression. Both the fully-modified and errorcorrection methods work well in finite samples and the sampling performance of each procedure confirms the relevance of asymptotic distribution theory, as distinct from super-consistency results, in discriminating between statistical methods.

3,945 citations


Book
Tony Lancaster1
01 Jan 1990
TL;DR: In this paper, the gamma function and distribution of the Laplace transform have been investigated in the context of model building, and the hazard function has been shown to be an important process in modeling structural transition models.
Abstract: Preface Part I. Model Building: 1. Some basic results 2. Covariates and the hazard function 3. Parametric families of duration distribution 4. Mixture models 5. Some important processes 6. Some structural transition models Part II. Inference: 7. Identifiability issues 8. Fully parametric inference 9. Limited information inference 10. Misspecification analysis 11. Residual analysis Appendix 1: The gamma function and distribution Appendix 2: Some properties of the Laplace transform Bibliography Index.

1,788 citations


Journal ArticleDOI
TL;DR: In this paper, a flexible parametric proportional hazards model is proposed and the model specification is flexibly parametric in the sense that the baseline hazard is non-parametric while the effect of the covariates takes a particular functional form.
Abstract: In this paper we specify and estimate a flexible parametric proportional hazards model. The model specification is flexibly parametric in the sense that the baseline hazard is non parametric while the effect of the covariates takes a particular functional form. We also add parametric heterogeneity to the underlying hazard model specification. We specify a flexible parametric proportional competing risks model which permits unrestricted correlation among the risks. Unemployment duration data are then analysed using the flexible parametric duration and competing risks specifications. We find an important effect arising from the exhaustion of unemployment insurance and significantly different hazards for the two types of risks, new jobs and recalls.

696 citations


Journal ArticleDOI
TL;DR: In this paper, a method for recovery of compact volumetric models for shape representation of single-part objects in computer vision is introduced, where the model recovery is formulated as a least-squares minimization of a cost function for all range points belonging to a single part.
Abstract: A method for recovery of compact volumetric models for shape representation of single-part objects in computer vision is introduced. The models are superquadrics with parametric deformations (bending, tapering, and cavity deformation). The input for the model recovery is three-dimensional range points. Model recovery is formulated as a least-squares minimization of a cost function for all range points belonging to a single part. During an iterative gradient descent minimization process, all model parameters are adjusted simultaneously, recovery position, orientation, size, and shape of the model, such that most of the given range points lie close to the model's surface. A specific solution among several acceptable solutions, where are all minima in the parameter space, can be reached by constraining the search to a part of the parameter space. The many shallow local minima in the parameter space are avoided as a solution by using a stochastic technique during minimization. Results using real range data show that the recovered models are stable and that the recovery procedure is fast. >

596 citations


Journal ArticleDOI
TL;DR: In this paper, a non-parametric estimator for discrete response valuation experiments is proposed, where no distribution assumption is necessary and, in fact, the computations can be done on the back of an envelope.
Abstract: ronmental goods. The idea of asking the individual to accept or reject a given price for an environmental good-rather than asking him/her to state his/her reservation price exactly-was introduced in the literature by Bishop and Heberlein (1979). Hanemann (1984) showed how this particular technique of preference revelation could be integrated into economic theory by using the Random Utility Maximization (RUM) model. A salient feature of the studies published to date is the use of a specific parametric statistical model to derive descriptive measures like mean or median Willingness to Pay (WTP) from the discrete response valuation experiment. The purpose of this article is to introduce a non-parametric estimator suitable for discrete response valuation experiments. There are two significant advantages of the approach suggested here; first, no distribution assumption is necessary and, second, it is very simple to use-in fact, the computations can be done "on the back of an envelope." The paper is structured as follows: Section II outlines three different (parametric) approaches suggested in the literature, i.e., Bishop and Heberlein (1979), Hanemann (1984), and Cameron (1988). Section III introduces the non-parametric estimator and gives a brief description of the data used to illustrate the methods. Section IV compares the estimators, followed by concluding remarks.

386 citations


Journal ArticleDOI
TL;DR: In this paper, the authors argue that a more fruitful approach to testing optimizing behavior is to measure the departure from optimization using the estimated objective function, and see whether this departure is significant in an economic sense.

327 citations


Posted Content
TL;DR: In this paper, the authors used semiparametric methods to reanalyze data on the labor supply of married women first studied by Thomas Mroz (1987) using parametric methods.
Abstract: Among the central theoretical developments in the econometric analysis of nonexperimental microeconomic data has been the analysis of selectivity bias. Following the work of James Heckman (1974), statistical techniques were developed in the 1970s to consistently estimate the parameters of these models. One potential drawback to the application of these techniques is their sensitivity to the assumed parametric distribution of the unobservable error terms in the model. In recent years, a number of estimation methods for selection models have been developed which do not impose parametric forms on error distributions; these methods are termed "semiparametric," since only part of the model of interest (the regression function) is parametrically specified. While the statistical theory of these semiparametric estimators has received much attention, practical applications of the methods are lacking (an exception being the paper by Joel Horowitz and George Neumann, 1987). In this paper, we use semiparametric methods to reanalyze data on the labor supply of married women first studied by Thomas Mroz (1987) using parametric methods. The object of this reanalysis is to determine whether Mroz's results are sensitive to his parametric assumptions.

291 citations


Journal ArticleDOI
TL;DR: In this article, a parametric approach is proposed in order to introduce a well-defined metric on the class of autoregressive integrated moving-average (ARIMA) invertible models as the Euclidean distance between their auto-gressive expansions.
Abstract: . In a number of practical problems where clustering or choosing from a set of dynamic structures is needed, the introduction of a distance between the data is an early step in the application of multivariate statistical methods. In this paper a parametric approach is proposed in order to introduce a well-defined metric on the class of autoregressive integrated moving-average (ARIMA) invertible models as the Euclidean distance between their autoregressive expansions. Two case studies for clustering economic time series and for assessing the consistency of seasonal adjustment procedures are discussed. Finally, some related proposals are surveyed and some suggestions for further research are made.

269 citations


Journal ArticleDOI
01 Oct 1990
TL;DR: The proposed formalism is applied to the problems of selecting an optimal architecture and the prediction of learning curves and the Gibbs distribution on the ensemble of networks with a fixed architecture is derived.
Abstract: A general statistical description of the problem of learning from examples is presented. Learning in layered networks is posed as a search in the network parameter space for a network that minimizes an additive error function of a statistically independent examples. By imposing the equivalence of the minimum error and the maximum likelihood criteria for training the network, the Gibbs distribution on the ensemble of networks with a fixed architecture is derived. The probability of correct prediction of a novel example can be expressed using the ensemble, serving as a measure to the network's generalization ability. The entropy of the prediction distribution is shown to be a consistent measure of the network's performance. The proposed formalism is applied to the problems of selecting an optimal architecture and the prediction of learning curves. >

242 citations


Journal ArticleDOI
TL;DR: In this article, the problem of output tracking for a single-input single-output non-linear system in the presence of uncertainties is studied, and a control law is designed for minimum-phase nonlinear systems which results in tracking of this signal by the output.
Abstract: The problem of output tracking for a single-input single-output non-linear system in the presence of uncertainties is studied. The notions relative degree and minimum-phase for non-linear systems are reviewed. Given a bounded desired tracking signal with bounded derivatives, a control law is designed for minimum-phase non-linear systems which results in tracking of this signal by the output. This control law is modified in the presence of uncertainties associated with the model vector fields to reduce the effects of these uncertainties on the tracking errors. Two types of uncertainties are considered: those satisfying a generalized matching condition but otherwise unstructured, and linear parametric uncertainties. It is shown that for systems with the first type of uncertainty, high-gain control laws can result in small tracking errors of O(∊), where e is a small design parameter. An alternative scheme based on variable structure control strategy is shown to yield zero tracking errors. Adaptive control te...

233 citations


Journal ArticleDOI
TL;DR: Local variations in attenuation, the center frequency and bandwidth of the transducer, and the distribution of scatterer sizes greatly influence the accuracy of estimates and the appearance of the image, thus demonstrating the importance of these factors in parametric image interpretation.

Journal ArticleDOI
TL;DR: In this article, the issue of inference from the problems of prediction or of quantitative policy analysis of an empirical parametric model and illustrates a new methodology that enables this in a test for prima facie causality.
Abstract: A research strategy is suggested that separates the issue of inference from the problems of prediction or of quantitative policy analysis of an empirical parametric model and illustrates a new methodology that enables this in a test for prima facie causality. Unlike the conventional parametric test, the more powerful multiple rank F test is invariant to monotonic transformations of the variables and independent of the error distribution. Employing this test, the Wagnerian hypothesis, supported by conventional parametric analysis, is rejected and the conventional Keynesian theory is accepted. Copyright 1990 by MIT Press.

Journal ArticleDOI
TL;DR: In this article, a comparison of nonparametric regression curves is considered, where the authors assume that there are parametric transformations of the axes which map one curve into the other.
Abstract: The comparison of nonparametric regression curves is considered. It is assumed that there are parametric (possibly nonlinear) transformations of the axes which map one curve into the other. Estimation and testing of the parameters in the transformations are studied. The rate of convergence is $n^{-1/2}$ although the nonparametric components of the model typically have a rate slower than that. A statistic is provided for testing the validity of a given completely parametric model.

Journal ArticleDOI
TL;DR: In this article, a review of the use of nonparametric analysis of variance in the design of experiments in the behavioral and social sciences that focused on interaction effects is presented, where the authors show that non-parametric methods lack statistical power and that there is a paucity of techniques in more complicated research designs.
Abstract: Until recently the design of experiments in the behavioral and social sciences that focused on interaction effects demanded the use of the parametric analysis of variance. Yet, researchers have been concerned by the presence of nonnormally distributed variables. Although nonparametric statistics are recommended in these situations, researchers often rely on the robustness of parametric tests. Further, often it is assumed that nonparametric methods lack statistical power and that there is a paucity of techniques in more complicated research designs, such as in testing for interaction effects. This paper reviewed (a) research in the past decade and a half that addressed concerns in selecting parametric and nonparametric statistics and (b) 10 recently developed nonparametric techniques for the testing of interactions in experimental design. The review shows that these new techniques are robust, powerful, versatile, and easy to compute. An application of selected nonparametric techniques on fabricated data is...

Journal ArticleDOI
TL;DR: In this paper, a Monte Carlo method is employed to characterize distributions of parameter values calculated in nonlinear regression problems, and accurate estimates of confidence intervals are easily obtained, up to 2 and 3fold in the calculated uncertainties.
Abstract: A Monte Carlo method is employed to characterize distributions of parameter values calculated in nonlinear regression problems. Accurate estimates of confidence intervals are easily obtained. Two illustrative numerical examples are provided to compare the Monte Carlo uncertainty estimates with those derived by use of standard methods of parametric statistics. The customary assumptions that (1) the effects of covariances between pairs of the parameters can bc ignored and (2) that the distributions of the parameters are normal are shown to lead to significant errors, up to 2and 3-fold in the calculated uncertainties. The Monte Carlo method is free from these assumptions and their associated errors.

Proceedings ArticleDOI
13 May 1990
TL;DR: A kinematic modeling convention for robot manipulators is proposed which has complete and parametrically continuous (CPC) properties and makes the CPC model particularly useful for robot calibration.
Abstract: A kinematic modeling convention for robot manipulators is proposed. The kinematic model is named for its completeness and parametric continuity (CPC) properties. Parametric continuity of the CPC model is achieved by adopting a singularity-free line representation consisting of four line parameters. Completeness is achieved through adding two link parameters to allow arbitrary placement of link coordinate frames. The transformations from the world frame to the base frame and from the last link frame to the tool frame can be modeled with the same modeling convention used for internal link transformations. Since all the redundant parameters in the CPC model can be systematically eliminated, a linearized robot error model can be constructed in which all error parameters are independent and span the entire geometric error space. The focus is on model construction, mappings between the CPC model and the Denavit-Hartenberg model, the study of the model properties, and its application to robot kinematic calibration. >

Journal ArticleDOI
TL;DR: In this article, the authors compared the Fourier series, harmonic mean, minimum convex polygon, and 2 95% ellipse home-range estimators using computer-simulated data with a known home range area.
Abstract: We compared the Fourier series, harmonic mean, minimum convex polygon, and 2 95% ellipse home-range estimators using computer-simulated data with a known home-range area. Data were generated to simulate home ranges with 1 or 2 centers of activity and with topographic barriers causing irregular shapes. Estimators performed with different precision and bias according to the type of data simulated and the number of observations taken. Overall, the harmonic mean was the least biased, but was also one of the least precise. Home-range estimators are only general measures of animal activity. J. WILDL. MANAGE. 54(2):310-315 Many statistical home-range estimators have been proposed. All have different underlying models (assumptions) providing different operating characteristics and therefore can produce dissimilar results for a specific set of data. Rarely have the statistical properties of these estimators been compared or have guidelines on the selection of a specific estimator for a particular set of data been proposed. Application of different estimators produces confusion in the interpretation of home-range estimates because some of the differences observed between studies are due to the estimators themselves, and not to the behavior of the animals being studied. We used computer simulated data to address the differences among estimators by comparing 3 nonparametric estimators: harmonic mean (Dixon and Chapman 1980), Fourier series (Anderson 1982), and minimum convex polygon (Mohr 1947); and 2 parametric estimators: 95% ellipse estimators of Jennrich and Turner (1969) and Koeppl et al. (1975). Two types of animal movements were simulated: (1) an animal with a center of activity (e.g., a nesting bird) and (2) an animal with a uniform distribution (no center of activity). Data from a normal distribution were used to simulate center-of-activity movements, and data from a uniform distribution were used for movements without a center of activity. In addition, home ranges with irregular This content downloaded from 207.46.13.114 on Thu, 26 May 2016 06:08:42 UTC All use subject to http://about.jstor.org/terms J. Wildl. Manage. 54(2):1990 HOME-RANGE ESTIMATORS * Boulanger and White 311 shapes, as would be caused by topographic barriers and/or habitat heterogeneity, were simulated. Advantages of using computer-simulated data are that a true home range is known, and thus bias and precision of each estimator can be evaluated. The statistical properties of an estimator cannot be determined with field telemetry data because true replicate observations cannot be constructed. Replicated, simulated data sets also allow powerful statistical comparisons of esti-

Journal ArticleDOI
TL;DR: In this paper, an asymptotically-minimum-variance algorithm for estimating the MA (moving average) and ARMA (autoregressive moving average) parameters of non-Gaussian processes from sample high-order moments is given.
Abstract: A description is given of an asymptotically-minimum-variance algorithm for estimating the MA (moving-average) and ARMA (autoregressive moving-average) parameters of non-Gaussian processes from sample high-order moments. The algorithm uses the statistical properties (covariances and cross covariances) of the sample moments explicitly. A simpler alternative algorithm that requires only linear operations is also presented. The latter algorithm is asymptotically-minimum-variance in the class of weighted least-squares algorithms. >

Proceedings ArticleDOI
23 May 1990
TL;DR: In this article, a method for parameter set estimation for a system wich contains both parametric and nonparametric uncertainty prior nformation is available about both types of uncertainty, but only the parametric type is further refined from the measured data.
Abstract: A method is presented for parameter set estimation for a system wich contains both parametric and nonparametric uncertainty Prior nformation is available about both types of uncertainty, but only the parametric type is further refined from the measured data

Proceedings ArticleDOI
01 Sep 1990
TL;DR: An algorithm to detect geometric collisions between pairs of time-dependent parametric surfaces that works on surfaces that are continuous and have bounded derivatives, and includes objects that move or deform as a function of time.
Abstract: We develop an algorithm to detect geometric collisions between pairs of time-dependent parametric surfaces. The algorithm works on surfaces that are continuous and have bounded derivatives, and includes objects that move or deform as a function of time. The algorithm numerically solves for the parametric values corresponding to coincident points and near-misses between the surfaces of two parametric functions.Upper bounds on the parametric derivatives make it possible to guarantee the successful detection of collisions and near-misses; we describe a method to find the derivative bounds for many surface types. To compute collisions between new types of surfaces, the mathematical collision analysis is needed only once per surface type, rather than analyzing for each pair of surface types.The algorithm is hierarchical, first finding potential collisions over large volumes, and then refining the solution to smaller volumes. The user may specify the desired accuracy of the solution. A C-code implementation is described, with results for several non-bicubic and bicubic time-dependent parametric functions. An animation of the collision computation demonstrates collisions between complex parametric functions.

Journal ArticleDOI
TL;DR: In this article, a generalization of the small gain theorem for robust stability of a linear time-invariant dynamic system under perturbations of mixed type is presented, and the problem of calculating the exact structured and unstructured stability margins is then constructively solved.
Abstract: The problem of robust stability for linear time-invariant single-output control systems subject to both structured (parametric) and unstructured (H/sub infinity /) perturbations is studied. A generalization of the small gain theorem which yields necessary and sufficient conditions for robust stability of a linear time-invariant dynamic system under perturbations of mixed type is presented. The solution involves calculating the H/sub infinity /-norm of a finite number of extremal plants. The problem of calculating the exact structured and unstructured stability margins is then constructively solved. A feedback control system containing a linear time-invariant plant which is subject to both structured and unstructured perturbations is considered. The case where the system to be controlled is interval is treated, and a nonconservative, easily verifiable necessary and sufficient condition for robust stability is given. The solution is based on the extremal of a finite number of line segments in the plant parameter property of a finite number of line segments in the plant parameter space along which the points closest to instability are encountered. >

Journal ArticleDOI
TL;DR: In this paper, a linear method and a modification to an existing linear method are proposed for consistent parameter estimation in measurement noise under the assumption that the system order is known, and both recursive closed-form and batch least-squares versions of the parameter estimators are presented.
Abstract: The problem of estimating the parameters of a moving average model from the cumulant statistics of the noisy observations of the system output is discussed. The system is driven by an independently identically distributed nonGaussian sequence that is not observed. The noise is additive and may be colored and nonGaussian. Following some existing linear parametric approaches to this problem, a linear method and a modification to an existing linear method are proposed for consistent parameter estimation in measurement noise under the assumption that the system order is known. Both recursive closed-form and batch least-squares versions of the parameter estimators are presented. The existing and the proposed linear methods utilize only a partial set of the relevant output statistics (this restriction being necessary to obtain a linear estimator), whereas there exist nonlinear method that exploit a much larger set of output statistics. A simulation example where two existing linear methods and the two new methods are compared to two existing nonlinear methods is presented. >

Proceedings ArticleDOI
03 Apr 1990
TL;DR: An acoustic-class-dependent technique for text-independent speaker identification on very short utterances is described, based on maximum-likelihood estimation of a Gaussian mixture model representation of speaker identity.
Abstract: An acoustic-class-dependent technique for text-independent speaker identification on very short utterances is described. The technique is based on maximum-likelihood estimation of a Gaussian mixture model representation of speaker identity. Gaussian mixtures are noted for their robustness as a parametric model and their ability to form smooth estimates of rather arbitrary underlying densities. Speaker model parameters are estimated using a special case of the iterative expectation-maximization (EM) algorithm, and a number of techniques are investigated for improving model robustness. The system is evaluated using a 12 reference speaker population from a conversational speech database. It achieves 80% average text-independent speaker identification performance for a 1-s test utterance length. >

Journal Article
TL;DR: A linear method and a modification to an existing linear method are proposed for consistent parameter estimation in measurement noise under the assumption that the system order is known and both recursive closed-form and batch least-squares versions of the parameter estimators are presented.
Abstract: The problem of estimating the parameters of a moving average model from the cumulant statistics of the noisy observations of the system output is discussed The system is driven by an independently identically distributed nonGaussian sequence that is not observed The noise is additive and may be colored and nonGaussian Following some existing linear parametric approaches to this problem, a linear method and a modification to an existing linear method are proposed for consistent parameter estimation in measurement noise under the assumption that the system order is known Both recursive closed-form and batch least-squares versions of the parameter estimators are presented The existing and the proposed linear methods utilize only a partial set of the relevant output statistics (this restriction being necessary to obtain a linear estimator), whereas there exist nonlinear method that exploit a much larger set of output statistics A simulation example where two existing linear methods and the two new methods are compared to two existing nonlinear methods is presented >

Journal ArticleDOI
TL;DR: Routine use of the bulb syringe and use of regulated suction only when indicated, and always below 100 mm Hg, are also important to avoid traumatic esophageal perforation in neonates.
Abstract: when the standard operative maximum wall suction might inadvertently be used to suction the neonatal oropharynx and trauma might be induced. There is no easy way to know during sections if suction is causing esophageal trauma because of the usual blood-stained amniotic fluid at cesarean delivery. Careful intubation of the newborn by skilled personnel is fundamentally important. Routine use of the bulb syringe and use of regulated suction only when indicated, and always below 100 mm Hg, are also important to avoid traumatic esophageal perforation in neonates.

Journal ArticleDOI
TL;DR: A model of the echo formation process from tissues is suggested, in order to relate microstructural features to ultrasonic spectral signatures and to prove its validity and to test the procedure for estimating model parameters from actual data.
Abstract: A model of the echo formation process from tissues is suggested, in order to relate microstructural features to ultrasonic spectral signatures. The tissue is modeled as a collection of ideal randomly distributed scatterers, filtered by a time-invariant system representing the measurement apparatus and the average properties of the scattering medium, to obtain the backscattered signal. The gamma distribution has been assumed to describe the process, because it offers a flexible approach to approximating the parametric regularity of the scatterers. According to the model the spectral characteristics of the backscattered signal are strictly correlated with the spatial architecture and can be related to the gamma distribution parameters, i.e. interdistance and order. The model has been tested by simulating practical situations, in order to prove its validity and to test the procedure for estimating model parameters from actual data. The correspondence of experimental results obtained from tissues of different degrees of regularity with the simulated results confirms the effectiveness of the model for tissue characterization studies. >

Journal ArticleDOI
TL;DR: In this paper, the authors analyse productive efficiency in about 400 local social insurance offices of the Swedish social insurance system for the period 1974-1984 and find that the efficiency is around 0.8 and that the differences between the approaches are supprisingly small.

Journal ArticleDOI
TL;DR: In this article, the authors employ the general framework of counting processes to check the validity of an assumed parametric model for survival data, and compare the nonparametric Nelson-Aalen plot with the estimated parametric cumulative hazard rate, with $A(t, \hat{\theta})$, the maximum likelihood estimator.
Abstract: To check the validity of an assumed parametric model for survival data, one may compare $\hat{A}(t)$, the nonparametric Nelson-Aalen plot of the cumulative hazard rate, with $A(t, \hat{\theta})$, the estimated parametric cumulative hazard rate, $\hat{\theta}$ being for example the maximum likelihood estimator. Convergence in distribution of $\sqrt n (\hat{A}(t) - A(t, \hat{\theta}))$ and more general processes is studied in the present paper, employing the general framework of counting processes, which allows for quite general models for life history data and for quite general censoring schemes. The results are applied to the construction of $\chi^2$-type statistics for goodness of fit. Cramer-von Mises and Kolmogorov-Smirnov type tests are presented in the case where the unknown parameter is one-dimensional. Power considerations are also included, and some optimality results are reached. Finally tests are constructed for the hypothesis that the unspecified hazard rate part in Cox's regression model follows a parametric form.

Journal ArticleDOI
TL;DR: In this paper, the Neyman-Rao test is introduced, which is equivalent to the classical tests and has been shown to be asymptotically equivalent to many classical tests.
Abstract: Summary Large-sample theory of estimation and of hypothesis testing is developed, at an intermediate (graduate textbook) level. Attention is confined to parametric models. Local analysis ('moving parameter') methods are used. For estimation, an information bound is found for the large-sample variance of regular estimates, regularity being a kind of large-sample unbiasedness. And near solution of the score equations yields estimates achieving these bounds. Regular estimates and efficient estimates are characterized. For hypothesis testing, various classical tests-including likelihood ratio, Wald, and Rao-are shown to be asymptotically equivalent, and efficient within a class of quadratic form (QF) tests. A Neyman-Rao, or effective scores, test is introduced; it too is equivalent to the classical tests. Regular (asymptotically similar, and with no local power against changes in nuisance parameters) QF tests, and efficient tests, are characterized. Many of the results are not entirely new; but the organization and presentation are.

Patent
21 Aug 1990
TL;DR: In this article, a control system and methodology for defining and executing parametric test sequences using automated test equipment is presented, where the control system is divided into components which separate fixed, reusable information from information which is specific to particular tests.
Abstract: Disclosed is a control system and methodology used for defining and executing parametric test sequences using automated test equipment. The control system is divided into components which separate fixed, reusable information from information which is specific to particular tests. One component contains reference data which describes the configuration of the wafer being tested as well as specifications for the tests to be carried out. Another component contains a set of measurement algorithms that describe individual tests to be performed on generic types of devices or parametric test structures. Execution of a test is carried out by a general test program which retrieves stored reference and test definition information and supplies it to the measurement algorithms to enable them to perform measurements on specific devices in the user specified sequence. The general test program additionally routes the measurement results obtained from the algorithms to data files and/or networks, and summarizes the results in a standardized report format.