scispace - formally typeset
Search or ask a question

Showing papers on "Parametric statistics published in 1989"


Journal ArticleDOI
TL;DR: In this article, the authors derived the asymptotic distribution of the maximum partial likelihood estimator β for the vector of regression coefficients β under a possibly misspecified Cox proportional hazards model.
Abstract: We derive the asymptotic distribution of the maximum partial likelihood estimator β for the vector of regression coefficients β under a possibly misspecified Cox proportional hazards model. As in the parametric setting, this estimator β converges to a well-defined constant vector β*. In addition, the random vector n 1/2(β – β*) is asymptotically normal with mean 0 and with a covariance matrix that can be consistently estimated. The newly proposed robust covariance matrix estimator is similar to the so-called “sandwich” variance estimators that have been extensively studied for parametric cases. For many misspecified Cox models, the asymptotic limit β* or part of it can be interpreted meaningfully. In those circumstances, valid statistical inferences about the corresponding covariate effects can be drawn based on the aforementioned asymptotic theory of β and the related results for the score statistics. Extensive studies demonstrate that the proposed robust tests and interval estimation procedures...

2,466 citations


Journal ArticleDOI
TL;DR: It is shown that the recent maximum flow algorithm of Goldberg and Tarjan can be extended to solve an important class of such parametric maximum flow problems, at the cost of only a constant factor in its worst-case time bound.
Abstract: The classical maximum flow problem sometimes occurs in settings in which the arc capacities are not fixed but are functions of a single parameter, and the goal is to find the value of the parameter such that the corresponding maximum flow or minimum cut satisfies some side condition. Finding the desired parameter value requires solving a sequence of related maximum flow problems. In this paper it is shown that the recent maximum flow algorithm of Goldberg and Tarjan can be extended to solve an important class of such parametric maximum flow problems, at the cost of only a constant factor in its worst-case time bound. Faster algorithms for a variety of combinatorial optimization problems follow from the result.

659 citations


Book
01 Jan 1989
TL;DR: In this article, the authors investigate stability theory in terms of two different measures, treat the theory of a variety of inequalities, and demonstrate manifestations of the general Lyapunov method.
Abstract: Investigates stability theory in terms of two different measures, treats the theory of a variety of inequalities, and demonstrates manifestations of the general Lyapunov method. Also covers the importance of utilizing different forms of nonlinear variation of parametric formulae, constructive methods generated by monotone iterative technique and th

570 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a survey of diagnostic testing of models based on unit record data and propose some new tests based on comparisons of parametric estimators with nonparametric estimator which are consistent under certain forms of misspecification.
Abstract: This paper surveys the growing literature on diagnostic testing of models based on unit record data. We argue that while many of these tests are produced in a Lagrange multiplier framework they are often more readily derived, and more easily applied, if approached from the conditional moment testing view of Newey (1985) and Tauchen (1985). In addition we propose some new tests based on comparisons of parametric estimators with nonparametric estimators which are consistent under certain forms of misspecification. To illustrate the utility of the tests we employ them in the examination of some existing published studies.

403 citations


Journal ArticleDOI
TL;DR: The results obtained in studies of robust stability and stabilizability of control systems with parametric (structured) uncertainties are reviewed in this paper, where both the algebraic methods based upon characteristic equations and the methods using Lyapunov functions and Riccati equations are discussed and compared.
Abstract: The results obtained in studies of robust stability and stabilizability of control systems with parametric (structured) uncertainties are reviewed. Both the algebraic methods based upon characteristic equations and the methods using Lyapunov functions and Riccati equations are discussed and compared. In the context of algebraic methods, most promising are the Kharitonov-type approach and the optimization procedure of embedding a geometric figure of some kind inside the stability regions of the parameter space, maximizing its size using minimax or some other mathematical programming technique. In the framework of Lyapunov's direct method, the dominant approach has been a quadratic function estimation of stability regions in the parameter space. In large-sale systems, the concept of vector Lyapunov functions has been used with the possibility of choosing quadratic forms, norm-like functions, and their combinations. >

307 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present the results of an experimental effort to generate squeezed microwave radiation using the phase-sensitive gain of a Josephson parametric amplifier, which is used for both doubly degenerate and four-photon mode.
Abstract: We present the results of an experimental effort to generate squeezed microwave radiation using the phase-sensitive gain of a Josephson parametric amplifier. To facilitate the interpretation of the experimental results, we first present a discussion of the theory of microwave squeezing via Josephson parametric amplifiers. This is followed by a detailed description of the device fabricated for our experiment. Experimental results are then presented for the device used in both the doubly degenerate or four-photon mode and for the degenerate or three-photon mode. We have observed parametric deamplification of signals by more than 8 dB. We have demonstrated squeezing of 4.2-K thermal noise. When operated at 0.1 K, the amplifier exhibits an excess noise of 0.28 K when referred to the input. This is smaller than the vacuum fluctuation noise level \ensuremath{\Elzxh}\ensuremath{\omega}/2k=0.47 K. The amplifier is thus quieter than a linear phase-insensitive amplifier in principle can be.

247 citations


Journal ArticleDOI
TL;DR: In this paper, the role of differential geometry in generalizing results is indicated, further applications are mentioned, and geometrical methods in nonlinear regression are related to those developed for general parametric families.
Abstract: Geometrical foundations of asymptotic inference are described in simple cases, without the machinery of differential geometry. A primary statistical goal is to provide a deeper understanding of the ideas of Fisher and Jeffreys. The role of differential geometry in generalizing results is indicated, further applications are mentioned, and geometrical methods in nonlinear regression are related to those developed for general parametric families.

223 citations


Posted Content
TL;DR: In this article, the authors study the asymptotic properties of instrumental variable (IV) estimates of multivariate cointegrating regressions and find that IV regressions are consistent even when the instruments are stochastically independent of the regressors.
Abstract: This paper studies the asymptotic properties of instrumental variable (IV) estimates of multivariate cointegrating regressions. The framework of study is based on earlier work by Phillips and Durlauf (1986) and Park and Phillips (1988, 1989). In particular, the results in these papers are extended to allow for IV regressions that accommodate deterministic and stochastic regressors as well as quite general deterministic processes in the data generating mechanism. It is found that IV regressions are consistent even when the instruments are stochastically independent of the regressors. This phenomenon, which contrasts with traditional theory for stationary time series, is a beneficial artifact of spurious regression theory whereby stochastic trends in the instruments ensure their relevance asymptotically. Problems of inference are also addressed and some promising new theoretical results are reported. These involve a class of Wald tests which are modified by semiparametric corrections for serial correlation and for endogeneity. The resulting test statistics which we term fully modified Wald tests have limiting chi-squared distributions, thereby removing the obstacles to inference in cointegrated systems that were presented by the nuisance parameter dependencies in earlier work. Interestingly, IV methods themselves are insufficient to achieve this end and an endogeneity correction is still generally required, again in contrast to traditional theory. Our results therefore provide strong support for the conclusion reached by Hendry (1986) that there is no free lunch in estimating cointegrated systems. Some simulation results are reported which seek to explore the sampling behavior of our suggested procedures. These simulations compare our fully modified (semiparametric) methods with the parametric error correction methodology that has been extensively used in recent empirical research and with conventional least squares regression. Both the fully modified and error correction methods work well in finite samples and the sampling performance of each procedure confirms the relevance of asymptotic distribution theory, as distinct from superconsistency results, in discriminating between different statistical methods.

213 citations


Journal ArticleDOI
TL;DR: In this paper, a statistical model for step-stress accelerated life test is proposed from the point of view that a change of the stress has a multiplicative effect on the failure rate function over the remaining life.
Abstract: A statistical model for step-stress accelerated life test is motivated from the point of view that a change of the stress has a multiplicative effect on the failure rate function over the remaining life.Properties of the proposed model, including an interpretation in terms of the conditional reliability, and relationships with the existing models are discussed. For the parametric setting of a Weibull family representing the life distribution under a constant stress, maximum likelihood estimation of the parameters is investigated and the Fisher information matrix is derived. The proposed model is found to have certain analytical advantages over the cumulative exposure model that is commonly used in step-stress analysis. An extension of the model to include a regression structure and inferences for life under the use condition stress are briefly discussed.

213 citations


Journal ArticleDOI
TL;DR: In this paper, a unified approach to the asymptotic theory of alternative test criteria for testing parametric restrictions is provided, and the discussion develops within a general framework that distinguishes whether or not the fitting function is a chi-square distribution, and allows the null and alternative hypothesis to be only approximations of the true model.
Abstract: In the context of covariance structure analysis, a unified approach to the asymptotic theory of alternative test criteria for testing parametric restrictions is provided. The discussion develops within a general framework that distinguishes whether or not the fitting function is asymptotically optimal, and allows the null and alternative hypothesis to be only approximations of the true model. Also, the equivalent of the information matrix, and the asymptotic covariance matrix of the vector of summary statistics, are allowed to be singular. When the fitting function is not asymptotically optimal, test statistics which have asymptotically a chi-square distribution are developed as a natural generalization of more classical ones. Issues relevant for power analysis, and the asymptotic theory of a testing related statistic, are also investigated.

209 citations


Journal ArticleDOI
TL;DR: A new theory of spectral analysis for non-linear systems is introduced which consists of estimating the parameters in a NARMAX model description of the system and then computing the generalised frequency response functions directly from the estimated model.

Journal ArticleDOI
TL;DR: In this paper, the parametric sensitivity of the most likely failure point is studied and a formula is derived for the derivative of the unit normal vector to the failure surface in this point.
Abstract: New insight into parametric sensitivity in first-order reliability theory is given by studying the parametric sensitivity of the most likely failure point. A formula is derived for the derivative of the unit normal vector to the failure surface in this point. The derivative of the first-order reliability index is also a result of the analysis, and a geometrical interpretation of this result is given. The derivatives find applications, e.g., within structural systems reliability and sensitivity analysis.

Journal ArticleDOI
TL;DR: The paper covers parametric representation and smoothness, parametric continuity, reparameterization and equivalent parameterization, beta-constraints, and arc-length parameterization.
Abstract: Some of the important basic results on geometric continuity of curves are presented in a self-contained manner. The paper covers parametric representation and smoothness, parametric continuity, reparameterization and equivalent parameterization, beta-constraints, and arc-length parameterization. >

Journal ArticleDOI
TL;DR: In this paper, the authors characterized the MLE for the semiparametric model, and the large-sample properties of the estimate were established for the fully nonparametric model.
Abstract: For randomly censored data, it is known that the maximum likelihood estimate (MLE) of the survival curve is not affected by parametric assumption on the censoring variable. The Kaplan-Meier (1958) estimate is the MLE for both nonparametric and semiparametric models. For randomly truncated data, the truncation product-limit estimate is the MLE for nonparametric models. This is not the case if the truncation mechanism is parameterized, however. Specifically, let X be a generic random variable and T be the truncation variable. If the distribution of T is parameterized and the distribution of X is left unspecified, it can be shown that the truncation product-limit estimate is not the MLE for this semiparametric model, even though it is for the fully nonparametric model. In this article the MLE is characterized for the semiparametric model, and the large-sample properties of the estimate are established. The results show that, unlike censoring, the parametric information from the truncation mechanism ...

Journal ArticleDOI
TL;DR: The robustness of the multivariate test of Gibbons, Ross, and Shanken as mentioned in this paper to nonnormalities in the residual covariance matrix is examined after considering the relative performance of various tests of normality.
Abstract: The robustness of the multivariate test of Gibbons, Ross, and Shanken (1986) to nonnormalities in the residual covariance matrix is examined After considering the relative performance of various tests of normality, simulation techniques are used to determine the effects of nonnormalities on the multivariate test It is found that, where the sample nonnormalities are severe, the size and/or power of the test can be seriously misstated However, it is also shown that these extreme sample values may overestimate the population parameters Hence, we conclude that the multivariate test is reasonably robust with respect to typical levels of nonnormality IN TRADITIONAL HYPOTHESIS TESTING, a nonrandom test maps the values of a random variable into a sample space dichotomized into regions where a hypothesis is either accepted or rejected There are three possible outcomes from this process: (1) a correct decision, (2) a false rejection (Type I error), or (3) failure to reject the hypothesis when it is false (Type II error) Of the latter two types of error, an error of the first kind is usually considered less desirable So typically, a level of significance is selected with low probability of Type I error (eg, 005 or 001), and a test is chosen so as to maximize power (minimize probability of Type II error) for the specified level of Type I error' Knowledge of the relative level of these two errors is critical in assimilating the results of an experiment For example, if the power of a test is equal to its significance level (ie, a weak test), rejection of the null contains zero information2 Additionally, in constructing parametric tests of hypotheses, it is necessary to assume some distribution for the underlying data Consequently, when using parametric tests, rejection of the null is only equivalent to rejection of at least one of the underlying hypotheses (ie, the null hypothesis or the distributional assumption) Interestingly, the size (significance level) and power of procedures used to test

Journal ArticleDOI
TL;DR: In this article, a review of known results on prediction intervals for univariate distributions is presented, including results for parametric continuous and discrete distributions as well as those based on distribution-free methods.
Abstract: This review covers some known results on prediction intervals for univariate distributions. Results for parametric continuous and discrete distributions as well as those based on distribution-free methods are included. Prediction intervals based on Bayesian and sequential methods are not covered. Methods of construction of prediction intervals and other related problems are discussed.

Journal ArticleDOI
TL;DR: A new algorithm for computing the Miutivariable Stability Margin for checking the robust stability of feedback systems with real parametric uncertainty is proposed, which eliminates the need for the frequency search involved in the algorithm of [1,2,3].
Abstract: An alternative implementation of an algorithm for the computation of the multivariable stability margin to check the robust stability of feedback systems with real parametric uncertainty is proposed. This method eliminates the need for the frequency search by reducing it to the testing of a finite number of conditions. These conditions have a special structure that allows a significant improvement in the speed of computation. >

Proceedings ArticleDOI
04 Jun 1989
TL;DR: Segmentation using boundary finding is enhanced both by considering the boundary as a whole and by using model-based shape information, which helps to find the best match between the boundary and the parameter vector and the image data.
Abstract: Segmentation using boundary finding is enhanced both by considering the boundary as a whole and by using model-based shape information. Flexible constraints, in the form of a probabilistic deformable model, are applied to the problem of segmenting natural objects whose diversity and irregularity of shape makes them poorly represented in terms of fixed features of forms. The parametric model is based on the elliptic Fourier decomposition of the boundary. The segmentation problem is solved as an optimization problem, where the best match between the boundary (as defined by the parameter vector) and the image data is found. Initial experimentation shows good results on a variety of images. >

Journal ArticleDOI
TL;DR: In this article, the effect of simulation order on the level accuracy and power of Monte Carlo tests has been discussed, and it is shown that if the level of a Monte Carlo test is known only nominally, not precisely, then the level error of the test is an order of magnitude less than that of the corresponding asymptotic test.
Abstract: We discuss the effect of simulation order on level accuracy and power of Monte Carlo tests, in a very general setting. Both parametric problems, with or without nuisance parameters, and nonparametric problems are treated by a single unifying argument. It is shown that if the level of a Monte Carlo test is known only nominally, not precisely, then the level error of a Monte Carlo test is an order of magnitude less than that of the corresponding asymptotic test. This result is available whenever the test statistic is asymptotically pivotal, even if the number of simulations is held fixed as the sample size n increases. It implies that Monte Carlo methods are a real alternative to asymptotic methods. We also show that, even if the number of simulations is held fixed, a Monte Carlo test is able to distinguish between the null hypothesis and alternative hypotheses distant n-'12 from the null.

Journal ArticleDOI
TL;DR: In this paper, the interpretation and properties of the non-linear frequency response functions are discussed and illustrated by example, and new methods of parametric spectral analysis for a wide class of nonlinear systems were introduced.

Journal ArticleDOI
TL;DR: A formal methodology for IC parametric performance testing, called predictive subset testing, is presented, based on a statistical model of parametric process variation that reduces test complexity and cost.
Abstract: A formal methodology for IC parametric performance testing, called predictive subset testing, is presented. It is based on a statistical model of parametric process variation. In this Monte-Carlo-based approach, a statistical process simulation is used together with circuit simulation to determine the joint probability distribution of a set of circuit performances. By evaluating the joint probability distribution, rather than assuming the performances to be independent, correlations that exist between them are used to reduce the number of performances that need to be explicitly tested. Once a subset of performances for explicit testing has been identified, regression models are constructed for the untested performances, and from the confidence intervals test limits are assigned for the tested performances. In this manner, the values of the untested performances within desired quality levels are predicted, reducing test complexity and cost. >

Journal ArticleDOI
TL;DR: It is shown that the algorithms converge to the unknown characteristic in a pointwise manner and that the mean integrated square error converges to zero as the number of observations tends to infinity.
Abstract: The non-linearity in a discrete system governed by the Hammerstein functional is identified. The system is driven by a random while input signal and the output is disturbed by a random white noise. No parametric a priori information concerning the non-linearity is available and non-parametric algorithms are proposed. The algorithms are derived from the trigonometric as well as Hermite orthogonal series. It is shown that the algorithms converge to the unknown characteristic in a pointwise manner and that the mean integrated square error converges to zero as the number of observations tends to infinity. The rate of convergence is examined. A numerical example is also given.

Journal ArticleDOI
TL;DR: In this article, simple formulae for determining the seismic demand in SDOF systems with natural periods in the medium and long-period range are proposed, which can be used to construct design spectra of the Newmark-Hall type.
Abstract: Based on the results of an extensive parametric study of elastic and inelastic response of SDOF systems, in which the most important structural parameters were varied and ground motions of very different characteristics were taken into account, simple formulae for determining the seismic demand in SDOF systems with natural periods in the medium- and long-period range are proposed. Seismic demand is expressed in terms of the mean values of maximum relative displacements and maximum input energy. These results can be used to provide rough estimates of structural behaviour when different damage models are applied. As well as this, the proposed formulae can be used to construct design spectra of the Newmark-Hall type.

Book ChapterDOI
TL;DR: In this article, the power of various tests for the random walk hypothesis against AR(1) alternatives when the sampling interval is allowed to vary is analyzed for a grid of values of the number of observations and the span of the data available (hence for various sampling intervals).
Abstract: This paper analyzes the power of various tests for the random walk hypothesis against AR(1) alternatives when the sampling interval is allowed to vary. The null and alternative hypotheses are set in terms of the parameters of a continuous time model. The discrete time representations are derived and it is shown how they depend on the sampling interval. The power is simulated for a grid of values of the number of observations and the span of the data available (hence for various sampling intervals). Various test statistics are considered among the following classes: (a) test for a unit root on the original series and (b) tests for randomness in the differenced series. Among class (b), we consider both parametric and nonparametric tests, the latter including tests based on the rank of the first-differenced series. The paper therefore not only provides information as to the relative power of these tests but also about their properties when the sampling interval varies. This work is an extension of Perron (1987) and Shiller and Perron (1985).

Journal ArticleDOI
TL;DR: In this article, the estimation of soil hydraulic and transport parameters from transient unsaturated flow and tracer experiments using a combined simulations-optimization approach is dealt with, where hydraulic properties were defined by a modified form of van Genuchten's (1980) parametric model for two-phase permeability-saturation-pressure relations, and transport properties are defined by an empirical parametric dispersion model.
Abstract: This paper deals with the estimation of soil hydraulic and transport parameters from transient unsaturated flow and tracer experiments using a combined simulations-optimization approach. Hydraulic properties are defined by a modified form of van Genuchten's (1980) parametric model for two-phase permeability-saturation-pressure relations, and transport properties are defined by an empirical parametric dispersion model. A nonlinear weighted least squares algorithm is used to estimate unknown model parameters by minimizing deviations between concentrations, water contents, and pressure heads obtained from hypothetical infiltration/redistribution/evaporation experiments, and those predicted by solving a numerical model of coupled unsaturated flow and transport. Simultaneous estimation of hydraulic and transport properties is found to yield smaller estimation errors for model parameters than sequential inversion of hydraulic properties from water content and pressure head data followed by inversion for transport properties from concentration data. Effects of random noise in data measurements, soil layering, and choice of improper parametric model on the parameter estimation process are discussed.

Journal ArticleDOI
TL;DR: Using a parametric approach, duality is presented for a minimax fractional programming problem that involves several ratios in the objective function.
Abstract: Using a parametric approach, duality is presented for a minimax fractional programming problem that involves several ratios in the objective function.

Journal ArticleDOI
TL;DR: The paper reviews theory and approaches enabling one to treat model parameters as uncertain random variables in hydrologic modeling and several aspects of the topic are introduced but not treated in great mathematical detail.
Abstract: IN most applications, parametric, hydrologic models are treated as deterministic. That is once the parameters are determined, the model always produces the same outputs for a given set of inputs. Parameters are generally treated as unknown constants. Once they are estimated, they are taken as constant. In fact, model parameters are generally estimated from observed data or from relationships that have been derived from observed data. The observed data generally includes some stochastic variables such as rainfall and runoff. Based on the premise that any function of a random variable is itself a random variable, this paper reviews theory and approaches enabling one to treat model parameters as uncertain random variables. The report is a review of parametric uncertainty in hydrologic modeling. Several aspects of the topic are introduced but not treated in great mathematical detail. Many references to additional work are given that will enable the interested reader to pursue various aspects of the topic in detail.

Journal ArticleDOI
TL;DR: In this article, the relationship of flood magnitude to frequency of occurrence can be estimated from observed annual flood data by the parametric method of fitting any of various theoretical distributions (e.g., Log-Pearson Type III) to the data, or by the nonparametric method, which does not require a distributional assumption.

Journal ArticleDOI
TL;DR: In this article, the reliability with respect to plastic collapse of rigid frame and truss structures is estimated by directional simulation, and the results from methods of identifying likely plastic mechanisms can be used to establish an importance sampling whereby the sample size can be reduced without loss of accuracy.
Abstract: The reliability with respect to plastic collapse of rigid frame and truss structures is estimated by directional simulation. This conditional expectation simulation method is especiahy attractive for this problem because the mechanical computation for each sample is the same as the computation required in a standard Monte Carlo simulation. For each simulation a mathematical programming problem must be solved. Under suitable conditions this programming problem simplifies to a linear one. How results from methods of identifying likely plastic mechanisms can be used to establish an importance sampling whereby the sample size can be reduced without loss of accuracy is shown. Finally, it is shown how parametric sensitivity can be estimated by directional simulation.

Journal ArticleDOI
TL;DR: A stochastic version of a knowledge space is developed, in which the knowledge states are considered as possible epochs in a subject's learning history, and an application of the model to artificial data is described, based on maximum likelihood methods.
Abstract: To capture the cognitive organization of a set of questions or problems pertaining to a body of information, Doignon and Falmagne have proposed, and analyzed in a number of papers, the concept of aknowledge space, that is, a distinguished collection of subsets of questions, representing the possibleknowledge states. This collection of sets is assumed to satisfy a number of conditions. Since this concept is a deterministic one, the problem of empirical testing arises. A stochastic version of a knowledge space is developed in this paper, in which the knowledge states are considered as possible epochs in a subject's learning history. The knowledge space is decomposed as a union of a number of possible learning paths, calledgradations. The model specifies how a subject is channelled through and progresses along a gradation. A probabilistic axiom of the “local indepencence” type relates the knowledge states to the observable responses. The predictions of this model are worked out in details in the case of parametric assumptions involving gamma distributions. An application of the model to artificial data is described, based on maximum likelihood methods. The statistical analysis is shown to be capable of revealing the combinatoric core of the model.