scispace - formally typeset
Search or ask a question

Showing papers on "Parametric statistics published in 1997"


Journal ArticleDOI
TL;DR: This basis function method (BFM) is compared with conventional nonlinear least squares estimation of parameters (NLM) and is shown to be more stable than NLM at the voxel level and is computationally much faster.

1,056 citations


Journal ArticleDOI
TL;DR: The occurrence of time‐locked activations is formulated in terms of the general linear model, i.e., multiple linear regression, which permits the use of established statistical techniques that correct for multiple comparisons in the context of spatially smooth and serially correlated data.
Abstract: We present a method for detecting event-related responses in functional magnetic resonance imaging (fMRI). The occurrence of time-locked activations is formulated in terms of the general linear model, i.e., multiple linear regression. This permits the use of established statistical techniques that correct for multiple comparisons in the context of spatially smooth and serially correlated data. Responses are modelled using event-related temporal basis functions. Inferences are then made about all components of the model, using the F-ratio at all voxels in the image, to produce a statistical parametric map (SPM{F}). This method allows for the experimental design to relate the timing of events to the acquisition of data to give a temporal resolution (with respect to the event-related response) far better than the scanning repeat time.

688 citations


Journal ArticleDOI
TL;DR: This paper considers the adaptive robust control of a class SISO nonlinear systems in a semi-strict feedback form and develops a systematic way to combine the backstepping adaptive control with deterministic robust control.

671 citations


Book
01 Jan 1997
TL;DR: This chapter discusses duality Theory for Linear Optimization, a Polynomial Algorithm for the Skew-Symmetric Model, and Parametric and Sensitivity Analysis, as well as implementing Interior Point Methods.
Abstract: Partial table of contents: INTRODUCTION: THEORY AND COMPLEXITY. Duality Theory for Linear Optimization. A Polynomial Algorithm for the Skew-Symmetric Model. Solving the Canonical Problem. THE LOGARITHMIC BARRIER APPROACH. The Dual Logarithmic Barrier Method. Initialization. THE TARGET-FOLLOWING APPROACH. The Primal-Dual Newton Method. Application to the Method of Centers. MISCELLANEOUS TOPICS. Karmarkar's Projective Method. More Properties of the Central Path. Partial Updating. High-Order Methods. Parametric and Sensitivity Analysis. Implementing Interior Point Methods. Appendices. Bibliography. Indexes.

554 citations


Journal ArticleDOI
TL;DR: In this article, the Kullback-Leibler Information Criterion is used for weakly dependent data generating mechanisms, and conditions are derived under which the large sample properties of this estimator are similar to GMM, i.e., the estimator will be consistent and asymptotically normal, with the same covariance matrix as GMM.
Abstract: While optimally weighted GMM estimation has desirable large sample properties, its small sample performance is poor in some applications. We propose a computationally simple alternative, for weakly dependent data generating mechanisms, based on minimization of the Kullback-Leibler Information Criterion. Conditions are derived under which the large sample properties of this estimator are similar to GMM, i.e., the estimator will be consistent and asymptotically normal, with the same asymptotic covariance matrix as GMM. In addition, we propose overidentifying and parametric restrictions tests as alternatives to analogous GMM procedures.

545 citations


Journal ArticleDOI
TL;DR: In this paper, the authors describe parametric curvefitting methods for modeling extreme fire insurance losses, which revolve around the genelahzed Pareto distribution and are supported by extreme value theory.
Abstract: Good estimates for the tails of loss severity dustrlbutlons are essential for pricing or positioning high-excess loss layers m reinsurance We describe parametric curvefitting inethods for modelling extreme h~storlcal losses These methods revolve around the genelahzed Pareto distribution and are supported by extreme value theory. We summarize relevant theoretical results and provide an extenswe example of thmr application to Danish data on large fire insurance losses

528 citations


Book
01 Jan 1997
TL;DR: Preliminary data analysis basic concepts and definitions statistical properties of distributions probability models model estimation and testing regression and forecasting multivariate statistical methods extreme value analysis stochastic models of processes simulaton risk and reliability analysis reliability design decision methods.
Abstract: Preliminary data analysis basic concepts and definitions statistical properties of distributions probability models model estimation and testing regression and forecasting multivariate statistical methods extreme value analysis stochastic models of processes simulaton risk and reliability analysis reliability design decision methods mathematical addendum annotated references and related readings tables of parametric families of probability models annotated software for statistical data analysis

381 citations


Journal ArticleDOI
TL;DR: In this article, a new method for parametric uncertainty analysis of numerical geophysical models is presented, which approximates model response surfaces, which are functions of model input parameters, using orthogonal polynomials, whose weighting functions are the probabilistic density functions of the input uncertain parameters.
Abstract: A new method for parametric uncertainty analysis of numerical geophysical models is presented. It approximates model response surfaces, which are functions of model input parameters, using orthogonal polynomials, whose weighting functions are the probabilistic density functions (PDFs) of the input uncertain parameters. This approach has been applied to the uncertainty analysis of an analytical model of the direct radiative forcing by anthropogenic sulfate aerosols which has nine uncertain parameters. This method is shown to generate PDFs of the radiative forcing which are very similar to the exact analytical PDF. Compared with the Monte Carlo method for this problem, the new method is a factor of 25 to 60 times faster, depending on the error tolerance, and exhibits an exponential decrease of error with increasing order of the approximation.

360 citations


Proceedings ArticleDOI
02 Nov 1997
TL;DR: A parametric statistical model for visual images in the wavelet transform domain is presented and the joint densities of coefficient magnitudes at adjacent spatial locations, adjacent orientations, and adjacent spatial scales are characterized.
Abstract: We present a parametric statistical model for visual images in the wavelet transform domain. We characterize the joint densities of coefficient magnitudes at adjacent spatial locations, adjacent orientations, and adjacent spatial scales. The model accounts for the statistics of a wide variety of visual images. As a demonstration of this, we used the model to design a progressive image encoder with state-of-the-art rate-distortion performance. We also show promising examples of image restoration and texture synthesis.

333 citations


Journal ArticleDOI
TL;DR: A new sampling technique is presented that generates and inverts the Hammersley points to provide a representative sample for multivariate probability distributions and is compared to a sample obtained from a Latin hypercube design by propagating it through a set of nonlinear functions.
Abstract: The basic setting of this article is that of parameter-design studies using data from computer models. A general approach to parameter design is introduced by coupling an optimizer directly with the computer simulation model using stochastic descriptions of the noise factors. The computational burden of these approaches can be extreme, however, and depends on the sample size used for characterizing the parametric uncertainties. In this article, we present a new sampling technique that generates and inverts the Hammersley points (a low-discrepancy design for placing n points uniformly in a k-dimensional cube) to provide a representative sample for multivariate probability distributions. We compare the performance of this to a sample obtained from a Latin hypercube design by propagating it through a set of nonlinear functions. The number of samples required to converge to the mean and variance is used as a measure of performance. The sampling technique based on the Hammersley points requires far fewer sampl...

309 citations


Journal ArticleDOI
TL;DR: In this article, the authors studied the problem of H∞-control for linear systems with Markovian jumping parameters and parameter uncertainties, where the jumping rates were assumed to be real, time-varying, norm-bounded, appearing in the state matrix.
Abstract: This paper studies the problem of H∞-control for linear systems with Markovian jumping parameters The jumping parameters considered here are two separable continuous-time, discrete-state Markov processes, one appearing in the system matrices and one appearing in the control variable Our attention is focused on the design of linear state feedback controllers such that both stochastic stability and a prescribed H∞-performance are achieved We also deal with the robust H∞-control problem for linear systems with both Markovian jumping parameters and parameter uncertainties The parameter uncertainties are assumed to be real, time-varying, norm-bounded, appearing in the state matrix Both the finite-horizon and infinite-horizon cases are analyzed We show that the control problems for linear Markovian jumping systems with and without parameter uncertainties can be solved in terms of the solutions to a set of coupled differential Riccati equations for the finite-horizon case or algebraic Riccati equations for the infinite-horizon case Particularly, robust H∞-controllers are also designed when the jumping rates have parameter uncertainties

Journal ArticleDOI
TL;DR: In this paper, a conditional Kolmogorov test of model specification for parametric models with covariates (regressors) is proposed, based on the goodness-of-fit test for distribution functions.
Abstract: This paper introduces a conditional Kolmogorov test of model specification for parametric models with covariates (regressors). The test is an extension of the Kolmogorov test of goodness-of-fit for distribution functions. The test is shown to have power against 1/√n local alternatives and all fixed alternatives to the null hypothesis. A parametric bootstrap procedure is used to obtain critical values for the test.

Proceedings ArticleDOI
Bin Yao1
10 Dec 1997
TL;DR: In this article, a general framework is proposed for the design of high-performance robust controllers based on the recently proposed adaptive robust control (ARC), which attenuates the effect of model uncertainties as much as possible while learning mechanisms such as parameter adaptation are used to reduce the model uncertainties.
Abstract: A general framework is proposed for the design of a new class of high-performance robust controllers based on the recently proposed adaptive robust control (ARC). Robust filter structures are used to attenuate the effect of model uncertainties as much as possible while learning mechanisms such as parameter adaptation are used to reduce the model uncertainties. Under the proposed general framework, a simple new ARC controller is also constructed for a class of nonlinear systems transformable to a semi-strict feedback form. The new design utilizes the popular discontinuous projection method in solving the conflicts between the deterministic robust control design and the adaptive control design. The controller achieves a guaranteed transient performance and a prescribed final tracking accuracy in the presence of both parametric uncertainties and uncertain nonlinearities while achieving asymptotic stability in the presence of parametric uncertainties without using a discontinuous control law or infinite-gain feedback.

Proceedings ArticleDOI
17 Jun 1997
TL;DR: This work presents a variant of the EM algorithm that can segment image sequences by fitting multiple smooth flow fields to the spatiotemporal data and shows how the estimation of a single smooth flow field can be performed in closed form, thus making the multiple model estimation computationally feasible.
Abstract: Grouping based on common motion, or "common fate" provides a powerful cue for segmenting image sequences. Recently a number of algorithms have been developed that successfully perform motion segmentation by assuming that the motion of each group can be described by a low dimensional parametric model (e.g. affine). Typically the assumption is that motion segments correspond to planar patches in 3D undergoing rigid motion. Here we develop an alternative approach, where the motion of each group is described by a smooth dense flow field and the stability of the estimation is ensured by means of a prior distribution on the class of flow fields. We present a variant of the EM algorithm that can segment image sequences by fitting multiple smooth flow fields to the spatiotemporal data. Using the method of Green's functions, we show how the estimation of a single smooth flow field can be performed in closed form, thus making the multiple model estimation computationally feasible. Furthermore, the number of models is estimated automatically using similar methods to those used in the parametric approach. We illustrate the algorithm's performance on synthetic and real image sequences.

Journal ArticleDOI
TL;DR: In this paper, a sampling technique is presented that generates and inverts the Hammersley points (an optimal design for placing n points uniformly on a k-dimensional cube) to provide a representative sample for multivariate probability distributions.
Abstract: The concept of robust design involves identification of design settings that make the product performance less sensitive to the effects of seasonal and environmental variations. This concept is discussed in this article in the context of batch distillation column design with feed stock variations, and internal and external uncertainties. Stochastic optimization methods provide a general approach to robust/parameter design as compared to conventional techniques. However, the computational burden of these approaches can be extreme and depends on the sample size used for characterizing the parametric variations and uncertainties. A novel sampling technique is presented that generates and inverts the Hammersley points (an optimal design for placing n points uniformly on a k-dimensional cube) to provide a representative sample for multivariate probability distributions. The example of robust batch-distillation column design illustrates that the new sampling technique offers significant computational savings and better accuracy.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed simple finite-sample size approximations for the distribution of quadratic forms in factorial designs under a normal heteroscedastic error structure.
Abstract: Linear rank statistics in nonparametric factorial designs are asymptotically normal and, in general, heteroscedastic. In a comprehensive simulation study, the asymptotic chi-squared law of the corresponding quadratic forms is shown to be a rather poor approximation of the finite-sample distribution. Motivated by this problem, we propose simple finite-sample size approximations for the distribution of quadratic forms in factorial designs under a normal heteroscedastic error structure. These approximations are based on an F distribution with estimated degrees of freedom that generalizes ideas of Patnaik and Box. Simulation studies show that the nominal level is maintained with high accuracy and in most cases the power is comparable to the asymptotic maximin Wald test. Data-driven guidelines are given to select the most appropriate test procedure. These ideas are finally transferred to nonparametric factorial designs where the same quadratic forms as in the parametric case are applied to the vector ...

Journal ArticleDOI
TL;DR: A notion of model uncertainty based on the closeness of input-output trajectories which is not tied to a particular uncertainty representation, such as additive, parametric, structured, etc. is pursued.
Abstract: This paper presents an approach to robustness analysis for nonlinear feedback systems. We pursue a notion of model uncertainty based on the closeness of input-output trajectories which is not tied to a particular uncertainty representation, such as additive, parametric, structured, etc. The basic viewpoint is to regard systems as operators on signal spaces. We present two versions of a global theory where stability is captured by induced norms or by gain functions. We also develop local approaches (over bounded signal sets) and give a treatment for systems with potential for finite-time escape. We compute the relevant stability margin for several examples and demonstrate robustness of stability for some specific perturbations, e.g., small-time delays. We also present examples of nonlinear control systems which have zero robustness margin and are destabilized by arbitrarily small gap perturbations. The paper considers the case where uncertainty is present in the controller as well as the plant and the generalization of the approach to the case where uncertainty occurs in several subsystems in an arbitrary interconnection.

Journal ArticleDOI
TL;DR: A simple method is presented that allows statistical inferences to be made about the significance of regional effects in statistical parametric maps when the approximate location of the effect is specified in advance using the spatial extent or volume of the nearest activated region.
Abstract: r Abstract: We present a simple method that allows statistical inferences to be made about the significance of regional effects in statistical parametric maps (SPMs) when the approximate location of the effect is specified in advance. The test can be thought of as analogous to assessing activations with uncorrected P values based on the height of SPMs but, in this instance, using the spatial extent or volume of the nearest activated region. The advantage of the current test is that it eschews a correction for multiple comparisons even though the exact location of the expected activation may not be known. Hum. Brain Mapping 5:133-136, 1997. r 1997 Wiley-Liss, Inc.

Journal ArticleDOI
TL;DR: A Bayesian framework that combines motion (optical flow) estimation and segmentation based on a representation of the motion field as the sum of a parametric field and a residual field is presented.
Abstract: We present a Bayesian framework that combines motion (optical flow) estimation and segmentation based on a representation of the motion field as the sum of a parametric field and a residual field The parameters describing the parametric component are found by a least squares procedure given the best estimates of the motion and segmentation fields The motion field is updated by estimating the minimum-norm residual field given the best estimate of the parametric field, under the constraint that motion field be smooth within each segment The segmentation field is updated to yield the minimum-norm residual field given the best estimate of the motion field, using Gibbsian priors The solution to successive optimization problems are obtained using the highest confidence first (HCF) or iterated conditional mode, (ICM) optimization methods Experimental results on real video are shown

Journal ArticleDOI
TL;DR: In this paper, the generalized state-space averaging (GSSA) method is applied to power electronic converters and shown to work well only within specific converter topologies and parametric limits, where the model approximation order is not defined by the topology number of components.
Abstract: Power electronic converters are periodic time-variant systems, because of their switching operation. The generalized state-space averaging method is a way to model them as time independent systems, defined by a unified set of differential equations, capable of representing circuit waveforms. Therefore, it can be a convenient approach for designing controllers to he applied to switched converters. This brief shows that the generalized state-space averaging method works well only within specific converter topologies and parametric limits, where the model approximation order is not defined by the topology number of components. This point is illustrated with detailed examples from several basic dc/dc converter topologies.

Journal ArticleDOI
TL;DR: Simulation results show that the proposed robust nonlinear decentralized controller can greatly enhance the transient stability of the system regardless of the network parameters, operating points and fault locations.

Journal ArticleDOI
TL;DR: This paper proves that if the objective function of a GBLP is uniformly Lipschitz continuous in the lower level decision variable with respect to the upper level decisionVariable, then using certain uniform parametric error bounds as penalty functions gives single level problems equivalent to the GBLPs.
Abstract: The generalized bilevel programming problem (GBLP) is a bilevel mathematical program where the lower level is a variational inequality. In this paper we prove that if the objective function of a GBLP is uniformly Lipschitz continuous in the lower level decision variable with respect to the upper level decision variable, then using certain uniform parametric error bounds as penalty functions gives single level problems equivalent to the GBLP. Several local and global uniform parametric error bounds are presented, and assumptions guaranteeing that they apply are discussed. We then derive Kuhn--Tucker-type necessary optimality conditions by using exact penalty formulations and nonsmooth analysis.

Journal ArticleDOI
TL;DR: The common statistical techniques employed to analyze survival data in public health research, including the Kaplan-Meier method for estimating the survival function and the Cox proportional hazards model to identify risk factors and to obtain adjusted risk ratios are reviewed.
Abstract: This paper reviews the common statistical techniques employed to analyze survival data in public health research. Due to the presence of censoring, the data are not amenable to the usual method of analysis. The improvement in statistical computing and wide accessibility of personal computers led to the rapid development and popularity of nonparametric over parametric procedures. The former required less stringent conditions. But, if the assumptions for parametric methods hold, the resulting estimates have smaller standard errors and are easier to interpret. Nonparametric techniques include the Kaplan-Meier method for estimating the survival function and the Cox proportional hazards model to identify risk factors and to obtain adjusted risk ratios. In cases where the assumption of proportional hazards is not tenable, the data can be stratified and a model fitted with different baseline functions in each stratum. Parametric modeling such as the accelerated failure time model also may be used. Hazard functions for the exponential, Weibull, gamma, Gompertz, lognormal, and log-logistic distributions are described. Examples from published literature are given to illustrate the various methods. The paper is intended for public health professionals who are interested in survival data analysis.

Journal ArticleDOI
TL;DR: In this paper, a feed-forward back-propagation neural network based on a central composite rotatable experimental design is developed to model the WEDM process, and an autoregressive AR(3) model is used to describe its stochastic component.
Abstract: Wire electrical discharge machining (WEDM) technology has been widely used in conductive material machining. The WEDM process, which is a combination of electrodynamic, electromagnetic, thermaldynamic, and hydrodynamic actions, exhibits a complex and stochastic nature. Its performance, in terms of surface finish and machining productivity, is affected by many factors. This paper presents an attempt at optimization of the process parametric combinations by modeling the process using artificial neural networks (ANN) and characterizes the WEDMed surface through time series techniques. A feed-forward back-propagation neural network based on a central composite rotatable experimental design is developed to model the machining process. Optimal parametric combinations are selected for the process. The periodic component of the surface texture is identified, and an autoregressive AR(3) model is used to describe its stochastic component.

Journal ArticleDOI
TL;DR: In this article, a systematic method for parametric sensitivity analysis based on the Gaussian quadrature procedure for numerical integration is presented, which is compared to those of Harrison and Vinod and of Pagan and Shannon for the case of a computable general equilibrium model due to Walley and Wigle.

Journal ArticleDOI
TL;DR: It is shown that, whilst both methods are capable of determining cluster validity for data sets in which clusters tend towards a multivariate Gaussian distribution, the parametric method inevitably fails for clusters which have a non-Gaussian structure whilst the scale-space method is more robust.

Journal ArticleDOI
TL;DR: In this paper, the authors propose a Bayesian nonparametric prior for the random effects to capture possible deviances in modality and skewness and explore the observed covariates' effect on the distribution of the mixed effects.
Abstract: SUMMARY This paper proposes Bayesian nonparametric mixing for some well-known and popular models. The distribution of the observations is assumed to contain an unknown mixed effects term which includes a fixed effects term, a function of the observed covariates, and an additive or multiplicative random effects term. Typically these random effects are assumed to be independent of the observed covariates and independent and identically distributed from a distribution from some known parametric family. This assumption may be suspect if either there is interaction between observed covariates and unobserved covariates or the fixed effects predictor of observed covariates is misspecified. Another cause for concern might be simply that the covariates affect more than just the location of the mixed effects distribution. As a consequence the distribution of the random effects could be highly irregular in modality and skewness leaving parametric families unable to model the distribution adequately. This paper therefore proposes a Bayesian nonparametric prior for the random effects to capture possible deviances in modality and skewness and to explore the observed covariates' effect on the distribution of the mixed effects.

Journal ArticleDOI
TL;DR: An analytical study of the asymptotic differences between different members of the family proposed in goodness of fit, together with an examination of closer approximations to the exact distribution of these statistics than the commonly used chi-squared distribution.
Abstract: In this paper we investigate the Jensen-Shannon parametric divergence for testing goodness-of-fit for point estimation. Most of the work presented is an analytical study of the asymptotic differences between different members of the family proposed in goodness of fit, together with an examination of closer approximations to the exact distribution of these statistics than the commonly used chi-squared distribution. Finally the minimum Jensen-Shannon divergence estimates are introduced and compared with other well-known estimators by computer simulation.

Journal ArticleDOI
TL;DR: A statistical investigation of subspace-based system identification techniques using the structure of the extended observability matrix to find a system model estimate for 4SID methods.

Journal ArticleDOI
TL;DR: For a wide range of distributions, concern about bias or imprecision of the estimates of the AUC should not be a major factor in choosing between the nonparametric and parametric approaches.
Abstract: Receiver operating characteristic (ROC) analysis, which yields indices of accuracy such as the area under the curve (AUC), is increasingly being used to evaluate the performances of diagnostic tests that produce results on continuous scales. Both parametric and nonparametric ROC approaches are available to assess the discriminant capacity of such tests, but there are no clear guidelines as to the merits of each, particularly with non-binormal data. Investigators may worry that when data are non-Gaussian, estimates of diagnostic accuracy based on a binormal model may be distorted. The authors conducted a Monte Carlo simulation study to compare the bias and sampling variability in the estimates of the AUCs derived from parametric and nonparametric procedures. Each approach was assessed in data sets generated from various configurations of pairs of overlapping distributions; these included the binormal model and non-binormal pairs of distributions where one or both pair members were mixtures of Gaussian (MG) distributions with different degrees of departures from binormality. The biases in the estimates of the AUCs were found to be very small for both parametric and nonparametric procedures. The two approaches yielded very close estimates of the AUCs and the corresponding sampling variability even when data were generated from non-binormal models. Thus, for a wide range of distributions, concern about bias or imprecision of the estimates of the AUC should not be a major factor in choosing between the nonparametric and parametric approaches.