scispace - formally typeset
Search or ask a question

Showing papers on "Parametric statistics published in 2003"


Journal ArticleDOI
TL;DR: In this paper, the authors use experimental or observational data to estimate decision regimes that result in a maximal mean response, and make smooth parametric assumptions only on quantities that are directly relevant to the goal of estimating the optimal rules.
Abstract: Summary. A dynamic treatment regime is a list of decision rules, one per time interval, for how the level of treatment will be tailored through time to an individual’s changing status. The goal of this paper is to use experimental or observational data to estimate decision regimes that result in a maximal mean response. To explicate our objective and to state the assumptions, we use the potential outcomes model. The method proposed makes smooth parametric assumptions only on quantities that are directly relevant to the goal of estimating the optimal rules. We illustrate the methodology proposed via a small simulation.

922 citations


Journal ArticleDOI
TL;DR: The Laplace type estimators (LTEs) as mentioned in this paper are a family of estimators that include means and quantiles of quasi-posterior distributions defined as transformations of general (nonlikelihood-based) statistical criterion functions, such as those in GMM, nonlinear IV, empirical likelihood, and minimum distance methods.

584 citations


Journal ArticleDOI
TL;DR: An algorithm for fast elastic multidimensional intensity-based image registration with a parametric model of the deformation that is computationally more efficient than other alternatives and capable of accepting expert hints in the form of soft landmark constraints.
Abstract: We present an algorithm for fast elastic multidimensional intensity-based image registration with a parametric model of the deformation. It is fully automatic in its default mode of operation. In the case of hard real-world problems, it is capable of accepting expert hints in the form of soft landmark constraints. Much fewer landmarks are needed and the results are far superior compared to pure landmark registration. Particular attention has been paid to the factors influencing the speed of this algorithm. The B-spline deformation model is shown to be computationally more efficient than other alternatives. The algorithm has been successfully used for several two-dimensional (2-D) and three-dimensional (3-D) registration tasks in the medical domain, involving MRI, SPECT, CT, and ultrasound image modalities. We also present experiments in a controlled environment, permitting an exact evaluation of the registration accuracy. Test deformations are generated automatically using a random hierarchical fractional wavelet-based generator.

526 citations


Journal ArticleDOI
TL;DR: It is shown that the finite-horizon robust optimal control law is a continuous piecewise affine function of the state vector and can be calculated by solving a sequence of multiparametric linear programs.
Abstract: For discrete-time uncertain linear systems with constraints on inputs and states, we develop an approach to determine state feedback controllers based on a min-max control formulation. Robustness is achieved against additive norm-bounded input disturbances and/or polyhedral parametric uncertainties in the state-space matrices. We show that the finite-horizon robust optimal control law is a continuous piecewise affine function of the state vector and can be calculated by solving a sequence of multiparametric linear programs. When the optimal control law is implemented in a receding horizon scheme, only a piecewise affine function needs to be evaluated on line at each time step. The technique computes the robust optimal feedback controller for a rather general class of systems with modest computational effort without needing to resort to gridding of the state-space.

516 citations


Posted Content
TL;DR: In this paper, the identification and estimation of hedonic models is studied in an additive version of the model, where technology and preferences are identified up to affine transformations from data on demand and supply in a single market.
Abstract: This paper considers the identification and estimation of hedonic models. We establish that in an additive version of the hedonic model, technology and preferences are generically identified up to affine transformations from data on demand and supply in a single hedonic market. For a very general parametric structure, preferences and technology are fully identified. This is true under a strong assumption of statistical independence of the error term. It is also true under the weaker assumption of mean independence of the error term. Much of the confusion in the empirical literature that claims that hedonic models estimated on data from a single market are fundamentally underidentified is based on linearizations that do not use all of the information in the model. The exact economic model that justifies widely used linear approximations has strange properties so the approximation is doubly poor. A semiparametric estimation method is proposed that is valid when a statistical independence assumption is valid. Alternatively, under the weaker condition of mean independence instrumental variables estimators can be applied to identify technology and preference parameters from a single market. They are justified by nonlinearities that are generic features of equilibrium in hedonic models.

502 citations


Journal ArticleDOI
TL;DR: In this article, Parametric and non-parametric random-coefficient latent class models are proposed to modify the assumption that observations are independent, which can be used for the analysis of data collected with complex sampling designs, data with a multilevel structure, and multiple-group data for more than a few groups.
Abstract: The latent class (LC) models that have been developed so far assume that observations are independent. Parametric and non-parametric random-coefficient LC models are proposed here, which will make it possible to modify this assumption. For example, the models can be used for the analysis of data collected with complex sampling designs, data with a multilevel structure, and multiple-group data for more than a few groups. An adapted EM algorithm is presented that makes maximum-likelihood estimation feasible. The new model is illustrated with examples from organizational, educational, and cross-national comparative research.

388 citations


Book
01 Mar 2003
TL;DR: A survey of applications theory and general estimation procedures for stress strength models can be found in this paper, along with examples and details on applications and their application in the context of point estimation and statistical inference.
Abstract: Stress-strength models - history, mathematical tools and survey of applications theory and general estimation procedures parametric point estimation parametric statistical inference nonparametric methods special cases and generalizations examples and details on applications

387 citations


Journal Article
TL;DR: In this article, the power of the parametric t-test for trend detection is estimated by Monte Carlo simulation for various probability distributions and compared with the non-parametric Mann-Kendall test.
Abstract: The existence of a trend in a hydrological time series is detected by statistical tests. The power of a test is the probability that it will reject a null hypothesis when it is false. In this study, the power of the parametric t-test for trend detection is estimated by Monte Carlo simulation for various probability distributions and compared with the power of the non-parametric Mann-Kendall test. The t-test has less power than the non-parametric test when the probability distribution is skewed. However, for moderately skewed distributions the power ratio is close to one. Annual streamflow records in various regions of Turkey are analyzed by the two tests to compare their powers in detecting a trend.

362 citations


Journal ArticleDOI
TL;DR: It is argued that methods for implementing the bootstrap with time‐series data are not as well understood as methods for data that are independent random samples, and there is a considerable need for further research.
Abstract: The chapter gives a review of the literature on bootstrap methods for time series data. It describes various possibilities on how the bootstrap method, initially introduced for independent random variables, can be extended to a wide range of dependent variables in discrete time, including parametric or nonparametric time series models, autoregressive and Markov processes, long range dependent time series and nonlinear time series, among others. Relevant bootstrap approaches, namely the intuitive residual bootstrap and Markovian bootstrap methods, the prominent block bootstrap methods as well as frequency domain resampling procedures, are described. Further, conditions for consistent approximations of distributions of parameters of interest by these methods are presented. The presentation is deliberately kept non-technical in order to allow for an easy understanding of the topic, indicating which bootstrap scheme is advantageous under a specific dependence situation and for a given class of parameters of interest. Moreover, the chapter contains an extensive list of relevant references for bootstrap methods for time series.

331 citations



Journal ArticleDOI
TL;DR: In this paper, the authors present the results of a parametric study conducted with a previously described three-dimensional, non-isothermal model of a polymer electrolyte membrane (PEM) fuel cell.

Journal ArticleDOI
TL;DR: A delay-independent sufficient condition for the existence of linear sliding surfaces is given in terms of linear matrix inequalities, based on which the corresponding reaching motion controller is developed.
Abstract: This note is devoted to robust sliding-mode control for time-delay systems with mismatched parametric uncertainties. A delay-independent sufficient condition for the existence of linear sliding surfaces is given in terms of linear matrix inequalities, based on which the corresponding reaching motion controller is also developed. The results are illustrated by an example.

Journal ArticleDOI
TL;DR: In this paper, curvature polynomials of arbitrary order are used as the assumed form of solution and these quadratures are readily linearized to express the necessary conditions for optimality.
Abstract: There are many situations for which a feasible nonholonomic motion plan must be generated immediately based on real-time perceptual information Parametric trajectory representations limit computation because they reduce the search space for solutions (at the cost of potentially introducing suboptimality) The use of any parametric trajectory model converts the optimal control formulation into an equivalent nonlinear programming problem In this paper, curvature polynomials of arbitrary order are used as the assumed form of solution Polynomials sacrifice little in terms of spanning the set of feasible controls while permitting an expression of the general solution to the system dynamics in terms of decoupled quadratures These quadratures are then readily linearized to express the necessary conditions for optimality Resulting trajectories are convenient to manipulate and execute in vehicle controllers and they can be computed with a straightforward numerical procedure in real time KEY WORDS—mobile robots, car-like robots, trajectory generation, curve generation, nonholonomic, clothoid, cornu spiral, optimal control

Journal ArticleDOI
Jushan Bai1
TL;DR: In this article, a nonparametric test for parametric conditional distributions of dynamic models is proposed, coupled with Khmaladze's martingale transformation, which is asymptotically distribution-free and has nontrivial power against root-n local alternatives.
Abstract: This paper proposes a nonparametric test for parametric conditional distributions of dynamic models. The test is of the Kolmogorov type coupled with Khmaladze's martingale transformation. It is asymptotically distribution-free and has nontrivial power against root-n local alternatives. The method is applicable for various dynamic models, including autoregressive and moving average models, generalized autoregressive conditional heteroskedasticity (GARCH), integrated GARCH, and general nonlinear time series regressions. The method is also applicable for cross-sectional models. Finally, we apply the procedure to testing conditional normality and the conditional t-distribution in a GARCH model for the NYSE equal-weighted returns.

01 Jan 2003
TL;DR: In this paper, a post-Panamax C11 class containership encountered extreme weather and sustained extensive loss and damage to deck stowed containers, and the motions of the vessel during this storm event were investigated through a series of model tests and numerical analyses that confirmed the vessel's parametric rolling response in head seas at the time of the casualty.
Abstract: In October 1998 a post-Panamax C11 class containership encountered extreme weather and sustained extensive loss and damage to deck stowed containers. The motions of the vessel during this storm event were investigated through a series of model tests and numerical analyses that confirmed the vessel's parametric rolling response in head seas at the time of the casualty. These studies provide insight into the conditions in which post-Panamax containerships are likely to experience head sea parametric rolling, and the magnitude of motions and accelerations that can occur. The findings from this investigation are presented in this paper, together with discussion of how such extreme motions impact the design and application of container securing systems. Also outlined in the paper are recommendations for additional research needed to better understand the influence of vessel design and operational considerations on the propensity of post-Panamax containerships towards parametric rolling. This investigation and other recent studies demonstrate that parametric roll in extreme head or near head seas can occur when unfavorable tuning is combined with low roll damping (reduced speed) and large stability variations (governed by wavelength, wave height, general hull form, bow flare, and stern shapes). Parametric rolling is an unstable phenomenon, which can quickly generate large roll angles that are coupled with significant pitch motions. The rolling occurs in phase with pitch, and on containerships introduces high loads into the containers and their securing systems. It appears that post-Panamax containerships may be particularly prone to this behavior. This is an important issue considering the large number of these vessels scheduled for delivery in the next few years.

Journal ArticleDOI
01 Feb 2003
TL;DR: A comparative study of the predictive performances of neural network time series models for forecasting failures and reliability in engine systems shows that the radial basis function (RBF) neural network architecture is found to be a viable alternative due to its shorter training time.
Abstract: This paper presents a comparative study of the predictive performances of neural network time series models for forecasting failures and reliability in engine systems. Traditionally, failure data analysis requires specifications of parametric failure distributions and justifications of certain assumptions, which are at times difficult to validate. On the other hand, the time series modeling technique using neural networks provides a promising alternative. Neural network modeling via feed-forward multilayer perceptron (MLP) suffers from local minima problems and long computation time. The radial basis function (RBF) neural network architecture is found to be a viable alternative due to its shorter training time. Illustrative examples using reliability testing and field data showed that the proposed model results in comparable or better predictive performance than traditional MLP model and the linear benchmark based on Box–Jenkins autoregressive-integrated-moving average (ARIMA) models. The effects of input window size and hidden layer nodes are further investigated. Appropriate design topologies can be determined via sensitivity analysis.

Journal ArticleDOI
TL;DR: A statistical objective performance analysis and detector parameter selection is proposed, using detection results produced by different detector parameters, and an estimated best edge map is obtained, utilized as an estimated ground truth (EGT).
Abstract: Subjective evaluation by human observers is usually used to analyze and select an edge detector parametric setup when real-world images are considered. We propose a statistical objective performance analysis and detector parameter selection, using detection results produced by different detector parameters. Using the correspondence between the different detection results, an estimated best edge map, utilized as an estimated ground truth (EGT), is obtained. This is done using both a receiver operating characteristics (ROC) analysis and a Chi-square test, and considers the trade off between information and noisiness in the detection results. The best edge detector parameter set (PS) is then selected by the same statistical approach, using the EGT. Results are demonstrated for several edge detection techniques, and compared to published subjective evaluation results. The method developed here suggests a general tool to assist in practical implementations of parametric edge detectors where an automatic process is required.

Proceedings Article
06 Jun 2003
TL;DR: In this paper, the four-sideband model of parametric amplifiers driven by two pump waves is reviewed and used to describe the conditions required to produce broad-bandwidth gain.
Abstract: Parametric amplifiers (PAs) are well-suited for optical communication systems. Not only can PAs provide high gain for arbitrary signal wavelengths, they can also conjugate the signals and convert their wavelengths. The four-sideband model of parametric amplifiers driven by two pump waves will be reviewed and used to describe the conditions required to produce broad-bandwidth gain. The flexibility of the two-pump architecture allows it to produce gain that is nearly independent of the signal polarization, and idlers whose spectral widths are comparable to that of the signal.


Journal ArticleDOI
01 Jul 2003
TL;DR: The experiments showed that a set of end-point and oscillatory behavior patterns are learned by self-organizing fixed points and limit cycle dynamics that form behavior primitives, and it was found that diverse novel behavior patterns can be generated by taking advantage of nonlinear effects that emerge from the distributed representation.
Abstract: This paper investigates how behavior primitives are self-organized in a neural network model utilizing a distributed representation scheme. The model is characterized by so-called parametric biases which adaptively modulate the encoding of different behavior patterns in a single recurrent neural net (RNN). Our experiments, using a real robot arm, showed that a set of end-point and oscillatory behavior patterns are learned by self-organizing fixed points and limit cycle dynamics that form behavior primitives. It was also found that diverse novel behavior patterns can be generated by modulating the parametric biases arbitrarily. Our analysis showed that such diversity in behavior generation emerges because a nonlinear map is self-organized between the space of parametric biases and that of the behavior patterns. The origin of the observed nonlinearity from the distributed representation is discussed. This paper investigates how behavior primitives are self-organized in a neural network model utilizing a distributed representation scheme. Our robot experiments showed that a set of end-point and oscillatory behavior patterns are learned by self-organizing fixed points and limit cycle dynamics that form behavior primitives. It was also found that diverse novel behavior patterns, in addition to previously learned patterns, can be generated by taking advantage of nonlinear effects that emerge from the distributed representation.

Journal ArticleDOI
TL;DR: New methods are developed to analyze uncertainty about the parameters of a model, the lag specification, the serial correlation of shocks, and the effects of real time data in one coherent structure and suggest that the aggressiveness recently found in robust policy rules is likely to be caused by overemphasizing uncertainty about economic dynamics at low frequencies.
Abstract: Recently there has been a great deal of interest in studying monetary policy under model uncertainty. We develop new methods to analyze dierent sources of uncertainty in one coherent structure, which is useful for policy decisions. We show how to estimate the size of the uncertainty based on time series data, and how to incorporate this uncertainty in choosing policy. In particular, we develop a new approach for modeling uncertainty called model error modeling. The approach imposes additional structure on the errors of an estimated model, and builds a statistical description of the uncertainty around the model. We develop both parametric and nonparametric specifications of this approach, and use them to estimate uncertainty in a small model of the US economy. We then use our estimates to compute Bayesian and minimax robust policy rules, which are designed to perform well in the face of uncertainty.

Proceedings ArticleDOI
04 Jun 2003
TL;DR: The analysis yields several improvements over previous methods and opens up new possibilities, including the possibility of treating nonlinear vector fields and/or switching surfaces and parametric robustness analysis in a unified way.
Abstract: This paper presents a method for stability analysis of switched and hybrid systems using polynomial and piecewise polynomial Lyapunov functions. Computation of such functions can be performed using convex optimization, based on the sum of squares decomposition of multivariate polynomials. The analysis yields several improvements over previous methods and opens up new possibilities, including the possibility of treating nonlinear vector fields and/or switching surfaces and parametric robustness analysis in a unified way.

Journal ArticleDOI
TL;DR: A Multivariate Normal Distribution is proposed as a parametric statistical model of diffusion tensor data when magnitude MR images contain no artifacts other than Johnson noise, and this model is evaluated using Monte Carlo simulations of DT-MRI experiments.

Journal ArticleDOI
TL;DR: Simulation results indicate that the offline detection of feed rate sensitive corners improves parametric interpolation.
Abstract: Parametric interpolation has many advantages over linear interpolation in machining curves. Real time parametric interpolation research so far has addressed achieving a uniform feed rate, confined chord errors and jerk limited trajectory planning. However, simultaneous consideration of confined chord errors that respect the acceleration and deceleration capabilities of the machine has not been attempted. In this paper, the offline detection of feed rate sensitive corners is proposed. The velocity profile in these zones is planned so that chord errors are satisfied while simultaneously accommodating the machine's acceleration and deceleration limits. Outside the zone of the feed rate sensitive corners, the feed rate is planned using the Taylor approximation. Simulation results indicate that the offline detection of feed rate sensitive corners improves parametric interpolation. For real time interpolation, the parametric curve information can be augmented with the detected feed rate sensitive corners that are stored in 2×2 matrices.

Journal ArticleDOI
TL;DR: The World Health Report 2000 focuses on the performance of health-care systems around the globe and it is suggested that the WHO's estimation procedure is too narrow and that contextual information is hidden by the use of one method.
Abstract: The World Health Report 2000 focuses on the performance of health-care systems around the globe. The report uses efficiency measurement techniques to create a league table of health-care systems, highlighting good and bad performers. Efficiency is measured using panel data methods. This paper suggests that the WHO's estimation procedure is too narrow and that contextual information is hidden by the use of one method. This paper uses and validates a range of parametric and non-parametric empirical methods to measure efficiency using the WHO data. The rankings obtained are compared to the WHO league table and we demonstrate that there are trends and movements of interest within the league tables. We recommend that the WHO broaden its range of techniques in order to reveal this hidden information. Copyright © 2002 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: To exchange CAD models using the macro-parametric approach, a set of standard modeling commands should be defined and the process of developing the set is explained.
Abstract: The current version of STEP standard cannot exchange the parametric information of CAD models. Only pure boundary representations that cannot be parametrically edited are transferable [Geometric Modeling: Theory and Practice (1997)]. There are two approaches for the exchange of design intents such as parameters, features, and constraints. The first is an explicit approach based on constraints between pre-defined parameters and features. The second is a procedural approach based on the sequence of operations issued to construct the models. The authors have previously proposed a macro-parametric approach [International Journal of CAD/CAM 2 (2002) 23], which is a variation of the procedural approach. In this approach, CAD models can be exchanged in the form of macro files, which include the history of modeling commands. To exchange CAD models using the macro-parametric approach, a set of standard modeling commands should be defined. This paper introduces a set of standard commands and explains the process of developing the set.

Proceedings ArticleDOI
02 Jun 2003
TL;DR: Three novel path-based algorithms for statistical timing analysis and parametric yield prediction of digital integrated circuits are proposed and results in the face of statistical temperature and Vdd variations are presented.
Abstract: Uncertainty in circuit performance due to manufacturing and environmental variations is increasing with each new generation of technology. It is therefore important to predict the performance of a chip as a probabilistic quantity. This paper proposes three novel algorithms for statistical timing analysis and parametric yield prediction of digital integrated circuits. The methods have been implemented in the context of the EinsTimer static timing analyzer. Numerical results are presented to study the strengths and weaknesses of these complementary approaches. Across-the-chip variability continues to be accommodated by EinsTimer's "Linear Combination of Delay (LCD)" mode. Timing analysis results in the face of statistical temperature and V/sub dd/ variations are presented on an industrial ASIC part on which a bounded timing methodology leads to surprisingly wrong results.

Patent
22 Apr 2003
TL;DR: In this article, a psycho-acoustically motivated, parametric description of the spatial attributes of multichannel audio signals is proposed, which allows strong bitrate reductions in audio coders, since only one monaural signal has to be transmitted, combined with quantized spatial properties of the signal.
Abstract: In summary, this application describes a psycho-acoustically motivated, parametric description of the spatial attributes of multichannel audio signals. This parametric description allows strong bitrate reductions in audio coders, since only one monaural signal has to be transmitted, combined with (quantized) parameters which describe the spatial properties of the signal. The decoder can form the original amount of audio channels by applying the spatial parameters. For near-CD-quality stereo audio, a bitrate associated with these spatial parameters of 10 kbit/s or less seems sufficient to reproduce the correct spatial impression at the receiving end.

Journal ArticleDOI
TL;DR: In this article, a nonparametric, residual-based block bootstrap procedure is proposed in the context of testing for integrated (unit root) time series, which is based on weak assumptions on the dependence structure of the stationary process driving the random walk.
Abstract: A nonparametric, residual-based block bootstrap procedure is proposed in the context of testing for integrated (unit root) time series. The resampling procedure is based on weak assumptions on the dependence structure of the stationary process driving the random walk and successfully generates unit root integrated pseudo-series retaining the important characteristics of the data. It is more general than previous bootstrap approaches to the unit root problem in that it allows for a very wide class of weakly dependent processes and it is not based on any parametric assumption on the process generating the data. As a consequence the procedure can accurately capture the distribution of many unit root test statistics proposed in the literature. Large sample theory is developed and the asymptotic validity of the block bootstrap-based unit root testing is shown via a bootstrap functional limit theorem. Applications to some particular test statistics of the unit root hypothesis, i.e., least squares and Dickey-Fuller type statistics are given. The power properties of our procedure are investigated and compared to those of alternative bootstrap approaches to carry out the unit root test. Some simulations examine the finite sample performance of our procedure.

Journal ArticleDOI
TL;DR: The parametric bootstrap has made a fundamental impact on how we carry out statistical inference in problems without analytic solutions as mentioned in this paper, and this fact is illustrated with examples and comments that emphasize the parametric Bootstrap and hypothesis testing.
Abstract: The bootstrap has made a fundamental impact on how we carry out statistical inference in problems without analytic solutions. This fact is illustrated with examples and comments that emphasize the parametric bootstrap and hypothesis testing.