scispace - formally typeset
Search or ask a question

Showing papers on "Parametric model published in 2000"


01 May 2000
TL;DR: In this paper, a new class of multivariate models called dynamic conditional correlation (DCC) models is proposed, which have the flexibility of univariate GARCH models coupled with parsimonious parametric models for the correlations.
Abstract: Time varying correlations are often estimated with Multivariate Garch models that are linear in squares and cross products of returns. A new class of multivariate models called dynamic conditional correlation (DCC) models is proposed. These have the flexibility of univariate GARCH models coupled with parsimonious parametric models for the correlations. They are not linear but can often be estimated very simply with univariate or two step methods based on the likelihood function. It is shown that they perform well in a variety of situations and give sensible empirical results.

1,327 citations


Journal ArticleDOI
TL;DR: A new approach to the correction of intensity inhomogeneities in magnetic resonance imaging (MRI) that significantly improves intensity-based tissue segmentation and overcomes limitations of methods based on homomorphic filtering is presented.
Abstract: Presents a new approach to the correction of intensity inhomogeneities in magnetic resonance imaging (MRI) that significantly improves intensity-based tissue segmentation. The distortion of the image brightness values by a low-frequency bias field impedes visual inspection and segmentation. The new correction method called parametric bias field correction (PABIC) is based on a simplified model of the imaging process, a parametric model of tissue class statistics, and a polynomial model of the inhomogeneity field. The authors assume that the image is composed of pixels assigned to a small number of categories with a priori known statistics. Further they assume that the image is corrupted by noise and a low-frequency inhomogeneity field. The estimation of the parametric bias field is formulated as a nonlinear energy minimization problem using an evolution strategy (ES). The resulting bias field is independent of the image region configurations and thus overcomes limitations of methods based on homomorphic filtering. Furthermore, PABIC can correct bias distortions much larger than the image contrast. Input parameters are the intensity statistics of the classes and the degree of the polynomial function. The polynomial approach combines bias correction with histogram adjustment, making it well suited for normalizing the intensity histogram of datasets from serial studies. The authors present simulations and a quantitative validation with phantom and test images. A large number of MR image data acquired with breast, surface, and head coils, both in two dimensions and three dimensions, have been processed and demonstrate the versatility and robustness of this new bias correction scheme.

549 citations


Journal ArticleDOI
TL;DR: In this paper, a nonparametric model of the generalized mass, damping and stiffness matrices is proposed, which does not require identifying the uncertain local parameters and obviates construction of functions that map the domains of uncertain local parameter vectors into the generalized matrix.

499 citations


Journal ArticleDOI
TL;DR: A general nonparametric mixture model that extends models and improves estimation methods proposed by other researchers and extends Cox's proportional hazards regression model by allowing a proportion of event-free patients and investigating covariate effects on that proportion.
Abstract: Nonparametric methods have attracted less attention than their parametric counterparts for cure rate analysis. In this paper, we study a general nonparametric mixture model. The proportional hazards assumption is employed in modeling the effect of covariates on the failure time of patients who are not cured. The EM algorithm, the marginal likelihood approach, and multiple imputations are employed to estimate parameters of interest in the model. This model extends models and improves estimation methods proposed by other researchers. It also extends Cox's proportional hazards regression model by allowing a proportion of event-free patients and investigating covariate effects on that proportion. The model and its estimation method are investigated by simulations. An application to breast cancer data, including comparisons with previous analyses using a parametric model and an existing nonparametric model by other researchers, confirms the conclusions from the parametric model but not those from the existing nonparametric model.

404 citations


Proceedings ArticleDOI
01 May 2000
TL;DR: In this paper, the role of parametric modeling as an application for the global computing grid is examined, and some heuristics which make it possible to specific soft real time deadlines for larger computational experiments are explored.
Abstract: This paper examines the role of parametric modeling as an application for the global computing grid, and explores some heuristics which make it possible to specific soft real time deadlines for larger computational experiments. We demonstrate the scheme with a case study utilizing the Globus toolkit running on the GUSTO testbed.

341 citations


Journal ArticleDOI
TL;DR: In this paper, the authors used classification and regression trees (CART), a recursive partitioning method that is free of distributional assumptions, to predict the distribution of three major oak species in California.
Abstract: . The use of Generalized Linear Models (GLM) in vegetation analysis has been advocated to accommodate complex species response curves. This paper investigates the potential advantages of using classification and regression trees (CART), a recursive partitioning method that is free of distributional assumptions. We used multiple logistic regression (a form of GLM) and CART to predict the distribution of three major oak species in California. We compared two types of model: polynomial logistic regression models optimized to account for non-linearity and factor interactions, and simple CART-models. Each type of model was developed using learning data sets of 2085 and 410 sample cases, and assessed on test sets containing 2016 and 3691 cases respectively. The responses of the three species to environmental gradients were varied and often non-homogeneous or context dependent. We tested the methods for predictive accuracy: CART-models performed significantly better than our polynomial logistic regression models in four of the six cases considered, and as well in the two remaining cases. CART also showed a superior ability to detect factor interactions. Insight gained from CART-models then helped develop improved parametric models. Although the probabilistic form of logistic regression results is more adapted to test theories about species responses to environmental gradients, we found that CART-models are intuitive, easy to develop and interpret, and constitute a valuable tool for modeling species distributions.

341 citations


Journal ArticleDOI
TL;DR: In this paper, the authors show that the monotonicity of the conditional hazard in traditional ACD models is both econometrically important and empirically invalid, and they introduce a more flexible parametric model which is easy to fit and performs well both in simulation studies and in practice.
Abstract: Summary This paper shows that the monotonicity of the conditional hazard in traditional ACD models is both econometrically important and empirically invalid. To counter this problem we introduce a more flexible parametric model which is easy to fit and performs well both in simulation studies and in practice. In an empirical application to NYSE price duration processes, we show that non-monotonic conditional hazard functions are indicated for all stocks. Recently proposed specification tests for financial duration models clearly reject the standard ACD models, whereas the results for the new model are quite favorable.

246 citations


Journal ArticleDOI
TL;DR: In this paper, the authors consider M-estimators of general parametric models and show that the component-wise asymptotic normality of the estimate remains valid if the dimension of the parameter space grows more slowly than some root of the sample size.

228 citations


Journal ArticleDOI
TL;DR: It is proposed that one such model, a (possibly over-fitted) cubic smoothing spline, may be used to define a suitable reference curve against which the fit of a parametric model may be checked, and a significance test is suggested for the purpose.
Abstract: Low-dimensional parametric models are well understood, straightforward to communicate to other workers, have very smooth curves and may easily be checked for consistency with background scientific knowledge or understanding. They should therefore be ideal tools with which to represent smooth relationships between a continuous predictor and an outcome variable in medicine and epidemiology. Unfortunately, a seriously restricted set of such models is used routinely in practical data analysis - typically, linear, quadratic or occasionally cubic polynomials, or sometimes a power or logarithmic transformation of a covariate. Since their flexibility is limited, it is not surprising that the fit of such models is often poor. Royston and Altman's recent work on fractional polynomials has extended the range of available functions. It is clearly crucial that the chosen final model fits the data well. Achieving a good fit with minimal restriction on the functional form has been the motivation behind the major recent research effort on non-parametric curve-fitting techniques. Here I propose that one such model, a (possibly over-fitted) cubic smoothing spline, may be used to define a suitable reference curve against which the fit of a parametric model may be checked. I suggest a significance test for the purpose and examine its type I error and power in a small simulation study. Several families of parametric models, including some with sigmoid curves, are considered. Their suitability in fitting regression relationships found in several real data sets is investigated. With all the example data sets, a simple parametric model can be found which fits the data approximately as well as a cubic smoothing spline, but without the latter's tendency towards artefacts in the fitted curve.

217 citations


Journal ArticleDOI
TL;DR: In this article, a semiparametric approach to smoothing sample extremes, based on local polynomial fitting of the generalized extreme value distribution and related models, is proposed, which is applied to data on extreme temperatures and on record times for the women's 3000 m race.
Abstract: Trends in sample extremes are of interest in many contexts, an example being environmental statistics. Parametric models are often used to model trends in such data, but they may not be suitable for exploratory data analysis. This paper outlines a semiparametric approach to smoothing sample extremes, based on local polynomial fitting of the generalized extreme value distribution and related models. The uncertainty of fits is assessed by using resampling methods. The methods are applied to data on extreme temperatures and on record times for the women's 3000 m race.

161 citations


Journal ArticleDOI
TL;DR: A nonparametric approach to estimating temporal trends when fitting parametric models to extreme values from a weakly dependent time series is suggested and the Gaussian distribution is shown to have special features that permit it to play a universal role as a "nominal" model for the marginal distribution.
Abstract: A topic of major current interest in extreme-value analysis is the investigation of temporal trends. For example, the potential influ- ence of "greenhouse" effects may result in severe storms becoming grad- ually more frequent, or in maximum temperatures gradually increasing, with time. One approach to evaluating these possibilities is to fit, to data, a parametric model for temporal parameter variation, as well as a model describing the marginal distribution of data at any given point in time. However, structural trend models can be difficult to formulate in many circumstances, owing to the complex way in which different factors combine to influence data in the form of extremes. Moreover, it is not advisable to fit trend models without empirical evidence of their suitability. In this paper, motivated by datasets on windstorm severity and maximum temperature, we suggest a nonparametric approach to estimating temporal trends when fitting parametric models to extreme values from a weakly dependent time series. We illustrate the method through applications to time series where the marginal distributions are approximately Pareto, generalized-Pareto, extreme-value or Gaussian. We introduce time-varying probability plots to assess goodness of fit, we discuss local-likelihood approaches to fitting the marginal model within a window and we propose temporal cross-validation for selecting window width. In cases where both location and scale are estimated together, the Gaussian distribution is shown to have special features that permit it to play a universal role as a "nominal" model for the marginal distribution.

Journal ArticleDOI
TL;DR: The Bayesian classification, including the determination of a correspondence between unordered random features, is shown to be tractable, yielding a classification algorithm, a method for estimating error rates, and a tool for evaluating the performance sensitivity.
Abstract: A Bayesian approach is presented for model-based classification of images with application to synthetic-aperture radar. Posterior probabilities are computed for candidate hypotheses using physical features estimated from sensor data along with features predicted from these hypotheses. The likelihood scoring allows propagation of uncertainty arising in both the sensor data and object models. The Bayesian classification, including the determination of a correspondence between unordered random features, is shown to be tractable, yielding a classification algorithm, a method for estimating error rates, and a tool for evaluating the performance sensitivity. The radar image features used for classification are point locations with an associated vector of physical attributes; the attributed features are adopted from a parametric model of high-frequency radar scattering. With the emergence of wideband sensor technology, these physical features expand interpretation of radar imagery to access the frequency- and aspect-dependent scattering information carried in the image phase.

Journal ArticleDOI
TL;DR: In this article, a principal components-based approach known as Min/Max Autocorrelation Factor (MAF) is applied to the modeling of pore-size distributions in partially welded tuff.
Abstract: In many fields of the Earth Sciences, one is interested in the distribution of particle or void sizes within samples. Like many other geological attributes, size distributions exhibit spatial variability, and it is convenient to view them within a geostatistical framework, as regionalized functions or curves. Since they rarely conform to simple parametric models, size distributions are best characterized using their raw spectrum as determined experimentally in the form of a series of abundance measures corresponding to a series of discrete size classes. However, the number of classes may be large and the class abundances may be highly cross-correlated. In order to model the spatial variations of discretized size distributions using current geostatistical simulation methods, it is necessary to reduce the number of variables considered and to render them uncorrelated among one another. This is achieved using a principal components-based approach known as Min/Max Autocorrelation Factors (MAF). For a two-structure linear model of coregionalization, the approach has the attractive feature of producing orthogonal factors ranked in order of increasing spatial correlation. Factors consisting largely of noise and exhibiting pure nugget–effect correlation structures are isolated in the lower rankings, and these need not be simulated. The factors to be simulated are those capturing most of the spatial correlation in the data, and they are isolated in the highest rankings. Following a review of MAF theory, the approach is applied to the modeling of pore-size distributions in partially welded tuff. Results of the case study confirm the usefulness of the MAF approach for the simulation of large numbers of coregionalized variables.

Journal ArticleDOI
TL;DR: In this paper, the authors compare non-parametric and parametric approaches to the analysis of data in the form of replicated spatial point patterns in two or more experimental groups, and compare mean K-functions between experimental groups using a bootstrap testing procedure.
Abstract: The paper compares non-parametric (design-based) and parametric (model-based) approaches to the analysis of data in the form of replicated spatial point patterns in two or more experimental groups. Basic questions for data of this kind concern estimating the properties of the underlying spatial point process within each experimental group, and comparing the properties between groups. A non-parametric approach, building on work by Diggle et. al. (1991), summarizes each pattern by an estimate of the reduced second moment measure or K-function (Ripley (1977)) and compares mean K-functions between experimental groups using a bootstrap testing procedure. A parametric approach fits particular classes of parametric model to the data, uses the model parameter estimates as summaries and tests for differences between groups by comparing fits with and without the assumption of common parameter values across groups. The paper discusses how either approach can be implemented in the specific context of a single-factor replicated experiment and uses simulations to show how the parametric approach can be more efficient when the underlying model assumptions hold, but potentially misleading otherwise.

Journal ArticleDOI
01 Sep 2000
TL;DR: In this article, the authors provide a description of current issues and considerations in parametric BRDF modeling and discuss the main requirements to be met in model formulation and application and examine the status of the discussion on potential biophysical interpretations of the derived semi-empirical model parameters.
Abstract: Data products comprising bidirectional reflectance distribution function (BRDF) and albedo parameters generated from multiangular satellite reflectance observations are entering into operational production. These data products, which represent an earth science data type that has not been available before, are based upon semiempirical modeling of the reflectance anisotropy of the earth's land surface. This paper provides a description of current issues and considerations in such parametric BRDF modeling and discusses the main requirements to be met in model formulation and application. Various linear kernel‐driven and multiplicative BRDF models are considered with respect to model formulation, validation, and angular interpolation and extrapolation of the reflectance observations. The paper also examines the status of the discussion on potential biophysical interpretations of the derived semiempirical model parameters. This overview over parametric BRDF and albedo modeling hopefully will aid users of the n...

Journal ArticleDOI
TL;DR: In this paper, a new method for estimating parameters of non-linear parametric models that uses internal feedback to account for nonlinearities is presented, and the method estimates the linear frequency response matrix and nonlinear system parameters at forced and unforced degrees of freedom of general multiple-degree-of-freedom nonlinear systems simultaneously.

Journal ArticleDOI
TL;DR: This work reviews five nonstationary models that it regard as most useful and concludes that antedependence models should be given much greater consideration than they have historically received.
Abstract: An important theme of longitudinal data analysis in the past two decades has been the development and use of explicit parametric models for the data's variance-covariance structure. A variety of these models have been proposed, of which most are second-order stationary. A few are flexible enough to accommodate nonstationarity, i.e., nonconstant variances and/or correlations that are not a function solely of elapsed time between measurements. We review five nonstationary models that we regard as most useful: (1) the unstructured covariance model, (2) unstructured antedependence models, (3) structured antedependence models, (4) autoregressive integrated moving average and similar models, and (5) random coefficients models. We evaluate the relative strengths and limitations of each model, emphasizing when it is inappropriate or unlikely to be useful. We present three examples to illustrate the fitting and comparison of the models and to demonstrate that nonstationary longitudinal data can be modeled effectively and, in some cases, quite parsimoniously. In these examples, the antedependence models generally prove to be superior and the random coefficients models prove to be inferior. We conclude that antedependence models should be given much greater consideration than they have historically received.

Book
01 Jan 2000
TL;DR: In this paper, the Second Lyapunov's Method in Sensitivity Theory is used to measure the sensitivity of a single sensor to a single parameter in the time domain of a system.
Abstract: Preface Parametric Models State Variables and Control System Parameters Parametric Models of Control Systems Sensitivity Functions and their Applications Finite-Dimensional Continuous Systems Finite-Dimensional Continuous Systems Depending on a Parameter Second Lyapunov's Method in Sensitivity Theory Sensitivity on Infinite Time Intervals Sensitivity Analysis of Self-Oscillating Systems in the Time Domain Sensitivity of Non-Autonomous Systems Sensitivity of Solutions of Boundary Value Problems Finite-Dimensional Discontinuous Systems Sensitivity Equations for Finite Dimensional Discontinuous Systems Sensitivity Equations for Relay Systems Sensitivity Equations for Pulse and Relay-Pulse Systems Discontinuous Systems Given by Operator Models Operator Parametric Models of Control Systems Operator Models of Discontinuous Systems Sensitivity of Operator Models Sensitivity Equations of Relay and Pulse Systems Non-Time Characteristics Sensitivity of Transfer Function and Frequency Response of Linear Systems Sensitivity of Zeros and Poles Sensitivity of Eigenvalues and Eigenvectors of Linear Time Invariant Control Systems Sensitivity of Integral Quality Indices Indirect Characteristics of Sensitivity Functions Sensitivity Invariants Sensitivity Invariants of Time Characteristics Root and Transfer Function Sensitivity Invariants Sensitivity Invariants of Frequency Responses Sensitivity Invariants of Integral Estimates Sensitivity Invariants for Gyroscopic Systems Sensitivity of Mathematical Programming Sensitivity of Linear Programming Problems Sensitivity of Optimal Solutions of Non-Linear Programming Problems Sensitivity of a Simplest Variational Problem Sensitivity of Variational Problems with Flexible Boundaries and Corner Points Sensitivity of Variational Problems on Conditional Extremum Applied Sensitivity Problems Direct and Inverse Problems of Sensitivity Theory Identification of Dynamic Systems Distribution of Parameter Tolerance Synthesis of Insensitive Systems Numerical Solution of Sensitivity Equations

Journal ArticleDOI
TL;DR: In this article, the authors show that a non-linear time series can be represented by a deterministic time-dependent coefficient model without first specifying the nature of the nonlinearity.

Journal ArticleDOI
TL;DR: In this paper, the authors consider the role of econometrics in decision making under uncertainty, and they use Bayes procedures based on parametric models with approximate prior distributions.

Journal ArticleDOI
TL;DR: In this article, a nonparametric regression estimator is proposed that uses prior information on regression shape in the form of a parametric model. But it is not suitable for binary data.

Journal ArticleDOI
TL;DR: Previously proposed "distributed source" models for bearing estimation problems are generalized to a parametric spatio-temporal model for what is called "partially coherently distributed (PCD) sources", yielding an overall spatio/temporal channel model that is more realistic than formerly proposed models.
Abstract: The problem of using sensor array measurements to estimate the bearing of a radiating source surrounded by local scatterers is considered. The concept of "partial coherence" is introduced to account for temporal as well as spatial correlation effects often encountered in a Rayleigh fading-type propagation channel formed between a source and sensor array elements. A simple parametric model for temporal channel correlation is presented, yielding an overall spatio-temporal channel model that is more realistic than formerly proposed models (which assume either full or zero temporal channel correlation). Thus, previously proposed "distributed source" models for bearing estimation problems are generalized to a parametric spatio-temporal model for what is called "partially coherently distributed (PCD) sources". A study of the associated Cramer-Rao Bound (CRB) is undertaken for a simple but illustrative problem formulation. The inherent limitations in bearing estimation accuracy for this spatio-temporal problem are seen to lie between the cases of zero and full temporal correlation, becoming more severe as temporal channel correlation increases. In addition, the associated maximum likelihood estimators for source bearing are proposed, and their performance is compared with that predicted by the CRB.

01 Jan 2000
TL;DR: In this paper, two extensions of a parametric model are proposed, each one involving the score function of an alternative parametric models, and it is shown that the encompassing hypothesis is equivalent to standard conditions on the score of each of the extended models.
Abstract: Two extensions of a parametric model are proposed, each one involving the score function of an alternative parametric model. We show that the encompassing hypothesis is equivalent to standard conditions on the score of each of the extended models. The condition on the first extension gives rise to the standard score encompassing test, while the condition on the second extension induces a so-called reversed score encompassing test. A similar logic is applied to the likelihood ratio, generating a likelihood ratio and a reversed likelihood ratio encompassing test. The ensued test statistics can be based on simulations if certain calculations are too difficult to carry out analytically. We study the first order asymptotic properties of the proposed test statistics under general conditions.

Journal ArticleDOI
TL;DR: In this article, a comparison among cloudless sky parameterization schemes is presented for estimating the radiant energy using available atmospheric parameters using data recorded at two different greenhouses in the Netherlands.

Proceedings ArticleDOI
10 Sep 2000
TL;DR: It is established that, for the proposed hierarchical parametric model, the Jacobian J may be easily constrained over the whole deformation field, by only constraining its values at a limited number of control points, yielding a fast topology preserving matching algorithm.
Abstract: We present a topology preserving image matching method, based on a hierarchical parametric model of the deformation map. The transformation is parameterized at different scales, using a decomposition of the deformation vector field over a sequence of nested subspaces, generated from a single compactly supported scaling function. To preserve topology, the positivity of the Jacobian of the continuous deformation is enforced. We establish that, for the proposed hierarchical parametric model, the Jacobian J may be easily constrained over the whole deformation field, by only constraining its values at a limited number of control points, yielding a fast topology preserving matching algorithm. Experimental results are presented both on simulated and real-world medical MR images.

Journal ArticleDOI
TL;DR: This paper proposed a unified approach to the estimation of regression parameters under double-sampling designs, in which a primary sample consisting of data on the rough or proxy measures for the response and/or explanatory variables as well as a validation subsample consisting of the exact measurements are available.
Abstract: We propose a unified approach to the estimation of regression parameters under double-sampling designs, in which a primary sample consisting of data on the rough or proxy measures for the response and/or explanatory variables as well as a validation subsample consisting of data on the exact measurements are available. We assume that the validation sample is a simple random subsample from the primary sample. Our proposal utilizes a specific parametric model to extract the partial information contained in the primary sample. The resulting estimator is consistent even if such a model is misspecified, and it achieves higher asymptotic efficiency than the estimator based only on the validation data. Specific cases are discussed to illustrate the application of the estimator proposed.

Journal ArticleDOI
TL;DR: In this paper, the same data situations analyzed in the earlier work were reanalyzed using a complex nonlinear least-squares parametric method that has been employed to estimate discrete-line distributions since 1982 and continuous ones since 1993.

Journal ArticleDOI
TL;DR: A probabilistic framework for modeling single-trial functional magnetic resonance (fMR) images based on a parametric model for the hemodynamic response and Markov random field image models.
Abstract: Describes a probabilistic framework for modeling single-trial functional magnetic resonance (fMR) images based on a parametric model for the hemodynamic response and Markov random field (MRF) image models. The model is fitted to image data by maximizing a lower bound on the log likelihood. The result is an approximate maximum a posteriori estimate of the joint distribution over the model parameters and pixel labels. Examples show how this technique can used to segment two-dimensional (2-D) fMR images, or parts thereof, into regions with different characteristics of their hemodynamic response.

Journal ArticleDOI
TL;DR: In this paper, a decision rule that maximizes the minimum expected utility (or, equivalently, minimizes maximum risk) over a set of distributions is proposed, based on a concave program to find the least-favourable mixture of these distributions.
Abstract: Gilboa and Schmeidler (1989) develop a set of axioms for decision making under uncertainty. The axioms imply a utility function and a set of distributions such that the preference ordering is obtained by calculating expected utility with respect to each distribution in the set, and then taking the minimum of expected utility over the set. In a portfolio choice problem, the distributions are joint distributions for the data that will be available when the choice is made and for the future returns that will determine the value of the portfolio. The set of distributions could be generated by combining a parametric model with a set of prior distributions. We apply this framework to obtain a preference ordering over decision rules, which map the data into a choice. We seek a decision rule that maximizes the minimum expected utility (or, equivalently, minimizes maximum risk) over the set of distributions. An algorithm is provided for the case of a finite set of distributions. It is based on solving a concave programme to find the least-favourable mixture of these distributions. The minimax rule is a Bayes rule with respect to this least-favourable distribution. The minimax value is a lower bound for minimax risk relative to a larger set of distributions. An upper bound can be found by fixing a decision rule and calculating its maximum risk. We apply the algorithm to an estimation problem in an autoregressive, random-effects model for panel data. Copyright © 2000 John Wiley & Sons, Ltd.

Book ChapterDOI
01 Jan 2000
TL;DR: It is proposed that one consider sensitivity analysis by embedding standard parametric models in model extensions defined by replacing a parametric probability model with a nonparametric extension.
Abstract: We propose that one consider sensitivity analysis by embedding standard parametric models in model extensions defined by replacing a parametric probability model with a nonparametric extension. The non-parametric model could replace the entire probability model, or some level of a hierarchical model. Specifically, we define nonparametric extensions of a parametric probability model using Dirichlet process (DP) priors. Similar approaches have been used in the literature to implement formal model fit diagnostics (Carota et al., 1996).