scispace - formally typeset
Search or ask a question
Topic

Parametric statistics

About: Parametric statistics is a research topic. Over the lifetime, 39200 publications have been published within this topic receiving 765761 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, a method for recovery of compact volumetric models for shape representation of single-part objects in computer vision is introduced, where the model recovery is formulated as a least-squares minimization of a cost function for all range points belonging to a single part.
Abstract: A method for recovery of compact volumetric models for shape representation of single-part objects in computer vision is introduced. The models are superquadrics with parametric deformations (bending, tapering, and cavity deformation). The input for the model recovery is three-dimensional range points. Model recovery is formulated as a least-squares minimization of a cost function for all range points belonging to a single part. During an iterative gradient descent minimization process, all model parameters are adjusted simultaneously, recovery position, orientation, size, and shape of the model, such that most of the given range points lie close to the model's surface. A specific solution among several acceptable solutions, where are all minima in the parameter space, can be reached by constraining the search to a part of the parameter space. The many shallow local minima in the parameter space are avoided as a solution by using a stochastic technique during minimization. Results using real range data show that the recovered models are stable and that the recovery procedure is fast. >

596 citations

Journal ArticleDOI
TL;DR: The development of a new coarse in the control curriculum dealing with the control of systems subject to parametric uncertainty is presented, rich in theoretical content, easy to motivate from a practical standpoint and requires just the right level of mathematics to be taught as a fundamental discipline to engineers and scientists.

593 citations

Journal ArticleDOI
TL;DR: This paper presents a method that uses the level sets of volumes to reconstruct the shapes of 3D objects from range data and presents an analytical characterization of the surface that maximizes the posterior probability, and presents a novel computational technique for level-set modeling, called the sparse-field algorithm.
Abstract: This paper presents a method that uses the level sets of volumes to reconstruct the shapes of 3D objects from range data. The strategy is to formulate 3D reconstruction as a statistical problem: find that surface which is mostly likely, given the data and some prior knowledge about the application domain. The resulting optimization problem is solved by an incremental process of deformation. We represent a deformable surface as the level set of a discretely sampled scalar function of three dimensions, i.e., a volume. Such level-set models have been shown to mimic conventional deformable surface models by encoding surface movements as changes in the greyscale values of the volume. The result is a voxel-based modeling technology that offers several advantages over conventional parametric models, including flexible topology, no need for reparameterization, concise descriptions of differential structure, and a natural scale space for hierarchical representations. This paper builds on previous work in both 3D reconstruction and level-set modeling. It presents a fundamental result in surface estimation from range data: an analytical characterization of the surface that maximizes the posterior probability. It also presents a novel computational technique for level-set modeling, called the sparse-field algorithm, which combines the advantages of a level-set approach with the computational efficiency and accuracy of a parametric representation. The sparse-field algorithm is more efficient than other approaches, and because it assigns the level set to a specific set of grid points, it positions the level-set model more accurately than the grid itself. These properties, computational efficiency and subcell accuracy, are essential when trying to reconstruct the shapes of 3D objects. Results are shown for the reconstruction objects from sets of noisy and overlapping range maps.

593 citations

Journal ArticleDOI
TL;DR: The time-rescaling theorem may be used to develop goodness-of-fit tests for both parametric and histogram-based point process models of neural spike trains, and a proof using only elementary probability theory arguments is presented.
Abstract: Measuring agreement between a statistical model and a spike train data series, that is, evaluating goodness of fit, is crucial for establishing the model's validity prior to using it to make inferences about a particular neural system. Assessing goodness-of-fit is a challenging problem for point process neural spike train models, especially for histogram-based models such as perstimulus time histograms (PSTH) and rate functions estimated by spike train smoothing. The time-rescaling theorem is a well-known result in probability theory, which states that any point process with an integrable conditional intensity function may be transformed into a Poisson process with unit rate. We describe how the theorem may be used to develop goodness-of-fit tests for both parametric and histogram-based point process models of neural spike trains. We apply these tests in two examples: a comparison of PSTH, inhomogeneous Poisson, and inhomogeneous Markov interval models of neural spike trains from the supplementary eye field of a macque monkey and a comparison of temporal and spatial smoothers, inhomogeneous Poisson, inhomogeneous gamma, and inhomogeneous inverse gaussian models of rat hippocampal place cell spiking activity. To help make the logic behind the time-rescaling theorem more accessible to researchers in neuroscience, we present a proof using only elementary probability theory arguments. We also show how the theorem may be used to simulate a general point process model of a spike train. Our paradigm makes it possible to compare parametric and histogram-based neural spike train models directly. These results suggest that the time-rescaling theorem can be a valuable tool for neural spike train data analysis.

590 citations

Journal ArticleDOI
TL;DR: In this article, a quantum mechanical model for parametric interactions is used to evaluate the effect of the measuring (amplifying) process on the statistical properties of radiation, and it is shown that it allows a simultaneous determination of the phase and number of quanta of an electromagnetic wave with an accuracy which is limited only by the uncertainty principle.
Abstract: A quantum mechanical model for parametric interactions is used to evaluate the effect of the measuring (amplifying) process on the statistical properties of radiation. Parametric amplification is shown to be ideal in the sense that it allows a simultaneous determination of the phase and number of quanta of an electromagnetic wave with an accuracy which is limited only by the uncertainty principle. Frequency conversion via parametric processes is shown to be free of zero-point fluctuations.

590 citations


Network Information
Related Topics (5)
Nonlinear system
208.1K papers, 4M citations
90% related
Matrix (mathematics)
105.5K papers, 1.9M citations
84% related
Artificial neural network
207K papers, 4.5M citations
83% related
Estimator
97.3K papers, 2.6M citations
83% related
Differential equation
88K papers, 2M citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20252
20242
20233,966
20227,822
20211,968
20202,033