scispace - formally typeset
Search or ask a question

Showing papers on "Piecewise published in 1999"


Journal ArticleDOI
TL;DR: The approach exploits the gain-scheduling nature of fuzzy systems and results in stability conditions that can be verified via convex optimization over linear matrix inequalities, and special attention is given to the computational aspects of the approach.
Abstract: Presents an approach to stability analysis of fuzzy systems. The analysis is based on Lyapunov functions that are continuous and piecewise quadratic. The approach exploits the gain-scheduling nature of fuzzy systems and results in stability conditions that can be verified via convex optimization over linear matrix inequalities. Examples demonstrate the many improvements over analysis based on a single quadratic Lyapunov function. Special attention is given to the computational aspects of the approach and several methods to improve the computational efficiency are described.

775 citations


Proceedings ArticleDOI
Jehee Lee1, Sung Yong Shin1
01 Jul 1999
TL;DR: This paper presents a technique for adapting existing motion of a human-like character to have the desired features that are specified by a set of constraints, and combines a hierarchical curve fitting technique with a new inverse kinematics solver.
Abstract: This paper presents a technique for adapting existing motion of a human-like character to have the desired features that are specified by a set of constraints This problem can be typically formulated as a spacetime constraint problem Our approach combines a hierarchical curve fitting technique with a new inverse kinematics solver Using the kinematics solver, we can adjust the configuration of an articulated figure to meet the constraints in each frame Through the fitting technique, the motion displacement of every joint at each constrained frame is interpolated and thus smoothly propagated to frames We are able to adaptively add motion details to satisfy the constraints within a specified tolerance by adopting a multilevel Bspline representation which also provides a speedup for the interpolation The performance of our system is further enhanced by the new inverse kinematics solver We present a closed-form solution to compute the joint angles of a limb linkage This analytical method greatly reduces the burden of a numerical optimization to find the solutions for full degrees of freedom of a human-like articulated figure We demonstrate that the technique can be used for retargetting a motion to compensate for geometric variations caused by both characters and environments Furthermore, we can also use this technique for directly manipulating a motion clip through a graphical interface CR Categories: I37 [Computer Graphics]: Threedimensional Graphics—Animation; G12 [Numerical Analysis]: Approximation—Spline and piecewise polynomial approximation

551 citations


Proceedings ArticleDOI
23 Jun 1999
TL;DR: A probabilistic multiple-hypothesis framework for tracking highly articulated objects where the probability density of the tracker state is represented as a set of modes with piecewise Gaussians characterizing the neighborhood around these modes.
Abstract: This paper describes a probabilistic multiple-hypothesis framework for tracking highly articulated objects. In this framework, the probability density of the tracker state is represented as a set of modes with piecewise Gaussians characterizing the neighborhood around these modes. The temporal evolution of the probability density is achieved through sampling from the prior distribution, followed by local optimization of the sample positions to obtain updated modes. This method of generating hypotheses from state-space search does not require the use of discrete features unlike classical multiple-hypothesis tracking. The parametric form of the model is suited for high dimensional state-spaces which cannot be efficiently modeled using non-parametric approaches. Results are shown for tracking Fred Astaire in a movie dance sequence.

378 citations



Journal ArticleDOI
TL;DR: In this paper, the authors investigated time harmonic Maxwell equations in heterogeneous media, where the permeability μ and the permittivity e are piecewise constant and the associated boundary value problem can be interpreted as a transmission problem.
Abstract: We investigate time harmonic Maxwell equations in heterogeneous media, where the permeability μ and the permittivity e are piecewise constant. The associated boundary value problem can be interpreted as a transmission problem. In a very natural way the interfaces can have edges and corners. We give a detailed description of the edge and corner singularities of the electromagnetic fields.

293 citations


Journal ArticleDOI
TL;DR: The skeleton of a non-degenerate pluri-stable formal scheme is given in this article, where a colored polysimplicial set associated with a nondegenerate poly-stable fibration is presented.
Abstract: 0 Introduction 293 1 Piecewise RS -linear spaces 298 2 R-colored polysimplicial sets 308 3 R-colored polysimplicial sets of length l 313 4 The skeleton of a nondegenerate pluri-stable formal scheme 327 5 A colored polysimplicial set associated with a nondegenerate poly-stable fibration 336 6 p-Adic analytic and piecewise linear spaces 346 7 Strong local contractibility of smooth analytic spaces 355 8 Cohomology with coefficients in the sheaf of constant functions 362

290 citations


01 Jan 1999
TL;DR: This thesis develops efficient combinatorial optimization algorithms for several important classes of energy functions which incorporate everywhere smooth, piecewise constant, and piecewise smooth priors, and demonstrates the effectiveness of the approach on image restoration, stereo, and motion.
Abstract: Energy minimization is an elegant approach to computer vision. Vision problems usually have many solutions due to uncertainties in the imaging process and ambiguities in visual interpretation. The energy function encodes the problem constraints, and its minimum gives the optimal solution. Despite numerous advantages, this approach is severely limited by the high computational cost. The main contribution of my thesis lies in developing efficient combinatorial optimization algorithms for several important classes of energy functions which incorporate everywhere smooth, piecewise constant, and piecewise smooth priors. These priors assume, respectively, that the quantity to be estimated varies smoothly over its domain, consist of several pieces with constant values, or consist of several pieces with smoothly varying values. The algorithms rely on graph cuts as a powerful optimization technique. For a certain everywhere smooth prior we develop an algorithm which finds the exact minimum by computing a single graph cut. This method is most suitable to estimate quantities without discontinuities. But even when discontinuities exist, the method produces good results in certain cases. The running time is low order polynomial. For several wide classes of piecewise smooth priors we develop two approximation algorithms (we show that exact minimization in NP-hard in these cases). These algorithms produce a local minimum in interesting large move spaces. Furthermore, one of them finds a solution within a known factor from the optimum. The algorithms are iterative and compute several graph cuts at each iteration. The running time at each iteration is effectively linear due to the special graph structure. In practice it takes just a few iterations to converge. Moreover most of the progress happens during the first iteration. For a certain piecewise constant prior we adapt the algorithms developed for the piecewise smooth prior. One of them finds a solution within a factor of two from the optimum. In addition we develop a third algorithm which finds a local minimum in yet another move space. We demonstrate the effectiveness of our approach on image restoration, stereo, and motion. For the data with ground truth, our methods significantly outperform standard methods.

282 citations


Journal ArticleDOI
TL;DR: In this paper, it was shown that the adiabatic theorem holds for the ground state of atoms in quantized radiation field without the traditional gap condition, and that the general result gives no information on the rate at which the adabiabatic limit is approached, but with additional spectral information one can also estimate this rate.
Abstract: We prove the adiabatic theorem for quantum evolution without the traditional gap condition. All that this adiabatic theorem needs is a (piecewise) twice differentiable finite dimensional spectral projection. The result implies that the adiabatic theorem holds for the ground state of atoms in quantized radiation field. The general result we prove gives no information on the rate at which the adiabatic limit is approached. With additional spectral information one can also estimate this rate.

258 citations


Journal ArticleDOI
TL;DR: In this article, a theoretical framework for the classification of border collision bifurcations is presented, which can help in explaining bifurbation in all systems, which are represented by two-dimensional piecewise smooth maps.
Abstract: Recent investigations on the bifurcations in switching circuits have shown that many atypical bifurcations can occur in piecewise smooth maps that cannot be classified among the generic cases like saddle-node, pitchfork, or Hopf bifurcations occurring in smooth maps. In this paper we first present experimental results to establish the need for the development of a theoretical framework and classification of the bifurcations resulting from border collision. We then present a systematic analysis of such bifurcations by deriving a normal form - the piecewise linear approximation in the neighborhood of the border. We show that there can be eleven qualitatively different types of border collision bifurcations depending on the parameters of the normal form, and these are classified under six cases. We present a partitioning of the parameter space of the normal form showing the regions where different types of bifurcations occur. This theoretical framework will help in explaining bifurcations in all systems, which can be represented by two-dimensional piecewise smooth maps.

255 citations


Proceedings ArticleDOI
01 Jun 1999
TL;DR: The novelty of the approach lies in the use of inter-image homographies to validate and best estimate the plane, and in the minimal initialization requirements-only a single 3D line with a textured neighbourhood is required to generate a plane hypothesis.
Abstract: A new method is described for automatically reconstructing 3D planar faces from multiple images of a scene. The novelty of the approach lies in the use of inter-image homographies to validate and best estimate the plane, and in the minimal initialization requirements-only a single 3D line with a textured neighbourhood is required to generate a plane hypothesis. The planar facets enable line grouping and also the construction of parts of the wireframe which were missed due to the inevitable shortcomings of feature detection and matching. The method allows a piecewise planar model of a scene to be built completely automatically, with no user intervention at any stage, given only the images and camera projection matrices as input. The robustness and reliability of the method are illustrated on several examples, from both aerial and interior views.

237 citations


Journal ArticleDOI
01 Jun 1999
TL;DR: In this article, a nonholonomic ground mobile base tracking an arbitrarily shaped continuous ground curve is considered, where the shape of the image curve is controllable only up to its "linear" curvature parameters.
Abstract: Theoretical and analytical aspects of the visual servoing problem have not received much attention. Furthermore, the problem of estimation from the vision measurements has been considered separately from the design of the control strategies. Instead of addressing the pose estimation and control problems separately, we attempt to characterize the types of control tasks which can be achieved using only quantities directly measurable in the image, bypassing the pose estimation phase. We consider the task of navigation for a nonholonomic ground mobile base tracking an arbitrarily shaped continuous ground curve. This tracking problem is formulated as one of controlling the shape of the curve in the image plane. We study the controllability of the system characterizing the dynamics of the image curve, and show that the shape of the image curve is controllable only up to its "linear" curvature parameters. We present stabilizing control laws for tracking piecewise analytic curves, and propose to track arbitrary curves by approximating them by piecewise "linear" curvature curves. Simulation results are given for these control schemes. Observability of the curve dynamics by using direct measurements from vision sensors as the outputs is studied and an extended Kalman filter is proposed to dynamically estimate the image quantities needed for feedback control from the actual noisy images.

BookDOI
01 Jan 1999
TL;DR: In this article, the authors propose a non-interior predictor-corrector path-following method for non-linear Dirichlet problems, based on the Smoothing Newton method.
Abstract: Preface. Solving Complementarity Problems by Means of a New Smooth Constrained Nonlinear Solver R. Andreani, J.M. Martinez.epsiv -Enlargements of Maximal Monotone Operators: Theory and Applications R.S. Burachik, et al. A Non-Interior Predictor Corrector Path-Following Method for LCP J.V. Burke, S. Xu.Smoothing Newton Methods for Nonsmooth Dirichlet Problems X. Chen, etal. Frictional Contact Algorithms Based on Semismooth Newton Methods P.W. Christensen, J.-S. Pang. Well-Posed Problems and Error Bounds in Optimization S. Deng. Modeling and Solution Environments for MPEC: GAMS & MATLAB S.P. Dirkse, M.C. Ferris. Merit Functions and Stability for Complementarity Problems A. Fischer. Minimax and Triality Theory in Nonsmooth Variational Problems D.Y. Gao. Global and Local Superlinear Convergence Analysis of Newton-Type Methods for Semismooth Equations with Smooth Least Squares H. Jiang, D. Ralph.Inexact Trust-Region Methods for Nonlinear Complementarity Problems C.Kanzow, M. Zupke. Regularized Newton Methods for Minimization of Convex Quadratic Splines with Singular Hessians W. Li, J. Swetits.Regularized Linear Programs with Equilibrium Constraints L.L.Mangasarian. Reformulations of a Bicriterion Equilibrium Model P.Marcotte. A Smoothing Function and Its Applications J.-M. Peng.On the Local Super-Linear Convergence of a Matrix Secant Implementation of the Variable Metric Proximal Point Algorithm for Monotone Operators M.Qian, J.V. Burke. Reformulation of a Problem of Economic Equilibrium A.M. Rubinov, B.M. Glover. A Globally Convergent Inexact NewtonMethod for Systems of Monotone Equations M.V. Solodov, B.F.Svaiter. On the Limiting Behavior of the Trajectory of RegularizedSolutions of a P0 -Complementarity Problem R. Sznajder, M.S.Gowda. Analysis of a Non-Interior Continuation Method Based on Chen-Mangasarian Smoothing Functions for Complementarity Problems P.Tseng. A New Merit Function and a Descent Method for Semidefinite Complementarity Problems N. Yamashita, M. Fukushima. Numerical Experiments for a Class of Squared Smoothing Newton Methods for Box Constrained Variational Inequality Problems G. Zhou, et al.

Journal ArticleDOI
TL;DR: In this article, a control parameterization enhancing transform is introduced to convert approximate optimal control problems with variable switching times into equivalent standard control problems involving piecewise constant or piecewise linear control functions with pre-fixed switching times.
Abstract: Consider a general class of constrained optimal control problems in canonical form Using the classical control parameterization technique, the time (planning) horizon is partitioned into several subintervals The control functions are approximated by piecewise constant or piecewise linear functions with pre-fixed switching times However, if the optimal control functions to be obtained are piecewise continuous, the accuracy of this approximation process greatly depends on how fine the partition is On the other hand, the performance of any optimization algorithm used is limited by the number of decision variables of the problem Thus, the time horizon cannot be partitioned into arbitrarily many subintervals to reach the desired accuracy To overcome this difficulty, the switching points should also be taken as decision variables This is the main motivation of the paper A novel transform, to be referred to as the control parameterization enhancing transform, is introduced to convert approximate optimal control problems with variable switching times into equivalent standard optimal control problems involving piecewise constant or piecewise linear control functions with pre-fixed switching times The transformed problems are essentially optimal parameter selection problems and hence are solvable by various existing algorithms For illustration, two non-trivial numerical examples are solved using the proposed method

Journal ArticleDOI
TL;DR: In this article, the authors considered the wave equation damped with a locally distributed nonlinear dissipation and showed that the energy of the system decays to zero with a precise decay rate.
Abstract: We consider the wave equation damped with a locally distributed nonlinear dissipation. We improve several earlier results of E. Zuazua and of M. Nakao in two directions: first, using the piecewise multiplier method introduced by K. Liu, we weaken the usual geometrical conditions on the localization of the damping. Then thanks to some new nonlinear integral inequalities, we eliminate the usual assumption on the polynomial growth of the feedback in zero and we show that the energy of the system decays to zero with a precise decay rate estimate.

Proceedings ArticleDOI
13 Sep 1999
TL;DR: This work presents an approach for 3D reconstruction of objects from a single image based on user-provided coplanarity, perpendicularity and parallelism constraints, used to calibrate the image and perform3D reconstruction.
Abstract: We present an approach for 3D reconstruction of objects from a single image. Obviously, constraints on the 3D structure are needed to perform this task. Our approach is based on user-provided coplanarity, perpendicularity and parallelism constraints. These are used to calibrate the image and perform 3D reconstruction. The method is described in detail and results are provided.

Journal ArticleDOI
TL;DR: This work presents an approach for the reconstruction and approximation of 3D CAD models from an unorganized collection of points that is flexible enough to permit interpolation of both smooth surfaces and sharp features, while placing few restrictions on the geometry or topology of the object.
Abstract: We present an approach for the reconstruction and approximation of 3D CAD models from an unorganized collection of points. Applications include rapid reverse engineering of existing objects for use in a virtual prototyping environment, including computer aided design and manufacturing. Our reconstruction approach is flexible enough to permit interpolation of both smooth surfaces and sharp features, while placing few restrictions on the geometry or topology of the object. Our algorithm is based on alpha-shapes to compute an initial triangle mesh approximating the surface of the object. A mesh reduction technique is applied to the dense triangle mesh to build a simplified approximation, while retaining important topological and geometric characteristics of the model. The reduced mesh is interpolated with piecewise algebraic surface patches which approximate the original points. The process is fully automatic, and the reconstruction is guaranteed to be homeomorphic and error bounded with respect to the original model when certain sampling requirements are satisfied. The resulting model is suitable for typical CAD modeling and analysis applications.

Journal ArticleDOI
TL;DR: A complete discussion of the optimal (global and local) order of convergence of piecewise polynomial collocation methods on graded grids for nonlinear Volterra integral equations with algebraic or logarithmic singularities in their kernels is presented.
Abstract: Second-kind Volterra integral equations with weakly singular kernels typically have solutions which are nonsmooth near the initial point of the interval of integration. Using an adaptation of the analysis originally developed for nonlinear weakly singular Fredholm integral equations, we present a complete discussion of the optimal (global and local) order of convergence of piecewise polynomial collocation methods on graded grids for nonlinear Volterra integral equations with algebraic or logarithmic singularities in their kernels.

Journal ArticleDOI
TL;DR: In this paper, the authors consider the recovery of smooth region boundaries of piecewise constant coefficients of an elliptic PDE from data on the exterior boundary, where the values of the coefficients (a,b) are known a priori but the information about the geometry of the smooth region boundary where a and b are discontinous is missing.
Abstract: In this study we consider the recovery of smooth region boundaries of piecewise constant coefficients of an elliptic PDE, - a+b = f, from data on the exterior boundary The assumption made is that the values of the coefficients (a,b) are known a priori but the information about the geometry of the smooth region boundaries where a and b are discontinous is missing For the full characterization of (a,b) it is then sufficient to find the region boundaries separating different values of the coefficients This leads to a nonlinear ill-posed inverse problem In this study we propose a numerical algorithm that is based on the finite-element method and subdivision of the discretization elements We formulate the forward problem as a mapping from a set of coefficients representing boundary shapes to data on , and derive the Jacobian of this forward mapping Then an iterative algorithm which seeks a boundary configuration minimizing the residual norm between measured and predicted data is implemented The method is illustrated first for a general elliptic PDE and then applied to optical tomography where the goal is to find the diffusion and absorption coefficients of the object by transilluminating the object with visible or near-infrared light Numerical test results for this specific application are given with synthetic data

Posted Content
TL;DR: In this article, the authors presented versions of the proofs of two classic theorems of combinatorial topology, namely, the result that piecewise linearly homeomorphic simplicial complexes are related by stellar moves.
Abstract: Here are versions of the proofs of two classic theorems of combinatorial topology. The first is the result that piecewise linearly homeomorphic simplicial complexes are related by stellar moves. This is used in the proof, modelled on that of Pachner, of the second theorem. This states that moves from only a finite collection are needed to relate two triangulations of a piecewise linear manifold.

Journal ArticleDOI
TL;DR: The simplified results, which rely heavily on a careful dissection and improved understanding of the tangent cone of the feasible region of the program, bypass the combinatorial characterization that is intrinsic to B-stationarity.
Abstract: With the aid of some novel complementarity constraint qualifications, we derive some simplified primal-dual characterizations of a B-stationary point for a mathematical program with complementarity constraints (MPEC). The approach is based on a locally equivalent piecewise formulation of such a program near a feasible point. The simplified results, which rely heavily on a careful dissection and improved understanding of the tangent cone of the feasible region of the program, bypass the combinatorial characterization that is intrinsic to B-stationarity.

Journal ArticleDOI
TL;DR: Two algorithms are presented, one point-based and one element-based, that extract separation and attachment lines using eigenvalue analysis of a locally linear function, which shows that both algorithms detect open separation lines-a type of separation that is not captured by conventional vector field topology algorithms.
Abstract: Separation and attachment lines are topologically significant curves that exist on 2D surfaces in 3D vector fields. Two algorithms are presented, one point-based and one element-based, that extract separation and attachment lines using eigenvalue analysis of a locally linear function. Unlike prior techniques based on piecewise numerical integration, these algorithms use robust analytical tests that can be applied independently to any point in a vector field. The feature extraction is fully automatic and suited to the analysis of large-scale numerical simulations. The strengths and weaknesses of the two algorithms are evaluated using analytic vector fields and also results from computational fluid dynamics (CFD) simulations. We show that both algorithms detect open separation lines-a type of separation that is not captured by conventional vector field topology algorithms.

Proceedings ArticleDOI
23 Jun 1999
TL;DR: An explanation-based facial motion tracking algorithm based on a piecewise Bezier volume deformation model (PBVD) that takes the predefined action units as the initial articulation model and adaptively improves them during the tracking process to obtain a more realistic articulation models.
Abstract: Capturing real motions from video sequences is a powerful method for automatic building of facial articulation models. In this paper, we propose an explanation-based facial motion tracking algorithm based on a piecewise Bezier volume deformation model (PBVD). The PBVD is a suitable model both for the synthesis and the analysis of facial images. It is linear and independent of the facial mesh structure. With this model, basic facial movements, or action units, are interactively defined. By changing the magnitudes of these action units, animated facial images are generated. The magnitudes of these action units can also be computed from real video sequences using a model-based tracking algorithm. However, in order to customize the articulation model for a particular face, the predefined PBVD action units need to be adaptively modified. In this paper, we first briefly introduce the PBVD model and its application in facial animation. Then a multiresolution PBVD-based motion tracking algorithm is presented. Finally, we describe an explanation-based tracking algorithm that takes the predefined action units as the initial articulation model and adaptively improves them during the tracking process to obtain a more realistic articulation model. Experimental results on PBVD-based animation, model-based tracking, and explanation-based tracking are shown in this paper.

Journal ArticleDOI
TL;DR: An approach to solid-state electronic-structure calculations based on the finite-element method that combines the significant advantages of both real-space-grid and basis-oriented approaches and so promises to be particularly well suited for large, accurate calculations.
Abstract: We present an approach to solid-state electronic-structure calculations based on the finite-element method. In this method, the basis functions are strictly local, piecewise polynomials. Because the basis is composed of polynomials, the method is completely general and its convergence can be controlled systematically. Because the basis functions are strictly local in real space, the method allows for variable resolution in real space; produces sparse, structured matrices, enabling the effective use of iterative solution methods; and is well suited to parallel implementation. The method thus combines the significant advantages of both real-space-grid and basis-oriented approaches and so promises to be particularly well suited for large, accurate {ital ab initio} calculations. We develop the theory of our approach in detail, discuss advantages and disadvantages, and report initial results, including electronic band structures and details of the convergence of the method. {copyright} {ital 1999} {ital The American Physical Society}

Journal ArticleDOI
TL;DR: It is shown that optimal discrete-valued control problems are equivalent to optimal control problems involving a new control function which is piecewise constant with pre-fixed switching points and can hence be readily solved by various existing algorithms.

Proceedings ArticleDOI
01 Nov 1999
TL;DR: A third-order finite volume scheme applicable to arbitrary 3D unstructured grids as well as Cartesian grids is presented, based on a piecewise discontinuous quadratic reconstruction of the data and a high-order flux integration using a Gauss quadrature rule.
Abstract: A critical evaluation of both the practical aspects and implementation strategies for higher order methods applied to unstructured and Cartesian meshes is presented. The aim of this paper is to present a third-order finite volume scheme applicable to arbitrary 3D unstructured grids as well as Cartesian grids. The method is based on a piecewise discontinuous quadratic reconstruction of the data and a high-order flux integration using a Gauss quadrature rule. Several results emphasize the improved accuracy with respect to second-order methods. In particular, pertinent aspects of the implementation for 3D in terms of memory requirements, efficiency, and performance are presented and conclusions are drawn as to the viability and appropriateness of such methods.

Journal ArticleDOI
TL;DR: In this article, the author's exact envelope function representation method is clarified and contrasted with that of the conventional method, and a simple example showing how to obtain correct operator ordering in electronic valence band Hamiltonians is worked out in detailed tutorial style.
Abstract: The increasing sophistication used in the fabrication of semiconductor nanostructures and in the experiments performed on them requires more sophisticated theoretical techniques than previously employed. The philosophy behind the author's exact envelope function representation method is clarified and contrasted with that of the conventional method. The significance of globally slowly varying envelope functions is explained. The difference between the envelope functions that appear in the author's envelope function representation and conventional envelope functions is highlighted and some erroneous statements made in the literature on the scope of envelope function methods are corrected. A perceived conflict between the standard effective mass Hamiltonian and the uncertainty principle is resolved demonstrating the limited usefulness of this principle in determining effective Hamiltonians. A simple example showing how to obtain correct operator ordering in electronic valence band Hamiltonians is worked out in detailed tutorial style. It is shown how the use of out of zone solutions to the author's approximate envelope function equations plays an essential role in their mathematically rigorous solution. In particular, a demonstration is given of the calculation of an approximate wavefunction for an electronic state in a one dimensional nanostructure with abrupt interfaces and disparate crystals using out of zone solutions alone. The author's work on the interband dipole matrix element for slowly varying envelope functions is extended to envelope functions without restriction. Exact envelope function equations are derived for multicomponent fields to emphasize that the author's method is a general one for converting a microscopic description to a mesoscopic one, applicable to linear partial differential equations with piecewise or approximately piecewise periodic coefficients. As an example, the method is applied to the derivation of approximate envelope function equations from the Maxwell equations for photonic nanostructures.

Journal ArticleDOI
TL;DR: In this article, the Girsanov transformation is used to derive estimates for the accuracy of piecewise approximations for one-sided and two-sided boundary crossing probabilities using repeated numerical integration.
Abstract: Using the Girsanov transformation we derive estimates for the accuracy of piecewise approximations for one-sided and two-sided boundary crossing probabilities. We demonstrate that piecewise linear approximations can be calculated using repeated numerical integration. As an illustrative example we consider the case of one-sided and two-sided square-root boundaries for which we also present analytical representations in a form of infinite power series.

Journal ArticleDOI
TL;DR: In this paper, the applicability of the sine-approximating symmetrical piecewise nth-order polynomial kernels is investigated and it is concluded that while the improvement of cubic convolution over linear interpolation is significant, the use of higher order polynomials only yields marginal improvement.
Abstract: The reconstruction of images is an important operation in many applications. From sampling theory, it is well known that the sine-function is the ideal interpolation kernel which, however, cannot be used in practice. In order to be able to obtain an acceptable reconstruction, both in terms of computational speed and mathematical precision, it is required to design a kernel that is of finite extent and resembles the sinc-function as much as possible. In this paper, the applicability of the sine-approximating symmetrical piecewise nth-order polynomial kernels is investigated in satisfying these requirements. After the presentation of the general concept, kernels of first, third, fifth and seventh order are derived. An objective, quantitative evaluation of the reconstruction capabilities of these kernels is obtained by analyzing the spatial and spectral behavior using different measures, and by using them to translate, rotate, and magnify a number of real-life test images. From the experiments, it is concluded that while the improvement of cubic convolution over linear interpolation is significant, the use of higher order polynomials only yields marginal improvement.

Journal ArticleDOI
TL;DR: It is demonstrated that a single, a priori selected spline model recovers a variety of patterns of changes in hazard ratio and fits better than other models, especially when the changes are non-monotonic, as in the case of cancer stages.
Abstract: The authors compare the performance of different regression models for censored survival data in modeling the impact of prognostic factors on all-cause mortality in colon cancer. The data were for 1,951 patients, who were diagnosed in 1977-1991, recorded by the Registry of Digestive Tumors of Cote d'Or, France, and followed for up to 15 years. Models include the Cox proportional hazards model and its three generalizations that allow for hazard ratio to change over time: 1) the piecewise model where hazard ratio is a step function; 2) the model with interaction between a predictor and a parametric function of time; and 3) the non-parametric regression spline model. Results illustrate the importance of accounting for non-proportionality of hazards, and some advantages of flexible non-parametric modeling of time-dependent effects. The authors provide empirical evidence for the dependence of the results of piecewise and parametric models on arbitrary a priori choices, regarding the number of time intervals and specific parametric function, which may lead to biased estimates and low statistical power. The authors demonstrate that a single, a priori selected spline model recovers a variety of patterns of changes in hazard ratio and fits better than other models, especially when the changes are non-monotonic, as in the case of cancer stages.

Journal ArticleDOI
TL;DR: In a chemical kinetics calculation, a solution-mapping procedure is applied to parametrize the solution of the initial-value ordinary differential equation system as a set of algebraic polynomial equations, achieving a factor of 10 increase in computational efficiency.
Abstract: In a chemical kinetics calculation, a solution-mapping procedure is applied to parametrize the solution of the initial-value ordinary differential equation system as a set of algebraic polynomial equations. To increase the accuracy, the parametrization is done piecewise, dividing the multidimensional chemical composition space into hypercubes and constructing polynomials for each hypercube. A differential equation solver is used to provide the solution at selected points throughout a hypercube, and from these solutions the polynomial coefficients are determined. Factorial design methods are used to reduce the required number of computed points. The polynomial coefficients for each hypercube are stored in a data structure for subsequent reuse, since over the duration of a flame simulation it is likely that a particular set of concentrations and temperature will occur repeatedly at different times and positions. The method is applied to H2–air combustion using an 8-species reaction set. After N2 is added as an inert species and enthalpy is considered, this results in a 10-dimensional chemical composition space. To add the capability of using a variable time-step, time-step is added as an additional dimension, making an 11-dimensional space. Reactive fluid dynamical simulations of a 1-D laminar premixed flame and a 2-D turbulent non-premixed jet are performed. The results are compared to identical control runs which use an ordinary differential equation solver to calculate the chemical kinetic rate equations. The resulting accuracy is very good, and a factor of 10 increase in computational efficiency is attained.