scispace - formally typeset
C

Christa Cuchiero

Researcher at University of Vienna

Publications -  65
Citations -  1292

Christa Cuchiero is an academic researcher from University of Vienna. The author has contributed to research in topics: Affine transformation & Yield curve. The author has an hindex of 17, co-authored 55 publications receiving 1004 citations. Previous affiliations of Christa Cuchiero include University of Paris & Vienna University of Economics and Business.

Papers
More filters
Journal ArticleDOI

Affine Processes on Positive Semidefinite Matrices

TL;DR: In this paper, the mathematical foundation for stochastically continuous affine processes on the cone of positive semidefinite symmetric matrices is provided, and a large range of useful applications in finance, including multi-asset option pricing with stochastic volatility and correlation structures.
Journal ArticleDOI

Affine processes on positive semidefinite matrices

TL;DR: In this article, the mathematical foundation for stochastically continuous affine processes on the cone of positive semidefinite symmetric matrices is provided, motivated by a large and growing use of matrix-valued affine process in finance, including multi-asset option pricing with stochastic volatility and correlation structures.
Journal ArticleDOI

Polynomial processes and their applications to mathematical Finance

TL;DR: A class of Markov processes, called m-polynomial, for which the calculation of (mixed) moments up to order m only requires the computation of matrix exponentials, which contains affine processes, processes with quadratic diffusion coefficients, as well as Lévy-driven SDEs with affine vector fields.
Posted Content

Polynomial processes and their applications to mathematical Finance

TL;DR: In this paper, the authors introduce a class of Markov processes, called $m$-polynomial, for which the calculation of (mixed) moments up to order$m$ only requires the computation of matrix exponentials.
Journal ArticleDOI

Deep Neural Networks, Generic Universal Interpolation, and Controlled ODEs

TL;DR: It is shown that universal interpolation holds for certain deep neural networks even if large numbers of parameters are left untrained, and are instead chosen randomly, which lends theoretical support to the observation that training with random initialization can be successful even when most parameters are largely unchanged through the training.