scispace - formally typeset
D

Diyora Salimova

Researcher at ETH Zurich

Publications -  19
Citations -  366

Diyora Salimova is an academic researcher from ETH Zurich. The author has contributed to research in topics: Curse of dimensionality & Stochastic partial differential equation. The author has an hindex of 10, co-authored 17 publications receiving 276 citations.

Papers
More filters
Journal ArticleDOI

A proof that deep artificial neural networks overcome the curse of dimensionality in the numerical approximation of Kolmogorov partial differential equations with constant diffusion and nonlinear drift coefficients

TL;DR: It is revealed that DNNs do overcome the curse of dimensionality in the numerical approximation of Kolmogorov PDEs with constant diffusion and nonlinear drift coefficients.
Posted Content

Deep neural network approximations for Monte Carlo algorithms

TL;DR: The main result of this paper shows that if a function can be approximated by means of some suitable discrete approximation scheme without the curse of dimensionality and if there exist DNNs which satisfy certain regularity properties and which approximate this discrete approximation Scheme, then the function itself can also be approximating with DNN's without the cursed dimensionality.
Journal ArticleDOI

Strong convergence of full-discrete nonlinearity-truncated accelerated exponential Euler-type approximations for stochastic Kuramoto-Sivashinsky equations

TL;DR: In this article, a new explicit, easily implementable, and full discrete accelerated exponential Euler-type approximation scheme for additive space-time white noise driven stochastic partial differential equations (SPDEs) with possibly non-globally monotone nonlinearities, such as Stochastic Kuramoto-Sivashinsky equations, is presented.
Journal ArticleDOI

Strong convergence of full-discrete nonlinearity-truncated accelerated exponential Euler-type approximations for stochastic Kuramoto-Sivashinsky equations

TL;DR: In this article, the authors consider the stochastic Kuramoto-Sivashinsky equations and show that the nonlinearity of convolutional neural networks can be characterized by Fernique's theorem.
Journal ArticleDOI

A proof that deep artificial neural networks overcome the curse of dimensionality in the numerical approximation of Kolmogorov partial differential equations with constant diffusion and nonlinear drift coefficients

TL;DR: In this article, it was shown that the number of parameters used to describe the employed DNN grows at most polynomially in both the PDE dimension and the reciprocal of the prescribed approximation accuracy.