scispace - formally typeset
D

Dennis Elbrächter

Researcher at University of Vienna

Publications -  14
Citations -  417

Dennis Elbrächter is an academic researcher from University of Vienna. The author has contributed to research in topics: Artificial neural network & Activation function. The author has an hindex of 8, co-authored 13 publications receiving 263 citations.

Papers
More filters
Posted Content

Deep Neural Network Approximation Theory

TL;DR: Deep networks provide exponential approximation accuracy—i.e., the approximation error decays exponentially in the number of nonzero weights in the network— of the multiplication operation, polynomials, sinusoidal functions, and certain smooth functions.
Journal ArticleDOI

DNN Expression Rate Analysis of High-dimensional PDEs: Application to Option Pricing

TL;DR: It is proved that the solution to the d -variate option pricing problem can be approximated up to an ε -error by a deep ReLU network, and the techniques developed in the constructive proof are of independent interest in the analysis of the expressive power of deep neural networks for solution manifolds of PDEs in high dimension.
Journal ArticleDOI

Deep Neural Network Approximation Theory

TL;DR: In this paper, it was shown that deep networks are Kolmogorov-optimal approximants for unit balls in Besov spaces and modulation spaces, and that for sufficiently smooth functions finite-width deep networks require strictly smaller connectivity than finite-depth wide networks.
Posted ContentDOI

Group testing for SARS-CoV-2 allows for up to 10-fold efficiency increase across realistic scenarios and testing strategies

TL;DR: There are significant efficiency gaps between different group testing strategies in realistic scenarios for SARS-CoV-2 testing, highlighting the need for an informed decision of the pooling protocol depending on estimated prevalence, target specificity, and high- vs. low-risk population.
Posted Content

The universal approximation power of finite-width deep ReLU networks.

TL;DR: Finite-width deep ReLU neural networks yield rate-distortion optimal approximation of polynomials, windowed sinusoidal functions, one-dimensional oscillatory textures, and the Weierstrass function, a fractal function which is continuous but nowhere differentiable.