scispace - formally typeset
Open AccessJournal ArticleDOI

Neural network with unbounded activation functions is universal approximator

Sho Sonoda, +1 more
- 01 Sep 2017 - 
- Vol. 43, Iss: 2, pp 233-268
TLDR
In this paper, the authors investigated the universal approximation property of neural networks with unbounded activation functions, such as the rectified linear unit (ReLU), and showed that the ReLU network can be analyzed by the ridgelet transform with respect to Lizorkin distributions.
About
This article is published in Applied and Computational Harmonic Analysis.The article was published on 2017-09-01 and is currently open access. It has received 214 citations till now. The article focuses on the topics: Radon transform & Parseval's theorem.

read more

Citations
More filters
Journal ArticleDOI

An overview of deep learning in medical imaging focusing on MRI

TL;DR: In this article, the authors provide a short overview of recent advances and some associated challenges in machine learning applied to medical image processing and image analysis, and provide a starting point for people interested in experimenting and perhaps contributing to the field of machine learning for medical imaging.
Journal ArticleDOI

An overview of deep learning in medical imaging focusing on MRI

TL;DR: This paper indicates how deep learning has been applied to the entire MRI processing chain, from acquisition to image retrieval, from segmentation to disease prediction, and provides a starting point for people interested in experimenting and contributing to the field of deep learning for medical imaging.
Journal ArticleDOI

Deep learning classifiers for hyperspectral imaging: A review

TL;DR: A comprehensive review of the current-state-of-the-art in DL for HSI classification, analyzing the strengths and weaknesses of the most widely used classifiers in the literature is provided, providing an exhaustive comparison of the discussed techniques.
Posted Content

On the Spectral Bias of Neural Networks

TL;DR: This work shows that deep ReLU networks are biased towards low frequency functions, and studies the robustness of the frequency components with respect to parameter perturbation, to develop the intuition that the parameters must be finely tuned to express high frequency functions.
Journal ArticleDOI

A gentle introduction to deep learning in medical image processing

TL;DR: A gentle introduction to deep learning in medical image processing is given, proceeding from theoretical foundations to applications, including general reasons for the popularity of deep learning, including several major breakthroughs in computer science.
References
More filters
Proceedings Article

ImageNet Classification with Deep Convolutional Neural Networks

TL;DR: The state-of-the-art performance of CNNs was achieved by Deep Convolutional Neural Networks (DCNNs) as discussed by the authors, which consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax.
Book

Functional analysis

Walter Rudin
Book

Singular Integrals and Differentiability Properties of Functions.

TL;DR: Stein's seminal work Real Analysis as mentioned in this paper is considered the most influential mathematics text in the last thirty-five years and has been widely used as a reference for many applications in the field of analysis.
Proceedings Article

Deep Sparse Rectifier Neural Networks

TL;DR: This paper shows that rectifying neurons are an even better model of biological neurons and yield equal or better performance than hyperbolic tangent networks in spite of the hard non-linearity and non-dierentiabil ity.
Book

Functional Analysis, Sobolev Spaces and Partial Differential Equations

Haim Brezis
TL;DR: In this article, the theory of conjugate convex functions is introduced, and the Hahn-Banach Theorem and the closed graph theorem are discussed, as well as the variations of boundary value problems in one dimension.
Related Papers (5)