scispace - formally typeset
Journal ArticleDOI

Approximation by neural networks is not continuous

Reads0
Chats0
TLDR
In a Banach space X satisfying mild conditions, for its infinite, linearly independent subset G, there is no continuous best approximation map from X to the n-span, span n G.
About
This article is published in Neurocomputing.The article was published on 1999-11-01. It has received 44 citations till now. The article focuses on the topics: Subspace topology & Banach space.

read more

Citations
More filters
Journal ArticleDOI

Approximation theory of the MLP model in neural networks

TL;DR: This survey discusses various approximation-theoretic problems that arise in the multilayer feedforward perceptron (MLP) model in neural networks.
Journal ArticleDOI

Error bounds for approximations with deep ReLU networks.

TL;DR: It is proved that deep ReLU networks more efficiently approximate smooth functions than shallow networks and adaptive depth-6 network architectures more efficient than the standard shallow architecture are described.
Posted Content

Error bounds for approximations with deep ReLU networks

TL;DR: In this paper, the expressive power of shallow and deep neural networks with piecewise linear activation functions was studied and upper and lower bounds for the network complexity in the setting of approximations in Sobolev spaces were established.
MonographDOI

Recurrent Neural Networks for Prediction

TL;DR: Within this text neural networks are considered as massively interconnected nonlinear adaptive filters.
Journal ArticleDOI

Comparison of worst case errors in linear and neural network approximation

TL;DR: A theoretical framework for describing sets of multivariable functions for which worst case errors in linear approximation are larger than those in approximation by neural networks is developed in the context of nonlinear approximation by fixed versus variable basis functions.
References
More filters
Book

Approximation of Functions

TL;DR: Possibility of approximating polynomials of best approximation with linear operators has been studied in the context of functions of one variable as mentioned in this paper, where the degree of approximation of differentiable functions has been shown to be a function of the complexity of the function.
Book

n-Widths in Approximation Theory

Allan Pinkus
TL;DR: In this paper, Tchebycheff Systems and total positivity of n-widths are studied in the context of the Tchebyschka system, and the existence of optimal subspaces for dn.
Book

Best Approximation in Normed Linear Spaces by Elements of Linear Subspaces

Ivan Singer
TL;DR: In this paper, the authors propose an approximation of sous-espace lineaire de dimension finie, which is a lineaire lineaire of the dimension of the element d'ensemble.
Related Papers (5)