scispace - formally typeset
Journal ArticleDOI

Approximation, dimension reduction, and nonconvex optimization using linear superpositions of Gaussians

A. Saha, +2 more
- 01 Oct 1993 - 
- Vol. 42, Iss: 10, pp 1222-1233
TLDR
The authors derive some key properties of RBF networks that provide suitable grounds for implementing efficient search strategies for nonconvex optimization within the same framework.
Abstract
This paper concerns neural network approaches to function approximation and optimization using linear superposition of Gaussians (or what are popularly known as radial basis function (RBF) networks). The problem of function approximation is one of estimating an underlying function f, given samples of the form ((y/sub i/, x/sub i/); i=1,2,...,n; with y/sub i/=f(x/sub i/)). When the dimension of the input is high and the number of samples small, estimation of the function becomes difficult due to the sparsity of samples in local regions. The authors find that this problem of high dimensionality can be overcome to some extent by using linear transformations of the input in the Gaussian kernels. Such transformations induce intrinsic dimension reduction, and can be exploited for identifying key factors of the input and for the phase space reconstruction of dynamical systems, without explicitly computing the dimension and delay. They present a generalization that uses multiple linear projections onto scalars and successive RBF networks (MLPRBF) that estimate the function based on these scaler values. They derive some key properties of RBF networks that provide suitable grounds for implementing efficient search strategies for nonconvex optimization within the same framework. >

read more

Citations
More filters
Journal ArticleDOI

Survey of modeling and optimization strategies to solve high-dimensional design problems with computationally-expensive black-box functions

TL;DR: A survey on related modeling and optimization strategies that may help to solve High-dimensional, Expensive (computationally), Black-box (HEB) problems and two promising approaches are identified to solve HEB problems.
Dissertation

Constructive algorithms for structure learning in feedforward neural networks

Tin Yau Kwok
TL;DR: This survey paper first describes the general issues in constructive algorithms, with special emphasis on the search strategy, then presents a taxonomy, based on the differences in the state transition mapping, the training algorithm, and the network architecture.
Journal ArticleDOI

Constructive algorithms for structure learning in feedforward neural networks for regression problems

TL;DR: A survey of constructive algorithms for structure learning in feed-forward neural networks for regression problems can be found in this paper, where the authors formulate the whole problem as a state-space search, with special emphasis on the search strategy.
Journal ArticleDOI

Objective functions for training new hidden units in constructive neural networks

TL;DR: The aim is to derive a class of objective functions the computation of which and the corresponding weight updates can be done in O(N) time, where N is the number of training patterns.
Journal ArticleDOI

Robust radial basis function neural networks

TL;DR: Compared with traditional RBF networks, the proposed network demonstrates the following advantages: (1) better capability of approximation to underlying functions; (2) faster learning speed; (3) better size of network; (4) high robustness to outliers.
References
More filters
Journal ArticleDOI

Multilayer feedforward networks are universal approximators

TL;DR: It is rigorously established that standard multilayer feedforward networks with as few as one hidden layer using arbitrary squashing functions are capable of approximating any Borel measurable function from one finite dimensional space to another to any desired degree of accuracy, provided sufficiently many hidden units are available.
Book ChapterDOI

Learning internal representations by error propagation

TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Book

Learning internal representations by error propagation

TL;DR: In this paper, the problem of the generalized delta rule is discussed and the Generalized Delta Rule is applied to the simulation results of simulation results in terms of the generalized delta rule.
Journal ArticleDOI

Approximation by superpositions of a sigmoidal function

TL;DR: It is demonstrated that finite linear combinations of compositions of a fixed, univariate function and a set of affine functionals can uniformly approximate any continuous function ofn real variables with support in the unit hypercube.