scispace - formally typeset
Topic

Hierarchical RBF

About: Hierarchical RBF is a(n) research topic. Over the lifetime, 599 publication(s) have been published within this topic receiving 21653 citation(s).

...read more

Papers
  More

Open accessJournal Article
28 Mar 1988-Complex Systems
Abstract: : The relationship between 'learning' in adaptive layered networks and the fitting of data with high dimensional surfaces is discussed. This leads naturally to a picture of 'generalization in terms of interpolation between known data points and suggests a rational approach to the theory of such networks. A class of adaptive networks is identified which makes the interpolation scheme explicit. This class has the property that learning is equivalent to the solution of a set of linear equations. These networks thus represent nonlinear relationships while having a guaranteed learning rule. Great Britain.

...read more

Topics: Interpolation (65%), Linear interpolation (62%), Bilinear interpolation (61%) ...read more

3,464 Citations


Journal ArticleDOI: 10.1162/NECO.1991.3.2.246
Jooyoung Park1, Irwin W. Sandberg1Institutions (1)
01 Jun 1991-Neural Computation
Abstract: There have been several recent studies concerning feedforward networks and the problem of approximating arbitrary functionals of a finite number of real variables. Some of these studies deal with cases in which the hidden-layer nonlinearity is not a sigmoid. This was motivated by successful applications of feedforward networks with nonsigmoidal hidden-layer units. This paper reports on a related study of radial-basis-function (RBF) networks, and it is proved that RBF networks having one hidden layer are capable of universal approximation. Here the emphasis is on the case of typical RBF networks, and the results show that a certain class of RBF networks with the same smoothing factor in each kernel node is broad enough for universal approximation.

...read more

Topics: Hierarchical RBF (62%), Activation function (58%), Kernel (statistics) (51%) ...read more

3,344 Citations


Journal ArticleDOI: 10.1109/72.80341
Sheng Chen1, Colin F. N. Cowan1, Peter Grant1Institutions (1)
Abstract: The radial basis function network offers a viable alternative to the two-layer neural network in many applications of signal processing. A common learning algorithm for radial basis function networks is based on first choosing randomly some data points as radial basis function centers and then using singular-value decomposition to solve for the weights of the network. Such a procedure has several drawbacks, and, in particular, an arbitrary selection of centers is clearly unsatisfactory. The authors propose an alternative learning procedure based on the orthogonal least-squares method. The procedure chooses radial basis function centers one by one in a rational way until an adequate network has been constructed. In the algorithm, each selected center maximizes the increment to the explained variance or energy of the desired output and does not suffer numerical ill-conditioning problems. The orthogonal least-squares learning strategy provides a simple and efficient means for fitting radial basis function networks. This is illustrated using examples taken from two different signal processing applications. >

...read more

Topics: Radial basis function network (70%), Basis function (62%), Radial basis function (57%) ...read more

3,300 Citations


Proceedings ArticleDOI: 10.1145/383259.383266
J. C. Carr1, Rick Beatson1, J. B. Cherrie, T. J. Mitchell1  +3 moreInstitutions (1)
01 Aug 2001-
Abstract: We use polyharmonic Radial Basis Functions (RBFs) to reconstruct smooth, manifold surfaces from point-cloud data and to repair incomplete meshes. An object's surface is defined implicitly as the zero set of an RBF fitted to the given surface data. Fast methods for fitting and evaluating RBFs allow us to model large data sets, consisting of millions of surface points, by a single RBF — previously an impossible task. A greedy algorithm in the fitting process reduces the number of RBF centers required to represent a surface and results in significant compression and further computational advantages. The energy-minimisation characterisation of polyharmonic splines result in a “smoothest” interpolant. This scale-independent characterisation is well-suited to reconstructing surfaces from non-uniformly sampled data. Holes are smoothly filled and surfaces smoothly extrapolated. We use a non-interpolating approximation when the data is noisy. The functional representation is in effect a solid model, which means that gradients and surface normals can be determined analytically. This helps generate uniform meshes and we show that the RBF representation has advantages for mesh simplification and remeshing applications. Results are presented for real-world rangefinder data.

...read more

Topics: Hierarchical RBF (61%), Polyharmonic spline (58%), Polygon mesh (51%) ...read more

1,841 Citations


Open accessJournal ArticleDOI: 10.1109/TNN.2002.1000134
Abstract: A general and efficient design approach using a radial basis function (RBF) neural classifier to cope with small training sets of high dimension, which is a problem frequently encountered in face recognition, is presented. In order to avoid overfitting and reduce the computational burden, face features are first extracted by the principal component analysis (PCA) method. Then, the resulting features are further processed by the Fisher's linear discriminant (FLD) technique to acquire lower-dimensional discriminant patterns. A novel paradigm is proposed whereby data information is encapsulated in determining the structure and initial parameters of the RBF neural classifier before learning takes place. A hybrid learning algorithm is used to train the RBF neural networks so that the dimension of the search space is drastically reduced in the gradient paradigm. Simulation results conducted on the ORL database show that the system achieves excellent performance both in terms of error rates of classification and learning efficiency.

...read more

  • TABLE XV GENERALIZATION ERRORS FORTESTINGDATA BY THE ORBF METHOD
    TABLE XV GENERALIZATION ERRORS FORTESTINGDATA BY THE ORBF METHOD
  • Fig. 11. An example of incorrect classification: (a) training images and (b) test images.
    Fig. 11. An example of incorrect classification: (a) training images and (b) test images.
  • TABLE XIV GENERALIZATION ERRORS FORTESTINGDATA BY DIFFERENTINITIAL WIDTHS (THE RESULT IS THE SUM OF SIX SIMULATIONS)
    TABLE XIV GENERALIZATION ERRORS FORTESTINGDATA BY DIFFERENTINITIAL WIDTHS (THE RESULT IS THE SUM OF SIX SIMULATIONS)
  • Fig. 1. Schematic diagram of RBF neural classifier for small training sets of high dimension.
    Fig. 1. Schematic diagram of RBF neural classifier for small training sets of high dimension.
  • Fig. 7. The ORL face database.
    Fig. 7. The ORL face database.
  • + 5

614 Citations


Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20183
201710
201614
201516
201424
201330

Top Attributes

Show by:

Topic's top 5 most impactful authors

Yuehui Chen

6 papers, 38 citations

Nicolaos B. Karayiannis

5 papers, 310 citations

Siti Mariyam Shamsuddin

5 papers, 116 citations

Ireneusz Czarnowski

3 papers, 19 citations

Roman Neruda

3 papers, 10 citations

Network Information
Related Topics (5)
CURE data clustering algorithm

13.7K papers, 461.2K citations

80% related
Fuzzy clustering

23.2K papers, 601.2K citations

78% related
Scale-space segmentation

26.7K papers, 599.6K citations

78% related
Feature extraction

111.8K papers, 2.1M citations

77% related
Feature vector

48.8K papers, 954.4K citations

77% related