Journal ArticleDOI
A fast training algorithm for neural networks
Jarosław Bilski,Leszek Rutkowski +1 more
Reads0
Chats0
TLDR
The recursive least squares method (RLS) is derived for the learning of multilayer feedforward neural networks and simulation results indicate a fast learning process in comparison to the classical and momentum backpropagation (BP) algorithms.Abstract:
The recursive least squares method (RLS) is derived for the learning of multilayer feedforward neural networks. Simulation results on the XOR, 4-2-4 encoder, and function approximation problems indicate a fast learning process in comparison to the classical and momentum backpropagation (BP) algorithms.read more
Citations
More filters
Journal ArticleDOI
On Adaptive Learning Rate That Guarantees Convergence in Feedforward Networks
TL;DR: Investigations made in this paper help to better understand the learning procedure of feedforward neural networks in terms of adaptive learning rate, convergence speed, and local minima.
Journal ArticleDOI
Asynchronous Event-Based Binocular Stereo Matching
TL;DR: It is shown that matching on the timing of the visual events provides a new solution to the real-time computation of 3-D objects when combined with geometric constraints using the distance to the epipolar lines.
Journal ArticleDOI
Data-Driven MFAC for a Class of Discrete-Time Nonlinear Systems With RBFNN
Yuanming Zhu,Zhongsheng Hou +1 more
TL;DR: A novel model-free adaptive control method is proposed for a class of discrete-time single input single output (SISO) nonlinear systems, where the equivalent dynamic linearization technique is used on the ideal nonlinear controller.
Journal ArticleDOI
Predicting Chaotic Time Series Using Recurrent Neural Network
Jia-Shu Zhang,Xian-Ci Xiao +1 more
TL;DR: Numerical results show that the RNN proposed here is a very powerful tool for making prediction of chaotic time series.
Journal ArticleDOI
Adaptive Computation Algorithm for RBF Neural Network
Honggui Han,Junfei Qiao +1 more
TL;DR: A novel learning algorithm is proposed for nonlinear modelling and identification using radial basis function neural networks through the use of an adaptive computation algorithm, exhibiting the effectiveness of the proposed algorithm.
References
More filters
Book
Adaptive Filter Theory
TL;DR: In this paper, the authors propose a recursive least square adaptive filter (RLF) based on the Kalman filter, which is used as the unifying base for RLS Filters.
Journal ArticleDOI
A fast new algorithm for training feedforward neural networks
R.S. Scalero,N. Tepedelenlioglu +1 more
TL;DR: A fast new algorithm is presented for training multilayer perceptrons as an alternative to the back-propagation algorithm that reduces the required training time considerably and overcomes many of the shortcomings presented by the conventional back- Propagation algorithms.
Journal ArticleDOI
An adaptive least squares algorithm for the efficient training of artificial neural networks
TL;DR: In this paper, a novel learning algorithm is developed for the training of multilayer feedforward neural networks, based on a modification of the Marquardt-Levenberg least-squares optimization method.
Book ChapterDOI
Fast learning algorithms for neural networks
TL;DR: A generalized criterion for the training of feedforward neural networks is proposed, which leads to a variety of fast learning algorithms for single-layered as well as multilayered neural networks.
Related Papers (5)
On optimal global rate of convergence of some nonparametric identification procedures
Leszek Rutkowski,E. Rafajlowicz +1 more