scispace - formally typeset
Open AccessJournal ArticleDOI

A new optimized GA-RBF neural network algorithm

TLDR
A new optimized RBF neural network algorithm based on genetic algorithm (GA-RBF algorithm) is proposed, which uses genetic algorithm to optimize the weights and structure of RBF Neural network; it chooses new ways of hybrid encoding and optimizing simultaneously.
Abstract
When confronting the complex problems, radial basis function (RBF) neural network has the advantages of adaptive and self-learning ability, but it is difficult to determine the number of hidden layer neurons, and the weights learning ability from hidden layer to the output layer is low; these deficiencies easily lead to decreasing learning ability and recognition precision. Aiming at this problem, we propose a new optimized RBF neural network algorithm based on genetic algorithm (GA-RBF algorithm), which uses genetic algorithm to optimize the weights and structure of RBF neural network; it chooses new ways of hybrid encoding and optimizing simultaneously. Using the binary encoding encodes the number of the hidden layer's neurons and using real encoding encodes the connection weights. Hidden layer neurons number and connection weights are optimized simultaneously in the new algorithm. However, the connection weights optimization is not complete; we need to use least mean square (LMS) algorithm for further leaning, and finally get a new algorithm model. Using two UCI standard data sets to test the new algorithm, the results show that the new algorithm improves the operating efficiency in dealing with complex problems and also improves the recognition precision, which proves that the new algorithm is valid.

read more

Content maybe subject to copyright    Report

Citations
More filters
Book ChapterDOI

Evolving Radial Basis Function Networks Using Moth–Flame Optimizer

Abstract: This book chapter proposes a new training algorithms for Radial Basis Function (RBF) using a recently proposed optimization algorithm called Moth–Flame Optimizer (MFO). After formulating MFO as RBFN trainer, seven standard binary classifications are employed as case studies. The MFO-based trainer is compared with Particle Swarm Algorithm (PSO), Genetic Algorithm (GA), Bat Algorithm (BA), and newrb. The results show that the proposed trainer is able to show superior results on the majority of case studies. The observation of convergence behavior proves that this new trainer benefits from accelerating convergence speed as well.
Journal ArticleDOI

A Review of Epidemic Forecasting Using Artificial Neural Networks

TL;DR: The selection of forecasting tool is critical to the precision of epidemic forecast; hence, a working guide for the choice of appropriate tools will help reduce inconsistency and imprecision in forecasting epidemic size in populations.
Journal ArticleDOI

Time Series Prediction Using Radial Basis Function Neural Network

TL;DR: The RBFNN model illustrates the proposed best model to predict daily network traffic, namely radial basis function neural network (RBFNN) method, and shows that the smallest MSE value indicates a good method for accuracy.
Journal ArticleDOI

Radial basis function neural network for 2 satisfiability programming

TL;DR: An integrated 2 Satisfiability in radial basis function neural network (RBFNN-2SAT) is presented, and results obtained from a computer simulation showed that RBfNN- 2SATHT outperformed RBFNN -2S ATNT.
Journal ArticleDOI

RBF neural network modeling approach using PCA based LM–GA optimization for coke furnace system

TL;DR: The dimensionality reduction of normalized input variables that affect the outputs is first implemented by using principal component analysis (PCA), then, radial basis function (RBF) neural network model is established and genetic algorithm is introduced to train the centers, widths and weights to improve the modeling accuracy.
References
More filters
Journal ArticleDOI

Learning representations by back-propagating errors

TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Journal ArticleDOI

Evolving artificial neural networks

TL;DR: It is shown, through a considerably large literature review, that combinations between ANNs and EAs can lead to significantly better intelligent systems than relying on ANNs or EAs alone.
Journal ArticleDOI

Regularization in the selection of radial basis function centers

TL;DR: An efficient implementation of RFS into which either delete-1 or generalized cross-validation can be incorporated and a reestimation formula for the regularization parameter are also discussed.
Journal ArticleDOI

Evolutionary artificial neural networks: a review

TL;DR: The advantages and disadvantages of using EAs to optimize ANNs are explained and the basic theories and algorithms for optimizing the weights, optimizing the network architecture and optimizing the learning rules are provided.
Journal ArticleDOI

A review of genetic algorithms applied to training radial basis function networks

TL;DR: A brief overview of feedforward ANNs and GAs is given followed by a review of the current state of research in applying evolutionary techniques to training RBF networks.
Related Papers (5)