scispace - formally typeset
Journal ArticleDOI

Let a biogeography-based optimizer train your Multi-Layer Perceptron

Seyedali Mirjalili, +2 more
- 01 Jun 2014 - 
- Vol. 269, Iss: 269, pp 188-209
TLDR
Training MLPs by using Biogeography-Based Optimization is significantly better than the current heuristic learning algorithms and BP, and the results show that BBO is able to provide very competitive results in comparison with ELM.
About
This article is published in Information Sciences.The article was published on 2014-06-01. It has received 261 citations till now. The article focuses on the topics: Multilayer perceptron & Perceptron.

read more

Citations
More filters
Journal ArticleDOI

Optimizing connection weights in neural networks using the whale optimization algorithm

TL;DR: The qualitative and quantitative results prove that the proposed WOA-based trainer is able to outperform the current algorithms on the majority of datasets in terms of both local optima avoidance and convergence speed.
Proceedings ArticleDOI

Elephant Herding Optimization

TL;DR: A new kind of swarm-based metaheuristic search method, called Elephant Herding Optimization (EHO), is proposed for solving optimization tasks, inspired by the herding behavior of elephant group.
Journal ArticleDOI

How effective is the Grey Wolf optimizer in training multi-layer perceptrons

TL;DR: The statistical results prove the GWO algorithm is able to provide very competitive results in terms of improved local optima avoidance and a high level of accuracy in classification and approximation of the proposed trainer.
Journal ArticleDOI

Dendritic Neuron Model With Effective Learning Algorithms for Classification, Approximation, and Prediction

TL;DR: Six learning algorithms including biogeography-based optimization, particle swarm optimization, genetic algorithm, ant colony optimization, evolutionary strategy, and population-based incremental learning are used to train a new dendritic neuron model (DNM) and are suggested to make DNM more powerful in solving classification, approximation, and prediction problems.
Journal ArticleDOI

Biogeography-based optimisation with chaos

TL;DR: This study utilises ten chaotic maps to enhance the performance of the biogeography-based optimisation algorithm and demonstrates that the combination of chaotic selection and emigration operators results in the highest performance.
References
More filters
Journal ArticleDOI

Multilayer feedforward networks are universal approximators

TL;DR: It is rigorously established that standard multilayer feedforward networks with as few as one hidden layer using arbitrary squashing functions are capable of approximating any Borel measurable function from one finite dimensional space to another to any desired degree of accuracy, provided sufficiently many hidden units are available.
Journal ArticleDOI

A logical calculus of the ideas immanent in nervous activity

TL;DR: In this article, it is shown that many particular choices among possible neurophysiological assumptions are equivalent, in the sense that for every net behaving under one assumption, there exists another net which behaves under another and gives the same results, although perhaps not in the same time.
Book ChapterDOI

Individual Comparisons by Ranking Methods

TL;DR: The comparison of two treatments generally falls into one of the following two categories: (a) a number of replications for each of the two treatments, which are unpaired, or (b) we may have a series of paired comparisons, some of which may be positive and some negative as mentioned in this paper.
Journal ArticleDOI

No free lunch theorems for optimization

TL;DR: A framework is developed to explore the connection between effective optimization algorithms and the problems they are solving and a number of "no free lunch" (NFL) theorems are presented which establish that for any algorithm, any elevated performance over one class of problems is offset by performance over another class.
Related Papers (5)