scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Inertia Weight strategies in Particle Swarm Optimization

TL;DR: 15 relatively recent and popular Inertia Weight strategies are studied and their performance on 05 optimization test problems is compared to show which are more efficient than others.
Abstract: Particle Swarm Optimization is a popular heuristic search algorithm which is inspired by the social learning of birds or fishes. It is a swarm intelligence technique for optimization developed by Eberhart and Kennedy [1] in 1995. Inertia weight is an important parameter in PSO, which significantly affects the convergence and exploration-exploitation trade-off in PSO process. Since inception of Inertia Weight in PSO, a large number of variations of Inertia Weight strategy have been proposed. In order to propose one or more than one Inertia Weight strategies which are efficient than others, this paper studies 15 relatively recent and popular Inertia Weight strategies and compares their performance on 05 optimization test problems.

Content maybe subject to copyright    Report

Inertia Weight Strategies in Particle Swarm
Optimization
1
J. C. Bansal,
2
P. K. Singh
3
Mukesh Saraswat,
4
Abhishek Verma,
5
Shimpi Singh Jadon,
6,7
Ajith Abraham
1,2,3,4,5
ABV-Indian Institute of Information Technology & Management, Gwalior, India
6
Machine Intelligence Research Labs (MIR Labs), USA
7
VSB Technical University of Ostrava, Czech Republic
1
jcbansal@gmail.com,
2
pksingh7@gmail.com,
3
saraswatmukesh@gmail.com,
4
abhishekverma.cs@gmail.com,
5
shimpisingh2k6@gmail.com, ,
6
ajith.abraham@ieee.org
Abstract: Particle Swarm Optimization is a popular heuristic
search algorithm which is inspired by the social learning of
birds or fishes. It is a swarm intelligence technique for
optimization developed by Eberhart and Kennedy [1] in 1995.
Inertia weight is an important parameter in PSO, which
significantly affects the convergence and exploration-
exploitation trade-off in PSO process. Since inception of
Inertia Weight in PSO, a large number of variations of Inertia
Weight strategy have been proposed. In order to propose one
or more than one Inertia Weight strategies which are efficient
than others, this paper studies 15 relatively recent and popular
Inertia Weight strategies and compares their performance on
05 optimization test problems.
Keywords-Particle Swarm Optimization; Inertia Weight;
Convergence.
I. I
NTRODUCTION
Particle Swarm Optimization (PSO) is an optimization
technique inspired by social behavior of bird flocking and
fish schooling in search of food. The technique was
originally designed and developed by Eberhart and
Kennedy [1]. The prominent features of PSO are its easy
implementation, robustness to control parameters and
computation efficiency compared with other existing
heuristic algorithms such as genetic algorithm in a
continuous problem. PSO can be applied to non-
differentiable, non-linear, huge search space problems and
gives the better results with a good efficiency. In PSO,
instead of using more traditional genetic operators, each
particle modifies its movement according to its own
experience and its neighboring particle experience. The two
equations which are used in PSO are position update
equations and velocity update equation. These are to be
modified in each iteration of PSO algorithm to converge the
optimum solution. For an n- dimensional search space, the
i
th
particle of the swarm is represented by a n- dimensional
vector, X
i
= (x
i1
, x
i2
, …,x
in
)
T
. The velocity of this particle is
represented by another n-dimensional vector V
i
= (v
i1
,
v
i2
,…,v
in
)
T
. The previously best visited position of the i
th
particle is denoted as P
i
= (p
i1
, p
i2
, …,p
in
)
T
. ‘g’ is the index
of the best particle in the swarm. The velocity of the i
th
particle is updated using the velocity update equation given
by (1) and the position is updated using (2).



















where d = 1, 2… n represents the dimension and i = 1,
2,…, S represents the particle index. S is the size of the
swarm and c
1
and c
2
are constants, called cognitive and
social scaling parameters respectively (usually, c
1
= c
2
; r
1
, r
2
are random numbers drawn from a uniform distribution).
Equations (1) and (2) define the classical version of PSO
algorithm. A constant, V
max
, was introduced to arbitrarily
limit the velocities of the particles and improve the
resolution of the search. The maximum velocity V
max
,
serves as a constraint to control the global exploration
ability of particle swarm. Further, the concept of an Inertia
Weight was developed by Shi and Eberhart [2] in 1998 to
better control exploration and exploitation. The motivation
was to be able to eliminate the need for V
max
. The resulting
velocity update equation becomes:













As there is a large effect of initial velocity in the balancing
of exploration and exploitation process of swarm, Inertia
Weight (w) is used to control the velocity. In this paper,
Inertia Weight for PSO is reviewed and experiments are
carried out over five basic benchmark optimization
functions to compare different strategies of setting Inertia
Weight.
II. D
IFFERENT
I
NERTIA
W
EIGHT
S
TRATEGIES FOR
P
ARTICLE
S
WARM OPTIMIZATION
Inertia Weight plays a key role in the process of providing
balance between exploration and exploitation process. The
Inertia Weight determines the contribution rate of a
particle’s previous velocity to its velocity at the current
time step. The basic PSO, presented by Eberhart and
Kennedy in 1995 [1], has no Inertia Weight. In 1998, first
time Shi and Eberhart [2] presented the concept of Inertia
Weight by introducing Constant Inertia Weight. They stated
that a large Inertia Weight facilitates a global search while a
small Inertia Weight facilitates a local search. Further,
640
978-1-4577-1123-7/11/$26.00
c
2011 IEEE

dynamical adjusting of Inertia Weight was introduced by
many researchers which can increase the capabilities of
PSO. A review of Inertia Weight strategies in PSO is given
chronologically in subsequent paragraphs.
Eberhart and Shi [3] proposed a Random Inertia
Weight strategy and experimentally found that this strategy
increases the convergence of PSO in early iterations of the
algorithm. The Linearly Decreasing strategy [6] enhances
the efficiency and performance of PSO. It is found
experimentally that Inertia Weight from 0.9 to 0.4 provides
the excellent results. In spite of its ability to converge
optimum, it gets into the local optimum solving the
question of more apices function.
In Global-Local Best Inertia Weight [9], the Inertia
Weight is based on the function of local best and global
best of the particles in each generation. It neither takes a
constant value nor a linearly decreasing time-varying value.
To overcome the weakness of premature convergence to
local minimum, Adaptive Inertia Weight strategy [4] is
proposed to improve its searching capability. It controls the
population diversity by adaptive adjustment of Inertia
Weight.
Fayek et al. [11] introduces an optimized Particle
Swarm technique (PSOSA) that uses Simulated Annealing
for optimizing the Inertia Weight and tested the approach
on urban planning problem. The proposed technique gives
much better as regards convergence speed as well as
sustainability to increased load of growing number of
blocks to be fitted in the urban planning problem.
Chen et al. [13] present two Natural Exponent Inertia
Weight strategies which are based on the basic idea of
Decreasing Inertia Weight. Experimentally, these two new
strategies converge faster than linear one during the early
stage of the search process and provide better results for
most continuous optimization problems. Using the merits of
chaotic optimization, Chaotic Inertia Weight has been
proposed by Feng et al. [7]. Comparison between CRIW
PSO and RIW PSO has been done and found that CRIW
PSO performs excellently. It has rough search stage and
minute search stage alternately in all its evolutionary
process.
Malik et al. [5] presented a Sigmoid Increasing
Inertia Weight. They found that sigmoid function has
contributed in getting minimum fitness function while
Linearly Increasing Inertia Weight gives contribution to
quick convergence ability. So they combine sigmoid
function and Linear Increasing Inertia Weight and provides
a SIIW which has produced a great improvement in quick
convergence ability and aggressive movement narrowing
towards the solution region. Oscillating Inertia Weight [8]
provides periodically alternates between global and local
search waves and conclusion was drawn that this strategy
appears to be generally competitive and, in some cases,
outperform particularly in terms of convergence speed.
Gao et al. [14] proposed a new PSO algorithm
which combined the Logarithm Decreasing Inertia Weight
with Chaos mutation operator. The Logarithm Decreasing
Inertia Weight can improve the convergence speed, while
the Chaos mutation can enhance the ability to jump out of
the local optima. In order to overcome the premature
convergence and later period oscillatory occurrences of the
standard PSO, an Exponent Decreasing Inertia Weight and
a stochastic mutation to produce an improved PSO has been
proposed by Gao et al. [12] which uses the Exponent
Decreasing Inertia Weight along with stochastic piecewise
mutation for current global optimal particle during the
running time, thus strengthened jumped out the partial
optimal solution ability.
The summary of different Inertia Weight strategies is
tabulated in Table 1 along with the required constraints.
III.
E
XPERIMENTAL RESULTS
To suggest a better strategy for a user of PSO with Inertia
Weight, experiments have been carried out for 15 different
Inertia Weight strategies over five optimization test
problems.
A. Parameter Settings
Swarm size is taken to be 50. Number of decision variables
is fixed to be 10 for each experiment. The termination
criterion is set to the “no improvement observed for 200
iterations (similar fitness value achieved for 200
consecutive iterations)”. For those which require maximum
number of iterations, 1000 iterations are used. To avoid the
effect of choice of initial population, 30 simulations are
taken. The value of acceleration parameters c
1
& c
2
are
taken equal to 2. As previously discussed, five different test
optimization functions are used for experiments. These
functions are shown in Table 2 along with their range of
search space. For implementing these 15 strategies in PSO,
a C++ code has been developed and compiled in Dev C++
compiler.
2011 Third World Congress on Nature and Biologically Inspired Computing 641

TABLE 1. DIFFERENT INERTIA WEIGHT
Sr. No. Name Of Inertia Weight Formula of Inertia Weight Reference
1. Constant Inertia Weight
w=c
c=0.7(considered for experiments) [2]
2. Random Inertia Weight


[3]
3. Adaptive Inertia Weight



!

"
#
$
!

"
#
$
!

%
&'"( )"
&'"( )"
[4]
4. Sigmoid Increasing Inertia Weight
*
+!,-!

./
"
0
*1/./

./
)
23
./
1
[5]
5. Sigmoid Decreasing Inertia Weight
*
+!,-!

./
"
10
*1/./

./
)
23
./
1
[5]
6. Linear Decreasing Inertia Weight
*

#,4
#,4

#/
5 "
#,4
67
[6]
7. The Chaotic Inertia Weight
896868



6
:;<5 "5 "
:;<5 "

68
[7]
8. Chaotic Random Inertia Weight
896868
668
[7]
9. Oscillating Inertia Weight
#/

#,4
#,4

#/
=>?
@
A
A
B
7
[8]
10. Global-Local Best Inertia Weight
C" 5"5&D

&'"(
'"(
[9]
11. Simulated Annealing Inertia Weight
*

#/

#,4

#/
6E
*1
EF
[11]
12.
Natural Exponent Inertia Weight Strategy
(e1-PSO)

#/
#,4

#/
"
1G
!
H
IJKLMNO
P
Q
R
[13]
13
Natural Exponent Inertia Weight Strategy
(e2 -PSO)

#/
#,4

#/
"
1G
!
H
IJKLMNO
S
Q
R
T
[13]
14. Logarithm Decreasing Inertia Weight 
#,4

#/

#,4
6U>V
P


A
#,4
[14]
15. Exponent Decreasing Inertia Weight

#,4

#/

WXY

#,4
[12]
642 2011 Third World Congress on Nature and Biologically Inspired Computing

TABLE 2. DIFFERENT FUNCTIONS FOR SIMULATIONS
Function Name
Objective Functions Search Space
Optimal
Function
Value
Sphere
:5
Z
[
/
\
]
]
0
Griewank
:5
Z
9
[
^=>?_
5
a
/
\
/
\
b]
]b
0
Rosenbrock
:5
Z
[G
c


R
/1
\
]
]
0
Rastrigin :5
Z
[G
=>?
@
R
/
\
]
]
0
Ackley
:5
Z
WXYd
[
/
\
WXYe
[=>?
@
/
\
f"
]
]
0
B.
Experimental Results and Analysis
The result analysis is done over three different criteria.
These are average error, average number of iterations and
minimum error obtained through all the simulations. Table
3 shows the result of average error obtained and
corresponding box plots are given in Figure 1. It is
observed that in case of Rosenbrock function; most of the
Inertia Weight strategies produce poor results in
comparison to all the other test functions taken. From Table
3 and Figure 1, it is obvious that Chaotic Inertia Weight
strategy is best from the point of view of accuracy, while
Chaotic Random Inertia Weight strategy is worst among all
considered strategies. Average number of iterations
required to produce the results for no improvement up to
200 iterations are also tabulated in Table 4 and
corresponding box plots are given in Figure 2. From Figure
2, it is found that minimum average number of iterations is
taken by Random Inertia Weight and maximum is taken by
Constant Inertia Weight. Table 5 represents the minimum
error obtained after all simulations in each case considered
above and corresponding box plots are given in Figure 3. It
is clear that from the data represented in the Figure 3 that
constant and linear decreasing Inertia Weight produce near
optimum results in comparison to other methods. The
summary of observations is given in Table 6.
IV.
C
ONCLUSIONS
This paper presents a comparative study on 15 strategies to
set Inertia Weight in Particle Swarm Optimization
Algorithm. A set of 5 most common optimization test
problems and three criteria for comparison have been
considered. As an overall outcome of the experiments
carried out in this paper, Chaotic Inertia Weight is the best
strategy for better accuracy. Random Inertia Weight
strategy is best for better efficiency.
2011 Third World Congress on Nature and Biologically Inspired Computing 643

TABLE 3. AVERAGE ERROR VALUE OF DIFFERENT INERTIA WEIGHT STRATEGIES FOR DIFFERENT TEST PROBLEMS
Problem
Inertia
Weight Strategy
Sphere Griewank Rosenbrock Rastrigin Ackley
Constant 0 0.0660 87.1177 0.9959 3.76E-15
Random 16.2943 84.7315 49419.74 99.8390 18.4242
Adaptive 5.1986 16.5427 4525.81 77.6964 13.3161
Sigmoid increasing 6.6605 29.4044 3138.98 61.6105 13.6855
Sigmoid decreasing 28.7272 83.9782 11677.4 85.4881 18.1589
Linear decreasing 7.01E-81 0.0691 6.1676 39.7121 2.94E-15
Chaotic 5.48E-81 0.0913 3.6398 3.2203 3.41E-15
Chaotic random 15.6258 91.7888 68234.573 85.3247 17.7203
Oscillating 0 0.0562 232.9622 2.8883 3.76E-15
Global-local best 19.8213 54.2368 67725.4 78.9104 17.9878
Simulated annealing 0 0.0669 44.077 4.183 2.94E-15
Natural exponent (e1-PSO) 0 0.0752 4.7199 2.7223 3.05E-15
Natural exponent (e2-PSO) 5.0885 30.6310 631.2029 31.8677 12.1880
Logarithm decreasing 2.34E-36 0.0859 3.7097 4.9466 4.36E-15
Exponent decreasing 0 0.0717 3.5584 3.6519 2.94E-15
TABLE 4. AVERAGE NUMBER OF ITERATIONS OF DIFFERENT INERTIA WEIGHT STRATEGIES FOR DIFFERENT TEST PROBLEMS
Problem
Inertia
Weight Strategy
Sphere Griewank Rosenbrock Rastrigin Ackley
Constant 27611.9 3236.77 11512 3097.13 2853.97
Random 202.13 202.43 202 201.93 202
Adaptive 306 281.667 320.067 297.033 292.1
Sigmoid increasing 469.37 265.07 419 319.6 352.5
Sigmoid decreasing 205.57 206.73 205 210.4 203.37
Linear decreasing 2278.3 1460.767 1573.4 526.533 1254.9
Chaotic 2404.9 1345.733 2469.767 1121 1012.633
Chaotic random 202.2 202 202.57 201.9 202.43
Oscillating 13383.67 2207.9 26406.73 1535.133 1472.9
Global-local best 224.67 220.93 223 239.67 248.5
Simulated annealing 4245.8 821.43 2017.6 712.7 642.87
Natural exponent (e1-PSO) 4390.5 1043.9 1231.2 928.27 790.5
Natural exponent (e2-PSO) 1256.133 380.2 2248.667 531.3667 332.6
Logarithm decreasing 1573.1 822.87 1033.7 671 681.57
Exponent decreasing 3627.9 870.9 1958.4 754.17 764.83
644 2011 Third World Congress on Nature and Biologically Inspired Computing

Citations
More filters
Journal ArticleDOI
TL;DR: The potential of particle swarm optimization for solving various kinds of optimization problems in chemometrics is shown through an extensive description of the algorithm (highlighting the importance of the proper choice of its metaparameters) and by means of selected worked examples in the fields of signal warping, estimation robust PCA solutions and variable selection.

764 citations

Journal ArticleDOI
TL;DR: In this article, the authors developed new intelligent prediction models for estimating the tunnel boring machine performance (TBM) by means of the rate pf penetration (PR) of the Pahang-Selangor Raw Water Transfer (PSRWT) tunnel in Malaysia.

286 citations

Journal ArticleDOI
TL;DR: An overview of the research progress in Particle Swarm Optimization during 1995-2017 is presented, which includes improvements, modifications and applications of this technique.
Abstract: This paper presents an overview of the research progress in Particle Swarm Optimization (PSO) during 1995–2017. Fifty two papers have been reviewed. They have been categorized into nine categories based on various aspects. This technique has attracted many researchers because of its simplicity which led to many improvements and modifications of the basic PSO. Some researchers carried out the hybridization of PSO with other evolutionary techniques. This paper discusses the progress of PSO, its improvements, modifications and applications.

178 citations

Journal ArticleDOI
TL;DR: A rigorous yet systematic review is presented to organize and summarize the information on the PSO algorithm and the developments and trends of its most basic as well as of some of the very notable implementations that have been introduced recently, bearing in mind the coverage of paradigm, theory, hybridization, parallelization, complex optimization, and the diverse applications of the algorithm.
Abstract: Over the ages, nature has constantly been a rich source of inspiration for science, with much still to discover about and learn from. Swarm Intelligence (SI), a major branch of artificial intelligence, was rendered to model the collective behavior of social swarms in nature. Ultimately, Particle Swarm Optimization algorithm (PSO) is arguably one of the most popular SI paradigms. Over the past two decades, PSO has been applied successfully, with good return as well, in a wide variety of fields of science and technology with a wider range of complex optimization problems, thereby occupying a prominent position in the optimization field. However, through in-depth studies, a number of problems with the algorithm have been detected and identified; e.g., issues regarding convergence, diversity, and stability. Consequently, since its birth in the mid-1990s, PSO has witnessed a myriad of enhancements, extensions, and variants in various aspects of the algorithm, specifically after the twentieth century, and the related research has therefore now reached an impressive state. In this paper, a rigorous yet systematic review is presented to organize and summarize the information on the PSO algorithm and the developments and trends of its most basic as well as of some of the very notable implementations that have been introduced recently, bearing in mind the coverage of paradigm, theory, hybridization, parallelization, complex optimization, and the diverse applications of the algorithm, making it more accessible. Ease for researchers to determine which PSO variant is currently best suited or to be invented for a given optimization problem or application. This up-to-date review also highlights the current pressing issues and intriguing open challenges haunting PSO, prompting scholars and researchers to conduct further research both on the theory and application of the algorithm in the forthcoming years.

169 citations

Journal ArticleDOI
TL;DR: This analysis revealed that the hybrid PSO–ANN model offers a higher degree of accuracy compared to conventional ANN for predicting the Qu of rock-socketed piles, however, the developed model would be most useful in the preliminary stages of pile design and should be used with caution.
Abstract: Rock-socketed piles are commonly used in foundations built in soft ground, and thus, their bearing capacity is a key issue of universal concern in research, design and construction. The accurate prediction of the ultimate bearing capacity (Qu) of rock-socketed piles is a difficult task due to the uncertainty surrounding the various factors that affect this capacity. This study was aimed at developing an artificial neural network (ANN) model, as well as a hybrid model based on both particle swarm optimisation (PSO) and ANN, with which to predict the Qu of rock-socketed piles. PSO, a powerful population-based algorithm used in solving continuous and discrete optimisation problems, was here employed as a robust global search algorithm to determine ANN weights and biases and thereby improve model performance. To achieve the study aims, 132 piles socketed in various rock types as part of the Klang Valley Mass Rapid Transit project, Malaysia, were investigated. Based on previous related investigations, parameters with the most influence on Qu were identified and utilised in the modelling procedure of the intelligent systems. After constructing and modelling these systems, selected performance indices including the coefficient of determination (R2), root-mean-square error, variance account for and total ranking were used to identify the best models and compare the obtained results. This analysis revealed that the hybrid PSO---ANN model offers a higher degree of accuracy compared to conventional ANN for predicting the Qu of rock-socketed piles. However, the developed model would be most useful in the preliminary stages of pile design and should be used with caution.

146 citations


Cites background from "Inertia Weight strategies in Partic..."

  • ...Using the inertia weight, the contribution ratio of the previous velocity of a particle to its velocity at the current time stage is determined [62]....

    [...]

References
More filters
Proceedings ArticleDOI
06 Aug 2002
TL;DR: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced, and the evolution of several paradigms is outlined, and an implementation of one of the paradigm is discussed.
Abstract: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced. The evolution of several paradigms is outlined, and an implementation of one of the paradigms is discussed. Benchmark testing of the paradigm is described, and applications, including nonlinear function optimization and neural network training, are proposed. The relationships between particle swarm optimization and both artificial life and genetic algorithms are described.

35,104 citations

Proceedings ArticleDOI
04 May 1998
TL;DR: A new parameter, called inertia weight, is introduced into the original particle swarm optimizer, which resembles a school of flying birds since it adjusts its flying according to its own flying experience and its companions' flying experience.
Abstract: Evolutionary computation techniques, genetic algorithms, evolutionary strategies and genetic programming are motivated by the evolution of nature. A population of individuals, which encode the problem solutions are manipulated according to the rule of survival of the fittest through "genetic" operations, such as mutation, crossover and reproduction. A best solution is evolved through the generations. In contrast to evolutionary computation techniques, Eberhart and Kennedy developed a different algorithm through simulating social behavior (R.C. Eberhart et al., 1996; R.C. Eberhart and J. Kennedy, 1996; J. Kennedy and R.C. Eberhart, 1995; J. Kennedy, 1997). As in other algorithms, a population of individuals exists. This algorithm is called particle swarm optimization (PSO) since it resembles a school of flying birds. In a particle swarm optimizer, instead of using genetic operators, these individuals are "evolved" by cooperation and competition among the individuals themselves through generations. Each particle adjusts its flying according to its own flying experience and its companions' flying experience. We introduce a new parameter, called inertia weight, into the original particle swarm optimizer. Simulations have been done to illustrate the significant and effective impact of this new parameter on the particle swarm optimizer.

9,373 citations


"Inertia Weight strategies in Partic..." refers background in this paper

  • ...In 1998, first time Shi and Eberhart [2] presented the concept of Inertia Weight by introducing Constant Inertia Weight....

    [...]

  • ...Further, the concept of an Inertia Weight was developed by Shi and Eberhart [2] in 1998 to better control exploration and exploitation....

    [...]

Proceedings ArticleDOI
06 Jul 1999
TL;DR: The experimental results show that the PSO is a promising optimization method and a new approach is suggested to improve PSO's performance near the optima, such as using an adaptive inertia weight.
Abstract: We empirically study the performance of the particle swarm optimizer (PSO). Four different benchmark functions with asymmetric initial range settings are selected as testing functions. The experimental results illustrate the advantages and disadvantages of the PSO. Under all the testing cases, the PSO always converges very quickly towards the optimal positions but may slow its convergence speed when it is near a minimum. Nevertheless, the experimental results show that the PSO is a promising optimization method and a new approach is suggested to improve PSO's performance near the optima, such as using an adaptive inertia weight.

3,976 citations

Proceedings ArticleDOI
27 May 2001
TL;DR: Three kinds of dynamic systems are defined for the purposes of this paper and one of them is chosen for preliminary analysis using the particle swarm on the parabolic benchmark function.
Abstract: Using particle swarms to track and optimize dynamic systems is described. Issues related to tracking and optimizing dynamic systems are briefly reviewed. Three kinds of dynamic systems are defined for the purposes of this paper. One of them is chosen for preliminary analysis using the particle swarm on the parabolic benchmark function. Successful tracking of a 10-dimensional parabolic function with a severity of up to 1.0 is demonstrated. A number of issues related to tracking and optimizing dynamic systems with particle swarms are identified. Directions for future research and applications are suggested.

959 citations

Proceedings ArticleDOI
24 Apr 2009
TL;DR: A group of strategies with multi-stage linearly-decreasing inertia weight (MLDW) is proposed in order to get better balance between the global and local search.
Abstract: The inertia weight is often used to control the global exploration and local exploitation abilities of particle swarm optimizers (PSO). In this paper, a group of strategies with multi-stage linearly-decreasing inertia weight (MLDW) is proposed in order to get better balance between the global and local search. Six most commonly used benchmarks are used to evaluate the MLDW strategies on the performance of PSOs. The results suggest that the PSO with W5 strategy is a good choice for solving unimodal problems due to its fast convergence speed, and the CLPSO with W5 strategy is more suitable for solving multimodal problems. Also, W5-CLPSO can be used as a robust algorithm because it is not sensitive to the complexity of problems for solving.

185 citations


"Inertia Weight strategies in Partic..." refers background in this paper

  • ...Linear Decreasing Inertia Weight * #,4 #,4 # / 5 " #,4 6 7 [6]...

    [...]

  • ...The Linearly Decreasing strategy [6] enhances the efficiency and performance of PSO....

    [...]

Frequently Asked Questions (9)
Q1. What are the contributions in "Inertia weight strategies in particle swarm optimization" ?

In order to propose one or more than one Inertia Weight strategies which are efficient than others, this paper studies 15 relatively recent and popular Inertia Weight strategies and compares their performance on 05 optimization test problems. 

In order to overcome the premature convergence and later period oscillatory occurrences of the standard PSO, an Exponent Decreasing Inertia Weight and a stochastic mutation to produce an improved PSO has been proposed by Gao et al. [12] which uses the Exponent Decreasing Inertia Weight along with stochastic piecewise mutation for current global optimal particle during the running time, thus strengthened jumped out the partial optimal solution ability. 

Fayek et al. [11] introduces an optimized Particle Swarm technique (PSOSA) that uses Simulated Annealing for optimizing the Inertia Weight and tested the approach on urban planning problem. 

To overcome the weakness of premature convergence to local minimum, Adaptive Inertia Weight strategy [4] is proposed to improve its searching capability. 

The termination criterion is set to the “no improvement observed for 200 iterations (similar fitness value achieved for 200 consecutive iterations)”. 

Oscillating Inertia Weight [8] provides periodically alternates between global and local search waves and conclusion was drawn that this strategy appears to be generally competitive and, in some cases, outperform particularly in terms of convergence speed. 

Y. Gao, X. An, and J. Liu., “A Particle Swarm Optimization Algorithm with Logarithm Decreasing Inertia Weight and Chaos Mutation”, In Computational Intelligence and Security, 2008. 

So they combine sigmoid function and Linear Increasing Inertia Weight and provides a SIIW which has produced a great improvement in quick convergence ability and aggressive movement narrowing towards the solution region. 

2011 Third World Congress on Nature and Biologically Inspired Computing 643TABLE 3. AVERAGE ERROR VALUE OF DIFFERENT INERTIA WEIGHT STRATEGIES FOR DIFFERENT TEST PROBLEMSProblemInertia Weight StrategySphere Griewank Rosenbrock Rastrigin AckleyConstant 0 0.0660 87.1177 0.9959 3.76E-15Random 16.2943 84.7315 49419.74 99.8390 18.4242Adaptive 5.1986 16.5427 4525.81 77.6964 13.3161Sigmoid increasing 6.6605 29.4044 3138.98 61.6105 13.6855Sigmoid decreasing 28.7272 83.9782 11677.4 85.4881 18.1589Linear decreasing 7.01E-81 0.0691 6.1676 39.7121 2.94E-15Chaotic 5.48E-81 0.0913 3.6398 3.2203 3.41E-15Chaotic random 15.6258 91.7888 68234.573 85.3247 17.7203Oscillating 0 0.0562 232.9622 2.8883 3.76E-15Global-local best 19.8213 54.2368 67725.4 78.9104 17.9878Simulated annealing 0 0.0669 44.077 4.183 2.94E-15Natural exponent (e1-PSO) 0 0.0752 4.7199 2.7223 3.05E-15Natural exponent (e2-PSO) 5.0885 30.6310 631.2029 31.8677 12.1880Logarithm decreasing 2.34E-36 0.0859 3.7097 4.9466 4.36E-15Exponent decreasing 0 0.0717 3.5584 3.6519 2.94E-15TABLE