scispace - formally typeset
Search or ask a question

Showing papers on "Particle swarm optimization published in 2002"


Proceedings ArticleDOI
06 Aug 2002
TL;DR: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced, and the evolution of several paradigms is outlined, and an implementation of one of the paradigm is discussed.
Abstract: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced. The evolution of several paradigms is outlined, and an implementation of one of the paradigms is discussed. Benchmark testing of the paradigm is described, and applications, including nonlinear function optimization and neural network training, are proposed. The relationships between particle swarm optimization and both artificial life and genetic algorithms are described.

35,104 citations


Journal ArticleDOI
TL;DR: This paper analyzes a particle's trajectory as it moves in discrete time, then progresses to the view of it in continuous time, leading to a generalized model of the algorithm, containing a set of coefficients to control the system's convergence tendencies.
Abstract: The particle swarm is an algorithm for finding optimal regions of complex search spaces through the interaction of individuals in a population of particles. This paper analyzes a particle's trajectory as it moves in discrete time (the algebraic view), then progresses to the view of it in continuous time (the analytical view). A five-dimensional depiction is developed, which describes the system completely. These analyses lead to a generalized model of the algorithm, containing a set of coefficients to control the system's convergence tendencies. Some results of the particle swarm optimizer, implementing modifications derived from the analysis, suggest methods for altering the original algorithm in ways that eliminate problems and increase the ability of the particle swarm to find optima of some well-studied test functions.

8,287 citations


Proceedings ArticleDOI
12 May 2002
TL;DR: This paper introduces a proposal to extend the heuristic called "particle swarm optimization" (PSO) to deal with multiobjective optimization problems and it maintains previously found nondominated vectors in a global repository that is later used by other particles to guide their own flight.
Abstract: This paper introduces a proposal to extend the heuristic called "particle swarm optimization" (PSO) to deal with multiobjective optimization problems. Our approach uses the concept of Pareto dominance to determine the flight direction of a particle and it maintains previously found nondominated vectors in a global repository that is later used by other particles to guide their own flight. The approach is validated using several standard test functions from the specialized literature. Our results indicate that our approach is highly competitive with current evolutionary multiobjective optimization techniques.

1,842 citations


Proceedings ArticleDOI
12 May 2002
TL;DR: The effects of various population topologies on the particle swarm algorithm were systematically investigated and it was discovered that previous assumptions may not have been correct.
Abstract: The effects of various population topologies on the particle swarm algorithm were systematically investigated. Random graphs were generated to specifications, and their performance on several criteria was compared. What makes a good population structure? We discovered that previous assumptions may not have been correct.

1,589 citations


Dissertation
01 Jan 2002
TL;DR: This thesis presents a theoretical model that can be used to describe the long-term behaviour of the Particle Swarm Optimiser and results are presented to support the theoretical properties predicted by the various models, using synthetic benchmark functions to investigate specific properties.
Abstract: Many scientific, engineering and economic problems involve the optimisation of a set of parameters. These problems include examples like minimising the losses in a power grid by finding the optimal configuration of the components, or training a neural network to recognise images of people's faces. Numerous optimisation algorithms have been proposed to solve these problems, with varying degrees of success. The Particle Swarm Optimiser (PSO) is a relatively new technique that has been empirically shown to perform well on many of these optimisation problems. This thesis presents a theoretical model that can be used to describe the long-term behaviour of the algorithm. An enhanced version of the Particle Swarm Optimiser is constructed and shown to have guaranteed convergence on local minima. This algorithm is extended further, resulting in an algorithm with guaranteed convergence on global minima. A model for constructing cooperative PSO algorithms is developed, resulting in the introduction of two new PSO-based algorithms. Empirical results are presented to support the theoretical properties predicted by the various models, using synthetic benchmark functions to investigate specific properties. The various PSO-based algorithms are then applied to the task of training neural networks, corroborating the results obtained on the synthetic benchmark functions.

1,498 citations


Journal ArticleDOI
TL;DR: A Composite PSO, in which the heuristic parameters of PSO are controlled by a Differential Evolution algorithm during the optimization, is described, and results for many well-known and widely used test functions are given.
Abstract: This paper presents an overview of our most recent results concerning the Particle Swarm Optimization (PSO) method. Techniques for the alleviation of local minima, and for detecting multiple minimizers are described. Moreover, results on the ability of the PSO in tackling Multiobjective, Minimax, Integer Programming and e1 errors-in-variables problems, as well as problems in noisy and continuously changing environments, are reported. Finally, a Composite PSO, in which the heuristic parameters of PSO are controlled by a Differential Evolution algorithm during the optimization, is described, and results for many well-known and widely used test functions are given.

1,436 citations


Journal ArticleDOI
TL;DR: In this paper, an evolutionary-based approach to solve the optimal power flow (OPF) problem is presented. And the proposed approach has been examined and tested on the standard IEEE 30bus test system with different objectives that reflect fuel cost minimization, voltage profile improvement, and voltage stability enhancement.

1,209 citations


Journal ArticleDOI
TL;DR: In this paper, a novel evolutionary algorithm-based approach to optimal design of multimachine power system stabilizers (PSSs) is proposed, which employs the particle swarm optimization (PSO) technique to search for optimal settings of PSS parameters.
Abstract: In this paper, a novel evolutionary algorithm-based approach to optimal design of multimachine power system stabilizers (PSSs) is proposed. The proposed approach employs the particle swarm optimization (PSO) technique to search for optimal settings of PSS parameters. Two elgenvalue-based objective functions to enhance system damping of electromechanical modes are considered. The robustness of the proposed approach to the initial guess is demonstrated. The performance of the proposed PSO-based PSS (PSOPSS) under different disturbances, loading conditions, and system configurations is tested and examined for different multimachine power systems. Eigenvalue analysis and nonlinear simulation results show the effectiveness of the proposed PSOPSSs to damp out the local as well as the interarea modes of oscillations and work effectively over a wide range of loading conditions and system configurations. In addition, the potential and superiority of the proposed approach over the conventional approaches are demonstrated.

684 citations


Proceedings ArticleDOI
11 Mar 2002
TL;DR: Critical aspects of the VEGA approach for Multiobjective Optimization using Genetic Algorithms are adapted to the PSO framework in order to develop a multi-swarm PSO that can cope effectively with MO problems.
Abstract: This paper constitutes a first study of the Particle Swarm Optimization (PSO) method in Multiobjective Optimization (MO) problems. The ability of PSO to detect Pareto Optimal points and capture the shape of the Pareto Front is studied through experiments on well-known non-trivial test functions. The Weighted Aggregation technique with fixed or adaptive weights is considered. Furthermore, critical aspects of the VEGA approach for Multiobjective Optimization using Genetic Algorithms are adapted to the PSO framework in order to develop a multi-swarm PSO that can cope effectively with MO problems. Conclusions are derived and ideas for further research are proposed.

674 citations


Proceedings ArticleDOI
12 May 2002
TL;DR: This paper presents a particle swarm optimization algorithm modified by using a dynamic neighborhood strategy, new particle memory updating, and one-dimension optimization to deal with multiple objectives for multiobjective optimization problems.
Abstract: This paper presents a particle swarm optimization (PSO) algorithm for multiobjective optimization problems. PSO is modified by using a dynamic neighborhood strategy, new particle memory updating, and one-dimension optimization to deal with multiple objectives. Several benchmark cases were tested and showed that PSO could efficiently find multiple Pareto optimal solutions.

671 citations


Journal ArticleDOI
TL;DR: The effectiveness of the proposed PSO-based algorithm is demonstrated by comparing it with the genetic algorithm, which is well-known population-based probabilistic heuristic, on randomly generated task interaction graphs.

Journal ArticleDOI
TL;DR: In attaining the approximate region of the optimum, the implementation suggests that the PSOA is superior to the GA, and comparable to gradient based algorithms.
Abstract: Shape and size optimization problems instructural design are addressed using the particle swarm optimization algorithm (PSOA). In our implementation of the PSOA, the social behaviour of birds is mimicked. Individual birds exchange information about their position, velocity and fitness, and the behaviour of the flock is then influenced to increase the probability of migration to regions of high fitness. New operators in the PSOA, namely the elite velocity and the elite particle, are introduced.

Journal ArticleDOI
TL;DR: In this article, a hybrid particle swarm optimization (HPSO) was proposed for a practical distribution state estimation, which considers nonlinear characteristics of the practical equipment and actual limited measurements in distribution systems.
Abstract: This paper proposes a hybrid particle swarm optimization (HPSO) for a practical distribution state estimation. The proposed method considers nonlinear characteristics of the practical equipment and actual limited measurements in distribution systems. The method can estimate load and distributed generation output values at each node by minimizing the difference between measured and calculated voltages and currents. The feasibility of the proposed method is demonstrated and compared with an original particle swarm optimization-based method on practical distribution system models. Effectiveness of the constriction factor approach of particle swarm optimization is also investigated. The results indicate the applicability of the proposed state estimation method to the practical distribution systems.

01 Jan 2002
TL;DR: Particle Swarm Optimization is an efficient and general solution to solve most nonlinear optimization problems with nonlinear inequality constraints with preserving feasibility strategy employed to deal with constraints.
Abstract: This paper presents a Particle Swarm Optimization (PSO) algorithm for constrained nonlinear optimization problems. In PSO, the potential solutions, called particles, are "flown" through the problem space by learning from the current optimal particle and its own memory. In this paper, preserving feasibility strategy is employed to deal with constraints. PSO is started with a group of feasible solutions and a feasibility function is used to check if the new explored solutions satisfy all the constraints. All particles keep only those feasible solutions in their memory. Eleven test cases were tested and showed that PSO is an efficient and general solution to solve most nonlinear optimization problems with nonlinear inequality constraints.

Proceedings ArticleDOI
06 Oct 2002
TL;DR: This paper introduces a new Particle Swarm Optimisation (PSO) algorithm with strong local convergence properties, which performs much better with a smaller number of particles, compared to the original PSO.
Abstract: This paper introduces a new Particle Swarm Optimisation (PSO) algorithm with strong local convergence properties. The new algorithm performs much better with a smaller number of particles, compared to the original PSO. This property is desirable when designing a niching PSO algorithm.

Proceedings ArticleDOI
07 Aug 2002
TL;DR: The foundations and performance of the two algorithms when applied to the design of a profiled corrugated horn antenna are investigated and the possibility of hybridizing the twogorithms is investigated.
Abstract: Genetic algorithms (GA) have proven to be a useful method of optimization for difficult and discontinuous multidimensional engineering problems. A new method of optimization, particle swarm optimization (PSO), is able to accomplish the same goal as GA optimization in a new and faster way. The purpose of this paper is to investigate the foundations and performance of the two algorithms when applied to the design of a profiled corrugated horn antenna. Also investigated is the possibility of hybridizing the two algorithms.

Proceedings ArticleDOI
12 May 2002
TL;DR: An adaptive PSO is introduced, which automatically tracks various changes in a dynamic system and re-randomization is introduced to respond to the dynamic changes.
Abstract: This paper introduces an adaptive PSO, which automatically tracks various changes in a dynamic system. Different environment detection and response techniques are tested on the parabolic and Rosenbrock benchmark functions, and re-randomization is introduced to respond to the dynamic changes. Performance on the benchmark functions with various severities is analyzed.

Journal ArticleDOI
TL;DR: A new optimization algorithm to solve multiobjective design optimization problems based on behavioral concepts similar to that of a real swarm is presented, indicating that the swarm algorithm is capable of generating an extended Pareto front with significantly fewer function evaluations when compared to the nondominated sorting genetic algorithm (NSGA).
Abstract: This paper presents a new optimization algorithm to solve multiobjective design optimization problems based on behavioral concepts similar to that of a real swarm. The individuals of a swarm update their flying direction through communication with their neighboring leaders with an aim to collectively attain a common goal. The success of the swarm is attributed to three fundamental processes: identification of a set of leaders, selection of a leader for information acquisition, and finally a meaningful information transfer scheme. The proposed algorithm mimics the above behavioral processes of a real swarm. The algorithm employs a multilevel sieve to generate a set of leaders, a probabilistic crowding radius-based strategy for leader selection and a simple generational operator for information transfer. Two test problems, one with a discontinuous Pareto front and the other with a multi-modal Pareto front is solved to illustrate the capabilities of the algorithm in handling mathematically complex problems. ...

Proceedings ArticleDOI
12 May 2002
TL;DR: Three variants of PSO are compared with the widely used branch and bound technique, on several integer programming test problems and results indicate that PSO handles efficiently such problems, and in most cases it outperforms the branch and Bound technique.
Abstract: The investigation of the performance of the particle swarm optimization (PSO) method in integer programming problems, is the main theme of the present paper. Three variants of PSO are compared with the widely used branch and bound technique, on several integer programming test problems. Results indicate that PSO handles efficiently such problems, and in most cases it outperforms the branch and bound technique.


Proceedings ArticleDOI
12 May 2002
TL;DR: This paper introduces spatial extension to particles in the PSO model in order to overcome premature convergence in iterative optimisation and shows that the SEPSO indeed managed to keep diversity in the search space and yielded superior results.
Abstract: In this paper, we introduce spatial extension to particles in the PSO model in order to overcome premature convergence in iterative optimisation. The standard PSO and the new model (SEPSO) are compared w.r.t. performance on well-studied benchmark problems. We show that the SEPSO indeed managed to keep diversity in the search space and yielded superior results.

Proceedings ArticleDOI
06 Oct 2002
TL;DR: The fundamentals of the method are described, and an application to the problem of loss minimization and voltage control is presented, with very good results.
Abstract: This paper presents a new optimization model EPSO, evolutionary particle swarm optimization, inspired in both evolutionary algorithms and in particle swarm optimization algorithms. The fundamentals of the method are described, and an application to the problem of loss minimization and voltage control is presented, with very good results.

Journal ArticleDOI
TL;DR: The Intelligent Particle Swarm Optimization (IPSO) algorithm as mentioned in this paper uses concepts such as group experiences, unpleasant memories (tabu to be avoided), local landscape models based on virtual neighbors, and memetic replication of successful behavior parameters.
Abstract: The paper describes a new stochastic heuristic algorithm for global optimization. The new optimization algorithm, called intelligent-particle swarm optimization (IPSO), offers more intelligence to particles by using concepts such as: group experiences, unpleasant memories (tabu to be avoided), local landscape models based on virtual neighbors, and memetic replication of successful behavior parameters. The new individual complexity is amplified at the group level and consequently generates a more efficient optimization procedure. A simplified version of the IPSO algorithm was implemented and compared with the classical PSO algorithm for a simple test function and for the Loney's solenoid.

Proceedings ArticleDOI
07 Aug 2002
TL;DR: Particle swarm is an optimization paradigm for real-valued functions, based on the social dynamics of group interaction, and its application to the training of neural networks is proposed.
Abstract: Particle swarm is an optimization paradigm for real-valued functions, based on the social dynamics of group interaction. We propose its application to the training of neural networks. Comparative tests were carried out, for classification and regression tasks.

01 Jan 2002
TL;DR: The paper describes a new stochastic heuristic algorithm for global optimization, called intelligent-particle swarm optimization (IPSO), which offers more intelligence to particles by using concepts such as: group experiences, unpleasant memories, local landscape models based on virtual neighbors, and memetic replication of successful behavior parameters.

Proceedings ArticleDOI
12 May 2002
TL;DR: A dissipative particle swarm optimization is developed according to the self-organization of dissipative structure where the negative entropy is introduced to construct an opening dissipative system that is far-from-equilibrium so as to driving the irreversible evolution process with better fitness.
Abstract: A dissipative particle swarm optimization is developed according to the self-organization of dissipative structure The negative entropy is introduced to construct an opening dissipative system that is far-from-equilibrium so as to driving the irreversible evolution process with better fitness The testing of two multimodal functions indicates it improves the performance effectively

Proceedings ArticleDOI
12 May 2002
TL;DR: A new meta-heuristic (EPSO) built putting together the best features of evolution strategies (ES) and particle swarm optimization (PSO), including an application in opto-electronics and another in power systems is presented.
Abstract: This paper presents a new meta-heuristic (EPSO) built putting together the best features of evolution strategies (ES) and particle swarm optimization (PSO). Examples of the superiority of EPSO over classical PSO are reported. The paper also describes the application of EPSO to real world problems, including an application in opto-electronics and another in power systems.


Proceedings Article
09 Jul 2002
TL;DR: Two novel particle swarm optimization algorithms are used to track and optimize a 3-dimensional parabolic benchmark function where the optimum location changes randomly and with high severity.
Abstract: Two novel particle swarm optimization (PSO) algorithms are used to track and optimize a 3-dimensional parabolic benchmark function where the optimum location changes randomly and with high severity. The new algorithms are based on an analogy of electrostatic energy with charged particles. For comparison, the same experiment is performed with a conventional PSO algorithm. It is found that the best strategy for this particular problem involves a combination of neutral and charged particles.

Proceedings ArticleDOI
M. Lovbjerg1, T. Krink1
12 May 2002
TL;DR: Self-organized criticality (SOC) can help control the PSO and add diversity and seems promising reaching faster convergence and better solutions.
Abstract: Particle swarm optimisers (PSOs) show potential in function optimisation, but still have room for improvement. Self-organized criticality (SOC) can help control the PSO and add diversity. Extending the PSO with SOC seems promising reaching faster convergence and better solutions.