scispace - formally typeset
Search or ask a question
Author

Yann Cooren

Bio: Yann Cooren is an academic researcher from University of Paris. The author has contributed to research in topics: Particle swarm optimization & Metaheuristic. The author has an hindex of 8, co-authored 14 publications receiving 324 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: The practical suitability of PSO to solve both mono-objective and multiobjective discrete optimization problems and the aptness ofPSO to optimize difficult circuit problems, in terms of numbers of parameters and constraints is shown.
Abstract: This paper details the Particle Swarm Optimization (PSO) technique for the optimal design of analog circuits. It is shown the practical suitability of PSO to solve both mono-objective and multiobjective discrete optimization problems. Two application examples are presented: maximizing the voltage gain of a low noise amplifier for the UMTS standard and computing the Pareto front of a bi-objective problem, maximizing the high current cut off frequency and minimizing the parasitic input resistance of a second generation current conveyor. The aptness of PSO to optimize difficult circuit problems, in terms of numbers of parameters and constraints, is shown.

156 citations

Journal ArticleDOI
TL;DR: A global study of the behavior of TRIBES under several conditions is performed in order to determine strengths and drawbacks of this adaptive PSO algorithm.
Abstract: This paper presents a study of the performance of TRIBES, an adaptive particle swarm optimization algorithm. Particle Swarm Optimization (PSO) is a biologically-inspired optimization method. Recently, researchers have used it effectively in solving various optimization problems. However, like most optimization heuristics, PSO suffers from the drawback of being greatly influenced by the selection of its parameter values. Thus, the common belief is that the performance of a PSO algorithm is directly related to the tuning of such parameters. Usually, such tuning is a lengthy, time consuming and delicate process. A new adaptive PSO algorithm called TRIBES avoids manual tuning by defining adaptation rules which aim at automatically changing the particles’ behaviors as well as the topology of the swarm. In TRIBES, the topology is changed according to the swarm behavior and the strategies of displacement are chosen according to the performances of the particles. A comparative study carried out on a large set of benchmark functions shows that the performance of TRIBES is quite competitive compared to most other similar PSO algorithms that need manual tuning of parameters. The performance evaluation of TRIBES follows the testing procedure introduced during the 2005 IEEE Conference on Evolutionary Computation. The main objective of the present paper is to perform a global study of the behavior of TRIBES under several conditions, in order to determine strengths and drawbacks of this adaptive algorithm.

65 citations

Journal ArticleDOI
TL;DR: Results show that MO-TRIBES is a promising alternative to tackle multiobjective problems without the constraint of parameter fitting, allowing to avoid the parameter fitting step.
Abstract: This paper presents MO-TRIBES, an adaptive multiobjective Particle Swarm Optimization (PSO) algorithm. Metaheuristics have the drawback of being very dependent on their parameter values. Then, performances are strongly related to the fitting of parameters. Usually, such tuning is a lengthy, time consuming and delicate process. The aim of this paper is to present and to evaluate MO-TRIBES, which is an adaptive algorithm, designed for multiobjective optimization, allowing to avoid the parameter fitting step. A global description of TRIBES and a comparison with other algorithms are provided. Using an adaptive algorithm means that adaptation rules must be defined. Swarm's structure and strategies of displacement of the particles are modified during the process according to the tribes behaviors. The choice of the final solutions is made using the Pareto dominance criterion. Rules based on crowding distance have been incorporated in order to maintain diversity along the Pareto Front. Preliminary simulations are provided and compared with the best known algorithms. These results show that MO-TRIBES is a promising alternative to tackle multiobjective problems without the constraint of parameter fitting.

34 citations

Proceedings ArticleDOI
01 Dec 2007
TL;DR: In this paper, the particle swarm optimization metaheuristic was used for optimally sizing CMOS positive second generation current conveyors (CCII+) to improve static and dynamic performances.
Abstract: This brief paper deals with using the particle swarm optimization metaheuristic for optimally sizing CMOS positive second generation current conveyors (CCII+). Both static and dynamic performances are improved. Pareto front is generated while minimizing parasitic X-port input resistance RX and maximizing current high cut off frequency fci. The translinear implementation in CMOS technology is presented. Boundaries of the generated Pareto boarder are 400 Omega and 2 GHz for RX and fci respectively. SPICE simulation results are presented to validate obtained sizing.

21 citations

Book ChapterDOI
01 Jan 2008
TL;DR: This chapter presents two ways of improvement for TRIBES, a parameter-free Particle Swarm Optimization (PSO) algorithm, by choosing a new way of initialization of the particles and by hybridizing it with an Estimation of Distribution Algorithm (EDA).
Abstract: This chapter presents two ways of improvement for TRIBES, a parameter-free Particle Swarm Optimization (PSO) algorithm. PSO requires the tuning of a set of parameters, and the performance of the algorithm is strongly linked to the values given to the parameter set. However, finding the optimal set of parameters is a very hard and time consuming problem. So, Clerc worked out TRIBES, a totally adaptive algorithm that avoids parameter fitting. Experimental results are encouraging but are still worse than many algorithms. The purpose of this chapter is to demonstrate how TRIBES can be improved by choosing a new way of initialization of the particles and by hybridizing it with an Estimation of Distribution Algorithm (EDA). These two improvements aim at allowing the algorithm to explore as widely as possible the search space and avoid a premature convergence in a local optimum. Obtained results show that, compared to other algorithms, the proposed algorithm gives results either equal or better.

19 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: This paper reviews recent studies on the Particle Swarm Optimization (PSO) algorithm and presents some potential areas for future study.
Abstract: This paper reviews recent studies on the Particle Swarm Optimization PSO algorithm. The review has been focused on high impact recent articles that have analyzed and/or modified PSO algorithms. This paper also presents some potential areas for future study.

532 citations

Journal ArticleDOI
01 Jun 2012
TL;DR: A novel algorithm, called self-learning particle swarm optimizer (SLPSO), for global optimization problems, which can enable a particle to choose the optimal strategy according to its own local fitness landscape.
Abstract: Particle swarm optimization (PSO) has been shown as an effective tool for solving global optimization problems. So far, most PSO algorithms use a single learning pattern for all particles, which means that all particles in a swarm use the same strategy. This monotonic learning pattern may cause the lack of intelligence for a particular particle, which makes it unable to deal with different complex situations. This paper presents a novel algorithm, called self-learning particle swarm optimizer (SLPSO), for global optimization problems. In SLPSO, each particle has a set of four strategies to cope with different situations in the search space. The cooperation of the four strategies is implemented by an adaptive learning framework at the individual level, which can enable a particle to choose the optimal strategy according to its own local fitness landscape. The experimental study on a set of 45 test functions and two real-world problems show that SLPSO has a superior performance in comparison with several other peer algorithms.

348 citations

Journal ArticleDOI
TL;DR: In this algorithm, a reinforced memory strategy is designed to update the local leaders of particles for avoiding the degradation of outstanding genes in the particles, and a uniform combination is proposed to balance the local exploitation and the global exploration of algorithm.

207 citations

Journal ArticleDOI
TL;DR: A novel method, named parallel cell coordinate system (PCCS), is proposed to assess the evolutionary environment including density, rank, and diversity indicators based on the measurements of parallel cell distance, potential, and distribution entropy, respectively.
Abstract: Managing convergence and diversity is essential in the design of multiobjective particle swarm optimization (MOPSO) in search of an accurate and well distributed approximation of the true Pareto-optimal front. Largely due to its fast convergence, particle swarm optimization incurs a rapid loss of diversity during the evolutionary process. Many mechanisms have been proposed in existing MOPSOs in terms of leader selection, archive maintenance, and perturbation to tackle this deficiency. However, few MOPSOs are designed to dynamically adjust the balance in exploration and exploitation according to the feedback information detected from the evolutionary environment. In this paper, a novel method, named parallel cell coordinate system (PCCS), is proposed to assess the evolutionary environment including density, rank, and diversity indicators based on the measurements of parallel cell distance, potential, and distribution entropy, respectively. Based on PCCS, strategies proposed for selecting global best and personal best, maintaining archive, adjusting flight parameters, and perturbing stagnation are integrated into a self-adaptive MOPSO (pccsAMOPSO). The comparative experimental results show that the proposed pccsAMOPSO outperforms the other eight state-of-the-art competitors on ZDT and DTLZ test suites in terms of the chosen performance metrics. An additional experiment for density estimation in MOPSO illustrates that the performance of PCCS is superior to that of adaptive grid and crowding distance in terms of convergence and diversity.

200 citations

Journal ArticleDOI
TL;DR: The practical suitability of PSO to solve both mono-objective and multiobjective discrete optimization problems and the aptness ofPSO to optimize difficult circuit problems, in terms of numbers of parameters and constraints is shown.
Abstract: This paper details the Particle Swarm Optimization (PSO) technique for the optimal design of analog circuits. It is shown the practical suitability of PSO to solve both mono-objective and multiobjective discrete optimization problems. Two application examples are presented: maximizing the voltage gain of a low noise amplifier for the UMTS standard and computing the Pareto front of a bi-objective problem, maximizing the high current cut off frequency and minimizing the parasitic input resistance of a second generation current conveyor. The aptness of PSO to optimize difficult circuit problems, in terms of numbers of parameters and constraints, is shown.

156 citations