Optimization of Analog RF Circuit parameters using randomness in particle swarm optimization
01 Dec 2011-pp 274-278
...read more
Citations
More filters
[...]
TL;DR: An adaptive particle swarm optimization algorithm based on directed weighted complex network (DWCNPSO) is proposed that can effectively avoid the premature convergence problem and the convergence rate is faster.
Abstract: The disadvantages of particle swarm optimization (PSO) algorithm are that it is easy to fall into local optimum in high-dimensional space and has a low convergence rate in the iterative process. To deal with these problems, an adaptive particle swarm optimization algorithm based on directed weighted complex network (DWCNPSO) is proposed. Particles can be scattered uniformly over the search space by using the topology of small-world network to initialize the particles position. At the same time, an evolutionary mechanism of the directed dynamic network is employed to make the particles evolve into the scale-free network when the in-degree obeys power-law distribution. In the proposed method, not only the diversity of the algorithm was improved, but also particles’ falling into local optimum was avoided. The simulation results indicate that the proposed algorithm can effectively avoid the premature convergence problem. Compared with other algorithms, the convergence rate is faster.
46 citations
[...]
TL;DR: A general form of PSO algorithms is considered, and asymptotic properties of the algorithms using stochastic approximation methods are analyzed, proving that a suitably scaled sequence of swarms converge to the solution of an ordinary differential equation.
Abstract: Recently, much progress has been made on particle swarm optimization (PSO). A number of works have been devoted to analyzing the convergence of the underlying algorithms. Nevertheless, in most cases, rather simplified hypotheses are used. For example, it often assumes that the swarm has only one particle. In addition, more often than not, the variables and the points of attraction are assumed to remain constant throughout the optimization process. In reality, such assumptions are often violated. Moreover, there are no rigorous rates of convergence results available to date for the particle swarm, to the best of our knowledge. In this paper, we consider a general form of PSO algorithms, and analyze asymptotic properties of the algorithms using stochastic approximation methods. We introduce four coefficients and rewrite the PSO procedure as a stochastic approximation type iterative algorithm. Then we analyze its convergence using weak convergence method. It is proved that a suitably scaled sequence of swarms converge to the solution of an ordinary differential equation. We also establish certain stability results. Moreover, convergence rates are ascertained by using weak convergence method. A centered and scaled sequence of the estimation errors is shown to have a diffusion limit.
39 citations
Cites background from "Optimization of Analog RF Circuit p..."
[...]
[...]
TL;DR: It is shown that the yield for the rest of the population can be estimated based on the membership degree of FCM and RIs yield values alone, and this new method was applied on two real circuit-sizing optimization problems and the obtained results were compared to the exhaustive approach.
Abstract: This paper presents fuzzy ${c}$ -means-based yield estimation (FUZYE), a methodology that reduces the time impact caused by Monte Carlo (MC) simulations in the context of analog integrated circuits (ICs) yield estimation, enabling it for yield optimization with population-based algorithms, e.g., the genetic algorithm (GA). MC analysis is the most general and reliable technique for yield estimation, yet the considerable amount of time it requires has discouraged its adoption in population-based optimization tools. The proposed methodology reduces the total number of MC simulations that are required, since, at each GA generation, the population is clustered using a fuzzy ${c}$ -means (FCMs) technique, and, only the representative individual (RI) from each cluster is subject to MC simulations. This paper shows that the yield for the rest of the population can be estimated based on the membership degree of FCM and RIs yield values alone. This new method was applied on two real circuit-sizing optimization problems and the obtained results were compared to the exhaustive approach, where all individuals of the population are subject to MC analysis. The FCM approach presents a reduction of 89% in the total number of MC simulations, when compared to the exhaustive MC analysis over the full population. Moreover, a ${k}$ -means-based clustering algorithm was also tested and compared with the proposed FUZYE, with the latest showing an improvement up to 13% in yield estimation accuracy.
14 citations
Book•
[...]
20 Feb 2015
TL;DR: This work addresses the research and development (R&D) of an innovative optimization kernel applied to analog integrated circuit (IC) design by enhancing AIDA-C with a new multi-objective multi-constraint optimization kernel.
Abstract: This work addresses the research and development (R&D) of an innovative optimization kernel applied to analog integrated circuit (IC) design. Particularly, this work focus is AIDA-CMK, by enhancing AIDA-C with a new multi-objective multi-constraint optimization kernel. AIDA-C is the circuit optimizer component of AIDA, an electronic design automation framework fully developed in-house. The proposed solution implements three approaches to multi-objective multiconstraint optimization, namely, an evolutionary approach with NSGAII, a swarm intelligence approach with MOPSO and stochastic hill climbing approach with MOSA. Moreover, the implemented kernels allow an easy hybridization between them transforming a simple NSGAII optimization kernel to a more evolved and versatile multi algorithm and hybrid kernel. The three multi-objective optimization approaches were validated with CEC2009 benchmarks to constrained multiobjective optimization and tested with real analog IC design problems. The achieved results were compared in terms of performance, using statistical results obtained from multiple independent runs, finally, some hybrid approaches were also experimented, giving a foretaste to a wide range of opportunities to explore in future work.
7 citations
[...]
TL;DR: An innovative combination of principal component analysis (PCA) and evolutionary computation is used to increase the optimizer's efficiency, reaching wider solutions sets, and in some cases, solutions sets that can be almost 3 times better in terms of hypervolume.
Abstract: State-of-the-art design of analog and radio frequency integrated circuits is often accomplished using sizing optimization. In this paper, an innovative combination of principal component analysis (PCA) and evolutionary computation is used to increase the optimizer's efficiency. The adopted NSGA-II optimization kernel is improved by applying the genetic operators of mutation and crossover on a transformed design-space, obtained from the latest set of solutions (the parents) using PCA. By applying crossover and mutation on variables that are projections of the principal components, the optimization moves more effectively, finding solutions with better performances, in the same amount of time, than the standard NSGA-II optimization kernel. The proposed method was validated in the optimization of two widely used analog circuits, an amplifier and a voltage controlled oscillator, reaching wider solutions sets, and in some cases, solutions sets that can be almost 3 times better in terms of hypervolume.
4 citations
Cites methods from "Optimization of Analog RF Circuit p..."
[...]
[...]
[...]
References
More filters
[...]
06 Aug 2002
TL;DR: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced, and the evolution of several paradigms is outlined, and an implementation of one of the paradigm is discussed.
Abstract: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced. The evolution of several paradigms is outlined, and an implementation of one of the paradigms is discussed. Benchmark testing of the paradigm is described, and applications, including nonlinear function optimization and neural network training, are proposed. The relationships between particle swarm optimization and both artificial life and genetic algorithms are described.
32,237 citations
[...]
TL;DR: This paper analyzes a particle's trajectory as it moves in discrete time, then progresses to the view of it in continuous time, leading to a generalized model of the algorithm, containing a set of coefficients to control the system's convergence tendencies.
Abstract: The particle swarm is an algorithm for finding optimal regions of complex search spaces through the interaction of individuals in a population of particles. This paper analyzes a particle's trajectory as it moves in discrete time (the algebraic view), then progresses to the view of it in continuous time (the analytical view). A five-dimensional depiction is developed, which describes the system completely. These analyses lead to a generalized model of the algorithm, containing a set of coefficients to control the system's convergence tendencies. Some results of the particle swarm optimizer, implementing modifications derived from the analysis, suggest methods for altering the original algorithm in ways that eliminate problems and increase the ability of the particle swarm to find optima of some well-studied test functions.
7,683 citations
[...]
TL;DR: It is concluded that the best approach is to use the constriction factor while limiting the maximum velocity Vmax to the dynamic range of the variable Xmax on each dimension.
Abstract: The performance of particle swarm optimization using an inertia weight is compared with performance using a constriction factor. Five benchmark functions are used for the comparison. It is concluded that the best approach is to use the constriction factor while limiting the maximum velocity Vmax to the dynamic range of the variable Xmax on each dimension. This approach provides performance on the benchmark functions superior to any other published results known by the authors.
2,741 citations
[...]
TL;DR: The particle swarm optimization algorithm is analyzed using standard results from the dynamic system theory and graphical parameter selection guidelines are derived, resulting in results superior to previously published results.
Abstract: The particle swarm optimization algorithm is analyzed using standard results from the dynamic system theory. Graphical parameter selection guidelines are derived. The exploration-exploitation tradeoff is discussed and illustrated. Examples of performance on benchmark functions superior to previously published results are given.
2,399 citations
[...]
TL;DR: This letter presents a formal stochastic convergence analysis of the standard particle swarm optimization (PSO) algorithm, which involves with randomness.
Abstract: This letter presents a formal stochastic convergence analysis of the standard particle swarm optimization (PSO) algorithm, which involves with randomness. By regarding each particle's position on each evolutionary step as a stochastic vector, the standard PSO algorithm determined by non-negative real parameter tuple {@w,c"1,c"2} is analyzed using stochastic process theory. The stochastic convergent condition of the particle swarm system and corresponding parameter selection guidelines are derived.
393 citations
Related Papers (5)
[...]