scispace - formally typeset
Search or ask a question
Topic

Genetic algorithm

About: Genetic algorithm is a research topic. Over the lifetime, 67538 publications have been published within this topic receiving 1232117 citations. The topic is also known as: optimize problem & GA.


Papers
More filters
Journal ArticleDOI
TL;DR: This paper suggests a non-dominated sorting-based MOEA, called NSGA-II (Non-dominated Sorting Genetic Algorithm II), which alleviates all of the above three difficulties, and modify the definition of dominance in order to solve constrained multi-objective problems efficiently.
Abstract: Multi-objective evolutionary algorithms (MOEAs) that use non-dominated sorting and sharing have been criticized mainly for: (1) their O(MN/sup 3/) computational complexity (where M is the number of objectives and N is the population size); (2) their non-elitism approach; and (3) the need to specify a sharing parameter. In this paper, we suggest a non-dominated sorting-based MOEA, called NSGA-II (Non-dominated Sorting Genetic Algorithm II), which alleviates all of the above three difficulties. Specifically, a fast non-dominated sorting approach with O(MN/sup 2/) computational complexity is presented. Also, a selection operator is presented that creates a mating pool by combining the parent and offspring populations and selecting the best N solutions (with respect to fitness and spread). Simulation results on difficult test problems show that NSGA-II is able, for most problems, to find a much better spread of solutions and better convergence near the true Pareto-optimal front compared to the Pareto-archived evolution strategy and the strength-Pareto evolutionary algorithm - two other elitist MOEAs that pay special attention to creating a diverse Pareto-optimal front. Moreover, we modify the definition of dominance in order to solve constrained multi-objective problems efficiently. Simulation results of the constrained NSGA-II on a number of test problems, including a five-objective, seven-constraint nonlinear problem, are compared with another constrained multi-objective optimizer, and the much better performance of NSGA-II is observed.

37,111 citations

01 Jan 1989

12,457 citations

Journal ArticleDOI
TL;DR: Experimental results have demonstrated that MOEA/D with simple decomposition methods outperforms or performs similarly to MOGLS and NSGA-II on multiobjective 0-1 knapsack problems and continuous multiobjectives optimization problems.
Abstract: Decomposition is a basic strategy in traditional multiobjective optimization. However, it has not yet been widely used in multiobjective evolutionary optimization. This paper proposes a multiobjective evolutionary algorithm based on decomposition (MOEA/D). It decomposes a multiobjective optimization problem into a number of scalar optimization subproblems and optimizes them simultaneously. Each subproblem is optimized by only using information from its several neighboring subproblems, which makes MOEA/D have lower computational complexity at each generation than MOGLS and nondominated sorting genetic algorithm II (NSGA-II). Experimental results have demonstrated that MOEA/D with simple decomposition methods outperforms or performs similarly to MOGLS and NSGA-II on multiobjective 0-1 knapsack problems and continuous multiobjective optimization problems. It has been shown that MOEA/D using objective normalization can deal with disparately-scaled objectives, and MOEA/D with an advanced decomposition method can generate a set of very evenly distributed solutions for 3-objective test instances. The ability of MOEA/D with small population, the scalability and sensitivity of MOEA/D have also been experimentally investigated in this paper.

6,657 citations

Book ChapterDOI
18 Sep 2000
TL;DR: Simulation results on five difficult test problems show that the proposed NSGA-II, in most problems, is able to find much better spread of solutions and better convergence near the true Pareto-optimal front compared to PAES and SPEA--two other elitist multi-objective EAs which pay special attention towards creating a diverse Paretimal front.
Abstract: Multi-objective evolutionary algorithms which use non-dominated sorting and sharing have been mainly criticized for their (i) O(MN3) computational complexity (where M is the number of objectives and N is the population size), (ii) non-elitism approach, and (iii) the need for specifying a sharing parameter. In this paper, we suggest a non-dominated sorting based multi-objective evolutionary algorithm (we called it the Non-dominated Sorting GA-II or NSGA-II) which alleviates all the above three difficulties. Specifically, a fast non-dominated sorting approach with O(MN2) computational complexity is presented. Second, a selection operator is presented which creates a mating pool by combining the parent and child populations and selecting the best (with respect to fitness and spread) N solutions. Simulation results on five difficult test problems show that the proposed NSGA-II, in most problems, is able to find much better spread of solutions and better convergence near the true Pareto-optimal front compared to PAES and SPEA--two other elitist multi-objective EAs which pay special attention towards creating a diverse Pareto-optimal front. Because of NSGA-II's low computational requirements, elitist approach, and parameter-less sharing approach, NSGA-II should find increasing applications in the years to come.

4,878 citations

Proceedings ArticleDOI
05 Jul 1995
TL;DR: C Culling is near optimal for this problem, highly noise tolerant, and the best known a~~roach in some regimes, and some new large deviation bounds on this submartingale enable us to determine the running time of the algorithm.
Abstract: We analyze the performance of a Genetic Type Algorithm we call Culling and a variety of other algorithms on a problem we refer to as ASP. Culling is near optimal for this problem, highly noise tolerant, and the best known a~~roach . . in some regimes. We show that the problem of learning the Ising perception is reducible to noisy ASP. These results provide an example of a rigorous analysis of GA’s and give insight into when and how C,A’s can beat competing methods. To analyze the genetic algorithm, we view it as a special type of submartingale. We prove some new large deviation bounds on this submartingale w~ich enable us to determine the running time of the algorithm.

4,520 citations


Network Information
Related Topics (5)
Artificial neural network
207K papers, 4.5M citations
92% related
Fuzzy logic
151.2K papers, 2.3M citations
92% related
Optimization problem
96.4K papers, 2.1M citations
91% related
Cluster analysis
146.5K papers, 2.9M citations
88% related
Feature extraction
111.8K papers, 2.1M citations
87% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20241
20231,703
20224,251
20212,801
20202,986
20193,103