scispace - formally typeset
Search or ask a question
Topic

Learnable Evolution Model

About: Learnable Evolution Model is a research topic. Over the lifetime, 146 publications have been published within this topic receiving 21803 citations.


Papers
More filters
Book
01 Jan 1992
TL;DR: GAs and Evolution Programs for Various Discrete Problems, a Hierarchy of Evolution Programs and Heuristics, and Conclusions.
Abstract: 1 GAs: What Are They?.- 2 GAs: How Do They Work?.- 3 GAs: Why Do They Work?.- 4 GAs: Selected Topics.- 5 Binary or Float?.- 6 Fine Local Tuning.- 7 Handling Constraints.- 8 Evolution Strategies and Other Methods.- 9 The Transportation Problem.- 10 The Traveling Salesman Problem.- 11 Evolution Programs for Various Discrete Problems.- 12 Machine Learning.- 13 Evolutionary Programming and Genetic Programming.- 14 A Hierarchy of Evolution Programs.- 15 Evolution Programs and Heuristics.- 16 Conclusions.- Appendix A.- Appendix B.- Appendix C.- Appendix D.- References.

12,212 citations

Journal ArticleDOI
TL;DR: In this paper, a method is described for converting a weak learning algorithm into one that achieves arbitrarily high accuracy, and it is shown that these two notions of learnability are equivalent.
Abstract: This paper addresses the problem of improving the accuracy of an hypothesis output by a learning algorithm in the distribution-free (PAC) learning model. A concept class is learnable (or strongly learnable) if, given access to a source of examples of the unknown concept, the learner with high probability is able to output an hypothesis that is correct on all but an arbitrarily small fraction of the instances. The concept class is weakly learnable if the learner can produce an hypothesis that performs only slightly better than random guessing. In this paper, it is shown that these two notions of learnability are equivalent. A method is described for converting a weak learning algorithm into one that achieves arbitrarily high accuracy. This construction may have practical applications as a tool for efficiently converting a mediocre learning algorithm into one that performs extremely well. In addition, the construction has some interesting theoretical consequences, including a set of general upper bounds on the complexity of any strong learning algorithm as a function of the allowed error e.

3,678 citations

Journal ArticleDOI
Yaochu Jin1
TL;DR: This paper provides a concise overview of the history and recent developments in surrogate-assisted evolutionary computation and suggests a few future trends in this research area.
Abstract: Surrogate-assisted, or meta-model based evolutionary computation uses efficient computational models, often known as surrogates or meta-models, for approximating the fitness function in evolutionary algorithms. Research on surrogate-assisted evolutionary computation began over a decade ago and has received considerably increasing interest in recent years. Very interestingly, surrogate-assisted evolutionary computation has found successful applications not only in solving computationally expensive single- or multi-objective optimization problems, but also in addressing dynamic optimization problems, constrained optimization problems and multi-modal optimization problems. This paper provides a concise overview of the history and recent developments in surrogate-assisted evolutionary computation and suggests a few future trends in this research area.

1,072 citations

MonographDOI
01 Jan 2000

728 citations

Journal ArticleDOI
Michael Kearns1
TL;DR: This paper formalizes a new but related model of learning from statistical queries, and demonstrates the generality of the statistical query model, showing that practically every class learnable in Valiant's model and its variants can also be learned in the new model (and thus can be learning in the presence of noise).
Abstract: In this paper, we study the problem of learning in the presence of classification noise in the probabilistic learning model of Valiant and its variants. In order to identify the class of “robust” learning algorithms in the most general way, we formalize a new but related model of learning from statistical queries. Intuitively, in this model a learning algorithm is forbidden to examine individual examples of the unknown target function, but is given acess to an oracle providing estimates of probabilities over the sample space of random examples.One of our main results shows that any class of functions learnable from statistical queries is in fact learnable with classification noise in Valiant's model, with a noise rate approaching the information-theoretic barrier of 1/2. We then demonstrate the generality of the statistical query model, showing that practically every class learnable in Valiant's model and its variants can also be learned in the new model (and thus can be learned in the presence of noise). A notable exception to this statement is the class of parity functions, which we prove is not learnable from statistical queries, and for which no noise-tolerant algorithm is known.

662 citations


Network Information
Related Topics (5)
Genetic algorithm
67.5K papers, 1.2M citations
76% related
Fuzzy set
44.4K papers, 1.1M citations
72% related
Artificial neural network
207K papers, 4.5M citations
71% related
Fuzzy logic
151.2K papers, 2.3M citations
70% related
Support vector machine
73.6K papers, 1.7M citations
68% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20211
20201
20192
20174
20164
20154