scispace - formally typeset
Search or ask a question
Topic

Evaluation function

About: Evaluation function is a research topic. Over the lifetime, 2319 publications have been published within this topic receiving 24379 citations. The topic is also known as: heuristic evaluation function & static evaluation function.


Papers
More filters
Book ChapterDOI
01 Mar 1991
TL;DR: Improvements to the CN2 algorithm are described, including the use of the Laplacian error estimate as an alternative evaluation function and it is shown how unordered as well as ordered rules can be generated.
Abstract: The CN2 algorithm induces an ordered list of classification rules from examples using entropy as its search heuristic. In this short paper, we describe two improvements to this algorithm. Firstly, we present the use of the Laplacian error estimate as an alternative evaluation function and secondly, we show how unordered as well as ordered rules can be generated. We experimentally demonstrate significantly improved performances resulting from these changes, thus enhancing the usefulness of CN2 as an inductive tool. Comparisons with Quinlan's C4.5 are also made.

934 citations

Journal ArticleDOI
TL;DR: The extensive use of the uncertainty information of predictions for screening the candidate solutions makes it possible to significantly reduce the computational cost of singleand multiobjective EA.
Abstract: This paper presents and analyzes in detail an efficient search method based on evolutionary algorithms (EA) assisted by local Gaussian random field metamodels (GRFM). It is created for the use in optimization problems with one (or many) computationally expensive evaluation function(s). The role of GRFM is to predict objective function values for new candidate solutions by exploiting information recorded during previous evaluations. Moreover, GRFM are able to provide estimates of the confidence of their predictions. Predictions and their confidence intervals predicted by GRFM are used by the metamodel assisted EA. It selects the promising members in each generation and carries out exact, costly evaluations only for them. The extensive use of the uncertainty information of predictions for screening the candidate solutions makes it possible to significantly reduce the computational cost of singleand multiobjective EA. This is adequately demonstrated in this paper by means of mathematical test cases and a multipoint airfoil design in aerodynamics

639 citations

Journal ArticleDOI
TL;DR: Computational results show that the genetic algorithm heuristic is able to find optimal and near optimal solutions that are on average less than 0.01 % from optimality.

510 citations

Journal ArticleDOI
TL;DR: A unified extension of the basic method to predict not only the network structure but also its dynamics using a Genetic Algorithm and an S-system formalism is proposed and successfully inferred the dynamics of a small genetic network constructed with 60 parameters for 5 network variables and feedback loops using only time-course data of gene expression.
Abstract: Motivation: The modeling of system dynamics of genetic networks, metabolic networks or signal transduction cascades from time-course data is formulated as a reverse-problem. Previous studies focused on the estimation of only network structures, and they were ineffective in inferring a network structure with feedback loops. We previously proposed a method to predict not only the network structure but also its dynamics using a Genetic Algorithm (GA) and an S-system formalism. However, it could predict only a small number of parameters and could rarely obtain essential structures. In this work, we propose a unified extension of the basic method. Notable improvements are as follows: (1) an additional term in its evaluation function that aims at eliminating futile parameters; (2) a crossover method called Simplex Crossover (SPX) to improve its optimization ability; and (3) a gradual optimization strategy to increase the number of predictable parameters. Results: The proposed method is implemented as a C program called PEACE1 (Predictor by Evolutionary Algorithms and Canonical Equations 1). Its performance was compared with the basic method. The comparison showed that: (1) the convergence rate increased about 5-fold; (2) the optimization speed was raised about 1.5-fold; and (3) the number of predictable parameters was increased about 5-fold. Moreover, we successfully inferred the dynamics of a small genetic network constructed with 60 parameters for 5 network variables and feedback loops using only time-course data of gene expression.

430 citations

Book ChapterDOI
01 Jan 2004
TL;DR: In the previous three chapters, various classic problem-solving methods, including dynamic programming, branch and bound, and local search algorithms, as well as some modern heuristic methods like simulated annealing and tabu search, were seen to be deterministic.
Abstract: In the previous three chapters we discussed various classic problem-solving methods, including dynamic programming, branch and bound, and local search algorithms, as well as some modern heuristic methods like simulated annealing and tabu search. Some of these techniques were seen to be deterministic. Essentially you “turn the crank” and out pops the answer. For these methods, given a search space and an evaluation function, some would always return the same solution (e.g., dynamic programming), while others could generate different solutions based on the initial configuration or starting point (e.g., a greedy algorithm or the hill-climbing technique). Still other methods were probabilistic, incorporating random variation into the search for optimal solutions. These methods (e.g., simulated annealing) could return different final solutions even when given the same initial configuration. No two trials with these algorithms could be expected to take exactly the same course. Each trial is much like a person’s fingerprint: although there are broad similarities across fingerprints, no two are exactly alike.

416 citations


Network Information
Related Topics (5)
Fuzzy logic
151.2K papers, 2.3M citations
84% related
Artificial neural network
207K papers, 4.5M citations
81% related
Optimization problem
96.4K papers, 2.1M citations
81% related
Software
130.5K papers, 2M citations
81% related
Cluster analysis
146.5K papers, 2.9M citations
80% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20233
202211
202161
202092
2019128
2018108