scispace - formally typeset
Search or ask a question
Topic

Best-first search

About: Best-first search is a research topic. Over the lifetime, 3331 publications have been published within this topic receiving 102191 citations.


Papers
More filters
Journal Article
TL;DR: This paper shows empirically and theoretically that randomly chosen trials are more efficient for hyper-parameter optimization than trials on a grid, and shows that random search is a natural baseline against which to judge progress in the development of adaptive (sequential) hyper- parameter optimization algorithms.
Abstract: Grid search and manual search are the most widely used strategies for hyper-parameter optimization. This paper shows empirically and theoretically that randomly chosen trials are more efficient for hyper-parameter optimization than trials on a grid. Empirical evidence comes from a comparison with a large previous study that used grid search and manual search to configure neural networks and deep belief networks. Compared with neural networks configured by a pure grid search, we find that random search over the same domain is able to find models that are as good or better within a small fraction of the computation time. Granting random search the same computational budget, random search finds better models by effectively searching a larger, less promising configuration space. Compared with deep belief networks configured by a thoughtful combination of manual search and grid search, purely random search over the same 32-dimensional configuration space found statistically equal performance on four of seven data sets, and superior performance on one of seven. A Gaussian process analysis of the function from hyper-parameters to validation set performance reveals that for most data sets only a few of the hyper-parameters really matter, but that different hyper-parameters are important on different data sets. This phenomenon makes grid search a poor choice for configuring algorithms for new data sets. Our analysis casts some light on why recent "High Throughput" methods achieve surprising success--they appear to search through a large number of hyper-parameters because most hyper-parameters do not matter much. We anticipate that growing interest in large hierarchical models will place an increasing burden on techniques for hyper-parameter optimization; this work shows that random search is a natural baseline against which to judge progress in the development of adaptive (sequential) hyper-parameter optimization algorithms.

6,935 citations

Journal ArticleDOI
TL;DR: A new optimization algorithm based on the law of gravity and mass interactions is introduced and the obtained results confirm the high performance of the proposed method in solving various nonlinear functions.

5,501 citations

Journal ArticleDOI
TL;DR: This chapter presents the basic schemes of VNS and some of its extensions, and presents five families of applications in which VNS has proven to be very successful.

3,572 citations

Journal Article
TL;DR: A real-coded crossover operator is developed whose search power is similar to that of the single-point crossover used in binary-coded GAs, and SBX is found to be particularly useful in problems having mult ip le optimal solutions with a narrow global basin where the lower and upper bo unds of the global optimum are not known a priori.
Abstract: Abst ract . T he success of binary-coded gene t ic algorithms (GA s) in problems having discrete sear ch space largely depends on the coding used to represent the prob lem var iables and on the crossover ope ra tor that propagates buildin g blocks from parent strings to children st rings . In solving optimization problems having continuous search space, binary-coded GAs discr et ize the search space by using a coding of the problem var iables in binary strings. However , t he coding of realvalued vari ables in finit e-length st rings causes a number of difficulties: inability to achieve arbit rary pr ecision in the obtained solution , fixed mapping of problem var iab les, inh eren t Hamming cliff problem associated wit h binary coding, and processing of Holland 's schemata in cont inuous search space. Although a number of real-coded GAs are developed to solve optimization problems having a cont inuous search space, the search powers of these crossover operators are not adequate . In t his paper , t he search power of a crossover operator is defined in terms of the probability of creating an arbitrary child solut ion from a given pair of parent solutions . Motivated by the success of binarycoded GAs in discrete search space problems , we develop a real-coded crossover (which we call the simulated binar y crossover , or SBX) operator whose search power is similar to that of the single-point crossover used in binary-coded GAs . Simulation results on a nu mber of realvalued test problems of varying difficulty and dimensionality suggest t hat the real-cod ed GAs with the SBX operator ar e ab le to perfor m as good or bet ter than binary-cod ed GAs wit h the single-po int crossover. SBX is found to be particularly useful in problems having mult ip le optimal solutions with a narrow global basin an d in prob lems where the lower and upper bo unds of the global optimum are not known a priori. Further , a simulation on a two-var iable blocked function shows that the real-coded GA with SBX work s as suggested by Goldberg

2,702 citations

Journal ArticleDOI
TL;DR: A novel search strategy is introduced that combines hill-climbing with systematic search, and it is shown how other powerful heuristic information can be extracted and used to prune the search space.
Abstract: We describe and evaluate the algorithmic techniques that are used in the FF planning system. Like the HSP system, FF relies on forward state space search, using a heuristic that estimates goal distances by ignoring delete lists. Unlike HSP's heuristic, our method does not assume facts to be independent. We introduce a novel search strategy that combines hill-climbing with systematic search, and we show how other powerful heuristic information can be extracted and used to prune the search space. FF was the most successful automatic planner at the recent AIPS-2000 planning competition. We review the results of the competition, give data for other benchmark domains, and investigate the reasons for the runtime performance of FF compared to HSP.

1,994 citations


Network Information
Related Topics (5)
Genetic algorithm
67.5K papers, 1.2M citations
86% related
Optimization problem
96.4K papers, 2.1M citations
85% related
Scheduling (computing)
78.6K papers, 1.3M citations
82% related
Robustness (computer science)
94.7K papers, 1.6M citations
82% related
Artificial neural network
207K papers, 4.5M citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20233
202226
20219
202015
201917
201822