scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

β-Hill Climbing Algorithm for Sudoku Game

TL;DR: In this article, β-Hill climbing algorithm was used for solving Sudoku puzzle in 19 iterations and 2 seconds using a stochastic operator called β-operator, which is a new extended version of hill climbing algorithm which has the capability to escape the local optima.
Abstract: In this paper, β-Hill Climbing algorithm, the recent local search-based meta-heuristic, are tailored for Sudoku puzzle. β-Hill Climbing algorithm is a new extended version of hill climbing algorithm which has the capability to escape the local optima using a stochastic operator called β-operator. The Sudoku puzzle is a popular game formulated as an optimization problem to come up with exact solution. Some Sudoku puzzle examples have been applied for evaluation process. The parameters of the β-Hill Climbing is also studied to show the best configuration used for this game. β-Hill Climbing in its best parameter configuration is able to find solution for Sudoku puzzle in 19 iterations and 2 seconds.
Citations
More filters
Journal ArticleDOI
TL;DR: The proposed βHCWT, a hybrid of the β-hill climbing metaheuristic algorithm and wavelet transform (WT), as a new method for denoising electrocardiogram (ECG) signals demonstrated an outstanding performance in removing noise from non-stationary signals, and the quality of the output signal was deemed favorable for medical diagnosis.

68 citations


Cites methods from "β-Hill Climbing Algorithm for Sudok..."

  • ...Nevertheless, given its many advantages, the β-hill climbing algorithm has been successfully applied to solve many optimization problems, such as in Sudoku [3], text clustering [1],...

    [...]

  • ...Nevertheless, given its many advantages, the β-hill climbing algorithm has been successfully applied to solve many optimization problems, such as in Sudoku [3], text clustering [1], AC CE PT ED M AN US CR IP T multiple-reservoir scheduling [6], and signal processing [9, 8]....

    [...]

Journal ArticleDOI
TL;DR: The proposed grey wolf optimizer (GWO) which is a swarm intelligence is hybridized with a local search algorithm, to improve convergence properties and is proved to be a powerful method for ELD problem or for any other similar problems in the power system domain.
Abstract: Economic load dispatch (ELD) is a crucial problem in the power system which is tackled by distributing the required generation power through a set of units to minimize the fuel cost required. This distribution is subject to two main constraints: (1) equality and inequality related to power balance and power output, respectively. In the optimization context, ELD is formulated as a non-convex, nonlinear, constrained optimization problem which cannot be easily solved using calculus-based techniques. Several optimization algorithms have been adapted. Due to the complexity nature of ELD search space, the theoretical concepts of these optimization algorithms have been modified or hybridized. In this paper, the grey wolf optimizer (GWO) which is a swarm intelligence is hybridized with $$\beta$$ -hill climbing optimizer ( $$\beta$$ HC) which is a local search algorithm, to improve convergence properties. GWO is very powerful in a wide search, while $$\beta$$ HC is very powerful in deep search. By combining the wide and deep search ability in a single optimization framework, the balance between the exploration and exploitation is correctly managed. The proposed hybrid algorithm is named $$\beta$$ -GWO which is evaluated using five different test cases of ELD problems: 3 generating units with 850 MW; 13 generating units with 1800 MW; 13 generating units with 2520 MW; 40 generating units with 10,500 MW; and 80 generating units with 21,000 MW. $$\beta$$ -GWO is comparatively measured using 49 comparative methods. The results obtained by $$\beta$$ -GWO outperform others in most test cases. In conclusion, the proposed $$\beta$$ -GWO is proved to be a powerful method for ELD problem or for any other similar problems in the power system domain.

47 citations

Journal ArticleDOI
TL;DR: The proposed local search-based method, which uses an intelligent stochastic operator called β-operator to escape the trap of local optima, is able to produce a very closed-to-optimum result for almost all the tested ELD systems and the best result for one of them.
Abstract: In this paper, the problem of economic load dispatch (ELD) is tackled using a recently introduced local search-based method called $$\beta $$ -hill climbing optimizer. In a power system, the ELD problem is tackled by arranging a set of generation units’ outputs in a specific order to minimize the cost of the operating fuel and to match the power system load demand. This goal is achieved by satisfying all the power balance equality constraints and power output inequality constraints. $$\beta $$ -hill climbing algorithm is a new local search algorithm which uses an intelligent stochastic operator called $$\beta $$ -operator to escape the trap of local optima. The proposed method is evaluated using five real-world ELD benchmarks which vary in terms of complexity and size. The sensitivity analysis to study the effect of the proposed method parameters is conducted based on eight different convergence cases. The evaluation results of the proposed method are compared with 45 state-of-the-art methods using the same tested ELD benchmarks. Interestingly, the proposed method is able to produce a very closed-to-optimum result for almost all the tested ELD systems and the best result for one of them.

40 citations

Journal ArticleDOI
01 Jan 2019
TL;DR: The proposed adaptive $$\beta -$$β-hill climbing is able to achieve the best results on 10 out of 23 test functions, which are very competitive with the other methods.
Abstract: In this paper, an adaptive version of $$\beta -$$ hill climbing is proposed. In the original $$\beta -$$ hill climbing, two control parameters are utilized to strike the right balance between a local-nearby exploitation and a global wide-range exploration during the search: $${\mathcal {N}}$$ and $$\beta $$ , respectively. Conventionally, these two parameters require an intensive study to find their suitable values. In order to yield an easy-to-use optimization method, this paper proposes an efficient adaptive strategy for these two parameters in a deterministic way. The proposed adaptive method is evaluated against 23 global optimization functions. The selectivity analysis to determine the optimal progressing values of $${\mathcal {N}}$$ and $$\beta $$ during the search is carried out. Furthermore, the behavior of the adaptive version is analyzed based on various problems with different complexity levels. For comparative evaluation, the adaptive version is initially compared with the original one as well as with other local search-based methods and other well-regarded methods using the same benchmark functions. Interestingly, the results produced are very competitive with the other methods. In a nutshell, the proposed adaptive $$\beta -$$ hill climbing is able to achieve the best results on 10 out of 23 test functions. For more validation, the test functions established in IEEE-CEC2015 are used with various scaling values. The comparative results show the viability of the proposed adaptive method.

37 citations

Journal ArticleDOI
TL;DR: The obtained results show that the proposed binary $$\beta$$ -hill climbing optimizer outperforms other comparative local search methods in terms of classification accuracy on 16 out of 22 datasets, and overcomes other comparative metaheuristics approaches in termsof classification accuracy.
Abstract: Feature selection is an essential stage in many data mining and machine learning and applications that find the proper subset of features from a set of irrelevant, redundant, noisy and high dimensional data. This dimensional reduction is a vital task to increase classification accuracy and thus reduce the processing time. An optimization algorithm can be applied to tackle the feature selection problem. In this paper, a $$\beta$$ -hill climbing optimizer is applied to solve the feature selection problem. $$\beta$$ -hill climbing is recently introduced as a local-search based algorithm that can obtain pleasing solutions for different optimization problems. In order to tailor $$\beta$$ -hill climbing for feature selection, it has to be adapted to work in a binary context. The S-shaped transfer function is used to transform the data into the binary representation. A set of 22 de facto benchmark real-world datasets are used to evaluate the proposed algorithm. The effect of the $$\beta$$ -hill climbing parameters on the convergence rate is studied in terms of accuracy, the number of features, fitness values, and computational time. Furthermore, the proposed method is compared against three local search methods and ten metaheuristics methods. The obtained results show that the proposed binary $$\beta$$ -hill climbing optimizer outperforms other comparative local search methods in terms of classification accuracy on 16 out of 22 datasets. Furthermore, it overcomes other comparative metaheuristics approaches in terms of classification accuracy in 7 out of 22 datasets. The obtained results prove the efficiency of the proposed binary $$\beta$$ -hill climbing optimizer.

24 citations

References
More filters
Journal ArticleDOI
13 May 1983-Science
TL;DR: There is a deep and useful connection between statistical mechanics and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters), and a detailed analogy with annealing in solids provides a framework for optimization of very large and complex systems.
Abstract: There is a deep and useful connection between statistical mechanics (the behavior of systems with many degrees of freedom in thermal equilibrium at a finite temperature) and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters). A detailed analogy with annealing in solids provides a framework for optimization of the properties of very large and complex systems. This connection to statistical mechanics exposes new information and provides an unfamiliar perspective on traditional optimization problems and methods.

41,772 citations

Journal ArticleDOI
TL;DR: Four key areas of Integer programming are examined from a framework that links the perspectives of artificial intelligence and operations research, and each has characteristics that appear usefully relevant to developments on the horizon.

3,985 citations


"β-Hill Climbing Algorithm for Sudok..." refers methods in this paper

  • ...Several extensions to hill climbing have been developed to include an exploration-oriented operators such as simulated annealing [4], Tabu search [5], variable neighbourhood search [6], iterated local search [7], GRASP [8]....

    [...]

Journal ArticleDOI
TL;DR: This paper defines the various components comprising a GRASP and demonstrates, step by step, how to develop such heuristics for combinatorial optimization problems.
Abstract: Today, a variety of heuristic approaches are available to the operations research practitioner. One methodology that has a strong intuitive appeal, a prominent empirical track record, and is trivial to efficiently implement on parallel processors is GRASP (Greedy Randomized Adaptive Search Procedures). GRASP is an iterative randomized sampling technique in which each iteration provides a solution to the problem at hand. The incumbent solution over all GRASP iterations is kept as the final result. There are two phases within each GRASP iteration: the first intelligently constructs an initial solution via an adaptive randomized greedy function; the second applies a local search procedure to the constructed solution in hope of finding an improvement. In this paper, we define the various components comprising a GRASP and demonstrate, step by step, how to develop such heuristics for combinatorial optimization problems. Intuitive justifications for the observed empirical behavior of the methodology are discussed. The paper concludes with a brief literature review of GRASP implementations and mentions two industrial applications.

2,370 citations

Journal ArticleDOI
TL;DR: In this article, a simple and effective metaheuristic for combinatorial and global optimization, called variable neighborhood search (VNS), is presented, which can easily be implemented using any local search algorithm as a subroutine.

1,824 citations

Journal ArticleDOI
TL;DR: The components and concepts that are used in various metaheuristics are outlined in order to analyze their similarities and differences and the classification adopted in this paper differentiates between single solution based metaheURistics and population based meta heuristics.

1,343 citations


"β-Hill Climbing Algorithm for Sudok..." refers methods in this paper

  • ...This kind of algorithms is called optimization algorithms [1]....

    [...]