scispace - formally typeset
Search or ask a question

Showing papers on "Best-first search published in 2016"


01 Jan 2016
TL;DR: Algorithm selection is concerned with selecting the best algorithm to solve a given problem on a case-by-case basis as discussed by the authors, which has become especially relevant in the last decade, as researchers are increasingly investigating how to identify the most suitable existing algorithm for solving a problem instead of developing new algorithms.
Abstract: The Algorithm Selection Problem is concerned with selecting the best algorithm to solve a given problem on a case-by-case basis. It has become especially relevant in the last decade, as researchers are increasingly investigating how to identify the most suitable existing algorithm for solving a problem instead of developing new algorithms. This survey presents an overview of this work focusing on the contributions made in the area of combinatorial search problems, where Algorithm Selection techniques have achieved significant performance improvements. We unify and organise the vast literature according to criteria that determine Algorithm Selection systems in practice. The comprehensive classification of approaches identifies and analyses the different directions from which Algorithm Selection has been approached. This chapter contrasts and compares different methods for solving the problem as well as ways of using these solutions.

283 citations


Journal ArticleDOI
TL;DR: This Hybrid Harmony Search (HHS) algorithm follows a new approach to improvisation: while retaining HS algorithm Harmony Memory and pitch adjustment functions, it replaces the HS algorithm randomization function with Global-best Particle Swarm Optimization (PSO) search and neighbourhood search.

121 citations


Journal ArticleDOI
TL;DR: In this paper, a knowledge-guided fruit fly optimisation algorithm (KGFOA) with a new encoding scheme is proposed to solve the dual-resource constrained flexible job shop scheduling problem (DRCFJSP) with makespan minimisation criterion.
Abstract: Different from the classical job shop scheduling, the dual-resource constrained flexible job-shop scheduling problem (DRCFJSP) should deal with job sequence, machine assignment and worker assignment all together. In this paper, a knowledge-guided fruit fly optimisation algorithm (KGFOA) with a new encoding scheme is proposed to solve the DRCFJSP with makespan minimisation criterion. In the KGFOA, two types of permutation-based search operators are used to perform the smell-based search for job sequence and resource (machine and worker) assignment, respectively. To enhance the search capability, a knowledge-guided search stage is incorporated into the KGFOA with two new search operators particularly designed for adjusting the operation sequence and the resource assignment, respectively. Due to the combination of the knowledge-guided search and the smell-based search, global exploration and local exploitation can be balanced. Besides, the effect of parameter setting of the KGFOA is investigated and numerica...

97 citations


Journal ArticleDOI
TL;DR: A novel algorithm combining the capabilities of chaotic maps and the golden section search method in order to solve nonlinear optimization problems and performs effectively for the engineering applications such as the gear train deign problem.

89 citations


Proceedings ArticleDOI
27 Jun 2016
TL;DR: This paper presents a compact coding solution for efficient search, with a focus on the quantization approach which has already shown the superior performance over the hashing solutions in the single-modal similarity search.
Abstract: Cross-modal similarity search is a problem about designing a search system supporting querying across content modalities, e.g., using an image to search for texts or using a text to search for images. This paper presents a compact coding solution for efficient search, with a focus on the quantization approach which has already shown the superior performance over the hashing solutions in the single-modal similarity search. We propose a cross-modal quantization approach, which is among the early attempts to introduce quantization into cross-modal search. The major contribution lies in jointly learning the quantizers for both modalities through aligning the quantized representations for each pair of image and text belonging to a document. In addition, our approach simultaneously learns the common space for both modalities in which quantization is conducted to enable efficient and effective search using the Euclidean distance computed in the common space with fast distance table lookup. Experimental results compared with several competitive algorithms over three benchmark datasets demonstrate that the proposed approach achieves the state-of-the-art performance.

59 citations


Proceedings Article
12 Jun 2016
TL;DR: This paper defines and investigates the Lazy Shortest Path class of algorithms which is differentiated by the choice of an edge selector function, and shows that several algorithms in the literature are equivalent to this lazy algorithm for appropriate choice of this selector.
Abstract: While the shortest path problem has myriad applications, the computational efficiency of suitable algorithms depends intimately on the underlying problem domain. In this paper, we focus on domains where evaluating the edge weight function dominates algorithm running time. Inspired by approaches in robotic motion planning, we define and investigate the Lazy Shortest Path class of algorithms which is differentiated by the choice of an edge selector function. We show that several algorithms in the literature are equivalent to this lazy algorithm for appropriate choice of this selector. Further, we propose various novel selectors inspired by sampling and statistical mechanics, and find that these selectors outperform existing algorithms on a set of example problems.

59 citations


Proceedings ArticleDOI
01 Aug 2016
TL;DR: A novel chaotic Kbest gravitational search algorithm is proposed that uses the chaotic model in Kbest to balance the exploration and exploitation non-linearly and shows better convergence rate at later iterations with high precision and does not trap into local optima.
Abstract: Gravitational search algorithm is a popular adaptive search algorithm among nature-inspired algorithms and has been successfully used for optimizing many real-world problems. Gravitational search algorithm uses the law of Newton gravity for finding the optimal solution. The performance of gravitational search algorithm is controlled by exploration and exploitation capabilities and Kbest is one of its parameters that controls this trade-off. In this paper, a novel chaotic Kbest gravitational search algorithm has been proposed that uses the chaotic model in Kbest to balance the exploration and exploitation non-linearly. The proposed algorithm shows better convergence rate at later iterations with high precision and does not trap into local optima. The experimental results validate that the proposed algorithm outperforms.

54 citations


Journal ArticleDOI
TL;DR: To improve the exploitation ability of HS algorithm in later stage and provide global optimal solution, a memetic algorithm approach considering Harmony Search and Random Search is presented in the proposed research to solve unit commitment problem of electric power system.

54 citations


Proceedings ArticleDOI
16 May 2016
TL;DR: To enable graph edit similarity computation on larger and distant graphs, CSI_GED is presented, a novel edge-based mapping method for computing graph edit distance through common sub-structure isomorphisms enumeration that outperforms the state-of-the-art indexing-based methods by over two orders of magnitude.
Abstract: Graph similarity is a basic and essential operation in many applications. In this paper, we are interested in computing graph similarity based on edit distance. Existing graph edit distance computation methods adopt the best first search paradigm A*. These methods are time and space bound. In practice, they can compute the edit distance of graphs containing 12 vertices at most. To enable graph edit similarity computation on larger and distant graphs, we present CSI_GED, a novel edge-based mapping method for computing graph edit distance through common sub-structure isomorphisms enumeration. CSI_GED utilizes backtracking search combined with a number of heuristics to reduce memory requirements and quickly prune away a large portion of the mapping search space. Experiments show that CSI_GED is highly efficient for computing the edit distance on small as well as large and distant graphs. Furthermore, we evaluated CSI_GED as a stand-alone graph edit similarity search query method. The experiments show that CSI_GED is effective and scalable, and outperforms the state-of-the-art indexing-based methods by over two orders of magnitude.

49 citations


Journal ArticleDOI
TL;DR: Empirical results with a large number of randomly generated problem instances involving large part sizes varying from 200 to 500 under different operating conditions are compared with two well-known algorithms in the literature and demonstrate the effectiveness of the proposed cuckoo search algorithm.
Abstract: The paper addresses the problem of 2-machine robotic cell scheduling of one-unit cycle with sequence-dependent setup times and different loading/unloading times of the parts. As an alternative metaheuristic algorithm, the cuckoo search algorithm has recently attracted growing interests of researchers. It has the capability to search globally as well as locally to converge to the global optimality by exploring the search space more efficiently due to its global random walk governed by Levy flights, rather than standard isotropic random walk. In this study, a discrete cuckoo search algorithm is proposed to determine the sequence of robot moves along with the sequence of parts so that the cycle time is minimized. In the proposed algorithm, the fractional scaling factor based procedure is presented to determine the step length of Levy flights distribution in discrete from and then, using this step length, two neighborhood search techniques, interchange and cyclical shift methods are applied to the current solution to obtain improved solution. A response surface methodology based on desirability function is used to enhance the convergence speed of the proposed algorithm. Also, a design of experiment is employed to tune the operating parameters of the algorithm. Finally, empirical results with a large number of randomly generated problem instances involving large part sizes varying from 200 to 500 under different operating conditions are compared with two well-known algorithms in the literature and demonstrate the effectiveness of the proposed algorithm.

44 citations


Proceedings ArticleDOI
14 Jun 2016
TL;DR: The results show that the best algorithm is not only able to generate progressively-refined feasible solutions, but it also finds the optimal solution with at least two orders of magnitude acceleration over the state-of-the-art algorithm, using much less memory.
Abstract: The Group Steiner Tree (GST) problem is a fundamental problem in database area that has been successfully applied to keyword search in relational databases and team search in social networks. The state-of-the-art algorithm for the GST problem is a parameterized dynamic programming (DP) algorithm, which finds the optimal tree in O(3kn+2k(n log n + m)) time, where k is the number of given groups, m and n are the number of the edges and nodes of the graph respectively. The major limitations of the parameterized DP algorithm are twofold: (i) it is intractable even for very small values of k (e.g., k=8) in large graphs due to its exponential complexity, and (ii) it cannot generate a solution until the algorithm has completed its entire execution. To overcome these limitations, we propose an efficient and progressive GST algorithm in this paper, called PrunedDP. It is based on newly-developed optimal-tree decomposition and conditional tree merging techniques. The proposed algorithm not only drastically reduces the search space of the parameterized DP algorithm, but it also produces progressively-refined feasible solutions during algorithm execution. To further speed up the PrunedDP algorithm, we propose a progressive A*-search algorithm, based on several carefully-designed lower-bounding techniques. We conduct extensive experiments to evaluate our algorithms on several large scale real-world graphs. The results show that our best algorithm is not only able to generate progressively-refined feasible solutions, but it also finds the optimal solution with at least two orders of magnitude acceleration over the state-of-the-art algorithm, using much less memory.

Journal ArticleDOI
TL;DR: Two methods are presented: directed search (DS) descent which seeks for improvements of the given model, and a novel continuation method (DS continuation) which allows to search along the Pareto set of a given MOP.
Abstract: We propose a new iterative search procedure for the numerical treatment of unconstrained multi-objective optimization problems (MOPs) which steers the search along a predefined direction given in objective space. Based on this idea we will present two methods: directed search (DS) descent which seeks for improvements of the given model, and a novel continuation method (DS continuation) which allows to search along the Pareto set of a given MOP. One advantage of both methods is that they can be realized with and without gradient information, and if neighborhood information is available the computation of the search direction comes even for free. The latter makes our algorithms interesting candidates for local search engines within memetic strategies. Further, the approach can be used to gain some interesting insights into the nature of multi-objective stochastic local search which may explain one facet of the success of multi-objective evolutionary algorithms (MOEAs). Finally, we demonstrate the strength of the method both as standalone algorithm and as local search engine within a MOEA.

Posted Content
TL;DR: In this article, the authors define and investigate the Lazy Shortest Path class of algorithms, which is differentiated by the choice of an edge selector function, and propose various novel selectors inspired by sampling and statistical mechanics, and find that these selectors outperform existing algorithms on a set of example problems.
Abstract: While the shortest path problem has myriad applications, the computational efficiency of suitable algorithms depends intimately on the underlying problem domain. In this paper, we focus on domains where evaluating the edge weight function dominates algorithm running time. Inspired by approaches in robotic motion planning, we define and investigate the Lazy Shortest Path class of algorithms which is differentiated by the choice of an edge selector function. We show that several algorithms in the literature are equivalent to this lazy algorithm for appropriate choice of this selector. Further, we propose various novel selectors inspired by sampling and statistical mechanics, and find that these selectors outperform existing algorithms on a set of example problems.

Journal ArticleDOI
Shuai Ma1, Jia Li1, Chunming Hu1, Xuelian Lin1, Jinpeng Huai1 
TL;DR: In this article, the authors argue that big graph search is the one filling the gap between traditional relational and XML models, and give an analysis of graph search from an evolutionary point of view, followed by the evidences from both industry and academia.
Abstract: On one hand, compared with traditional relational and XML models, graphs have more expressive power and are widely used today. On the other hand, various applications of social computing trigger the pressing need of a new search paradigm. In this article, we argue that big graph search is the one filling this gap. We first introduce the application of graph search in various scenarios. We then formalize the graph search problem, and give an analysis of graph search from an evolutionary point of view, followed by the evidences from both the industry and academia. After that, we analyze the difficulties and challenges of big graph search. Finally, we present three classes of techniques towards big graph search: query techniques, data techniques and distributed computing techniques.

Journal ArticleDOI
TL;DR: A quiescently consistent lock-free priority queue based on a multi-dimensional list that guarantees worst-case search time of O(logN) for key universe of size N is proposed and achieves an average of 50 percent speedup over the state of the art approaches under high concurrency.
Abstract: The throughput of concurrent priority queues is pivotal to multiprocessor applications such as discrete event simulation, best-first search and task scheduling. Existing lock-free priority queues are mostly based on skiplists, which probabilistically create shortcuts in an ordered list for fast insertion of elements. The use of skiplists eliminates the need of global rebalancing in balanced search trees and ensures logarithmic sequential search time on average, but the worst-case performance is linear with respect to the input size. In this paper, we propose a quiescently consistent lock-free priority queue based on a multi-dimensional list that guarantees worst-case search time of $\mathcal {O}(\log N)$ for key universe of size $N$ . The novel multi-dimensional list (MDList) is composed of nodes that contain multiple links to child nodes arranged by their dimensionality. The insertion operation works by first injectively mapping the scalar key to a high-dimensional vector, then uniquely locating the target position by using the vector as coordinates. Nodes in MDList are ordered by their coordinate prefixes and the ordering property of the data structure is readily maintained during insertion without rebalancing nor randomization. In our experimental evaluation using a micro-benchmark, our priority queue achieves an average of $50$ percent speedup over the state of the art approaches under high concurrency.

Journal ArticleDOI
TL;DR: A novel hybrid meta-heuristic optimization approach, called cuckoo search Algorithm with global harmony search (CSGHS), is proposed in this paper to solve 0–1 knapsack problems (KP) more effectively.
Abstract: Cuckoo search (CS) is a novel biologically inspired algorithm and has been widely applied to many fields. Although some binary-coded CS variants are developed to solve 0–1 knapsack problems, the search accuracy and the convergence speed are still needed to further improve. According to the analysis of the shortcomings of the standard CS and the advantage of the global harmony search (GHS), a novel hybrid meta-heuristic optimization approach, called cuckoo search Algorithm with global harmony search (CSGHS), is proposed in this paper to solve 0–1 knapsack problems (KP) more effectively. In CSGHS, it is the combination of the exploration of GHS and the exploitation of CS that makes the CSGHS efficient and effective. The experiments conducted demonstrate that the CSGHS generally outperformed the binary cuckoo search, the binary shuffled frog-leaping algorithm and the binary differential evolution in accordance with the search accuracy and convergence speed. Therefore, the proposed hybrid algorithm is...

Journal ArticleDOI
TL;DR: The experimental results show that the proposed algorithm can provide a result for the deployment problem that is significantly better than those provided by the state-of-the-art metaheuristic algorithms evaluated in this study in terms of the quality.

Journal ArticleDOI
TL;DR: The proposed algorithm is balancing between the global exploration of the Cuckoo search algorithm and the deep exploitation of the Nelder–Mead method in order to solve the integer and minimax optimization problems.
Abstract: Cuckoo search algorithm is a promising metaheuristic population based method. It has been applied to solve many real life problems. In this paper, we propose a new cuckoo search algorithm by combining the cuckoo search algorithm with the Nelder–Mead method in order to solve the integer and minimax optimization problems. We call the proposed algorithm by hybrid cuckoo search and Nelder–Mead method (HCSNM). HCSNM starts the search by applying the standard cuckoo search for number of iterations then the best obtained solution is passing to the Nelder–Mead algorithm as an intensification process in order to accelerate the search and overcome the slow convergence of the standard cuckoo search algorithm. The proposed algorithm is balancing between the global exploration of the Cuckoo search algorithm and the deep exploitation of the Nelder–Mead method. We test HCSNM algorithm on seven integer programming problems and ten minimax problems and compare against eight algorithms for solving integer programming problems and seven algorithms for solving minimax problems. The experiments results show the efficiency of the proposed algorithm and its ability to solve integer and minimax optimization problems in reasonable time.

Proceedings ArticleDOI
01 Jul 2016
TL;DR: A kind of classical search technology, called variable neighborhood search (VNS), is incorporated into BA as a local search tool, and the experimental results imply that VNBA takes the absolute advantage over the basic BA.
Abstract: Bat algorithm (BA) is newly proposed bio-inspired metaheuristic algorithm with the inspiration of the echolocation of bats in nature. Several experimental results have proven to the effectiveness and performance of BA. However, BA may fail to find the global optimal solution occasionally. In this paper, a kind of classical search technology, called variable neighborhood search (VNS), is incorporated into BA as a local search tool. An improved version of BA namely variable neighborhood bat algorithm (VNBA), is thus proposed. In VNBA, the classic BA as a global search tool searches the whole space globally, and this can significantly shrink the search space. Subsequently, VNS as a local search tool is implemented to find the final best solution within the small promising area. After that, the VNBA is benchmarked by sixteen standard benchmark functions. The experimental results imply that VNBA takes the absolute advantage over the basic BA.

Journal ArticleDOI
TL;DR: A novel multiobjective memetic search algorithm (MMSA), is proposed to solve the MOPFSSP with makespan and total flowtime with better solutions than several state-of-the-art algorithms.
Abstract: The multiobjective permutation flowshop scheduling problem (MOPFSSP) is one of the most popular machine scheduling problems with extensive engineering relevance of manufacturing systems. There have been many attempts at solving MOPFSSP using heuristic and meta-heuristic methods, such as evolutionary algorithm. In this paper, a novel multiobjective memetic search algorithm (MMSA), is proposed to solve the MOPFSSP with makespan and total flowtime. First, a problem-specific Nawaz–Enscore–Hoam heuristic is used to initialize the population to enhance the quality of the initial solution. Second, a global search embedded with a perturbation operation is used to improve the solution of the entire population. Then, a single insert-based local search is used to improve each individual and then a further local search strategy is used to find the better solution for the non-improved individual in the single insert-based local search. The performance of our proposed algorithm is validated and compared with the four state-of-the-art algorithms on a number of benchmark problem. The experimental results show that the proposed MMSA provides better solutions than several state-of-the-art algorithms.

Journal ArticleDOI
TL;DR: A novel hybrid two-stage optimization algorithm integrating an improved backtracking search optimization algorithm (IBSA) with the hp-adaptive Gauss pseudo-spectral methods (hpGPM) is proposed to find a global optimum more quickly and accurately.

Proceedings ArticleDOI
01 Sep 2016
TL;DR: In this paper, a single-phase sequential and adaptive search algorithm is proposed and shown to achieve the best possible targeting rate and error exponent among all adaptive search algorithms.
Abstract: Consider a target search problem on a unit interval where at any given time an agent can choose a region to probe into for the presence of the target in that region The measurement noise is assumed to be increasing with the size of the search region the agent chooses In this paper, a single-phase sequential and adaptive search algorithm is proposed and shown to achieve the best possible targeting rate and error exponent among all adaptive search algorithms The proposed algorithm simply adopts a low complexity sorting operation on the posterior of the target and then pick up locations with larger posterior until the probability that the search region contains the target is closest to half

Journal ArticleDOI
01 Apr 2016
TL;DR: A Preferential Local Search mechanism to fine tune the global optimal solutions further and an adaptive weight mechanism for combining multi-objectives together is introduced and is able to produce better solutions, when compared with NSGA-II, SPEA2, and traditional memetic algorithms with fixed local search steps.
Abstract: Evolutionary multi-objective optimization algorithms are generally employed to generate Pareto optimal solutions by exploring the search space. To enhance the performance, exploration by global search can be complemented with exploitation by combining it with local search. In this paper, we address the issues in integrating local search with global search such as: how to select individuals for local search; how deep the local search is performed; how to combine multiple objectives into single objective for local search. We introduce a Preferential Local Search mechanism to fine tune the global optimal solutions further and an adaptive weight mechanism for combining multi-objectives together. These ideas have been integrated into NSGA-II to arrive at a new memetic algorithm for solving multi-objective optimization problems. The proposed algorithm has been applied on a set of constrained and unconstrained multi-objective benchmark test suite. The performance was analyzed by computing different metrics such as Generational distance, Spread, Max spread, and HyperVolume Ratio for the test suite functions. Statistical test applied on the results obtained suggests that the proposed algorithm outperforms the state-of-art multi-objective algorithms like NSGA-II and SPEA2. To study the performance of our algorithm on a real-world application, Economic Emission Load Dispatch was also taken up for validation. The performance was studied with the help of measures such as Hypervolume and Set Coverage Metrics. Experimental results substantiate that our algorithm has the capability to solve real-world problems like Economic Emission Load Dispatch and is able to produce better solutions, when compared with NSGA-II, SPEA2, and traditional memetic algorithms with fixed local search steps.

Journal ArticleDOI
TL;DR: Results show that deBILS can probe promising neighborhoods for each node of a TAN and offers substantially higher search speed and solution quality not only than ordinary BILS, but also the genetic algorithm and scatter search algorithm.
Abstract: This paper studies the optimization problem of topological active net (TAN), which is often seen in image segmentation and shape modeling. A TAN is a topological structure containing many nodes, whose positions must be optimized while a predefined topology needs to be maintained. TAN optimization is often time-consuming and even constructing a single solution is hard to do. Such a problem is usually approached by a “best improvement local search” (BILS) algorithm based on deterministic search (DS), which is inefficient because it spends too much efforts in nonpromising probing. In this paper, we propose the use of micro-differential evolution (DE) to replace DS in BILS for improved directional guidance. The resultant algorithm is termed deBILS. Its micro-population efficiently utilizes historical information for potentially promising search directions and hence improves efficiency in probing. Results show that deBILS can probe promising neighborhoods for each node of a TAN. Experimental tests verify that deBILS offers substantially higher search speed and solution quality not only than ordinary BILS, but also the genetic algorithm and scatter search algorithm.

Journal ArticleDOI
TL;DR: A novel memetic algorithm (MA) called cooperative particle swarm optimizer-modified harmony search (CPSO-MHS) is proposed in this paper, where the CPSO is used for local search and the MHS for global search, and a good performance is demonstrated in solving multimodal nonseparable problems.
Abstract: It is a big challenging issue of avoiding falling into local optimum especially when facing high-dimensional nonseparable problems where the interdependencies among vector elements are unknown. In order to improve the performance of optimization algorithm, a novel memetic algorithm (MA) called cooperative particle swarm optimizer-modified harmony search (CPSO-MHS) is proposed in this paper, where the CPSO is used for local search and the MHS for global search. The CPSO, as a local search method, uses 1-D swarm to search each dimension separately and thus converges fast. Besides, it can obtain global optimum elements according to our experimental results and analyses. MHS implements the global search by recombining different vector elements and extracting global optimum elements. The interaction between local search and global search creates a set of local search zones, where global optimum elements reside within the search space. The CPSO-MHS algorithm is tested and compared with seven other optimization algorithms on a set of 28 standard benchmarks. Meanwhile, some MAs are also compared according to the results derived directly from their corresponding references. The experimental results demonstrate a good performance of the proposed CPSO-MHS algorithm in solving multimodal nonseparable problems.

Journal ArticleDOI
TL;DR: This algorithm involves three stages, namely greedy selection, local improvement, and randomized improvement, which uses a heuristic approach to construct solutions based on an improved scoring rule and the least-waste-first strategy.

Proceedings ArticleDOI
07 Jul 2016
TL;DR: The study of selective search is extended using a fine-grained simulation investigating selective search efficiency in a parallel query processing environment; the difference in efficiency when term-based and sample-based resource selection algorithms are used; and the effect of two policies for assigning index shards to machines.
Abstract: Simulation and analysis have shown that selective search can reduce the cost of large-scale distributed information retrieval. By partitioning the collection into small topical shards, and then using a resource ranking algorithm to choose a subset of shards to search for each query, fewer postings are evaluated. Here we extend the study of selective search using a fine-grained simulation investigating: selective search efficiency in a parallel query processing environment; the difference in efficiency when term-based and sample-based resource selection algorithms are used; and the effect of two policies for assigning index shards to machines. Results obtained for two large datasets and four large query logs confirm that selective search is significantly more efficient than conventional distributed search. In particular, we show that selective search is capable of both higher throughput and lower latency in a parallel environment than is exhaustive search.

Journal ArticleDOI
Xu Zhou1, Yanheng Liu1, Bin Li1
TL;DR: The experimental results show that the proposed multi-objective discrete cuckoo search algorithm with local search (MDCL) for community detection has better performance than other algorithms and can discover the higher quality community structure without prior information.
Abstract: Detecting community is a challenging task in analyzing networks. Solving community detection problem by evolutionary algorithm is a heated topic in recent years. In this paper, a multi-objective discrete cuckoo search algorithm with local search (MDCL) for community detection is proposed. To the best of our knowledge, it is first time to apply cuckoo search algorithm for community detection. Two objective functions termed as negative ratio association and ratio cut are to be minimized. These two functions can break through the modularity limitation. In the proposed algorithm, the nest location updating strategy and abandon operator of cuckoo are redefined in discrete form. A local search strategy and a clone operator are proposed to obtain the optimal initial population. The experimental results on synthetic and real-world networks show that the proposed algorithm has better performance than other algorithms and can discover the higher quality community structure without prior information.

Proceedings ArticleDOI
16 May 2016
TL;DR: A*-Connect uses a fast approximation of the classic front-to-front heuristic from literature to lead the forward and backward searches towards each other, while retaining theoretical guarantees on completeness and bounded suboptimality.
Abstract: The benefits of bidirectional planning over the unidirectional version are well established for motion planning in high-dimensional configuration spaces. While bidirectional approaches have been employed with great success in the context of sampling-based planners such as in RRT-Connect, they have not enjoyed popularity amongst search-based methods such as A*. The systematic nature of search-based algorithms, which often leads to consistent and high-quality paths, also enforces strict conditions for the connection of forward and backward searches. Admissible heuristics for the connection of forward and backward searches have been developed, but their computational complexity is a deterrent. In this work, we leverage recent advances in search with inadmissible heuristics to develop an algorithm called A*-Connect, much in the spirit of RRT-Connect. A*-Connect uses a fast approximation of the classic front-to-front heuristic from literature to lead the forward and backward searches towards each other, while retaining theoretical guarantees on completeness and bounded suboptimality. We validate A*-Connect on manipulation as well as navigation domains, comparing with popular sampling-based methods as well as state-of-the-art bidirectional search algorithms. Our results indicate that A*-Connect can provide several times speedup over unidirectional search while maintaining high solution quality.

Journal ArticleDOI
TL;DR: It is observed that the proposed vector tabu search method outperforms its ancestors in both the convergence performance and the solution quality.
Abstract: A vector tabu search algorithm encapsulating a new updating mechanism for current states and a directed search phase is proposed to enhance its searching ability for Pareto-optimal solutions. The new updating mechanism considers quantitatively both the number of improved objectives and the amount of improvements in a specified objective, of multiobjective design problems. The directed search phase uses some desired directions, a priori knowledge about the objective space, as the moving direction to efficiently find improved solutions without any gradient computation procedure. The numerical results on both high- and low-frequency inverse problems are reported to demonstrate the pros and cons of the proposed algorithm. It is observed that the proposed vector tabu search method outperforms its ancestors in both the convergence performance and the solution quality.